DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
NASA Astrophysics Data System (ADS)
Inc, Mustafa; Yusuf, Abdullahi; Isa Aliyu, Aliyu; Baleanu, Dumitru
2018-03-01
This research analyzes the symmetry analysis, explicit solutions and convergence analysis to the time fractional Cahn-Allen (CA) and time-fractional Klein-Gordon (KG) equations with Riemann-Liouville (RL) derivative. The time fractional CA and time fractional KG are reduced to respective nonlinear ordinary differential equation of fractional order. We solve the reduced fractional ODEs using an explicit power series method. The convergence analysis for the obtained explicit solutions are investigated. Some figures for the obtained explicit solutions are also presented.
Xu, Duo; Zhu, Xuejiao; Xu, Yuan; Zhang, Liqing
2017-02-01
Objective Routine fasting (12 h) is always applied before laparoscopic cholecystectomy, but prolonged preoperative fasting causes thirst, hunger, and irritability as well as dehydration, low blood glucose, insulin resistance and other adverse reactions. We assessed the safety and efficacy of a shortened preoperative fasting period in patients undergoing laparoscopic cholecystectomy. Methods We searched PubMed, Embase and Cochrane Central Register of Controlled Trials up to 20 November 2015 and selected controlled trials with a shortened fasting time before laparoscopic cholecystectomy. We assessed the results by performing a meta-analysis using a variety of outcome measures and investigated the heterogeneity by subgroup analysis. Results Eleven trials were included. Forest plots showed that a shortened fasting time reduced the operative risk and patient discomfort. A shortened fasting time also reduced postoperative nausea and vomiting as well as operative vomiting. With respect to glucose metabolism, a shortened fasting time significantly reduced abnormalities in the ratio of insulin sensitivity. The C-reactive protein concentration was also reduced by a shortened fasting time. Conclusions A shortened preoperative fasting time increases patients' postoperative comfort, improves insulin resistance, and reduces stress responses. This evidence supports the clinical application of a shortened fasting time before laparoscopic cholecystectomy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofschen, S.; Wolff, I.
1996-08-01
Time-domain simulation results of two-dimensional (2-D) planar waveguide finite-difference time-domain (FDTD) analysis are normally analyzed using Fourier transform. The introduced method of time series analysis to extract propagation and attenuation constants reduces the desired computation time drastically. Additionally, a nonequidistant discretization together with an adequate excitation technique is used to reduce the number of spatial grid points. Therefore, it is possible to reduce the number of spatial grid points. Therefore, it is possible to simulate normal- and superconducting planar waveguide structures with very thin conductors and small dimensions, as they are used in MMIC technology. The simulation results are comparedmore » with measurements and show good agreement.« less
Time-resolved x-ray scattering instrumentation
Borso, C.S.
1985-11-21
An apparatus and method for increased speed and efficiency of data compilation and analysis in real time is presented in this disclosure. Data is sensed and grouped in combinations in accordance with predetermined logic. The combinations are grouped so that a simplified reduced signal results, such as pairwise summing of data values having offsetting algebraic signs, thereby reducing the magnitude of the net pair sum. Bit storage requirements are reduced and speed of data compilation and analysis is increased by manipulation of shorter bit length data values, making real time evaluation possible.
Design Through Analysis (DTA) roadmap vision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blacker, Teddy Dean; Adams, Charles R.; Hoffman, Edward L.
2004-10-01
The Design through Analysis Realization Team (DART) will provide analysts with a complete toolset that reduces the time to create, generate, analyze, and manage the data generated in a computational analysis. The toolset will be both easy to learn and easy to use. The DART Roadmap Vision provides for progressive improvements that will reduce the Design through Analysis (DTA) cycle time by 90-percent over a three-year period while improving both the quality and accountability of the analyses.
Velo and REXAN - Integrated Data Management and High Speed Analysis for Experimental Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleese van Dam, Kerstin; Carson, James P.; Corrigan, Abigail L.
2013-01-10
The Chemical Imaging Initiative at the Pacific Northwest National Laboratory (PNNL) is creating a ‘Rapid Experimental Analysis’ (REXAN) Framework, based on the concept of reusable component libraries. REXAN allows developers to quickly compose and customize high throughput analysis pipelines for a range of experiments, as well as supporting the creation of multi-modal analysis pipelines. In addition, PNNL has coupled REXAN with its collaborative data management and analysis environment Velo to create an easy to use data management and analysis environments for experimental facilities. This paper will discuss the benefits of Velo and REXAN in the context of three examples: PNNLmore » High Resolution Mass Spectrometry - reducing analysis times from hours to seconds, and enabling the analysis of much larger data samples (100KB to 40GB) at the same time · ALS X-Ray tomography - reducing analysis times of combined STXM and EM data collected at the ALS from weeks to minutes, decreasing manual work and increasing data volumes that can be analysed in a single step ·Multi-modal nano-scale analysis of STXM and TEM data - providing a semi automated process for particle detection The creation of REXAN has significantly shortened the development time for these analysis pipelines. The integration of Velo and REXAN has significantly increased the scientific productivity of the instruments and their users by creating easy to use data management and analysis environments with greatly reduced analysis times and improved analysis capabilities.« less
A strategy for reducing turnaround time in design optimization using a distributed computer system
NASA Technical Reports Server (NTRS)
Young, Katherine C.; Padula, Sharon L.; Rogers, James L.
1988-01-01
There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.
Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Hou, Zhangshuan; Meng, Da
2016-07-17
In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.
IoGET: Internet of Geophysical and Environmental Things
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mudunuru, Maruti Kumar
The objective of this project is to provide novel and fast reduced-order models for onboard computation at sensor nodes for real-time analysis. The approach will require that LANL perform high-fidelity numerical simulations, construct simple reduced-order models (ROMs) using machine learning and signal processing algorithms, and use real-time data analysis for ROMs and compressive sensing at sensor nodes.
NASA Astrophysics Data System (ADS)
Baleanu, Dumitru; Inc, Mustafa; Yusuf, Abdullahi; Aliyu, Aliyu Isa
2018-06-01
In this work, we investigate the Lie symmetry analysis, exact solutions and conservation laws (Cls) to the time fractional Caudrey-Dodd-Gibbon-Sawada-Kotera (CDGDK) equation with Riemann-Liouville (RL) derivative. The time fractional CDGDK is reduced to nonlinear ordinary differential equation (ODE) of fractional order. New exact traveling wave solutions for the time fractional CDGDK are obtained by fractional sub-equation method. In the reduced equation, the derivative is in Erdelyi-Kober (EK) sense. Ibragimov's nonlocal conservation method is applied to construct Cls for time fractional CDGDK.
Reduced rank models for travel time estimation of low order mode pulses.
Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M
2013-10-01
Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.
Huang, Xueqing; Ding, Jia; Effgen, Sigi; Turck, Franziska; Koornneef, Maarten
2013-08-01
Shoot branching is a major determinant of plant architecture. Genetic variants for reduced stem branching in the axils of cauline leaves of Arabidopsis were found in some natural accessions and also at low frequency in the progeny of multiparent crosses. Detailed genetic analysis using segregating populations derived from backcrosses with the parental lines and bulked segregant analysis was used to identify the allelic variation controlling reduced stem branching. Eight quantitative trait loci (QTLs) contributing to natural variation for reduced stem branching were identified (REDUCED STEM BRANCHING 1-8 (RSB1-8)). Genetic analysis showed that RSB6 and RSB7, corresponding to flowering time genes FLOWERING LOCUS C (FLC) and FRIGIDA (FRI), epistatically regulate stem branching. Furthermore, FLOWERING LOCUS T (FT), which corresponds to RSB8 as demonstrated by fine-mapping, transgenic complementation and expression analysis, caused pleiotropic effects not only on flowering time, but, in the specific background of active FRI and FLC alleles, also on the RSB trait. The consequence of allelic variation only expressed in late-flowering genotypes revealed novel and thus far unsuspected roles of several genes well characterized for their roles in flowering time control. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.
Pharmaceutical identifier confirmation via DART-TOF.
Easter, Jacob L; Steiner, Robert R
2014-07-01
Pharmaceutical analysis comprises a large amount of the casework in forensic controlled substances laboratories. In order to reduce the time of analysis for pharmaceuticals, a Direct Analysis in Real Time ion source coupled with an accurate mass time-of-flight (DART-TOF) mass spectrometer was used to confirm identity. DART-TOF spectral data for pharmaceutical samples were analyzed and evaluated by comparison to standard spectra. Identical mass pharmaceuticals were differentiated using collision induced dissociation fragmentation, present/absent ions, and abundance comparison box plots; principal component analysis (PCA) and linear discriminant analysis (LDA) were used for differentiation of identical mass mixed drug spectra. Mass assignment reproducibility and robustness tests were performed on the DART-TOF spectra. Impacts on the forensic science community include a decrease in analysis time over the traditional gas chromatograph/mass spectrometry (GC/MS) confirmations, better laboratory efficiency, and simpler sample preparation. Using physical identifiers and the DART-TOF to confirm pharmaceutical identity will eliminate the use of GC/MS and effectively reduce analysis time while still complying with accepted analysis protocols. This will prove helpful in laboratories with large backlogs and will simplify the confirmation process. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
2010-04-01
the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose...made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis . We also show how our theory relates to, and...of the most recent investigations for Earth and Mars atmospheres will be discussed in the following sections. 2.4.1 Earth: lunar return NASA’s
Lin, Susie; McKenna, Samuel J; Yao, Chuan-Fong; Chen, Yu-Ray; Chen, Chit
2017-01-01
The objective of this study was to evaluate the efficacy of hypotensive anesthesia in reducing intraoperative blood loss, decreasing operation time, and improving the quality of the surgical field during orthognathic surgery. A systematic review and meta-analysis of randomized controlled trials addressing these issues were carried out. An electronic database search was performed. The risk of bias was evaluated with the Jadad Scale and Delphi List. The inverse variance statistical method and a random-effects model were used. Ten randomized controlled trials were included for analysis. Our meta-analysis indicated that hypotensive anesthesia reduced intraoperative blood loss by a mean of about 169 mL. Hypotensive anesthesia was not shown to reduce the operation time for orthognathic surgery, but it did improve the quality of the surgical field. Subgroup analysis indicated that for blood loss in double-jaw surgery, the weighted mean difference favored the hypotensive group, with a reduction in blood loss of 175 mL, but no statistically significant reduction in blood loss was found for anterior maxillary osteotomy. If local anesthesia with epinephrine was used in conjunction with hypotensive anesthesia, the reduction in intraoperative blood loss was increased to 254.93 mL. Hypotensive anesthesia was effective in reducing blood loss and improving the quality of the surgical field, but it did not reduce the operation time for orthognathic surgery. The use of local anesthesia in conjunction with hypotensive general anesthesia further reduced the amount of intraoperative blood loss for orthognathic surgery. Copyright © 2016 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Inc, Mustafa; Yusuf, Abdullahi; Aliyu, Aliyu Isa; Baleanu, Dumitru
2018-04-01
This paper studies the symmetry analysis, explicit solutions, convergence analysis, and conservation laws (Cls) for two different space-time fractional nonlinear evolution equations with Riemann-Liouville (RL) derivative. The governing equations are reduced to nonlinear ordinary differential equation (ODE) of fractional order using their Lie point symmetries. In the reduced equations, the derivative is in Erdelyi-Kober (EK) sense, power series technique is applied to derive an explicit solutions for the reduced fractional ODEs. The convergence of the obtained power series solutions is also presented. Moreover, the new conservation theorem and the generalization of the Noether operators are developed to construct the nonlocal Cls for the equations . Some interesting figures for the obtained explicit solutions are presented.
Integrated Sensitivity Analysis Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.
2014-08-01
Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.
Optimal subinterval selection approach for power system transient stability simulation
Kim, Soobae; Overbye, Thomas J.
2015-10-21
Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modalmore » analysis using a single machine infinite bus (SMIB) system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. As a result, the performance of the proposed method is demonstrated with the GSO 37-bus system.« less
Development of WRF-ROI system by incorporating eigen-decomposition
NASA Astrophysics Data System (ADS)
Kim, S.; Noh, N.; Song, H.; Lim, G.
2011-12-01
This study presents the development of WRF-ROI system, which is the implementation of Retrospective Optimal Interpolation (ROI) to the Weather Research and Forecasting model (WRF). ROI is a new data assimilation algorithm introduced by Song et al. (2009) and Song and Lim (2009). The formulation of ROI is similar with that of Optimal Interpolation (OI), but ROI iteratively assimilates an observation set at a post analysis time into a prior analysis, possibly providing the high quality reanalysis data. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In previous study, ROI method is applied to Lorenz 40-variable model (Lorenz, 1996) to validate the algorithm and to investigate the capability. It is therefore required to apply this ROI method into a more realistic and complicated model framework such as WRF. In this research, the reduced-rank formulation of ROI is used instead of a reduced-resolution method. The computational costs can be reduced due to the eigen-decomposition of background error covariance in the reduced-rank method. When single profile of observations is assimilated in the WRF-ROI system by incorporating eigen-decomposition, the analysis error tends to be reduced if compared with the background error. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation.
Time and expected value of sample information wait for no patient.
Eckermann, Simon; Willan, Andrew R
2008-01-01
The expected value of sample information (EVSI) from prospective trials has previously been modeled as the product of EVSI per patient, and the number of patients across the relevant time horizon less those "used up" in trials. However, this implicitly assumes the eligible patient population to which information from a trial can be applied across a time horizon are independent of time for trial accrual, follow-up and analysis. This article demonstrates that in calculating the EVSI of a trial, the number of patients who benefit from trial information should be reduced by those treated outside as well as within the trial over the time until trial evidence is updated, including time for accrual, follow-up and analysis. Accounting for time is shown to reduce the eligible patient population: 1) independent of the size of trial in allowing for time of follow-up and analysis, and 2) dependent on the size of trial for time of accrual, where the patient accrual rate is less than incidence. Consequently, the EVSI and expected net gain (ENG) at any given trial size are shown to be lower when accounting for time, with lower ENG reinforced in the case of trials undertaken while delaying decisions by additional opportunity costs of time. Appropriately accounting for time reduces the EVSI of trial design and increase opportunity costs of trials undertaken with delay, leading to lower likelihood of trialing being optimal and smaller trial designs where optimal.
USDA-ARS?s Scientific Manuscript database
Accurate and rapid assays for glucose are desirable for analysis of glucose and starch in food and feedstuffs. An established colorimetric glucose oxidase-peroxidase method for glucose was modified to reduce analysis time, and evaluated for factors that affected accuracy. Time required to perform t...
Kumar, Mukesh; Rath, Nitish Kumar; Rath, Santanu Kumar
2016-04-01
Microarray-based gene expression profiling has emerged as an efficient technique for classification, prognosis, diagnosis, and treatment of cancer. Frequent changes in the behavior of this disease generates an enormous volume of data. Microarray data satisfies both the veracity and velocity properties of big data, as it keeps changing with time. Therefore, the analysis of microarray datasets in a small amount of time is essential. They often contain a large amount of expression, but only a fraction of it comprises genes that are significantly expressed. The precise identification of genes of interest that are responsible for causing cancer are imperative in microarray data analysis. Most existing schemes employ a two-phase process such as feature selection/extraction followed by classification. In this paper, various statistical methods (tests) based on MapReduce are proposed for selecting relevant features. After feature selection, a MapReduce-based K-nearest neighbor (mrKNN) classifier is also employed to classify microarray data. These algorithms are successfully implemented in a Hadoop framework. A comparative analysis is done on these MapReduce-based models using microarray datasets of various dimensions. From the obtained results, it is observed that these models consume much less execution time than conventional models in processing big data. Copyright © 2016 Elsevier Inc. All rights reserved.
Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis
NASA Technical Reports Server (NTRS)
Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.
2015-01-01
This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.
Kelly, Elizabeth W; Kelly, Jonathan D; Hiestand, Brian; Wells-Kiser, Kathy; Starling, Stephanie; Hoekstra, James W
2010-01-01
Rapid reperfusion in patients with ST-elevation myocardial infarction (STEMI) is associated with lower mortality. Reduction in door-to-balloon (D2B) time for percutaneous coronary intervention requires multidisciplinary cooperation, process analysis, and quality improvement methodology. Six Sigma methodology was used to reduce D2B times in STEMI patients presenting to a tertiary care center. Specific steps in STEMI care were determined, time goals were established, and processes were changed to reduce each step's duration. Outcomes were tracked, and timely feedback was given to providers. After process analysis and implementation of improvements, mean D2B times decreased from 128 to 90 minutes. Improvement has been sustained; as of June 2010, the mean D2B was 56 minutes, with 100% of patients meeting the 90-minute window for the year. Six Sigma methodology and immediate provider feedback result in significant reductions in D2B times. The lessons learned may be extrapolated to other primary percutaneous coronary intervention centers. Copyright © 2010 Elsevier Inc. All rights reserved.
The strength of graduated drivers license programs and fatalities among teen drivers and passengers.
Morrisey, Michael A; Grabowski, David C; Dee, Thomas S; Campbell, Christine
2006-01-01
The purpose of this study is to investigate the effects of differentially stringent graduated drivers license programs on teen driver fatalities, day-time and night-time teen driver fatalities, fatalities of teen drivers with passengers present, and fatalities among teen passengers. The study uses 1992-2002 data on motor vehicle fatalities among 15-17-year-old drivers from the Fatality Analysis Reporting System to identify the effects of "good", "fair", and "marginal" GDL programs based upon designations by the Insurance Institute for Highway Safety. Analysis is conducted using conditional negative binomial regressions with fixed effects. "Good" programs reduce total fatalities among young drivers by 19.4% (c.i. -33.0%, -5.9%). "Fair" programs reduce night-time young driver fatalities by 12.6% (c.i. -23.9%, -1.2%), but have no effect on day-time fatalities. "Marginal" programs had no statistically meaningful effect on driver fatalities. All three types of programs reduced teen passenger fatalities, but the effects of limitations on the number of passengers appear to have had only minimal effects in reducing fatalities among young drivers themselves. Stronger GDL programs are more effective than weaker programs in reducing teenage motor vehicle fatalities.
The ALICE analysis train system
NASA Astrophysics Data System (ADS)
Zimmermann, Markus; ALICE Collaboration
2015-05-01
In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.
Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio
2015-12-01
This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.
Walsh, Alex J.; Sharick, Joe T.; Skala, Melissa C.; Beier, Hope T.
2016-01-01
Time-correlated single photon counting (TCSPC) enables acquisition of fluorescence lifetime decays with high temporal resolution within the fluorescence decay. However, many thousands of photons per pixel are required for accurate lifetime decay curve representation, instrument response deconvolution, and lifetime estimation, particularly for two-component lifetimes. TCSPC imaging speed is inherently limited due to the single photon per laser pulse nature and low fluorescence event efficiencies (<10%) required to reduce bias towards short lifetimes. Here, simulated fluorescence lifetime decays are analyzed by SPCImage and SLIM Curve software to determine the limiting lifetime parameters and photon requirements of fluorescence lifetime decays that can be accurately fit. Data analysis techniques to improve fitting accuracy for low photon count data were evaluated. Temporal binning of the decays from 256 time bins to 42 time bins significantly (p<0.0001) improved fit accuracy in SPCImage and enabled accurate fits with low photon counts (as low as 700 photons/decay), a 6-fold reduction in required photons and therefore improvement in imaging speed. Additionally, reducing the number of free parameters in the fitting algorithm by fixing the lifetimes to known values significantly reduced the lifetime component error from 27.3% to 3.2% in SPCImage (p<0.0001) and from 50.6% to 4.2% in SLIM Curve (p<0.0001). Analysis of nicotinamide adenine dinucleotide–lactate dehydrogenase (NADH-LDH) solutions confirmed temporal binning of TCSPC data and a reduced number of free parameters improves exponential decay fit accuracy in SPCImage. Altogether, temporal binning (in SPCImage) and reduced free parameters are data analysis techniques that enable accurate lifetime estimation from low photon count data and enable TCSPC imaging speeds up to 6x and 300x faster, respectively, than traditional TCSPC analysis. PMID:27446663
Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature.
Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav
2017-05-26
Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.
Reducing youth screen time: qualitative metasynthesis of findings on barriers and facilitators.
Minges, Karl E; Owen, Neville; Salmon, Jo; Chao, Ariana; Dunstan, David W; Whittemore, Robin
2015-04-01
An integrated perspective on the relevant qualitative findings on the experience of screen time in youth can inform the development of hypotheses to be tested in future research and can guide the development of interventions to decrease sedentary behavior. The purpose of this qualitative metasynthesis was to explore parent, youth, and educational professionals' perceptions of barriers to, and facilitators of, reducing youth screen time. Qualitative metasynthesis techniques were used to analyze and synthesize 15 qualitative studies of screen time among youth (11-18 years) meeting inclusion criteria. The phrases, quotes, and/or author interpretations (i.e., theme or subtheme) were recorded in a data display matrix to facilitate article comparisons. Codes were collapsed into 23 categories of similar conceptual meaning and 3 overarching themes were derived using thematic analysis procedures. Study sample sizes ranged from 6 to 270 participants from 6 countries. Data collection methods included focus groups (n = 6), interviews (n = 4), focus group and interviews (n = 4), and naturalistic observation (n = 1) with youth and/or parents. Data analysis techniques included thematic analysis (n = 9), content analysis (n = 3), grounded theory (n = 1), observation (n = 1), and interpretive phenomenological analysis (n = 1). Three thematic categories were identified: (a) youth's norms-screen time is an integral part of daily life, and facilitates opportunities for entertainment, social interaction, and escapism; (b) family dynamics and parental roles-parents are conflicted and send mixed messages about the appropriate uses and amounts of screen time; and, (c) resources and environment-engagement in screen time is dependent on school, community, neighborhood, and home environmental contexts. Screen time is an established norm in many youth cultures, presenting barriers to behavior change. Parents recognize the importance of reducing youth screen time, but model and promote engagement themselves. For youth and parents, mutually agreed rules, limits, and parental monitoring of screen time were perceived as likely to be effective. (c) 2015 APA, all rights reserved).
Reducing Youth Screen Time: Qualitative Metasynthesis of Findings on Barriers and Facilitators
Minges, Karl E.; Salmon, Jo; Dunstan, David W.; Owen, Neville; Chao, Ariana; Whittemore, Robin
2015-01-01
Objective An integrated perspective on the relevant qualitative findings on the experience of screen time in youth can inform the development of hypotheses to be tested in future research and can guide the development of interventions to decrease sedentary behavior. The purpose of this qualitative metasynthesis was to explore parent, youth, and educational professionals’ perceptions of barriers to, and facilitators of, reducing youth screen time. Method Qualitative metasynthesis techniques were used to analyze and synthesize 15 qualitative studies of screen time among youth (11–18 years) meeting inclusion criteria. The phrases, quotes, and/or author interpretations (i.e., theme or subtheme) were recorded in a data display matrix to facilitate article comparisons. Codes were collapsed into 23 categories of similar conceptual meaning and 3 overarching themes were derived using thematic analysis procedures. Results Study sample sizes ranged from 6 to 270 participants from 6 countries. Data collection methods included focus groups (n = 6), interviews (n = 4), focus group and interviews (n = 4), and naturalistic observation (n = 1) with youth and/or parents. Data analysis techniques included thematic analysis (n = 9), content analysis (n = 3), grounded theory (n = 1), observation (n = 1), and interpretive phenomenological analysis (n = 1). Three thematic categories were identified: (a) youth’s norms—screen time is an integral part of daily life, and facilitates opportunities for entertainment, social interaction, and escapism; (b) family dynamics and parental roles—parents are conflicted and send mixed messages about the appropriate uses and amounts of screen time; and, (c) resources and environment—engagement in screen time is dependent on school, community, neighborhood, and home environmental contexts. Conclusions Screen time is an established norm in many youth cultures, presenting barriers to behavior change. Parents recognize the importance of reducing youth screen time, but model and promote engagement themselves. For youth and parents, mutually agreed rules, limits, and parental monitoring of screen time were perceived as likely to be effective. PMID:25822054
Spectral Analysis of the Effects of Daylight Saving Time on Motor Vehicle Fatal Traffic Accidents
DOT National Transportation Integrated Search
1977-04-01
This report shows that Daylight Saving Time (DST) reduces the number of persons killed in motor vehicle fatal traffic accidents by about one percent. This estimate is based on a spectral (Fourier) analysis of these fatalities which utilizes a filteri...
Hindricks, Gerhard; Varma, Niraj; Kacet, Salem; Lewalter, Thorsten; Søgaard, Peter; Guédon-Moreau, Laurence; Proff, Jochen; Gerds, Thomas A; Anker, Stefan D; Torp-Pedersen, Christian
2017-06-07
Remote monitoring of implantable cardioverter-defibrillators may improve clinical outcome. A recent meta-analysis of three randomized controlled trials (TRUST, ECOST, IN-TIME) using a specific remote monitoring system with daily transmissions [Biotronik Home Monitoring (HM)] demonstrated improved survival. We performed a patient-level analysis to verify this result with appropriate time-to-event statistics and to investigate further clinical endpoints. Individual data of the TRUST, ECOST, and IN-TIME patients were pooled to calculate absolute risks of endpoints at 1-year follow-up for HM vs. conventional follow-up. All-cause mortality analysis involved all three trials (2405 patients). Other endpoints involved two trials, ECOST and IN-TIME (1078 patients), in which an independent blinded endpoint committee adjudicated the underlying causes of hospitalizations and deaths. The absolute risk of death at 1 year was reduced by 1.9% in the HM group (95% CI: 0.1-3.8%; P = 0.037), equivalent to a risk ratio of 0.62. Also the combined endpoint of all-cause mortality or hospitalization for worsening heart failure (WHF) was significantly reduced (by 5.6%; P = 0.007; risk ratio 0.64). The composite endpoint of all-cause mortality or cardiovascular (CV) hospitalization tended to be reduced by a similar degree (4.1%; P = 0.13; risk ratio 0.85) but without statistical significance. In a pooled analysis of the three trials, HM reduced all-cause mortality and the composite endpoint of all-cause mortality or WHF hospitalization. The similar magnitudes of absolute risk reductions for WHF and CV endpoints suggest that the benefit of HM is driven by the prevention of heart failure exacerbation.
Challenges in reducing the computational time of QSTS simulations for distribution system analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.
The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less
Uei, Shu-Lin; Tsai, Chung-Hung; Kuo, Yu-Ming
2016-04-29
Telehealth cost analysis has become a crucial issue for governments in recent years. In this study, we examined cases of metabolic syndrome in Hualien County, Taiwan. This research adopted the framework proposed by Marchand to establish a study process. In addition, descriptive statistics, a t test, analysis of variance, and regression analysis were employed to analyze 100 questionnaires. The results of the t$ test revealed significant differences in medical health expenditure, number of clinical visits for medical treatment, average amount of time spent commuting to clinics, amount of time spent undergoing medical treatment, and average number of people accompanying patients to medical care facilities or assisting with other tasks in the past one month, indicating that offering telehealth care services can reduce health expenditure. The statistical analysis results revealed that customer satisfaction has a positive effect on reducing health expenditure. Therefore, this study proves that telehealth care systems can effectively reduce health expenditure and directly improve customer satisfaction with medical treatment.
Li, Ye; Wang, Hao; Wang, Wei; Xing, Lu; Liu, Shanwen; Wei, Xueyan
2017-01-01
Although plenty of studies have been conducted recently about the impacts of cooperative adaptive cruise control (CACC) system on traffic efficiency, there are few researches analyzing the safety effects of this advanced driving-assistant system. Thus, the primary objective of this study is to evaluate the impacts of the CACC system on reducing rear-end collision risks on freeways. The CACC model is firstly developed, which is based on the Intelligent Driver Model (IDM). Then, two surrogated safety measures, derived from the time-to-collision (TTC), denoting time exposed time-to-collision (TET) and time integrated time-to-collision (TIT), are introduced for quantifying the collision risks. And the safety effects are analyzed both theoretically and experimentally, by the linear stability analysis and simulations. The theoretical and simulation results conformably indicate that the CACC system brings dramatic benefits for reducing rear-end collision risks (TET and TIT are reduced more than 90%, respectively), when the desired time headway and time delay are set properly. The sensitivity analysis indicates there are few differences among different values of the threshold of TTC and the length of a CACC platoon. The results also show that the safety improvements weaken with the decrease of the penetration rates of CACC on the market and the increase of time delay between platoons. We also evaluate the traffic efficiency of the CACC system with different desired time headway. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimizing a Drone Network to Deliver Automated External Defibrillators.
Boutilier, Justin J; Brooks, Steven C; Janmohamed, Alyf; Byers, Adam; Buick, Jason E; Zhan, Cathy; Schoellig, Angela P; Cheskes, Sheldon; Morrison, Laurie J; Chan, Timothy C Y
2017-06-20
Public access defibrillation programs can improve survival after out-of-hospital cardiac arrest, but automated external defibrillators (AEDs) are rarely available for bystander use at the scene. Drones are an emerging technology that can deliver an AED to the scene of an out-of-hospital cardiac arrest for bystander use. We hypothesize that a drone network designed with the aid of a mathematical model combining both optimization and queuing can reduce the time to AED arrival. We applied our model to 53 702 out-of-hospital cardiac arrests that occurred in the 8 regions of the Toronto Regional RescuNET between January 1, 2006, and December 31, 2014. Our primary analysis quantified the drone network size required to deliver an AED 1, 2, or 3 minutes faster than historical median 911 response times for each region independently. A secondary analysis quantified the reduction in drone resources required if RescuNET was treated as a large coordinated region. The region-specific analysis determined that 81 bases and 100 drones would be required to deliver an AED ahead of median 911 response times by 3 minutes. In the most urban region, the 90th percentile of the AED arrival time was reduced by 6 minutes and 43 seconds relative to historical 911 response times in the region. In the most rural region, the 90th percentile was reduced by 10 minutes and 34 seconds. A single coordinated drone network across all regions required 39.5% fewer bases and 30.0% fewer drones to achieve similar AED delivery times. An optimized drone network designed with the aid of a novel mathematical model can substantially reduce the AED delivery time to an out-of-hospital cardiac arrest event. © 2017 American Heart Association, Inc.
Error analysis of multipoint flux domain decomposition methods for evolutionary diffusion problems
NASA Astrophysics Data System (ADS)
Arrarás, A.; Portero, L.; Yotov, I.
2014-01-01
We study space and time discretizations for mixed formulations of parabolic problems. The spatial approximation is based on the multipoint flux mixed finite element method, which reduces to an efficient cell-centered pressure system on general grids, including triangles, quadrilaterals, tetrahedra, and hexahedra. The time integration is performed by using a domain decomposition time-splitting technique combined with multiterm fractional step diagonally implicit Runge-Kutta methods. The resulting scheme is unconditionally stable and computationally efficient, as it reduces the global system to a collection of uncoupled subdomain problems that can be solved in parallel without the need for Schwarz-type iteration. Convergence analysis for both the semidiscrete and fully discrete schemes is presented.
Nonlinear Reduced-Order Analysis with Time-Varying Spatial Loading Distributions
NASA Technical Reports Server (NTRS)
Prezekop, Adam
2008-01-01
Oscillating shocks acting in combination with high-intensity acoustic loadings present a challenge to the design of resilient hypersonic flight vehicle structures. This paper addresses some features of this loading condition and certain aspects of a nonlinear reduced-order analysis with emphasis on system identification leading to formation of a robust modal basis. The nonlinear dynamic response of a composite structure subject to the simultaneous action of locally strong oscillating pressure gradients and high-intensity acoustic loadings is considered. The reduced-order analysis used in this work has been previously demonstrated to be both computationally efficient and accurate for time-invariant spatial loading distributions, provided that an appropriate modal basis is used. The challenge of the present study is to identify a suitable basis for loadings with time-varying spatial distributions. Using a proper orthogonal decomposition and modal expansion, it is shown that such a basis can be developed. The basis is made more robust by incrementally expanding it to account for changes in the location, frequency and span of the oscillating pressure gradient.
Reducing Response Time Bounds for DAG-Based Task Systems on Heterogeneous Multicore Platforms
2016-01-01
synchronous parallel tasks on multicore platforms. In 25th ECRTS, 2013. [10] U. Devi. Soft Real - Time Scheduling on Multiprocessors. PhD thesis...report, Washington University in St Louis, 2014. [18] C. Liu and J. Anderson. Supporting soft real - time DAG-based sys- tems on multiprocessors with...analysis for DAG-based real - time task systems im- plemented on heterogeneous multicore platforms. The spe- cific analysis problem that is considered was
Lewars, Brittany; Hurst, Samantha; Crist, Katie; Nebeker, Camille; Madanat, Hala; Nichols, Jeanne; Rosenberg, Dori E; Kerr, Jacqueline
2018-01-01
Background Recent epidemiological evidence indicates that, on average, people are sedentary for approximately 7.7 hours per day. There are deleterious effects of prolonged sedentary behavior that are separate from participation in physical activity and include increased risk of weight gain, cancer, metabolic syndrome, diabetes, and heart disease. Previous trials have used wearable devices to increase physical activity in studies; however, additional research is needed to fully understand how this technology can be used to reduce sitting time. Objective The purpose of this study was to explore the potential of wearable devices as an intervention tool in a larger sedentary behavior study through a general inductive and deductive analysis of focus group discussions. Methods We conducted four focus groups with 15 participants to discuss 7 different wearable devices with sedentary behavior capabilities. Participants recruited for the focus groups had previously participated in a pilot intervention targeting sedentary behavior over a 3-week period and were knowledgeable about the challenges of reducing sitting time. During the focus groups, participants commented on the wearability, functionality, and feedback mechanism of each device and then identified their two favorite and two least favorite devices. Finally, participants designed and described their ideal or dream wearable device. Two researchers, who have expertise analyzing qualitative data, coded and analyzed the data from the focus groups. A thematic analysis approach using Dedoose software (SocioCultural Research Consultants, LLC version 7.5.9) guided the organization of themes that reflected participants’ perspectives. Results Analysis resulted in 14 codes that we grouped into themes. Three themes emerged from our data: (1) features of the device, (2) data the device collected, and (3) how data are displayed. Conclusions Current wearable devices for increasing physical activity are insufficient to intervene on sitting time. This was especially evident when participants voted, as several participants reported using a “process of elimination” as opposed to choosing favorites because none of the devices were ideal for reducing sitting time. To overcome the limitations in current devices, future wearable devices designed to reduce sitting time should include the following features: waterproof, long battery life, accuracy in measuring sitting time, real time feedback on progress toward sitting reduction goals, and flexible options for prompts to take breaks from sitting. PMID:29599105
Prognostic significance of blood coagulation tests in carcinoma of the lung and colon.
Wojtukiewicz, M Z; Zacharski, L R; Moritz, T E; Hur, K; Edwards, R L; Rickles, F R
1992-08-01
Blood coagulation test results were collected prospectively in patients with previously untreated, advanced lung or colon cancer who entered into a clinical trial. In patients with colon cancer, reduced survival was associated (in univariate analysis) with higher values obtained at entry to the study for fibrinogen, fibrin(ogen) split products, antiplasmin, and fibrinopeptide A and accelerated euglobulin lysis times. In patients with non-small cell lung cancer, reduced survival was associated (in univariate analysis) with higher fibrinogen and fibrin(ogen) split products, platelet counts and activated partial thromboplastin times. In patients with small cell carcinoma of the lung, only higher activated partial thromboplastin times were associated (in univariate analysis) with reduced survival in patients with disseminated disease. In multivariate analysis, higher activated partial thromboplastin times were a significant independent predictor of survival for patients with non-small cell lung cancer limited to one hemithorax and with disseminated small cell carcinoma of the lung. Fibrin(ogen) split product levels were an independent predictor of survival for patients with disseminated non-small cell lung cancer as were both the fibrinogen and fibrinopeptide A levels for patients with disseminated colon cancer. These results suggest that certain tests of blood coagulation may be indicative of prognosis in lung and colon cancer. The heterogeneity of these results suggests that the mechanism(s), intensity, and pathophysiological significance of coagulation activation in cancer may differ between tumour types.
Why fibers are better turbulent drag reducing agents than polymers
NASA Astrophysics Data System (ADS)
Boelens, Arnout; Muthukumar, Murugappan
2016-11-01
It is typically found in literature that fibers are not as effective as drag reducing agents as polymers. However, for low concentrations, when adding charged polymers to either distilled or salt water, it is found that polymers showing rod-like behavior are better drag reducing agents than polymers showing coil-like behavior. In this study, using hybrid Direct Numerical Simulation with Langevin dynamics, a comparison is performed between polymer and fiber stress tensors in turbulent flow. The stress tensors are found to be similar, suggesting a common drag reducing mechanism in the onset regime. Since fibers do not have an elastic backbone, this must be a viscous effect. Analysis of the viscosity tensor reveals that all terms are negligible, except the off-diagonal shear viscosity associated with rotation. Based on this analysis, we are able to explain why charged polymers showing rod-like behavior are better drag reducing agents than polymers showing coil-like behavior. Additionally, we identify the rotational orientation time as the unifying time scale setting a new time criterion for drag reduction by both flexible polymers and rigid fibers. This research was supported by NSF Grant No. DMR-1404940 and AFOSR Grant No. FA9550-14-1-0164.
Meng, Yutong; Li, Zhirui; Gong, Ke; An, Xiao; Dong, Jiyuan; Tang, Peifu
2018-01-01
Obesity can result in increased blood loss, which is correlated with poor prognosis in total knee arthroplasty (TKA). Clinical application of tranexamic acid is effective in reducing blood loss in TKA. However, most previous studies focused on the effect of tranexamic acid in the whole population, neglecting patients with specific health conditions, such as obesity. We hypothesized that tranexamic acid would reduce blood loss to a greater extent in obese patients than in those of normal weight. A total of 304 patients with knee osteoarthritis treated with TKA from October 2013 to March 2015 were separated into tranexamic, non-tranexamic, obese, and non-obese groups. The demographic characteristics, surgical indices, and hematological indices were all recorded. We first investigated the ability of intravenous tranexamic acid to reduce intraoperative blood loss in knee osteoarthritis patients undergoing unilateral TKA. Second, we performed subgroup analysis to compare the effects of tranexamic acid between obese and non-obese patients separately. Of the 304 patients, 146 (52.0%) received tranexamic acid and 130 (42.8%) were obese. In the analysis of the whole group, both the actual and occult blood loss volume were lower in the tranexamic acid group (both P < 0.05). Tourniquet time was shorter in the tranexamic acid group ( P < 0.05). In subgroup analysis, tranexamic acid was shown to reduce theoretical and actual blood loss in both the obese and non-obese groups ( P < 0.05). Tranexamic acid reduced occult blood loss and tourniquet time in the obese group ( P < 0.05), while no such effects were observed in the non-obese group ( P > 0.05). Tranexamic acid can reduce occult blood loss and tourniquet time in obese patients to a greater extent than in patients of normal weight. Therefore, obese knee osteoarthritis patients undergoing TKA can benefit more from tranexamic acid.
Gong, Ke; An, Xiao; Dong, Jiyuan; Tang, Peifu
2018-01-01
Purpose Obesity can result in increased blood loss, which is correlated with poor prognosis in total knee arthroplasty (TKA). Clinical application of tranexamic acid is effective in reducing blood loss in TKA. However, most previous studies focused on the effect of tranexamic acid in the whole population, neglecting patients with specific health conditions, such as obesity. We hypothesized that tranexamic acid would reduce blood loss to a greater extent in obese patients than in those of normal weight. Patients and methods A total of 304 patients with knee osteoarthritis treated with TKA from October 2013 to March 2015 were separated into tranexamic, non-tranexamic, obese, and non-obese groups. The demographic characteristics, surgical indices, and hematological indices were all recorded. We first investigated the ability of intravenous tranexamic acid to reduce intraoperative blood loss in knee osteoarthritis patients undergoing unilateral TKA. Second, we performed subgroup analysis to compare the effects of tranexamic acid between obese and non-obese patients separately. Results Of the 304 patients, 146 (52.0%) received tranexamic acid and 130 (42.8%) were obese. In the analysis of the whole group, both the actual and occult blood loss volume were lower in the tranexamic acid group (both P < 0.05). Tourniquet time was shorter in the tranexamic acid group (P < 0.05). In subgroup analysis, tranexamic acid was shown to reduce theoretical and actual blood loss in both the obese and non-obese groups (P < 0.05). Tranexamic acid reduced occult blood loss and tourniquet time in the obese group (P < 0.05), while no such effects were observed in the non-obese group (P > 0.05). Conclusion Tranexamic acid can reduce occult blood loss and tourniquet time in obese patients to a greater extent than in patients of normal weight. Therefore, obese knee osteoarthritis patients undergoing TKA can benefit more from tranexamic acid. PMID:29695912
Using operations research to plan improvement of the transport of critically ill patients.
Chen, Jing; Awasthi, Anjali; Shechter, Steven; Atkins, Derek; Lemke, Linda; Fisher, Les; Dodek, Peter
2013-01-01
Operations research is the application of mathematical modeling, statistical analysis, and mathematical optimization to understand and improve processes in organizations. The objective of this study was to illustrate how the methods of operations research can be used to identify opportunities to reduce the absolute value and variability of interfacility transport intervals for critically ill patients. After linking data from two patient transport organizations in British Columbia, Canada, for all critical care transports during the calendar year 2006, the steps for transfer of critically ill patients were tabulated into a series of time intervals. Statistical modeling, root-cause analysis, Monte Carlo simulation, and sensitivity analysis were used to test the effect of changes in component intervals on overall duration and variation of transport times. Based on quality improvement principles, we focused on reducing the 75th percentile and standard deviation of these intervals. We analyzed a total of 3808 ground and air transports. Constraining time spent by transport personnel at sending and receiving hospitals was projected to reduce the total time taken by 33 minutes with as much as a 20% reduction in standard deviation of these transport intervals in 75% of ground transfers. Enforcing a policy of requiring acceptance of patients who have life- or limb-threatening conditions or organ failure was projected to reduce the standard deviation of air transport time by 63 minutes and the standard deviation of ground transport time by 68 minutes. Based on findings from our analyses, we developed recommendations for technology renovation, personnel training, system improvement, and policy enforcement. Use of the tools of operations research identifies opportunities for improvement in a complex system of critical care transport.
NASA Technical Reports Server (NTRS)
Meyers, Steven D.; Kelly, B. G.; O'Brien, J. J.
1993-01-01
Wavelet analysis is a relatively new technique that is an important addition to standard signal analysis methods. Unlike Fourier analysis that yields an average amplitude and phase for each harmonic in a dataset, the wavelet transform produces an instantaneous estimate or local value for the amplitude and phase of each harmonic. This allows detailed study of nonstationary spatial or time-dependent signal characteristics. The wavelet transform is discussed, examples are given, and some methods for preprocessing data for wavelet analysis are compared. By studying the dispersion of Yanai waves in a reduced gravity equatorial model, the usefulness of the transform is demonstrated. The group velocity is measured directly over a finite range of wavenumbers by examining the time evolution of the transform. The results agree well with linear theory at higher wavenumber but the measured group velocity is reduced at lower wavenumbers, possibly due to interaction with the basin boundaries.
NASA Astrophysics Data System (ADS)
Elizar, Suripin, Wibowo, Mochamad Agung
2017-11-01
Delays in construction sites occur due to systematic additions of time waste in various activities that are part of the construction process. Work-time waste is non-adding value activity which used to differentiate between physical construction waste found on site and other waste which occurs during the construction process. The aim of this study is identification using the concept of Value Stream Mapping (VSM) to reduce of work-time waste as applied the smart construction management.VSM analysis is a method of business process improvement. The application of VSM began in the manufacturing community. The research method base on theoretically informed case study and literature review. The data have collected using questionnaire through personal interviews from 383 respondents on construction project in Indonesia. The results show that concept of VSM can identify causes of work-time waste. Base on result of questioners and quantitative approach analysis was obtained 29 variables that influence of work-time waste or non-value-adding activities. Base on three cases of construction project founded that average 14.88% of working time was classified as waste. Finally, the concept of VSM can recommend to identification of systematic for reveal current practices and opportunities for improvement towards global challenges. The concept of value stream mapping can help optimize to reduce work-time waste and improve quality standard of construction management. The concept is also can help manager to make a decision to reduce work-time waste so as to obtain of result in more efficient for performance and sustainable construction project.
[Computer-supported patient history: a workplace analysis].
Schubiger, G; Weber, D; Winiker, H; Desgrandchamps, D; Imahorn, P
1995-04-29
Since 1991, an extensive computer network has been developed and implemented at the Cantonal Hospital of Lucerne. The medical applications include computer aided management of patient charts, medical correspondence, and compilation of diagnosis statistics according to the ICD-9 code. In 1992, the system was introduced as a pilot project in the departments of pediatrics and pediatric surgery of the Lucerne Children's Hospital. This new system has been prospectively evaluated using a workplace analysis. The time taken to complete patient charts and surgical reports was recorded for 14 days before and after the introduction of the computerized system. This analysis was performed for both physicians and secretarial staff. The time delay between the discharge of the patient and the mailing of the discharge letter to the family doctor was also recorded. By conventional means, the average time for the physician to generate a patient chart (26 minutes, n = 119) was slightly lower than the time needed with the computer system (28 minutes, n = 177). However, for a discharge letter, the time needed by the physician was reduced by one third with the computer system and by more than one half for the secretarial staff (32 and 66 minutes conventionally; 22 and 24 minutes respectively with the computer system; p < 0.0001). The time required for the generation of surgical reports was reduced from 17 to 13 minutes per patient and the processing time by secretaries from 37 to 14 minutes. The time delay between the discharge of the patient and the mailing of the discharge letter was reduced by 50% from 7.6 to 3.9 days.(ABSTRACT TRUNCATED AT 250 WORDS)
Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis
NASA Astrophysics Data System (ADS)
Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.
We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.
Li, Xueqi; Woodman, Michael; Wang, Selina C
2015-08-01
Pheophytins and pyropheophytin are degradation products of chlorophyll pigments, and their ratios can be used as a sensitive indicator of stress during the manufacturing and storage of olive oil. They increase over time depending on the storage condition and if the oil is exposed to heat treatments during the refining process. The traditional analysis method includes solvent- and time-consuming steps of solid-phase extraction followed by analysis by high-performance liquid chromatography with ultraviolet detection. We developed an improved dilute/fluorescence method where multi-step sample preparation was replaced by a simple isopropanol dilution before the high-performance liquid chromatography injection. A quaternary solvent gradient method was used to include a fourth strong solvent wash on a quaternary gradient pump, which avoided the need to premix any solvents and greatly reduced the oil residues on the column from previous analysis. This new method not only reduces analysis cost and time but shows reliability, repeatability, and improved sensitivity, especially important for low-level samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Chen, Yi-He; Lin, Hui; Xie, Cheng-Long; Zhang, Xiao-Ting; Li, Yi-Gang
2015-06-01
We perform this meta-analysis to compare the efficacy and safety of cryoablation versus radiofrequency ablation for patients with cavotricuspid valve isthmus dependent atrial flutter. By searching EMBASE, MEDLINE, PubMed and Cochrane electronic databases from March 1986 to September 2014, 7 randomized clinical trials were included. Acute (risk ratio[RR]: 0.93; P = 0.14) and long-term (RR: 0.94; P = 0.08) success rate were slightly lower in cryoablation group than in radiofrequency ablation group, but the difference was not statistically significant. Additionally, the fluoroscopy time was nonsignificantly reduced (weighted mean difference[WMD]: -2.83 P = 0.29), whereas procedure time was significantly longer (WMD: 25.95; P = 0.01) in cryoablation group compared with radiofrequency ablation group. Furthermore, Pain perception during the catheter ablation was substantially less in cryoabaltion group than in radiofrequency ablation group (standardized mean difference[SMD]: -2.36 P < 0.00001). Thus, our meta-analysis demonstrated that cryoablation and radiofrequency ablation produce comparable acute and long-term success rate for patients with cavotricuspid valve isthmus dependent atrial flutter. Meanwhile, cryoablation ablation tends to reduce the fluoroscopy time and significantly reduce pain perception in cost of significantly prolonged procedure time.
Chen, Yi-He; Lin, Hui; Xie, Cheng-Long; Zhang, Xiao-Ting; Li, Yi-Gang
2015-01-01
We perform this meta-analysis to compare the efficacy and safety of cryoablation versus radiofrequency ablation for patients with cavotricuspid valve isthmus dependent atrial flutter. By searching EMBASE, MEDLINE, PubMed and Cochrane electronic databases from March 1986 to September 2014, 7 randomized clinical trials were included. Acute (risk ratio[RR]: 0.93; P = 0.14) and long-term (RR: 0.94; P = 0.08) success rate were slightly lower in cryoablation group than in radiofrequency ablation group, but the difference was not statistically significant. Additionally, the fluoroscopy time was nonsignificantly reduced (weighted mean difference[WMD]: −2.83; P = 0.29), whereas procedure time was significantly longer (WMD: 25.95; P = 0.01) in cryoablation group compared with radiofrequency ablation group. Furthermore, Pain perception during the catheter ablation was substantially less in cryoabaltion group than in radiofrequency ablation group (standardized mean difference[SMD]: −2.36; P < 0.00001). Thus, our meta-analysis demonstrated that cryoablation and radiofrequency ablation produce comparable acute and long-term success rate for patients with cavotricuspid valve isthmus dependent atrial flutter. Meanwhile, cryoablation ablation tends to reduce the fluoroscopy time and significantly reduce pain perception in cost of significantly prolonged procedure time. PMID:26039980
Lost in a Giant Database: The Potentials and Pitfalls of Secondary Analysis for Deaf Education
ERIC Educational Resources Information Center
Kluwin, T. N.; Morris, C. S.
2006-01-01
Secondary research or archival research is the analysis of data collected by another person or agency. It offers several advantages, including reduced cost, a less time-consuming research process, and access to larger populations and thus greater generalizability. At the same time, it offers several limitations, including the fact that the…
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2002-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time
Production Time Loss Reduction in Sauce Production Line by Lean Six Sigma Approach
NASA Astrophysics Data System (ADS)
Ritprasertsri, Thitima; Chutima, Parames
2017-06-01
In all industries, time losses, which are incurred in processing are very important. As a result, losses are incurred in productivity and cost. This research aimed to reduce lost time that occurs in sauce production line by using the lean six sigma approach. The main objective was to reduce the time for heating sauce which causes a lot of time lost in the production line which affects productivity. The methodology was comprised of the five-phase improvement model of Six Sigma. This approach begins with defining phase, measuring phase, analysing phase, improving phase and controlling phase. Cause-and-effect matrix and failure mode and effect analysis (FMEA) were adopted to screen the factors which affect production time loss. The results showed that the percentage of lost time from heating sauce reduced by 47.76%. This increased productivity to meet the plan.
Reduced-Order Aerothermoelastic Analysis of Hypersonic Vehicle Structures
NASA Astrophysics Data System (ADS)
Falkiewicz, Nathan J.
Design and simulation of hypersonic vehicles require consideration of a variety of disciplines due to the highly coupled nature of the flight regime. In order to capture all of the potential effects on vehicle dynamics, one must consider the aerodynamics, aerodynamic heating, heat transfer, and structural dynamics as well as the interactions between these disciplines. The problem is further complicated by the large computational expense involved in capturing all of these effects and their interactions in a full-order sense. While high-fidelity modeling techniques exist for each of these disciplines, the use of such techniques is computationally infeasible in a vehicle design and control system simulation setting for such a highly coupled problem. Early in the design stage, many iterations of analyses may need to be carried out as the vehicle design matures, thus requiring quick analysis turnaround time. Additionally, the number of states used in the analyses must be small enough to allow for efficient control simulation and design. As a result, alternatives to full-order models must be considered. This dissertation presents a fully coupled, reduced-order aerothermoelastic framework for the modeling and analysis of hypersonic vehicle structures. The reduced-order transient thermal solution is a modal solution based on the proper orthogonal decomposition. The reduced-order structural dynamic model is based on projection of the equations of motion onto a Ritz modal subspace that is identified a priori. The reduced-order models are assembled into a time-domain aerothermoelastic simulation framework which uses a partitioned time-marching scheme to account for the disparate time scales of the associated physics. The aerothermoelastic modeling framework is outlined and the formulations associated with the unsteady aerodynamics, aerodynamic heating, transient thermal, and structural dynamics are outlined. Results demonstrate the accuracy of the reduced-order transient thermal and structural dynamic models under variation in boundary conditions and flight conditions. The framework is applied to representative hypersonic vehicle control surface structures and a variety of studies are conducted to assess the impact of aerothermoelastic effects on hypersonic vehicle dynamics. The results presented in this dissertation demonstrate the ability of the proposed framework to perform efficient aerothermoelastic analysis.
Mediation analysis with time varying exposures and mediators
VanderWeele, Tyler J.; Tchetgen Tchetgen, Eric J.
2016-01-01
Summary In this paper we consider causal mediation analysis when exposures and mediators vary over time. We give non-parametric identification results, discuss parametric implementation, and also provide a weighting approach to direct and indirect effects based on combining the results of two marginal structural models. We also discuss how our results give rise to a causal interpretation of the effect estimates produced from longitudinal structural equation models. When there are time-varying confounders affected by prior exposure and mediator, natural direct and indirect effects are not identified. However, we define a randomized interventional analogue of natural direct and indirect effects that are identified in this setting. The formula that identifies these effects we refer to as the “mediational g-formula.” When there is no mediation, the mediational g-formula reduces to Robins’ regular g-formula for longitudinal data. When there are no time-varying confounders affected by prior exposure and mediator values, then the mediational g-formula reduces to a longitudinal version of Pearl’s mediation formula. However, the mediational g-formula itself can accommodate both mediation and time-varying confounders and constitutes a general approach to mediation analysis with time-varying exposures and mediators. PMID:28824285
Mediation analysis with time varying exposures and mediators.
VanderWeele, Tyler J; Tchetgen Tchetgen, Eric J
2017-06-01
In this paper we consider causal mediation analysis when exposures and mediators vary over time. We give non-parametric identification results, discuss parametric implementation, and also provide a weighting approach to direct and indirect effects based on combining the results of two marginal structural models. We also discuss how our results give rise to a causal interpretation of the effect estimates produced from longitudinal structural equation models. When there are time-varying confounders affected by prior exposure and mediator, natural direct and indirect effects are not identified. However, we define a randomized interventional analogue of natural direct and indirect effects that are identified in this setting. The formula that identifies these effects we refer to as the "mediational g-formula." When there is no mediation, the mediational g-formula reduces to Robins' regular g-formula for longitudinal data. When there are no time-varying confounders affected by prior exposure and mediator values, then the mediational g-formula reduces to a longitudinal version of Pearl's mediation formula. However, the mediational g-formula itself can accommodate both mediation and time-varying confounders and constitutes a general approach to mediation analysis with time-varying exposures and mediators.
2014-01-01
Background There is growing evidence suggesting that prolonged sitting has negative effects on people’s weight, chronic diseases and mortality. Interventions to reduce sedentary time can be an effective strategy to increase daily energy expenditure. The purpose of this study is to evaluate the effectiveness of a six-month primary care intervention to reduce daily of sitting time in overweight and mild obese sedentary patients. Method/Design The study is a randomized controlled trial (RCT). Professionals from thirteen primary health care centers (PHC) will randomly invite to participate mild obese or overweight patients of both gender, aged between 25 and 65 years old, who spend 6 hours at least daily sitting. A total of 232 subjects will be randomly allocated to an intervention (IG) and control group (CG) (116 individuals each group). In addition, 50 subjects with fibromyalgia will be included. Primary outcome is: (1) sitting time using the activPAL device and the Marshall questionnaire. The following parameters will be also assessed: (2) sitting time in work place (Occupational Sitting and Physical Activity Questionnaire), (3) health-related quality of life (EQ-5D), (4) evolution of stage of change (Prochaska and DiClemente's Stages of Change Model), (5) physical inactivity (catalan version of Brief Physical Activity Assessment Tool), (6) number of steps walked (pedometer and activPAL), (7) control based on analysis (triglycerides, total cholesterol, HDL, LDL, glycemia and, glycated haemoglobin in diabetic patients) and (8) blood pressure and anthropometric variables. All parameters will be assessed pre and post intervention and there will be a follow up three, six and twelve months after the intervention. A descriptive analysis of all variables and a multivariate analysis to assess differences among groups will be undertaken. Multivariate analysis will be carried out to assess time changes of dependent variables. All the analysis will be done under the intention to treat principle. Discussion If the SEDESTACTIV intervention shows its effectiveness in reducing sitting time, health professionals would have a low-cost intervention tool for sedentary overweight and obese patients management. Trial registration A service of the U.S. National Institutes of Health. Developed by the National Library of Medicine. ClinicalTrials.gov NCT01729936 PMID:24597534
Martín-Borràs, Carme; Giné-Garriga, Maria; Martínez, Elena; Martín-Cantera, Carlos; Puigdoménech, Elisa; Solà, Mercè; Castillo, Eva; Beltrán, Angela Ma; Puig-Ribera, Anna; Trujillo, José Manuel; Pueyo, Olga; Pueyo, Javier; Rodríguez, Beatriz; Serra-Paya, Noemí
2014-03-05
There is growing evidence suggesting that prolonged sitting has negative effects on people's weight, chronic diseases and mortality. Interventions to reduce sedentary time can be an effective strategy to increase daily energy expenditure. The purpose of this study is to evaluate the effectiveness of a six-month primary care intervention to reduce daily of sitting time in overweight and mild obese sedentary patients. The study is a randomized controlled trial (RCT). Professionals from thirteen primary health care centers (PHC) will randomly invite to participate mild obese or overweight patients of both gender, aged between 25 and 65 years old, who spend 6 hours at least daily sitting. A total of 232 subjects will be randomly allocated to an intervention (IG) and control group (CG) (116 individuals each group). In addition, 50 subjects with fibromyalgia will be included.Primary outcome is: (1) sitting time using the activPAL device and the Marshall questionnaire. The following parameters will be also assessed: (2) sitting time in work place (Occupational Sitting and Physical Activity Questionnaire), (3) health-related quality of life (EQ-5D), (4) evolution of stage of change (Prochaska and DiClemente's Stages of Change Model), (5) physical inactivity (catalan version of Brief Physical Activity Assessment Tool), (6) number of steps walked (pedometer and activPAL), (7) control based on analysis (triglycerides, total cholesterol, HDL, LDL, glycemia and, glycated haemoglobin in diabetic patients) and (8) blood pressure and anthropometric variables. All parameters will be assessed pre and post intervention and there will be a follow up three, six and twelve months after the intervention. A descriptive analysis of all variables and a multivariate analysis to assess differences among groups will be undertaken. Multivariate analysis will be carried out to assess time changes of dependent variables. All the analysis will be done under the intention to treat principle. If the SEDESTACTIV intervention shows its effectiveness in reducing sitting time, health professionals would have a low-cost intervention tool for sedentary overweight and obese patients management. A service of the U.S. National Institutes of Health. Developed by the National Library of Medicine. ClinicalTrials.gov NCT01729936.
ERIC Educational Resources Information Center
Rector, Robert E.; Johnson, Kirk A.; Fagan, Patrick F.; Noyes, Lauren R.
This report uses data from the Fragile Families and Child Well-Being Study (a nationwide survey that collects data on married and non-married parents at the time of the child's birth) to determine how much marriage could reduce poverty among couples who are not married at the time of birth. To determine the impact of marriage on children's and…
NASA Technical Reports Server (NTRS)
Mcgowan, David M.; Bostic, Susan W.; Camarda, Charles J.
1993-01-01
The development of two advanced reduced-basis methods, the force derivative method and the Lanczos method, and two widely used modal methods, the mode displacement method and the mode acceleration method, for transient structural analysis of unconstrained structures is presented. Two example structural problems are studied: an undamped, unconstrained beam subject to a uniformly distributed load which varies as a sinusoidal function of time and an undamped high-speed civil transport aircraft subject to a normal wing tip load which varies as a sinusoidal function of time. These example problems are used to verify the methods and to compare the relative effectiveness of each of the four reduced-basis methods for performing transient structural analyses on unconstrained structures. The methods are verified with a solution obtained by integrating directly the full system of equations of motion, and they are compared using the number of basis vectors required to obtain a desired level of accuracy and the associated computational times as comparison criteria.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-08-30
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.
Design sensitivity analysis of boundary element substructures
NASA Technical Reports Server (NTRS)
Kane, James H.; Saigal, Sunil; Gallagher, Richard H.
1989-01-01
The ability to reduce or condense a three-dimensional model exactly, and then iterate on this reduced size model representing the parts of the design that are allowed to change in an optimization loop is discussed. The discussion presents the results obtained from an ongoing research effort to exploit the concept of substructuring within the structural shape optimization context using a Boundary Element Analysis (BEA) formulation. The first part contains a formulation for the exact condensation of portions of the overall boundary element model designated as substructures. The use of reduced boundary element models in shape optimization requires that structural sensitivity analysis can be performed. A reduced sensitivity analysis formulation is then presented that allows for the calculation of structural response sensitivities of both the substructured (reduced) and unsubstructured parts of the model. It is shown that this approach produces significant computational economy in the design sensitivity analysis and reanalysis process by facilitating the block triangular factorization and forward reduction and backward substitution of smaller matrices. The implementatior of this formulation is discussed and timings and accuracies of representative test cases presented.
Wagner, Lucas; Schmal, Christoph; Staiger, Dorothee; Danisman, Selahattin
2017-01-01
The analysis of circadian leaf movement rhythms is a simple yet effective method to study effects of treatments or gene mutations on the circadian clock of plants. Currently, leaf movements are analysed using time lapse photography and subsequent bioinformatics analyses of leaf movements. Programs that are used for this purpose either are able to perform one function (i.e. leaf tip detection or rhythm analysis) or their function is limited to specific computational environments. We developed a leaf movement analysis tool-PALMA-that works in command line and combines image extraction with rhythm analysis using Fast Fourier transformation and non-linear least squares fitting. We validated PALMA in both simulated time series and in experiments using the known short period mutant sensitivity to red light reduced 1 ( srr1 - 1 ). We compared PALMA with two established leaf movement analysis tools and found it to perform equally well. Finally, we tested the effect of reduced iron conditions on the leaf movement rhythms of wild type plants. Here, we found that PALMA successfully detected period lengthening under reduced iron conditions. PALMA correctly estimated the period of both simulated and real-life leaf movement experiments. As a platform-independent console-program that unites both functions needed for the analysis of circadian leaf movements it is a valid alternative to existing leaf movement analysis tools.
Savastano, Simone; Vanni, Vincenzo; Burkart, Roman; Raimondi, Maurizio; Canevari, Fabrizio; Molinari, Simone; Baldi, Enrico; Danza, Aurora I; Caputo, Maria Luce; Mauri, Romano; Regoli, Francois; Conte, Giulio; Benvenuti, Claudio; Auricchio, Angelo
2017-01-01
Early and good quality cardiopulmonary resuscitation (CPR) and the use of automated external defibrillators (AEDs) improve cardiac arrest patients' survival. However, AED peri- and post-shock/analysis pauses may reduce CPR effectiveness. The time performance of 12 different commercially available AEDs was tested in a manikin based scenario; then the AEDs recordings from the same tested models following the clinical use both in Pavia and Ticino were analyzed to evaluate the post-shock and post-analysis time. None of the AEDs was able to complete the analysis and to charge the capacitors in less than 10s and the mean post-shock pause was 6.7±2.4s. For non-shockable rhythms, the mean analysis time was 10.3±2s and the mean post-analysis time was 6.2±2.2s. We analyzed 154 AED records [104 by Emergency Medical Service (EMS) rescuers; 50 by lay rescuers]. EMS rescuers were faster in resuming CPR than lay rescuers [5.3s (95%CI 5-5.7) vs 8.6s (95%CI 7.3-10). AEDs showed different performances that may reduce CPR quality mostly for those rescuers following AED instructions. Both technological improvements and better lay rescuers training might be needed. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Takemoto, Michelle; Lewars, Brittany; Hurst, Samantha; Crist, Katie; Nebeker, Camille; Madanat, Hala; Nichols, Jeanne; Rosenberg, Dori E; Kerr, Jacqueline
2018-03-31
Recent epidemiological evidence indicates that, on average, people are sedentary for approximately 7.7 hours per day. There are deleterious effects of prolonged sedentary behavior that are separate from participation in physical activity and include increased risk of weight gain, cancer, metabolic syndrome, diabetes, and heart disease. Previous trials have used wearable devices to increase physical activity in studies; however, additional research is needed to fully understand how this technology can be used to reduce sitting time. The purpose of this study was to explore the potential of wearable devices as an intervention tool in a larger sedentary behavior study through a general inductive and deductive analysis of focus group discussions. We conducted four focus groups with 15 participants to discuss 7 different wearable devices with sedentary behavior capabilities. Participants recruited for the focus groups had previously participated in a pilot intervention targeting sedentary behavior over a 3-week period and were knowledgeable about the challenges of reducing sitting time. During the focus groups, participants commented on the wearability, functionality, and feedback mechanism of each device and then identified their two favorite and two least favorite devices. Finally, participants designed and described their ideal or dream wearable device. Two researchers, who have expertise analyzing qualitative data, coded and analyzed the data from the focus groups. A thematic analysis approach using Dedoose software (SocioCultural Research Consultants, LLC version 7.5.9) guided the organization of themes that reflected participants' perspectives. Analysis resulted in 14 codes that we grouped into themes. Three themes emerged from our data: (1) features of the device, (2) data the device collected, and (3) how data are displayed. Current wearable devices for increasing physical activity are insufficient to intervene on sitting time. This was especially evident when participants voted, as several participants reported using a "process of elimination" as opposed to choosing favorites because none of the devices were ideal for reducing sitting time. To overcome the limitations in current devices, future wearable devices designed to reduce sitting time should include the following features: waterproof, long battery life, accuracy in measuring sitting time, real time feedback on progress toward sitting reduction goals, and flexible options for prompts to take breaks from sitting. ©Michelle Takemoto, Brittany Lewars, Samantha Hurst, Katie Crist, Camille Nebeker, Hala Madanat, Jeanne Nichols, Dori E Rosenberg, Jacqueline Kerr. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 31.03.2018.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less
NASA Astrophysics Data System (ADS)
Jiang, Wenqian; Zeng, Bo; Yang, Zhou; Li, Gang
2018-01-01
In the non-invasive load monitoring mode, the load decomposition can reflect the running state of each load, which will help the user reduce unnecessary energy costs. With the demand side management measures of time of using price, a resident load influence analysis method for time of using price (TOU) based on non-intrusive load monitoring data are proposed in the paper. Relying on the current signal of the resident load classification, the user equipment type, and different time series of self-elasticity and cross-elasticity of the situation could be obtained. Through the actual household load data test with the impact of TOU, part of the equipment will be transferred to the working hours, and users in the peak price of electricity has been reduced, and in the electricity at the time of the increase Electrical equipment, with a certain regularity.
Ciemins, Elizabeth L; Blum, Linda; Nunley, Marsha; Lasher, Andrew; Newman, Jeffrey M
2007-12-01
While there has been a rapid increase of inpatient palliative care (PC) programs, the financial and clinical benefits have not been well established. Determine the effect of an inpatient PC consultation service on costs and clinical outcomes. Multifaceted study included: (1) interrupted time-series design utilizing mean daily costs preintervention and postintervention; (2) matched cohort analysis comparing PC to usual care patients; and (3) analysis of symptom control after consultation. Large private, not-for-profit, academic medical center in San Francisco, California, 2004-2006. Time series analysis included 282 PC patients; matched cohorts included 27 PC with 128 usual care patients; clinical outcome analysis of 48 PC patients. Mean daily patient costs and length of stay (LOS); pain, dyspnea, and secretions assessment scores. Mean daily costs were reduced 33% (p < 0.01) from preintervention to postintervention period. Mean length of stay (LOS) was reduced 30%. Mean daily costs for PC patients were 14.5% lower compared to usual care patients (p < 0.01). Pain, dyspnea, and secretions scores were reduced by 86%, 64%, and 87%, respectively. Over the study period, time to PC referral as well as overall ALOS were reduced by 50%. The large reduction in mean daily costs and LOS resulted in an estimated annual savings of $2.2 million in the study hospital. Our results extend the evidence base of financial and clinical benefits associated with inpatient PC programs. We recommend additional study of best practices for identifying patients and providing consultation services, in addition to progressive management support and reimbursement policy.
Why "Working Smarter" Isn't Working: White-Collar Productivity Improvement.
ERIC Educational Resources Information Center
Shaw, Edward
2001-01-01
Discusses the productivity and work days of white collar workers. Topics include productivity improvement; task analysis; the amount of time spent reading, and how to reduce it by improving writing skills; time spent in meetings; empowered time management; and sustaining a climate for change. (LRW)
Paramedir: A Tool for Programmable Performance Analysis
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.
Visual Analysis of Air Traffic Data
NASA Technical Reports Server (NTRS)
Albrecht, George Hans; Pang, Alex
2012-01-01
In this paper, we present visual analysis tools to help study the impact of policy changes on air traffic congestion. The tools support visualization of time-varying air traffic density over an area of interest using different time granularity. We use this visual analysis platform to investigate how changing the aircraft separation volume can reduce congestion while maintaining key safety requirements. The same platform can also be used as a decision aid for processing requests for unmanned aerial vehicle operations.
Tao, Weiwei; Luo, Xi; Cui, Bai; Liang, Dapeng; Wang, Chunli; Duan, Yangyang; Li, Xiaofen; Zhou, Shiyu; Zhao, Mingjie; Li, Yi; He, Yumin; Wang, Shaowu; Kelley, Keith W; Jiang, Ping; Liu, Quentin
2015-11-24
Cancer patients suffer from diverse symptoms, including depression, anxiety, pain, and fatigue and lower quality of life (QoL) during disease progression. This study aimed to evaluate the benefits of Traditional Chinese Medicine psycho-behavioral interventions (TCM PBIs) on improving QoL by meta-analysis. The six TCM PBIs analyzed were acupuncture, Chinese massage, Traditional Chinese Medicine five elements musical intervention (TCM FEMI), Traditional Chinese Medicine dietary supplement (TCM DS), Qigong and Tai Chi. Although both TCM PBIs and non-TCM PBIs reduced functional impairments in cancer patients and led to pain relief, depression remission, reduced time to flatulence following surgery and sleep improvement, TCM PBIs showed more beneficial effects as assessed by reducing both fatigue and gastrointestinal distress. In particular, acupuncture relieved fatigue, reduced diarrhea and decreased time to flatulence after surgery in cancer patients, while therapeutic Chinese massage reduced time to flatulence and time to peristaltic sound. Electronic literature databases (PubMed, CNKI, VIP, and Wanfang) were searched for randomized, controlled trials conducted in China. The primary intervention was TCM PBIs. The main outcome was health-related QoL (HR QoL) post-treatment. We applied standard meta analytic techniques to analyze data from papers that reached acceptable criteria. These findings demonstrate the efficacy of TCM PBIs in improving QoL in cancer patients and establish that TCM PBIs represent beneficial adjunctive therapies for cancer patients.
Liang, Dapeng; Wang, Chunli; Duan, Yangyang; Li, Xiaofen; Zhou, Shiyu; Zhao, Mingjie; Li, Yi; He, Yumin; Wang, Shaowu; Kelley, Keith W.; Jiang, Ping; Liu, Quentin
2015-01-01
Background Cancer patients suffer from diverse symptoms, including depression, anxiety, pain, and fatigue and lower quality of life (QoL) during disease progression. This study aimed to evaluate the benefits of Traditional Chinese Medicine psycho-behavioral interventions (TCM PBIs) on improving QoL by meta-analysis. Methods Electronic literature databases (PubMed, CNKI, VIP, and Wanfang) were searched for randomized, controlled trials conducted in China. The primary intervention was TCM PBIs. The main outcome was health-related QoL (HR QoL) post-treatment. We applied standard meta analytic techniques to analyze data from papers that reached acceptable criteria. Results The six TCM PBIs analyzed were acupuncture, Chinese massage, Traditional Chinese Medicine five elements musical intervention (TCM FEMI), Traditional Chinese Medicine dietary supplement (TCM DS), Qigong and Tai Chi. Although both TCM PBIs and non-TCM PBIs reduced functional impairments in cancer patients and led to pain relief, depression remission, reduced time to flatulence following surgery and sleep improvement, TCM PBIs showed more beneficial effects as assessed by reducing both fatigue and gastrointestinal distress. In particular, acupuncture relieved fatigue, reduced diarrhea and decreased time to flatulence after surgery in cancer patients, while therapeutic Chinese massage reduced time to flatulence and time to peristaltic sound. Conclusion These findings demonstrate the efficacy of TCM PBIs in improving QoL in cancer patients and establish that TCM PBIs represent beneficial adjunctive therapies for cancer patients. PMID:26498685
Harmonic versus LigaSure hemostasis technique in thyroid surgery: A meta-analysis
Upadhyaya, Arun; Hu, Tianpeng; Meng, Zhaowei; Li, Xue; He, Xianghui; Tian, Weijun; Jia, Qiang; Tan, Jian
2016-01-01
Harmonic scalpel and LigaSure vessel sealing systems have been suggested as options for saving surgical time and reducing postoperative complications. The aim of the present meta-analysis was to compare surgical time, postoperative complications and other parameters between them in for the open thyroidectomy procedure. Studies were retrieved from MEDLINE, Cochrane Library, EMBASE and ISI Web of Science until December 2015. All the randomized controlled trials (RCTs) comparing Harmonic scalpel and LigaSure during open thyroidectomy were selected. Following data extraction, statistical analyses were performed. Among the 24 studies that were evaluated for eligibility, 7 RCTs with 981 patients were included. The Harmonic scalpel significantly reduced surgical time compared with LigaSure techniques (8.79 min; 95% confidence interval, −15.91 to −1.67; P=0.02). However, no significant difference was observed for the intraoperative blood loss, postoperative blood loss, duration of hospital stay, thyroid weight and serum calcium level postoperatively in either group. The present meta-analysis indicated superiority of Harmonic Scalpel only in terms of surgical time compared with LigaSure hemostasis techniques in open thyroid surgery. PMID:27446546
Foston, Marcus; Samuel, Reichel; Ragauskas, Arthur J
2012-09-07
The ability to accurately and rapidly measure plant cell wall composition, relative monolignol content and lignin-hemicellulose inter-unit linkage distributions has become essential to efforts centered on reducing the recalcitrance of biomass by genetic engineering. Growing (13)C enriched transgenic plants is a viable route to achieve the high-throughput, detailed chemical analysis of whole plant cell wall before and after pretreatment and microbial or enzymatic utilization by (13)C nuclear magnetic resonance (NMR) in a perdeuterated ionic liquid solvent system not requiring component isolation. 1D (13)C whole cell wall ionic liquid NMR of natural abundant and (13)C enriched corn stover stem samples suggest that a high level of uniform labeling (>97%) can significantly reduce the total NMR experiment times up to ~220 times. Similarly, significant reduction in total NMR experiment time (~39 times) of the (13)C enriched corn stover stem samples for 2D (13)C-(1)H heteronuclear single quantum coherence NMR was found.
Automated drug dispensing systems in the intensive care unit: a financial analysis.
Chapuis, Claire; Bedouch, Pierrick; Detavernier, Maxime; Durand, Michel; Francony, Gilles; Lavagne, Pierre; Foroni, Luc; Albaladejo, Pierre; Allenet, Benoit; Payen, Jean-Francois
2015-09-09
To evaluate the economic impact of automated-drug dispensing systems (ADS) in surgical intensive care units (ICUs). A financial analysis was conducted in three adult ICUs of one university hospital, where ADS were implemented, one in each unit, to replace the traditional floor stock system. Costs were estimated before and after implementation of the ADS on the basis of floor stock inventories, expired drugs, and time spent by nurses and pharmacy technicians on medication-related work activities. A financial analysis was conducted that included operating cash flows, investment cash flows, global cash flow and net present value. After ADS implementation, nurses spent less time on medication-related activities with an average of 14.7 hours saved per day/33 beds. Pharmacy technicians spent more time on floor-stock activities with an average of 3.5 additional hours per day across the three ICUs. The cost of drug storage was reduced by €44,298 and the cost of expired drugs was reduced by €14,772 per year across the three ICUs. Five years after the initial investment, the global cash flow was €148,229 and the net present value of the project was positive by €510,404. The financial modeling of the ADS implementation in three ICUs showed a high return on investment for the hospital. Medication-related costs and nursing time dedicated to medications are reduced with ADS.
NASA Astrophysics Data System (ADS)
Jules, Kenol; Lin, Paul P.
2007-06-01
With the International Space Station currently operational, a significant amount of acceleration data is being down-linked, processed and analyzed daily on the ground on a continuous basis for the space station reduced gravity environment characterization, the vehicle design requirements verification and science data collection. To help understand the impact of the unique spacecraft environment on the science data, an artificial intelligence monitoring system was developed, which detects in near real time any change in the reduced gravity environment susceptible to affect the on-going experiments. Using a dynamic graphical display, the monitoring system allows science teams, at any time and any location, to see the active vibration disturbances, such as pumps, fans, compressor, crew exercise, re-boost and extra-vehicular activities that might impact the reduced gravity environment the experiments are exposed to. The monitoring system can detect both known and unknown vibratory disturbance activities. It can also perform trend analysis and prediction by analyzing past data over many increments (an increment usually lasts 6 months) collected onboard the station for selected disturbances. This feature can be used to monitor the health of onboard mechanical systems to detect and prevent potential systems failures. The monitoring system has two operating modes: online and offline. Both near real-time on-line vibratory disturbance detection and off-line detection and trend analysis are discussed in this paper.
2013-01-01
Background A smartcard is an integrated circuit card that provides identification, authentication, data storage, and application processing. Among other functions, smartcards can serve as credit and ATM cards and can be used to pay various invoices using a ‘reader’. This study looks at the unit cost and activity time of both a traditional cash billing service and a newly introduced smartcard billing service in an outpatient department in a hospital in Taipei, Taiwan. Methods The activity time required in using the cash billing service was determined via a time and motion study. A cost analysis was used to compare the unit costs of the two services. A sensitivity analysis was also performed to determine the effect of smartcard use and number of cashier windows on incremental cost and waiting time. Results Overall, the smartcard system had a higher unit cost because of the additional service fees and business tax, but it reduced patient waiting time by at least 8 minutes. Thus, it is a convenient service for patients. In addition, if half of all outpatients used smartcards to pay their invoices, along with four cashier windows for cash payments, then the waiting time of cash service users could be reduced by approximately 3 minutes and the incremental cost would be close to breaking even (even though it has a higher overall unit cost that the traditional service). Conclusions Traditional cash billing services are time consuming and require patients to carry large sums of money. Smartcard services enable patients to pay their bill immediately in the outpatient clinic and offer greater security and convenience. The idle time of nurses could also be reduced as they help to process smartcard payments. A reduction in idle time reduces hospital costs. However, the cost of the smartcard service is higher than the cash service and, as such, hospital administrators must weigh the costs and benefits of introducing a smartcard service. In addition to the obvious benefits of the smartcard service, there is also scope to extend its use in a hospital setting to include the notification of patient arrival and use in other departments. PMID:23763904
Chu, Kuan-Yu; Huang, Chunmin
2013-06-13
A smartcard is an integrated circuit card that provides identification, authentication, data storage, and application processing. Among other functions, smartcards can serve as credit and ATM cards and can be used to pay various invoices using a 'reader'. This study looks at the unit cost and activity time of both a traditional cash billing service and a newly introduced smartcard billing service in an outpatient department in a hospital in Taipei, Taiwan. The activity time required in using the cash billing service was determined via a time and motion study. A cost analysis was used to compare the unit costs of the two services. A sensitivity analysis was also performed to determine the effect of smartcard use and number of cashier windows on incremental cost and waiting time. Overall, the smartcard system had a higher unit cost because of the additional service fees and business tax, but it reduced patient waiting time by at least 8 minutes. Thus, it is a convenient service for patients. In addition, if half of all outpatients used smartcards to pay their invoices, along with four cashier windows for cash payments, then the waiting time of cash service users could be reduced by approximately 3 minutes and the incremental cost would be close to breaking even (even though it has a higher overall unit cost that the traditional service). Traditional cash billing services are time consuming and require patients to carry large sums of money. Smartcard services enable patients to pay their bill immediately in the outpatient clinic and offer greater security and convenience. The idle time of nurses could also be reduced as they help to process smartcard payments. A reduction in idle time reduces hospital costs. However, the cost of the smartcard service is higher than the cash service and, as such, hospital administrators must weigh the costs and benefits of introducing a smartcard service. In addition to the obvious benefits of the smartcard service, there is also scope to extend its use in a hospital setting to include the notification of patient arrival and use in other departments.
Li, Ingrid; Mackey, Martin G; Foley, Bridget; Pappas, Evangelos; Edwards, Kate; Chau, Josephine Y; Engelen, Lina; Voukelatos, Alexander; Whelan, Anna; Bauman, Adrian; Winkler, Elisabeth; Stamatakis, Emmanuel
2017-06-01
To examine the effects of different sit-stand protocols on work-time sitting and physical activity (PA) of office workers. Participants (n = 26, 77% women, mean age 42) were randomly allocated to usual sitting (control) or one of three sit-stand protocols (intervention) facilitated by height-adjustable workstations for a 4-week period between June and August 2015. Sitting, standing, and stepping time were assessed by inclinometry (activPAL); leisure-time physical activity (LTPA) by self-report. One-way analysis of covariance (ANCOVA) and post-hoc (Bonferroni) tests explored between-group differences. Compared with baseline, intervention groups reduced work sitting time by 113 minutes/8-hour workday (95% confidence interval [CI] [-147,-79]) and increased work standing time by 96 minutes/8-hour workday (95% CI [67,125]) without significantly impacting LTPA/sleep time. Sit-stand protocols facilitated by height-adjustable workstations appear to reduce office workers' sitting time without significant adverse effects on LTPA.
USDA-ARS?s Scientific Manuscript database
Traditional plating methods are reliable means for Campylobacter identification from poultry samples but automated gene-based detection systems now available can reduce assay time, data collection and analysis. Bio-Rad and DuPont Qualicon recently introduced Campylobacter assays for their real-time ...
Effect of intervention programs in schools to reduce screen time: a meta-analysis.
Friedrich, Roberta Roggia; Polet, Jéssica Pinto; Schuch, Ilaine; Wagner, Mário Bernardes
2014-01-01
to evaluate the effects of intervention program strategies on the time spent on activities such as watching television, playing videogames, and using the computer among schoolchildren. a search for randomized controlled trials available in the literature was performed in the following electronic databases: PubMed, Lilacs, Embase, Scopus, Web of Science, and Cochrane Library using the following Keywords randomized controlled trial, intervention studies, sedentary lifestyle, screen time, and school. A summary measure based on the standardized mean difference was used with a 95% confidence interval. a total of 1,552 studies were identified, of which 16 were included in the meta-analysis. The interventions in the randomized controlled trials (n=8,785) showed a significant effect in reducing screen time, with a standardized mean difference (random effect) of: -0.25 (-0.37, -0.13), p<0.01. interventions have demonstrated the positive effects of the decrease of screen time among schoolchildren. Copyright © 2014 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.
O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A
2015-02-01
Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.
2012-10-01
We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.
Sensitivity analysis of water consumption in an office building
NASA Astrophysics Data System (ADS)
Suchacek, Tomas; Tuhovcak, Ladislav; Rucka, Jan
2018-02-01
This article deals with sensitivity analysis of real water consumption in an office building. During a long-term real study, reducing of pressure in its water connection was simulated. A sensitivity analysis of uneven water demand was conducted during working time at various provided pressures and at various time step duration. Correlations between maximal coefficients of water demand variation during working time and provided pressure were suggested. The influence of provided pressure in the water connection on mean coefficients of water demand variation was pointed out, altogether for working hours of all days and separately for days with identical working hours.
Activating clinical trials: a process improvement approach.
Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin
2016-02-24
The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.
Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.
2012-12-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within SciReduce a versatile set of python operators for data lookup, access, subsetting, co-registration, mining, fusion, and statistical analysis. All operators take in sets of geo-located arrays and generate more arrays. Large, multi-year satellite and model datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of granules) can be compared or fused in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP or webification URLs, thereby minimizing the size of the stored input and intermediate datasets. A typical map function might assemble and quality control AIRS Level-2 water vapor profiles for a year of data in parallel, then a reduce function would average the profiles in lat/lon bins (again, in parallel), and a final reduce would aggregate the climatology and write it to output files. We are using SciReduce to automate the production of multiple versions of a multi-year water vapor climatology (AIRS & MODIS), stratified by Cloudsat cloud classification, and compare it to models (ECMWF & MERRA reanalysis). We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing huge datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer.
Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud
NASA Astrophysics Data System (ADS)
Wilson, B.; Manipon, G.; Hua, H.
2012-04-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within SciReduce a versatile set of python operators for data lookup, access, subsetting, co-registration, mining, fusion, and statistical analysis. All operators take in sets of geo-arrays and generate more arrays. Large, multi-year satellite and model datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of granules) can be compared or fused in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP or webification URLs, thereby minimizing the size of the stored input and intermediate datasets. A typical map function might assemble and quality control AIRS Level-2 water vapor profiles for a year of data in parallel, then a reduce function would average the profiles in bins (again, in parallel), and a final reduce would aggregate the climatology and write it to output files. We are using SciReduce to automate the production of multiple versions of a multi-year water vapor climatology (AIRS & MODIS), stratified by Cloudsat cloud classification, and compare it to models (ECMWF & MERRA reanalysis). We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing huge datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer.
Rationale for continuing R&D in indirect coal liquefaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.; Tomlinson, G.
1995-12-31
The objective of this analysis is to use the world energy demand/supply model developed at MITRE to examine future liquid fuels supply scenarios both for the world and for the United States. This analysis has determined the probable extent of future oil resource shortages and the likely time frame in which the shortages will occur. The role that coal liquefaction could play in helping to alleviate this liquid fuels shortfall is also examined. The importance of continuing R&D to improve process performance and reduce the costs of coal-derived transportation fuel is quantified in terms of reducing the time when coalmore » liquids will become competitive with petroleum.« less
Xiong, Shuyu; Sankaridurg, Padmaja; Naduvilath, Thomas; Zang, Jiajie; Zou, Haidong; Zhu, Jianfeng; Lv, Minzhi; He, Xiangui; Xu, Xun
2017-09-01
Outdoor time is considered to reduce the risk of developing myopia. The purpose is to evaluate the evidence for association between time outdoors and (1) risk of onset of myopia (incident/prevalent myopia); (2) risk of a myopic shift in refractive error and c) risk of progression in myopes only. A systematic review followed by a meta-analysis and a dose-response analysis of relevant evidence from literature was conducted. PubMed, EMBASE and the Cochrane Library were searched for relevant papers. Of the 51 articles with relevant data, 25 were included in the meta-analysis and dose-response analysis. Twenty-three of the 25 articles involved children. Risk ratio (RR) for binary variables and weighted mean difference (WMD) for continuous variables were conducted. Mantel-Haenszel random-effects model was used to pool the data for meta-analysis. Statistical heterogeneity was assessed using the I 2 test with I 2 ≥ 50% considered to indicate high heterogeneity. Additionally, subgroup analyses (based on participant's age, prevalence of myopia and study type) and sensitivity analyses were conducted. A significant protective effect of outdoor time was found for incident myopia (clinical trials: risk ratio (RR) = 0.536, 95% confidence interval (CI) = 0.338 to 0.850; longitudinal cohort studies: RR = 0.574, 95% CI = 0.395 to 0.834) and prevalent myopia (cross-sectional studies: OR = 0.964, 95% CI = 0.945 to 0.982). With dose-response analysis, an inverse nonlinear relationship was found with increased time outdoors reducing the risk of incident myopia. Also, pooled results from clinical trials indicated that when outdoor time was used as an intervention, there was a reduced myopic shift of -0.30 D (in both myopes and nonmyopes) compared with the control group (WMD = -0.30, 95% CI = -0.18 to -0.41) after 3 years of follow-up. However, when only myopes were considered, dose-response analysis did not find a relationship between time outdoors and myopic progression (R 2 = 0.00064). Increased time outdoors is effective in preventing the onset of myopia as well as in slowing the myopic shift in refractive error. But paradoxically, outdoor time was not effective in slowing progression in eyes that were already myopic. Further studies evaluating effect of outdoor in various doses and objective measurements of time outdoors may help improve our understanding of the role played by outdoors in onset and management of myopia. © 2017 The Authors. Acta Ophthalmologica published by John Wiley & Sons Ltd on behalf of Acta Ophthalmologica Scandinavica Foundation.
Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I
2017-01-01
This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.
Ramp time synchronization. [for NASA Deep Space Network
NASA Technical Reports Server (NTRS)
Hietzke, W.
1979-01-01
A new method of intercontinental clock synchronization has been developed and proposed for possible use by NASA's Deep Space Network (DSN), using a two-way/three-way radio link with a spacecraft. Analysis of preliminary data indicates that the real-time method has an uncertainty of 0.6 microsec, and it is very likely that further work will decrease the uncertainty. Also, the method is compatible with a variety of nonreal-time analysis techniques, which may reduce the uncertainty down to the tens of nanosecond range.
Lean manufacturing analysis to reduce waste on production process of fan products
NASA Astrophysics Data System (ADS)
Siregar, I.; Nasution, A. A.; Andayani, U.; Sari, R. M.; Syahputri, K.; Anizar
2018-02-01
This research is based on case study that being on electrical company. One of the products that will be researched is the fan, which when running the production process there is a time that is not value-added, among others, the removal of material which is not efficient in the raw materials and component molding fan. This study aims to reduce waste or non-value added activities and shorten the total lead time by using the tools Value Stream Mapping. Lean manufacturing methods used to analyze and reduce the non-value added activities, namely the value stream mapping analysis tools, process mapping activity with 5W1H, and tools 5 whys. Based on the research note that no value-added activities in the production process of a fan of 647.94 minutes of total lead time of 725.68 minutes. Process cycle efficiency in the production process indicates that the fan is still very low at 11%. While estimates of the repair showed a decrease in total lead time became 340.9 minutes and the process cycle efficiency is greater by 24%, which indicates that the production process has been better.
Musil, Carol; Jeanblanc, Alexandra; Burant, Christopher; Zauszniewski, Jaclene; Warner, Camille
2013-01-01
Background Grandmothers living with grandchildren face stressors that may increase depressive symptoms, but cognitive-behavioral strategies, such as resourcefulness, may reduce the effects of stressors on mental health. Purpose This analysis examined the contemporaneous and longitudinal relationships among intra-family strain, resourcefulness and depressive symptoms in 240 grandmothers, classified by caregiving status to grandchildren. Methods Grandmothers raising grandchildren, grandmothers living in multigenerational homes, and non-caregivers to grandchildren reported on intra-family strain, resourcefulness, and depressive symptoms using mailed questionnaires at three time points over five years. Structural equation modeling was used to evaluate the mediating effects of resourcefulness and the relationships between variables. Discussion Grandmother caregiver status had significant effects on depressive symptoms and intra-family strain, but not resourcefulness. At all waves, higher resourcefulness was associated with fewer depressive symptoms, which reduced appraisals of intra-family strain. Conclusions Interventions focused on strengthening resourcefulness could reduce depressive symptoms over time. PMID:23756496
Novel Framework for Reduced Order Modeling of Aero-engine Components
NASA Astrophysics Data System (ADS)
Safi, Ali
The present study focuses on the popular dynamic reduction methods used in design of complex assemblies (millions of Degrees of Freedom) where numerous iterations are involved to achieve the final design. Aerospace manufacturers such as Rolls Royce and Pratt & Whitney are actively seeking techniques that reduce computational time while maintaining accuracy of the models. This involves modal analysis of components with complex geometries to determine the dynamic behavior due to non-linearity and complicated loading conditions. In such a case the sub-structuring and dynamic reduction techniques prove to be an efficient tool to reduce design cycle time. The components whose designs are finalized can be dynamically reduced to mass and stiffness matrices at the boundary nodes in the assembly. These matrices conserve the dynamics of the component in the assembly, and thus avoid repeated calculations during the analysis runs for design modification of other components. This thesis presents a novel framework in terms of modeling and meshing of any complex structure, in this case an aero-engine casing. In this study the affect of meshing techniques on the run time are highlighted. The modal analysis is carried out using an extremely fine mesh to ensure all minor details in the structure are captured correctly in the Finite Element (FE) model. This is used as the reference model, to compare against the results of the reduced model. The study also shows the conditions/criteria under which dynamic reduction can be implemented effectively, proving the accuracy of Criag-Bampton (C.B.) method and limitations of Static Condensation. The study highlights the longer runtime needed to produce the reduced matrices of components compared to the overall runtime of the complete unreduced model. Although once the components are reduced, the assembly run is significantly. Hence the decision to use Component Mode Synthesis (CMS) is to be taken judiciously considering the number of iterations that may be required during the design cycle.
CMS Analysis and Data Reduction with Apache Spark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver; Canali, Luca; Cremer, Illia
Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less
Real time gamma-ray signature identifier
Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA
2012-05-15
A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.
Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W
2011-01-01
Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Kevin J.; Wright, Bob W.; Jarman, Kristin H.
2003-05-09
A rapid retention time alignment algorithm was developed as a preprocessing utility to be used prior to chemometric analysis of large datasets of diesel fuel gas chromatographic profiles. Retention time variation from chromatogram-to-chromatogram has been a significant impediment against the use of chemometric techniques in the analysis of chromatographic data due to the inability of current multivariate techniques to correctly model information that shifts from variable to variable within a dataset. The algorithm developed is shown to increase the efficacy of pattern recognition methods applied to a set of diesel fuel chromatograms by retaining chemical selectivity while reducing chromatogram-to-chromatogram retentionmore » time variations and to do so on a time scale that makes analysis of large sets of chromatographic data practical.« less
NASA Astrophysics Data System (ADS)
Herath, Narmada; Del Vecchio, Domitilla
2018-03-01
Biochemical reaction networks often involve reactions that take place on different time scales, giving rise to "slow" and "fast" system variables. This property is widely used in the analysis of systems to obtain dynamical models with reduced dimensions. In this paper, we consider stochastic dynamics of biochemical reaction networks modeled using the Linear Noise Approximation (LNA). Under time-scale separation conditions, we obtain a reduced-order LNA that approximates both the slow and fast variables in the system. We mathematically prove that the first and second moments of this reduced-order model converge to those of the full system as the time-scale separation becomes large. These mathematical results, in particular, provide a rigorous justification to the accuracy of LNA models derived using the stochastic total quasi-steady state approximation (tQSSA). Since, in contrast to the stochastic tQSSA, our reduced-order model also provides approximations for the fast variable stochastic properties, we term our method the "stochastic tQSSA+". Finally, we demonstrate the application of our approach on two biochemical network motifs found in gene-regulatory and signal transduction networks.
Ikemi, A
1988-01-01
Experiments were conducted to investigate the psychophysiological effects of self-regulation method (SRM), a newly developed method of self-control, using EEG frequency analysis and contingent negative variations (CNV). The results of the EEG frequency analysis showed that there is a significant increase in the percentage (power) of the theta-band and a significant decrease in the percentage (power) of the beta-band during SRM. Moreover, the results of an identical experiment conducted on subjects in a drowsy state showed that the changes in EEG frequencies during SRM can be differentiated from those of a drowsy state. Furthermore, experiments using CNV showed that there is a significant reduction of CNV amplitude during SRM. Despite the reduced amplitude during SRM, the number of errors in a task to evoke the CNV was reduced significantly without significant delay of reaction time. When an identical experiment was conducted in a drowsy state, CNV amplitude was reduced significantly, but reaction time and errors increased. From these experiments, the state of vigilance during SRM was discussed as a state of 'relaxed alertness'.
Improved robustness and performance of discrete time sliding mode control systems.
Chakrabarty, Sohom; Bartoszewicz, Andrzej
2016-11-01
This paper presents a theoretical analysis along with simulations to show that increased robustness can be achieved for discrete time sliding mode control systems by choosing the sliding variable, or the output, to be of relative degree two instead of relative degree one. In other words it successfully reduces the ultimate bound of the sliding variable compared to the ultimate bound for standard discrete time sliding mode control systems. It is also found out that for such a selection of relative degree two output of the discrete time system, the reduced order system during sliding becomes finite time stable in absence of disturbance. With disturbance, it becomes finite time ultimately bounded. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Okubo, Yoshiro; Schoene, Daniel; Lord, Stephen R
2017-04-01
To examine the effects of stepping interventions on fall risk factors and fall incidence in older people. Electronic databases (PubMed, EMBASE, CINAHL, Cochrane, CENTRAL) and reference lists of included articles from inception to March 2015. Randomised (RCT) or clinical controlled trials (CCT) of volitional and reactive stepping interventions that included older (minimum age 60) people providing data on falls or fall risk factors. Meta-analyses of seven RCTs (n=660) showed that the stepping interventions significantly reduced the rate of falls (rate ratio=0.48, 95% CI 0.36 to 0.65, p<0.0001, I 2 =0%) and the proportion of fallers (risk ratio=0.51, 95% CI 0.38 to 0.68, p<0.0001, I 2 =0%). Subgroup analyses stratified by reactive and volitional stepping interventions revealed a similar efficacy for rate of falls and proportion of fallers. A meta-analysis of two RCTs (n=62) showed that stepping interventions significantly reduced laboratory-induced falls, and meta-analysis findings of up to five RCTs and CCTs (n=36-416) revealed that stepping interventions significantly improved simple and choice stepping reaction time, single leg stance, timed up and go performance (p<0.05), but not measures of strength. The findings indicate that both reactive and volitional stepping interventions reduce falls among older adults by approximately 50%. This clinically significant reduction may be due to improvements in reaction time, gait, balance and balance recovery but not in strength. Further high-quality studies aimed at maximising the effectiveness and feasibility of stepping interventions are required. CRD42015017357. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Zaydfudim, Victor; Dossett, Lesly A; Starmer, John M; Arbogast, Patrick G; Feurer, Irene D; Ray, Wayne A; May, Addison K; Pinson, C Wright
2009-07-01
Ventilator-associated pneumonia (VAP) causes significant morbidity and mortality in critically ill surgical patients. Recent studies suggest that the success of preventive measures is dependent on compliance with ventilator bundle parameters. Implementation of an electronic dashboard will improve compliance with the bundle parameters and reduce rates of VAP in our surgical intensive care unit (SICU). Time series analysis of VAP rates between January 2005 and July 2008, with dashboard implementation in July 2007. Multidisciplinary SICU at a tertiary-care referral center with a stable case mix during the study period. Patients admitted to the SICU between January 2005 and July 2008. Infection control data were used to establish rates of VAP and total ventilator days. For the time series analysis, VAP rates were calculated as quarterly VAP events per 1000 ventilator days. Ventilator bundle compliance was analyzed after dashboard implementation. Differences between expected and observed VAP rates based on time series analysis were used to estimate the effect of intervention. Average compliance with the ventilator bundle improved from 39% in August 2007 to 89% in July 2008 (P < .001). Rates of VAP decreased from a mean (SD) of 15.2 (7.0) to 9.3 (4.9) events per 1000 ventilator days after introduction of the dashboard (P = .01). Quarterly VAP rates were significantly reduced in the November 2007 through January 2008 and February through April 2008 periods (P < .05). For the August through October 2007 and May through July 2008 quarters, the observed rate reduction was not statistically significant. Implementation of an electronic dashboard improved compliance with ventilator bundle measures and is associated with reduced rates of VAP in our SICU.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-01-01
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs. PMID:27589753
Optimizing Scientist Time through In Situ Visualization and Analysis.
Patchett, John; Ahrens, James
2018-01-01
In situ processing produces reduced size persistent representations of a simulations state while the simulation is running. The need for in situ visualization and data analysis is usually described in terms of supercomputer size and performance in relation to available storage size.
Fitzsimons, Claire F; Kirk, Alison; Baker, Graham; Michie, Fraser; Kane, Catherine; Mutrie, Nanette
2013-11-01
Sedentary behaviours have been linked to poor health, independent of physical activity levels. The objective of this study was to explore an individualised intervention strategy aimed at reducing sedentary behaviours in older Scottish adults. This feasibility and pilot study was a pre-experimental (one group pretest-posttest) study design. Participants were enrolled into the study in January-March 2012 and data analysis was completed April-October 2012. The study was based in Glasgow, Scotland. Participants received an individualised consultation targeting sedentary behaviour incorporating feedback from an activPAL activity monitor. Outcome measures were objectively (activPAL) and subjectively measured (Sedentary Behaviour Questionnaire) sedentary time. Twenty four participants received the intervention. Objectively measured total time spent sitting/lying was reduced by 24 min/day (p=0.042), a reduction of 2.2%. Total time spent in stepping activities, such as walking increased by 13 min/day (p=0.044). Self-report data suggested participants achieved behaviour change by reducing time spent watching television and/or using motorised transport. Interventions to reduce sedentary behaviours in older people are urgently needed. The results of this feasibility and pilot study suggest a consultation approach may help individuals reduce time spent in sedentary behaviours. A larger, controlled trial is warranted with a diverse sample to increase generalisability. © 2013.
NASA Astrophysics Data System (ADS)
Nagai, Toshiki; Mitsutake, Ayori; Takano, Hiroshi
2013-02-01
A new relaxation mode analysis method, which is referred to as the principal component relaxation mode analysis method, has been proposed to handle a large number of degrees of freedom of protein systems. In this method, principal component analysis is carried out first and then relaxation mode analysis is applied to a small number of principal components with large fluctuations. To reduce the contribution of fast relaxation modes in these principal components efficiently, we have also proposed a relaxation mode analysis method using multiple evolution times. The principal component relaxation mode analysis method using two evolution times has been applied to an all-atom molecular dynamics simulation of human lysozyme in aqueous solution. Slow relaxation modes and corresponding relaxation times have been appropriately estimated, demonstrating that the method is applicable to protein systems.
Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N
2007-10-01
Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.
Risk, Issues and Lessons Learned: Maximizing Risk Management in the DoD Ground Domain
2011-10-01
Carnegie Mellon University “Risk Management Overview for TACOM” Benefits of Risk Management include: • Risk is a proactive approach - preventing... Chili (no beans) 13 • Hot dog sub-assy Unclassified How does the FMEA work? Execute the analysis and discover the potential failures and effects...34 - --· . -· A c u i.rition Benefits of FMEAs • Prevent major risks, reduce failures, minimize cost and reduce development time - Do it right the first time
MaPLE: A MapReduce Pipeline for Lattice-based Evaluation and Its Application to SNOMED CT
Zhang, Guo-Qiang; Zhu, Wei; Sun, Mengmeng; Tao, Shiqiang; Bodenreider, Olivier; Cui, Licong
2015-01-01
Non-lattice fragments are often indicative of structural anomalies in ontological systems and, as such, represent possible areas of focus for subsequent quality assurance work. However, extracting the non-lattice fragments in large ontological systems is computationally expensive if not prohibitive, using a traditional sequential approach. In this paper we present a general MapReduce pipeline, called MaPLE (MapReduce Pipeline for Lattice-based Evaluation), for extracting non-lattice fragments in large partially ordered sets and demonstrate its applicability in ontology quality assurance. Using MaPLE in a 30-node Hadoop local cloud, we systematically extracted non-lattice fragments in 8 SNOMED CT versions from 2009 to 2014 (each containing over 300k concepts), with an average total computing time of less than 3 hours per version. With dramatically reduced time, MaPLE makes it feasible not only to perform exhaustive structural analysis of large ontological hierarchies, but also to systematically track structural changes between versions. Our change analysis showed that the average change rates on the non-lattice pairs are up to 38.6 times higher than the change rates of the background structure (concept nodes). This demonstrates that fragments around non-lattice pairs exhibit significantly higher rates of change in the process of ontological evolution. PMID:25705725
MaPLE: A MapReduce Pipeline for Lattice-based Evaluation and Its Application to SNOMED CT.
Zhang, Guo-Qiang; Zhu, Wei; Sun, Mengmeng; Tao, Shiqiang; Bodenreider, Olivier; Cui, Licong
2014-10-01
Non-lattice fragments are often indicative of structural anomalies in ontological systems and, as such, represent possible areas of focus for subsequent quality assurance work. However, extracting the non-lattice fragments in large ontological systems is computationally expensive if not prohibitive, using a traditional sequential approach. In this paper we present a general MapReduce pipeline, called MaPLE (MapReduce Pipeline for Lattice-based Evaluation), for extracting non-lattice fragments in large partially ordered sets and demonstrate its applicability in ontology quality assurance. Using MaPLE in a 30-node Hadoop local cloud, we systematically extracted non-lattice fragments in 8 SNOMED CT versions from 2009 to 2014 (each containing over 300k concepts), with an average total computing time of less than 3 hours per version. With dramatically reduced time, MaPLE makes it feasible not only to perform exhaustive structural analysis of large ontological hierarchies, but also to systematically track structural changes between versions. Our change analysis showed that the average change rates on the non-lattice pairs are up to 38.6 times higher than the change rates of the background structure (concept nodes). This demonstrates that fragments around non-lattice pairs exhibit significantly higher rates of change in the process of ontological evolution.
Solar electric geocentric transfer with attitude constraints: Analysis
NASA Technical Reports Server (NTRS)
Sackett, L. L.; Malchow, H. L.; Delbaum, T. N.
1975-01-01
A time optimal or nearly time optimal trajectory program was developed for solar electric geocentric transfer with or without attitude constraints and with an optional initial high thrust stage. The method of averaging reduces computation time. A nonsingular set of orbital elements is used. The constraints, which are those of one of the SERT-C designs, introduce complexities into the analysis and the solution yields possible discontinuous changes in thrust direction. The power degradation due to VanAllen radiation is modeled analytically. A wide range of solar cell characteristics is assumed. Effects such as oblateness and shadowing are included. The analysis and the results of many example runs are included.
USDA-ARS?s Scientific Manuscript database
The objective of this analysis is to estimate and compare the cost-effectiveness of on- and off-field approaches to reducing nitrogen loadings. On-field practices include improving the timing, rate, and method of nitrogen application. Off-field practices include restoring wetlands and establishing v...
Aeroelastic Modeling of X-56A Stiff-Wing Configuration Flight Test Data
NASA Technical Reports Server (NTRS)
Grauer, Jared A.; Boucher, Matthew J.
2017-01-01
Aeroelastic stability and control derivatives for the X-56A Multi-Utility Technology Testbed (MUTT), in the stiff-wing configuration, were estimated from flight test data using the output-error method. Practical aspects of the analysis are discussed. The orthogonal phase-optimized multisine inputs provided excellent data information for aeroelastic modeling. Consistent parameter estimates were determined using output error in both the frequency and time domains. The frequency domain analysis converged faster and was less sensitive to starting values for the model parameters, which was useful for determining the aeroelastic model structure and obtaining starting values for the time domain analysis. Including a modal description of the structure from a finite element model reduced the complexity of the estimation problem and improved the modeling results. Effects of reducing the model order on the short period stability and control derivatives were investigated.
Performance-based, cost- and time-effective pcb analytical methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alvarado, J. S.
1998-06-11
Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less
A Reduced Order Model for Whole-Chip Thermal Analysis of Microfluidic Lab-on-a-Chip Systems
Wang, Yi; Song, Hongjun; Pant, Kapil
2013-01-01
This paper presents a Krylov subspace projection-based Reduced Order Model (ROM) for whole microfluidic chip thermal analysis, including conjugate heat transfer. Two key steps in the reduced order modeling procedure are described in detail, including (1) the acquisition of a 3D full-scale computational model in the state-space form to capture the dynamic thermal behavior of the entire microfluidic chip; and (2) the model order reduction using the Block Arnoldi algorithm to markedly lower the dimension of the full-scale model. Case studies using practically relevant thermal microfluidic chip are undertaken to establish the capability and to evaluate the computational performance of the reduced order modeling technique. The ROM is compared against the full-scale model and exhibits good agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) and over three orders-of-magnitude acceleration in computational speed. The salient model reusability and real-time simulation capability renders it amenable for operational optimization and in-line thermal control and management of microfluidic systems and devices. PMID:24443647
Reducing neural network training time with parallel processing
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Lamarsh, William J., II
1995-01-01
Obtaining optimal solutions for engineering design problems is often expensive because the process typically requires numerous iterations involving analysis and optimization programs. Previous research has shown that a near optimum solution can be obtained in less time by simulating a slow, expensive analysis with a fast, inexpensive neural network. A new approach has been developed to further reduce this time. This approach decomposes a large neural network into many smaller neural networks that can be trained in parallel. Guidelines are developed to avoid some of the pitfalls when training smaller neural networks in parallel. These guidelines allow the engineer: to determine the number of nodes on the hidden layer of the smaller neural networks; to choose the initial training weights; and to select a network configuration that will capture the interactions among the smaller neural networks. This paper presents results describing how these guidelines are developed.
On-Line Robust Modal Stability Prediction using Wavelet Processing
NASA Technical Reports Server (NTRS)
Brenner, Martin J.; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.
Ho, Chi-Kung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te
2017-01-01
Background This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). Methods A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. Results There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p < 0.05). There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. Conclusions This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time. PMID:28900621
Koch Hansen, Lars; Mohammed, Anna; Pedersen, Magnus; Folkestad, Lars; Brodersen, Jacob; Hey, Thomas; Lyhne Christensen, Nicolaj; Carter-Storch, Rasmus; Bendix, Kristoffer; Hansen, Morten R; Brabrand, Mikkel
2016-12-01
Reducing hands-off time during cardiopulmonary resuscitation (CPR) is believed to increase survival after cardiac arrests because of the sustaining of organ perfusion. The aim of our study was to investigate whether charging the defibrillator before rhythm analyses and shock delivery significantly reduced hands-off time compared with the European Resuscitation Council (ERC) 2010 CPR guideline algorithm in full-scale cardiac arrest scenarios. The study was designed as a full-scale cardiac arrest simulation study including administration of drugs. Participants were randomized into using the Stop-Only-While-Shocking (SOWS) algorithm or the ERC2010 algorithm. In SOWS, chest compressions were only interrupted for a post-charging rhythm analysis and immediate shock delivery. A Resusci Anne HLR-D manikin and a LIFEPACK 20 defibrillator were used. The manikin recorded time and chest compressions. Sample size was calculated with an α of 0.05 and 80% power showed that we should test four scenarios with each algorithm. Twenty-nine physicians participated in 11 scenarios. Hands-off time was significantly reduced 17% using the SOWS algorithm compared with ERC2010 [22.1% (SD 2.3) hands-off time vs. 26.6% (SD 4.8); P<0.05]. In full-scale cardiac arrest simulations, a minor change consisting of charging the defibrillator before rhythm check reduces hands-off time by 17% compared with ERC2010 guidelines.
Software Aids Visualization of Computed Unsteady Flow
NASA Technical Reports Server (NTRS)
Kao, David; Kenwright, David
2003-01-01
Unsteady Flow Analysis Toolkit (UFAT) is a computer program that synthesizes motions of time-dependent flows represented by very large sets of data generated in computational fluid dynamics simulations. Prior to the development of UFAT, it was necessary to rely on static, single-snapshot depictions of time-dependent flows generated by flow-visualization software designed for steady flows. Whereas it typically takes weeks to analyze the results of a largescale unsteady-flow simulation by use of steady-flow visualization software, the analysis time is reduced to hours when UFAT is used. UFAT can be used to generate graphical objects of flow visualization results using multi-block curvilinear grids in the format of a previously developed NASA data-visualization program, PLOT3D. These graphical objects can be rendered using FAST, another popular flow visualization software developed at NASA. Flow-visualization techniques that can be exploited by use of UFAT include time-dependent tracking of particles, detection of vortex cores, extractions of stream ribbons and surfaces, and tetrahedral decomposition for optimal particle tracking. Unique computational features of UFAT include capabilities for automatic (batch) processing, restart, memory mapping, and parallel processing. These capabilities significantly reduce analysis time and storage requirements, relative to those of prior flow-visualization software. UFAT can be executed on a variety of supercomputers.
Shao, Liyang; Zhang, Lianjun; Zhen, Zhen
2017-01-01
Children’s blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children’s blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children’s BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children’s BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children’s blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children’s BLLs in the city of Syracuse, NY. The average of children’s BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children’s blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children’s BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis. PMID:28182688
Shao, Liyang; Zhang, Lianjun; Zhen, Zhen
2017-01-01
Children's blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children's blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children's BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children's BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children's blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children's BLLs in the city of Syracuse, NY. The average of children's BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children's blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children's BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis.
Tracking and imaging humans on heterogeneous infrared sensor arrays for law enforcement applications
NASA Astrophysics Data System (ADS)
Feller, Steven D.; Zheng, Y.; Cull, Evan; Brady, David J.
2002-08-01
We present a plan for the integration of geometric constraints in the source, sensor and analysis levels of sensor networks. The goal of geometric analysis is to reduce the dimensionality and complexity of distributed sensor data analysis so as to achieve real-time recognition and response to significant events. Application scenarios include biometric tracking of individuals, counting and analysis of individuals in groups of humans and distributed sentient environments. We are particularly interested in using this approach to provide networks of low cost point detectors, such as infrared motion detectors, with complex imaging capabilities. By extending the capabilities of simple sensors, we expect to reduce the cost of perimeter and site security applications.
Effect of cooking time on some nutrient and antinutrient components of bambaragroundnut seeds.
Omoikhoje, Stanley Omoh; Aruna, Mohammed Bashiru; Bamgbose, Adeyemi Mustapha
2009-02-01
The proximate composition, gross energy, mineral composition, percentage sugar, oligosaccharides and antinutrient substances of bambaragroundnut seeds subjected to different cooking times were determined. The seeds were cooked for 30, 60, 90 and 120 min. Results of the proximate analysis showed that only the ether extract and ash were significantly (P < 0.05) reduced as the cooking time increased. In contrast, gross energy values significantly (P < 0.05) increased with increased cooking time. Amongst, the mineral elements assayed, calcium, magnesium and iron were significantly (P < 0.05) increased, while phosphorous, potassium, sodium and copper were reduced significantly (P > 0.05) with inreased cooking time. Percentage sucrose and glucose of bambaragroundnut seeds were significantly (P < 0.05) lowest in the raw form, but increased progressively with increased of cooking time. Raffinose and stachyose levels were reduced significantly by increased cookinf time (P < 0.05) with the least value in seeds cooked for 120 min. Trypsin inhibitor, hemagglutinin and tannin were completely eliminated in seeds cooked for 60 min or longer, but the phytin level was reduced significantly (P < 0.05) by cooking. For a significant detoxification of antinutrient substances and for optimal bioavailability of the component nutrients of bambaragroundnut seeds, an optimum cooking time of 60 min at 100 degrees C is therefore recommended.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
ERIC Educational Resources Information Center
Brown, Kristine M.
2009-01-01
This paper exploits a major, unanticipated reform of the California teachers' pension to provide quasi-experimental evidence on the link between pension generosity and retirement timing. Using two large administrative datasets, the author conducts a reduced-form analysis of the pension reform and estimates a structural model of retirement timing.…
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
A collaborative approach to lean laboratory workstation design reduces wasted technologist travel.
Yerian, Lisa M; Seestadt, Joseph A; Gomez, Erron R; Marchant, Kandice K
2012-08-01
Lean methodologies have been applied in many industries to reduce waste. We applied Lean techniques to redesign laboratory workstations with the aim of reducing the number of times employees must leave their workstations to complete their tasks. At baseline in 68 workflows (aggregates or sequence of process steps) studied, 251 (38%) of 664 tasks required workers to walk away from their workstations. After analysis and redesign, only 59 (9%) of the 664 tasks required technologists to leave their workstations to complete these tasks. On average, 3.4 travel events were removed for each workstation. Time studies in a single laboratory section demonstrated that workers spend 8 to 70 seconds in travel each time they step away from the workstation. The redesigned workstations will allow employees to spend less time travelling around the laboratory. Additional benefits include employee training in waste identification, improved overall laboratory layout, and identification of other process improvement opportunities in our laboratory.
NASA Technical Reports Server (NTRS)
Bartels, Robert E.
2011-01-01
Launch vehicles frequently experience a reduced stability margin through the transonic Mach number range. This reduced stability margin is caused by an undamping of the aerodynamics in one of the lower frequency flexible or rigid body modes. Analysis of the behavior of a flexible vehicle is routinely performed with quasi-steady aerodynamic lineloads derived from steady rigid computational fluid dynamics (CFD). However, a quasi-steady aeroelastic stability analysis can be unconservative at the critical Mach numbers where experiment or unsteady computational aeroelastic (CAE) analysis show a reduced or even negative aerodynamic damping. This paper will present a method of enhancing the quasi-steady aeroelastic stability analysis of a launch vehicle with unsteady aerodynamics. The enhanced formulation uses unsteady CFD to compute the response of selected lower frequency modes. The response is contained in a time history of the vehicle lineloads. A proper orthogonal decomposition of the unsteady aerodynamic lineload response is used to reduce the scale of data volume and system identification is used to derive the aerodynamic stiffness, damping and mass matrices. The results of the enhanced quasi-static aeroelastic stability analysis are compared with the damping and frequency computed from unsteady CAE analysis and from a quasi-steady analysis. The results show that incorporating unsteady aerodynamics in this way brings the enhanced quasi-steady aeroelastic stability analysis into close agreement with the unsteady CAE analysis.
Emerging spectra of singular correlation matrices under small power-map deformations
NASA Astrophysics Data System (ADS)
Vinayak; Schäfer, Rudi; Seligman, Thomas H.
2013-09-01
Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.
Emerging spectra of singular correlation matrices under small power-map deformations.
Vinayak; Schäfer, Rudi; Seligman, Thomas H
2013-09-01
Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.
Effects of Parenting Programs on Child Maltreatment Prevention: A Meta-Analysis.
Chen, Mengtong; Chan, Ko Ling
2016-01-01
The objective of this study is to evaluate the effectiveness of parenting programs in reducing child maltreatment and modifying associated factors as well as to examine the moderator variables that are linked to program effects. For this meta-analysis, we searched nine electronic databases to identify randomized controlled trials published before September 2013. The effect sizes of various outcomes at different time points were computed. From the 3,578 studies identified, we selected 37 studies for further analysis. The total random effect size was 0.296. Our results showed that parenting programs successfully reduced substantiated and self-reported child maltreatment reports and reduced the potential for child maltreatment. The programs also reduced risk factors and enhanced protective factors associated with child maltreatment. However, the effects of the parenting programs on reducing parental depression and stress were limited. Parenting programs produced positive effects in low-, middle-, and high-income countries and were effective in reducing child maltreatment when applied as primary, secondary, or tertiary child maltreatment intervention. In conclusion, parenting programs are effective public health approaches to reduce child maltreatment. The evidence-based service of parenting programs could be widely adopted in future practice. © The Author(s) 2015.
Potisek, Nicholas M; Malone, Robb M; Shilliday, Betsy Bryant; Ives, Timothy J; Chelminski, Paul R; DeWalt, Darren A; Pignone, Michael P
2007-01-15
Patients with chronic conditions require frequent care visits. Problems can arise during several parts of the patient visit that decrease efficiency, making it difficult to effectively care for high volumes of patients. The purpose of the study is to test a method to improve patient visit efficiency. We used Patient Flow Analysis to identify inefficiencies in the patient visit, suggest areas for improvement, and test the effectiveness of clinic interventions. At baseline, the mean visit time for 93 anticoagulation clinic patient visits was 84 minutes (+/- 50 minutes) and the mean visit time for 25 chronic pain clinic patient visits was 65 minutes (+/- 21 minutes). Based on these data, we identified specific areas of inefficiency and developed interventions to decrease the mean time of the patient visit. After interventions, follow-up data found the mean visit time was reduced to 59 minutes (+/-25 minutes) for the anticoagulation clinic, a time decrease of 25 minutes (t-test 39%; p < 0.001). Mean visit time for the chronic pain clinic was reduced to 43 minutes (+/- 14 minutes) a time decrease of 22 minutes (t-test 34 %; p < 0.001). Patient Flow Analysis is an effective technique to identify inefficiencies in the patient visit and efficiently collect patient flow data. Once inefficiencies are identified they can be improved through brief interventions.
Spada, Eva; Perego, Roberta; Sgamma, Elena Assunta; Proverbio, Daniela
2018-02-01
Feline immunodeficiency virus (FIV) and feline leukemia virus (FeLV) are among the most important feline infectious diseases worldwide. This retrospective study investigated survival times and effects of selected predictor factors on survival time in a population of owned pet cats in Northern Italy testing positive for the presence of FIV antibodies and FeLV antigen. One hundred and three retrovirus-seropositive cats, 53 FIV-seropositive cats, 40 FeLV-seropositive cats, and 10 FIV+FeLV-seropositive cats were included in the study. A population of 103 retrovirus-seronegative age and sex-matched cats was selected. Survival time was calculated and compared between retrovirus-seronegative, FIV, FeLV and FIV+FeLV-seropositive cats using Kaplan-Meier survival analysis. Cox proportional-hazards regression analysis was used to study the effect of selected predictor factors (male gender, peripheral blood cytopenia as reduced red blood cells - RBC- count, leukopenia, neutropenia and lymphopenia, hypercreatininemia and reduced albumin to globulin ratio) on survival time in retrovirus-seropositive populations. Median survival times for seronegative cats, FIV, FeLV and FIV+FeLV-seropositive cats were 3960, 2040, 714 and 77days, respectively. Compared to retrovirus-seronegative cats median survival time was significantly lower (P<0.000) in FeLV and FIV+FeLV-seropositive cats. Median survival time in FeLV and FIV+FeLV-seropositive cats was also significant lower (P<0.000) when compared to FIV-seropositive cats. Hazard ratio of death in FeLV and FIV+FeLV-seropositive cats being respectively 3.4 and 7.4 times higher, in comparison to seronegative cats and 2.3 and 4.8 times higher in FeLV and FIV+FeLV-seropositive cats as compared to FIV-seropositive cats. A Cox proportional-hazards regression analysis showed that FIV and FeLV-seropositive cats with reduced RBC counts at time of diagnosis of seropositivity had significantly shorter survival times when compared to FIV and FeLV-seropositive cats with normal RBC counts at diagnosis. In summary, FIV-seropositive status did not significantly affect longevity of cats in this study, unlike FeLV and FIV+FeLV-seropositivity. Reduced RBC counts at time of FIV and FeLV diagnosis could impact negatively on the longevity of seropositive cats and therefore blood counts should always be evaluated at diagnosis and follow-up of retrovirus-seropositive cats. Copyright © 2017 Elsevier B.V. All rights reserved.
Sanjay, Pandanaboyana; Watt, David G; Wigmore, Stephen J
2013-04-01
Fibrin sealants are frequently used in liver surgery to achieve intraoperative haemostasis and reduce post-operative haemorrhage and bile leak. This meta-analysis aimed to review the haemostatic and biliostatic capacity of fibrin sealants in elective liver surgery. An electronic search was performed on the MEDLINE, Embase and PubMed databases using both subject headings and truncated word searches to identify all published articles that are related to this topic. Pooled risk ratios were calculated for categorical outcomes, and mean differences for secondary continuous outcomes, using the fixed-effects and random-effects models for meta-analysis. Ten randomised controlled trials encompassing 1,225 patients were analysed to achieve a summated outcome. Pooled data analysis showed the use of fibrin sealants resulted in reduced time to haemostasis (mean difference -3.45 min [-3.78, -3.13] (P < 0.00001)) and increased numbers of patients with complete haemostasis (risk ratio 1.56, 95 % confidence interval 1.04-2.34, p = 0.03) when compared to controls. The use of fibrin sealants did not influence perioperative blood transfusion requirements, bile leak rates, post-operative haemorrhage, intra-abdominal collections and overall morbidity and mortality compared with controls. There is no solid evidence that the routine use of fibrin sealants reduces the incidence of post-operative haemorrhage or bile leak compared with other treatments. The use of fibrin sealants may reduce the time to haemostasis, but this does not translate to improved perioperative outcomes.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Bacheré, N; Diene, G; Delagnes, V; Molinas, C; Moulin, P; Tauber, M
2008-01-01
To describe and evaluate the impact of very early diagnosis and multidisciplinary care on the evolution and care of infants presenting with Prader-Willi syndrome (PWS). 19 infants diagnosed with PWS before the second month of life were followed by a multidisciplinary team. Median age at the time of analysis was 3.1 years [range 0.4-6.5]. The data were compared with data collected in 1997 from 113 questionnaires filled out by members of the French PWS Association. The patients from this latter data set were 12.0 years [range 4 months to 41 years] at the time of analysis, with a median age of 36 months at diagnosis. The duration of their hospitalization time was significantly reduced from 30.0 [range 0-670] to 21 [range 0-90] days (p = 0.043). The duration of gastric tube feeding was significantly reduced from 30.5 [range 0-427] to 15 [range 0-60] days (p = 0.017). Growth hormone treatment was started at a mean age of 1.9 +/- 0.5 years in 10 infants and L-thyroxine in 6 infants. Only 1 infant became obese at 2.5 years. Early diagnosis combined with multidisciplinary care decreases the hospitalization time, duration of gastric tube feeding and prevents early obesity in PWS infants. (c) 2007 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Shokravi, H.; Bakhary, NH
2017-11-01
Subspace System Identification (SSI) is considered as one of the most reliable tools for identification of system parameters. Performance of a SSI scheme is considerably affected by the structure of the associated identification algorithm. Weight matrix is a variable in SSI that is used to reduce the dimensionality of the state-space equation. Generally one of the weight matrices of Principle Component (PC), Unweighted Principle Component (UPC) and Canonical Variate Analysis (CVA) are used in the structure of a SSI algorithm. An increasing number of studies in the field of structural health monitoring are using SSI for damage identification. However, studies that evaluate the performance of the weight matrices particularly in association with accuracy, noise resistance, and time complexity properties are very limited. In this study, the accuracy, noise-robustness, and time-efficiency of the weight matrices are compared using different qualitative and quantitative metrics. Three evaluation metrics of pole analysis, fit values and elapsed time are used in the assessment process. A numerical model of a mass-spring-dashpot and operational data is used in this research paper. It is observed that the principal components obtained using PC algorithms are more robust against noise uncertainty and give more stable results for the pole distribution. Furthermore, higher estimation accuracy is achieved using UPC algorithm. CVA had the worst performance for pole analysis and time efficiency analysis. The superior performance of the UPC algorithm in the elapsed time is attributed to using unit weight matrices. The obtained results demonstrated that the process of reducing dimensionality in CVA and PC has not enhanced the time efficiency but yield an improved modal identification in PC.
Computational Methods for HSCT-Inlet Controls/CFD Interdisciplinary Research
NASA Technical Reports Server (NTRS)
Cole, Gary L.; Melcher, Kevin J.; Chicatelli, Amy K.; Hartley, Tom T.; Chung, Joongkee
1994-01-01
A program aimed at facilitating the use of computational fluid dynamics (CFD) simulations by the controls discipline is presented. The objective is to reduce the development time and cost for propulsion system controls by using CFD simulations to obtain high-fidelity system models for control design and as numerical test beds for control system testing and validation. An interdisciplinary team has been formed to develop analytical and computational tools in three discipline areas: controls, CFD, and computational technology. The controls effort has focused on specifying requirements for an interface between the controls specialist and CFD simulations and a new method for extracting linear, reduced-order control models from CFD simulations. Existing CFD codes are being modified to permit time accurate execution and provide realistic boundary conditions for controls studies. Parallel processing and distributed computing techniques, along with existing system integration software, are being used to reduce CFD execution times and to support the development of an integrated analysis/design system. This paper describes: the initial application for the technology being developed, the high speed civil transport (HSCT) inlet control problem; activities being pursued in each discipline area; and a prototype analysis/design system in place for interactive operation and visualization of a time-accurate HSCT-inlet simulation.
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1996-01-01
NASA's advanced propulsion system Small Scale Magnetic Disturbances/Advanced Technology Development (SSME/ATD) has been undergoing extensive flight certification and developmental testing, which involves large numbers of health monitoring measurements. To enhance engine safety and reliability, detailed analysis and evaluation of the measurement signals are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce the risk of catastrophic system failures and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. During the development of SSME, ASRI participated in the research and development of several advanced non- linear signal diagnostic methods for health monitoring and failure prediction in turbomachinery components. However, due to the intensive computational requirement associated with such advanced analysis tasks, current SSME dynamic data analysis and diagnostic evaluation is performed off-line following flight or ground test with a typical diagnostic turnaround time of one to two days. The objective of MSFC's MPP Prototype System is to eliminate such 'diagnostic lag time' by achieving signal processing and analysis in real-time. Such an on-line diagnostic system can provide sufficient lead time to initiate corrective action and also to enable efficient scheduling of inspection, maintenance and repair activities. The major objective of this project was to convert and implement a number of advanced nonlinear diagnostic DSP algorithms in a format consistent with that required for integration into the Vanderbilt Multigraph Architecture (MGA) Model Based Programming environment. This effort will allow the real-time execution of these algorithms using the MSFC MPP Prototype System. ASRI has completed the software conversion and integration of a sequence of nonlinear signal analysis techniques specified in the SOW for real-time execution on MSFC's MPP Prototype. This report documents and summarizes the results of the contract tasks; provides the complete computer source code; including all FORTRAN/C Utilities; and all other utilities/supporting software libraries that are required for operation.
Flexible Launch Vehicle Stability Analysis Using Steady and Unsteady Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Bartels, Robert E.
2012-01-01
Launch vehicles frequently experience a reduced stability margin through the transonic Mach number range. This reduced stability margin can be caused by the aerodynamic undamping one of the lower-frequency flexible or rigid body modes. Analysis of the behavior of a flexible vehicle is routinely performed with quasi-steady aerodynamic line loads derived from steady rigid aerodynamics. However, a quasi-steady aeroelastic stability analysis can be unconservative at the critical Mach numbers, where experiment or unsteady computational aeroelastic analysis show a reduced or even negative aerodynamic damping.Amethod of enhancing the quasi-steady aeroelastic stability analysis of a launch vehicle with unsteady aerodynamics is developed that uses unsteady computational fluid dynamics to compute the response of selected lower-frequency modes. The response is contained in a time history of the vehicle line loads. A proper orthogonal decomposition of the unsteady aerodynamic line-load response is used to reduce the scale of data volume and system identification is used to derive the aerodynamic stiffness, damping, and mass matrices. The results are compared with the damping and frequency computed from unsteady computational aeroelasticity and from a quasi-steady analysis. The results show that incorporating unsteady aerodynamics in this way brings the enhanced quasi-steady aeroelastic stability analysis into close agreement with the unsteady computational aeroelastic results.
How does spatial extent of fMRI datasets affect independent component analysis decomposition?
Aragri, Adriana; Scarabino, Tommaso; Seifritz, Erich; Comani, Silvia; Cirillo, Sossio; Tedeschi, Gioacchino; Esposito, Fabrizio; Di Salle, Francesco
2006-09-01
Spatial independent component analysis (sICA) of functional magnetic resonance imaging (fMRI) time series can generate meaningful activation maps and associated descriptive signals, which are useful to evaluate datasets of the entire brain or selected portions of it. Besides computational implications, variations in the input dataset combined with the multivariate nature of ICA may lead to different spatial or temporal readouts of brain activation phenomena. By reducing and increasing a volume of interest (VOI), we applied sICA to different datasets from real activation experiments with multislice acquisition and single or multiple sensory-motor task-induced blood oxygenation level-dependent (BOLD) signal sources with different spatial and temporal structure. Using receiver operating characteristics (ROC) methodology for accuracy evaluation and multiple regression analysis as benchmark, we compared sICA decompositions of reduced and increased VOI fMRI time-series containing auditory, motor and hemifield visual activation occurring separately or simultaneously in time. Both approaches yielded valid results; however, the results of the increased VOI approach were spatially more accurate compared to the results of the decreased VOI approach. This is consistent with the capability of sICA to take advantage of extended samples of statistical observations and suggests that sICA is more powerful with extended rather than reduced VOI datasets to delineate brain activity. (c) 2006 Wiley-Liss, Inc.
Li, Tao; Zhang, Lishu; Wang, Zhichao; Duan, Yunrui; Li, Jie; Wang, Junjun; Li, Hui
2018-06-20
Surfaces designed so that liquid metals do not stick to them but instead rebound as soon as possible have received considerable attention due to their significant importance in many practical technologies. We herein design a ridge structure that can induce the drop to rapidly rebound through the combination effect of centre-drawing recoil and the resulting faster retraction velocity. The suitable sharp-angle of the ridge for minimizing the contact time is determined as 20-30°. Further analysis reveals that multi-ridge structure or two-ridge structure with gaps can reduce more contact time. We also highlight the role the impact velocity played in minimizing the contact time, which has been a neglected parameter previously. Our studies would open up a new way to reduce the contact time and control the bouncing dynamics of metal drops, which provides guidance for some potential applications, such as preventing splashing molten drops from depositing on clean surface.
Maillard, Florie; Pereira, Bruno; Boisseau, Nathalie
2018-02-01
High-intensity interval training (HIIT) is promoted as a time-efficient strategy to improve body composition. The aim of this meta-analysis was to assess the efficacy of HIIT in reducing total, abdominal, and visceral fat mass in normal-weight and overweight/obese adults. Electronic databases were searched to identify all related articles on HIIT and fat mass. Stratified analysis was performed using the nature of HIIT (cycling versus running, target intensity), sex and/or body weight, and the methods of measuring body composition. Heterogeneity was also determined RESULTS: A total of 39 studies involving 617 subjects were included (mean age 38.8 years ± 14.4, 52% females). HIIT significantly reduced total (p = 0.003), abdominal (p = 0.007), and visceral (p = 0.018) fat mass, with no differences between the sexes. A comparison showed that running was more effective than cycling in reducing total and visceral fat mass. High-intensity (above 90% peak heart rate) training was more successful in reducing whole body adiposity, while lower intensities had a greater effect on changes in abdominal and visceral fat mass. Our analysis also indicated that only computed tomography scan or magnetic resonance imaging showed significant abdominal and/or visceral fat-mass loss after HIIT interventions. HIIT is a time-efficient strategy to decrease fat-mass deposits, including those of abdominal and visceral fat mass. There was some evidence of the greater effectiveness of HIIT running versus cycling, but owing to the wide variety of protocols used and the lack of full details about cycling training, further comparisons need to be made. Large, multicenter, prospective studies are required to establish the best HIIT protocols for reducing fat mass according to subject characteristics.
Christner, Martin; Dressler, Dirk; Andrian, Mark; Reule, Claudia; Petrini, Orlando
2017-01-01
The fast and reliable characterization of bacterial and fungal pathogens plays an important role in infectious disease control and tracking of outbreak agents. DNA based methods are the gold standard for epidemiological investigations, but they are still comparatively expensive and time-consuming. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a fast, reliable and cost-effective technique now routinely used to identify clinically relevant human pathogens. It has been used for subspecies differentiation and typing, but its use for epidemiological tasks, e. g. for outbreak investigations, is often hampered by the complexity of data analysis. We have analysed publicly available MALDI-TOF mass spectra from a large outbreak of Shiga-Toxigenic Escherichia coli in northern Germany using a general purpose software tool for the analysis of complex biological data. The software was challenged with depauperate spectra and reduced learning group sizes to mimic poor spectrum quality and scarcity of reference spectra at the onset of an outbreak. With high quality formic acid extraction spectra, the software's built in classifier accurately identified outbreak related strains using as few as 10 reference spectra (99.8% sensitivity, 98.0% specificity). Selective variation of processing parameters showed impaired marker peak detection and reduced classification accuracy in samples with high background noise or artificially reduced peak counts. However, the software consistently identified mass signals suitable for a highly reliable marker peak based classification approach (100% sensitivity, 99.5% specificity) even from low quality direct deposition spectra. The study demonstrates that general purpose data analysis tools can effectively be used for the analysis of bacterial mass spectra.
Speeding Products to Market: Waiting Time to First Product Introduction in New Firms.
ERIC Educational Resources Information Center
Schoonhoven, Claudia Bird; And Others
1990-01-01
Using event-history analysis techniques, a longitudinal study of the semiconductor industry found that substantial technological innovation lengthens development times and reduces the speed with which first products reach the marketplace. Organizations that undertook lower levels of technological innovation had relatively lower monthly…
Colello, Raymond J; Tozer, Jordan; Henderson, Scott C
2012-01-01
Photoconversion, the method by which a fluorescent dye is transformed into a stable, osmiophilic product that can be visualized by electron microscopy, is the most widely used method to enable the ultrastructural analysis of fluorescently labeled cellular structures. Nevertheless, the conventional method of photoconversion using widefield fluorescence microscopy requires long reaction times and results in low-resolution cell targeting. Accordingly, we have developed a photoconversion method that ameliorates these limitations by adapting confocal laser scanning microscopy to the procedure. We have found that this method greatly reduces photoconversion times, as compared to conventional wide field microscopy. Moreover, region-of-interest scanning capabilities of a confocal microscope facilitate the targeting of the photoconversion process to individual cellular or subcellular elements within a fluorescent field. This reduces the area of the cell exposed to light energy, thereby reducing the ultrastructural damage common to this process when widefield microscopes are employed. © 2012 by John Wiley & Sons, Inc.
Jitter Reduces Response-Time Variability in ADHD: An Ex-Gaussian Analysis.
Lee, Ryan W Y; Jacobson, Lisa A; Pritchard, Alison E; Ryan, Matthew S; Yu, Qilu; Denckla, Martha B; Mostofsky, Stewart; Mahone, E Mark
2015-09-01
"Jitter" involves randomization of intervals between stimulus events. Compared with controls, individuals with ADHD demonstrate greater intrasubject variability (ISV) performing tasks with fixed interstimulus intervals (ISIs). Because Gaussian curves mask the effect of extremely slow or fast response times (RTs), ex-Gaussian approaches have been applied to study ISV. This study applied ex-Gaussian analysis to examine the effects of jitter on RT variability in children with and without ADHD. A total of 75 children, aged 9 to 14 years (44 ADHD, 31 controls), completed a go/no-go test with two conditions: fixed ISI and jittered ISI. ADHD children showed greater variability, driven by elevations in exponential (tau), but not normal (sigma) components of the RT distribution. Jitter decreased tau in ADHD to levels not statistically different than controls, reducing lapses in performance characteristic of impaired response control. Jitter may provide a nonpharmacologic mechanism to facilitate readiness to respond and reduce lapses from sustained (controlled) performance. © 2012 SAGE Publications.
Real time analysis with the upgraded LHCb trigger in Run III
NASA Astrophysics Data System (ADS)
Szumlak, Tomasz
2017-10-01
The current LHCb trigger system consists of a hardware level, which reduces the LHC bunch-crossing rate of 40 MHz to 1.1 MHz, a rate at which the entire detector is read out. A second level, implemented in a farm of around 20k parallel processing CPUs, the event rate is reduced to around 12.5 kHz. The LHCb experiment plans a major upgrade of the detector and DAQ system in the LHC long shutdown II (2018-2019). In this upgrade, a purely software based trigger system is being developed and it will have to process the full 30 MHz of bunch crossings with inelastic collisions. LHCb will also receive a factor of 5 increase in the instantaneous luminosity, which further contributes to the challenge of reconstructing and selecting events in real time with the CPU farm. We discuss the plans and progress towards achieving efficient reconstruction and selection with a 30 MHz throughput. Another challenge is to exploit the increased signal rate that results from removing the 1.1 MHz readout bottleneck, combined with the higher instantaneous luminosity. Many charm hadron signals can be recorded at up to 50 times higher rate. LHCb is implementing a new paradigm in the form of real time data analysis, in which abundant signals are recorded in a reduced event format that can be fed directly to the physics analyses. These data do not need any further offline event reconstruction, which allows a larger fraction of the grid computing resources to be devoted to Monte Carlo productions. We discuss how this real-time analysis model is absolutely critical to the LHCb upgrade, and how it will evolve during Run-II.
Extending Differential Fault Analysis to Dynamic S-Box Advanced Encryption Standard Implementations
2014-09-18
entropy . At the same time, researchers strive to enhance AES and mitigate these growing threats. This paper researches the extension of existing...the algorithm or use side channels to reduce entropy , such as Differential Fault Analysis (DFA). At the same time, continuing research strives to...the state matrix. The S-box is an 8-bit 16x16 table built from an affine transformation on multiplicative inverses which guarantees full permutation (S
Discrete retardance second harmonic generation ellipsometry.
Dehen, Christopher J; Everly, R Michael; Plocinik, Ryan M; Hedderich, Hartmut G; Simpson, Garth J
2007-01-01
A new instrument was constructed to perform discrete retardance nonlinear optical ellipsometry (DR-NOE). The focus of the design was to perform second harmonic generation NOE while maximizing sample and application flexibility and minimizing data acquisition time. The discrete retardance configuration results in relatively simple computational algorithms for performing nonlinear optical ellipsometric analysis. NOE analysis of a disperse red 19 monolayer yielded results that were consistent with previously reported values for the same surface system, but with significantly reduced acquisition times.
Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins
NASA Technical Reports Server (NTRS)
Brenner, Marty; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.
NASA Astrophysics Data System (ADS)
Li, Hong; Ding, Xue
2017-03-01
This paper combines wavelet analysis and wavelet transform theory with artificial neural network, through the pretreatment on point feature attributes before in intrusion detection, to make them suitable for improvement of wavelet neural network. The whole intrusion classification model gets the better adaptability, self-learning ability, greatly enhances the wavelet neural network for solving the problem of field detection invasion, reduces storage space, contributes to improve the performance of the constructed neural network, and reduces the training time. Finally the results of the KDDCup99 data set simulation experiment shows that, this method reduces the complexity of constructing wavelet neural network, but also ensures the accuracy of the intrusion classification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, A.; Markel, T.; Burton, E.
Analysis has been performed on the Transportation Secure Data Center (TSDC) warehouse of collected GPS second-by-second driving profile data of vehicles in the Atlanta, Chicago, Fresno, Kansas City, Los Angeles, Sacramento, and San Francisco Consolidated Statistical Areas (CSAs) to understand in-motion wireless power transfer introduction scenarios. In this work it has been shown that electrification of 1% of road miles could reduce fuel use by 25% for Hybrid Electric Vehicles (HEVs) in these CSAs. This analysis of strategically located infrastructure offers a promising approach to reduced fuel consumption; however, even the most promising 1% of road miles determined by thesemore » seven analysis scenarios still represent an impressive 2,700 miles of roadway to electrify. Therefore to mitigate the infrastructure capital costs, integration of the grid-tied power electronics in the Wireless Power Transfer (WPT) system at the DC-link to photovoltaic and/or battery storage is suggested. The integration of these resources would allow for the hardware to provide additional revenue through grid services at times of low traffic volumes and conversely at time of high traffic volumes these resources could reduce the peak demand that the WPT system would otherwise add to the grid.« less
Roy, Vandana; Shukla, Shailja; Shukla, Piyush Kumar; Rawat, Paresh
2017-01-01
The motion generated at the capturing time of electro-encephalography (EEG) signal leads to the artifacts, which may reduce the quality of obtained information. Existing artifact removal methods use canonical correlation analysis (CCA) for removing artifacts along with ensemble empirical mode decomposition (EEMD) and wavelet transform (WT). A new approach is proposed to further analyse and improve the filtering performance and reduce the filter computation time under highly noisy environment. This new approach of CCA is based on Gaussian elimination method which is used for calculating the correlation coefficients using backslash operation and is designed for EEG signal motion artifact removal. Gaussian elimination is used for solving linear equation to calculate Eigen values which reduces the computation cost of the CCA method. This novel proposed method is tested against currently available artifact removal techniques using EEMD-CCA and wavelet transform. The performance is tested on synthetic and real EEG signal data. The proposed artifact removal technique is evaluated using efficiency matrices such as del signal to noise ratio (DSNR), lambda ( λ ), root mean square error (RMSE), elapsed time, and ROC parameters. The results indicate suitablity of the proposed algorithm for use as a supplement to algorithms currently in use.
Analysis in temporal regime of dispersive invisible structures designed from transformation optics
NASA Astrophysics Data System (ADS)
Gralak, B.; Arismendi, G.; Avril, B.; Diatta, A.; Guenneau, S.
2016-03-01
A simple invisible structure made of two anisotropic homogeneous layers is analyzed theoretically in temporal regime. The frequency dispersion is introduced and analytic expression of the transient part of the field is derived for large times when the structure is illuminated by a causal excitation. This expression shows that the limiting amplitude principle applies with transient fields decaying as the power -3 /4 of the time. The quality of the cloak is then reduced at short times and remains preserved at large times. The one-dimensional theoretical analysis is supplemented with full-wave numerical simulations in two-dimensional situations which confirm the effect of dispersion.
Sonko, Bakary J; Miller, Leland V; Jones, Richard H; Donnelly, Joseph E; Jacobsen, Dennis J; Hill, James O; Fennessey, Paul V
2003-12-15
Reducing water to hydrogen gas by zinc or uranium metal for determining D/H ratio is both tedious and time consuming. This has forced most energy metabolism investigators to use the "two-point" technique instead of the "Multi-point" technique for estimating total energy expenditure (TEE). Recently, we purchased a new platinum (Pt)-equilibration system that significantly reduces both time and labor required for D/H ratio determination. In this study, we compared TEE obtained from nine overweight but healthy subjects, estimated using the traditional Zn-reduction method to that obtained from the new Pt-equilibration system. Rate constants, pool spaces, and CO2 production rates obtained from use of the two methodologies were not significantly different. Correlation analysis demonstrated that TEEs estimated using the two methods were significantly correlated (r=0.925, p=0.0001). Sample equilibration time was reduced by 66% compared to those of similar methods. The data demonstrated that the Zn-reduction method could be replaced by the Pt-equilibration method when TEE was estimated using the "Multi-Point" technique. Furthermore, D equilibration time was significantly reduced.
Rotational relaxation time as unifying time scale for polymer and fiber drag reduction
NASA Astrophysics Data System (ADS)
Boelens, A. M. P.; Muthukumar, M.
2016-05-01
Using hybrid direct numerical simulation plus Langevin dynamics, a comparison is performed between polymer and fiber stress tensors in turbulent flow. The stress tensors are found to be similar, suggesting a common drag reducing mechanism in the onset regime for both flexible polymers and rigid fibers. Since fibers do not have an elastic backbone, this must be a viscous effect. Analysis of the viscosity tensor reveals that all terms are negligible, except the off-diagonal shear viscosity associated with rotation. Based on this analysis, we identify the rotational orientation time as the unifying time scale setting a new time criterion for drag reduction by both flexible polymers and rigid fibers.
Rotational relaxation time as unifying time scale for polymer and fiber drag reduction.
Boelens, A M P; Muthukumar, M
2016-05-01
Using hybrid direct numerical simulation plus Langevin dynamics, a comparison is performed between polymer and fiber stress tensors in turbulent flow. The stress tensors are found to be similar, suggesting a common drag reducing mechanism in the onset regime for both flexible polymers and rigid fibers. Since fibers do not have an elastic backbone, this must be a viscous effect. Analysis of the viscosity tensor reveals that all terms are negligible, except the off-diagonal shear viscosity associated with rotation. Based on this analysis, we identify the rotational orientation time as the unifying time scale setting a new time criterion for drag reduction by both flexible polymers and rigid fibers.
Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat
2011-05-27
In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected. Copyright © 2011 Elsevier B.V. All rights reserved.
Liu, Xianchen; Thompson, John; Phatak, Hemant; Mardekian, Jack; Porcari, Anthony; Johnson, Margot; Cohen, Alexander T
2016-01-01
Treatment with apixaban versus placebo for 12 months significantly reduced symptomatic recurrent venous thromboembolism (VTE) or all-cause death without increasing the rate of major bleeding in the AMPLIFY-EXT trial. This analysis examined the effects of apixaban versus placebo on the rate of all-cause hospitalisations, time to first hospitalisation, and predictors of first hospitalisation in patients with VTE enrolled in AMPLIFY-EXT. Treatment with apixaban 2.5 mg and 5 mg twice daily significantly reduced the rate of all-cause hospitalisations versus placebo (hazard ratio [95% confidence interval], 0.64 [0.43, 0.95]; p=0.026 and 0.54 [0.36, 0.82]; p=0.004, respectively). Apixaban prolonged mean time to first hospitalisation versus placebo by 43 and 49 days for the 2.5-mg and 5-mg twice-daily groups, respectively. Median length of hospital stay during the first hospitalisation was longer for placebo than for apixaban 2.5 mg or 5 mg twice daily (7.0, 5.0, and 4.5 days, respectively). Treatment with apixaban was a significant predictor of lower rates of hospitalisations versus placebo, and severe/moderate renal impairment was a significant predictor of an increased rate. This study supports extended use of apixaban for reducing all-cause hospitalisations and extending time to first hospitalisation in patients with VTE enrolled in AMPLIFY-EXT (www.clinicaltrials.gov registration: #NCT00633893).
Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.
Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy
2015-12-30
While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.
Threlkeld, Zachary D; Kozak, Benjamin; McCoy, David; Cole, Sara; Martin, Christine; Singh, Vineeta
2017-07-01
Shorter time-to-thrombolysis in acute ischemic stroke (AIS) is associated with improved functional outcome and reduced morbidity. We evaluate the effect of several interventions to reduce time-to-thrombolysis at an urban, public safety net hospital. All patients treated with tissue plasminogen activator for AIS at our institution between 2008 and 2015 were included in a retrospective analysis of door-to-needle (DTN) time and associated factors. Between 2011 and 2014, we implemented 11 distinct interventions to reduce DTN time. Here, we assess the relative impact of each intervention on DTN time. The median DTN time pre- and postintervention decreased from 87 (interquartile range: 68-109) minutes to 49 (interquartile range: 39-63) minutes. The reduction was comprised primarily of a decrease in median time from computed tomography scan order to interpretation. The goal DTN time of 60 minutes or less was achieved in 9% (95% confidence interval: 5%-22%) of cases preintervention, compared with 70% (58%-81%) postintervention. Interventions with the greatest impact on DTN time included the implementation of a stroke group paging system, dedicated emergency department stroke pharmacists, and the development of a stroke code supply box. Multidisciplinary, collaborative interventions are associated with a significant and substantial reduction in time-to-thrombolysis. Such targeted interventions are efficient and achievable in resource-limited settings, where they are most needed. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kuntamalla, Srinivas; Lekkala, Ram Gopal Reddy
2014-10-01
Heart rate variability (HRV) is an important dynamic variable of the cardiovascular system, which operates on multiple time scales. In this study, Multiscale entropy (MSE) analysis is applied to HRV signals taken from Physiobank to discriminate Congestive Heart Failure (CHF) patients from healthy young and elderly subjects. The discrimination power of the MSE method is decreased as the amount of the data reduces and the lowest amount of the data at which there is a clear discrimination between CHF and normal subjects is found to be 4000 samples. Further, this method failed to discriminate CHF from healthy elderly subjects. In view of this, the Reduced Data Dualscale Entropy Analysis method is proposed to reduce the data size required (as low as 500 samples) for clearly discriminating the CHF patients from young and elderly subjects with only two scales. Further, an easy to interpret index is derived using this new approach for the diagnosis of CHF. This index shows 100 % accuracy and correlates well with the pathophysiology of heart failure.
Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Marquart, Jed E.
2005-01-01
The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the objectives listed, the following tasks were accomplished: 1) Parametric Study Of Preconditioning Parameters And Other Code Inputs; 2) Code Modifications To Reduce Runtimes; 3) Investigation Of Compiler Options To Reduce Code Runtime; and 4) Development/Enhancement of Users Manuals for Aardvark and Phantom
Extending nonlinear analysis to short ecological time series.
Hsieh, Chih-hao; Anderson, Christian; Sugihara, George
2008-01-01
Nonlinearity is important and ubiquitous in ecology. Though detectable in principle, nonlinear behavior is often difficult to characterize, analyze, and incorporate mechanistically into models of ecosystem function. One obvious reason is that quantitative nonlinear analysis tools are data intensive (require long time series), and time series in ecology are generally short. Here we demonstrate a useful method that circumvents data limitation and reduces sampling error by combining ecologically similar multispecies time series into one long time series. With this technique, individual ecological time series containing as few as 20 data points can be mined for such important information as (1) significantly improved forecast ability, (2) the presence and location of nonlinearity, and (3) the effective dimensionality (the number of relevant variables) of an ecological system.
Time varying voltage combustion control and diagnostics sensor
Chorpening, Benjamin T [Morgantown, WV; Thornton, Jimmy D [Morgantown, WV; Huckaby, E David [Morgantown, WV; Fincham, William [Fairmont, WV
2011-04-19
A time-varying voltage is applied to an electrode, or a pair of electrodes, of a sensor installed in a fuel nozzle disposed adjacent the combustion zone of a continuous combustion system, such as of the gas turbine engine type. The time-varying voltage induces a time-varying current in the flame which is measured and used to determine flame capacitance using AC electrical circuit analysis. Flame capacitance is used to accurately determine the position of the flame from the sensor and the fuel/air ratio. The fuel and/or air flow rate (s) is/are then adjusted to provide reduced flame instability problems such as flashback, combustion dynamics and lean blowout, as well as reduced emissions. The time-varying voltage may be an alternating voltage and the time-varying current may be an alternating current.
BESIII Physics Data Storing and Processing on HBase and MapReduce
NASA Astrophysics Data System (ADS)
LEI, Xiaofeng; Li, Qiang; Kan, Bowen; Sun, Gongxing; Sun, Zhenyu
2015-12-01
In the past years, we have successfully applied Hadoop to high-energy physics analysis. Although, it has not only improved the efficiency of data analysis, but also reduced the cost of cluster building so far, there are still some spaces to be optimized, like inflexible pre-selection, low-efficient random data reading and I/O bottleneck caused by Fuse that is used to access HDFS. In order to change this situation, this paper presents a new analysis platform for high-energy physics data storing and analysing. The data structure is changed from DST tree-like files to HBase according to the features of the data itself and analysis processes, since HBase is more suitable for processing random data reading than DST files and enable HDFS to be accessed directly. A few of optimization measures are taken for the purpose of getting a good performance. A customized protocol is defined for data serializing and desterilizing for the sake of decreasing the storage space in HBase. In order to make full use of locality of data storing in HBase, utilizing a new MapReduce model and a new split policy for HBase regions are proposed in the paper. In addition, a dynamic pluggable easy-to-use TAG (event metadata) based pre-selection subsystem is established. It can assist physicists even to filter out 999%o uninterested data, if the conditions are set properly. This means that a lot of I/O resources can be saved, the CPU usage can be improved and consuming time for data analysis can be reduced. Finally, several use cases are designed, the test results show that the new platform has an excellent performance with 3.4 times faster with pre-selection and 20% faster without preselection, and the new platform is stable and scalable as well.
Cho, Han-Jin; Lee, Kyung Yul; Nam, Hyo Suk; Kim, Young Dae; Song, Tae-Jin; Jung, Yo Han; Choi, Hye-Yeon; Heo, Ji Hoe
2014-10-01
Process improvement (PI) is an approach for enhancing the existing quality improvement process by making changes while keeping the existing process. We have shown that implementation of a stroke code program using a computerized physician order entry system is effective in reducing the in-hospital time delay to thrombolysis in acute stroke patients. We investigated whether implementation of this PI could further reduce the time delays by continuous improvement of the existing process. After determining a key indicator [time interval from emergency department (ED) arrival to intravenous (IV) thrombolysis] and conducting data analysis, the target time from ED arrival to IV thrombolysis in acute stroke patients was set at 40 min. The key indicator was monitored continuously at a weekly stroke conference. The possible reasons for the delay were determined in cases for which IV thrombolysis was not administered within the target time and, where possible, the problems were corrected. The time intervals from ED arrival to the various evaluation steps and treatment before and after implementation of the PI were compared. The median time interval from ED arrival to IV thrombolysis in acute stroke patients was significantly reduced after implementation of the PI (from 63.5 to 45 min, p=0.001). The variation in the time interval was also reduced. A reduction in the evaluation time intervals was achieved after the PI [from 23 to 17 min for computed tomography scanning (p=0.003) and from 35 to 29 min for complete blood counts (p=0.006)]. PI is effective for continuous improvement of the existing process by reducing the time delays between ED arrival and IV thrombolysis in acute stroke patients.
Secker, T J; Pinchin, H E; Hervé, R C; Keevil, C W
2015-01-01
Increasing drying time adversely affects attachment of tissue proteins and prion-associated amyloid to surgical stainless steel, and reduces the efficacy of commercial cleaning chemistries. This study tested the efficacy of commercial humidity retention bags to reduce biofouling on surgical stainless steel and to improve subsequent cleaning. Surgical stainless steel surfaces were contaminated with ME7-infected brain homogenates and left to dry for 15 to 1,440 min either in air, in dry polythene bags or within humidity retention bags. Residual contamination pre/post cleaning was analysed using Thioflavin T/SYPRO Ruby dual staining and microscope analysis. An increase in biofouling was observed with increased drying time in air or in sealed dry bags. Humidity retention bags kept both protein and prion-associated amyloid minimal across the drying times both pre- and post-cleaning. Therefore, humidity bags demonstrate a cheap, easy to implement solution to improve surgical instrument reprocessing and to potentially reduce associated hospital acquired infections.
Testing for intracycle determinism in pseudoperiodic time series.
Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A
2008-06-01
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.
Forward Period Analysis Method of the Periodic Hamiltonian System.
Wang, Pengfei
2016-01-01
Using the forward period analysis (FPA), we obtain the period of a Morse oscillator and mathematical pendulum system, with the accuracy of 100 significant digits. From these results, the long-term [0, 1060] (time unit) solutions, ranging from the Planck time to the age of the universe, are computed reliably and quickly with a parallel multiple-precision Taylor series (PMT) scheme. The application of FPA to periodic systems can greatly reduce the computation time of long-term reliable simulations. This scheme provides an efficient way to generate reference solutions, against which long-term simulations using other schemes can be tested.
[Risk factors for anorexia in children].
Liu, Wei-Xiao; Lang, Jun-Feng; Zhang, Qin-Feng
2016-11-01
To investigate the risk factors for anorexia in children, and to reduce the prevalence of anorexia in children. A questionnaire survey and a case-control study were used to collect the general information of 150 children with anorexia (case group) and 150 normal children (control group). Univariate analysis and multivariate logistic stepwise regression analysis were performed to identify the risk factors for anorexia in children. The results of the univariate analysis showed significant differences between the case and control groups in the age in months when supplementary food were added, feeding pattern, whether they liked meat, vegetables and salty food, whether they often took snacks and beverages, whether they liked to play while eating, and whether their parents asked them to eat food on time (P<0.05). The results of the multivariate logistic regression analysis showed that late addition of supplementary food (OR=5.408), high frequency of taking snacks and/or drinks (OR=11.813), and eating while playing (OR=6.654) were major risk factors for anorexia in children. Liking of meat (OR=0.093) and vegetables (OR=0.272) and eating on time required by parents (OR=0.079) were protective factors against anorexia in children. Timely addition of supplementary food, a proper diet, and development of children's proper eating and living habits can reduce the incidence of anorexia in children.
Practical solutions for reducing container ships' waiting times at ports using simulation model
NASA Astrophysics Data System (ADS)
Sheikholeslami, Abdorreza; Ilati, Gholamreza; Yeganeh, Yones Eftekhari
2013-12-01
The main challenge for container ports is the planning required for berthing container ships while docked in port. Growth of containerization is creating problems for ports and container terminals as they reach their capacity limits of various resources which increasingly leads to traffic and port congestion. Good planning and management of container terminal operations reduces waiting time for liner ships. Reducing the waiting time improves the terminal's productivity and decreases the port difficulties. Two important keys to reducing waiting time with berth allocation are determining suitable access channel depths and increasing the number of berths which in this paper are studied and analyzed as practical solutions. Simulation based analysis is the only way to understand how various resources interact with each other and how they are affected in the berthing time of ships. We used the Enterprise Dynamics software to produce simulation models due to the complexity and nature of the problems. We further present case study for berth allocation simulation of the biggest container terminal in Iran and the optimum access channel depth and the number of berths are obtained from simulation results. The results show a significant reduction in the waiting time for container ships and can be useful for major functions in operations and development of container ship terminals.
NASA Astrophysics Data System (ADS)
Abdul, Fakhreza; Pintowantoro, Sungging; Yuwandono, Ridwan Bagus
2018-04-01
With the depletion of nickel sulfide ore resources, the nickel laterit processing become an attention to fulfill nickel world demans. Reducing laterite nickel by using a low cost carbonaceous reductan has proved produces high grade ferronickel alloy. In this research, reduction was carried out to low grade laterite nickel (limonite) with 1.25% nikel content by using CO gas reductant formed by reaction between coal and dolomite. Reduction process preceded by forming brickets mixture from limonit ore, coal, and Na2SO4, then the brickets placed inside crucible bed together with dolomit and reduced at temperature 1400 °C with holding time variations 4, 6, and 8 hours. EDX, XRD, and SEM test were carried out to find out the Ni and nickel grade after reduced, the phases that formed, and the morphology brickets after reduced. The reduction results shows that the highest increase on nickel grade was obtained by 8 hours holding time increasing 5.84 % from initial grade, and the highest recovery was obtained by 6 hours holding time with recovery 88.51 %. While the higest increase on Fe grade was obtained by 4 hours holding time, and the highest recovery Fe was obtained by 4 hours holding time with recovery 85.41%.
Immersed boundary lattice Boltzmann model based on multiple relaxation times
NASA Astrophysics Data System (ADS)
Lu, Jianhua; Han, Haifeng; Shi, Baochang; Guo, Zhaoli
2012-01-01
As an alterative version of the lattice Boltzmann models, the multiple relaxation time (MRT) lattice Boltzmann model introduces much less numerical boundary slip than the single relaxation time (SRT) lattice Boltzmann model if some special relationship between the relaxation time parameters is chosen. On the other hand, most current versions of the immersed boundary lattice Boltzmann method, which was first introduced by Feng and improved by many other authors, suffer from numerical boundary slip as has been investigated by Le and Zhang. To reduce such a numerical boundary slip, an immerse boundary lattice Boltzmann model based on multiple relaxation times is proposed in this paper. A special formula is given between two relaxation time parameters in the model. A rigorous analysis and the numerical experiments carried out show that the numerical boundary slip reduces dramatically by using the present model compared to the single-relaxation-time-based model.
Harris, Carl M.; Litteral, Charles J.; Damrau, Donna L.
1997-01-01
The U.S. Geological Survey National Water Quality Laboratory has developed a method for the determination of dissolved calcium, iron, magnesium, manganese, silica, and sodium using a modified ultrasonic nebulizer sample-introduction system to an inductively coupled plasma-optical emission spectrometer. The nebulizer's spray chamber has been modified to avoid carryover and memory effects common in some conventional ultrasonic designs. The modified ultrasonic nebulizer is equipped with a high-speed rinse cycle to remove previously analyzed samples from the spray chamber without excessive flush times. This new rinse cycle decreases sample washout times by reducing carryover and memory effects from salt or analytes in previously analyzed samples by as much as 45 percent. Plasma instability has been reduced by repositioning the argon carrier gas inlet on the spray chamber and by directly pumping waste from the chamber, instead of from open drain traps, thereby maintaining constant pressure to the plasma. The ultrasonic nebulizer improves signal intensities, which are 8 to 16 times greater than for a conventional cross-flow pneumatic nebulizer, without being sensitive to clogging from salt buildup as in cross-flow nebulizers. Detection limits for the ultrasonic nebulizer are 4 to 18 times less than detection limits achievable using a cross-flow pneumatic nebulizer, with equivalent sample analysis time.
Reducing time delays in the management of ischemic stroke patients in Northern Italy.
Vidale, Simone; Arnaboldi, Marco; Bezzi, Giacomo; Bono, Giorgio; Grampa, Giampiero; Guidotti, Mario; Perrone, Patrizia; Salmaggi, Andrea; Zarcone, Davide; Zoli, Alberto; Agostoni, Elio
2016-07-15
Thrombolysis represents the best therapy for ischemic stroke but the main limitation of its administration is time. The avoidable delay is a concept reflecting the effectiveness of management pathway. For this reason, we projected a study concerning the detection of main delays with following introduction of corrective factors. In this paper we describe the results after these corrections. Consecutive patients admitted for ischemic stroke during a 3-months period to 35 hospitals of a macro-area of Northern Italy were enrolled. Each time of management was registered, identifying three main intervals: pre-hospital, in-hospital and total times. Previous corrective interventions were: 1.increasing of population awareness to use the Emergency Medical Service (EMS); 2.pre-notification of Emergency Department; 3.use of high urgency codes; 4.use of standardised operational algorithm. Statistical analysis was conducted using time-to-event analysis and Cox proportional hazard regression. 1084 patients were enrolled. EMS was alerted for 56.3% of subjects, mainly in females and severe strokes (p<0.001). Thrombolytic treatment was performed in 4.7% of patients. Median pre-hospital and in-hospital times were 113 and 105min, while total time was 240. High urgency codes at transport contributed to reduce pre-hospital and in-hospital time (p<0.05). EMS use and high urgency codes promoted thrombolysis. Treatment within 4.5hours from symptom onset was performed in 14% of patients more than the first phase of study. The implementation of an organizational system based on EMS and concomitant high urgency codes use was effective to reduce avoidable delay and to increase thrombolysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Sample preparation techniques for the determination of trace residues and contaminants in foods.
Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M
2007-06-15
The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.
A Four Dimensional Spatio-Temporal Analysis of an Agricultural Dataset
Donald, Margaret R.; Mengersen, Kerrie L.; Young, Rick R.
2015-01-01
While a variety of statistical models now exist for the spatio-temporal analysis of two-dimensional (surface) data collected over time, there are few published examples of analogous models for the spatial analysis of data taken over four dimensions: latitude, longitude, height or depth, and time. When taking account of the autocorrelation of data within and between dimensions, the notion of closeness often differs for each of the dimensions. Here, we consider a number of approaches to the analysis of such a dataset, which arises from an agricultural experiment exploring the impact of different cropping systems on soil moisture. The proposed models vary in their representation of the spatial correlation in the data, the assumed temporal pattern and choice of conditional autoregressive (CAR) and other priors. In terms of the substantive question, we find that response cropping is generally more effective than long fallow cropping in reducing soil moisture at the depths considered (100 cm to 220 cm). Thus, if we wish to reduce the possibility of deep drainage and increased groundwater salinity, the recommended cropping system is response cropping. PMID:26513746
Rules? Relationships?: A Feminist Analysis of Competition and Fair Play in Physical Education
ERIC Educational Resources Information Center
Singleton, Ellen
2003-01-01
Regardless of recent curriculum revisions, physical educators, faced with reduced time and/or inadequate equipment and facilities, continue to offer competitive team sport activities for a high percentage of their program time. When competition is only experienced as a win-lose situation, possibilities that students will derive any morally…
Air Cargo Transportation Route Choice Analysis
NASA Technical Reports Server (NTRS)
Obashi, Hiroshi; Kim, Tae-Seung; Oum, Tae Hoon
2003-01-01
Using a unique feature of air cargo transshipment data in the Northeast Asian region, this paper identifies the critical factors that determine the transshipment route choice. Taking advantage of the variations in the transport characteristics in each origin-destination airports pair, the paper uses a discrete choice model to describe the transshipping route choice decision made by an agent (i.e., freight forwarder, consolidator, and large shipper). The analysis incorporates two major factors, monetary cost (such as line-haul cost and landing fee) and time cost (i.e., aircraft turnaround time, including loading and unloading time, custom clearance time, and expected scheduled delay), along with other controls. The estimation method considers the presence of unobserved attributes, and corrects for resulting endogeneity by use of appropriate instrumental variables. Estimation results find that transshipment volumes are more sensitive to time cost, and that the reduction in aircraft turnaround time by 1 hour would be worth the increase in airport charges by more than $1000. Simulation exercises measures the impacts of alternative policy scenarios for a Korean airport, which has recently declared their intention to be a future regional hub in the Northeast Asian region. The results suggest that reducing aircraft turnaround time at the airport be an effective strategy, rather than subsidizing to reduce airport charges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoaf, S.; APS Engineering Support Division
A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.
Forbes, Thomas P.; Degertekin, F. Levent; Fedorov, Andrei G.
2010-01-01
Electrochemistry and ion transport in a planar array of mechanically-driven, droplet-based ion sources are investigated using an approximate time scale analysis and in-depth computational simulations. The ion source is modeled as a controlled-current electrolytic cell, in which the piezoelectric transducer electrode, which mechanically drives the charged droplet generation using ultrasonic atomization, also acts as the oxidizing/corroding anode (positive mode). The interplay between advective and diffusive ion transport of electrochemically generated ions is analyzed as a function of the transducer duty cycle and electrode location. A time scale analysis of the relative importance of advective vs. diffusive ion transport provides valuable insight into optimality, from the ionization prospective, of alternative design and operation modes of the ion source operation. A computational model based on the solution of time-averaged, quasi-steady advection-diffusion equations for electroactive species transport is used to substantiate the conclusions of the time scale analysis. The results show that electrochemical ion generation at the piezoelectric transducer electrodes located at the back-side of the ion source reservoir results in poor ionization efficiency due to insufficient time for the charged analyte to diffuse away from the electrode surface to the ejection location, especially at near 100% duty cycle operation. Reducing the duty cycle of droplet/analyte ejection increases the analyte residence time and, in turn, improves ionization efficiency, but at an expense of the reduced device throughput. For applications where this is undesirable, i.e., multiplexed and disposable device configurations, an alternative electrode location is incorporated. By moving the charging electrode to the nozzle surface, the diffusion length scale is greatly reduced, drastically improving ionization efficiency. The ionization efficiency of all operating conditions considered is expressed as a function of the dimensionless Peclet number, which defines the relative effect of advection as compared to diffusion. This analysis is general enough to elucidate an important role of electrochemistry in ionization efficiency of any arrayed ion sources, be they mechanically-driven or electrosprays, and is vital for determining optimal design and operation conditions. PMID:20607111
A New Modified Histogram Matching Normalization for Time Series Microarray Analysis.
Astola, Laura; Molenaar, Jaap
2014-07-01
Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.
Dyer, Bryce; Hassani, Hossein; Shadi, Mehran
2016-01-01
The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.
Upshur, Ross E G; Moineddin, Rahim; Crighton, Eric J; Mamdani, Muhammad
2006-03-01
The question of how best to reduce waiting times for health care, particularly surgical procedures such as hip and knee replacements is among the most pressing concern of the Canadian health care system. The objective of this study was to test the hypothesis that significant seasonal variation exists in the performance of hip and knee replacement surgery in the province of Ontario. We performed a retrospective, cross-sectional time series analysis examining all hip and knee replacement surgeries in people over the age of 65 in the province of Ontario, Canada between 1992 and 2002. The main outcome measure was monthly hospitalization rates per 100,000 population for all hip and knee replacements. There was a marked increase in the rate of hip and knee replacement surgery over the 10-year period as well as an increasing seasonal variation in surgeries. Highly significant (Fisher Kappa = 16.05, p < 0.01; Bartlett-Kolmogorov-Smirnov Test = 0.31, p < 0.01) and strong (R2Autoreg = 0.85) seasonality was identified in the data. Holidays and utilization caps appear to exert a significant influence on the rate of service provision. It is expected that waiting times for hip and knee replacement could be reduced by reducing seasonal fluctuations in service provision and benchmarking services to peak delivery. The results highlight the importance of system behaviour in seasonal fluctuation of service delivery.
Upshur, Ross EG; Moineddin, Rahim; Crighton, Eric J; Mamdani, Muhammad
2006-01-01
Background The question of how best to reduce waiting times for health care, particularly surgical procedures such as hip and knee replacements is among the most pressing concern of the Canadian health care system. The objective of this study was to test the hypothesis that significant seasonal variation exists in the performance of hip and knee replacement surgery in the province of Ontario. Methods We performed a retrospective, cross-sectional time series analysis examining all hip and knee replacement surgeries in people over the age of 65 in the province of Ontario, Canada between 1992 and 2002. The main outcome measure was monthly hospitalization rates per 100 000 population for all hip and knee replacements. Results There was a marked increase in the rate of hip and knee replacement surgery over the 10-year period as well as an increasing seasonal variation in surgeries. Highly significant (Fisher Kappa = 16.05, p < 0.01; Bartlett-Kolmogorov-Smirnov Test = 0.31, p < 0.01) and strong (R2Autoreg = 0.85) seasonality was identified in the data. Conclusion Holidays and utilization caps appear to exert a significant influence on the rate of service provision. It is expected that waiting times for hip and knee replacement could be reduced by reducing seasonal fluctuations in service provision and benchmarking services to peak delivery. The results highlight the importance of system behaviour in seasonal fluctuation of service delivery. PMID:16509992
Verdant: automated annotation, alignment and phylogenetic analysis of whole chloroplast genomes.
McKain, Michael R; Hartsock, Ryan H; Wohl, Molly M; Kellogg, Elizabeth A
2017-01-01
Chloroplast genomes are now produced in the hundreds for angiosperm phylogenetics projects, but current methods for annotation, alignment and tree estimation still require some manual intervention reducing throughput and increasing analysis time for large chloroplast systematics projects. Verdant is a web-based software suite and database built to take advantage a novel annotation program, annoBTD. Using annoBTD, Verdant provides accurate annotation of chloroplast genomes without manual intervention. Subsequent alignment and tree estimation can incorporate newly annotated and publically available plastomes and can accommodate a large number of taxa. Verdant sharply reduces the time required for analysis of assembled chloroplast genomes and removes the need for pipelines and software on personal hardware. Verdant is available at: http://verdant.iplantcollaborative.org/plastidDB/ It is implemented in PHP, Perl, MySQL, Javascript, HTML and CSS with all major browsers supported. mrmckain@gmail.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Investigating the effects of PDC cutters geometry on ROP using the Taguchi technique
NASA Astrophysics Data System (ADS)
Jamaludin, A. A.; Mehat, N. M.; Kamaruddin, S.
2017-10-01
At times, the polycrystalline diamond compact (PDC) bit’s performance dropped and affects the rate of penetration (ROP). The objective of this project is to investigate the effect of PDC cutter geometry and optimize them. An intensive study in cutter geometry would further enhance the ROP performance. The relatively extended analysis was carried out and four significant geometry factors have been identified that directly improved the ROP. Cutter size, back rake angle, side rake angle and chamfer angle are the stated geometry factors. An appropriate optimization technique that effectively controls all influential geometry factors during cutters manufacturing is introduced and adopted in this project. By adopting L9 Taguchi OA, simulation experiment is conducted by using explicit dynamics finite element analysis. Through a structure Taguchi analysis, ANOVA confirms that the most significant geometry to improve ROP is cutter size (99.16% percentage contribution). The optimized cutter is expected to drill with high ROP that can reduce the rig time, which in its turn, may reduce the total drilling cost.
New Methods for Assessing and Reducing Uncertainty in Microgravity Studies
NASA Astrophysics Data System (ADS)
Giniaux, J. M.; Hooper, A. J.; Bagnardi, M.
2017-12-01
Microgravity surveying, also known as dynamic or 4D gravimetry is a time-dependent geophysical method used to detect mass fluctuations within the shallow crust, by analysing temporal changes in relative gravity measurements. We present here a detailed uncertainty analysis of temporal gravity measurements, considering for the first time all possible error sources, including tilt, error in drift estimations and timing errors. We find that some error sources that are actually ignored, can have a significant impact on the total error budget and it is therefore likely that some gravity signals may have been misinterpreted in previous studies. Our analysis leads to new methods for reducing some of the uncertainties associated with residual gravity estimation. In particular, we propose different approaches for drift estimation and free air correction depending on the survey set up. We also provide formulae to recalculate uncertainties for past studies and lay out a framework for best practice in future studies. We demonstrate our new approach on volcanic case studies, which include Kilauea in Hawaii and Askja in Iceland.
Prediction of the space adaptation syndrome
NASA Technical Reports Server (NTRS)
Reschke, M. F.; Homick, J. L.; Ryan, P.; Moseley, E. C.
1984-01-01
The univariate and multivariate relationships of provocative measures used to produce motion sickness symptoms were described. Normative subjects were used to develop and cross-validate sets of linear equations that optimally predict motion sickness in parabolic flights. The possibility of reducing the number of measurements required for prediction was assessed. After describing the variables verbally and statistically for 159 subjects, a factor analysis of 27 variables was completed to improve understanding of the relationships between variables and to reduce the number of measures for prediction purposes. The results of this analysis show that none of variables are significantly related to the responses to parabolic flights. A set of variables was selected to predict responses to KC-135 flights. A series of discriminant analyses were completed. Results indicate that low, moderate, or severe susceptibility could be correctly predicted 64 percent and 53 percent of the time on original and cross-validation samples, respectively. Both the factor analysis and the discriminant analysis provided no basis for reducing the number of tests.
Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng
2017-07-01
Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.
Time Resolved FTIR Analysis of Tailpipe Exhaust for Several Automobiles
NASA Astrophysics Data System (ADS)
White, Allen R.; Allen, James; Devasher, Rebecca B.
2011-06-01
The automotive catalytic converter reduces or eliminates the emission of various chemical species (e.g. CO, hydrocarbons, etc.) that are the products of combustion from automobile exhaust. However, these units are only effective once they have reached operating temperature. The design and placement of catalytic converters has changed in order to reduce both the quantity of emissions and the time that is required for the converter to be effective. In order to compare the effectiveness of catalytic converters, time-resolved measurements were performed on several vehicles, including a 2010 Toyota Prius, a 2010 Honda Fit, a 1994 Honda Civic, and a 1967 Oldsmobile 442 (which is not equipped with a catalytic converter but is used as a baseline). The newer vehicles demonstrate bot a reduced overall level of CO and hydrocarbon emissions but are also effective more quickly than older units. The time-resolved emissions will be discussed along with the impact of catalytic converter design and location on the measured emissions.
Risk analysis of Listeria spp. contamination in two types of ready-to-eat chicken meat products.
Keeratipibul, Suwimon; Lekroengsin, Sumalin
2009-01-01
This study was conducted to determine the risk of Listeria contamination in frozen ready-to-eat roasted and steamed chicken meat in a chicken plant in Thailand. Environmental surfaces were divided into three zones. Zone 1 included surfaces in direct contact with products. Zones 2 and 3 included indirect contact surfaces; zone 2 was next to zone 1, and zone 3 was located next to zone 2 and relatively far from the product. A mathematical model for the probability of product contamination after contact with contaminated zone 1 surfaces was established. This model was augmented by an already established model for the probability of Listeria contamination on zone 1 surfaces. Sensitivity analysis revealed that the prevalence of Listeria on zone 1 surfaces before cleaning and sanitizing, production time, and concentration and contact time of sanitizer were correlated with contamination of both products. Alternative risk management measures for reducing the risk of Listeria contamination were developed using sanitizer concentrations of 0.25 to 1.25% (vol/vol), sanitizer contact times of 5 to 20 min, and production times of 5 to 20 h. The plant's risk manager chose a 0.25% (vol/vol) sanitizer concentration, a contact time of 20 min, and a production time of 20 h. After implementation of the selected risk management option, the prevalence of Listeria on roasted and steamed products was reduced by 2.19 and 2.01%, respectively. The prevalence of Listeria in zones 1, 2, and 3 was also reduced by 3.13, 11.24, and 25.66%, respectively.
Antal, Borbála; Kuki, Ákos; Nagy, Lajos; Nagy, Tibor; Zsuga, Miklós; Kéki, Sándor
2016-07-01
Residues of chemicals on clothing products were examined by direct analysis in real-time (DART) mass spectrometry. Our experiments have revealed the presence of more than 40 chemicals in 15 different clothing items. The identification was confirmed by DART tandem mass spectrometry (MS/MS) experiments for 14 compounds. The most commonly detected hazardous substances were nonylphenol ethoxylates (NPEs), phthalic acid esters (phthalates), amines released by azo dyes, and quinoline derivates. DART-MS was able to detect NPEs on the skin of the person wearing the clothing item contaminated by NPE residuals. Automated data acquisition and processing method was developed and tested for the recognition of NPE residues thereby reducing the analysis time.
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
Sedentary time and markers of inflammation in people with newly diagnosed type 2 diabetes
Falconer, C.L.; Cooper, A.R.; Walhin, J.P.; Thompson, D.; Page, A.S.; Peters, T.J.; Montgomery, A.A.; Sharp, D.J.; Dayan, C.M.; Andrews, R.C.
2014-01-01
Background and aims We investigated whether objectively measured sedentary time was associated with markers of inflammation in adults with newly diagnosed type 2 diabetes. Methods and results We studied 285 adults (184 men, 101 women, mean age 59.0 ± 9.7) who had been recruited to the Early ACTivity in Diabetes (Early ACTID) randomised controlled trial. C-reactive protein (CRP), adiponectin, soluble intracellular adhesion molecule-1 (sICAM-1), interleukin-6 (IL-6), and accelerometer-determined sedentary time and moderate-vigorous physical activity (MVPA) were measured at baseline and after six-months. Linear regression analysis was used to investigate the independent cross-sectional and longitudinal associations of sedentary time with markers of inflammation. At baseline, associations between sedentary time and IL-6 were observed in men and women, an association that was attenuated following adjustment for waist circumference. After 6 months of follow-up, sedentary time was reduced by 0.4 ± 1.2 h per day in women, with the change in sedentary time predicting CRP at follow-up. Every hour decrease in sedentary time between baseline and six-months was associated with 24% (1, 48) lower CRP. No changes in sedentary time between baseline and 6 months were seen in men. Conclusions Higher sedentary time is associated with IL-6 in men and women with type 2 diabetes, and reducing sedentary time is associated with improved levels of CRP in women. Interventions to reduce sedentary time may help to reduce inflammation in women with type 2 diabetes. PMID:24925122
Application of Microchip for Biomarker Analysis
NASA Astrophysics Data System (ADS)
Kataoka, Masatoshi; Yatsushiro, Shouki; Yamamura, Shouhei; Abe, Hiroko
Microchip technologies have received considerable attention, due to their competitive advantages, especially in regards to reduced sample and reagent consumption, analysis time, and easy operation. This approach has been successfully used to analyze DNA, amino acids, proteins, and carbohydrates. In the present study, we showed the potential of microchip technologies for the biomarker analysis, blood carbohydrate analysis on microchip electrophoresis, quantitative analysis of protein with antigen-antibody reaction on microchip, and the detection of malaria-infected erythrocyte with a cell microarray chip.
Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Nikbay, Melike; Heeg, Jennifer
2017-01-01
This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.
NASA Astrophysics Data System (ADS)
Asztalos, Stephen J.; Hennig, Wolfgang; Warburton, William K.
2016-01-01
Pulse shape discrimination applied to certain fast scintillators is usually performed offline. In sufficiently high-event rate environments data transfer and storage become problematic, which suggests a different analysis approach. In response, we have implemented a general purpose pulse shape analysis algorithm in the XIA Pixie-500 and Pixie-500 Express digital spectrometers. In this implementation waveforms are processed in real time, reducing the pulse characteristics to a few pulse shape analysis parameters and eliminating time-consuming waveform transfer and storage. We discuss implementation of these features, their advantages, necessary trade-offs and performance. Measurements from bench top and experimental setups using fast scintillators and XIA processors are presented.
If Time Is Brain Where Is the Improvement in Prehospital Time after Stroke?
Pulvers, Jeremy N.; Watson, John D. G.
2017-01-01
Despite the availability of thrombolytic and endovascular therapy for acute ischemic stroke, many patients are ineligible due to delayed hospital arrival. The identification of factors related to either early or delayed hospital arrival may reveal potential targets of intervention to reduce prehospital delay and improve access to time-critical thrombolysis and clot retrieval therapy. Here, we have reviewed studies reporting on factors associated with either early or delayed hospital arrival after stroke, together with an analysis of stroke onset to hospital arrival times. Much effort in the stroke treatment community has been devoted to reducing door-to-needle times with encouraging improvements. However, this review has revealed that the median onset-to-door times and the percentage of stroke patients arriving before the logistically critical 3 h have shown little improvement in the past two decades. Major factors affecting prehospital time were related to emergency medical pathways, stroke symptomatology, patient and bystander behavior, patient health characteristics, and stroke treatment awareness. Interventions addressing these factors may prove effective in reducing prehospital delay, allowing prompt diagnosis, which in turn may increase the rates and/or efficacy of acute treatments such as thrombolysis and clot retrieval therapy and thereby improve stroke outcomes. PMID:29209269
Alotaibi, Naif M; Sarzetto, Francesca; Guha, Daipayan; Lu, Michael; Bodo, Andre; Gupta, Shaurya; Dyer, Erin; Howard, Peter; da Costa, Leodante; Swartz, Richard H; Boyle, Karl; Nathens, Avery B; Yang, Victor X D
2017-11-01
The metrics of imaging-to-puncture and imaging-to-reperfusion were recently found to be associated with the clinical outcomes of endovascular thrombectomy for acute ischemic stroke. However, measures for improving workflow within hospitals to achieve better timing results are largely unexplored for endovascular therapy. The aim of this study was to examine our experience with a novel smartphone application developed in house to improve our timing metrics for endovascular treatment. We developed an encrypted smartphone application connecting all stroke team members to expedite conversations and to provide synchronized real-time updates on the time window from stroke onset to imaging and to puncture. The effects of the application on the timing of endovascular therapy were evaluated with a secondary analysis of our single-center cohort. Our primary outcome was imaging-to-puncture time. We assessed the outcomes with nonparametric tests of statistical significance. Forty-five patients met our criteria for analysis among 66 consecutive patients with acute ischemic stroke who received endovascular therapy at our institution. After the implementation of the smartphone application, imaging-to-puncture time was significantly reduced (preapplication median time, 127 minutes; postapplication time, 69 minutes; P < 0.001). Puncture-to-reperfusion time was not affected by the application use (42 minutes vs. 36 minutes). The use of smartphone applications may reduce treatment times for endovascular therapy in acute ischemic stroke. Further studies are needed to confirm our findings. Copyright © 2017. Published by Elsevier Inc.
Qian, Xinyi Lisa; Yarnal, Careen M.; Almeida, David M.
2014-01-01
The stress suppressing model proposes that sufficient resources reduce stress. The stress exposure model suggests that certain factors expose individuals to more stress. The current study tested these two models by assessing the within-person lagging effect of leisure time on perceived severity of daily stressors. Analyzing eight-day diary data (N=2,022), we found that having more leisure time than usual on a day reduced perceived severity of daily stressors the next day and that the decrease in severity became larger with further increase in leisure time. Additionally, the effect is much stronger among busy individuals who usually had little leisure time. The findings demonstrated an accelerated suppressing effect that differed between-person, and the lagging effect affords stronger implication for causality than correlational analysis. PMID:24563564
Dhar, Sanjay; Michel, Raquel; Kanna, Balavenkatesh
2011-01-01
Patient waiting time and waiting room congestion are quality indicators that are related to efficiency of ambulatory care systems and patient satisfaction. Our main purpose was to test a program to decrease patient visit cycle time, while maintaining high-quality healthcare in a high-volume inner-city hospital-based clinic in New York City. Use of patient flow analysis and the creation of patient care teams proved useful in identifying areas for improvement, target, and measure effectiveness of interventions. The end result is reduced visit cycle time, improved provider team performance, and sustained patient care outcomes. © 2010 National Association for Healthcare Quality.
NASA Astrophysics Data System (ADS)
Chen, R. J.; Wang, M.; Yan, X. L.; Yang, Q.; Lam, Y. H.; Yang, L.; Zhang, Y. H.
2017-12-01
The periodic signals tracking algorithm has been used to determine the revolution times of ions stored in storage rings in isochronous mass spectrometry (IMS) experiments. It has been a challenge to perform real-time data analysis by using the periodic signals tracking algorithm in the IMS experiments. In this paper, a parallelization scheme of the periodic signals tracking algorithm is introduced and a new program is developed. The computing time of data analysis can be reduced by a factor of ∼71 and of ∼346 by using our new program on Tesla C1060 GPU and Tesla K20c GPU, compared to using old program on Xeon E5540 CPU. We succeed in performing real-time data analysis for the IMS experiments by using the new program on Tesla K20c GPU.
An analysis of thermal response factors and how to reduce their computational time requirement
NASA Technical Reports Server (NTRS)
Wiese, M. R.
1982-01-01
Te RESFAC2 version of the Thermal Response Factor Program (RESFAC) is the result of numerous modifications and additions to the original RESFAC. These modifications and additions have significantly reduced the program's computational time requirement. As a result of this work, the program is more efficient and its code is both readable and understandable. This report describes what a thermal response factor is; analyzes the original matrix algebra calculations and root finding techniques; presents a new root finding technique and streamlined matrix algebra; supplies ten validation cases and their results.
Markby, Jessica; Boeke, Caroline; Penazzato, Martina; Urick, Brittany; Ghadrshenas, Anisa; Harris, Lindsay; Ford, Nathan; Peter, Trevor
2017-01-01
Background: Despite significant gains made toward improving access, early infant diagnosis (EID) testing programs suffer from long test turnaround times that result in substantial loss to follow-up and mortality associated with delays in antiretroviral therapy initiation. These delays in treatment initiation are particularly impactful because of significant HIV-related infant mortality observed by 2–3 months of age. Short message service (SMS) and general packet radio service (GPRS) printers allow test results to be transmitted immediately to health care facilities on completion of testing in the laboratory. Methods: We conducted a systematic review and meta-analysis to assess the benefit of using SMS/GPRS printers to increase the efficiency of EID test result delivery compared with traditional courier paper–based results delivery methods. Results: We identified 11 studies contributing data for over 16,000 patients from East and Southern Africa. The test turnaround time from specimen collection to result received at the health care facility with courier paper–based methods was 68.0 days (n = 6835), whereas the test turnaround time with SMS/GPRS printers was 51.1 days (n = 6711), resulting in a 2.5-week (25%) reduction in the turnaround time. Conclusions: Courier paper–based EID test result delivery methods are estimated to add 2.5 weeks to EID test turnaround times in low resource settings and increase the risk that infants receive test results during or after the early peak of infant mortality. SMS/GPRS result delivery to health care facility printers significantly reduced test turnaround time and may reduce this risk. SMS/GPRS printers should be considered for expedited delivery of EID and other centralized laboratory test results. PMID:28825941
Constrained reduced-order models based on proper orthogonal decomposition
Reddy, Sohail R.; Freno, Brian Andrew; Cizmas, Paul G. A.; ...
2017-04-09
A novel approach is presented to constrain reduced-order models (ROM) based on proper orthogonal decomposition (POD). The Karush–Kuhn–Tucker (KKT) conditions were applied to the traditional reduced-order model to constrain the solution to user-defined bounds. The constrained reduced-order model (C-ROM) was applied and validated against the analytical solution to the first-order wave equation. C-ROM was also applied to the analysis of fluidized beds. Lastly, it was shown that the ROM and C-ROM produced accurate results and that C-ROM was less sensitive to error propagation through time than the ROM.
Fully automatic registration and segmentation of first-pass myocardial perfusion MR image sequences.
Gupta, Vikas; Hendriks, Emile A; Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2010-11-01
Derivation of diagnostically relevant parameters from first-pass myocardial perfusion magnetic resonance images involves the tedious and time-consuming manual segmentation of the myocardium in a large number of images. To reduce the manual interaction and expedite the perfusion analysis, we propose an automatic registration and segmentation method for the derivation of perfusion linked parameters. A complete automation was accomplished by first registering misaligned images using a method based on independent component analysis, and then using the registered data to automatically segment the myocardium with active appearance models. We used 18 perfusion studies (100 images per study) for validation in which the automatically obtained (AO) contours were compared with expert drawn contours on the basis of point-to-curve error, Dice index, and relative perfusion upslope in the myocardium. Visual inspection revealed successful segmentation in 15 out of 18 studies. Comparison of the AO contours with expert drawn contours yielded 2.23 ± 0.53 mm and 0.91 ± 0.02 as point-to-curve error and Dice index, respectively. The average difference between manually and automatically obtained relative upslope parameters was found to be statistically insignificant (P = .37). Moreover, the analysis time per slice was reduced from 20 minutes (manual) to 1.5 minutes (automatic). We proposed an automatic method that significantly reduced the time required for analysis of first-pass cardiac magnetic resonance perfusion images. The robustness and accuracy of the proposed method were demonstrated by the high spatial correspondence and statistically insignificant difference in perfusion parameters, when AO contours were compared with expert drawn contours. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.
A New Modified Histogram Matching Normalization for Time Series Microarray Analysis
Astola, Laura; Molenaar, Jaap
2014-01-01
Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data. PMID:27600344
Structural-Vibration-Response Data Analysis
NASA Technical Reports Server (NTRS)
Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.
1983-01-01
Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.
USDA-ARS?s Scientific Manuscript database
Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...
Research and application of embedded real-time operating system
NASA Astrophysics Data System (ADS)
Zhang, Bo
2013-03-01
In this paper, based on the analysis of existing embedded real-time operating system, the architecture of an operating system is designed and implemented. The experimental results show that the design fully complies with the requirements of embedded real-time operating system, can achieve the purposes of reducing the complexity of embedded software design and improving the maintainability, reliability, flexibility. Therefore, this design program has high practical value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter W. Carr; K.M. Fuller; D.R. Stoll
A new approach has been developed by modifying a conventional gradient elution liquid chromatograph for the high throughput screening of biological samples to detect the presence of regulated intoxicants. The goal of this work was to improve the speed of a gradient elution screening method over current approaches by optimizing the operational parameters of both the column and the instrument without compromising the reproducibility of the retention times, which are the basis for the identification. Most importantly, the novel instrument configuration substantially reduces the time needed to re-equilibrate the column between gradient runs, thereby reducing the total time for eachmore » analysis. The total analysis time for each gradient elution run is only 2.8 minutes, including 0.3 minutes for column reequilibration between analyses. Retention times standard calibration solutes are reproducible to better than 0.002 minutes in consecutive runs. A corrected retention index was adopted to account for day-to-day and column-to-column variations in retention time. The discriminating power and mean list length were calculated for a library of 47 intoxicants and compared with previous work from other laboratories to evaluate fast gradient elution HPLC as a screening tool.« less
78 FR 24201 - Graco, Inc.; Analysis of Agreement Containing Consent Order To Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-24
.... (``GlasCraft''). The Commission Complaint (``Complaint'') alleges that, at the time of the acquisitions... supply of fast-set equipment might later be interrupted as a result of litigation. To reduce that barrier... be restored. IV. The Consent Agreement Since the acquisitions were completed some time ago, it is not...
USDA-ARS?s Scientific Manuscript database
The variation in instar number and the pattern of sequential instar development time of Tenebrio molitor L. (Coleoptera: Tenebrionidae) was studied under 4 different diet regimes. Addition of dietary supplements consisting of dry potato or a mix of dry potato and dry egg whites significantly reduced...
Visual cluster analysis and pattern recognition methods
Osbourn, Gordon Cecil; Martinez, Rubel Francisco
2001-01-01
A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.
Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas
2016-01-01
The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125
[Evaluating cost/equity in the Colombian health system, 1998-2005].
Eslava-Schmalbach, Javier; Barón, Gilberto; Gaitán-Duarte, Hernando; Alfonso, Helman; Agudelo, Carlos; Sánchez, Carolina
2008-01-01
An economic analysis of cost-equity (from society's viewpoint) for evaluating the impact of Law 100/93 in Colombia between 1998 and 2005. An economic analysis compared costs and equity in health in Colombia between 1998 and 2005. Data was taken from the Colombian Statistics' Administration Department ( Departamento Administrativo Nacional de Estadistica - DANE) and from national demographic and health surveys carried out in 2000 and 2005. Information regarding costs was taken from the National Health Accounts' System. Inequity in Health was considered in line with the Inequity in Health Index (IHI). Incremental and average cost-equity analysis covered three sub-periods; 1998-1999 (during which time per capita gross internal product became reduced in Colombia ), 2000-2001 (during which time total health expense became reduced) and 2001 -2005. An unstable tendency for inequity in health becoming reduced during the period was revealed. There was an inverse relationship between IHI and public health spending and a direct relationship between out-of-pocket spending on health and equity in health (Spearman, p<0.05). The second period had the best incremental cost-equity ratio. Fluctuations in IHI and marginal cost-equity during the periods being analysed suggested that health spending depended on equity in health in Colombia during the period being studied.
Sloan, Robert A; Kim, Youngdeok; Sahasranaman, Aarti; Müller-Riemenschneider, Falk; Biddle, Stuart J H; Finkelstein, Eric A
2018-03-22
A recent meta-analysis surmised pedometers were a useful panacea to independently reduce sedentary time (ST). To further test and expand on this deduction, we analyzed the ability of a consumer-wearable activity tracker to reduce ST and prolonged sedentary bouts (PSB). We originally conducted a 12-month randomized control trial where 800 employees from 13 organizations were assigned to control, activity tracker, or one of two activity tracker plus incentive groups designed to increase step count. The primary outcome was accelerometer measured moderate-to-vigorous physical activity. We conducted a secondary analysis on accelerometer measured daily ST and PSB bouts. A general linear mixed model was used to examine changes in ST and prolonged sedentary bouts, followed by between-group pairwise comparisons. Regression analyses were conducted to examine the association of changes in step counts with ST and PSB. The changes in ST and PSB were not statistically significant and not different between the groups (P < 0.05). Increases in step counts were concomitantly associated with decreases in ST and PSB, regardless of intervention (P < 0.05). Caution should be taken when considering consumer-wearable activity trackers as a means to reduce sedentary behavior. Trial registration NCT01855776 Registered: August 8, 2012.
Chowdhury, Olie; Wedderburn, Catherine J; Duffy, Donovan; Greenough, Anne
2012-10-01
Continuous positive airway pressure (CPAP) is widely used in neonatal units both as a primary mode of respiratory support and following extubation from mechanical ventilation. In this review, the evidence for CPAP use particularly in prematurely born infants is considered. Studies comparing methods of CPAP generation have yielded conflicting results, but meta-analysis of randomised trials has demonstrated that delivering CPAP via short nasal prongs is most effective in preventing re-intubation. At present, there is insufficient evidence to establish the safety or efficacy of high flow nasal cannulae for prematurely born infants. Observational studies highlighted that early CPAP use rather than intubation and ventilation was associated with a lower incidence of bronchopulmonary dysplasia (BPD), but this has not been confirmed in three large randomised trials. Meta-analysis of the results of randomised trials has demonstrated that use of CPAP reduces extubation failure, particularly if a CPAP level of 5 cm H2O or more is used. Nasal injury can occur and is related to the length of time CPAP is used; weaning CPAP by pressure rather than by "time-cycling" reduces the weaning time and may reduce BPD. In conclusion, further studies are required to identify the optimum mode of CPAP generation and it is important that prematurely born infants are weaned from CPAP as soon as possible.
Costs of a Staff Communication Intervention to Reduce Dementia Behaviors in Nursing Home Care
Williams, Kristine N.; Ayyagari, Padmaja; Perkhounkova, Yelena; Bott, Marjorie J.; Herman, Ruth; Bossen, Ann
2017-01-01
CONTEXT Persons with Alzheimer’s disease and other dementias experience behavioral symptoms that frequently result in nursing home (NH) placement. Managing behavioral symptoms in the NH increases staff time required to complete care, and adds to staff stress and turnover, with estimated cost increases of 30%. The Changing Talk to Reduce Resistivenes to Dementia Care (CHAT) study found that an intervention that improved staff communication by reducing elderspeak led to reduced behavioral symptoms of dementia or resistiveness to care (RTC). OBJECTIVE This analysis evaluates the cost-effectiveness of the CHAT intervention to reduce elderspeak communication by staff and RTC behaviors of NH residents with dementia. DESIGN Costs to provide the intervention were determined in eleven NHs that participated in the CHAT study during 2011–2013 using process-based costing. Each NH provided data on staff wages for the quarter before and for two quarters after the CHAT intervention. An incremental cost-effectiveness analysis was completed. ANALYSIS An average cost per participant was calculated based on the number and type of staff attending the CHAT training, plus materials and interventionist time. Regression estimates from the parent study then were applied to determine costs per unit reduction in staff elderspeak communication and resident RTC. RESULTS A one percentage point reduction in elderspeak costs $6.75 per staff member with average baseline elderspeak usage. Assuming that each staff cares for 2 residents with RTC, a one percentage point reduction in RTC costs $4.31 per resident using average baseline RTC. CONCLUSIONS Costs to reduce elderspeak and RTC depend on baseline levels of elderspeak and RTC, as well as the number of staff participating in CHAT training and numbers of residents with dementia-related behaviors. Overall, the 3-session CHAT training program is a cost-effective intervention for reducing RTC behaviors in dementia care. PMID:28503675
Development of an Efficient Binaural Simulation for the Analysis of Structural Acoustic Data
NASA Technical Reports Server (NTRS)
Lalime, Aimee L.; Johnson, Marty E.; Rizzi, Stephen A. (Technical Monitor)
2002-01-01
Binaural or "virtual acoustic" representation has been proposed as a method of analyzing acoustic and vibroacoustic data. Unfortunately, this binaural representation can require extensive computer power to apply the Head Related Transfer Functions (HRTFs) to a large number of sources, as with a vibrating structure. This work focuses on reducing the number of real-time computations required in this binaural analysis through the use of Singular Value Decomposition (SVD) and Equivalent Source Reduction (ESR). The SVD method reduces the complexity of the HRTF computations by breaking the HRTFs into dominant singular values (and vectors). The ESR method reduces the number of sources to be analyzed in real-time computation by replacing sources on the scale of a structural wavelength with sources on the scale of an acoustic wavelength. It is shown that the effectiveness of the SVD and ESR methods improves as the complexity of the source increases. In addition, preliminary auralization tests have shown that the results from both the SVD and ESR methods are indistinguishable from the results found with the exhaustive method.
Crowdsourcing and Automated Retinal Image Analysis for Diabetic Retinopathy.
Mudie, Lucy I; Wang, Xueyang; Friedman, David S; Brady, Christopher J
2017-09-23
As the number of people with diabetic retinopathy (DR) in the USA is expected to increase threefold by 2050, the need to reduce health care costs associated with screening for this treatable disease is ever present. Crowdsourcing and automated retinal image analysis (ARIA) are two areas where new technology has been applied to reduce costs in screening for DR. This paper reviews the current literature surrounding these new technologies. Crowdsourcing has high sensitivity for normal vs abnormal images; however, when multiple categories for severity of DR are added, specificity is reduced. ARIAs have higher sensitivity and specificity, and some commercial ARIA programs are already in use. Deep learning enhanced ARIAs appear to offer even more improvement in ARIA grading accuracy. The utilization of crowdsourcing and ARIAs may be a key to reducing the time and cost burden of processing images from DR screening.
John-Baptiste, A.; Sowerby, L.J.; Chin, C.J.; Martin, J.; Rotenberg, B.W.
2016-01-01
Background: When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). Methods: We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. Results: The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Interpretation: Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems. PMID:27975045
John-Baptiste, A; Sowerby, L J; Chin, C J; Martin, J; Rotenberg, B W
2016-01-01
When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems.
Model diagnostics in reduced-rank estimation
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches. PMID:28003860
Model diagnostics in reduced-rank estimation.
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches.
Visual Analytics of integrated Data Systems for Space Weather Purposes
NASA Astrophysics Data System (ADS)
Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo
Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.
Nanoparticle separation with a miniaturized asymmetrical flow field-flow fractionation cartridge
Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; de Mello, Andrew J.
2015-01-01
Asymmetrical Flow Field-Flow Fractionation (AF4) is a separation technique applicable to particles over a wide size range. Despite the many advantages of AF4, its adoption in routine particle analysis is somewhat limited by the large footprint of currently available separation cartridges, extended analysis times and significant solvent consumption. To address these issues, we describe the fabrication and characterization of miniaturized AF4 cartridges. Key features of the down-scaled platform include simplified cartridge and reagent handling, reduced analysis costs and higher throughput capacities. The separation performance of the miniaturized cartridge is assessed using certified gold and silver nanoparticle standards. Analysis of gold nanoparticle populations indicates shorter analysis times and increased sensitivity compared to conventional AF4 separation schemes. Moreover, nanoparticulate titanium dioxide populations exhibiting broad size distributions are analyzed in a rapid and efficient manner. Finally, the repeatability and reproducibility of the miniaturized platform are investigated with respect to analysis time and separation efficiency. PMID:26258119
Nanoparticle separation with a miniaturized asymmetrical flow field-flow fractionation cartridge
NASA Astrophysics Data System (ADS)
Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; deMello, Andrew
2015-07-01
Asymmetrical Flow Field-Flow Fractionation (AF4) is a separation technique applicable to particles over a wide size range. Despite the many advantages of AF4, its adoption in routine particle analysis is somewhat limited by the large footprint of currently available separation cartridges, extended analysis times and significant solvent consumption. To address these issues, we describe the fabrication and characterization of miniaturized AF4 cartridges. Key features of the scale-down platform include simplified cartridge and reagent handling, reduced analysis costs and higher throughput capacities. The separation performance of the miniaturized cartridge is assessed using certified gold and silver nanoparticle standards. Analysis of gold nanoparticle populations indicates shorter analysis times and increased sensitivity compared to conventional AF4 separation schemes. Moreover, nanoparticulate titanium dioxide populations exhibiting broad size distributions are analyzed in a rapid and efficient manner. Finally, the repeatability and reproducibility of the miniaturized platform are investigated with respect to analysis time and separation efficiency.
The cost analysis of material handling in Chinese traditional praying paper production plant
NASA Astrophysics Data System (ADS)
Nasution, H.; Budiman, I.; Salim, A.
2018-02-01
Chinese traditional praying paper industry is an industry which produced Chinese traditional religion praying paper. This kind of industry is rarely examined since it was only in Small and Medium Enterprise (SME’s- form). This industry produced various kinds of Chinese traditional paper products. The purpose of this research is to increase the amount of production, reduce waiting time and moving time, and reduce material handling cost. The research was conducted at prime production activities, consists of: calculate the capacity of the material handler, the frequency of movement, cost of material handling, and total cost of material handling. This displacement condition leads to an ineffective and inefficient production process. The alternative was developed using production judgment and aisle standard. Based on the observation results, it is possible to reduce displacement in the production. Using alternative which by-passed displacement from a rolled paper in the temporary warehouse to cutting and printing workstation, it can reduce material handling cost from 2.26 million rupiahs to 2.00 million rupiahs only for each batch of production. This result leads to increasing of production quantity, reducing waiting and moving time about 10% from the current condition.
Reduced body size and cub recruitment in polar bears associated with sea ice decline.
Rode, Karyn D; Amstrup, Steven C; Regehr, Eric V
2010-04-01
Rates of reproduction and survival are dependent upon adequate body size and condition of individuals. Declines in size and condition have provided early indicators of population decline in polar bears (Ursus maritimus) near the southern extreme of their range. We tested whether patterns in body size, condition, and cub recruitment of polar bears in the southern Beaufort Sea of Alaska were related to the availability of preferred sea ice habitats and whether these measures and habitat availability exhibited trends over time, between 1982 and 2006. The mean skull size and body length of all polar bears over three years of age declined over time, corresponding with long-term declines in the spatial and temporal availability of sea ice habitat. Body size of young, growing bears declined over time and was smaller after years when sea ice availability was reduced. Reduced litter mass and numbers of yearlings per female following years with lower availability of optimal sea ice habitat, suggest reduced reproductive output and juvenile survival. These results, based on analysis of a long-term data set, suggest that declining sea ice is associated with nutritional limitations that reduced body size and reproduction in this population.
Reduced body size and cub recruitment in polar bears associated with sea ice decline
Rode, Karyn D.; Amstrup, Steven C.; Regehr, Eric V.
2010-01-01
Rates of reproduction and survival are dependent upon adequate body size and condition of individuals. Declines in size and condition have provided early indicators of population decline in polar bears (Ursus maritimus) near the southern extreme of their range. We tested whether patterns in body size, condition, and cub recruitment of polar bears in the southern Beaufort Sea of Alaska were related to the availability of preferred sea ice habitats and whether these measures and habitat availability exhibited trends over time, between 1982 and 2006. The mean skull size and body length of all polar bears over three years of age declined over time, corresponding with long‐term declines in the spatial and temporal availability of sea ice habitat. Body size of young, growing bears declined over time and was smaller after years when sea ice availability was reduced. Reduced litter mass and numbers of yearlings per female following years with lower availability of optimal sea ice habitat, suggest reduced reproductive output and juvenile survival. These results, based on analysis of a long‐term data set, suggest that declining sea ice is associated with nutritional limitations that reduced body size and reproduction in this population.
Real-Time Mapping Spectroscopy on the Ground, in the Air, and in Space
NASA Astrophysics Data System (ADS)
Thompson, D. R.; Allwood, A.; Chien, S.; Green, R. O.; Wettergreen, D. S.
2016-12-01
Real-time data interpretation can benefit both remote in situ exploration and remote sensing. Basic analyses at the sensor can monitor instrument performance and reveal invisible science phenomena in real time. This promotes situational awareness for remote robotic explorers or campaign decision makers, enabling adaptive data collection, reduced downlink requirements, and coordinated multi-instrument observations. Fast analysis is ideal for mapping spectrometers providing unambiguous, quantitative geophysical measurements. This presentation surveys recent computational advances in real-time spectroscopic analysis for Earth science and planetary exploration. Spectral analysis at the sensor enables new operations concepts that significantly improve science yield. Applications include real-time detection of fugitive greenhouse emissions by airborne monitoring, real-time cloud screening and mineralogical mapping by orbital spectrometers, and adaptive measurement by the PIXL instrument on the Mars 2020 rover. Copyright 2016 California Institute of Technology. All Rights Reserved. We acknowledge support of the US Government, NASA, the Earth Science Division and Terrestrial Ecology program.
Adaptive noise cancelling and time-frequency techniques for rail surface defect detection
NASA Astrophysics Data System (ADS)
Liang, B.; Iwnicki, S.; Ball, A.; Young, A. E.
2015-03-01
Adaptive noise cancelling (ANC) is a technique which is very effective to remove additive noises from the contaminated signals. It has been widely used in the fields of telecommunication, radar and sonar signal processing. However it was seldom used for the surveillance and diagnosis of mechanical systems before late of 1990s. As a promising technique it has gradually been exploited for the purpose of condition monitoring and fault diagnosis. Time-frequency analysis is another useful tool for condition monitoring and fault diagnosis purpose as time-frequency analysis can keep both time and frequency information simultaneously. This paper presents an ANC and time-frequency application for railway wheel flat and rail surface defect detection. The experimental results from a scaled roller test rig show that this approach can significantly reduce unwanted interferences and extract the weak signals from strong background noises. The combination of ANC and time-frequency analysis may provide us one of useful tools for condition monitoring and fault diagnosis of railway vehicles.
Brazhe, Nadezda A.; Treiman, Marek; Brazhe, Alexey R.; Find, Ninett L.; Maksimov, Georgy V.; Sosnovtseva, Olga V.
2012-01-01
This paper presents a nonivasive approach to study redox state of reduced cytochromes , and of complexes II and III in mitochondria of live cardiomyocytes by means of Raman microspectroscopy. For the first time with the proposed approach we perform studies of rod- and round-shaped cardiomyocytes, representing different morphological and functional states. Raman mapping and cluster analysis reveal that these cardiomyocytes differ in the amounts of reduced cytochromes , and . The rod-shaped cardiomyocytes possess uneven distribution of reduced cytochromes , and in cell center and periphery. Moreover, by means of Raman spectroscopy we demonstrated the decrease in the relative amounts of reduced cytochromes , and in the rod-shaped cardiomyocytes caused by H2O2-induced oxidative stress before any visible changes. Results of Raman mapping and time-dependent study of reduced cytochromes of complexes II and III and cytochrome in cardiomyocytes are in a good agreement with our fluorescence indicator studies and other published data. PMID:22957018
QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.
Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter
2015-07-01
Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.
MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce
2015-01-01
Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223
MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.
Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung
2015-01-01
Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.
Chromatic Image Analysis For Quantitative Thermal Mapping
NASA Technical Reports Server (NTRS)
Buck, Gregory M.
1995-01-01
Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.
Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering
NASA Technical Reports Server (NTRS)
Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)
2001-01-01
Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnold, Anthony, E-mail: anthony.arnold@sesiahs.health.nsw.gov.a; Delaney, Geoff P.; Cassapi, Lynette
Purpose: Radiotherapy is a common treatment for cancer patients. Although incidence of error is low, errors can be severe or affect significant numbers of patients. In addition, errors will often not manifest until long periods after treatment. This study describes the development of an incident reporting tool that allows categorical analysis and time trend reporting, covering first 3 years of use. Methods and Materials: A radiotherapy-specific incident analysis system was established. Staff members were encouraged to report actual errors and near-miss events detected at prescription, simulation, planning, or treatment phases of radiotherapy delivery. Trend reporting was reviewed monthly. Results: Reportsmore » were analyzed for the first 3 years of operation (May 2004-2007). A total of 688 reports was received during the study period. The actual error rate was 0.2% per treatment episode. During the study period, the actual error rates reduced significantly from 1% per year to 0.3% per year (p < 0.001), as did the total event report rates (p < 0.0001). There were 3.5 times as many near misses reported compared with actual errors. Conclusions: This system has allowed real-time analysis of events within a radiation oncology department to a reduced error rate through focus on learning and prevention from the near-miss reports. Plans are underway to develop this reporting tool for Australia and New Zealand.« less
Maduell, Francisco; Moreso, Francesc; Mora-Macià, Josep; Pons, Mercedes; Ramos, Rosa; Carreras, Jordi; Soler, Jordi; Torres, Ferrán
2016-01-01
The ESHOL study showed that post-dilution online haemodiafiltration (OL-HDF) reduces all-cause mortality versus haemodialysis. However, during the observation period, 355 patients prematurely completed the study and, according to the study design, these patients were censored at the time of premature termination. The aim of this study was to investigate the outcome of patients who discontinued the study. During follow-up, 207 patients died while under treatment and 47 patients died after discontinuation of the study. Compared with patients maintained on haemodialysis, those randomised to OL-HDF had lower all-cause mortality (12.4 versus 9.46 per 100 patient-years, hazard ratio and 95%CI: 0.76; [0.59-0.98], P= 0.031). For all-cause mortality by time-dependent covariates and competing risks for transplantation, the time-dependent Cox analysis showed very similar results to the main analysis with a hazard ratio of 0.77 (0.60-0.99, P= 0.043). The results of this analysis of the ESHOL trial confirm that post-dilution OL-HDF reduces all-cause mortality versus haemodialysis in prevalent patients. The original results of the ESHOL study, which censored patients discontinuing the study for any reason, were confirmed in the present ITT population without censures and when all-cause mortality was considered by time-dependent and competing risks for transplantation. Copyright © 2015 Sociedad Española de Nefrología. Published by Elsevier España, S.L.U. All rights reserved.
Quantification of underivatised amino acids on dry blood spot, plasma, and urine by HPLC-ESI-MS/MS.
Giordano, Giuseppe; Di Gangi, Iole Maria; Gucciardi, Antonina; Naturale, Mauro
2012-01-01
Enzyme deficiencies in amino acid (AA) metabolism affecting the levels of amino acids and their derivatives in physiological fluids may serve as diagnostically significant biomarkers for one or a group of metabolic disorders. Therefore, it is important to monitor a wide range of free amino acids simultaneously and to quantify them. This is time consuming if we use the classical methods and more than ever now that many laboratories have introduced Newborn Screening Programs for the semiquantitative analysis, detection, and quantification of some amino acids needed to be performed in a short time to reduce the rate of false positives.We have modified the stable isotope dilution HPLC-electrospray ionization (ESI)-MS/MS method previously described by Qu et al. (Anal Chem 74: 2034-2040, 2002) for a more rapid, robust, sensitive, and specific detection and quantification of underivatised amino acids. The modified method reduces the time of analysis to 10 min with very good reproducibility of retention times and a better separation of the metabolites and their isomers.The omission of the derivatization step allowed us to achieve some important advantages: fast and simple sample preparation and exclusion of artefacts and interferences. The use of this technique is highly sensitive, specific, and allows monitoring of 40 underivatized amino acids, including the key isomers and quantification of some of them, in order to cover many diagnostically important intermediates of metabolic pathways.We propose this HPLC-ESI-MS/MS method for underivatized amino acids as a support for the Newborn Screening as secondary test using the same dried blood spots for a more accurate and specific examination in case of suspected metabolic diseases. In this way, we avoid plasma collection from the patient as it normally occurs, reducing anxiety for the parents and further costs for analysis.The same method was validated and applied also to plasma and urine samples with good reproducibility, accuracy, and precision. The fast run time, feasibility of high sample throughput, and small amount of sample required make this method very suitable for routine analysis in the clinical setting.
Bicchi, Carlo; Liberto, Erica; Cagliero, Cecilia; Cordero, Chiara; Sgorbini, Barbara; Rubiolo, Patrizia
2008-11-28
The analysis of complex real-world samples of vegetable origin requires rapid and accurate routine methods, enabling laboratories to increase sample throughput and productivity while reducing analysis costs. This study examines shortening enantioselective-GC (ES-GC) analysis time following the approaches used in fast GC. ES-GC separations are due to a weak enantiomer-CD host-guest interaction and the separation is thermodynamically driven and strongly influenced by temperature. As a consequence, fast temperature rates can interfere with enantiomeric discrimination; thus the use of short and/or narrow bore columns is a possible approach to speeding-up ES-GC analyses. The performance of ES-GC with a conventional inner diameter (I.D.) column (25 m length x 0.25 mm I.D., 0.15 microm and 0.25 microm d(f)) coated with 30% of 2,3-di-O-ethyl-6-O-tert-butyldimethylsilyl-beta-cyclodextrin in PS-086 is compared to those of conventional I.D. short column (5m length x 0.25 mm I.D., 0.15 microm d(f)) and of different length narrow bore columns (1, 2, 5 and 10 m long x 0.10 mm I.D., 0.10 microm d(f)) in analysing racemate standards of pesticides and in the flavour and fragrance field and real-world-samples. Short conventional I.D. columns gave shorter analysis time and comparable or lower resolutions with the racemate standards, depending mainly on analyte volatility. Narrow-bore columns were tested under different analysis conditions; they provided shorter analysis time and resolutions comparable to those of conventional I.D. ES columns. The narrow-bore columns offering the most effective compromise between separation efficiency and analysis time are the 5 and 2m columns; in combination with mass spectrometry as detector, applied to lavender and bergamot essential oil analyses, these reduced analysis time by a factor of at least three while separation of chiral markers remained unaltered.
Epicenter location by analysis for interictal spikes
NASA Technical Reports Server (NTRS)
Hand, C.
2001-01-01
The MEG recording is a quick and painless process that requires no surgery. This approach has the potential to save time, reduce patient discomfort, and eliminates a painful and potentially dangerous surgical step in the treatment procedure.
Visual cluster analysis and pattern recognition template and methods
Osbourn, Gordon Cecil; Martinez, Rubel Francisco
1999-01-01
A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.
NASA Astrophysics Data System (ADS)
Gao, Yi; Zhu, Liangjia; Norton, Isaiah; Agar, Nathalie Y. R.; Tannenbaum, Allen
2014-03-01
Desorption electrospray ionization mass spectrometry (DESI-MS) provides a highly sensitive imaging technique for differentiating normal and cancerous tissue at the molecular level. This can be very useful, especially under intra-operative conditions where the surgeon has to make crucial decision about the tumor boundary. In such situations, the time it takes for imaging and data analysis becomes a critical factor. Therefore, in this work we utilize compressive sensing to perform the sparse sampling of the tissue, which halves the scanning time. Furthermore, sparse feature selection is performed, which not only reduces the dimension of data from about 104 to less than 50, and thus significantly shortens the analysis time. This procedure also identifies biochemically important molecules for further pathological analysis. The methods are validated on brain and breast tumor data sets.
Extracellular space preservation aids the connectomic analysis of neural circuits.
Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L
2015-12-09
Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits.
P2P-Based Data System for the EAST Experiment
NASA Astrophysics Data System (ADS)
Shu, Yantai; Zhang, Liang; Zhao, Weifeng; Chen, Haiming; Luo, Jiarong
2006-06-01
A peer-to-peer (P2P)-based EAST Data System is being designed to provide data acquisition and analysis support for the EAST superconducting tokamak. Instead of transferring data to the servers, all collected data are stored in the data acquisition subsystems locally and the PC clients can access the raw data directly using the P2P architecture. Both online and offline systems are based on Napster-like P2P architecture. This allows the peer (PC) to act both as a client and as a server. A simulation-based method and a steady-state operational analysis technique are used for performance evaluation. These analyses show that the P2P technique can significantly reduce the completion time of raw data display and real-time processing on the online system, and raise the workload capacity and reduce the delay on the offline system.
Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements
NASA Astrophysics Data System (ADS)
Papa, A. R.; Akel, A. F.
2009-05-01
Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.
Analysis of a Real-Time Separation Assurance System with Integrated Time-in-Trail Spacing
NASA Technical Reports Server (NTRS)
Aweiss, Arwa S.; Farrahi, Amir H.; Lauderdale, Todd A.; Thipphavong, Adam S.; Lee, Chu H.
2010-01-01
This paper describes the implementation and analysis of an integrated ground-based separation assurance and time-based metering prototype system into the Center-TRACON Automation System. The integration of this new capability accommodates constraints in four-dimensions: position (x-y), altitude, and meter-fix crossing time. Experiments were conducted to evaluate the performance of the integrated system and its ability to handle traffic levels up to twice that of today. Results suggest that the integrated system reduces the number and magnitude of time-in-trail spacing violations. This benefit was achieved without adversely affecting the resolution success rate of the system. Also, the data suggest that the integrated system is relatively insensitive to an increase in traffic of twice the current levels.
Defect-Repairable Latent Feature Extraction of Driving Behavior via a Deep Sparse Autoencoder
Taniguchi, Tadahiro; Takenaka, Kazuhito; Bando, Takashi
2018-01-01
Data representing driving behavior, as measured by various sensors installed in a vehicle, are collected as multi-dimensional sensor time-series data. These data often include redundant information, e.g., both the speed of wheels and the engine speed represent the velocity of the vehicle. Redundant information can be expected to complicate the data analysis, e.g., more factors need to be analyzed; even varying the levels of redundancy can influence the results of the analysis. We assume that the measured multi-dimensional sensor time-series data of driving behavior are generated from low-dimensional data shared by the many types of one-dimensional data of which multi-dimensional time-series data are composed. Meanwhile, sensor time-series data may be defective because of sensor failure. Therefore, another important function is to reduce the negative effect of defective data when extracting low-dimensional time-series data. This study proposes a defect-repairable feature extraction method based on a deep sparse autoencoder (DSAE) to extract low-dimensional time-series data. In the experiments, we show that DSAE provides high-performance latent feature extraction for driving behavior, even for defective sensor time-series data. In addition, we show that the negative effect of defects on the driving behavior segmentation task could be reduced using the latent features extracted by DSAE. PMID:29462931
Efficient multiscale magnetic-domain analysis of iron-core material under mechanical stress
NASA Astrophysics Data System (ADS)
Nishikubo, Atsushi; Ito, Shumpei; Mifune, Takeshi; Matsuo, Tetsuji; Kaido, Chikara; Takahashi, Yasuhito; Fujiwara, Koji
2018-05-01
For an efficient analysis of magnetization, a partial-implicit solution method is improved using an assembled domain structure model with six-domain mesoscopic particles exhibiting pinning-type hysteresis. The quantitative analysis of non-oriented silicon steel succeeds in predicting the stress dependence of hysteresis loss with computation times greatly reduced by using the improved partial-implicit method. The effect of cell division along the thickness direction is also evaluated.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
2012-01-01
industries through hide damage, reduc- tions in animal weight gains, or reduced production of animal products such as milk or eggs (Reviewed by Lehane...chiopterus (Meigen 1830) was abundant on sheep in southern England, although relatively uncommon in nearby light traps. Furthermore, attraction to or...Cross correlationmaps: a tool for visualizing andmodeling time lagged associations. Vector Borne Zoonotic Dis. 5: 267Ð 275. Duehl, A. J., L. W
An Analysis Of Personalized Learning Systems For Navy Training And Education Settings
2016-12-01
of dedicated “schoolhouse” training and education among the services account for approximately $8.7 billion per year (Department of Defense [DOD...calls it customized learning) opportunities for the Air Force with the sole intent of reducing time-to- train , and thereby significantly reducing...technology to develop and distribute personalized, cost-effective, always available, high quality training and education to service members and DOD
Real-Time Visualization of an HPF-based CFD Simulation
NASA Technical Reports Server (NTRS)
Kremenetsky, Mark; Vaziri, Arsi; Haimes, Robert; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Current time-dependent CFD simulations produce very large multi-dimensional data sets at each time step. The visual analysis of computational results are traditionally performed by post processing the static data on graphics workstations. We present results from an alternate approach in which we analyze the simulation data in situ on each processing node at the time of simulation. The locally analyzed results, usually more economical and in a reduced form, are then combined and sent back for visualization on a graphics workstation.
Hollinghurst, Sandra; Emmett, Clare; Peters, Tim J; Watson, Helen; Fahey, Tom; Murphy, Deirdre J; Montgomery, Alan
2010-01-01
Maternal preferences should be considered in decisions about mode of delivery following a previous cesarean, but risks and benefits are unclear. Decision aids can help decision making, although few studies have assessed costs in conjunction with effectiveness. Economic evaluation of 2 decision aids for women with 1 previous cesarean. Cost-consequences analysis. Data sources were self-reported resource use and outcome and published national unit costs. The target population was women with 1 previous cesarean. The time horizon was 37 weeks' gestation and 6 weeks postnatal. The perspective was health care delivery system. The interventions were usual care, usual care plus an information program, and usual care plus a decision analysis program. The outcome measures were costs to the National Health Service (NHS) in the United Kingdom (UK), score on the Decisional Conflict Scale, and mode of delivery. RESULTS OF MAIN ANALYSIS: Cost of delivery represented 84% of the total cost; mode of delivery was the most important determinant of cost differences across the groups. Mean (SD) total cost per mother and baby: 2033 (677) for usual care, 2069 (738) for information program, and 2019 (741) for decision analysis program. Decision aids reduced decisional conflict. Women using the decision analysis program had fewest cesarean deliveries. Applying a cost premium to emergency cesareans over electives had little effect on group comparisons. Conclusions were unaffected. Disparity in timing of outcomes and costs, data completeness, and quality. Decision aids can reduce decisional conflict in women with a previous cesarean section when deciding on mode of delivery. The information program could be implemented at no extra cost to the NHS. The decision analysis program might reduce the rate of cesarean sections without any increase in costs.
NASA Astrophysics Data System (ADS)
Li, Xue; Hou, Guangyue; Xing, Junpeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2014-12-01
In the present work, direct analysis of real time ionization combined with multi-stage tandem mass spectrometry (DART-MSn) was used to investigate the metabolic profile of aconite alkaloids in rat intestinal bacteria. A total of 36 metabolites from three aconite alkaloids were identified by using DART-MSn, and the feasibility of quantitative analysis of these analytes was examined. Key parameters of the DART ion source, such as helium gas temperature and pressure, the source-to-MS distance, and the speed of the autosampler, were optimized to achieve high sensitivity, enhance reproducibility, and reduce the occurrence of fragmentation. The instrument analysis time for one sample can be less than 10 s for this method. Compared with ESI-MS and UPLC-MS, the DART-MS is more efficient for directly detecting metabolic samples, and has the advantage of being a simple, high-speed, high-throughput method.
Li, Xue; Hou, Guangyue; Xing, Junpeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2014-12-01
In the present work, direct analysis of real time ionization combined with multi-stage tandem mass spectrometry (DART-MS(n)) was used to investigate the metabolic profile of aconite alkaloids in rat intestinal bacteria. A total of 36 metabolites from three aconite alkaloids were identified by using DART-MS(n), and the feasibility of quantitative analysis of these analytes was examined. Key parameters of the DART ion source, such as helium gas temperature and pressure, the source-to-MS distance, and the speed of the autosampler, were optimized to achieve high sensitivity, enhance reproducibility, and reduce the occurrence of fragmentation. The instrument analysis time for one sample can be less than 10 s for this method. Compared with ESI-MS and UPLC-MS, the DART-MS is more efficient for directly detecting metabolic samples, and has the advantage of being a simple, high-speed, high-throughput method.
Target Identification Using Harmonic Wavelet Based ISAR Imaging
NASA Astrophysics Data System (ADS)
Shreyamsha Kumar, B. K.; Prabhakar, B.; Suryanarayana, K.; Thilagavathi, V.; Rajagopal, R.
2006-12-01
A new approach has been proposed to reduce the computations involved in the ISAR imaging, which uses harmonic wavelet-(HW) based time-frequency representation (TFR). Since the HW-based TFR falls into a category of nonparametric time-frequency (T-F) analysis tool, it is computationally efficient compared to parametric T-F analysis tools such as adaptive joint time-frequency transform (AJTFT), adaptive wavelet transform (AWT), and evolutionary AWT (EAWT). Further, the performance of the proposed method of ISAR imaging is compared with the ISAR imaging by other nonparametric T-F analysis tools such as short-time Fourier transform (STFT) and Choi-Williams distribution (CWD). In the ISAR imaging, the use of HW-based TFR provides similar/better results with significant (92%) computational advantage compared to that obtained by CWD. The ISAR images thus obtained are identified using a neural network-based classification scheme with feature set invariant to translation, rotation, and scaling.
Ground support system methodology and architecture
NASA Technical Reports Server (NTRS)
Schoen, P. D.
1991-01-01
A synergistic approach to systems test and support is explored. A building block architecture provides transportability of data, procedures, and knowledge. The synergistic approach also lowers cost and risk for life cycle of a program. The determination of design errors at the earliest phase reduces cost of vehicle ownership. Distributed scaleable architecture is based on industry standards maximizing transparency and maintainability. Autonomous control structure provides for distributed and segmented systems. Control of interfaces maximizes compatibility and reuse, reducing long term program cost. Intelligent data management architecture also reduces analysis time and cost (automation).
Time-series analysis of sleep wake stage of rat EEG using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Shinba, Toshikazu; Mugishima, Go; Haraguchi, Hikaru; Inoue, Masayoshi
2008-05-01
We performed electroencephalography (EEG) for six male Wistar rats to clarify temporal behaviors at different levels of consciousness. Levels were identified both by conventional sleep analysis methods and by our novel entropy method. In our method, time-dependent pattern entropy is introduced, by which EEG is reduced to binary symbolic dynamics and the pattern of symbols in a sliding temporal window is considered. A high correlation was obtained between level of consciousness as measured by the conventional method and mean entropy in our entropy method. Mean entropy was maximal while awake (stage W) and decreased as sleep deepened. These results suggest that time-dependent pattern entropy may offer a promising method for future sleep research.
Brunetti, Natale Daniele; De Gennaro, Luisa; Correale, Michele; Santoro, Francesco; Caldarola, Pasquale; Gaglione, Antonio; Di Biase, Matteo
2017-04-01
A shorter time to treatment has been shown to be associated with lower mortality rates in acute myocardial infarction (AMI). Several strategies have been adopted with the aim to reduce any delay in diagnosis of AMI: pre-hospital triage with telemedicine is one of such strategies. We therefore aimed to measure the real effect of pre-hospital triage with telemedicine in case of AMI in a meta-analysis study. We performed a meta-analysis of non-randomized studies with the aim to quantify the exact reduction of time to treatment achieved by pre-hospital triage with telemedicine. Data were pooled and compared by relative time reduction and 95% C.I.s. A meta-regression analysis was performed in order to find possible predictors of shorter time to treatment. Eleven studies were selected and finally evaluated in the study. The overall relative reduction of time to treatment with pre-hospital triage and telemedicine was -38/-40% (p<0.001). Absolute time reduction was significantly correlated to time to treatment in the control groups (p<0.001), while relative time reduction was independent. A non-significant trend toward shorter relative time reductions was observed over years. Pre-hospital triage with telemedicine is associated with a near halved time to treatment in AMI. The benefit is larger in terms of absolute time to treatment reduction in populations with larger delays to treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Schwuttke, Ursula M.; Veregge, John, R.; Angelino, Robert; Childs, Cynthia L.
1990-10-01
The Monitor/Analyzer of Real-time Voyager Engineering Link (MARVEL) is described. It is the first automation tool to be used in an online mode for telemetry monitoring and analysis in mission operations. MARVEL combines standard automation techniques with embedded knowledge base systems to simultaneously provide real time monitoring of data from subsystems, near real time analysis of anomaly conditions, and both real time and non-real time user interface functions. MARVEL is currently capable of monitoring the Computer Command Subsystem (CCS), Flight Data Subsystem (FDS), and Attitude and Articulation Control Subsystem (AACS) for both Voyager spacecraft, simultaneously, on a single workstation. The goal of MARVEL is to provide cost savings and productivity enhancement in mission operations and to reduce the need for constant availability of subsystem expertise.
Robust passivity analysis for discrete-time recurrent neural networks with mixed delays
NASA Astrophysics Data System (ADS)
Huang, Chuan-Kuei; Shu, Yu-Jeng; Chang, Koan-Yuh; Shou, Ho-Nien; Lu, Chien-Yu
2015-02-01
This article considers the robust passivity analysis for a class of discrete-time recurrent neural networks (DRNNs) with mixed time-delays and uncertain parameters. The mixed time-delays that consist of both the discrete time-varying and distributed time-delays in a given range are presented, and the uncertain parameters are norm-bounded. The activation functions are assumed to be globally Lipschitz continuous. Based on new bounding technique and appropriate type of Lyapunov functional, a sufficient condition is investigated to guarantee the existence of the desired robust passivity condition for the DRNNs, which can be derived in terms of a family of linear matrix inequality (LMI). Some free-weighting matrices are introduced to reduce the conservatism of the criterion by using the bounding technique. A numerical example is given to illustrate the effectiveness and applicability.
Reducing the Time and Cost of Testing Engines
NASA Technical Reports Server (NTRS)
2004-01-01
Producing a new aircraft engine currently costs approximately $1 billion, with 3 years of development time for a commercial engine and 10 years for a military engine. The high development time and cost make it extremely difficult to transition advanced technologies for cleaner, quieter, and more efficient new engines. To reduce this time and cost, NASA created a vision for the future where designers would use high-fidelity computer simulations early in the design process in order to resolve critical design issues before building the expensive engine hardware. To accomplish this vision, NASA's Glenn Research Center initiated a collaborative effort with the aerospace industry and academia to develop its Numerical Propulsion System Simulation (NPSS), an advanced engineering environment for the analysis and design of aerospace propulsion systems and components. Partners estimate that using NPSS has the potential to dramatically reduce the time, effort, and expense necessary to design and test jet engines by generating sophisticated computer simulations of an aerospace object or system. These simulations will permit an engineer to test various design options without having to conduct costly and time-consuming real-life tests. By accelerating and streamlining the engine system design analysis and test phases, NPSS facilitates bringing the final product to market faster. NASA's NPSS Version (V)1.X effort was a task within the Agency s Computational Aerospace Sciences project of the High Performance Computing and Communication program, which had a mission to accelerate the availability of high-performance computing hardware and software to the U.S. aerospace community for its use in design processes. The technology brings value back to NASA by improving methods of analyzing and testing space transportation components.
Milne, Tony G E; Vather, Ryash; O'Grady, Gregory; Miquel, Jordi; Biondo, Sebastiano; Bissett, Ian
2018-03-06
Gastrografin has been suggested as a rescue therapy for prolonged post-operative ileus (PPOI) but trial data has been inconclusive. This study aimed to determine the benefit of gastrografin use in patients with PPOI by pooling the results of two recent randomized controlled trials assessing the efficacy of gastrografin compared to placebo given at time of PPOI diagnosis. Anonymized, individual patient data from patients undergoing elective bowel resection for any indication were included, stoma closure was excluded. The primary outcome was duration of PPOI. Secondary outcomes were time to tolerate oral diet, passage of flatus/stool, requirement and duration of nasogastric tube, length of post-operative stay and rate of post-operative complications. Individual patient data were pooled for analysis (53 gastrografin, 55 placebo). Gastrografin trended towards a reduction in PPOI duration compared to placebo, respectively, median 96 h (interquartile range, IQR, 78 h) versus median 120 h (IQR, 84 h), however, this result was non-significant (P = 0.11). In addition, no significant difference was detected between the two groups for time to passage of flatus/stool (P = 0.36) and overall length of stay (P = 0.35). Gastrografin conferred a significantly faster time to tolerate an oral diet compared to placebo (median 84 h versus median 107 h, P = 0.04). There was no difference in post-operative complications between the two interventions (P > 0.05). Gastrografin did not significantly reduce PPOI duration or length of stay after abdominal surgery, but did reduce time to tolerate a solid diet. Further studies are required to clarify the role of gastrografin in PPOI. © 2018 Royal Australasian College of Surgeons.
Time-course human urine proteomics in space-flight simulation experiments.
Binder, Hans; Wirth, Henry; Arakelyan, Arsen; Lembcke, Kathrin; Tiys, Evgeny S; Ivanisenko, Vladimir A; Kolchanov, Nikolay A; Kononikhin, Alexey; Popov, Igor; Nikolaev, Evgeny N; Pastushkova, Lyudmila; Larina, Irina M
2014-01-01
Long-term space travel simulation experiments enabled to discover different aspects of human metabolism such as the complexity of NaCl salt balance. Detailed proteomics data were collected during the Mars105 isolation experiment enabling a deeper insight into the molecular processes involved. We studied the abundance of about two thousand proteins extracted from urine samples of six volunteers collected weekly during a 105-day isolation experiment under controlled dietary conditions including progressive reduction of salt consumption. Machine learning using Self Organizing maps (SOM) in combination with different analysis tools was applied to describe the time trajectories of protein abundance in urine. The method enables a personalized and intuitive view on the physiological state of the volunteers. The abundance of more than one half of the proteins measured clearly changes in the course of the experiment. The trajectory splits roughly into three time ranges, an early (week 1-6), an intermediate (week 7-11) and a late one (week 12-15). Regulatory modes associated with distinct biological processes were identified using previous knowledge by applying enrichment and pathway flow analysis. Early protein activation modes can be related to immune response and inflammatory processes, activation at intermediate times to developmental and proliferative processes and late activations to stress and responses to chemicals. The protein abundance profiles support previous results about alternative mechanisms of salt storage in an osmotically inactive form. We hypothesize that reduced NaCl consumption of about 6 g/day presumably will reduce or even prevent the activation of inflammatory processes observed in the early time range of isolation. SOM machine learning in combination with analysis methods of class discovery and functional annotation enable the straightforward analysis of complex proteomics data sets generated by means of mass spectrometry.
Analysis of the Climate Change Technology Initiative
1999-01-01
Analysis of the impact of specific policies on the reduction of carbon emissions and their impact on U.S. energy use and prices in the 2008-2012 time frame. Also, analyzes the impact of the President's Climate Change Technology Initiative, as defined for the 2000 budget, on reducing carbon emissions from the levels forecast in the Annual Energy Outlook 1999 reference case.
Error Pattern Analysis Applied to Technical Writing: An Editor's Guide for Writers.
ERIC Educational Resources Information Center
Monagle, E. Brette
The use of error pattern analysis can reduce the time and money spent on editing and correcting manuscripts. What is required is noting, classifying, and keeping a frequency count of errors. First an editor should take a typical page of writing and circle each error. After the editor has done a sufficiently large number of pages to identify an…
Effect of abdominopelvic abscess drain size on drainage time and probability of occlusion
Rotman, Jessica A.; Getrajdman, George I.; Maybody, Majid; Erinjeri, Joseph P.; Yarmohammadi, Hooman; Sofocleous, Constantinos T.; Solomon, Stephen B.; Boas, F. Edward
2016-01-01
Background The purpose of this study is to determine whether larger abdominopelvic abscess drains reduce the time required for abscess resolution, or the probability of tube occlusion. Methods 144 consecutive patients who underwent abscess drainage at a single institution were reviewed retrospectively. Results: Larger initial drain size did not reduce drainage time, drain occlusion, or drain exchanges (p>0.05). Subgroup analysis did not find any type of collection that benefitted from larger drains. A multivariate model predicting drainage time showed that large collections (>200 ml) required 16 days longer drainage time than small collections (<50 ml). Collections with a fistula to bowel required 17 days longer drainage time than collections without a fistula. Initial drain size and the viscosity of the fluid in the collection had no significant effect on drainage time in the multivariate model. Conclusions 8 F drains are adequate for initial drainage of most serous and serosanguineous collections. 10 F drains are adequate for initial drainage of most purulent or bloody collections. PMID:27634422
Kim, Bo-Bae; Kim, Minji; Park, Yun-Hee; Ko, Youngkyung; Park, Jun-Beom
2017-06-01
Objective Next-generation sequencing was performed to evaluate the effects of short-term application of dexamethasone on human gingiva-derived mesenchymal stem cells. Methods Human gingiva-derived stem cells were treated with a final concentration of 10 -7 M dexamethasone and the same concentration of vehicle control. This was followed by mRNA sequencing and data analysis, gene ontology and pathway analysis, quantitative real-time polymerase chain reaction of mRNA, and western blot analysis of RUNX2 and β-catenin. Results In total, 26,364 mRNAs were differentially expressed. Comparison of the results of dexamethasone versus control at 2 hours revealed that 7 mRNAs were upregulated and 25 mRNAs were downregulated. The application of dexamethasone reduced the expression of RUNX2 and β-catenin in human gingiva-derived mesenchymal stem cells. Conclusion The effects of dexamethasone on stem cells were evaluated with mRNA sequencing, and validation of the expression was performed with qualitative real-time polymerase chain reaction and western blot analysis. The results of this study can provide new insights into the role of mRNA sequencing in maxillofacial areas.
The NASA Modern Era Reanalysis for Research and Applications, Version-2 (MERRA-2)
NASA Astrophysics Data System (ADS)
Gelaro, R.; McCarty, W.; Molod, A.; Suarez, M.; Takacs, L.; Todling, R.
2014-12-01
The NASA Modern Era Reanalysis for Research Applications Version-2 (MERRA-2) is a reanalysis for the satellite era using an updated version of the Goddard Earth Observing System Data Assimilation System Version-5 (GEOS-5) produced by the Global Modeling and Assimilation Office (GMAO). MERRA-2 will assimilate meteorological and aerosol observations not available to MERRA and includes improvements to the GEOS-5 model and analysis scheme so as to provide an ongoing climate analysis beyond MERRA's terminus. MERRA-2 will also serve as a development milestone for a future GMAO coupled Earth system analysis. Production of MERRA-2 began in June 2014 in four processing streams, with convergence to a single near-real time climate analysis expected by early 2015. This talk provides an overview of the MERRA-2 system developments and key science results. For example, compared with MERRA, MERRA-2 exhibits a well-balanced relationship between global precipitation and evaporation, with significantly reduced sensitivity to changes in the global observing system through time. Other notable improvements include reduced biases in the tropical middle- and upper-tropospheric wind and near-surface temperature over continents.
Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331
Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.
Spatial-temporal discriminant analysis for ERP-based brain-computer interface.
Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej
2013-03-01
Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.
ERIC Educational Resources Information Center
Varis, Olli; And Others
1993-01-01
Presents one approach to handling the trade-off between reducing uncertainty in environmental assessment and management and additional expenses. Uses the approach in the evaluation of three alternatives for a real time river water quality forecasting system. Analysis of risk attitudes, costs and uncertainty indicated the levels of socioeconomic…
Downing, Katherine L; Hnatiuk, Jill A; Hinkley, Trina; Salmon, Jo; Hesketh, Kylie D
2018-01-01
Aim or objective To evaluate the effectiveness of behavioural interventions that report sedentary behaviour outcomes during early childhood. Design Systematic review and meta-analysis. Data sources Academic Search Complete, CINAHL Complete, Global Health, MEDLINE Complete, PsycINFO, SPORTDiscus with Full Text and EMBASE electronic databases were searched in March 2016. Eligibility criteria for selecting studies Inclusion criteria were: (1) published in a peer-reviewed English language journal; (2) sedentary behaviour outcomes reported; (3) randomised controlled trial (RCT) study design; and (4) participants were children with a mean age of ≤5.9 years and not yet attending primary/elementary school at postintervention. Results 31 studies were included in the systematic review and 17 studies in the meta-analysis. The overall mean difference in screen time outcomes between groups was −17.12 (95% CI −28.82 to −5.42) min/day with a significant overall intervention effect (Z=2.87, p=0.004). The overall mean difference in sedentary time between groups was −18.91 (95% CI −33.31 to −4.51) min/day with a significant overall intervention effect (Z=2.57, p=0.01). Subgroup analyses suggest that for screen time, interventions of ≥6 months duration and those conducted in a community-based setting are most effective. For sedentary time, interventions targeting physical activity (and reporting changes in sedentary time) are more effective than those directly targeting sedentary time. Summary/conclusions Despite heterogeneity in study methods and results, overall interventions to reduce sedentary behaviour in early childhood show significant reductions, suggesting that this may be an opportune time to intervene. Trial registration number CRD42015017090. PMID:29449219
Downing, Katherine L; Hnatiuk, Jill A; Hinkley, Trina; Salmon, Jo; Hesketh, Kylie D
2018-03-01
To evaluate the effectiveness of behavioural interventions that report sedentary behaviour outcomes during early childhood. Systematic review and meta-analysis. Academic Search Complete, CINAHL Complete, Global Health, MEDLINE Complete, PsycINFO, SPORTDiscus with Full Text and EMBASE electronic databases were searched in March 2016. Inclusion criteria were: (1) published in a peer-reviewed English language journal; (2) sedentary behaviour outcomes reported; (3) randomised controlled trial (RCT) study design; and (4) participants were children with a mean age of ≤5.9 years and not yet attending primary/elementary school at postintervention. 31 studies were included in the systematic review and 17 studies in the meta-analysis. The overall mean difference in screen time outcomes between groups was -17.12 (95% CI -28.82 to -5.42) min/day with a significant overall intervention effect (Z=2.87, p=0.004). The overall mean difference in sedentary time between groups was -18.91 (95% CI -33.31 to -4.51) min/day with a significant overall intervention effect (Z=2.57, p=0.01). Subgroup analyses suggest that for screen time, interventions of ≥6 months duration and those conducted in a community-based setting are most effective. For sedentary time, interventions targeting physical activity (and reporting changes in sedentary time) are more effective than those directly targeting sedentary time. Despite heterogeneity in study methods and results, overall interventions to reduce sedentary behaviour in early childhood show significant reductions, suggesting that this may be an opportune time to intervene. CRD42015017090. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Zhu, Fengxiang; Pan, Zaifa; Hong, Chunlai; Wang, Weiping; Chen, Xiaoyang; Xue, Zhiyong; Yao, Yanlai
2016-12-01
Changes in volatile organic compound contents in compost samples during pig manure composting were studied using a headspace, solid-phase micro-extraction method (HS-SPME) followed by gas chromatography with mass spectrometric detection (GC/MS). Parameters affecting the SPME procedure were optimized as follows: the coating was carbon molecular sieve/polydimethylsiloxane (CAR/PDMS) fiber, the temperature was 60°C and the time was 30min. Under these conditions, 87 compounds were identified from 17 composting samples. Most of the volatile components could only be detected before day 22. However, benzenes, alkanes and alkenes increased and eventually stabilized after day 22. Phenol and acid substances, which are important factors for compost quality, were almost undetectable on day 39 in natural compost (NC) samples and on day 13 in maggot-treated compost (MC) samples. Our results indicate that the approach can be effectively used to determine the composting times by analysis of volatile substances in compost samples. An appropriate composting time not only ensures the quality of compost and reduces the loss of composting material but also reduces the generation of hazardous substances. The appropriate composting times for MC and NC were approximately 22days and 40days, respectively, during the summer in Zhejiang. Copyright © 2016 Elsevier Ltd. All rights reserved.
Liu, Yi-Hua; Dong, Guang-Tong; Ye, Yang; Zheng, Jia-Bin; Zhang, Ying; Lin, Hong-Sheng; Wang, Xue-Qian
2017-01-01
The aim of this study was to evaluate the effects of acupuncture therapy to reduce the duration of postoperative ileus (POI) and to enhance bowel function in cancer patients. A systematic search of electronic databases for studies published from inception until January 2017 was carried out from six databases. Randomized controlled trials (RCTs) involving the use of acupuncture and acupressure for POI and bowel function in cancer patients were identified. Outcomes were extracted from each study and pooled to determine the risk ratio and standardized mean difference. 10 RCTs involving 776 cancer patients were included. Compared with control groups (no acupuncture, sham acupuncture, and other active therapies), acupuncture was associated with shorter time to first flatus and time to first defecation. A subgroup analysis revealed that manual acupuncture was more effective on the time to first flatus and the time to first defecation; electroacupuncture was better in reducing the length of hospital stay. Compared with control groups (sham or no acupressure), acupressure was associated with shorter time to first flatus. However, GRADE approach indicated a low quality of evidence. Acupuncture and acupressure showed large effect size with significantly poor or inferior quality of included trials for enhancing bowel function in cancer patients after surgery. Further well-powered evidence is needed.
Accelerating next generation sequencing data analysis with system level optimizations.
Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid
2017-08-22
Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.
Hyperspectral image analysis for plant stress detection
USDA-ARS?s Scientific Manuscript database
Abiotic and disease-induced stress significantly reduces plant productivity. Automated on-the-go mapping of plant stress allows timely intervention and mitigating of the problem before critical thresholds are exceeded, thereby, maximizing productivity. A hyperspectral camera analyzed the spectral ...
2012-06-08
assassinated the fact-finding mission reported the charges of Buddhist...At nearly the same time, the Viet Cong increased kidnappings and assassinations to reduce government control and political instability. In fact
Diazo techniques for remote sensor data analysis
NASA Technical Reports Server (NTRS)
Mount, S.; Whitebay, L. E.
1979-01-01
Cost and time to extract land use maps, natural-resource surveys, and other data from aerial and satellite photographs are reduced by diazo processing. Process can be controlled to enhance features such as vegetation, land boundaries, and bodies of water.
Fast time-resolved electrostatic force microscopy: Achieving sub-cycle time resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karatay, Durmus U.; Harrison, Jeffrey S.; Glaz, Micah S.
The ability to measure microsecond- and nanosecond-scale local dynamics below the diffraction limit with widely available atomic force microscopy hardware would enable new scientific studies in fields ranging from biology to semiconductor physics. However, commercially available scanning-probe instruments typically offer the ability to measure dynamics only on time scales of milliseconds to seconds. Here, we describe in detail the implementation of fast time-resolved electrostatic force microscopy using an oscillating cantilever as a means to measure fast local dynamics following a perturbation to a sample. We show how the phase of the oscillating cantilever relative to the perturbation event is criticalmore » to achieving reliable sub-cycle time resolution. We explore how noise affects the achievable time resolution and present empirical guidelines for reducing noise and optimizing experimental parameters. Specifically, we show that reducing the noise on the cantilever by using photothermal excitation instead of piezoacoustic excitation further improves time resolution. We demonstrate the discrimination of signal rise times with time constants as fast as 10 ns, and simultaneous data acquisition and analysis for dramatically improved image acquisition times.« less
Oral contraception following abortion
Che, Yan; Liu, Xiaoting; Zhang, Bin; Cheng, Linan
2016-01-01
Abstract Oral contraceptives (OCs) following induced abortion offer a reliable method to avoid repeated abortion. However, limited data exist supporting the effective use of OCs postabortion. We conducted this systematic review and meta-analysis in the present study reported immediate administration of OCs or combined OCs postabortion may reduce vaginal bleeding time and amount, shorten the menstruation recovery period, increase endometrial thickness 2 to 3 weeks after abortion, and reduce the risk of complications and unintended pregnancies. A total of 8 major authorized Chinese and English databases were screened from January 1960 to November 2014. Randomized controlled trials in which patients had undergone medical or surgical abortions were included. Chinese studies that met the inclusion criteria were divided into 3 groups: administration of OC postmedical abortion (group I; n = 1712), administration of OC postsurgical abortion (group II; n = 8788), and administration of OC in combination with traditional Chinese medicine postsurgical abortion (group III; n = 19,707). In total, 119 of 6160 publications were included in this analysis. Significant difference was observed in group I for vaginal bleeding time (P = 0.0001), the amount of vaginal bleeding (P = 0.03), and menstruation recovery period (P < 0.00001) compared with the control groups. Group II demonstrated a significant difference in vaginal bleeding time (P < 0.00001), the amount of vaginal bleeding (P = 0.0002), menstruation recovery period (P < 0.00001), and endometrial thickness at 2 (P = 0.003) and 3 (P < 0.00001) weeks postabortion compared with the control group. Similarly, a significant difference was observed in group III for reducing vaginal bleeding time (P < 0.00001) and the amount of vaginal bleeding (P < 0.00001), shortening the menstruation recovery period (P < 0.00001), and increasing endometrial thickness 2 and 3 weeks after surgical abortion (P < 0.00001, all). Immediate administration of OCs postabortion may reduce vaginal bleeding time and amount, shorten the menstruation recovery period, increase endometrial thickness 2 to 3 weeks after abortion, and reduce the risk of complications and unintended pregnancies. PMID:27399060
Refined generalized multiscale entropy analysis for physiological signals
NASA Astrophysics Data System (ADS)
Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian
2018-01-01
Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.
Han, Jung Mi; Boo, Eun Hee; Kim, Jung A; Yoon, Soo Jin; Kim, Seong Woo
2012-01-01
Objectives This study evaluated the qualitative and quantitative performances of the newly developed information system which was implemented on November 4, 2011 at the National Health Insurance Corporation Ilsan Hospital. Methods Registration waiting time and changes in the satisfaction scores for the key performance indicators (KPI) before and after the introduction of the system were compared; and the economic effects of the system were analyzed by using the information economics approach. Results After the introduction of the system, the waiting time for registration was reduced by 20%, and the waiting time at the internal medicine department was reduced by 15%. The benefit-to-cost ratio was increased to 1.34 when all intangible benefits were included in the economic analysis. Conclusions The economic impact and target satisfaction rates increased due to the introduction of the new system. The results were proven by the quantitative and qualitative analyses carried out in this study. This study was conducted only seven months after the introduction of the system. As such, a follow-up study should be carried out in the future when the system stabilizes. PMID:23115744
Time frequency analysis of sound from a maneuvering rotorcraft
NASA Astrophysics Data System (ADS)
Stephenson, James H.; Tinney, Charles E.; Greenwood, Eric; Watts, Michael E.
2014-10-01
The acoustic signatures produced by a full-scale, Bell 430 helicopter during steady-level-flight and transient roll-right maneuvers are analyzed by way of time-frequency analysis. The roll-right maneuvers comprise both a medium and a fast roll rate. Data are acquired using a single ground based microphone that are analyzed by way of the Morlet wavelet transform to extract the spectral properties and sound pressure levels as functions of time. The findings show that during maneuvering operations of the helicopter, both the overall sound pressure level and the blade-vortex interaction sound pressure level are greatest when the roll rate of the vehicle is at its maximum. The reduced inflow in the region of the rotor disk where blade-vortex interaction noise originates is determined to be the cause of the increase in noise. A local decrease in inflow reduces the miss distance of the tip vortex and thereby increases the BVI noise signature. Blade loading and advance ratios are also investigated as possible mechanisms for increased sound production, but are shown to be fairly constant throughout the maneuvers.
Numerical Analysis of Heat Transfer During Quenching Process
NASA Astrophysics Data System (ADS)
Madireddi, Sowjanya; Krishnan, Krishnan Nambudiripad; Reddy, Ammana Satyanarayana
2018-04-01
A numerical model is developed to simulate the immersion quenching process of metals. The time of quench plays an important role if the process involves a defined step quenching schedule to obtain the desired characteristics. Lumped heat capacity analysis used for this purpose requires the value of heat transfer coefficient, whose evaluation requires large experimental data. Experimentation on a sample work piece may not represent the actual component which may vary in dimension. A Fluid-Structure interaction technique with a coupled interface between the solid (metal) and liquid (quenchant) is used for the simulations. Initial times of quenching shows boiling heat transfer phenomenon with high values of heat transfer coefficients (5000-2.5 × 105 W/m2K). Shape of the work piece with equal dimension shows less influence on the cooling rate Non-uniformity in hardness at the sharp corners can be reduced by rounding off the edges. For a square piece of 20 mm thickness, with 3 mm fillet radius, this difference is reduced by 73 %. The model can be used for any metal-quenchant combination to obtain time-temperature data without the necessity of experimentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-06-08
Laboratory applications for the analysis of PCBS (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBS. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less
NASA Astrophysics Data System (ADS)
Shao, Liyang; Zhang, Yunpeng; Li, Zonglei; Zhang, Zhiyong; Zou, Xihua; Luo, Bin; Pan, Wei; Yan, Lianshan
2016-11-01
Logarithmic detectors (LogDs) have been used in coherent Brillouin optical time-domain analysis (BOTDA) sensors to reduce the effect of phase fluctuation, demodulation complexities, and measurement time. However, because of the inherent properties of LogDs, a DC component at the level of hundreds of millivolts that prohibits high-gain signal amplification (SA) could be generated, resulting in unacceptable data acquisition (DAQ) inaccuracies and decoding errors in the process of prototype integration. By generating a reference light at a level similar to the probe light, differential detection can be applied to remove the DC component automatically using a differential amplifier before the DAQ process. Therefore, high-gain SA can be employed to reduce quantization errors. The signal-to-noise ratio of the weak Brillouin gain signal is improved from ˜11.5 to ˜21.8 dB. A BOTDA prototype is implemented based on the proposed scheme. The experimental results show that the measurement accuracy of the Brillouin frequency shift (BFS) is improved from ±1.9 to ±0.8 MHz at the end of a 40-km sensing fiber.
Detection of epileptiform activity in EEG signals based on time-frequency and non-linear analysis
Gajic, Dragoljub; Djurovic, Zeljko; Gligorijevic, Jovan; Di Gennaro, Stefano; Savic-Gajic, Ivana
2015-01-01
We present a new technique for detection of epileptiform activity in EEG signals. After preprocessing of EEG signals we extract representative features in time, frequency and time-frequency domain as well as using non-linear analysis. The features are extracted in a few frequency sub-bands of clinical interest since these sub-bands showed much better discriminatory characteristics compared with the whole frequency band. Then we optimally reduce the dimension of feature space to two using scatter matrices. A decision about the presence of epileptiform activity in EEG signals is made by quadratic classifiers designed in the reduced two-dimensional feature space. The accuracy of the technique was tested on three sets of electroencephalographic (EEG) signals recorded at the University Hospital Bonn: surface EEG signals from healthy volunteers, intracranial EEG signals from the epilepsy patients during the seizure free interval from within the seizure focus and intracranial EEG signals of epileptic seizures also from within the seizure focus. An overall detection accuracy of 98.7% was achieved. PMID:25852534
Co, Jayson L; Mejia, Michael Benedict A; Que, Jocelyn C; Dizon, Janine Margarita R
2016-07-01
Mucositis is a disabling effect of radiotherapy in head and neck cancers. There is no current standard on management of radiation-induced mucositis. Honey has been shown to reduce radiation-induced mucositis. A systematic review and meta-analysis were undertaken to assess the ability of honey in reducing the severity of oral mucositis, time to mucositis, weight loss, and treatment interruptions. Eight studies were included and showed that honey was significantly better in lowering the risk for treatment interruptions, weight loss, and delaying time to mucositis, but not severity of mucositis. There is current evidence that honey is beneficial for patients with head and neck cancers by decreasing treatment interruptions, weight loss, and delaying the onset of oral mucositis, but not in decreasing peak mucositis score. In light of the results, honey is a reasonable treatment for radiation-induced mucositis, but more randomized clinical trials (RCTs) should be done. © 2016 Wiley Periodicals, Inc. Head Neck 38: 1119-1128, 2016. © 2016 Wiley Periodicals, Inc.
System for sensing droplet formation time delay in a flow cytometer
Van den Engh, Ger; Esposito, Richard J.
1997-01-01
A droplet flow cytometer system which includes a system to optimize the droplet formation time delay based on conditions actually experienced includes an automatic droplet sampler which rapidly moves a plurality of containers stepwise through the droplet stream while simultaneously adjusting the droplet time delay. Through the system sampling of an actual substance to be processed can be used to minimize the effect of the substances variations or the determination of which time delay is optimal. Analysis such as cell counting and the like may be conducted manually or automatically and input to a time delay adjustment which may then act with analysis equipment to revise the time delay estimate actually applied during processing. The automatic sampler can be controlled through a microprocessor and appropriate programming to bracket an initial droplet formation time delay estimate. When maximization counts through volume, weight, or other types of analysis exists in the containers, the increment may then be reduced for a more accurate ultimate setting. This may be accomplished while actually processing the sample without interruption.
Multiscale multifractal detrended cross-correlation analysis of financial time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing
2014-06-01
In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.
Leisure-time physical activity and all-cause mortality.
Lahti, Jouni; Holstila, Ansku; Lahelma, Eero; Rahkonen, Ossi
2014-01-01
Physical inactivity is a major public health problem associated with increased mortality risk. It is, however, poorly understood whether vigorous physical activity is more beneficial for reducing mortality risk than activities of lower intensity. The aim of this study was to examine associations of the intensity and volume of leisure-time physical activity with all-cause mortality among middle-aged women and men while considering sociodemographic and health related factors as covariates. Questionnaire survey data collected in 2000-02 among 40-60-year-old employees of the City of Helsinki (N = 8960) were linked with register data on mortality (74% gave permission to the linkage) providing a mean follow-up time of 12-years. The analysis included 6429 respondents (79% women). The participants were classified into three groups according to intensity of physical activity: low moderate, high moderate and vigorous. The volume of physical activity was classified into three groups according to tertiles. Cox regression analysis was used to calculate hazard ratios (HR) and 95% confidence intervals (CIs) for all-cause mortality. During the follow up 205 participants died. Leisure-time physical activity was associated with reduced risk of mortality. After adjusting for covariates the vigorous group (HR = 0.54, 95% CI 0.34-0.86) showed a reduced risk of mortality compared with the low moderate group whereas for the high moderate group the reductions in mortality risk (HR = 0.72, 95% CI 0.48-1.08) were less clear. Adjusting for the volume of physical activity did not affect the point estimates. Higher volume of leisure-time physical activity was also associated with reduced mortality risk; however, adjusting for the covariates and the intensity of physical activity explained the differences. For healthy middle-aged women and men who engage in some physical activity vigorous exercise may provide further health benefits preventing premature deaths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin
The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less
Conducting On-orbit Gene Expression Analysis on ISS: WetLab-2
NASA Technical Reports Server (NTRS)
Parra, Macarena; Almeida, Eduardo; Boone, Travis; Jung, Jimmy; Lera, Matthew P.; Ricco, Antonio; Souza, Kenneth; Wu, Diana; Richey, C. Scott
2013-01-01
WetLab-2 will enable expanded genomic research on orbit by developing tools that support in situ sample collection, processing, and analysis on ISS. This capability will reduce the time-to-results for investigators and define new pathways for discovery on the ISS National Lab. The primary objective is to develop a research platform on ISS that will facilitate real-time quantitative gene expression analysis of biological samples collected on orbit. WetLab-2 will be capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on orbit. WetLab-2 will significantly expand the analytical capabilities onboard ISS and enhance science return from ISS.
Czuba, J.A.; Oberg, K.
2008-01-01
Previous work by Oberg and Mueller of the U.S. Geological Survey in 2007 concluded that exposure time (total time spent sampling the flow) is a critical factor in reducing measurement uncertainty. In a subsequent paper, Oberg and Mueller validated these conclusions using one set of data to show that the effect of exposure time on the uncertainty of the measured discharge is independent of stream width, depth, and range of boat speeds. Analysis of eight StreamPro acoustic Doppler current profiler (ADCP) measurements indicate that they fall within and show a similar trend to the Rio Grande ADCP data previously reported. Four special validation measurements were made for the purpose of verifying the conclusions of Oberg and Mueller regarding exposure time for Rio Grande and StreamPro ADCPs. Analysis of these measurements confirms that exposure time is a critical factor in reducing measurement uncertainty and is independent of stream width, depth, and range of boat speeds. Furthermore, it appears that the relation between measured discharge uncertainty and exposure time is similar for both Rio Grande and StreamPro ADCPs. These results are applicable to ADCPs that make use of broadband technology using bottom-tracking to obtain the boat velocity. Based on this work, a minimum of two transects should be collected with an exposure time for all transects greater than or equal to 720 seconds in order to achieve an uncertainty of ??5 percent when using bottom-tracking ADCPs. ?? 2008 IEEE.
Khoo, E H; Ahmed, I; Goh, R S M; Lee, K H; Hung, T G G; Li, E P
2013-03-11
The dynamic-thermal electron-quantum medium finite-difference time-domain (DTEQM-FDTD) method is used for efficient analysis of mode profile in elliptical microcavity. The resonance peak of the elliptical microcavity is studied by varying the length ratio. It is observed that at some length ratios, cavity mode is excited instead of whispering gallery mode. This depicts that mode profiles are length ratio dependent. Through the implementation of the DTEQM-FDTD on graphic processing unit (GPU), the simulation time is reduced by 300 times as compared to the CPU. This leads to an efficient optimization approach to design microcavity lasers for wide range of applications in photonic integrated circuits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moscicki, J. K.; Sokolowska, D.; Dziob, D.
2014-02-15
A simplified data analysis protocol, for dielectric spectroscopy use to study conductivity percolation in dehydrating granular media is discussed. To enhance visibility of the protonic conductivity contribution to the dielectric loss spectrum, detrimental effects of either low-frequency dielectric relaxation or electrode polarization are removed. Use of the directly measurable monofrequency dielectric loss factor rather than estimated DC conductivity to parameterize the percolation transition substantially reduces the analysis work and time.
Mobile GPU-based implementation of automatic analysis method for long-term ECG.
Fan, Xiaomao; Yao, Qihang; Li, Ye; Chen, Runge; Cai, Yunpeng
2018-05-03
Long-term electrocardiogram (ECG) is one of the important diagnostic assistant approaches in capturing intermittent cardiac arrhythmias. Combination of miniaturized wearable holters and healthcare platforms enable people to have their cardiac condition monitored at home. The high computational burden created by concurrent processing of numerous holter data poses a serious challenge to the healthcare platform. An alternative solution is to shift the analysis tasks from healthcare platforms to the mobile computing devices. However, long-term ECG data processing is quite time consuming due to the limited computation power of the mobile central unit processor (CPU). This paper aimed to propose a novel parallel automatic ECG analysis algorithm which exploited the mobile graphics processing unit (GPU) to reduce the response time for processing long-term ECG data. By studying the architecture of the sequential automatic ECG analysis algorithm, we parallelized the time-consuming parts and reorganized the entire pipeline in the parallel algorithm to fully utilize the heterogeneous computing resources of CPU and GPU. The experimental results showed that the average executing time of the proposed algorithm on a clinical long-term ECG dataset (duration 23.0 ± 1.0 h per signal) is 1.215 ± 0.140 s, which achieved an average speedup of 5.81 ± 0.39× without compromising analysis accuracy, comparing with the sequential algorithm. Meanwhile, the battery energy consumption of the automatic ECG analysis algorithm was reduced by 64.16%. Excluding energy consumption from data loading, 79.44% of the energy consumption could be saved, which alleviated the problem of limited battery working hours for mobile devices. The reduction of response time and battery energy consumption in ECG analysis not only bring better quality of experience to holter users, but also make it possible to use mobile devices as ECG terminals for healthcare professions such as physicians and health advisers, enabling them to inspect patient ECG recordings onsite efficiently without the need of a high-quality wide-area network environment.
Algorithms and Object-Oriented Software for Distributed Physics-Based Modeling
NASA Technical Reports Server (NTRS)
Kenton, Marc A.
2001-01-01
The project seeks to develop methods to more efficiently simulate aerospace vehicles. The goals are to reduce model development time, increase accuracy (e.g.,by allowing the integration of multidisciplinary models), facilitate collaboration by geographically- distributed groups of engineers, support uncertainty analysis and optimization, reduce hardware costs, and increase execution speeds. These problems are the subject of considerable contemporary research (e.g., Biedron et al. 1999; Heath and Dick, 2000).
Use of Six Sigma Methodology to Reduce Appointment Lead-Time in Obstetrics Outpatient Department.
Ortiz Barrios, Miguel A; Felizzola Jiménez, Heriberto
2016-10-01
This paper focuses on the issue of longer appointment lead-time in the obstetrics outpatient department of a maternal-child hospital in Colombia. Because of extended appointment lead-time, women with high-risk pregnancy could develop severe complications in their health status and put their babies at risk. This problem was detected through a project selection process explained in this article and to solve it, Six Sigma methodology has been used. First, the process was defined through a SIPOC diagram to identify its input and output variables. Second, six sigma performance indicators were calculated to establish the process baseline. Then, a fishbone diagram was used to determine the possible causes of the problem. These causes were validated with the aid of correlation analysis and other statistical tools. Later, improvement strategies were designed to reduce appointment lead-time in this department. Project results evidenced that average appointment lead-time reduced from 6,89 days to 4,08 days and the deviation standard dropped from 1,57 days to 1,24 days. In this way, the hospital will serve pregnant women faster, which represents a risk reduction of perinatal and maternal mortality.
Head movement compensation in real-time magnetoencephalographic recordings.
Little, Graham; Boe, Shaun; Bardouille, Timothy
2014-01-01
Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.
Sparse Solutions for Single Class SVMs: A Bi-Criterion Approach
NASA Technical Reports Server (NTRS)
Das, Santanu; Oza, Nikunj C.
2011-01-01
In this paper we propose an innovative learning algorithm - a variation of One-class nu Support Vector Machines (SVMs) learning algorithm to produce sparser solutions with much reduced computational complexities. The proposed technique returns an approximate solution, nearly as good as the solution set obtained by the classical approach, by minimizing the original risk function along with a regularization term. We introduce a bi-criterion optimization that helps guide the search towards the optimal set in much reduced time. The outcome of the proposed learning technique was compared with the benchmark one-class Support Vector machines algorithm which more often leads to solutions with redundant support vectors. Through out the analysis, the problem size for both optimization routines was kept consistent. We have tested the proposed algorithm on a variety of data sources under different conditions to demonstrate the effectiveness. In all cases the proposed algorithm closely preserves the accuracy of standard one-class nu SVMs while reducing both training time and test time by several factors.
2008-08-18
fidelity will be used to reduce the massive experimental testing and associated time required for qualification of new materials. Tools and...develping a model of the thermo-oxidative process for polymer systems, that incorporates the effects of reaction rates, Fickian diffusion, time varying...degradation processes. Year: 2005 Month: 12 Not required at this time . AIR FORCE OFFICE OF SCIENTIFIC KESEARCH 04 SEP 2008 Page 2 of 2 DTIC Data
Gimonet, Johan; Portmann, Anne-Catherine; Fournier, Coralie; Baert, Leen
2018-06-16
This work shows that an incubation time reduced to 4-5 h to prepare a culture for DNA extraction followed by an automated DNA extraction can shorten the hands-on time, the turnaround time by 30% and increase the throughput while maintaining the WGS quality assessed by high quality Single Nucleotide Polymorphism analysis. Copyright © 2018. Published by Elsevier B.V.
Moonshiram, Dooshaye; Picon, Antonio; Vazquez-Mayagoitia, Alvaro; ...
2017-02-08
Here, we report the use of time-resolved X-ray absorption spectroscopy in the ns–μs time scale to track the light induced two electron transfer processes in a multi-component photocatalytic system, consisting of [Ru(bpy) 3] 2+/ a diiron(III,III) model/triethylamine. EXAFS analysis with DFT calculations confirms the structural configurations of the diiron(III,III) and reduced diiron(II,II) states.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moonshiram, Dooshaye; Picon, Antonio; Vazquez-Mayagoitia, Alvaro
Here, we report the use of time-resolved X-ray absorption spectroscopy in the ns–μs time scale to track the light induced two electron transfer processes in a multi-component photocatalytic system, consisting of [Ru(bpy) 3] 2+/ a diiron(III,III) model/triethylamine. EXAFS analysis with DFT calculations confirms the structural configurations of the diiron(III,III) and reduced diiron(II,II) states.
Reduction of VSC and salivary bacteria by a multibenefit mouthrinse.
Boyd, T; Vazquez, J; Williams, M
2008-03-01
To evaluate the effectiveness of a multibenefit mouthrinse containing 0.05% cetylpyridinium chloride (CPC) and 0.025% sodium fluoride in reducing volatile sulfur compound (VSC) levels and total cultivable salivary bacteria, at both 4 h and overnight. In vitro analysis of efficacy was performed using saliva-coated hydroxyapatite disc substrates first treated with the mouthrinse, then exposed to whole human saliva, followed by overnight incubation in air-tight vials. Headspace VSC was quantified by gas chromatography (GC). A clinical evaluation was conducted with 14 subjects using a crossover design. After a seven-day washout period, baseline clinical measurement of VSC was performed by GC analysis of mouth air sampled in the morning prior to eating, drinking or performing any oral hygiene. A 10 mL saline rinse was used to sample and enumerate cultivable salivary bacterial levels via serial dilution and plating. Subjects were instructed to use the treatment rinse twice daily in combination with a controlled brushing regimen. After one week the subjects returned in the morning prior to eating, drinking or performing oral hygiene to provide samples of overnight mouth air and salivary bacteria. The subjects were then immediately rinsed with the test product, and provided additional mouth air and saliva rinse samples 4 h later. A multibenefit rinse containing 0.05% CPC and 0.025% sodium fluoride was found to reduce VSC in vitro by 52%. The rinse also demonstrated a significant clinical reduction in breath VSC (p < 0.05) of 55.8% at 4 h and 23.4% overnight relative to baseline VSC levels. At both time points, the multibenefit rinse was more effective than the control; this difference was statistically significant at the overnight time point (p < 0.05). Total cultivable salivary bacteria levels were also reduced significantly (p < 0.05) at 4 h and overnight by this mouthrinse compared to baseline levels and the control. A multibenefit mouthrinse was shown to reduce in vitro VSC levels via headspace analysis and clinically at the 4 h and overnight time points. A significant reduction in total cultivable salivary bacteria was also observed at all time points, supporting the VSC data.
Gilchrist, Nigel; Dalzell, Kristian; Pearson, Scott; Hooper, Gary; Hoeben, Kit; Hickling, Jeremy; McKie, John; Yi, Ma; Chamberlain, Sandra; McCullough, Caroline; Gutenstein, Marc
2017-05-12
The increasing elderly population and subsequent rise in total hip fracture(s) in this group means more effective management strategies are necessary to improve efficiency. We have changed our patient care strategy from the emergency department (ED), acute orthopaedic wards, operating theatre, post-operation and rehabilitation, and called it Fracture Neck of Femur Fast Track Pathway. All clinical data and actions were captured, integrated and displayed on a weekly basis using 'signalfromnoise' (SFN) software. The initial four months analysis of this project showed significant improvement in patient flow within the hospitals. The overall length of stay was reduced by four days. Time in ED was reduced by 30 minutes, and the wait for rehabilitation reduced by three days. Overall time in rehabilitation reduced by 3-7 days depending on facility. On average, fast track patients spent 95 less hours in hospital, resulting in 631 bed days saved in this period, with projected savings of NZD700,000. No adverse effects were seen in mortality, readmission and functional improvement status. Fractured neck of femur has increasing clinical demand in a busy tertiary hospital. Length of stay, co-morbidities and waiting time for theatres are seen as major barriers to treatment for these conditions. Wait for rehabilitation can significantly lengthen hospital stay; also poor communication between the individual hospital management facets of this condition has been an ongoing issue. Lack of instant and available electronic information on this patient group has also been seen as a major barrier to improvement. This paper demonstrates how integration of service components that are involved in fractured neck of femur can be achieved. It also shows how the use of electronic data capture and analysis can give a very quick and easily interpretable data trend that will enable change in practice. This paper indicates that cooperation between health professionals and practitioners can significantly improve the length of stay and the time in which patients can be returned home. Full interdisciplinary involvement was the key to this approach. The use of electronic data capture and analysis can be used in many other health pathways within the health system.
Cost analysis serves many purposes.
Finger, W R
1998-01-01
This article discusses the utility of performing cost analysis of family planning (FP) personnel resources by relying on a system analysis framework in developing countries. A study of a national provider that distributes 16% of all FP services in Mexico found that more efficient use of staff would increase the number of clients served. Nurses and doctors worked slightly more than 6 hours/day, and 38% of a nurse's time and 47% of a physician's time was spent in meetings, administrative duties, unoccupied work time, and personal time. The Mexican government proposed increasing the work day to 8 hours and increasing to 66% the portion of the work day spent on direct client activity. With this change, services would increase from 1.5 million couple-years of protection (CYP) to 1.8 million CYP in 2010, without additional staff, and CYP cost would decline. CYP costs could potentially be reduced by increasing the number of contraceptive units provided per visit and switching from a 1-month- to a 3-month-duration injectable contraceptive. A Bangladesh study found that CYP costs could be reduced by eliminating absenteeism and increasing work time/day by 1 hour. Cost studies can address specific human resource issues. A study in Thailand found that Norplant was more expensive per CYP than injectables and the IUD, and Norplant acceptors were willing to switch to other effective modern methods. The Thai government decided to target Norplant to a few target groups. Staff time use evaluations can be conducted by requiring staff to record their time or by having clients maintain records of staff time on their health cards. The time-motion study, which involves direct observations of how staff spend their time, is costly but avoids estimation error. A CEMOPLAF study in Ecuador found that 1 visit detected almost as many health problems as 4 visits. Some studies examine cost savings related to other services.
NASA Astrophysics Data System (ADS)
Pawłuszek, Kamila; Borkowski, Andrzej
2016-06-01
Since the availability of high-resolution Airborne Laser Scanning (ALS) data, substantial progress in geomorphological research, especially in landslide analysis, has been carried out. First and second order derivatives of Digital Terrain Model (DTM) have become a popular and powerful tool in landslide inventory mapping. Nevertheless, an automatic landslide mapping based on sophisticated classifiers including Support Vector Machine (SVM), Artificial Neural Network or Random Forests is often computationally time consuming. The objective of this research is to deeply explore topographic information provided by ALS data and overcome computational time limitation. For this reason, an extended set of topographic features and the Principal Component Analysis (PCA) were used to reduce redundant information. The proposed novel approach was tested on a susceptible area affected by more than 50 landslides located on Rożnów Lake in Carpathian Mountains, Poland. The initial seven PCA components with 90% of the total variability in the original topographic attributes were used for SVM classification. Comparing results with landslide inventory map, the average user's accuracy (UA), producer's accuracy (PA), and overall accuracy (OA) were calculated for two models according to the classification results. Thereby, for the PCA-feature-reduced model UA, PA, and OA were found to be 72%, 76%, and 72%, respectively. Similarly, UA, PA, and OA in the non-reduced original topographic model, was 74%, 77% and 74%, respectively. Using the initial seven PCA components instead of the twenty original topographic attributes does not significantly change identification accuracy but reduce computational time.
Ding, Chao; Wang, Chunmao; Dong, Aiqiang; Kong, Minjian; Jiang, Daming; Tao, Kaiyu; Shen, Zhonghua
2012-05-04
Anterolateral Minithoracotomy (ALMT) for the radical correction of Congenital Heart Defects is an alternative to Median Sternotomy (MS) due to reduce operative trauma accelerating recovery and yield a better cosmetic outcome after surgery. Our purpose is to conduct whether ALMT would bring more short-term benefits to patients than conventional Median Sternotomy by using a meta-analysis of case-control study in the published English Journal. 6 case control studies published in English from 1997 to 2011 were identified and synthesized to compare the short-term postoperative outcomes between ALMT and MS. These outcomes were cardiopulmonary bypass time, aortic cross-clamp time, intubation time, intensive care unit stay time, and postoperative hospital stay time. ALMT had significantly longer cardiopulmonary bypass times (8.00 min more, 95% CI 0.36 to 15.64 min, p = 0.04). Some evidence proved that aortic cross-clamp time of ALMT was longer, yet not significantly (2.38 min more, 95% CI -0.15 to 4.91 min, p = 0.06). In addition, ALMT had significantly shorter intubation time (1.66 hrs less, 95% CI -3.05 to -0.27 hrs, p = 0.02). Postoperative hospital stay time was significantly shorter with ALMT (1.52 days less, 95% CI -2.71 to -0.33 days, p = 0.01). Some evidence suggested a reduction in ICU stay time in the ALMT group. However, this did not prove to be statistically significant (0.88 days less, 95% CI -0.81 to 0.04 days, p = 0.08). ALMT can bring more benefits to patients with Congenital Heart Defects by reducing intubation time and postoperative hospital stay time, though ALMT has longer CPB time and aortic cross-clamp time.
Limited-memory adaptive snapshot selection for proper orthogonal decomposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxberry, Geoffrey M.; Kostova-Vassilevska, Tanya; Arrighi, Bill
2015-04-02
Reduced order models are useful for accelerating simulations in many-query contexts, such as optimization, uncertainty quantification, and sensitivity analysis. However, offline training of reduced order models can have prohibitively expensive memory and floating-point operation costs in high-performance computing applications, where memory per core is limited. To overcome this limitation for proper orthogonal decomposition, we propose a novel adaptive selection method for snapshots in time that limits offline training costs by selecting snapshots according an error control mechanism similar to that found in adaptive time-stepping ordinary differential equation solvers. The error estimator used in this work is related to theory boundingmore » the approximation error in time of proper orthogonal decomposition-based reduced order models, and memory usage is minimized by computing the singular value decomposition using a single-pass incremental algorithm. Results for a viscous Burgers’ test problem demonstrate convergence in the limit as the algorithm error tolerances go to zero; in this limit, the full order model is recovered to within discretization error. The resulting method can be used on supercomputers to generate proper orthogonal decomposition-based reduced order models, or as a subroutine within hyperreduction algorithms that require taking snapshots in time, or within greedy algorithms for sampling parameter space.« less
Loop transfer recovery for general nonminimum phase discrete time systems. I - Analysis
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Sannuti, Peddapullaiah; Shamash, Yacov
1992-01-01
A complete analysis of loop transfer recovery (LTR) for general nonstrictly proper, not necessarily minimum phase discrete time systems is presented. Three different observer-based controllers, namely, `prediction estimator' and full or reduced-order type `current estimator' based controllers, are used. The analysis corresponding to all these three controllers is unified into a single mathematical framework. The LTR analysis given here focuses on three fundamental issues: (1) the recoverability of a target loop when it is arbitrarily given, (2) the recoverability of a target loop while taking into account its specific characteristics, and (3) the establishment of necessary and sufficient conditions on the given system so that it has at least one recoverable target loop transfer function or sensitivity function. Various differences that arise in LTR analysis of continuous and discrete systems are pointed out.
Wang, D Z; Wang, C; Shen, C F; Zhang, Y; Zhang, H; Song, G D; Xue, X D; Xu, Z L; Zhang, S; Jiang, G H
2017-05-10
We described the time trend of acute myocardial infarction (AMI) from 1999 to 2013 in Tianjin incidence rate with Cochran-Armitage trend (CAT) test and linear regression analysis, and the results were compared. Based on actual population, CAT test had much stronger statistical power than linear regression analysis for both overall incidence trend and age specific incidence trend (Cochran-Armitage trend P value
Visual cluster analysis and pattern recognition template and methods
Osbourn, G.C.; Martinez, R.F.
1999-05-04
A method of clustering using a novel template to define a region of influence is disclosed. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques. 30 figs.
Code of Federal Regulations, 2014 CFR
2014-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS VALUE ENGINEERING... construction. Value Engineering (VE) analysis. The systematic process of reviewing and assessing a project by a...; and (3) Reducing the time to develop and deliver the project. Value Engineering (VE) Job Plan. A...
Code of Federal Regulations, 2013 CFR
2013-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS VALUE ENGINEERING... construction. Value Engineering (VE) analysis. The systematic process of reviewing and assessing a project by a...; and (3) Reducing the time to develop and deliver the project. Value Engineering (VE) Job Plan. A...
Supekar, Sarang D; Skerlos, Steven J
2017-10-03
Using a least-cost optimization framework, it is shown that unless emissions reductions beyond those already in place begin at the latest by 2025 (±2 years) for the U.S. automotive sector, and by 2026 (-3 years) for the U.S. electric sector, 2050 targets to achieve necessary within-sector preventative CO 2 emissions reductions of 70% or more relative to 2010 will be infeasible. The analysis finds no evidence to justify delaying climate action in the name of reducing technological costs. Even without considering social and environmental damage costs, delaying aggressive climate action does not reduce CO 2 abatement costs even under the most optimistic trajectories for improvements in fuel efficiencies, demand, and technology costs in the U.S. auto and electric sectors. In fact, the abatement cost for both sectors is found to increase sharply with every year of delay beyond 2020. When further considering reasonable limits to technology turnover, retirements, and new capacity additions, these costs would be higher, and the feasible time frame for initiating successful climate action on the 70% by 2050 target would be shorter, perhaps having passed already. The analysis also reveals that optimistic business-as-usual scenarios in the U.S. will, conservatively, release 79-108 billion metric tons of CO 2 . This could represent up to 13% of humanity's remaining carbon budget through 2050.
Lunar Polar Illumination for Power Analysis
NASA Technical Reports Server (NTRS)
Fincannon, James
2008-01-01
This paper presents illumination analyses using the latest Earth-based radar digital elevation model (DEM) of the lunar south pole and an independently developed analytical tool. These results enable the optimum sizing of solar/energy storage lunar surface power systems since they quantify the timing and durations of illuminated and shadowed periods. Filtering and manual editing of the DEM based on comparisons with independent imagery were performed and a reduced resolution version of the DEM was produced to reduce the analysis time. A comparison of the DEM with lunar limb imagery was performed in order to validate the absolute heights over the polar latitude range, the accuracy of which affects the impact of long range, shadow-casting terrain. Average illumination and energy storage duration maps of the south pole region are provided for the worst and best case lunar day using the reduced resolution DEM. Average illumination fractions and energy storage durations are presented for candidate low energy storage duration south pole sites. The best site identified using the reduced resolution DEM required a 62 hr energy storage duration using a fast recharge power system. Solar and horizon terrain elevations as well as illumination fraction profiles are presented for the best identified site and the data for both the reduced resolution and high resolution DEMs compared. High resolution maps for three low energy storage duration areas are presented showing energy storage duration for the worst case lunar day, surface height, and maximum absolute surface slope.
Wavelet decomposition based principal component analysis for face recognition using MATLAB
NASA Astrophysics Data System (ADS)
Sharma, Mahesh Kumar; Sharma, Shashikant; Leeprechanon, Nopbhorn; Ranjan, Aashish
2016-03-01
For the realization of face recognition systems in the static as well as in the real time frame, algorithms such as principal component analysis, independent component analysis, linear discriminate analysis, neural networks and genetic algorithms are used for decades. This paper discusses an approach which is a wavelet decomposition based principal component analysis for face recognition. Principal component analysis is chosen over other algorithms due to its relative simplicity, efficiency, and robustness features. The term face recognition stands for identifying a person from his facial gestures and having resemblance with factor analysis in some sense, i.e. extraction of the principal component of an image. Principal component analysis is subjected to some drawbacks, mainly the poor discriminatory power and the large computational load in finding eigenvectors, in particular. These drawbacks can be greatly reduced by combining both wavelet transform decomposition for feature extraction and principal component analysis for pattern representation and classification together, by analyzing the facial gestures into space and time domain, where, frequency and time are used interchangeably. From the experimental results, it is envisaged that this face recognition method has made a significant percentage improvement in recognition rate as well as having a better computational efficiency.
Evaluation of pre-hospital transport time of stroke patients to thrombolytic treatment.
Simonsen, Sofie Amalie; Andresen, Morten; Michelsen, Lene; Viereck, Søren; Lippert, Freddy K; Iversen, Helle Klingenberg
2014-11-13
Effective treatment of stroke is time dependent. Pre-hospital management is an important link in reducing the time from occurrence of stroke symptoms to effective treatment. The aim of this study was to evaluate time used by emergency medical services (EMS) for stroke patients during a five-year period in order to identify potential delays and evaluate the reorganization of EMS in Copenhagen in 2009. We performed a retrospective analysis of ambulance records from stroke patients suitable for thrombolysis from 1 January 2006 to 7 July 2011. We noted response time from dispatch of the ambulance to arrival at the scene, on-scene time and transport time to the hospital-in total, alarm-to-door time. In addition, we noted baseline characteristics. We reviewed 481 records (58% male, median age 66 years). The median (IQR) alarm-to-door time in minutes was 41 (33-52), of which 18 (12-24) minutes were spent on scene. Response time was reduced from the period before to the period after reorganization (7 vs. 5 minutes, p <0.001). In a linear multiple regression model, higher patient age and longer distance to the hospital correlated with significantly longer transportation time (p <0.001). This study shows an unchanged alarm-to-door time of 41 minutes over a five-year period. Response time, but not total alarm-to-door time, was reduced during the five years. On-scene time constituted nearly half of the total alarm-to-door time and is thus a point of focus for improvement.
Mass Reduction: The Weighty Challenge for Exploration Space Flight
NASA Technical Reports Server (NTRS)
Kloeris, Vickie L.
2014-01-01
Meeting nutritional and acceptability requirements is critical for the food system for an exploration class space mission. However, this must be achieved within the constraints of available resources such as water, crew time, stowage volume, launch mass and power availability. ? Due to resource constraints, exploration class missions are not expected to have refrigerators or freezers for food storage, and current per person food mass must be reduced to improve mission feasibility. ? The Packaged Food Mass Reduction Trade Study (Stoklosa, 2009) concluded that the mass of the current space food system can be effectively reduced by decreasing water content of certain foods and offering nutrient dense substitutes, such as meal replacement bars and beverages. Target nutrient ranges were established based on the nutritional content of the current breakfast and lunch meals in the ISS standard menu. A market survey of available commercial products produced no viable options for meal replacement bar or beverage products. New prototypes for both categories were formulated to meet target nutrient ranges. Samples of prototype products were packaged in high barrier packaging currently used for ISS and underwent an accelerated shelf life study at 31 degC and 41 degC (50% RH) for 24 weeks. Samples were assessed at the following time points: Initial, 6 weeks, 12 weeks, and 24 weeks. Testing at each time point included the following: color, texture, water activity, acceptability, and hexanal analysis (for food bars only). Proof of concept prototypes demonstrated that meal replacement food bars and beverages can deliver a comparable macronutrient profile while reducing the overall mass when compared to the ISS Standard Menu. Future work suggestions for meal replacement bars: Reformulation to include ingredients that reduce hardness and reduce browning to increase shelf life. Micronutrient analysis and potential fortification. Sensory evaluation studies including satiety tests and menu fatigue. Water Intake Analysis: The water in thermostabilized foods is considered as part of a crewmember's daily water intake. Extensive meal replacement would require further analyses to determine if additional water provisioning would be required per crewmember negating some of the mass savings.
A simple and reliable method reducing sulfate to sulfide for multiple sulfur isotope analysis.
Geng, Lei; Savarino, Joel; Savarino, Clara A; Caillon, Nicolas; Cartigny, Pierre; Hattori, Shohei; Ishino, Sakiko; Yoshida, Naohiro
2018-02-28
Precise analysis of four sulfur isotopes of sulfate in geological and environmental samples provides the means to extract unique information in wide geological contexts. Reduction of sulfate to sulfide is the first step to access such information. The conventional reduction method suffers from a cumbersome distillation system, long reaction time and large volume of the reducing solution. We present a new and simple method enabling the process of multiple samples at one time with a much reduced volume of reducing solution. One mL of reducing solution made of HI and NaH 2 PO 2 was added to a septum glass tube with dry sulfate. The tube was heated at 124°C and the produced H 2 S was purged with inert gas (He or N 2 ) through gas-washing tubes and then collected by NaOH solution. The collected H 2 S was converted into Ag 2 S by adding AgNO 3 solution and the co-precipitated Ag 2 O was removed by adding a few drops of concentrated HNO 3 . Within 2-3 h, a 100% yield was observed for samples with 0.2-2.5 μmol Na 2 SO 4 . The reduction rate was much slower for BaSO 4 and a complete reduction was not observed. International sulfur reference materials, NBS-127, SO-5 and SO-6, were processed with this method, and the measured against accepted δ 34 S values yielded a linear regression line which had a slope of 0.99 ± 0.01 and a R 2 value of 0.998. The new methodology is easy to handle and allows us to process multiple samples at a time. It has also demonstrated good reproducibility in terms of H 2 S yield and for further isotope analysis. It is thus a good alternative to the conventional manual method, especially when processing samples with limited amount of sulfate available. © 2017 The Authors. Rapid Communications in Mass Spectrometry Pubished by John Wiley & Sons Ltd.
Diagnostics of wear in aeronautical systems
NASA Technical Reports Server (NTRS)
Wedeven, L. D.
1979-01-01
The use of appropriate diagnostic tools for aircraft oil wetted components is reviewed, noting that it can reduce direct operating costs through reduced unscheduled maintenance, particularly in helicopter engine and transmission systems where bearing failures are a significant cost factor. Engine and transmission wear modes are described, and diagnostic methods for oil and wet particle analysis, the spectrometric oil analysis program, chip detectors, ferrography, in-line oil monitor and radioactive isotope tagging are discussed, noting that they are effective over a limited range of particle sizes but compliment each other if used in parallel. Fine filtration can potentially increase time between overhauls, but reduces the effectiveness of conventional oil monitoring techniques so that alternative diagnostic techniques must be used. It is concluded that the development of a diagnostic system should be parallel and integral with the development of a mechanical system.
Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly
2015-12-18
This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Direct stenting versus balloon predilation: Jury is still out.
Belardi, Jorge A; Albertal, Mariano
2017-08-01
Compared to balloon predilation, direct stenting (DS) shortens procedural time and reduces radiation and contrast exposure. A meta-analysis that included 7 studies comparing these 2 strategies revealed lower adverse event rate with DS. Studies included in the present meta-analysis were mostly observational and utilized first generation drug-eluting stent. Patient and lesion selection may explain these positive results. © 2017 Wiley Periodicals, Inc.
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
Expert Systems In Medical Studies - A New Twist
NASA Astrophysics Data System (ADS)
Slagle, James R.; Long, John M.; Wick, Michael R.; Matts, John P.; Leon, Arthur S.
1986-03-01
The use of experts to evaluate large amounts of trial data results in increasingly expensive and time consuming research. We are investigating the role expert systems can play in reducing the time and expense of research projects. Current methods in large clinical studies for evaluating data are often crude and superficial. We have developed, for a large clinical trial, an expert system for analysis of treadmill exercise ECG test results. In the cases we are studying, a patient is given a treadmill exercise ECG test once a year for five years. Pairs of these exercise tests are then evaluated by cardiologists to determine the condition of the patient's heart. The results of our system show great promise for the use of expert systems in reducing the time and expense of large clinical trials.
NASA Astrophysics Data System (ADS)
Szurgacz, Dawid
2018-01-01
The article discusses basic functions of a powered roof support in a longwall unit. The support function is to provide safety by protecting mine workings against uncontrolled falling of rocks. The subject of the research includes the measures to shorten the time of roof support shifting. The roof support is adapted to transfer, in hazard conditions of rock mass tremors, dynamic loads caused by mining exploitation. The article presents preliminary research results on the time reduction of the unit advance to increase the extraction process and thus reduce operating costs. Conducted stand tests showed the ability to increase the flow for 3/2-way valve cartridges. The level of fluid flowing through the cartridges is adequate to control individual actuators.
Intelligent traffic lights based on MATLAB
NASA Astrophysics Data System (ADS)
Nie, Ying
2018-04-01
In this paper, I describes the traffic lights system and it has some. Through analysis, I used MATLAB technology, transformed the camera photographs into digital signals. Than divided the road vehicle is into three methods: very congestion, congestion, a little congestion. Through the MCU programming, solved the different roads have different delay time, and Used this method, saving time and resources, so as to reduce road congestion.
Analysis of the glow curve of SrB 4O 7:Dy compounds employing the GOT model
NASA Astrophysics Data System (ADS)
Ortega, F.; Molina, P.; Santiago, M.; Spano, F.; Lester, M.; Caselli, E.
2006-02-01
The glow curve of SrB 4O 7:Dy phosphors has been analysed with the general one trap model (GOT). To solve the differential equation describing the GOT model a novel algorithm has been employed, which reduces significantly the deconvolution time with respect to the time required by usual integration algorithms, such as the Runge-Kutta method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlberg, Kevin Thomas; Drohmann, Martin; Tuminaro, Raymond S.
2014-10-01
Model reduction for dynamical systems is a promising approach for reducing the computational cost of large-scale physics-based simulations to enable high-fidelity models to be used in many- query (e.g., Bayesian inference) and near-real-time (e.g., fast-turnaround simulation) contexts. While model reduction works well for specialized problems such as linear time-invariant systems, it is much more difficult to obtain accurate, stable, and efficient reduced-order models (ROMs) for systems with general nonlinearities. This report describes several advances that enable nonlinear reduced-order models (ROMs) to be deployed in a variety of time-critical settings. First, we present an error bound for the Gauss-Newton with Approximatedmore » Tensors (GNAT) nonlinear model reduction technique. This bound allows the state-space error for the GNAT method to be quantified when applied with the backward Euler time-integration scheme. Second, we present a methodology for preserving classical Lagrangian structure in nonlinear model reduction. This technique guarantees that important properties--such as energy conservation and symplectic time-evolution maps--are preserved when performing model reduction for models described by a Lagrangian formalism (e.g., molecular dynamics, structural dynamics). Third, we present a novel technique for decreasing the temporal complexity --defined as the number of Newton-like iterations performed over the course of the simulation--by exploiting time-domain data. Fourth, we describe a novel method for refining projection-based reduced-order models a posteriori using a goal-oriented framework similar to mesh-adaptive h -refinement in finite elements. The technique allows the ROM to generate arbitrarily accurate solutions, thereby providing the ROM with a 'failsafe' mechanism in the event of insufficient training data. Finally, we present the reduced-order model error surrogate (ROMES) method for statistically quantifying reduced- order-model errors. This enables ROMs to be rigorously incorporated in uncertainty-quantification settings, as the error model can be treated as a source of epistemic uncertainty. This work was completed as part of a Truman Fellowship appointment. We note that much additional work was performed as part of the Fellowship. One salient project is the development of the Trilinos-based model-reduction software module Razor , which is currently bundled with the Albany PDE code and currently allows nonlinear reduced-order models to be constructed for any application supported in Albany. Other important projects include the following: 1. ROMES-equipped ROMs for Bayesian inference: K. Carlberg, M. Drohmann, F. Lu (Lawrence Berkeley National Laboratory), M. Morzfeld (Lawrence Berkeley National Laboratory). 2. ROM-enabled Krylov-subspace recycling: K. Carlberg, V. Forstall (University of Maryland), P. Tsuji, R. Tuminaro. 3. A pseudo balanced POD method using only dual snapshots: K. Carlberg, M. Sarovar. 4. An analysis of discrete v. continuous optimality in nonlinear model reduction: K. Carlberg, M. Barone, H. Antil (George Mason University). Journal articles for these projects are in progress at the time of this writing.« less
An asymptotic induced numerical method for the convection-diffusion-reaction equation
NASA Technical Reports Server (NTRS)
Scroggs, Jeffrey S.; Sorensen, Danny C.
1988-01-01
A parallel algorithm for the efficient solution of a time dependent reaction convection diffusion equation with small parameter on the diffusion term is presented. The method is based on a domain decomposition that is dictated by singular perturbation analysis. The analysis is used to determine regions where certain reduced equations may be solved in place of the full equation. Parallelism is evident at two levels. Domain decomposition provides parallelism at the highest level, and within each domain there is ample opportunity to exploit parallelism. Run time results demonstrate the viability of the method.
Compton suppression in BEGe detectors by digital pulse shape analysis.
Mi, Yu-Hao; Ma, Hao; Zeng, Zhi; Cheng, Jian-Ping; Li, Jun-Li; Zhang, Hui
2017-03-01
A new method of pulse shape discrimination (PSD) for BEGe detectors is developed to suppress Compton-continuum by digital pulse shape analysis (PSA), which helps reduce the Compton background level in gamma ray spectrometry. A decision parameter related to the rise time of a pulse shape was presented. The method was verified by experiments using 60 Co and 137 Cs sources. The result indicated that the 60 Co Peak to Compton ratio and the Cs-Peak to Co-Compton ratio could be improved by more than two and three times, respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.
Analysis of wheel rim - Material and manufacturing aspects
NASA Astrophysics Data System (ADS)
Misra, Sheelam; Singh, Abhiraaj; James, Eldhose
2018-05-01
The tire in an automobile is supported by the rim of the wheel and its shape and dimensions should be adjusted to accommodate a specified tire. In this study, a tire of car wheel rim belonging to the disc wheel category is considered. Design is an important industrial operation used to define and specify the quality of the product. The design and modelling reduces the risk of damage involved in the manufacturing process. The design performed on this wheel rim is done on modelling software. After designing the model, it is imported for analysis purposes. The analysis software is used to calculate the different types of force, stresses, torque, and pressures acting upon the rim of the wheel and it reduces the time spent by a human for mathematical calculations. The analysis carried out considers two different materials namely structural steel and aluminium. Both materials are analyzed and their performance is noted.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
Time-frequency analysis of acoustic scattering from elastic objects
NASA Astrophysics Data System (ADS)
Yen, Nai-Chyuan; Dragonette, Louis R.; Numrich, Susan K.
1990-06-01
A time-frequency analysis of acoustic scattering from elastic objects was carried out using the time-frequency representation based on a modified version of the Wigner distribution function (WDF) algorithm. A simple and efficient processing algorithm was developed, which provides meaningful interpretation of the scattering physics. The time and frequency representation derived from the WDF algorithm was further reduced to a display which is a skeleton plot, called a vein diagram, that depicts the essential features of the form function. The physical parameters of the scatterer are then extracted from this diagram with the proper interpretation of the scattering phenomena. Several examples, based on data obtained from numerically simulated models and laboratory measurements for elastic spheres and shells, are used to illustrate the capability and proficiency of the algorithm.
NASA Astrophysics Data System (ADS)
Nasertdinova, A. D.; Bochkarev, V. V.
2017-11-01
Deep neural networks with a large number of parameters are a powerful tool for solving problems of pattern recognition, prediction and classification. Nevertheless, overfitting remains a serious problem in the use of such networks. A method of solving the problem of overfitting is proposed in this article. This method is based on reducing the number of independent parameters of a neural network model using the principal component analysis, and can be implemented using existing libraries of neural computing. The algorithm was tested on the problem of recognition of handwritten symbols from the MNIST database, as well as on the task of predicting time series (rows of the average monthly number of sunspots and series of the Lorentz system were used). It is shown that the application of the principal component analysis enables reducing the number of parameters of the neural network model when the results are good. The average error rate for the recognition of handwritten figures from the MNIST database was 1.12% (which is comparable to the results obtained using the "Deep training" methods), while the number of parameters of the neural network can be reduced to 130 times.
Scanlan, Justin Newton
2010-07-01
In recent times, much attention has been focused on the reduction of seclusion and restraint in psychiatric settings. This paper analyzes evidence available from evaluations of single seclusion and/or restraint reduction programmes. A total of 29 papers were included in the review. Seven key strategy types emerged from the analysis: (i) policy change/leadership; (ii) external review/debriefing; (iii) data use; (iv) training; (v) consumer/family involvement; (vi) increase in staff ratio/crisis response teams; and (vii) programme elements/changes. Outcomes indicate that a range of reduction programmes are successful in reducing the frequency and duration of seclusion and restraint use, while at the same time maintaining a safe environment. The development of new seclusion and restraint reduction programmes should include strong leadership from local management; external seclusion and restraint review committees or post-incident debriefing and analysis; broad-based staff training and programme changes at a local level. Behavioural and cognitive-behavioural programmes appear to be very useful in child and adolescent services. Further systematic research should be conducted to more fully understand which elements of successful programmes are the most powerful in reducing incidents of seclusion and restraint.
Flat-plate solar array project process development area: Process research of non-CZ silicon material
NASA Technical Reports Server (NTRS)
Campbell, R. B.
1986-01-01
Several different techniques to simultaneously diffuse the front and back junctions in dendritic web silicon were investigated. A successful simultaneous diffusion reduces the cost of the solar cell by reducing the number of processing steps, the amount of capital equipment, and the labor cost. The three techniques studied were: (1) simultaneous diffusion at standard temperatures and times using a tube type diffusion furnace or a belt furnace; (2) diffusion using excimer laser drive-in; and (3) simultaneous diffusion at high temperature and short times using a pulse of high intensity light as the heat source. The use of an excimer laser and high temperature short time diffusion experiment were both more successful than the diffusion at standard temperature and times. The three techniques are described in detail and a cost analysis of the more successful techniques is provided.
[Quenched fluorescein: a reference dye for instrument response function of TCSPC].
Pan, Hai-feng; Ding, Jing-xin; Liang, Rong-rong; Tao, Zhan-dong; Liu, Meng-wei; Zhang, San-jun; Xu, Jian-hua
2014-08-01
Measuring the instrument response function (IRF) and fitting by reconvolution algorithms are routines to improve time resolution in fluorescence lifetime measurements. Iodide ions were successfully used to quench the fluorescence of fluorescein in this study. By systematically adding saturated NaI water solution in basic fluorescein solution, the lifetimes of fluorescein were reduced from 4 ns to 24 ps. The quenched lifetime of fluorescein obtained from the analysis of Time-Correlated Single Photon Counting (TCSPC) measurement agrees well with that from femtosecond frequency up-conversion measurement. In time resolved excitation spectra measurements, the IRF should be measured at various detection wavelengths providing scattring materials are used. This study could not only reduce the complexity of IRF measurement, but also avoid the existing color effect in system. This study should have wide applications in time resolved fluorescence spectroscopy and fluorescence lifetime imaging.
Smart Extraction and Analysis System for Clinical Research.
Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung
2017-05-01
With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.
Optimizing acupuncture treatment for dry eye syndrome: a systematic review.
Kim, Bong Hyun; Kim, Min Hee; Kang, Se Hyun; Nam, Hae Jeong
2018-05-03
In a former meta-analysis review, acupuncture was considered a potentially effective treatment for dry eye syndrome (DES), but there were heterogeneities among the outcomes. We updated the meta-analysis and conducted subgroup analysis to reduce the heterogeneity and suggest the most effective acupuncture method based on clinical trials. We searched for randomized controlled trials (RCTs) in 10 databases (MEDLINE, EMBASE, CENTAL, AMED, SCOPUS, CNKI, Wangfang database, Oriental Medicine Advanced Searching Integrated System (OASIS), Koreamed, J-stage) and searched by hand to compare the effects of acupuncture and artificial tears (AT). We also conducted subgroup analysis by (1) method of intervention (acupuncture only or acupuncture plus AT), (2) intervention frequency (less than 3 times a week or more than 3 times a week), (3) period of treatment (less than 4 weeks or more than 4 weeks), and (4) acupoints (BL1, BL2, ST1, ST2, TE23, Ex-HN5). The Bucher method was used for subgroup comparisons. Nineteen studies with 1126 patients were included. Significant improvements on the Schirmer test (weighted mean difference[WMD], 2.14; 95% confidence interval[CI], 0.93 to 3.34; p = 0.0005) and break up time (BUT) (WMD, 0.98; 95% CI, 0.79 to 1.18; p < 0.00001) were reported. In the subgroup analysis, acupuncture plus AT treatment had a weaker effect in BUT but a stronger effect on the Schirmer test and a better overall effect than acupuncture alone. For treatment duration, treatment longer than 1 month was more effective than shorter treatment. With regard to treatment frequency, treatment less than three times a week was more effective than more frequent treatment. In the acupoint analysis, acupuncture treatment including the BL2 and ST1 acupoints was less effective than treatment that did not include them. None of those factors reduced the heterogeneity. Acupuncture was more effective than AT in treating DES but showed high heterogeneity. Intervention differences did not influence the heterogeneity.
Analysis of Site Position Time Series Derived From Space Geodetic Solutions
NASA Astrophysics Data System (ADS)
Angermann, D.; Meisel, B.; Kruegel, M.; Tesmer, V.; Miller, R.; Drewes, H.
2003-12-01
This presentation deals with the analysis of station coordinate time series obtained from VLBI, SLR, GPS and DORIS solutions. We also present time series for the origin and scale derived from these solutions and discuss their contribution to the realization of the terrestrial reference frame. For these investigations we used SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS and DORIS time series were obtained from weekly station coordinates solutions provided by the IGS, and from the joint DORIS analysis center (IGN-JPL). We analysed the time series with respect to various aspects, such as non-linear motions, periodic signals and systematic differences (biases). A major focus is on a comparison of the results at co-location sites in order to identify technique- and/or solution related problems. This may also help to separate and quantify possible effects, and to understand the origin of still existing discrepancies. Technique-related systematic effects (biases) should be reduced to the highest possible extent, before using the space geodetic solutions for a geophysical interpretation of seasonal signals in site position time series.
Wang, Zhi-Quan; Zhang, Rui; Zhang, Peng-Pai; Liu, Xiao-Hong; Sun, Jian; Wang, Jun; Feng, Xiang-Fei; Lu, Qiu-Fen; Li, Yi-Gang
2015-04-01
Warfarin is yet the most widely used oral anticoagulant for thromboembolic diseases, despite the recently emerged novel anticoagulants. However, difficulty in maintaining stable dose within the therapeutic range and subsequent serious adverse effects markedly limited its use in clinical practice. Pharmacogenetics-based warfarin dosing algorithm is a recently emerged strategy to predict the initial and maintaining dose of warfarin. However, whether this algorithm is superior over conventional clinically guided dosing algorithm remains controversial. We made a comparison of pharmacogenetics-based versus clinically guided dosing algorithm by an updated meta-analysis. We searched OVID MEDLINE, EMBASE, and the Cochrane Library for relevant citations. The primary outcome was the percentage of time in therapeutic range. The secondary outcomes were time to stable therapeutic dose and the risks of adverse events including all-cause mortality, thromboembolic events, total bleedings, and major bleedings. Eleven randomized controlled trials with 2639 participants were included. Our pooled estimates indicated that pharmacogenetics-based dosing algorithm did not improve percentage of time in therapeutic range [weighted mean difference, 4.26; 95% confidence interval (CI), -0.50 to 9.01; P = 0.08], but it significantly shortened the time to stable therapeutic dose (weighted mean difference, -8.67; 95% CI, -11.86 to -5.49; P < 0.00001). Additionally, pharmacogenetics-based algorithm significantly reduced the risk of major bleedings (odds ratio, 0.48; 95% CI, 0.23 to 0.98; P = 0.04), but it did not reduce the risks of all-cause mortality, total bleedings, or thromboembolic events. Our results suggest that pharmacogenetics-based warfarin dosing algorithm significantly improves the efficiency of International Normalized Ratio correction and reduces the risk of major hemorrhage.
School start times and teenage driver motor vehicle crashes.
Foss, Robert D; Smith, Richard L; O'Brien, Natalie P
2018-04-26
Shifting school start times to 8:30 am or later has been found to improve academic performance and reduce behavior problems. Limited research suggests this may also reduce adolescent driver motor vehicle crashes. A change in the school start time from 7:30 am to 8:45 am for all public high schools in one North Carolina county presented the opportunity to address this question with greater methodologic rigor. We conducted ARIMA interrupted time-series analyses to examine motor vehicle crash rates of high school age drivers in the intervention county and 3 similar comparison counties with comparable urban-rural population distribution. To focus on crashes most likely to be affected, we limited analysis to crashes involving 16- & 17-year-old drivers occurring on days when school was in session. In the intervention county, there was a 14% downward shift in the time-series following the 75 min delay in school start times (p = .076). There was no change approaching statistical significance in any of the other three counties. Further analysis indicated marked, statistically significant shifts in hourly crash rates in the intervention county, reflecting effects of the change in school start time on young driver exposure. Crashes from 7 to 7:59 am decreased sharply (-25%, p = .008), but increased similarly from 8 to 8:59 am (21%, p = .004). Crashes from 2 to 2:59 pm declined dramatically (-48%, p = .000), then increased to a lesser degree from 3 to 3:59 pm (32%, p = .024) and non-significantly from 4 to 4:59 (19%, p = .102). There was no meaningful change in early morning or nighttime crashes, when drowsiness-induced crashes might have been expected to be most common. The small decrease in crashes among high school age drivers following the shift in school start time is consistent with the findings of other studies of teen driver crashes and school start times. All these studies, including the present one, have limitations, but the similar findings suggest that crashes and school start times are indeed related, with earlier start times equating to more crashes. Later high school start times (>8:30 am) appear to be associated with lower adolescent driver crash rates, but additional research is needed to confirm this and to identify the mechanism by which this occurs (reduced drowsiness or reduced exposure). Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Anggraini, N.
2017-02-01
This research aims to reduce the destructive behavior such as throwing the learning materials on autism student by using correctional “NO!” approach in CANDA educational institution Surakarta. This research uses Single Subject Research (SSR) method with A-B design, it is baseline and intervention. Subject of this research is one autism student of CANDA educational institution named G.A.P. Data were collected through recording in direct observation in the form of recording events at the time of implementation baseline and intervention. Data were analyzed by simple descriptive statistical analysis and is displayed in graphical form. Based on the result of data analysis, it could be concluded that destructive behavior such as throwing the learning material on autism student was significantly reduced after given an intervention. Based on the research results, using correctional “NO!” approach can be used by teacher or therapist to reduce the destructive behavior on autism student.
Medication Waste Reduction in Pediatric Pharmacy Batch Processes
Veltri, Michael A.; Hamrock, Eric; Mollenkopf, Nicole L.; Holt, Kristen; Levin, Scott
2014-01-01
OBJECTIVES: To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. METHODS: A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. RESULTS: Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. CONCLUSIONS: The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste. PMID:25024671
Medication waste reduction in pediatric pharmacy batch processes.
Toerper, Matthew F; Veltri, Michael A; Hamrock, Eric; Mollenkopf, Nicole L; Holt, Kristen; Levin, Scott
2014-04-01
To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste.
Time-series analysis of foreign exchange rates using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Inoue, Masayoshi
2013-08-01
Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in foreign exchange rates, in particular, the dollar-yen rate. The time-dependent pattern entropy of the dollar-yen rate was found to be high in the following periods: before and after the turning points of the yen from strong to weak or from weak to strong, and the period after the Lehman shock.
Analysis of SSEM Sensor Data Using BEAM
NASA Technical Reports Server (NTRS)
Zak, Michail; Park, Han; James, Mark
2004-01-01
A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.
The use of steam explosion to increase the nutrition available from rice straw.
Li, Bin; Chen, Kunjie; Gao, Xiang; Zhao, Chao; Shao, Qianjun; Sun, Qian; Li, Hua
2015-01-01
In the present study, rice straw was pretreated using steam-explosion (ST) technique to improve the enzymatic hydrolysis of potential reducing sugars for feed utilization. The response surface methodology based on central composite design was used to optimize the effects of steam pressure, pressure retention time, and straw moisture content on the yield of reducing sugar. All the investigated variables had significant effects (P < 0.001) on the reducing sugar yield. The optimum yield of 30.86% was obtained under the following pretreatment conditions: steam pressure, 1.54 MPa; pressure retention time, 140.5 Sec; and straw moisture content, 41.6%. The yield after thermal treatment under the same conditions was approximately 16%. Infrared (IR) radiation analysis showed a decrease in the cellulose IR crystallization index. ST noticeably increases reducing sugars in rice straw, and this technique may also be applicable to other cellulose/lignin sources of biomass. © 2014 International Union of Biochemistry and Molecular Biology, Inc.
Data analysis and detection methods for on-line health monitoring of bridge structures
DOT National Transportation Integrated Search
2002-06-01
Developing an efficient structural health monitoring (SHM) technique is important for reducing potential hazards posed : to the public by damaged civil structures. The ultimate goal of applying SHM is to real-time detect, localize, and quantify : the...
JPRS Report, Science & Technology Europe
1988-06-09
Vico, whose potato chips are known to everyone, the Glucoprocesseur is nevertheless capable of reducing sample analysis times by 75 per- cent—a test...faltering for reasons that are more political than technical, actually already has its biosensors. The Lyonnaise des Eaux company and the Compagnie
Overset Grid Methods Applied to Nonlinear Potential Flows
NASA Technical Reports Server (NTRS)
Holst, Terry; Kwak, Dochan (Technical Monitor)
2000-01-01
The objectives of this viewgraph presentation are to develop Chimera-based potential methodology which is compatible with overflow and overflow infrastructure, creating options for an advanced problem solving environment and to significantly reduce turnaround time for aerodynamic analysis and design (primarily cruise conditions).
Efficient algorithms for single-axis attitude estimation
NASA Technical Reports Server (NTRS)
Shuster, M. D.
1981-01-01
The computationally efficient algorithms determine attitude from the measurement of art lengths and dihedral angles. The dependence of these algorithms on the solution of trigonometric equations was reduced. Both single time and batch estimators are presented along with the covariance analysis of each algorithm.
1980-06-01
sufficient. Dropping the time lag terms, the equations for Xu, Xx’, and X reduce to linear algebraic equations.Y Hence in the quasistatic case the...quasistatic variables now are not described by differential equations but rather by linear algebraic equations. The solution for x0 then is simply -365...matrices for two-bladed rotor 414 7. LINEAR SYSTEM ANALYSIS 425 7,1 State Variable Form 425 7.2 Constant Coefficient System 426 7.2. 1 Eigen-analysis 426
NASA Technical Reports Server (NTRS)
Peabody, Hume; Guerrero, Sergio; Hawk, John; Rodriguez, Juan; McDonald, Carson; Jackson, Cliff
2016-01-01
The Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) utilizes an existing 2.4 m diameter Hubble sized telescope donated from elsewhere in the federal government for near-infrared sky surveys and Exoplanet searches to answer crucial questions about the universe and dark energy. The WFIRST design continues to increase in maturity, detail, and complexity with each design cycle leading to a Mission Concept Review and entrance to the Mission Formulation Phase. Each cycle has required a Structural-Thermal-Optical-Performance (STOP) analysis to ensure the design can meet the stringent pointing and stability requirements. As such, the models have also grown in size and complexity leading to increased model run time. This paper addresses efforts to reduce the run time while still maintaining sufficient accuracy for STOP analyses. A technique was developed to identify slews between observing orientations that were sufficiently different to warrant recalculation of the environmental fluxes to reduce the total number of radiation calculation points. The inclusion of a cryocooler fluid loop in the model also forced smaller time-steps than desired, which greatly increases the overall run time. The analysis of this fluid model required mitigation to drive the run time down by solving portions of the model at different time scales. Lastly, investigations were made into the impact of the removal of small radiation couplings on run time and accuracy. Use of these techniques allowed the models to produce meaningful results within reasonable run times to meet project schedule deadlines.
Parsons, Brendon A; Marney, Luke C; Siegler, W Christopher; Hoggard, Jamin C; Wright, Bob W; Synovec, Robert E
2015-04-07
Comprehensive two-dimensional (2D) gas chromatography coupled with time-of-flight mass spectrometry (GC × GC-TOFMS) is a versatile instrumental platform capable of collecting highly informative, yet highly complex, chemical data for a variety of samples. Fisher-ratio (F-ratio) analysis applied to the supervised comparison of sample classes algorithmically reduces complex GC × GC-TOFMS data sets to find class distinguishing chemical features. F-ratio analysis, using a tile-based algorithm, significantly reduces the adverse effects of chromatographic misalignment and spurious covariance of the detected signal, enhancing the discovery of true positives while simultaneously reducing the likelihood of detecting false positives. Herein, we report a study using tile-based F-ratio analysis whereby four non-native analytes were spiked into diesel fuel at several concentrations ranging from 0 to 100 ppm. Spike level comparisons were performed in two regimes: comparing the spiked samples to the nonspiked fuel matrix and to each other at relative concentration factors of two. Redundant hits were algorithmically removed by refocusing the tiled results onto the original high resolution pixel level data. To objectively limit the tile-based F-ratio results to only features which are statistically likely to be true positives, we developed a combinatorial technique using null class comparisons, called null distribution analysis, by which we determined a statistically defensible F-ratio cutoff for the analysis of the hit list. After applying null distribution analysis, spiked analytes were reliably discovered at ∼1 to ∼10 ppm (∼5 to ∼50 pg using a 200:1 split), depending upon the degree of mass spectral selectivity and 2D chromatographic resolution, with minimal occurrence of false positives. To place the relevance of this work among other methods in this field, results are compared to those for pixel and peak table-based approaches.
Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf
2016-06-01
This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.
Cost-Reduced M587 Electronic Time Fuze: Root Cause Analysis of July 1979 Early Bursts
1981-04-01
capacitor during gunfire, coupled with an intermit - tent wire bond (which opens during setback and then closes again) can defeat the initialization circuit...is calibrated, (c) the set time is checked out in fast time to verify that the setting has actually been achieved, and (d) the setter visually com...insensitive to supply voltage. One of the reasons for choosing the twin-T design was because it was possible to preclude fail- fast failure modes. The
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thappily, Praveen, E-mail: pravvmon@gmail.com, E-mail: shiiuvenus@gmail.com; Shiju, K., E-mail: pravvmon@gmail.com, E-mail: shiiuvenus@gmail.com
Green synthesis of silver nanoparticles was achieved by simple visible light irradiation using aloe barbadensis leaf extract as reducing agent. UV-Vis spectroscopic analysis was used for confirmation of the successful formation of nanoparticles. Investigated the effect of light irradiation time on the light absorption of the nanoparticles. It is observed that upto 25 minutes of light irradiation, the absorption is linearly increasing with time and after that it becomes saturated. Finally, theoretically fitted the time-absorption graph and modeled a relation between them with the help of simulation software.
Kruszewski, Kristen M; Nistico, Laura; Longwell, Mark J; Hynes, Matthew J; Maurer, Joshua A
2013-01-01
Stainless steel 316L (SS316L) is a common material used in orthopedic implants. Bacterial colonization of the surface and subsequent biofilm development can lead to refractory infection of the implant. Since the greatest risk of infection occurs perioperatively, strategies that reduce bacterial adhesion during this time are important. As a strategy to limit bacterial adhesion and biofilm formation on SS316L, self-assembled monolayers (SAMs) were used to modify the SS316L surface. SAMs with long alkyl chains terminated with hydrophobic (-CH3) or hydrophilic (oligoethylene glycol) tail groups were used to form coatings and in an orthogonal approach, SAMs were used to immobilize gentamicin or vancomycin on SS316L for the first time to form an “active” antimicrobial coating to inhibit early biofilm development. Modified SS316L surfaces were characterized using surface infrared spectroscopy, contact angles, MALDI-TOF mass spectrometry and atomic force microscopy. The ability of SAM-modified SS316L to retard biofilm development by Staphylococcus aureus was functionally tested using confocal scanning laser microscopy with COMSTAT image analysis, scanning electron microscopy and colony forming unit analysis. Neither hydrophobic nor hydrophilic SAMs reduced biofilm development. However, gentamicin-linked and vancomycin-linked SAMs significantly reduced S. aureus biofilm formation for up to 24 and 48 hours, respectively. PMID:23498233
Kruszewski, Kristen M; Nistico, Laura; Longwell, Mark J; Hynes, Matthew J; Maurer, Joshua A; Hall-Stoodley, Luanne; Gawalt, Ellen S
2013-05-01
Stainless steel 316L (SS316L) is a common material used in orthopedic implants. Bacterial colonization of the surface and subsequent biofilm development can lead to refractory infection of the implant. Since the greatest risk of infection occurs perioperatively, strategies that reduce bacterial adhesion during this time are important. As a strategy to limit bacterial adhesion and biofilm formation on SS316L, self-assembled monolayers (SAMs) were used to modify the SS316L surface. SAMs with long alkyl chains terminated with hydrophobic (-CH3) or hydrophilic (oligoethylene glycol) tail groups were used to form coatings and in an orthogonal approach, SAMs were used to immobilize gentamicin or vancomycin on SS316L for the first time to form an "active" antimicrobial coating to inhibit early biofilm development. Modified SS316L surfaces were characterized using surface infrared spectroscopy, contact angles, MALDI-TOF mass spectrometry and atomic force microscopy. The ability of SAM-modified SS316L to retard biofilm development by Staphylococcus aureus was functionally tested using confocal scanning laser microscopy with COMSTAT image analysis, scanning electron microscopy and colony forming unit analysis. Neither hydrophobic nor hydrophilic SAMs reduced biofilm development. However, gentamicin-linked and vancomycin-linked SAMs significantly reduced S. aureus biofilm formation for up to 24 and 48 h, respectively. Copyright © 2013 Elsevier B.V. All rights reserved.
Bleul, Tim; Rühl, Ralph; Bulashevska, Svetlana; Karakhanova, Svetlana; Werner, Jens; Bazhin, Alexandr V
2015-09-01
Pancreatic ductal adenocarcinoma (PDAC) represents one of the deadliest cancers in the world. All-trans retinoic acid (ATRA) is the major physiologically active form of vitamin A, regulating expression of many genes. Disturbances of vitamin A metabolism are prevalent in some cancer cells. The main aim of this work was to investigate deeply the components of retinoid signaling in PDAC compared to in the normal pancreas and to prove the clinical importance of retinoid receptor expression. For the study, human tumor tissues obtained from PDAC patients and murine tumors from the orthotopic Panc02 model were used for the analysis of retinoids, using high performance liquid chromatography mass spectrometry and real-time RT-PCR gene expression analysis. Survival probabilities in univariate analysis were estimated using the Kaplan-Meier method and the Cox proportional hazards model was used for the multivariate analysis. In this work, we showed for the first time that the ATRA and all-trans retinol concentration is reduced in PDAC tissue compared to their normal counterparts. The expression of RARα and β as well as RXRα and β are down-regulated in PDAC tissue. This reduced expression of retinoid receptors correlates with the expression of some markers of differentiation and epithelial-to-mesenchymal transition as well as of cancer stem cell markers. Importantly, the expression of RARα and RXRβ is associated with better overall survival of PDAC patients. Thus, reduction of retinoids and their receptors is an important feature of PDAC and is associated with worse patient survival outcomes. © 2014 Wiley Periodicals, Inc.
Preliminary Benefits Assessment of Traffic Aware Strategic Aircrew Requests (TASAR)
NASA Technical Reports Server (NTRS)
Henderson, Jeff; Idris, Husni; Wing, David J.
2012-01-01
While en route, aircrews submit trajectory change requests to air traffic control (ATC) to better meet their objectives including reduced delays, reduced fuel burn, and passenger comfort. Aircrew requests are currently made with limited to no information on surrounding traffic. Consequently, these requests are uninformed about a key ATC objective, ensuring traffic separation, and therefore less likely to be accepted than requests informed by surrounding traffic and that avoids creating conflicts. This paper studies the benefits of providing aircrews with on-board decision support to generate optimized trajectory requests that are probed and cleared of known separation violations prior to issuing the request to ATC. These informed requests are referred to as traffic aware strategic aircrew requests (TASAR) and leverage traffic surveillance information available through Automatic Dependent Surveillance Broadcast (ADS-B) In capability. Preliminary fast-time simulation results show increased benefits with longer stage lengths since beneficial trajectory changes can be applied over a longer distance. Also, larger benefits were experienced between large hub airports as compared to other airport sizes. On average, an aircraft equipped with TASAR reduced its travel time by about one to four minutes per operation and fuel burn by about 50 to 550 lbs per operation depending on the objective of the aircrew (time, fuel, or weighted combination of time and fuel), class of airspace user, and aircraft type. These preliminary results are based on analysis of approximately one week of traffic in July 2012 and additional analysis is planned on a larger data set to confirm these initial findings.
Optimization study for Pb(II) and COD sequestration by consortium of sulphate-reducing bacteria
NASA Astrophysics Data System (ADS)
Verma, Anamika; Bishnoi, Narsi R.; Gupta, Asha
2017-09-01
In this study, initial minimum inhibitory concentration (MIC) of Pb(II) ions was analysed to check optimum concentration of Pb(II) ions at which the growth of sulphate-reducing consortium (SRC) was found to be maximum. 80 ppm of Pb(II) ions was investigated as minimum inhibitory concentration for SRC. Influence of electron donors such as lactose, sucrose, glucose and sodium lactate was examined to investigate best carbon source for growth and activity of sulphate-reducing bacteria. Sodium lactate was found to be the prime carbon source for SRC. Later optimization of various parameters was executed using Box-Behnken design model of response surface methodology to explore the effectiveness of three independent operating variables, namely, pH (5.0-9.0), temperature (32-42 °C) and time (5.0-9.0 days), on dependent variables, i.e. protein content, precipitation of Pb(II) ions, and removal of COD by SRC biomass. Maximum removal of COD and Pb(II) was observed to be 91 and 98 %, respectively, at pH 7.0 and temperature 37 °C and incubation time 7 days. According to response surface analysis and analysis of variance, the experimental data were perfectly fitted to the quadratic model, and the interactive influence of pH, temperature and time on Pb(II) and COD removal was highly significant. A high regression coefficient between the variables and response ( r 2 = 0.9974) corroborate eminent evaluation of experimental data by second-order polynomial regression model. SEM and Fourier transform infrared analysis was performed to investigate morphology of PbS precipitates, sorption mechanism and involved functional groups in metal-free and metal-loaded biomass of SRC for Pb(II) binding.
Huang, Xiaoquan; Chen, Shiyao; Zhao, Hetong; Zeng, Xiaoqing; Lian, Jingjing; Tseng, Yujen; Chen, Jie
2017-03-01
The efficacy of transoral incisionless fundoplication (TIF) performed with the EsophyX device (Redmond, Washington, USA) and its long-term outcomes in gastresophageal reflux disease (GERD) are debated. We, therefore, performed a systematic review with meta-analysis of studies evaluating the role of TIF in GERD. A systematic search of EMBASE, SCOPUS, PubMed, and the Cochrane Library Central was performed. All original studies reporting outcomes in GERD patients who underwent TIF were identified. Only randomized controlled trials (RCTs) evaluating the efficacy of TIF, and prospective observational studies reporting outcomes after TIF were included. A total of 18 studies (963 patients) published between 2007 and 2015 were identified, including five RCTs and 13 prospective observational studies. The pooled relative risk of response rate to TIF versus PPIs/sham was 2.44 (95 % CI 1.25-4.79, p = 0.0009) in RCTs in the intention-to-treat analysis. The total number of refluxes was reduced after TIF compared with the PPIs/sham group. The esophageal acid exposure time and acid reflux episodes after TIF were not significantly improved. Proton-pump inhibitors (PPIs) usage increased with time and most of the patients resumed PPIs treatment at reduced dosage during the long-term follow-up. The total satisfaction rate after TIF was about 69.15 % in 6 months. The incidence of severe adverse events consisting of gastrointestinal perforation and bleeding was 2.4 %. TIF is an alternative intervention in controlling GERD-related symptoms with comparable short-term patient satisfaction. Long-term results showed decreased efficacy with time. Patients often resume PPIs at reduced doses in the near future.
Numerical analysis of biomass torrefaction reactor with recirculation of heat carrier
NASA Astrophysics Data System (ADS)
Director, L. B.; Ivanin, O. A.; Sinelshchikov, V. A.
2018-01-01
In this paper, results of numerical analysis of the energy-technological complex consisting of the gas piston power plant, the torrefaction reactor with recirculation of gaseous heat carrier and the heat recovery boiler are presented. Calculations of the reactor without recirculation and with recirculation of the heat carrier in torrefaction zone at different frequencies of unloading of torrefied biomass were held. It was shown that in recirculation mode the power of the gas piston power plant, required for providing given reactor productivity, is reduced several times and the consumption of fuel gas, needed for combustion of volatile torrefaction products in the heat recovery boiler, is reduced by an order.
FT-NIR: A Tool for Process Monitoring and More.
Martoccia, Domenico; Lutz, Holger; Cohen, Yvan; Jerphagnon, Thomas; Jenelten, Urban
2018-03-30
With ever-increasing pressure to optimize product quality, to reduce cost and to safely increase production output from existing assets, all combined with regular changes in terms of feedstock and operational targets, process monitoring with traditional instruments reaches its limits. One promising answer to these challenges is in-line, real time process analysis with spectroscopic instruments, and above all Fourier-Transform Near Infrared spectroscopy (FT-NIR). Its potential to afford decreased batch cycle times, higher yields, reduced rework and minimized batch variance is presented and application examples in the field of fine chemicals are given. We demonstrate that FT-NIR can be an efficient tool for improved process monitoring and optimization, effective process design and advanced process control.
NASA Astrophysics Data System (ADS)
DeForest, Craig; Seaton, Daniel B.; Darnell, John A.
2017-08-01
I present and demonstrate a new, general purpose post-processing technique, "3D noise gating", that can reduce image noise by an order of magnitude or more without effective loss of spatial or temporal resolution in typical solar applications.Nearly all scientific images are, ultimately, limited by noise. Noise can be direct Poisson "shot noise" from photon counting effects, or introduced by other means such as detector read noise. Noise is typically represented as a random variable (perhaps with location- or image-dependent characteristics) that is sampled once per pixel or once per resolution element of an image sequence. Noise limits many aspects of image analysis, including photometry, spatiotemporal resolution, feature identification, morphology extraction, and background modeling and separation.Identifying and separating noise from image signal is difficult. The common practice of blurring in space and/or time works because most image "signal" is concentrated in the low Fourier components of an image, while noise is evenly distributed. Blurring in space and/or time attenuates the high spatial and temporal frequencies, reducing noise at the expense of also attenuating image detail. Noise-gating exploits the same property -- "coherence" -- that we use to identify features in images, to separate image features from noise.Processing image sequences through 3-D noise gating results in spectacular (more than 10x) improvements in signal-to-noise ratio, while not blurring bright, resolved features in either space or time. This improves most types of image analysis, including feature identification, time sequence extraction, absolute and relative photometry (including differential emission measure analysis), feature tracking, computer vision, correlation tracking, background modeling, cross-scale analysis, visual display/presentation, and image compression.I will introduce noise gating, describe the method, and show examples from several instruments (including SDO/AIA , SDO/HMI, STEREO/SECCHI, and GOES-R/SUVI) that explore the benefits and limits of the technique.
Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations
Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.
2016-01-01
Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces. PMID:27736999
Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations.
Hess, Becky M; Amidan, Brett G; Anderson, Kevin K; Hutchison, Janine R
2016-01-01
Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.
Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.
Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-samplemore » composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.« less
Evaluation of the annual Canadian biodosimetry network intercomparisons
Wilkins, Ruth C.; Beaton-Green, Lindsay A.; Lachapelle, Sylvie; Kutzner, Barbara C.; Ferrarotto, Catherine; Chauhan, Vinita; Marro, Leonora; Livingston, Gordon K.; Boulay Greene, Hillary; Flegal, Farrah N.
2015-01-01
Abstract Purpose: To evaluate the importance of annual intercomparisons for maintaining the capacity and capabilities of a well-established biodosimetry network in conjunction with assessing efficient and effective analysis methods for emergency response. Materials and methods: Annual intercomparisons were conducted between laboratories in the Canadian National Biological Dosimetry Response Plan. Intercomparisons were performed over a six-year period and comprised of the shipment of 10–12 irradiated, blinded blood samples for analysis by each of the participating laboratories. Dose estimates were determined by each laboratory using the dicentric chromosome assay (conventional and QuickScan scoring) and where possible the cytokinesis block micronucleus (CBMN) assay. Dose estimates were returned to the lead laboratory for evaluation and comparison. Results: Individual laboratories performed comparably from year to year with only slight fluctuations in performance. Dose estimates using the dicentric chromosome assay were accurate about 80% of the time and the QuickScan method for scoring the dicentric chromosome assay was proven to reduce the time of analysis without having a significant effect on the dose estimates. Although analysis with the CBMN assay was comparable to QuickScan scoring with respect to speed, the accuracy of the dose estimates was greatly reduced. Conclusions: Annual intercomparisons are necessary to maintain a network of laboratories for emergency response biodosimetry as they evoke confidence in their capabilities. PMID:25670072
Yamashita, Ken-Ichiro; Taniwaki, Yoshihide; Utsunomiya, Hidetsuna; Taniwaki, Takayuki
2014-01-01
Impairment of orientation for time (OT) is a characteristic symptom of Alzheimer disease (AD). However, the brain regions underlying OT remain to be elucidated. Using single photon emission computed tomography (SPECT), we examined the brain regions exhibiting hypoperfusion that were associated with OT. We compared regional cerebral blood flow (rCBF) differences between AD and amnesic mild cognitive impairment (aMCI) or normal subjects using 3-dimensional stereotactic surface projection (3D-SSP) analysis. AD patients were divided into OT good and poor groups according to their mean OT scores, and rCBF then compared between the groups to elucidate OT-specific brain areas. 3D-SSP analysis showed reduced rCBF in the left superior parietal lobule (SPL) and bilateral inferior parietal lobule (IPL) in AD patients. In the poor OT group, 3D-SSP analysis revealed hypoperfusion in the bilateral SPL, IPL, posterior cingulated cortex (PCC), and precuneus. Among these areas, region of interest analysis revealed a significant higher number of hypoperfused pixels in the left PCC in the OT poor AD group. Our SPECT study suggested that hypoperfusion in the left SPL and bilateral IPL was AD specific, and reduced rCBF in the left PCC was specifically associated with OT. Copyright © 2014 by the American Society of Neuroimaging.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
NASA Astrophysics Data System (ADS)
Ma, Yan; Yao, Jinxia; Gu, Chao; Chen, Yufeng; Yang, Yi; Zou, Lida
2017-05-01
With the formation of electric big data environment, more and more big data analyses emerge. In the complicated data analysis on equipment condition assessment, there exist many join operations, which are time-consuming. In order to save time, the approach of materialized view is usually used. It places part of common and critical join results on external storage and avoids the frequent join operation. In the paper we propose the methods of selecting and placing materialized views to reduce the query time of electric transmission and transformation equipment, and make the profits of service providers maximal. In selection method we design a computation way for the value of non-leaf node based on MVPP structure chart. In placement method we use relevance weights to place the selected materialized views, which help reduce the network transmission time. Our experiments show that the proposed selection and placement methods have a high throughput and good optimization ability of query time for electric transmission and transformation equipment.
Adjoint-Based Methodology for Time-Dependent Optimization
NASA Technical Reports Server (NTRS)
Yamaleev, N. K.; Diskin, B.; Nielsen, E. J.
2008-01-01
This paper presents a discrete adjoint method for a broad class of time-dependent optimization problems. The time-dependent adjoint equations are derived in terms of the discrete residual of an arbitrary finite volume scheme which approximates unsteady conservation law equations. Although only the 2-D unsteady Euler equations are considered in the present analysis, this time-dependent adjoint method is applicable to the 3-D unsteady Reynolds-averaged Navier-Stokes equations with minor modifications. The discrete adjoint operators involving the derivatives of the discrete residual and the cost functional with respect to the flow variables are computed using a complex-variable approach, which provides discrete consistency and drastically reduces the implementation and debugging cycle. The implementation of the time-dependent adjoint method is validated by comparing the sensitivity derivative with that obtained by forward mode differentiation. Our numerical results show that O(10) optimization iterations of the steepest descent method are needed to reduce the objective functional by 3-6 orders of magnitude for test problems considered.
Immunophenotyping of posttraumatic neutrophils on a routine haematology analyser.
Groeneveld, Kathelijne Maaike; Heeres, Marjolein; Leenen, Loek Petrus Hendrikus; Huisman, Albert; Koenderman, Leo
2012-01-01
Flow cytometry markers have been proposed as useful predictors for the occurrence of posttraumatic inflammatory complications. However, currently the need for a dedicated laboratory and the labour-intensive analytical procedures make these markers less suitable for clinical practice. We tested an approach to overcome these limitations. Neutrophils of healthy donors were incubated with antibodies commonly used in trauma research: CD11b (MAC-1), L-selectin (CD62L), FcγRIII (CD16), and FcγRII (CD32) in active form (MoPhab A27). Flow cytometric analysis was performed both on a FACSCalibur, a standard flow cytometer, and on a Cell-Dyn Sapphire, a routine haematology analyser. There was a high level of agreement between the two types of analysers, with 41% for FcγRIII, 80% for L-selectin, 98% for CD11b, and even a 100% agreement for active FcγRII. Moreover, analysis on the routine haematology analyser was possible in less than a quarter of the time in comparison to the flow cytometer. Analysis of neutrophil phenotype on the Cell-Dyn Sapphire leads to the same conclusion compared to a standard flow cytometer. The markedly reduced time necessary for analysis and reduced labour intensity constitutes a step forward in implementation of this type of analysis in clinical diagnostics in trauma research. Copyright © 2012 Kathelijne Maaike Groeneveld et al.
Finite element based electric motor design optimization
NASA Technical Reports Server (NTRS)
Campbell, C. Warren
1993-01-01
The purpose of this effort was to develop a finite element code for the analysis and design of permanent magnet electric motors. These motors would drive electromechanical actuators in advanced rocket engines. The actuators would control fuel valves and thrust vector control systems. Refurbishing the hydraulic systems of the Space Shuttle after each flight is costly and time consuming. Electromechanical actuators could replace hydraulics, improve system reliability, and reduce down time.
Todd E. Ristau; Susan L. Stout
2014-01-01
Assessment of regeneration can be time-consuming and costly. Often, foresters look for ways to minimize the cost of doing inventories. One potential method to reduce time required on a plot is use of percent cover data rather than seedling count data to determine stocking. Robust linear regression analysis was used in this report to predict seedling count data from...
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-20
Look-ahead dynamic simulation software system incorporates the high performance parallel computing technologies, significantly reduces the solution time for each transient simulation case, and brings the dynamic simulation analysis into on-line applications to enable more transparency for better reliability and asset utilization. It takes the snapshot of the current power grid status, functions in parallel computing the system dynamic simulation, and outputs the transient response of the power system in real time.
Analysis of BigFoot HDC SymCap experiment N161205 on NIF
NASA Astrophysics Data System (ADS)
Dittrich, T. R.; Baker, K. L.; Thomas, C. A.; Berzak Hopkins, L. F.; Harte, J. A.; Zimmerman, G. B.; Woods, D. T.; Kritcher, A. L.; Ho, D. D.; Weber, C. R.; Kyrala, G.
2017-10-01
Analysis of NIF implosion experiment N161205 provides insight into both hohlraum and capsule performance. This experiment used an undoped High Density Carbon (HDC) ablator driven by a BigFoot x-ray profile in a Au hohlraum. Observations from this experiment include DT fusion yield, bang time, DSR, Tion and time-resolved x-ray emission images around bang time. These observations are all consistent with an x-ray spectrum having significantly reduced Au m-band emission that is present in a standard hohlraum simulation. Attempts to justify the observations using several other simulation modifications will be presented. This work was performed under the auspices of the Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.
Atanassova, Vassia; Sotirova, Evdokia; Doukovska, Lyubka; Bureva, Veselina; Mavrov, Deyan; Tomov, Jivko
2017-01-01
The approach of InterCriteria Analysis (ICA) was applied for the aim of reducing the set of variables on the input of a neural network, taking into account the fact that their large number increases the number of neurons in the network, thus making them unusable for hardware implementation. Here, for the first time, with the help of the ICA method, correlations between triples of the input parameters for training of the neural networks were obtained. In this case, we use the approach of ICA for data preprocessing, which may yield reduction of the total time for training the neural networks, hence, the time for the network's processing of data and images. PMID:28874908
Duda, Catherine; Rajaram, Kumar; Barz, Christiane; Rosenthal, J Thomas
2013-01-01
There has been an increasing emphasis on health care efficiency and costs and on improving quality in health care settings such as hospitals or clinics. However, there has not been sufficient work on methods of improving access and customer service times in health care settings. The study develops a framework for improving access and customer service time for health care settings. In the framework, the operational concept of the bottleneck is synthesized with queuing theory to improve access and reduce customer service times without reduction in clinical quality. The framework is applied at the Ronald Reagan UCLA Medical Center to determine the drivers for access and customer service times and then provides guidelines on how to improve these drivers. Validation using simulation techniques shows significant potential for reducing customer service times and increasing access at this institution. Finally, the study provides several practice implications that could be used to improve access and customer service times without reduction in clinical quality across a range of health care settings from large hospitals to small community clinics.
Effect of abdominopelvic abscess drain size on drainage time and probability of occlusion.
Rotman, Jessica A; Getrajdman, George I; Maybody, Majid; Erinjeri, Joseph P; Yarmohammadi, Hooman; Sofocleous, Constantinos T; Solomon, Stephen B; Boas, F Edward
2017-04-01
The purpose of this study is to determine whether larger abdominopelvic abscess drains reduce the time required for abscess resolution or the probability of tube occlusion. 144 consecutive patients who underwent abscess drainage at a single institution were reviewed retrospectively. Larger initial drain size did not reduce drainage time, drain occlusion, or drain exchanges (P > .05). Subgroup analysis did not find any type of collection that benefitted from larger drains. A multivariate model predicting drainage time showed that large collections (>200 mL) required 16 days longer drainage time than small collections (<50 mL). Collections with a fistula to bowel required 17 days longer drainage time than collections without a fistula. Initial drain size and the viscosity of the fluid in the collection had no significant effect on drainage time in the multivariate model. 8 F drains are adequate for initial drainage of most serous and serosanguineous collections. 10 F drains are adequate for initial drainage of most purulent or bloody collections. Copyright © 2016 Elsevier Inc. All rights reserved.
Optimal structural design of the midship of a VLCC based on the strategy integrating SVM and GA
NASA Astrophysics Data System (ADS)
Sun, Li; Wang, Deyu
2012-03-01
In this paper a hybrid process of modeling and optimization, which integrates a support vector machine (SVM) and genetic algorithm (GA), was introduced to reduce the high time cost in structural optimization of ships. SVM, which is rooted in statistical learning theory and an approximate implementation of the method of structural risk minimization, can provide a good generalization performance in metamodeling the input-output relationship of real problems and consequently cuts down on high time cost in the analysis of real problems, such as FEM analysis. The GA, as a powerful optimization technique, possesses remarkable advantages for the problems that can hardly be optimized with common gradient-based optimization methods, which makes it suitable for optimizing models built by SVM. Based on the SVM-GA strategy, optimization of structural scantlings in the midship of a very large crude carrier (VLCC) ship was carried out according to the direct strength assessment method in common structural rules (CSR), which eventually demonstrates the high efficiency of SVM-GA in optimizing the ship structural scantlings under heavy computational complexity. The time cost of this optimization with SVM-GA has been sharply reduced, many more loops have been processed within a small amount of time and the design has been improved remarkably.
A new parallel-vector finite element analysis software on distributed-memory computers
NASA Technical Reports Server (NTRS)
Qin, Jiangning; Nguyen, Duc T.
1993-01-01
A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
Performance Analysis of Scientific and Engineering Applications Using MPInside and TAU
NASA Technical Reports Server (NTRS)
Saini, Subhash; Mehrotra, Piyush; Taylor, Kenichi Jun Haeng; Shende, Sameer Suresh; Biswas, Rupak
2010-01-01
In this paper, we present performance analysis of two NASA applications using performance tools like Tuning and Analysis Utilities (TAU) and SGI MPInside. MITgcmUV and OVERFLOW are two production-quality applications used extensively by scientists and engineers at NASA. MITgcmUV is a global ocean simulation model, developed by the Estimating the Circulation and Climate of the Ocean (ECCO) Consortium, for solving the fluid equations of motion using the hydrostatic approximation. OVERFLOW is a general-purpose Navier-Stokes solver for computational fluid dynamics (CFD) problems. Using these tools, we analyze the MPI functions (MPI_Sendrecv, MPI_Bcast, MPI_Reduce, MPI_Allreduce, MPI_Barrier, etc.) with respect to message size of each rank, time consumed by each function, and how ranks communicate. MPI communication is further analyzed by studying the performance of MPI functions used in these two applications as a function of message size and number of cores. Finally, we present the compute time, communication time, and I/O time as a function of the number of cores.
Multiple Adaptation Types with Mitigation: A Framework for Policy Analysis
Effective climate policy will consist of mitigation and adaptation implemented simultaneously in a policy portfolio to reduce the risks of climate change. The relative share of these responses will vary over time and will be adjusted in response to new information. Furthermore,...
Hyperspectral image analysis for water stress detection of apple trees
USDA-ARS?s Scientific Manuscript database
Plant stress significantly reduces plant productivity. Automated on-the-go mapping of plant stress would allow for a timely intervention and mitigation of the problem before critical thresholds are exceeded, thereby maximizing productivity. The spectral signature of plant leaves was analyzed by a ...
USDA-ARS?s Scientific Manuscript database
Field-specific management could help achieve agricultural sustainability by increasing production and decreasing environmental impacts. Near-infrared spectroscopy (NIRS) and geostatistics are relatively unexplored tools that could reduce time, labor, and costs of soil analysis. Our objective was to ...
Extracellular space preservation aids the connectomic analysis of neural circuits
Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L
2015-01-01
Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits. DOI: http://dx.doi.org/10.7554/eLife.08206.001 PMID:26650352
Monitoring and modelling for dry-stone walls terracement maintenance
NASA Astrophysics Data System (ADS)
Preti, Federico; Errico, Alessandro; Giambastiani, Yamuna; Guastini, Enrico; Penna, Daniele
2017-04-01
An analysis of dry-stone walls stability in agricultural areas based on innovative monitoring and modeling is here presented The field test took place in Lamole, a terraced rural area located in the province of Florence, Tuscany, central Italy, where wine production is the most important agricultural activity business. Results show a good capability of the model to predict the time-space distribution and the intensity of stresses on the instrumented dry-stone wall and to describe the bulging of the ancient ones. We obtained significant information on how the terrace failure in Lamole resulted mainly related to the water concentration pathways at specific portions of the walls. An accurate drainage of the terraced slopes, even by means of simple ditches, could reduce the concentration factor at the critical parts of terraces strongly reducing the water pressures on the walls. The analysis of the effects caused by high return time events has been carried out by means of artificially reproduced severe rainfalls on the presented experimental area.
Giraudeau, Patrick; Guignard, Nadia; Hillion, Emilie; Baguet, Evelyne; Akoka, Serge
2007-03-12
Quantitative analysis by (1)H NMR is often hampered by heavily overlapping signals that may occur for complex mixtures, especially those containing similar compounds. Bidimensional homonuclear NMR spectroscopy can overcome this difficulty. A thorough review of acquisition and post-processing parameters was carried out to obtain accurate and precise, quantitative 2D J-resolved and DQF-COSY spectra in a much reduced time, thus limiting the spectrometer instabilities in the course of time. The number of t(1) increments was reduced as much as possible, and standard deviation was improved by optimization of spectral width, number of transients, phase cycling and apodization function. Localized polynomial baseline corrections were applied to the relevant chemical shift areas. Our method was applied to tropine-nortropine mixtures. Quantitative J-resolved spectra were obtained in less than 3 min and quantitative DQF-COSY spectra in 12 min, with an accuracy of 3% for J-spectroscopy and 2% for DQF-COSY, and a standard deviation smaller than 1%.
Della Pelle, Flavio; Di Crescenzo, Maria Chiara; Sergi, Manuel; Montesano, Camilla; Di Ottavio, Francesca; Scarpone, Rossana; Scortichini, Giampiero; Compagnone, Dario
2016-01-01
A rapid, selective and effective method of extraction, clean-up and concentration of organophosphorous pesticides from wheat followed by electrospray (ESI) LC-MS/MS analysis was developed. The μ-SPE (micro-solid-phase extraction) procedure resulted in good analytical performance and reduced at the same time matrix effects, analysis time and solvent consumption. Limits of detection (LODs) and quantification (LOQs) were in the range of 0.3-10 and 1-30 μg kg(-1), respectively, with good reproducibility (RSD ≤ 13.8) and recoveries between 75% and 109%. Coefficients of determination (r(2)) were greater than 0.996 for the studied pesticides. Despite the reduced sorbent bed mass of μ-SPE tips (4.2 mg), the analytical data showed that no saturation phenomena occurs in the tested range of concentration both for single compounds and mixtures. Several real samples were analysed and the concentrations of the selected pesticides were found to be below the respective maximum residue limit (MRLs).
Identification of Behavior Based Safety by Using Traffic Light Analysis to Reduce Accidents
NASA Astrophysics Data System (ADS)
Mansur, A.; Nasution, M. I.
2016-01-01
This work present the safety assessment of a case study and describes an important area within the field production in oil and gas industry, namely behavior based safety (BBS). The company set a rigorous BBS and its intervention program that implemented and deployed continually. In this case, observers requested to have discussion and spread a number of determined questions related with work behavior to the workers during observation. Appraisal of Traffic Light Analysis (TLA) as one tools of risk assessment used to determine the estimated score of BBS questionnaire. Standardization of TLA appraisal in this study are based on Regulation of Minister of Labor and Occupational Safety and Health No:PER.05/MEN/1996. The result shown that there are some points under 84%, which categorized in yellow category and should corrected immediately by company to prevent existing bad behavior of workers. The application of BBS expected to increase the safety performance at work time-by-time and effective in reducing accidents.
Computational singular perturbation analysis of stochastic chemical systems with stiffness
NASA Astrophysics Data System (ADS)
Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.
2017-04-01
Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.
Improving Reports Turnaround Time: An Essential Healthcare Quality Dimension.
Khan, Mustafa; Khalid, Parwaiz; Al-Said, Youssef; Cupler, Edward; Almorsy, Lamia; Khalifa, Mohamed
2016-01-01
Turnaround time is one of the most important healthcare performance indicators. King Faisal Specialist Hospital and Research Center in Jeddah, Saudi Arabia worked on reducing the reports turnaround time of the neurophysiology lab from more than two weeks to only five working days for 90% of cases. The main quality improvement methodology used was the FOCUS PDCA. Using root cause analysis, Pareto analysis and qualitative survey methods, the main factors contributing to the delay of turnaround time and the suggested improvement strategies were identified and implemented, through restructuring transcriptionists daily tasks, rescheduling physicians time and alerting for new reports, engaging consultants, consistent coordination and prioritizing critical reports. After implementation; 92% of reports are verified within 5 days compared to only 6% before implementation. 7% of reports were verified in 5 days to 2 weeks and only 1% of reports needed more than 2 weeks compared to 76% before implementation.
Widmar, Nicole Olynk; Lord, Emily; Litster, Annette
2015-01-01
Streamlining purchasing in nonhuman animal shelters can provide multiple financial benefits. Streamlining shelter inputs and thus reducing shelter costs can include trading paid labor and management for fewer, more involved volunteers or purchasing large quantities of medical supplies from fewer vendors to take advantage of bulk-purchasing discounts. Beyond direct savings, time and energy spent on purchasing and inventory control can be reduced through careful management. Although cost-cutting measures may seem attractive, shelter managers are cautioned to consider the potential unintended consequences of short-term cost reduction measures that could limit revenues or increase costs in the future. This analysis illustrates an example of the impact of cost reductions in specific expense categories and the impact on shelter net revenue, as well as the share of expenses across categories. An in-depth discussion of labor and purchasing cost-reducing strategies in the real world of animal shelter management is provided.
Time-based management of patient processes.
Kujala, Jaakko; Lillrank, Paul; Kronström, Virpi; Peltokorpi, Antti
2006-01-01
The purpose of this paper is to present a conceptual framework that would enable the effective application of time based competition (TBC) and work in process (WIP) concepts in the design and management of effective and efficient patient processes. This paper discusses the applicability of time-based competition and work-in-progress concepts to the design and management of healthcare service production processes. A conceptual framework is derived from the analysis of both existing research and empirical case studies. The paper finds that a patient episode is analogous to a customer order-to-delivery chain in industry. The effective application of TBC and WIP can be achieved by focusing on through put time of a patient episode by reducing the non-value adding time components and by minimizing time categories that are main cost drivers for all stakeholders involved in the patient episode. The paper shows that an application of TBC in managing patient processes can be limited if there is no consensus about optimal care episode in the medical community. In the paper it is shown that managing patient processes based on time and cost analysis enables one to allocate the optimal amount of resources, which would allow a healthcare system to minimize the total cost of specific episodes of illness. Analysing the total cost of patient episodes can provide useful information in the allocation of limited resources among multiple patient processes. This paper introduces a framework for health care managers and researchers to analyze the effect of reducing through put time to the total cost of patient episodes.
McLawhorn, Alexander S; Carroll, Kaitlin M; Blevins, Jason L; DeNegre, Scott T; Mayman, David J; Jerabek, Seth A
2015-10-01
Template-directed instrumentation (TDI) for total knee arthroplasty (TKA) may streamline operating room (OR) workflow and reduce costs by preselecting implants and minimizing instrument tray burden. A decision model simulated the economics of TDI. Sensitivity analyses determined thresholds for model variables to ensure TDI success. A clinical pilot was reviewed. The accuracy of preoperative templates was validated, and 20 consecutive primary TKAs were performed using TDI. The model determined that preoperative component size estimation should be accurate to ±1 implant size for 50% of TKAs to implement TDI. The pilot showed that preoperative template accuracy exceeded 97%. There were statistically significant improvements in OR turnover time and in-room time for TDI compared to an historical cohort of TKAs. TDI reduces costs and improves OR efficiency. Copyright © 2015 Elsevier Inc. All rights reserved.
Real-Time System Verification by Kappa-Induction
NASA Technical Reports Server (NTRS)
Pike, Lee S.
2005-01-01
We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.
Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes
2016-01-01
The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition. PMID:27983788
Data streaming for metabolomics: Accelerating data processing and analysis from days to minutes
Montenegro-Burke, J. Rafael; Aisporna, Aries E.; Benton, H. Paul; ...
2016-12-16
The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, whichmore » capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Here, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.« less
Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes.
Montenegro-Burke, J Rafael; Aisporna, Aries E; Benton, H Paul; Rinehart, Duane; Fang, Mingliang; Huan, Tao; Warth, Benedikt; Forsberg, Erica; Abe, Brian T; Ivanisevic, Julijana; Wolan, Dennis W; Teyton, Luc; Lairson, Luke; Siuzdak, Gary
2017-01-17
The speed and throughput of analytical platforms has been a driving force in recent years in the "omics" technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry
The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce themore » required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.« less
Yang, Ying; Wang, Congcong; Li, Xinxue; Chai, Qianyun; Fei, Yutong; Xia, Ruyu; Xu, Rongqian; Yang, Li; Liu, Jianping
2015-10-01
Henoch-Schönlein Purpura (HSP) is the most common necrotizing vasculitis affecting children. Traditional Chinese herbal medicine (CHM) was widely used. We aim to explore the evidence of effectiveness and safety of CHM for HSP in children without renal damage. Randomized controlled trials (RCTs) comparing CHM with conventional medications were searched from five databases. Eligible data were pooled using random-effects model using RevMan 5.2 Subgroup analysis for different co-interventions and sensitivity analysis for reducing heterogeneity were implemented. GRADE approach was adopted. We included 15 trials with 1112HSP children (age 1-16 years old), disease duration one day to three months. The overall methodological quality of included trials is relatively low. Adjunctive oral CHM treatments reduced renal damage (6 trials, RR 0.47, 95%CI 0.31-0.72, I(2)=0%), and subsiding time (days) of purpura (5 trials, mean difference (MD) -3.60, 95%CI -4.21 to -2.99, I(2)=23%), joint pain (5 trials, MD -1.04, 95%CI -1.33 to -0.74, I(2)=1%) and abdomen pain (5 trials, MD -1.69, 95%CI -2.51 to -0.86, I(2)=74%). Subgroup and sensitivity analysis did not change the direction of results. No severe adverse events reported. Orally taken adjunctive CHM treatments are effective for children suffering HSP in terms of reducing renal damage and subsiding time of purpura, and could possibly reduce subsiding pain of joint and abdomen. No reliable conclusion regarding safety is possible based on the safety data retrieved. Further rigorous trials are warranted. Copyright © 2015. Published by Elsevier Ltd.
Woehrle, Holger; Cowie, Martin R; Eulenburg, Christine; Suling, Anna; Angermann, Christiane; d'Ortho, Marie-Pia; Erdmann, Erland; Levy, Patrick; Simonds, Anita K; Somers, Virend K; Zannad, Faiez; Teschler, Helmut; Wegscheider, Karl
2017-08-01
This on-treatment analysis was conducted to facilitate understanding of mechanisms underlying the increased risk of all-cause and cardiovascular mortality in heart failure patients with reduced ejection fraction and predominant central sleep apnoea randomised to adaptive servo ventilation versus the control group in the SERVE-HF trial.Time-dependent on-treatment analyses were conducted (unadjusted and adjusted for predictive covariates). A comprehensive, time-dependent model was developed to correct for asymmetric selection effects (to minimise bias).The comprehensive model showed increased cardiovascular death hazard ratios during adaptive servo ventilation usage periods, slightly lower than those in the SERVE-HF intention-to-treat analysis. Self-selection bias was evident. Patients randomised to adaptive servo ventilation who crossed over to the control group were at higher risk of cardiovascular death than controls, while control patients with crossover to adaptive servo ventilation showed a trend towards lower risk of cardiovascular death than patients randomised to adaptive servo ventilation. Cardiovascular risk did not increase as nightly adaptive servo ventilation usage increased.On-treatment analysis showed similar results to the SERVE-HF intention-to-treat analysis, with an increased risk of cardiovascular death in heart failure with reduced ejection fraction patients with predominant central sleep apnoea treated with adaptive servo ventilation. Bias is inevitable and needs to be taken into account in any kind of on-treatment analysis in positive airway pressure studies. Copyright ©ERS 2017.
Comparative loneliness of users versus nonusers of online chatting.
Ong, Chorng-Shyong; Chang, Shu-Chen; Wang, Chih-Chien
2011-01-01
Online chatting is an important component of improving interpersonal relationships online, but it may reduce participants' communication time with family members. We conducted a study of the relationship between participants' intent to engage in online chatting and three dimensions of loneliness: social, familial, and romantic. This study was designed to show the effect of online chatting on each of these three dimensions of loneliness. The participants in the study were 709 students at two universities in Taiwan who were classified on the basis of whether or not they had ever engaged in online chatting. Of the participants, 651 (91.82%) fully completed the questionnaires that served as the study instruments and were included in data analysis. The study found that individuals who had participated in online chatting exhibited greater familial loneliness than those who had not because the time spent in online chatting reduced the time spent in familial relationships. Social loneliness was related to the quality of Internet relationships rather than to the time spent online. Individuals who participated in online chatting had less romantic loneliness because of a greater ease of maintaining romantic relationships online. We conclude that online chatting can reduce social loneliness through high-quality Internet relationships but may exacerbate familial loneliness.
Development of an Efficient Binaural Simulation for the Analysis of Structural Acoustic Data
NASA Technical Reports Server (NTRS)
Johnson, Marty E.; Lalime, Aimee L.; Grosveld, Ferdinand W.; Rizzi, Stephen A.; Sullivan, Brenda M.
2003-01-01
Applying binaural simulation techniques to structural acoustic data can be very computationally intensive as the number of discrete noise sources can be very large. Typically, Head Related Transfer Functions (HRTFs) are used to individually filter the signals from each of the sources in the acoustic field. Therefore, creating a binaural simulation implies the use of potentially hundreds of real time filters. This paper details two methods of reducing the number of real-time computations required by: (i) using the singular value decomposition (SVD) to reduce the complexity of the HRTFs by breaking them into dominant singular values and vectors and (ii) by using equivalent source reduction (ESR) to reduce the number of sources to be analyzed in real-time by replacing sources on the scale of a structural wavelength with sources on the scale of an acoustic wavelength. The ESR and SVD reduction methods can be combined to provide an estimated computation time reduction of 99.4% for the structural acoustic data tested. In addition, preliminary tests have shown that there is a 97% correlation between the results of the combined reduction methods and the results found with the current binaural simulation techniques
Shuval, Kerem; Balasubramanian, Bijal A.; Kendzor, Darla E.; Radford, Nina B.; DeFina, Laura F.; Gabriel, Kelley Pettee
2016-01-01
Introduction Objective estimates, based on waist-worn accelerometers, indicate that adults spend over half their day (55%) in sedentary behaviors. Our study examined the association between sitting time and cardiometabolic risk factors after adjustment for cardiorespiratory fitness (CRF). Methods A cross-sectional analysis was conducted with 4,486 men and 1,845 women who reported daily estimated sitting time, had measures for adiposity, blood lipids, glucose, and blood pressure, and underwent maximal stress testing. We used a modeling strategy using logistic regression analysis to assess CRF as a potential effect modifier and to control for potential confounding effects of CRF. Results Men who sat almost all of the time (about 100%) were more likely to be obese whether defined by waist girth (OR, 2.61; 95% CI, 1.25–5.47) or percentage of body fat (OR, 3.33; 95% CI, 1.35–8.20) than were men who sat almost none of the time (about 0%). Sitting time was not significantly associated with other cardiometabolic risk factors after adjustment for CRF level. For women, no significant associations between sitting time and cardiometabolic risk factors were observed after adjustment for CRF and other covariates. Conclusion As health professionals struggle to find ways to combat obesity and its health effects, reducing sitting time can be an initial step in a total physical activity plan that includes strategies to reduce sedentary time through increases in physical activity among men. In addition, further research is needed to elucidate the relationships between sitting time and CRF for women as well as the underlying mechanisms involved in these relationships. PMID:28033088
Venter, Anre; Maxwell, Scott E; Bolig, Erika
2002-06-01
Adding a pretest as a covariate to a randomized posttest-only design increases statistical power, as does the addition of intermediate time points to a randomized pretest-posttest design. Although typically 5 waves of data are required in this instance to produce meaningful gains in power, a 3-wave intensive design allows the evaluation of the straight-line growth model and may reduce the effect of missing data. The authors identify the statistically most powerful method of data analysis in the 3-wave intensive design. If straight-line growth is assumed, the pretest-posttest slope must assume fairly extreme values for the intermediate time point to increase power beyond the standard analysis of covariance on the posttest with the pretest as covariate, ignoring the intermediate time point.
NASA Technical Reports Server (NTRS)
Griffin, P. R.; Motakef, S.
1989-01-01
Consideration is given to the influence of temporal variations in the magnitude of gravity on natural convection during unidirectional solidification of semiconductors. It is shown that the response time to step changes in g at low Rayleigh numbers is controlled by the momentum diffusive time scale. At higher Rayleigh numbers, the response time to increases in g is reduced because of inertial effects. The degree of perturbation of flow fields by transients in the gravitational acceleration on the Space Shuttle and the Space Station is determined. The analysis is used to derive the requirements for crystal growth experiments conducted on low duration low-g vehicles. Also, the effectiveness of sounding rockets and KC-135 aircraft for microgravity experiments is examined.
Reducing adaptive optics latency using Xeon Phi many-core processors
NASA Astrophysics Data System (ADS)
Barr, David; Basden, Alastair; Dipper, Nigel; Schwartz, Noah
2015-11-01
The next generation of Extremely Large Telescopes (ELTs) for astronomy will rely heavily on the performance of their adaptive optics (AO) systems. Real-time control is at the heart of the critical technologies that will enable telescopes to deliver the best possible science and will require a very significant extrapolation from current AO hardware existing for 4-10 m telescopes. Investigating novel real-time computing architectures and testing their eligibility against anticipated challenges is one of the main priorities of technology development for the ELTs. This paper investigates the suitability of the Intel Xeon Phi, which is a commercial off-the-shelf hardware accelerator. We focus on wavefront reconstruction performance, implementing a straightforward matrix-vector multiplication (MVM) algorithm. We present benchmarking results of the Xeon Phi on a real-time Linux platform, both as a standalone processor and integrated into an existing real-time controller (RTC). Performance of single and multiple Xeon Phis are investigated. We show that this technology has the potential of greatly reducing the mean latency and variations in execution time (jitter) of large AO systems. We present both a detailed performance analysis of the Xeon Phi for a typical E-ELT first-light instrument along with a more general approach that enables us to extend to any AO system size. We show that systematic and detailed performance analysis is an essential part of testing novel real-time control hardware to guarantee optimal science results.
Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio
1997-11-01
A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).
Soft-copy sonography: cost reduction sensitivity analysis in a pediatric hospital.
Don, S; Albertina, M J; Ammann, D
1998-03-01
Our objective was to determine whether interpreting sonograms of pediatric patients using soft-copy (computer workstation) instead of laser-printed film could reduce costs for a pediatric radiology department. We used theoretic models of growth to analyze costs. The costs of a sonographic picture archiving and communication system (three interface devices, two workstations, a network server, maintenance expenses, and storage media costs) were compared with the potential savings of eliminating film and increasing technologist efficiency or reducing the number of technologists. The model was based on historic trends and future capitation estimates that will reduce fee-for-service reimbursement. The effects of varying the study volume and reducing technologists' work hours were analyzed. By converting to soft-copy interpretation, we saved 6 min 32 sec per examination by eliminating film processing waiting time, thus reducing examination time from 30 min to 24 min. During an average day of 27 examinations, 176 min were saved. However, 33 min a day were spent retrieving prior studies from long-term storage; thus, 143 extra minutes a day were available for scanning. This improved efficiency could result in five more sonograms a day obtained by converting to soft-copy interpretation, using existing staff and equipment. Alternatively, five examinations a day would equate to one half of a full-time equivalent technologists position. Our analysis of costs considered that the hospital's anticipated growth of sonography and the depreciation of equipment during 5 years resulted in a savings of more than $606,000. Increasing the examinations by just 200 sonograms in the first year and no further growth resulted in a savings of more than $96,000. If the number of sonograms stayed constant, elimination of film printing alone resulted in a loss of approximately $157,000; reduction of one half of a full-time equivalent technologist's position would recuperate approximately $134,000 of that loss. Soft-copy sonography can save money through improved technologist efficiency, thereby increasing the number of sonograms obtained and revenue generated. If the number of sonograms does not increase, elimination of printing costs and reduction of staff technologists will not result in a savings.
2015-12-01
2% reduced fat milk, Egg Beaters egg whites, tomato sauce, and several meats, including hamburger meat (80% lean and 20% fat), hot dogs, chicken... Egg Whites and Tomato Sauce .....................................................3 2.3.3 Group 3: Hot Dogs, Chicken Nuggets, Turkey Deli Meat...Apple juice, orange juice, whole milk, 2% reduced fat milk, Egg Beaters processed egg whites, tomato sauce, precooked turkey deli meat (99% fat free
Scramjet Combustor Simulations Using Reduced Chemical Kinetics for Practical Fuels
2003-12-01
the aerospace industry in reducing prototype and testing costs and the time needed to bring products to market . Accurate simulation of chemical...JP-8 kinetics and soot models into the UNICORN CFD code (Montgomery et al., 2003a) NSF Phase I and II SBIRs for development of a computer-assisted...divided by diameter QSS quasi-steady state REI Reaction Engineering International UNICORN UNsteady Ignition and COmbustion with ReactioNs VULCAN Viscous Upwind aLgorithm for Complex flow ANalysis
Development of a short version of the modified Yale Preoperative Anxiety Scale.
Jenkins, Brooke N; Fortier, Michelle A; Kaplan, Sherrie H; Mayes, Linda C; Kain, Zeev N
2014-09-01
The modified Yale Preoperative Anxiety Scale (mYPAS) is the current "criterion standard" for assessing child anxiety during induction of anesthesia and has been used in >100 studies. This observational instrument covers 5 items and is typically administered at 4 perioperative time points. Application of this complex instrument in busy operating room (OR) settings, however, presents a challenge. In this investigation, we examined whether the instrument could be modified and made easier to use in OR settings. This study used qualitative methods, principal component analyses, Cronbach αs, and effect sizes to create the mYPAS-Short Form (mYPAS-SF) and reduce time points of assessment. Data were obtained from multiple patients (N = 3798; Mage = 5.63) who were recruited in previous investigations using the mYPAS over the past 15 years. After qualitative analysis, the "use of parent" item was eliminated due to content overlap with other items. The reduced item set accounted for 82% or more of the variance in child anxiety and produced the Cronbach α of at least 0.92. To reduce the number of time points of assessment, a minimum Cohen d effect size criterion of 0.48 change in mYPAS score across time points was used. This led to eliminating the walk to the OR and entrance to the OR time points. Reducing the mYPAS to 4 items, creating the mYPAS-SF that can be administered at 2 time points, retained the accuracy of the measure while allowing the instrument to be more easily used in clinical research settings.
Pichai, Saravanan; Rajesh, M; Reddy, Naveen; Adusumilli, Gopinath; Reddy, Jayaprakash; Joshi, Bhavana
2014-09-01
Skeletal maturation is an integral part of individual pattern of growth and development and is a continuous process. Peak growth velocity in standing height is the most valid representation of the rate of overall skeletal growth. Ossification changes of hand wrist and cervical vertebrae are the reliable indicators of growth status of individual. The objective of this study was to compare skeletal maturation as measured by hand wrist bone analysis and cervical vertebral analysis. Hand wrist radiographs and lateral cephalograms of 72 subjects aged between 7 and 16 years both male and female from the patients visiting Department of Orthodontics and Dentofacial Orthopedics, R.V. Dental College and Hospital. The 9 stages were reduced to 5 stages to compare with cervical vertebral maturation stage by Baccetti et al. The Bjork, Grave and Brown stages were reduced to six intervals to compare with cervical vertebral maturational index (CVMI) staging by Hassel and Farman. These measurements were then compared with the hand wrist bone analysis, and the results were statistically analyzed using the Mann-Whitney test. There was no significant difference between the hand wrist analysis and the two different cervical vertebral analyses for assessing skeletal maturation. There was no significant difference between the two cervical vertebral analyses, but the CVMI method, which is visual method is less time consuming. Vertebral analysis on a lateral cephalogram is as valid as the hand wrist bone analysis with the advantage of reducing the radiation exposure of growing subjects.
Roswell, Robert O; Greet, Brian; Parikh, Parin; Mignatti, Andrea; Freese, John; Lobach, Iryna; Guo, Yu; Keller, Norma; Radford, Martha; Bangalore, Sripal
2014-07-01
The 2013 American College of Cardiology Foundation/American Heart Association ST-segment elevation myocardial infarction (STEMI) guidelines have shifted focus from door-to-balloon (D2B) time to the time from first medical contact to device activation (contact-to-device time [C2D] ). This study investigates the impact of prehospital wireless electrocardiogram transmission (PHT) on reperfusion times to assess the impact of the new guidelines. From January 2009 to December 2012, data were collected on STEMI patients who received percutaneous coronary interventions; 245 patients were included for analysis. The primary outcome was median C2D time in the PHT group and the secondary outcome was D2B time. Prehospital wireless electrocardiogram transmission was associated with reduced C2D times vs no PHT: 80 minutes (interquartile range [IQR], 64-94) vs 96 minutes (IQR, 79-118), respectively, P < 0.0001. The median D2B time was lower in the PHT group vs the no-PHT group: 45 minutes (IQR, 34-56) vs 63 minutes (IQR, 49-81), respectively, P < 0.0001. Multivariate analysis showed PHT to be the strongest predictor of a C2D time of <90 minutes (odds ratio: 3.73, 95% confidence interval: 1.65-8.39, P = 0.002). Female sex was negatively predictive of achieving a C2D time <90 minutes (odds ratio: 0.23, 95% confidence interval: 0.07-0.73, P = 0.01). In STEMI patients, PHT was associated with significantly reduced C2D and D2B times and was an independent predictor of achieving a target C2D time. As centers adapt to the new guidelines emphasizing C2D time, targeting a shorter D2B time (<50 minutes) is ideal to achieve a C2D time of <90 minutes. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Werner, C. L.; Wegmuller, U.; Strozzi, T.; Wiesmann, A.
2006-12-01
Principle contributors to the noise in differential SAR interferograms are temporal phase stability of the surface, geometry relating to baseline and surface slope, and propagation path delay variations due to tropospheric water vapor and the ionosphere. Time series analysis of multiple interferograms generated from a stack of SAR SLC images seeks to determine the deformation history of the surface while reducing errors. Only those scatterers within a resolution element that are stable and coherent for each interferometric pair contribute to the desired deformation signal. Interferograms with baselines exceeding 1/3 the critical baseline have substantial geometrical decorrelation for distributed targets. Short baseline pairs with multiple reference scenes can be combined using least-squares estimation to obtain a global deformation solution. Alternately point-like persistent scatterers can be identified in scenes that do not exhibit geometrical decorrelation associated with large baselines. In this approach interferograms are formed from a stack of SAR complex images using a single reference scene. Stable distributed scatter pixels are excluded however due to the presence of large baselines. We apply both point- based and short-baseline methodologies and compare results for a stack of fine-beam Radarsat data acquired in 2002-2004 over a rapidly subsiding oil field near Lost Hills, CA. We also investigate the density of point-like scatters with respect to image resolution. The primary difficulty encountered when applying time series methods is phase unwrapping errors due to spatial and temporal gaps. Phase unwrapping requires sufficient spatial and temporal sampling. Increasing the SAR range bandwidth increases the range resolution as well as increasing the critical interferometric baseline that defines the required satellite orbital tube diameter. Sufficient spatial sampling also permits unwrapping because of the reduced phase/pixel gradient. Short time intervals further reduce the differential phase due to deformation when the deformation is continuous. Lower frequency systems (L- vs. C-Band) substantially improve the ability to unwrap the phase correctly by directly reducing both interferometric phase amplitude and temporal decorrelation.
Singular-Arc Time-Optimal Trajectory of Aircraft in Two-Dimensional Wind Field
NASA Technical Reports Server (NTRS)
Nguyen, Nhan
2006-01-01
This paper presents a study of a minimum time-to-climb trajectory analysis for aircraft flying in a two-dimensional altitude dependent wind field. The time optimal control problem possesses a singular control structure when the lift coefficient is taken as a control variable. A singular arc analysis is performed to obtain an optimal control solution on the singular arc. Using a time-scale separation with the flight path angle treated as a fast state, the dimensionality of the optimal control solution is reduced by eliminating the lift coefficient control. A further singular arc analysis is used to decompose the original optimal control solution into the flight path angle solution and a trajectory solution as a function of the airspeed and altitude. The optimal control solutions for the initial and final climb segments are computed using a shooting method with known starting values on the singular arc The numerical results of the shooting method show that the optimal flight path angle on the initial and final climb segments are constant. The analytical approach provides a rapid means for analyzing a time optimal trajectory for aircraft performance.
Updated hazard rate equations for dual safeguard systems.
Rothschild, Marc
2007-04-11
A previous paper by this author [M.J. Rothschild, Updated hazard rate equation for single safeguards, J. Hazard. Mater. 130 (1-2) (2006) 15-20] showed that commonly used analytical methods for quantifying failure rates overestimates the risk in some circumstances. This can lead the analyst to mistakenly believe that a given operation presents an unacceptable risk. For a single safeguard system, a formula was presented in that paper that accurately evaluates the risk over a wide range of conditions. This paper expands on that analysis by evaluating the failure rate for dual safeguard systems. The safeguards can be activated at the same time or at staggered times, and the safeguard may provide an indication whether it was successful upon a challenge, or its status may go undetected. These combinations were evaluated using a Monte Carlo simulation. Empirical formulas for evaluating the hazard rate were developed from this analysis. It is shown that having the safeguards activate at the same time while providing positive feedback of their individual actions is the most effective arrangement in reducing the hazard rate. The hazard rate can also be reduced by staggering the testing schedules of the safeguards.
Wu, Shu Juan; Hayden, Joshua A
2018-02-15
Sandwich immunoassays offer advantages in the clinical laboratory but can yield erroneously low results due to hook (prozone) effect, especially with analytes whose concentrations span several orders of magnitude such as ferritin. This study investigated a new approach to reduce the likelihood of hook effect in ferritin immunoassays by performing upfront, five-fold dilutions of all samples for ferritin analysis. The impact of this change on turnaround time and costs were also investigated. Ferritin concentrations were analysed in routine clinical practice with and without upfront dilutions on Siemens Centaur® XP (Siemens Healthineers, Erlang, Germany) immunoanalysers. In addition, one month of baseline data (1026 results) were collected prior to implementing upfront dilutions and one month of data (1033 results) were collected after implementation. Without upfront dilutions, hook effect was observed in samples with ferritin concentrations as low as 86,028 µg/L. With upfront dilutions, samples with ferritin concentrations as high as 126,050 µg/L yielded values greater than the measurement interval and would have been diluted until an accurate value was obtained. The implementation of upfront dilution of ferritin samples led to a decrease in turnaround time from a median of 2 hours and 3 minutes to 1 hour and 18 minutes (P = 0.002). Implementation of upfront dilutions of all ferritin samples reduced the possibility of hook effect, improved turnaround time and saved the cost of performing additional dilutions.
A Simple Exoskeleton That Assists Plantarflexion Can Reduce the Metabolic Cost of Human Walking
Malcolm, Philippe; Derave, Wim; Galle, Samuel; De Clercq, Dirk
2013-01-01
Background Even though walking can be sustained for great distances, considerable energy is required for plantarflexion around the instant of opposite leg heel contact. Different groups attempted to reduce metabolic cost with exoskeletons but none could achieve a reduction beyond the level of walking without exoskeleton, possibly because there is no consensus on the optimal actuation timing. The main research question of our study was whether it is possible to obtain a higher reduction in metabolic cost by tuning the actuation timing. Methodology/Principal Findings We measured metabolic cost by means of respiratory gas analysis. Test subjects walked with a simple pneumatic exoskeleton that assists plantarflexion with different actuation timings. We found that the exoskeleton can reduce metabolic cost by 0.18±0.06 W kg−1 or 6±2% (standard error of the mean) (p = 0.019) below the cost of walking without exoskeleton if actuation starts just before opposite leg heel contact. Conclusions/Significance The optimum timing that we found concurs with the prediction from a mathematical model of walking. While the present exoskeleton was not ambulant, measurements of joint kinetics reveal that the required power could be recycled from knee extension deceleration work that occurs naturally during walking. This demonstrates that it is theoretically possible to build future ambulant exoskeletons that reduce metabolic cost, without power supply restrictions. PMID:23418524
A Method for Generating Reduced Order Linear Models of Supersonic Inlets
NASA Technical Reports Server (NTRS)
Chicatelli, Amy; Hartley, Tom T.
1997-01-01
For the modeling of high speed propulsion systems, there are at least two major categories of models. One is based on computational fluid dynamics (CFD), and the other is based on design and analysis of control systems. CFD is accurate and gives a complete view of the internal flow field, but it typically has many states and runs much slower dm real-time. Models based on control design typically run near real-time but do not always capture the fundamental dynamics. To provide improved control models, methods are needed that are based on CFD techniques but yield models that are small enough for control analysis and design.
NASA Technical Reports Server (NTRS)
Goodman, Allen; Shively, R. Joy (Technical Monitor)
1997-01-01
MIDAS, Man-machine Integration Design and Analysis System, is a unique combination of software tools aimed at reducing design cycle time, supporting quantitative predictions of human-system effectiveness and improving the design of crew stations and their associated operating procedures. This project is supported jointly by the US Army and NASA.
Estimating soil solution nitrate concentration from dielectric spectra using PLS analysis
USDA-ARS?s Scientific Manuscript database
Fast and reliable methods for in situ monitoring of soil nitrate-nitrogen concentration are vital for reducing nitrate-nitrogen losses to ground and surface waters from agricultural systems. While several studies have been done to indirectly estimate nitrate-nitrogen concentration from time domain s...
Separation and quantitation of plant and insect carbohydrate isomers found on the surface of cotton
USDA-ARS?s Scientific Manuscript database
Cotton stickiness researchers have worked to create ion chromatography (IC) carbohydrate separation methods which allow for minimal analysis time and reduced operational costs. Researchers have also tried to correlate scientifically backed IC data with the available physical stickiness tests, such ...
Methods for Estimating Payload/Vehicle Design Loads
NASA Technical Reports Server (NTRS)
Chen, J. C.; Garba, J. A.; Salama, M. A.; Trubert, M. R.
1983-01-01
Several methods compared with respect to accuracy, design conservatism, and cost. Objective of survey: reduce time and expense of load calculation by selecting approximate method having sufficient accuracy for problem at hand. Methods generally applicable to dynamic load analysis in other aerospace and other vehicle/payload systems.
Failure mode and effect analysis in blood transfusion: a proactive tool to reduce risks.
Lu, Yao; Teng, Fang; Zhou, Jie; Wen, Aiqing; Bi, Yutian
2013-12-01
The aim of blood transfusion risk management is to improve the quality of blood products and to assure patient safety. We utilize failure mode and effect analysis (FMEA), a tool employed for evaluating risks and identifying preventive measures to reduce the risks in blood transfusion. The failure modes and effects occurring throughout the whole process of blood transfusion were studied. Each failure mode was evaluated using three scores: severity of effect (S), likelihood of occurrence (O), and probability of detection (D). Risk priority numbers (RPNs) were calculated by multiplying the S, O, and D scores. The plan-do-check-act cycle was also used for continuous improvement. Analysis has showed that failure modes with the highest RPNs, and therefore the greatest risk, were insufficient preoperative assessment of the blood product requirement (RPN, 245), preparation time before infusion of more than 30 minutes (RPN, 240), blood transfusion reaction occurring during the transfusion process (RPN, 224), blood plasma abuse (RPN, 180), and insufficient and/or incorrect clinical information on request form (RPN, 126). After implementation of preventative measures and reassessment, a reduction in RPN was detected with each risk. The failure mode with the second highest RPN, namely, preparation time before infusion of more than 30 minutes, was shown in detail to prove the efficiency of this tool. FMEA evaluation model is a useful tool in proactively analyzing and reducing the risks associated with the blood transfusion procedure. © 2013 American Association of Blood Banks.
Schneiderhan, Wilhelm; Grundt, Alexander; Wörner, Stefan; Findeisen, Peter; Neumaier, Michael
2013-11-01
Because sepsis has a high mortality rate, rapid microbiological diagnosis is required to enable efficient therapy. The effectiveness of MALDI-TOF mass spectrometry (MALDI-TOF MS) analysis in reducing turnaround times (TATs) for blood culture (BC) pathogen identification when available in a 24-h hospital setting has not been determined. On the basis of data from a total number of 912 positive BCs collected within 140 consecutive days and work flow analyses of laboratory diagnostics, we evaluated different models to assess the TATs for batch-wise and for immediate response (real-time) MALDI-TOF MS pathogen identification of positive BC results during the night shifts. The results were compared to TATs from routine BC processing and biochemical identification performed during regular working hours. Continuous BC incubation together with batch-wise MALDI-TOF MS analysis enabled significant reductions of up to 58.7 h in the mean TATs for the reporting of the bacterial species. The TAT of batch-wise MALDI-TOF MS analysis was inferior by a mean of 4.9 h when compared to the model of the immediate work flow under ideal conditions with no constraints in staff availability. Together with continuous cultivation of BC, the 24-h availability of MALDI-TOF MS can reduce the TAT for microbial pathogen identification within a routine clinical laboratory setting. Batch-wise testing of positive BC loses a few hours compared to real-time identification but is still far superior to classical BC processing. Larger prospective studies are required to evaluate the contribution of rapid around-the-clock pathogen identification to medical decision-making for septicemic patients.
Murphy, John C; Darragh, Karen; Walsh, Simon J; Hanratty, Colm G
2011-11-15
The RADPAD is a lead-free surgical drape containing bismuth and barium that has been demonstrated to reduce scatter radiation exposure to primary operators during fluoroscopic procedures. It is not known to what degree the RADPAD reduces radiation exposure in operators who perform highly complex percutaneous coronary intervention (PCI) requiring prolonged fluoroscopic screening times. Sixty consecutive patients due to undergo elective complex PCI involving rotational atherectomy, multivessel PCI, or chronic total occlusions were randomized in a 1:1 pattern to have their procedures performed with and without the RADPAD drape in situ. Dosimetry was performed on the left arm of the primary operator. There were 40 cases of chronic total occlusion, including 28 with contralateral injections; 15 cases involving rotational atherectomy; and 5 cases of multivessel PCI. There was no significant difference in screening times or dose-area products between the 2 patient groups. Primary operator radiation dose relative to screening time (RADPAD: slope = 1.44, R² = 0.25; no RADPAD: slope = 4.60, R² = 0.26; analysis of covariance F = 4.81, p = 0.032) and dose-area product (RADPAD: slope = 0.003, R² = 0.26; no RADPAD: slope = 0.011, R² = 0.52; analysis of covariance F = 12.54, p = 0.008) was significantly smaller in the RADPAD cohort compared to the no-RADPAD group. In conclusion, the RADPAD significantly reduces radiation exposure to primary operators during prolonged, complex PCI cases. Copyright © 2011 Elsevier Inc. All rights reserved.
Sheikh, Bassem Yousef; Zihad, S M Neamul Kabir; Sifat, Nazifa; Uddin, Shaikh J; Shilpi, Jamil A; Hamdi, Omer A A; Hossain, Hemayet; Rouf, Razina; Jahan, Ismet Ara
2016-01-01
In addition to the rich nutritional value, date palm is also used in various ethnobotanical practices for the treatment of various disease conditions. Present investigation was undertaken to examine the neuropharmacological and antinociceptive effect of the ethanol extract of three date cultivars growing in Saudi Arabia, namely Ajwah, Safawy and Sukkari. Neuropharmacological effect was observed by pentobarbitone induced sleeping time, open field, and hole board test. Antinociceptive activity was tested by acetic acid induced writhing and hot plate test. The date extracts were also subjected to HPLC analysis to detect the presence of common bioactive polyphenols. All the three date extracts extended the pentobarbitone induced sleeping time, reduced locomotor activity in open field test and reduced exploratory behaviour in hole board test in mice. The extracts also reduced acetic acid induced writhing and delayed response time in hot plate test. The activities were stronger for Ajwah than the other two date cultivars. HPLC analysis indicated the presence of trans -ferulic acid in all three cultivars, while (+)-catechin and (-)-epicatechin only in Ajwah and Safawy. The observed neuropharmacological and analgesic activity could be partly due to the presence of (+)-catechin, (-)-epicatechin and trans -ferulic acid, three important plant polyphenols well known for their neuroprotective activity and their ability to exert antioxidant activity on brain cells. Present investigation also supports the ethnobotanical use of date palm to provide ameliorating effects in pain and CNS disorders.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Boram; Gupta, Rajan; Bhattacharya, Tanmoy
We present a detailed analysis of methods to reduce statistical errors and excited-state contamination in the calculation of matrix elements of quark bilinear operators in nucleon states. All the calculations were done on a 2+1 flavor ensemble with lattices of sizemore » $$32^3 \\times 64$$ generated using the rational hybrid Monte Carlo algorithm at $a=0.081$~fm and with $$M_\\pi=312$$~MeV. The statistical precision of the data is improved using the all-mode-averaging method. We compare two methods for reducing excited-state contamination: a variational analysis and a two-state fit to data at multiple values of the source-sink separation $$t_{\\rm sep}$$. We show that both methods can be tuned to significantly reduce excited-state contamination and discuss their relative advantages and cost-effectiveness. A detailed analysis of the size of source smearing used in the calculation of quark propagators and the range of values of $$t_{\\rm sep}$$ needed to demonstrate convergence of the isovector charges of the nucleon to the $$t_{\\rm sep} \\to \\infty $$ estimates is presented.« less
Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties
NASA Astrophysics Data System (ADS)
Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong
2018-03-01
This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.
Xie, Li; Chen, Jing; McMickle, Anthony; Awar, Nadia; Nady, Soad; Sredni, Benjamin; Drew, Paul D; Yu, Shiguang
2014-08-15
We reported that AS101 (organotellurium compound, trichloro(dioxoethylene-O,O') tellurate) inhibited the differentiation of Th17 cells and reduced the production of IL-17 and GM-CSF. In addition, AS101 promoted the production of IL-2 in activated T cells. Flow cytometric analysis showed that AS101 inhibited Th17 cell proliferation. AS101 blocked the activation of transcriptional factor NFAT, Stat3, and RORγt, and increased activation of Erk1/2, suggesting a mechanism of action of AS101. We further demonstrated that AS101 was effective in amelioration of experimental autoimmune encephalomyelitis (EAE), an animal model of multiple sclerosis. Finally, by real-time PCR analysis we showed that AS101 reduces the IL-17, IFN-γ, GM-CSF, and IL-6 mRNA expression in inflammatory cells of spinal cords. Additionally, flow cytometry analysis also indicated that the CD4+ T cells and IL-17 and GM-CSF-producing cells were reduced in the spinal cords of AS101 treated mice compared to those treated with PBS. Copyright © 2014 Elsevier B.V. All rights reserved.
Xie, Li; Chen, Jing; McMickle, Anthony; Awar, Nadia; Nady, Soad; Sredni, Benjamin; Drew, Paul D.; Yu, Shiguang
2014-01-01
We reported that AS101 (organotellurium compound, trichloro(dioxoethylene-O,O′) tellurate) inhibited the differentiation of Th17 cells and reduced the production of IL-17 and GM-CSF. In addition, AS101 promoted the production of IL-2 in activated T cells. Flow cytometric analysis showed that AS101 inhibited Th17 cell proliferation. AS101 blocked the activation of transcriptional factor NFAT, Stat3, and RORγt, and increased activation of Erk1/2, suggesting a mechanism of action of AS101. We further demonstrated that AS101 was effective in amelioration of experimental autoimmune encephalomyelitis (EAE), an animal model of multiple sclerosis. Finally, by real-time PCR analysis we showed that AS101 reduces the IL-17, IFN-γ, GM-CSF, and IL-6 mRNA expression in inflammatory cells of spinal cords. Additionally, flow cytometry analysis also indicated that the CD4+ T cells and IL-17 and GM-CSF-producing cells were reduced in the spinal cords of AS101 treated mice compared to those treated with PBS. PMID:24975323
Impact of Glucose Measurement Processing Delays on Clinical Accuracy and Relevance
Jangam, Sujit R.; Hayter, Gary; Dunn, Timothy C.
2013-01-01
Background In a hospital setting, glucose is often measured from venous blood in the clinical laboratory. However, laboratory glucose measurements are typically not available in real time. In practice, turn-around times for laboratory measurements can be minutes to hours. This analysis assesses the impact of turn-around time on the effective clinical accuracy of laboratory measurements. Methods Data obtained from an earlier study with 58 subjects with type 1 diabetes mellitus (T1DM) were used for this analysis. In the study, glucose measurements using a YSI glucose analyzer were obtained from venous blood samples every 15 min while the subjects were at the health care facility. To simulate delayed laboratory results, each YSI glucose value from a subject was paired with one from a later time point (from the same subject) separated by 15, 30, 45, and 60 min. To assess the clinical accuracy of a delayed YSI result relative to a real-time result, the percentage of YSI pairs that meet the International Organization for Standardization (ISO) 15197:2003(E) standard for glucose measurement accuracy (±15 mg/dl for blood glucose < 75 mg/dl, ±20% for blood glucose ≥ 75 mg/dl) was calculated. Results It was observed that delays of 15 min or more reduce clinical accuracy below the ISO 15197:2003(E) recommendation of 95%. The accuracy was less than 65% for delays of 60 min. Conclusion This analysis suggests that processing delays in glucose measurements reduce the clinical relevance of results in patients with T1DM and may similarly degrade the clinical value of measurements in other patient populations. PMID:23759399
NASA Technical Reports Server (NTRS)
Sahai, Ranjana; Pierce, Larry; Cicolani, Luigi; Tischler, Mark
1998-01-01
Helicopter slung load operations are common in both military and civil contexts. The slung load adds load rigid body modes, sling stretching, and load aerodynamics to the system dynamics, which can degrade system stability and handling qualities, and reduce the operating envelope of the combined system below that of the helicopter alone. Further, the effects of the load on system dynamics vary significantly among the large range of loads, slings, and flight conditions that a utility helicopter will encounter in its operating life. In this context, military helicopters and loads are often qualified for slung load operations via flight tests which can be time consuming and expensive. One way to reduce the cost and time required to carry out these tests and generate quantitative data more readily is to provide an efficient method for analysis during the flight, so that numerous test points can be evaluated in a single flight test, with evaluations performed in near real time following each test point and prior to clearing the aircraft to the next point. Methodology for this was implemented at Ames and demonstrated in slung load flight tests in 1997 and was improved for additional flight tests in 1999. The parameters of interest for the slung load tests are aircraft handling qualities parameters (bandwidth and phase delay), stability margins (gain and phase margin), and load pendulum roots (damping and natural frequency). A procedure for the identification of these parameters from frequency sweep data was defined using the CIFER software package. CIFER is a comprehensive interactive package of utilities for frequency domain analysis previously developed at Ames for aeronautical flight test applications. It has been widely used in the US on a variety of aircraft, including some primitive flight time analysis applications.
Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty
NASA Astrophysics Data System (ADS)
Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang
2016-12-01
Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration signal and principal components.
Can a clinical placement influence stigma? An analysis of measures of social distance.
Moxham, Lorna; Taylor, Ellie; Patterson, Christopher; Perlman, Dana; Brighton, Renee; Sumskis, Susan; Keough, Emily; Heffernan, Tim
2016-09-01
The way people who experience mental illness are perceived by health care professionals, which often includes stigmatising attitudes, can have a significant impact on treatment outcomes and on their quality of life. To determine whether stigma towards people with mental illness varied for undergraduate nursing students who attended a non-traditional clinical placement called Recovery Camp compared to students who attended a 'typical' mental health clinical placement. Quasi-experimental. Seventy-nine third-year nursing students were surveyed; n=40 attended Recovery Camp (intervention), n=39 (comparison group) attended a 'typical' mental health clinical placement. All students completed the Social Distance Scale (SDS) pre- and post-placement and at three-month follow-up. Data analysis consisted of a one-way repeated measures analysis of variance (ANOVA) exploring parameter estimates between group scores across three time points. Two secondary repeated measures ANOVAs were performed to demonstrate the differences in SDS scores for each group across time. Pairwise comparisons demonstrated the differences between time intervals. A statistically significant difference in ratings of stigma between the intervention group and the comparison group existed. Parameter estimates revealed that stigma ratings for the intervention group were significantly reduced post-placement and remained consistently low at three-month follow-up. There was no significant difference in ratings of stigma for the comparison group over time. Students who attended Recovery Camp reported significant decreases in stigma towards people with a mental illness over time, compared to the typical placement group. Findings suggest that a therapeutic recreation based clinical placement was more successful in reducing stigma regarding mental illness in undergraduate nursing students compared to those who attended typical mental health clinical placements. Copyright © 2016 Elsevier Ltd. All rights reserved.
Aceti, Arianna; Gori, Davide; Barone, Giovanni; Callegari, Maria Luisa; Fantini, Maria Pia; Indrio, Flavia; Maggio, Luca; Meneghin, Fabio; Morelli, Lorenzo; Zuccotti, Gianvincenzo; Corvaglia, Luigi
2016-01-01
Probiotics have been linked to a reduction in the incidence of necrotizing enterocolitis and late-onset sepsis in preterm infants. Recently, probiotics have also proved to reduce time to achieve full enteral feeding (FEF). However, the relationship between FEF achievement and type of feeding in infants treated with probiotics has not been explored yet. The aim of this systematic review and meta-analysis was to evaluate the effect of probiotics in reducing time to achieve FEF in preterm infants, according to type of feeding (exclusive human milk (HM) vs. formula). Randomized-controlled trials involving preterm infants receiving probiotics, and reporting on time to reach FEF were included in the systematic review. Trials reporting on outcome according to type of feeding (exclusive HM vs. formula) were included in the meta-analysis. Fixed-effect or random-effects models were used as appropriate. Results were expressed as mean difference (MD) with 95% confidence interval (CI). Twenty-five studies were included in the systematic review. In the five studies recruiting exclusively HM-fed preterm infants, those treated with probiotics reached FEF approximately 3 days before controls (MD −3.15 days (95% CI −5.25/−1.05), p = 0.003). None of the two studies reporting on exclusively formula-fed infants showed any difference between infants receiving probiotics and controls in terms of FEF achievement. The limited number of included studies did not allow testing for other subgroup differences between HM and formula-fed infants. However, if confirmed in further studies, the 3-days reduction in time to achieve FEF in exclusively HM-fed preterm infants might have significant implications for their clinical management. PMID:27483319
Aceti, Arianna; Gori, Davide; Barone, Giovanni; Callegari, Maria Luisa; Fantini, Maria Pia; Indrio, Flavia; Maggio, Luca; Meneghin, Fabio; Morelli, Lorenzo; Zuccotti, Gianvincenzo; Corvaglia, Luigi
2016-07-30
Probiotics have been linked to a reduction in the incidence of necrotizing enterocolitis and late-onset sepsis in preterm infants. Recently, probiotics have also proved to reduce time to achieve full enteral feeding (FEF). However, the relationship between FEF achievement and type of feeding in infants treated with probiotics has not been explored yet. The aim of this systematic review and meta-analysis was to evaluate the effect of probiotics in reducing time to achieve FEF in preterm infants, according to type of feeding (exclusive human milk (HM) vs. formula). Randomized-controlled trials involving preterm infants receiving probiotics, and reporting on time to reach FEF were included in the systematic review. Trials reporting on outcome according to type of feeding (exclusive HM vs. formula) were included in the meta-analysis. Fixed-effect or random-effects models were used as appropriate. Results were expressed as mean difference (MD) with 95% confidence interval (CI). Twenty-five studies were included in the systematic review. In the five studies recruiting exclusively HM-fed preterm infants, those treated with probiotics reached FEF approximately 3 days before controls (MD -3.15 days (95% CI -5.25/-1.05), p = 0.003). None of the two studies reporting on exclusively formula-fed infants showed any difference between infants receiving probiotics and controls in terms of FEF achievement. The limited number of included studies did not allow testing for other subgroup differences between HM and formula-fed infants. However, if confirmed in further studies, the 3-days reduction in time to achieve FEF in exclusively HM-fed preterm infants might have significant implications for their clinical management.
Impact of glucose measurement processing delays on clinical accuracy and relevance.
Jangam, Sujit R; Hayter, Gary; Dunn, Timothy C
2013-05-01
In a hospital setting, glucose is often measured from venous blood in the clinical laboratory. However, laboratory glucose measurements are typically not available in real time. In practice, turn-around times for laboratory measurements can be minutes to hours. This analysis assesses the impact of turn-around time on the effective clinical accuracy of laboratory measurements. Data obtained from an earlier study with 58 subjects with type 1 diabetes mellitus (T1DM) were used for this analysis. In the study, glucose measurements using a YSI glucose analyzer were obtained from venous blood samples every 15 min while the subjects were at the health care facility. To simulate delayed laboratory results, each YSI glucose value from a subject was paired with one from a later time point (from the same subject) separated by 15, 30, 45, and 60 min. To assess the clinical accuracy of a delayed YSI result relative to a real-time result, the percentage of YSI pairs that meet the International Organization for Standardization (ISO) 15197:2003(E) standard for glucose measurement accuracy (±15 mg/dl for blood glucose < 75 mg/dl, ±20% for blood glucose ≥ 75 mg/dl) was calculated. It was observed that delays of 15 min or more reduce clinical accuracy below the ISO 15197:2003(E) recommendation of 95%. The accuracy was less than 65% for delays of 60 min. This analysis suggests that processing delays in glucose measurements reduce the clinical relevance of results in patients with T1DM and may similarly degrade the clinical value of measurements in other patient populations. © 2013 Diabetes Technology Society.
Knaapen, Paul; de Mulder, Maarten; van der Zant, Friso M; Peels, Hans O; Twisk, Jos W R; van Rossum, Albert C; Cornel, Jan H; Umans, Victor A W M
2009-02-01
Primary percutaneous coronary intervention (PCI) performed in large community hospitals without cardiac surgery back-up facilities (off-site) reduces door-to-balloon time compared with emergency transferal to tertiary interventional centers (on-site). The present study was performed to explore whether off-site PCI for acute myocardial infarction results in reduced infarct size. One hundred twenty-eight patients with acute ST-segment elevation myocardial infarction were randomly assigned to undergo primary PCI at the off-site center (n = 68) or to transferal to an on-site center (n = 60). Three days after PCI, (99m)Tc-sestamibi SPECT was performed to estimate infarct size. Off-site PCI significantly reduced door-to-balloon time compared with on-site PCI (94 +/- 54 versus 125 +/- 59 min, respectively, p < 0.01), although symptoms-to-treatment time was only insignificantly reduced (257 +/- 211 versus 286 +/- 146 min, respectively, p = 0.39). Infarct size was comparable between treatment centers (16 +/- 15 versus 14 +/- 12%, respectively p = 0.35). Multivariate analysis revealed that TIMI 0/1 flow grade at initial coronary angiography (OR 3.125, 95% CI 1.17-8.33, p = 0.023), anterior wall localization of the myocardial infarction (OR 3.44, 95% CI 1.38-8.55, p < 0.01), and development of pathological Q-waves (OR 5.07, 95% CI 2.10-12.25, p < 0.01) were independent predictors of an infarct size > 12%. Off-site PCI reduces door-to-balloon time compared with transferal to a remote on-site interventional center but does not reduce infarct size. Instead, pre-PCI TIMI 0/1 flow, anterior wall infarct localization, and development of Q-waves are more important predictors of infarct size.
NASA Astrophysics Data System (ADS)
Alves de Mesquita, Jayme; Lopes de Melo, Pedro
2004-03-01
Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the diagnoses of sleep-breathing disorders.
NASA Astrophysics Data System (ADS)
Tanaka, Rie; Matsuda, Hiroaki; Sanada, Shigeru
2017-03-01
The density of lung tissue changes as demonstrated on imagery is dependent on the relative increases and decreases in the volume of air and lung vessels per unit volume of lung. Therefore, a time-series analysis of lung texture can be used to evaluate relative pulmonary function. This study was performed to assess a time-series analysis of lung texture on dynamic chest radiographs during respiration, and to demonstrate its usefulness in the diagnosis of pulmonary impairments. Sequential chest radiographs of 30 patients were obtained using a dynamic flat-panel detector (FPD; 100 kV, 0.2 mAs/pulse, 15 frames/s, SID = 2.0 m; Prototype, Konica Minolta). Imaging was performed during respiration, and 210 images were obtained over 14 seconds. Commercial bone suppression image-processing software (Clear Read Bone Suppression; Riverain Technologies, Miamisburg, Ohio, USA) was applied to the sequential chest radiographs to create corresponding bone suppression images. Average pixel values, standard deviation (SD), kurtosis, and skewness were calculated based on a density histogram analysis in lung regions. Regions of interest (ROIs) were manually located in the lungs, and the same ROIs were traced by the template matching technique during respiration. Average pixel value effectively differentiated regions with ventilatory defects and normal lung tissue. The average pixel values in normal areas changed dynamically in synchronization with the respiratory phase, whereas those in regions of ventilatory defects indicated reduced variations in pixel value. There were no significant differences between ventilatory defects and normal lung tissue in the other parameters. We confirmed that time-series analysis of lung texture was useful for the evaluation of pulmonary function in dynamic chest radiography during respiration. Pulmonary impairments were detected as reduced changes in pixel value. This technique is a simple, cost-effective diagnostic tool for the evaluation of regional pulmonary function.
Functional analysis and treatment of the diurnal bruxism of a 16-year-old girl with autism.
Armstrong, Amy; Knapp, Vicki Madaus; McAdam, David B
2014-01-01
Bruxism is defined as the clenching and grinding of teeth. This study used a functional analysis to examine whether the bruxism of a 16-year-old girl with autism was maintained by automatic reinforcement or social consequences. A subsequent component analysis of the intervention package described by Barnoy, Najdowski, Tarbox, Wilke, and Nollet (2009) showed that a vocal reprimand (e.g., "stop grinding") effectively reduced the participant's bruxism. Results were maintained across time, and effects extended to novel staff members. © Society for the Experimental Analysis of Behavior.
Miller, Robyn L; Yaesoubi, Maziar; Turner, Jessica A; Mathalon, Daniel; Preda, Adrian; Pearlson, Godfrey; Adali, Tulay; Calhoun, Vince D
2016-01-01
Resting-state functional brain imaging studies of network connectivity have long assumed that functional connections are stationary on the timescale of a typical scan. Interest in moving beyond this simplifying assumption has emerged only recently. The great hope is that training the right lens on time-varying properties of whole-brain network connectivity will shed additional light on previously concealed brain activation patterns characteristic of serious neurological or psychiatric disorders. We present evidence that multiple explicitly dynamical properties of time-varying whole-brain network connectivity are strongly associated with schizophrenia, a complex mental illness whose symptomatic presentation can vary enormously across subjects. As with so much brain-imaging research, a central challenge for dynamic network connectivity lies in determining transformations of the data that both reduce its dimensionality and expose features that are strongly predictive of important population characteristics. Our paper introduces an elegant, simple method of reducing and organizing data around which a large constellation of mutually informative and intuitive dynamical analyses can be performed. This framework combines a discrete multidimensional data-driven representation of connectivity space with four core dynamism measures computed from large-scale properties of each subject's trajectory, ie., properties not identifiable with any specific moment in time and therefore reasonable to employ in settings lacking inter-subject time-alignment, such as resting-state functional imaging studies. Our analysis exposes pronounced differences between schizophrenia patients (Nsz = 151) and healthy controls (Nhc = 163). Time-varying whole-brain network connectivity patterns are found to be markedly less dynamically active in schizophrenia patients, an effect that is even more pronounced in patients with high levels of hallucinatory behavior. To the best of our knowledge this is the first demonstration that high-level dynamic properties of whole-brain connectivity, generic enough to be commensurable under many decompositions of time-varying connectivity data, exhibit robust and systematic differences between schizophrenia patients and healthy controls.
Miller, Robyn L.; Yaesoubi, Maziar; Turner, Jessica A.; Mathalon, Daniel; Preda, Adrian; Pearlson, Godfrey; Adali, Tulay; Calhoun, Vince D.
2016-01-01
Resting-state functional brain imaging studies of network connectivity have long assumed that functional connections are stationary on the timescale of a typical scan. Interest in moving beyond this simplifying assumption has emerged only recently. The great hope is that training the right lens on time-varying properties of whole-brain network connectivity will shed additional light on previously concealed brain activation patterns characteristic of serious neurological or psychiatric disorders. We present evidence that multiple explicitly dynamical properties of time-varying whole-brain network connectivity are strongly associated with schizophrenia, a complex mental illness whose symptomatic presentation can vary enormously across subjects. As with so much brain-imaging research, a central challenge for dynamic network connectivity lies in determining transformations of the data that both reduce its dimensionality and expose features that are strongly predictive of important population characteristics. Our paper introduces an elegant, simple method of reducing and organizing data around which a large constellation of mutually informative and intuitive dynamical analyses can be performed. This framework combines a discrete multidimensional data-driven representation of connectivity space with four core dynamism measures computed from large-scale properties of each subject’s trajectory, ie., properties not identifiable with any specific moment in time and therefore reasonable to employ in settings lacking inter-subject time-alignment, such as resting-state functional imaging studies. Our analysis exposes pronounced differences between schizophrenia patients (Nsz = 151) and healthy controls (Nhc = 163). Time-varying whole-brain network connectivity patterns are found to be markedly less dynamically active in schizophrenia patients, an effect that is even more pronounced in patients with high levels of hallucinatory behavior. To the best of our knowledge this is the first demonstration that high-level dynamic properties of whole-brain connectivity, generic enough to be commensurable under many decompositions of time-varying connectivity data, exhibit robust and systematic differences between schizophrenia patients and healthy controls. PMID:26981625
l-fenfluramine in tests of dominance and anxiety in the rat.
File, S E; Guardiola-Lemaitre, B J
1988-01-01
l-Fenfluramine (1.25 and 2.5 mg/kg) significantly reduced the success of dominant rats competing with untreated middle rank rats for chocolate. In resident rats, l-fenfluramine (2.5 mg/kg) significantly increased the number of submissions, and the time spent submitting, to untreated rats intruding into their home-cage territory; it also significantly reduced the number of kicks directed at, and the time spent kicking, the intruder; and the incidence of, and time spent in, aggressively grooming the intruder. When the intruder rats were treated with l-fenfluramine the only significantly change was a decrease in the number of wrestling bouts and the time spent wrestling. Since l-fenfluramine did not change other behaviours in this test (e.g. sniffing the opponent) the decrease in dominance behaviours was probably not secondary to nonspecific sedation. In the social interaction test of anxiety, l-fenfluramine (2.5 and 5 mg/kg) significantly reduced the time spent in active social interaction, and decreased motor activity. Analyses of covariance indicated that these were two independent effects. In the elevated plus-maze, l-fenfluramine (1.25-5 mg/kg) significantly decreased the percent number of entries made onto open arms, and (2.5 and 5 mg/kg) significantly decreased the percent of times spent on the open arms. The total number of arm entries was reduced by all doses (0.625-5 mg/kg). Analysis of covariance indicated that the decrease in percent of time spent on the open arms was secondary to the drop in overall activity. Thus there was no evidence of anxiolytic action in either of these tests, the changes indicating, if anything, anxiogenic effects.(ABSTRACT TRUNCATED AT 250 WORDS)
Reich, H; Moens, Y; Braun, C; Kneissl, S; Noreikat, K; Reske, A
2014-12-01
Quantitative computer tomographic analysis (qCTA) is an accurate but time intensive method used to quantify volume, mass and aeration of the lungs. The aim of this study was to validate a time efficient interpolation technique for application of qCTA in ponies. Forty-one thoracic computer tomographic (CT) scans obtained from eight anaesthetised ponies positioned in dorsal recumbency were included. Total lung volume and mass and their distribution into four compartments (non-aerated, poorly aerated, normally aerated and hyperaerated; defined based on the attenuation in Hounsfield Units) were determined for the entire lung from all 5 mm thick CT-images, 59 (55-66) per animal. An interpolation technique validated for use in humans was then applied to calculate qCTA results for lung volumes and masses from only 10, 12, and 14 selected CT-images per scan. The time required for both procedures was recorded. Results were compared statistically using the Bland-Altman approach. The bias ± 2 SD for total lung volume calculated from interpolation of 10, 12, and 14 CT-images was -1.2 ± 5.8%, 0.1 ± 3.5%, and 0.0 ± 2.5%, respectively. The corresponding results for total lung mass were -1.1 ± 5.9%, 0.0 ± 3.5%, and 0.0 ± 3.0%. The average time for analysis of one thoracic CT-scan using the interpolation method was 1.5-2 h compared to 8 h for analysis of all images of one complete thoracic CT-scan. The calculation of pulmonary qCTA data by interpolation from 12 CT-images was applicable for equine lung CT-scans and reduced the time required for analysis by 75%. Copyright © 2014 Elsevier Ltd. All rights reserved.
Fast Learning for Immersive Engagement in Energy Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M
The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less
Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.
2013-01-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945
Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E
2013-06-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.
Study of optoelectronic switch for satellite-switched time-division multiple access
NASA Technical Reports Server (NTRS)
Su, Shing-Fong; Jou, Liz; Lenart, Joe
1987-01-01
The use of optoelectronic switching for satellite switched time division multiple access will improve the isolation and reduce the crosstalk of an IF switch matrix. The results are presented of a study on optoelectronic switching. Tasks include literature search, system requirements study, candidate switching architecture analysis, and switch model optimization. The results show that the power divided and crossbar switching architectures are good candidates for an IF switch matrix.
Improved result on stability analysis of discrete stochastic neural networks with time delay
NASA Astrophysics Data System (ADS)
Wu, Zhengguang; Su, Hongye; Chu, Jian; Zhou, Wuneng
2009-04-01
This Letter investigates the problem of exponential stability for discrete stochastic time-delay neural networks. By defining a novel Lyapunov functional, an improved delay-dependent exponential stability criterion is established in terms of linear matrix inequality (LMI) approach. Meanwhile, the computational complexity of the newly established stability condition is reduced because less variables are involved. Numerical example is given to illustrate the effectiveness and the benefits of the proposed method.
Determining Reduced Order Models for Optimal Stochastic Reduced Order Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonney, Matthew S.; Brake, Matthew R.W.
2015-08-01
The use of parameterized reduced order models(PROMs) within the stochastic reduced order model (SROM) framework is a logical progression for both methods. In this report, five different parameterized reduced order models are selected and critiqued against the other models along with truth model for the example of the Brake-Reuss beam. The models are: a Taylor series using finite difference, a proper orthogonal decomposition of the the output, a Craig-Bampton representation of the model, a method that uses Hyper-Dual numbers to determine the sensitivities, and a Meta-Model method that uses the Hyper-Dual results and constructs a polynomial curve to better representmore » the output data. The methods are compared against a parameter sweep and a distribution propagation where the first four statistical moments are used as a comparison. Each method produces very accurate results with the Craig-Bampton reduction having the least accurate results. The models are also compared based on time requirements for the evaluation of each model where the Meta- Model requires the least amount of time for computation by a significant amount. Each of the five models provided accurate results in a reasonable time frame. The determination of which model to use is dependent on the availability of the high-fidelity model and how many evaluations can be performed. Analysis of the output distribution is examined by using a large Monte-Carlo simulation along with a reduced simulation using Latin Hypercube and the stochastic reduced order model sampling technique. Both techniques produced accurate results. The stochastic reduced order modeling technique produced less error when compared to an exhaustive sampling for the majority of methods.« less
Spatial and temporal study of nitrate concentration in groundwater by means of coregionalization
D'Agostino, V.; Greene, E.A.; Passarella, G.; Vurro, M.
1998-01-01
Spatial and temporal behavior of hydrochemical parameters in groundwater can be studied using tools provided by geostatistics. The cross-variogram can be used to measure the spatial increments between observations at two given times as a function of distance (spatial structure). Taking into account the existence of such a spatial structure, two different data sets (sampled at two different times), representing concentrations of the same hydrochemical parameter, can be analyzed by cokriging in order to reduce the uncertainty of the estimation. In particular, if one of the two data sets is a subset of the other (that is, an undersampled set), cokriging allows us to study the spatial distribution of the hydrochemical parameter at that time, while also considering the statistical characteristics of the full data set established at a different time. This paper presents an application of cokriging by using temporal subsets to study the spatial distribution of nitrate concentration in the aquifer of the Lucca Plain, central Italy. Three data sets of nitrate concentration in groundwater were collected during three different periods in 1991. The first set was from 47 wells, but the second and the third are undersampled and represent 28 and 27 wells, respectively. Comparing the result of cokriging with ordinary kriging showed an improvement of the uncertainty in terms of reducing the estimation variance. The application of cokriging to the undersampled data sets reduced the uncertainty in estimating nitrate concentration and at the same time decreased the cost of the field sampling and laboratory analysis.Spatial and temporal behavior of hydrochemical parameters in groundwater can be studied using tools provided by geostatistics. The cross-variogram can be used to measure the spatial increments between observations at two given times as a function of distance (spatial structure). Taking into account the existence of such a spatial structure, two different data sets (sampled at two different times), representing concentrations of the same hydrochemical parameter, can be analyzed by cokriging in order to reduce the uncertainty of the estimation. In particular, if one of the two data sets is a subset of the other (that is, an undersampled set), cokriging allows us to study the spatial distribution of the hydrochemical parameter at that time, while also considering the statistical characteristics of the full data set established at a different time. This paper presents an application of cokriging by using temporal subsets to study the spatial distribution of nitrate concentration in the aquifer of the Lucca Plain, central Italy. Three data sets of nitrate concentration in groundwater were collected during three different periods in 1991. The first set was from 47 wells, but the second and the third are undersampled and represent 28 and 27 wells, respectively. Comparing the result of cokriging with ordinary kriging showed an improvement of the uncertainty in terms of reducing the estimation variance. The application of cokriging to the undersampled data sets reduced the uncertainty in estimating nitrate concentration and at the same time decreased the cost of the field sampling and laboratory analysis.
Gabbert, Dominik D; Entenmann, Andreas; Jerosch-Herold, Michael; Frettlöh, Felicitas; Hart, Christopher; Voges, Inga; Pham, Minh; Andrade, Ana; Pardun, Eileen; Wegner, P; Hansen, Traudel; Kramer, Hans-Heiner; Rickers, Carsten
2013-12-01
The determination of right ventricular volumes and function is of increasing interest for the postoperative care of patients with congenital heart defects. The presentation of volumetry data in terms of volume-time curves allows a comprehensive functional assessment. By using manual contour tracing, the generation of volume-time curves is exceedingly time-consuming. This study describes a fast and precise method for determining volume-time curves for the right ventricle and for the right ventricular outflow tract. The method applies contour detection and includes a feature for identifying the right ventricular outflow tract volume. The segregation of the outflow tract is performed by four-dimensional curved smooth boundary surfaces defined by prespecified anatomical landmarks. The comparison with manual contour tracing demonstrates that the method is accurate and improves the precision of the measurement. Compared to manual contour tracing the bias is <0.1% ± 4.1% (right ventricle) and -2.6% ± 20.0% (right ventricular outflow tract). The standard deviations of inter- and intraobserver variabilities for determining the volume of the right ventricular outflow tract are reduced to less than half the values of manual contour tracing. The time consumption per patient is reduced from 341 ± 80 min (right ventricle) and 56 ± 11 min (right ventricular outflow tract) using manual contour tracing to 46 ± 9 min for a combined analysis of right ventricle and right ventricular outflow tract. The analysis of volume-time curves for the right ventricle and its outflow tract discloses new evaluation methods in clinical routine and science. Copyright © 2013 Wiley Periodicals, Inc.
The quick acquisition technique for laser communication between LEO and GEO
NASA Astrophysics Data System (ADS)
Zhang, Li-zhong; Zhang, Rui-qin; Li, Yong-hao; Meng, Li-xin; Li, Xiao-ming
2013-08-01
The sight-axis alignment can be accomplished by the quick acquisition operation between two laser communication terminals, which is the premise of establishing a free-space optical communication link. Especially for the laser communication links of LEO (Low Earth Orbit)-Ground and LEO-GEO (Geostationary Earth Orbit), since the earth would break the transmission of laser and break the communication as well, so the effective time for each communication is very shot (several minutes~ dozens of minutes), as a result the communication terminals have to capture each other to rebuild the laser communication link. In the paper, on the basis of the analysis of the traditional methods, it presents a new idea that using the long beacon light instead of the circular beacon light; thereby the original of two-dimensional raster spiral scanning is replaced by one-dimensional scanning. This method will reduce the setup time and decrease the failure probability of acquisition for the LEO-GEO laser communication link. Firstly, the analysis of the external constraint conditions in the acquisition phase has been presented in this paper. Furthermore, the acquisition algorithm models have been established. The optimization analysis for the parameters of the acquisition unit has been carried out, and the ground validation experiments of the acquisition strategy have also been performed. The experiments and analysis show that compared with traditional capturing methods, the method presented in this article can make the capturing time be shortened by about 40%, and the failure probability of capturing be reduced by about 30%. So, the method is significant for the LEO-GEO laser communication link.
Liu, Xuejiao; Zhang, Dongdong; Liu, Yu; Sun, Xizhuo; Han, Chengyi; Wang, Bingyuan; Ren, Yongcheng; Zhou, Junmei; Zhao, Yang; Shi, Yuanyuan; Hu, Dongsheng; Zhang, Ming
2017-05-01
Despite the inverse association between physical activity (PA) and incident hypertension, a comprehensive assessment of the quantitative dose-response association between PA and hypertension has not been reported. We performed a meta-analysis, including dose-response analysis, to quantitatively evaluate this association. We searched PubMed and Embase databases for articles published up to November 1, 2016. Random effects generalized least squares regression models were used to assess the quantitative association between PA and hypertension risk across studies. Restricted cubic splines were used to model the dose-response association. We identified 22 articles (29 studies) investigating the risk of hypertension with leisure-time PA or total PA, including 330 222 individuals and 67 698 incident cases of hypertension. The risk of hypertension was reduced by 6% (relative risk, 0.94; 95% confidence interval, 0.92-0.96) with each 10 metabolic equivalent of task h/wk increment of leisure-time PA. We found no evidence of a nonlinear dose-response association of PA and hypertension ( P nonlinearity =0.094 for leisure-time PA and 0.771 for total PA). With the linear cubic spline model, when compared with inactive individuals, for those who met the guidelines recommended minimum level of moderate PA (10 metabolic equivalent of task h/wk), the risk of hypertension was reduced by 6% (relative risk, 0.94; 95% confidence interval, 0.92-0.97). This meta-analysis suggests that additional benefits for hypertension prevention occur as the amount of PA increases. © 2017 American Heart Association, Inc.
Carroll, Robert; Metcalfe, Chris; Steeg, Sarah; Davies, Neil M; Cooper, Jayne; Kapur, Nav; Gunnell, David
2016-01-01
Clinical guidelines have recommended psychosocial assessment of self-harm patients for years, yet estimates of its impact on the risk of repeat self-harm vary. Assessing the association of psychosocial assessment with risk of repeat self-harm is challenging due to the effects of confounding by indication. We analysed data from a cohort study of 15,113 patients presenting to the emergency departments of three UK hospitals to investigate the association of psychosocial assessment with risk of repeat hospital presentation for self-harm. Time of day of hospital presentation was used as an instrument for psychosocial assessment, attempting to control for confounding by indication. Conventional regression analysis suggested psychosocial assessment was not associated with risk of repeat self-harm within 12 months (Risk Difference (RD) 0.00 95% confidence interval (95%CI) -0.01 to 0.02). In contrast, IV analysis suggested risk of repeat self-harm was reduced by 18% (RD -0.18, 95%CI -0.32 to -0.03) in those patients receiving a psychosocial assessment. However, the instrument of time of day did not remove all potential effects of confounding by indication, suggesting the IV effect estimate may be biased. We found that psychosocial assessments reduce risk of repeat self-harm. This is in-line with other non-randomised studies based on populations in which allocation to assessment was less subject to confounding by indication. However, as our instrument did not fully balance important confounders across time of day, the IV effect estimate should be interpreted with caution.
Evaluation of a New Digital Automated Glycemic Pattern Detection Tool.
Comellas, María José; Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg
2017-11-01
Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study.
Liang, Renxing; Grizzle, Robert S.; Duncan, Kathleen E.; McInerney, Michael J.; Suflita, Joseph M.
2014-01-01
Thermophilic sulfide-producing microorganisms from an oil pipeline network were enumerated with different sulfur oxyanions as electron acceptors at 55°C. Most-probable number (MPN) analysis showed that thiosulfate-reducing bacteria were the most numerous sulfidogenic microorganisms in pipeline inspection gauge (PIG) scrapings. Thiosulfate-reducing and methanogenic enrichments were obtained from the MPN cultures that were able to use yeast extract as the electron donor. Molecular analysis revealed that both enrichments harbored the same dominant bacterium, which belonged to the genus Anaerobaculum. The dominant archaeon in the methanogenic enrichment was affiliated with the genus Methanothermobacter. With yeast extract as the electron donor, the general corrosion rate by the thiosulfate-reducing enrichment (8.43 ± 1.40 milli-inch per year, abbreviated as mpy) was about 5.5 times greater than the abiotic control (1.49 ± 0.15 mpy), while the comparable measures for the methanogenic culture were 2.03 ± 0.49 mpy and 0.62 ± 0.07 mpy, respectively. Total iron analysis in the cultures largely accounted for the mass loss of iron measured in the weight loss determinations. Profilometry analysis of polished steel coupons incubated in the presence of the thiosulfate-reducing enrichment revealed 59 pits over an area of 71.16 mm2, while only 6 pits were evident in the corresponding methanogenic incubations. The results show the importance of thiosulfate-utilizing, sulfide-producing fermentative bacteria such as Anaerobaculum sp. in the corrosion of carbon steel, but also suggest that Anaerobaculum sp. are of far less concern when growing syntrophically with methanogens. PMID:24639674
Liang, Renxing; Grizzle, Robert S; Duncan, Kathleen E; McInerney, Michael J; Suflita, Joseph M
2014-01-01
Thermophilic sulfide-producing microorganisms from an oil pipeline network were enumerated with different sulfur oxyanions as electron acceptors at 55°C. Most-probable number (MPN) analysis showed that thiosulfate-reducing bacteria were the most numerous sulfidogenic microorganisms in pipeline inspection gauge (PIG) scrapings. Thiosulfate-reducing and methanogenic enrichments were obtained from the MPN cultures that were able to use yeast extract as the electron donor. Molecular analysis revealed that both enrichments harbored the same dominant bacterium, which belonged to the genus Anaerobaculum. The dominant archaeon in the methanogenic enrichment was affiliated with the genus Methanothermobacter. With yeast extract as the electron donor, the general corrosion rate by the thiosulfate-reducing enrichment (8.43 ± 1.40 milli-inch per year, abbreviated as mpy) was about 5.5 times greater than the abiotic control (1.49 ± 0.15 mpy), while the comparable measures for the methanogenic culture were 2.03 ± 0.49 mpy and 0.62 ± 0.07 mpy, respectively. Total iron analysis in the cultures largely accounted for the mass loss of iron measured in the weight loss determinations. Profilometry analysis of polished steel coupons incubated in the presence of the thiosulfate-reducing enrichment revealed 59 pits over an area of 71.16 mm(2), while only 6 pits were evident in the corresponding methanogenic incubations. The results show the importance of thiosulfate-utilizing, sulfide-producing fermentative bacteria such as Anaerobaculum sp. in the corrosion of carbon steel, but also suggest that Anaerobaculum sp. are of far less concern when growing syntrophically with methanogens.
Interventions to reduce the stigma of eating disorders: A systematic review and meta-analysis.
Doley, Joanna R; Hart, Laura M; Stukas, Arthur A; Petrovic, Katja; Bouguettaya, Ayoub; Paxton, Susan J
2017-03-01
Stigma is a problem for individuals with eating disorders (EDs), forming a barrier to disclosure and help-seeking. Interventions to reduce ED stigma may help remove these barriers; however, it is not known which strategies (e.g., explaining etiology to reduce blame, contact with a person with an ED, or educating about ED) are effective in reducing stigma and related outcomes. This review described effectiveness of intervention strategies, and identified gaps in the literature. A search of four databases was performed using the terms (eating disorder* OR bulimi* OR anorexi* OR binge-eating disorder) AND (stigma* OR stereotyp* OR beliefs OR negative attitudes) AND (program OR experiment OR intervention OR education), with additional texts sought through LISTSERVs. Two raters screened papers, extracted data, and assessed quality. Stigma reduction strategies and study characteristics were examined in critical narrative synthesis. Exploratory meta-analysis compared the effects of biological and sociocultural explanations of EDs on attitudinal stigma. Eighteen papers were eligible for narrative synthesis, with four also eligible for inclusion in a meta-analysis. Biological explanations reduced stigma relative to other explanations, including sociocultural explanations in meta-analysis (g = .47, p < .001). Combined education and contact interventions improved stigma relative to control groups or over time. Most studies examined Anorexia Nervosa (AN) stigma and had mostly female, undergraduate participants. Despite apparent effectiveness, research should verify that biological explanations do not cause unintentional harm. Future research should evaluate in vivo contact, directly compare education and contact strategies, and aim to generalize findings across community populations. © 2017 Wiley Periodicals, Inc.
Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea
2015-09-01
The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less
The effect of interventions targeting screen time reduction: A systematic review and meta-analysis.
Wu, Lei; Sun, Samio; He, Yao; Jiang, Bin
2016-07-01
Previous studies have evaluated the effectiveness of interventions aimed at screen time reduction, but the results have been inconsistent. We therefore conducted a systematic review and meta-analysis of randomized controlled trials (RCTs) to summarize the accumulating evidence of the impact of interventions targeting screen time reduction on body mass index (BMI) reduction and screen time reduction. The PubMed, Embase, and Cochrane Central Register of Controlled Trials (CENTRAL) databases were searched for RCTs on the effect of interventions targeting screen time reduction. The primary and secondary outcomes were the mean difference between the treatment and control groups in the changes in BMI and changes in screen viewing time. A random effects model was used to calculate the pooled mean differences. Fourteen trials including 2238 participants were assessed. The pooled analysis suggested that interventions targeting screen time reduction had a significant effect on BMI reduction (-0.15 kg/m, P < 0.001, I = 0) and on screen time reduction (-4.63 h/w, P = 0.003, I = 94.6%). Subgroup analysis showed that a significant effect of screen time reduction was observed in studies in which the duration of intervention was <7 months and that the types of interventions in those studies were health promotion curricula or counseling. Interventions for screen time reduction might be effective in reducing screen time and preventing excess weight. Further rigorous investigations with larger samples and longer follow-up periods are still needed to evaluate the efficacy of screen time reduction both in children and in adults.
Parallel Event Analysis Under Unix
NASA Astrophysics Data System (ADS)
Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.
The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.
Comparison of histomorphometrical data obtained with two different image analysis methods.
Ballerini, Lucia; Franke-Stenport, Victoria; Borgefors, Gunilla; Johansson, Carina B
2007-08-01
A common way to determine tissue acceptance of biomaterials is to perform histomorphometrical analysis on histologically stained sections from retrieved samples with surrounding tissue, using various methods. The "time and money consuming" methods and techniques used are often "in house standards". We address light microscopic investigations of bone tissue reactions on un-decalcified cut and ground sections of threaded implants. In order to screen sections and generate results faster, the aim of this pilot project was to compare results generated with the in-house standard visual image analysis tool (i.e., quantifications and judgements done by the naked eye) with a custom made automatic image analysis program. The histomorphometrical bone area measurements revealed no significant differences between the methods but the results of the bony contacts varied significantly. The raw results were in relative agreement, i.e., the values from the two methods were proportional to each other: low bony contact values in the visual method corresponded to low values with the automatic method. With similar resolution images and further improvements of the automatic method this difference should become insignificant. A great advantage using the new automatic image analysis method is that it is time saving--analysis time can be significantly reduced.
Kamran, Faisal; Abildgaard, Otto H A; Sparén, Anders; Svensson, Olof; Johansson, Jonas; Andersson-Engels, Stefan; Andersen, Peter E; Khoptyar, Dmitry
2015-03-01
We present a comprehensive study of the application of photon time-of-flight spectroscopy (PTOFS) in the wavelength range 1050-1350 nm as a spectroscopic technique for the evaluation of the chemical composition and structural properties of pharmaceutical tablets. PTOFS is compared to transmission near-infrared spectroscopy (NIRS). In contrast to transmission NIRS, PTOFS is capable of directly and independently determining the absorption and reduced scattering coefficients of the medium. Chemometric models were built on the evaluated absorption spectra for predicting tablet drug concentration. Results are compared to corresponding predictions built on transmission NIRS measurements. The predictive ability of PTOFS and transmission NIRS is comparable when models are based on uniformly distributed tablet sets. For non-uniform distribution of tablets based on particle sizes, the prediction ability of PTOFS is better than that of transmission NIRS. Analysis of reduced scattering spectra shows that PTOFS is able to characterize tablet microstructure and manufacturing process parameters. In contrast to the chemometric pseudo-variables provided by transmission NIRS, PTOFS provides physically meaningful quantities such as scattering strength and slope of particle size. The ability of PTOFS to quantify the reduced scattering spectra, together with its robustness in predicting drug content, makes it suitable for such evaluations in the pharmaceutical industry.
Mechanisms by which oxygen acts as a surfactant in giant magnetoresistance film growth
NASA Astrophysics Data System (ADS)
Larson, D. J.; Petford-Long, A. K.; Cerezo, A.; Bozeman, S. P.; Morrone, A.; Ma, Y. Q.; Georgalakis, A.; Clifton, P. H.
2003-04-01
The mechanisms by which oxygen acts as a surfactant in giant magnetoresistance multilayers have been elucidated for the first time. Three-dimensional atom probe analysis of Cu/CoFe multilayers reveals the elemental distributions at the atomic level. Interfacial intermixing and oxygen impurity levels have been quantified for the first time. Both with and without oxygen the intermixing is greater at the CoFe-on-Cu interface than at the Cu-on-CoFe one and for both interfaces, oxygen reduced the intermixing. The oxygen largely floats to the growing surface and is incorporated at grain boundaries. The oxygen also reduces conformal roughness and grain boundary grooving, indicating a reduction in long-range surface diffusion.
Modifications Of Hydrostatic-Bearing Computer Program
NASA Technical Reports Server (NTRS)
Hibbs, Robert I., Jr.; Beatty, Robert F.
1991-01-01
Several modifications made to enhance utility of HBEAR, computer program for analysis and design of hydrostatic bearings. Modifications make program applicable to more realistic cases and reduce time and effort necessary to arrive at a suitable design. Uses search technique to iterate on size of orifice to obtain required pressure ratio.
Extending Sociocultural Theory to Group Creativity
ERIC Educational Resources Information Center
Sawyer, Keith
2012-01-01
Sociocultural theory focuses on group processes through time, and argues that group phenomena cannot be reduced to explanation in terms of the mental states or actions of the participating individuals. This makes sociocultural theory particularly useful in the analysis of group creativity and group learning, because both group creativity and group…
Reconstructing the historic demography of an endangered seabird
Steven R. Beissinger; Zachariah M. Peery
2007-01-01
Reducing extinction risk for threatened species requires determining which demographic parameters are depressed and causing population declines. Museum collections may constitute a unique, underutilized resource for measuring demographic changes over long time periods using age-ratio analysis. We reconstruct the historic demography of a U.S. federally endangered...
Time Patterns in Remote OPAC Use.
ERIC Educational Resources Information Center
Lucas, Thomas A.
1993-01-01
Describes a transaction log analysis of the New York Public Library research libraries' OPAC (online public access catalog). Much of the remote searching occurred when the libraries were closed and was more evenly distributed than internal searching, demonstrating that remote searching could expand access and reduce peak system loads. (Contains…
Looking within to Improve Office Organization
ERIC Educational Resources Information Center
Malinowski, Matthew J.
2009-01-01
When tough economic times set in, school business administrators heighten their normal zeal in finding ways to reduce costs and improve efficiency. The author's school district recently underwent a yearlong internal self-analysis to examine and determine the proper staffing levels for the administrative functions within the school district's…