Fully automatic time-window selection using machine learning for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.
2017-12-01
Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.
A general graphical user interface for automatic reliability modeling
NASA Technical Reports Server (NTRS)
Liceaga, Carlos A.; Siewiorek, Daniel P.
1991-01-01
Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.
StochKit2: software for discrete stochastic simulation of biochemical systems with events.
Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R
2011-09-01
StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.
NASA Astrophysics Data System (ADS)
Horton, Pascal; Jaboyedoff, Michel; Obled, Charles
2018-01-01
Analogue methods provide a statistical precipitation prediction based on synoptic predictors supplied by general circulation models or numerical weather prediction models. The method samples a selection of days in the archives that are similar to the target day to be predicted, and consider their set of corresponding observed precipitation (the predictand) as the conditional distribution for the target day. The relationship between the predictors and predictands relies on some parameters that characterize how and where the similarity between two atmospheric situations is defined. This relationship is usually established by a semi-automatic sequential procedure that has strong limitations: (i) it cannot automatically choose the pressure levels and temporal windows (hour of the day) for a given meteorological variable, (ii) it cannot handle dependencies between parameters, and (iii) it cannot easily handle new degrees of freedom. In this work, a global optimization approach relying on genetic algorithms could optimize all parameters jointly and automatically. The global optimization was applied to some variants of the analogue method for the Rhône catchment in the Swiss Alps. The performance scores increased compared to reference methods, especially for days with high precipitation totals. The resulting parameters were found to be relevant and coherent between the different subregions of the catchment. Moreover, they were obtained automatically and objectively, which reduces the effort that needs to be invested in exploration attempts when adapting the method to a new region or for a new predictand. For example, it obviates the need to assess a large number of combinations of pressure levels and temporal windows of predictor variables that were manually selected beforehand. The optimization could also take into account parameter inter-dependencies. In addition, the approach allowed for new degrees of freedom, such as a possible weighting between pressure levels, and non-overlapping spatial windows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qiu, J; Washington University in St Louis, St Louis, MO; Li, H. Harlod
Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The mostmore » important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.« less
Automatic Generation of Building Models with Levels of Detail 1-3
NASA Astrophysics Data System (ADS)
Nguatem, W.; Drauschke, M.; Mayer, H.
2016-06-01
We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.
Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio
1997-11-01
A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).
Uncovering effects of self-control and stimulus-driven action selection on the sense of agency.
Wang, Yuru; Damen, Tom G E; Aarts, Henk
2017-10-01
The sense of agency refers to feelings of causing one's own action and resulting effect. Previous research indicates that voluntary action selection is an important factor in shaping the sense of agency. Whereas the volitional nature of the sense of agency is well documented, the present study examined whether agency is modulated when action selection shifts from self-control to a more automatic stimulus-driven process. Seventy-two participants performed an auditory Simon task including congruent and incongruent trials to generate automatic stimulus-driven vs. more self-control driven action, respectively. Responses in the Simon task produced a tone and agency was assessed with the intentional binding task - an implicit measure of agency. Results showed a Simon effect and temporal binding effect. However, temporal binding was independent of congruency. These findings suggest that temporal binding, a window to the sense of agency, emerges for both automatic stimulus-driven actions and self-controlled actions. Copyright © 2017 Elsevier Inc. All rights reserved.
Automatic 3D Moment tensor inversions for southern California earthquakes
NASA Astrophysics Data System (ADS)
Liu, Q.; Tape, C.; Friberg, P.; Tromp, J.
2008-12-01
We present a new source mechanism (moment-tensor and depth) catalog for about 150 recent southern California earthquakes with Mw ≥ 3.5. We carefully select the initial solutions from a few available earthquake catalogs as well as our own preliminary 3D moment tensor inversion results. We pick useful data windows by assessing the quality of fits between the data and synthetics using an automatic windowing package FLEXWIN (Maggi et al 2008). We compute the source Fréchet derivatives of moment-tensor elements and depth for a recent 3D southern California velocity model inverted based upon finite-frequency event kernels calculated by the adjoint methods and a nonlinear conjugate gradient technique with subspace preconditioning (Tape et al 2008). We then invert for the source mechanisms and event depths based upon the techniques introduced by Liu et al 2005. We assess the quality of this new catalog, as well as the other existing ones, by computing the 3D synthetics for the updated 3D southern California model. We also plan to implement the moment-tensor inversion methods to automatically determine the source mechanisms for earthquakes with Mw ≥ 3.5 in southern California.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... automatic reversal systems (ARS) for power windows and to make a final decision. The agency has decided not... requirements for automatic reversal systems (ARS) and are withdrawing our 2009 proposal regarding ARS. This... of proposed rulemaking (NPRM) proposing new requirements for ARS. The proposal discussed the agency's...
Impact of Advanced Avionics Technology on Ground Attack Weapon Systems.
1982-02-01
as the relevant feature. 3.0 Problem The task is to perform the automatic cueing of moving objects in a natural environment . Additional problems...views on this subject to the American Defense Preparedness Association (ADPA) on 11 February 1981 in Orlando, Florida. ENVIRONMENTAL CONDITIONS OUR...the operating window or the environmental conditions of combat that our forces may encounter worldwide. The three areas selected were Europe, the
Linear segmentation algorithm for detecting layer boundary with lidar.
Mao, Feiyue; Gong, Wei; Logan, Timothy
2013-11-04
The automatic detection of aerosol- and cloud-layer boundary (base and top) is important in atmospheric lidar data processing, because the boundary information is not only useful for environment and climate studies, but can also be used as input for further data processing. Previous methods have demonstrated limitations in defining the base and top, window-size setting, and have neglected the in-layer attenuation. To overcome these limitations, we present a new layer detection scheme for up-looking lidars based on linear segmentation with a reasonable threshold setting, boundary selecting, and false positive removing strategies. Preliminary results from both real and simulated data show that this algorithm cannot only detect the layer-base as accurate as the simple multi-scale method, but can also detect the layer-top more accurately than that of the simple multi-scale method. Our algorithm can be directly applied to uncalibrated data without requiring any additional measurements or window size selections.
PubMedAlertMe - Standalone Windows-based PubMed SDI Software Application
Ma’ayan, Avi
2008-01-01
PubMedAlertMe is a Windows-based software system for automatically receiving e-mail alert messages about recent publications listed on PubMed. The e-mail messages contain links to newly available abstracts listed on PubMed describing publications that were selectively returned from a specified list of queries. Links are also provided to directly export citations to EndNote, and links are provided to directly forward articles to colleagues. The program is standalone. Thus, it does not require a remote mail server or user registration. PubMedAlertMe is free software, and can be downloaded from: http://amp.pharm.mssm.edu/PubMedAlertMe/PubMedAlertMe_setup.zip PMID:18402930
Information transfer rate with serial and simultaneous visual display formats
NASA Astrophysics Data System (ADS)
Matin, Ethel; Boff, Kenneth R.
1988-04-01
Information communication rate for a conventional display with three spatially separated windows was compared with rate for a serial display in which data frames were presented sequentially in one window. For both methods, each frame contained a randomly selected digit with various amounts of additional display 'clutter.' Subjects recalled the digits in a prescribed order. Large rate differences were found, with faster serial communication for all levels of the clutter factors. However, the rate difference was most pronounced for highly cluttered displays. An explanation for the latter effect in terms of visual masking in the retinal periphery was supported by the results of a second experiment. The working hypothesis that serial displays can speed information transfer for automatic but not for controlled processing is discussed.
Peyrodie, Laurent; Szurhaj, William; Bolo, Nicolas; Pinti, Antonio; Gallois, Philippe
2014-01-01
Muscle artifacts constitute one of the major problems in electroencephalogram (EEG) examinations, particularly for the diagnosis of epilepsy, where pathological rhythms occur within the same frequency bands as those of artifacts. This paper proposes to use the method dual adaptive filtering by optimal projection (DAFOP) to automatically remove artifacts while preserving true cerebral signals. DAFOP is a two-step method. The first step consists in applying the common spatial pattern (CSP) method to two frequency windows to identify the slowest components which will be considered as cerebral sources. The two frequency windows are defined by optimizing convolutional filters. The second step consists in using a regression method to reconstruct the signal independently within various frequency windows. This method was evaluated by two neurologists on a selection of 114 pages with muscle artifacts, from 20 clinical recordings of awake and sleeping adults, subject to pathological signals and epileptic seizures. A blind comparison was then conducted with the canonical correlation analysis (CCA) method and conventional low-pass filtering at 30 Hz. The filtering rate was 84.3% for muscle artifacts with a 6.4% reduction of cerebral signals even for the fastest waves. DAFOP was found to be significantly more efficient than CCA and 30 Hz filters. The DAFOP method is fast and automatic and can be easily used in clinical EEG recordings. PMID:25298967
Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen
2017-02-21
To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.
Automated selection of computed tomography display parameters using neural networks
NASA Astrophysics Data System (ADS)
Zhang, Di; Neu, Scott; Valentino, Daniel J.
2001-07-01
A collection of artificial neural networks (ANN's) was trained to identify simple anatomical structures in a set of x-ray computed tomography (CT) images. These neural networks learned to associate a point in an image with the anatomical structure containing the point by using the image pixels located on the horizontal and vertical lines that ran through the point. The neural networks were integrated into a computer software tool whose function is to select an index into a list of CT window/level values from the location of the user's mouse cursor. Based upon the anatomical structure selected by the user, the software tool automatically adjusts the image display to optimally view the structure.
Some selected quantitative methods of thermal image analysis in Matlab.
Koprowski, Robert
2016-05-01
The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Horton, Pascal; Weingartner, Rolf; Obled, Charles; Jaboyedoff, Michel
2017-04-01
Analogue methods (AMs) rely on the hypothesis that similar situations, in terms of atmospheric circulation, are likely to result in similar local or regional weather conditions. These methods consist of sampling a certain number of past situations, based on different synoptic-scale meteorological variables (predictors), in order to construct a probabilistic prediction for a local weather variable of interest (predictand). They are often used for daily precipitation prediction, either in the context of real-time forecasting, reconstruction of past weather conditions, or future climate impact studies. The relationship between predictors and predictands is defined by several parameters (predictor variable, spatial and temporal windows used for the comparison, analogy criteria, and number of analogues), which are often calibrated by means of a semi-automatic sequential procedure that has strong limitations. AMs may include several subsampling levels (e.g. first sorting a set of analogs in terms of circulation, then restricting to those with similar moisture status). The parameter space of the AMs can be very complex, with substantial co-dependencies between the parameters. Thus, global optimization techniques are likely to be necessary for calibrating most AM variants, as they can optimize all parameters of all analogy levels simultaneously. Genetic algorithms (GAs) were found to be successful in finding optimal values of AM parameters. They allow taking into account parameters inter-dependencies, and selecting objectively some parameters that were manually selected beforehand (such as the pressure levels and the temporal windows of the predictor variables), and thus obviate the need of assessing a high number of combinations. The performance scores of the optimized methods increased compared to reference methods, and this even to a greater extent for days with high precipitation totals. The resulting parameters were found to be relevant and spatially coherent. Moreover, they were obtained automatically and objectively, which reduces efforts invested in exploration attempts when adapting the method to a new region or for a new predictand. In addition, the approach allowed for new degrees of freedom, such as a weighting between the pressure levels, and non overlapping spatial windows. Genetic algorithms were then used further in order to automatically select predictor variables and analogy criteria. This resulted in interesting outputs, providing new predictor-criterion combinations. However, some limitations of the approach were encountered, and the need of the expert input is likely to remain necessary. Nevertheless, letting GAs exploring a dataset for the best predictor for a predictand of interest is certainly a useful tool, particularly when applied for a new predictand or a new region under different climatic characteristics.
Fermentation process tracking through enhanced spectral calibration modeling.
Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah
2007-06-15
The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.
Improvement of the user interface of multimedia applications by automatic display layout
NASA Astrophysics Data System (ADS)
Lueders, Peter; Ernst, Rolf
1995-03-01
Multimedia research has mainly focussed on real-time data capturing and display combined with compression, storage and transmission of these data. However, there is another problem considering real-time selecting and arranging a possibly large amount of data from multiple media on the computer screen together with textual and graphical data of regular software. This problem has already been known from complex software systems, such as CASE and hypertest, and will even be aggravated in multimedia systems. The aim of our work is to alleviate the user from the burden of continuously selecting, placing and sizing windows and their contents, but without introducing solutions limited to only few applications. We present an experimental system which controls the computer screen contents and layouts, directed by a user and/or tool provided information filter and prioritization. To be application independent, the screen layout is based on general layout optimization algorithms adapted from the VLSI layout which are controlled by application specific objective functions. In this paper, we discuss the problems of a comprehensible screen layout including the stability of optical information in time, the information filtering, the layout algorithms and the adaptation of the objective function to include a specific application. We give some examples of different standard applications with layout problems ranging from hierarchical graph layout to window layout. The results show that the automatic tool independent display layout will be possible in a real time interactive environment.
Window Selection Tool | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Selection Process for New Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Selection Process for Replacement Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Understanding Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Benefits of Efficient Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Windows for New Construction | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Performance Standards for Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Assessing Window Replacement Options | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Characterizing artifacts in RR stress test time series.
Astudillo-Salinas, Fabian; Palacio-Baus, Kenneth; Solano-Quinde, Lizandro; Medina, Ruben; Wong, Sara
2016-08-01
Electrocardiographic stress test records have a lot of artifacts. In this paper we explore a simple method to characterize the amount of artifacts present in unprocessed RR stress test time series. Four time series classes were defined: Very good lead, Good lead, Low quality lead and Useless lead. 65 ECG, 8 lead, records of stress test series were analyzed. Firstly, RR-time series were annotated by two experts. The automatic methodology is based on dividing the RR-time series in non-overlapping windows. Each window is marked as noisy whenever it exceeds an established standard deviation threshold (SDT). Series are classified according to the percentage of windows that exceeds a given value, based upon the first manual annotation. Different SDT were explored. Results show that SDT close to 20% (as a percentage of the mean) provides the best results. The coincidence between annotators classification is 70.77% whereas, the coincidence between the second annotator and the automatic method providing the best matches is larger than 63%. Leads classified as Very good leads and Good leads could be combined to improve automatic heartbeat labeling.
Research on Vehicle Temperature Regulation System Based on Air Convection Principle
NASA Astrophysics Data System (ADS)
Zhuge, Muzi; Li, Xiang; Liang, Caifeng
2018-03-01
The long time parking outdoors in the summer will lead to too high temperature in the car, and the harmful gas produced by the vehicle engine will stay in the confined space for a long time during the parking process, which will do great harm to the human body. If the air conditioning system is turned on before driving, the cooling rate is slow and the battery loss is large. To solve the above problems, we designed a temperature adjusting system based on the principle of air convection. We can choose the automatic mode or manual mode to achieve control of a convection window. In the automatic mode, the system will automatically detect the environmental temperature, through the sensor to complete the detection, and the signal is transmitted to the microcontroller to control the window open or close, in manual mode, the remote control of the window can be realized by Bluetooth. Therefore, the system has important practical significance to effectively regulate temperature, prolong battery life, and improve the safety and comfort of traffic vehicles.
Design Guidance for New Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Design Guidance for Replacement Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Design Considerations | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Gas Fills | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Books & Publications | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Efficient Windows Collaborative | Home
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Resources | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Provide Views | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Links | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Reducing Condensation | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Reduced Fading | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
EWC Membership | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Visible Transmittance | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
EWC Members | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Financing & Incentives | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
On dealing with multiple correlation peaks in PIV
NASA Astrophysics Data System (ADS)
Masullo, A.; Theunissen, R.
2018-05-01
A novel algorithm to analyse PIV images in the presence of strong in-plane displacement gradients and reduce sub-grid filtering is proposed in this paper. Interrogation windows subjected to strong in-plane displacement gradients often produce correlation maps presenting multiple peaks. Standard multi-grid procedures discard such ambiguous correlation windows using a signal to noise (SNR) filter. The proposed algorithm improves the standard multi-grid algorithm allowing the detection of splintered peaks in a correlation map through an automatic threshold, producing multiple displacement vectors for each correlation area. Vector locations are chosen by translating images according to the peak displacements and by selecting the areas with the strongest match. The method is assessed on synthetic images of a boundary layer of varying intensity and a sinusoidal displacement field of changing wavelength. An experimental case of a flow exhibiting strong velocity gradients is also provided to show the improvements brought by this technique.
Increased Light & View | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Air Leakage (AL) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
State Fact Sheets | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Fact Sheets & Publications | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Condensation Resistance (CR) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
National Fenestration Rating Council (NFRC) | Efficient Windows
Collaborative Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring
Low Conductance Spacers | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Energy & Cost Savings | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
U-Factor (U-value) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Pulse-Echo Ultrasonic Imaging Method for Eliminating Sample Thickness Variation Effects
NASA Technical Reports Server (NTRS)
Roth, Don J. (Inventor)
1997-01-01
A pulse-echo, immersion method for ultrasonic evaluation of a material which accounts for and eliminates nonlevelness in the equipment set-up and sample thickness variation effects employs a single transducer and automatic scanning and digital imaging to obtain an image of a property of the material, such as pore fraction. The nonlevelness and thickness variation effects are accounted for by pre-scan adjustments of the time window to insure that the echoes received at each scan point are gated in the center of the window. This information is input into the scan file so that, during the automatic scanning for the material evaluation, each received echo is centered in its time window. A cross-correlation function calculates the velocity at each scan point, which is then proportionalized to a color or grey scale and displayed on a video screen.
Solar Heat Gain Coefficient (SHGC) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
NASA Astrophysics Data System (ADS)
Hirata, Hiroshi; Itoh, Toshiharu; Hosokawa, Kouichi; Deng, Yuanmu; Susaki, Hitoshi
2005-08-01
This article describes a systematic method for determining the cutoff frequency of the low-pass window function that is used for deconvolution in two-dimensional continuous-wave electron paramagnetic resonance (EPR) imaging. An evaluation function for the criterion used to select the cutoff frequency is proposed, and is the product of the effective width of the point spread function for a localized point signal and the noise amplitude of a resultant EPR image. The present method was applied to EPR imaging for a phantom, and the result of cutoff frequency selection was compared with that based on a previously reported method for the same projection data set. The evaluation function has a global minimum point that gives the appropriate cutoff frequency. Images with reasonably good resolution and noise suppression can be obtained from projections with an automatically selected cutoff frequency based on the present method.
A Limited-Vocabulary, Multi-Speaker Automatic Isolated Word Recognition System.
ERIC Educational Resources Information Center
Paul, James E., Jr.
Techniques for automatic recognition of isolated words are investigated, and a computer simulation of a word recognition system is effected. Considered in detail are data acquisition and digitizing, word detection, amplitude and time normalization, short-time spectral estimation including spectral windowing, spectral envelope approximation,…
Valero, Enrique; Adan, Antonio; Cerrada, Carlos
2012-01-01
This paper is focused on the automatic construction of 3D basic-semantic models of inhabited interiors using laser scanners with the help of RFID technologies. This is an innovative approach, in whose field scarce publications exist. The general strategy consists of carrying out a selective and sequential segmentation from the cloud of points by means of different algorithms which depend on the information that the RFID tags provide. The identification of basic elements of the scene, such as walls, floor, ceiling, windows, doors, tables, chairs and cabinets, and the positioning of their corresponding models can then be calculated. The fusion of both technologies thus allows a simplified 3D semantic indoor model to be obtained. This method has been tested in real scenes under difficult clutter and occlusion conditions, and has yielded promising results. PMID:22778609
NASA Astrophysics Data System (ADS)
Kwon, Hyuk Ju; Yeon, Sang Hun; Lee, Keum Ho; Lee, Kwang Ho
2018-02-01
As various studies focusing on building energy saving have been continuously conducted, studies utilizing renewable energy sources, instead of fossil fuel, are needed. In particular, studies regarding solar energy are being carried out in the field of building science; in order to utilize such solar energy effectively, solar radiation being brought into the indoors should be acquired and blocked properly. Blinds are a typical solar radiation control device that is capable of controlling indoor thermal and light environments. However, slat-type blinds are manually controlled, giving a negative effect on building energy saving. In this regard, studies regarding the automatic control of slat-type blinds have been carried out for the last couple of decades. Therefore, this study aims to provide preliminary data for optimal control research through the controlling of slat angle in slat-type blinds by comprehensively considering various input variables. The window area ratio and orientation were selected as input variables. It was found that an optimal control algorithm was different among each window-to-wall ratio and window orientation. In addition, through comparing and analyzing the building energy saving performance for each condition by applying the developed algorithms to simulations, up to 20.7 % energy saving was shown in the cooling period and up to 12.3 % energy saving was shown in the heating period. In addition, building energy saving effect was greater as the window area ratio increased given the same orientation, and the effects of window-to-wall ratio in the cooling period were higher than those of window-to-wall ratio in the heating period.
Pulse-echo ultrasonic imaging method for eliminating sample thickness variation effects
NASA Technical Reports Server (NTRS)
Roth, Don J. (Inventor)
1995-01-01
A pulse-echo, immersion method for ultrasonic evaluation of a material is discussed. It accounts for and eliminates nonlevelness in the equipment set-up and sample thickness variation effects employs a single transducer, automatic scanning and digital imaging to obtain an image of a property of the material, such as pore fraction. The nonlevelness and thickness variation effects are accounted for by pre-scan adjusments of the time window to insure that the echoes received at each scan point are gated in the center of the window. This information is input into the scan file so that, during the automatic scanning for the material evaluation, each received echo is centered in its time window. A cross-correlation function calculates the velocity at each scan point, which is then proportionalized to a color or grey scale and displayed on a video screen.
Global moment tensor computation at GFZ Potsdam
NASA Astrophysics Data System (ADS)
Saul, J.; Becker, J.; Hanka, W.
2011-12-01
As part of its earthquake information service, GFZ Potsdam has started to provide seismic moment tensor solutions for significant earthquakes world-wide. The software used to compute the moment tensors is a GFZ-Potsdam in-house development, which uses the framework of the software SeisComP 3 (Hanka et al., 2010). SeisComP 3 (SC3) is a software package for seismological data acquisition, archival, quality control and analysis. SC3 is developed by GFZ Potsdam with significant contributions from its user community. The moment tensor inversion technique uses a combination of several wave types, time windows and frequency bands depending on magnitude and station distance. Wave types include body, surface and mantle waves as well as the so-called 'W-Phase' (Kanamori and Rivera, 2008). The inversion is currently performed in the time domain only. An iterative centroid search can be performed independently both horizontally and in depth. Moment tensors are currently computed in a semi-automatic fashion. This involves inversions that are performed automatically in near-real time, followed by analyst review prior to publication. The automatic results are quite often good enough to be published without further improvements, sometimes in less than 30 minutes from origin time. In those cases where a manual interaction is still required, the automatic inversion usually does a good job at pre-selecting those traces that are the most relevant for the inversion, keeping the work required for the analyst at a minimum. Our published moment tensors are generally in good agreement with those published by the Global Centroid-Moment-Tensor (GCMT) project for earthquakes above a magnitude of about Mw 5. Additionally we provide solutions for smaller earthquakes above about Mw 4 in Europe, which are normally not analyzed by the GCMT project. We find that for earthquakes above Mw 6, the most robust automatic inversions can usually be obtained using the W-Phase time window. The GFZ earthquake bulletin is located at http://geofon.gfz-potsdam.de/eqinfo For more information on the SeisComP 3 software visit http://www.seiscomp3.org
Keefe, Donald J.
1980-01-01
An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.
Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li
2015-01-01
Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert–Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500–800 and a m range of 50–300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction. PMID:26540059
Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li
2015-11-03
Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert-Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500-800 and a m range of 50-300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction.
Tools & Resources | Efficient Windows Collaborative
Selection Tool Mobile App Window Selection Tool Mobile App Use the Window Selection Tool Mobile App for new Window Selection Tool Mobile App. LBNL's RESFEN RESFEN RESFEN is used for calculating the heating and
Replacement Windows for Existing Homes Homes | Efficient Windows
Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Selection Tool will take you through a series of design conditions pertaining to your design and location
The impact of translucent fabric shades and control strategies on energy savings and visual quality
NASA Astrophysics Data System (ADS)
Wankanapon, Pimonmart
Translucent fabric shades provide opportunities for building occupants to control sunlight penetration for heat reduction, thermal comfort, and visual quality. Regulating shades affects building energy and can potentially reduce the size of mechanical cooling systems. Shades are not normally included in energy model studies during the design process, even though shades potential impact energy use. This is because the occupants normally leave shades closed a large fraction of the time, but models are generally performed with no shades. Automatic shade control is now available, so it is necessary to understand the impact of shades on visual quality and their energy saving potential in order to optimize their overall performance. There are very limited studies that have address shades and their integrated performance on energy consumption and visual quality. Most of these do not reflected modern shade types and their application. The goals of this study are: First, to determine the impact of shades on total, heating, cooling and lighting energy savings with different design and operation parameters. Second, to study and develop different automatic shade control strategies to promote and optimize energy savings and visual quality. A simulation-based approach using EnergyPlus in a parametric study provide better understanding energy savings under different shade conditions. The parametric runs addressed various building parameters such as geometry, orientation, site climate, glazing/shade properties, and shade control strategies with integrated lighting control. The impact of shades was determined for total building and space heating, cooling and lighting energy savings. The effect of shades on visual quality was studied using EnergyPlus, AGI32 and DAYSIM for several indices such as daylight glare index (DGI), work plane illuminance, luminance ratios and view. Different shade control strategies and integrated lighting control were considered with two translucent fabric shade colors. The results clearly show the benefit of automatic shade control strategies with integrated lighting control over a condition when shades are closed all day. The main contributor to the total energy savings is from lighting energy savings, followed by cooling energy savings. Shades provide greater benefit in a hot climate and in a moderate climate than in a cold climate. Different control strategies provide savings in the range of 7-35% for annual total space energy with higher savings with light colored shades. Control strategies of shades should be selected and optimized based on climate, orientation, window area, and window/shade properties. High performance glazings, when equipped with shades, show lower energy savings when compared to standard glazings. High transmittance/reflectance shades, such as white shades, perform better than dark shades in most of the cases due to higher lighting energy savings obtained with the automatic electric lighting control and the resulting cooling energy savings from rejection of some solar energy and a reduction in the heat from lights. A South orientation showed the least benefit of automatic control of shades when compare to other orientations due to the large fraction of time shades are required to provide visual comfort. Under automatic shade control, energy savings are higher the more often the shades can be raised. The different automatic control strategies present tradeoffs between energy savings and comfort. With regard to visual quality, daylight quality assessments on view, glare, luminance ratios, and UDI can be used to assess shade control strategies. Automatic shade control can increase the number of view hours while controlling sunlight penetration. With automatic shade control, more daylight hours can be provided within the beneficial range of 100-2000 lux compared to shades that are closed all day. For a person facing the window, discomfort glare is likely to increase the more often the shades are raised. Keeping the shades down ensures an acceptable glare condition, but limits energy savings. Luminance ratios are another metric that can be used to assess shade performance. With white shades, the luminance ratios between the task and proximate surfaces are improved. Dark shades help improve the luminance ratios between the task and distant surfaces. When the shades are left open, even with no direct sunlight in the space, task to window luminance ratios will often exceed 1:10.
Formally specifying the logic of an automatic guidance controller
NASA Technical Reports Server (NTRS)
Guaspari, David
1990-01-01
The following topics are covered in viewgraph form: (1) the Penelope Project; (2) the logic of an experimental automatic guidance control system for a 737; (3) Larch/Ada specification; (4) some failures of informal description; (5) description of mode changes caused by switches; (6) intuitive description of window status (chosen vs. current); (7) design of the code; (8) and specifying the code.
NASA Astrophysics Data System (ADS)
Thébault, Cédric; Doyen, Didier; Routhier, Pierre; Borel, Thierry
2013-03-01
To ensure an immersive, yet comfortable experience, significant work is required during post-production to adapt the stereoscopic 3D (S3D) content to the targeted display and its environment. On the one hand, the content needs to be reconverged using horizontal image translation (HIT) so as to harmonize the depth across the shots. On the other hand, to prevent edge violation, specific re-convergence is required and depending on the viewing conditions floating windows need to be positioned. In order to simplify this time-consuming work we propose a depth grading tool that automatically adapts S3D content to digital cinema or home viewing environments. Based on a disparity map, a stereo point of interest in each shot is automatically evaluated. This point of interest is used for depth matching, i.e. to position the objects of interest of consecutive shots in a same plane so as to reduce visual fatigue. The tool adapts the re-convergence to avoid edge-violation, hyper-convergence and hyper-divergence. Floating windows are also automatically positioned. The method has been tested on various types of S3D content, and the results have been validated by a stereographer.
A software platform for the analysis of dermatology images
NASA Astrophysics Data System (ADS)
Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon
2017-11-01
The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.
Adaptive synchrosqueezing based on a quilted short-time Fourier transform
NASA Astrophysics Data System (ADS)
Berrian, Alexander; Saito, Naoki
2017-08-01
In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.
NASA Technical Reports Server (NTRS)
Simpson, James J.; Harkins, Daniel N.
1993-01-01
Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.
Chest CT window settings with multiscale adaptive histogram equalization: pilot study.
Fayad, Laura M; Jin, Yinpeng; Laine, Andrew F; Berkmen, Yahya M; Pearson, Gregory D; Freedman, Benjamin; Van Heertum, Ronald
2002-06-01
Multiscale adaptive histogram equalization (MAHE), a wavelet-based algorithm, was investigated as a method of automatic simultaneous display of the full dynamic contrast range of a computed tomographic image. Interpretation times were significantly lower for MAHE-enhanced images compared with those for conventionally displayed images. Diagnostic accuracy, however, was insufficient in this pilot study to allow recommendation of MAHE as a replacement for conventional window display.
Using Parameters of Dynamic Pulse Function for 3d Modeling in LOD3 Based on Random Textures
NASA Astrophysics Data System (ADS)
Alizadehashrafi, B.
2015-12-01
The pulse function (PF) is a technique based on procedural preprocessing system to generate a computerized virtual photo of the façade with in a fixed size square(Alizadehashrafi et al., 2009, Musliman et al., 2010). Dynamic Pulse Function (DPF) is an enhanced version of PF which can create the final photo, proportional to real geometry. This can avoid distortion while projecting the computerized photo on the generated 3D model(Alizadehashrafi and Rahman, 2013). The challenging issue that might be handled for having 3D model in LoD3 rather than LOD2, is the final aim that have been achieved in this paper. In the technique based DPF the geometries of the windows and doors are saved in an XML file schema which does not have any connections with the 3D model in LoD2 and CityGML format. In this research the parameters of Dynamic Pulse Functions are utilized via Ruby programming language in SketchUp Trimble to generate (exact position and deepness) the windows and doors automatically in LoD3 based on the same concept of DPF. The advantage of this technique is automatic generation of huge number of similar geometries e.g. windows by utilizing parameters of DPF along with defining entities and window layers. In case of converting the SKP file to CityGML via FME software or CityGML plugins the 3D model contains the semantic database about the entities and window layers which can connect the CityGML to MySQL(Alizadehashrafi and Baig, 2014). The concept behind DPF, is to use logical operations to project the texture on the background image which is dynamically proportional to real geometry. The process of projection is based on two vertical and horizontal dynamic pulses starting from upper-left corner of the background wall in down and right directions respectively based on image coordinate system. The logical one/zero on the intersections of two vertical and horizontal dynamic pulses projects/does not project the texture on the background image. It is possible to define priority for each layer. For instance the priority of the door layer can be higher than window layer which means that window texture cannot be projected on the door layer. Orthogonal and rectified perpendicular symmetric photos of the 3D objects that are proportional to the real façade geometry must be utilized for the generation of the output frame for DPF. The DPF produces very high quality and small data size of output image files in quite smaller dimension compare with the photorealistic texturing method. The disadvantage of DPF is its preprocessing method to generate output image file rather than online processing to generate the texture within the 3D environment such as CityGML. Furthermore the result of DPF can be utilized for 3D model in LOD2 rather than LOD3. In the current work the random textures of the window layers are created based on parameters of DPF within Ruby console of SketchUp Trimble to generate the deeper geometries of the windows and their exact position on the façade automatically along with random textures to increase Level of Realism (LoR)(Scarpino, 2010). As the output frame in DPF is proportional to real geometry (height and width of the façade) it is possible to query the XML database and convert them to units such as meter automatically. In this technique, the perpendicular terrestrial photo from the façade is rectified by employing projective transformation based on the frame which is in constrain proportion to real geometry. The rectified photos which are not suitable for texturing but necessary for measuring, can be resized in constrain proportion to real geometry before measuring process. Height and width of windows, doors, horizontal and vertical distance between windows from upper left corner of the photo dimensions of doors and windows are parameters that should be measured to run the program as a plugins in SketchUp Trimble. The system can use these parameters and texture file names and file paths to create the façade semi-automatically. To avoid leaning geometry the textures of windows, doors and etc, should be cropped and rectified from perpendicular photos, so that they can be used in the program to create the whole façade along with its geometries. Texture enhancement should be done in advance such as removing disturbing objects, exposure setting, left-right up-down transformation, and so on. In fact, the quality, small data size, scale and semantic database for each façade are the prominent advantages of this method.
Accidental strangulation in children by the automatic closing of a car window.
Serena, Kailene; Piva, Jefferson Pedro; Andreolio, Cinara; Carvalho, Paulo Roberto Antonacci; Rocha, Tais Sica da
2018-03-01
Among the main causes of death in our country are car accidents, drowning and accidental burns. Strangulation is a potentially fatal injury and an important cause of homicide and suicide among adults and adolescents. In children, its occurrence is usually accidental. However, in recent years, several cases of accidental strangulation in children around the world have been reported. A 2-year-old male patient was strangled in a car window. The patient was admitted to the pediatric intensive care unit with a Glasgow Coma Scale score of 8 and presented with progressive worsening of respiratory dysfunction and torpor. The patient also presented acute respiratory distress syndrome, acute pulmonary edema and shock. He was managed with protective mechanical ventilation, vasoactive drugs and antibiotic therapy. He was discharged from the intensive care unit without neurological or pulmonary sequelae. After 12 days of hospitalization, he was discharged from the hospital, and his state was very good. The incidence of automobile window strangulation is rare but of high morbidity and mortality due to the resulting choking mechanism. Fortunately, newer cars have devices that stop the automatic closing of the windows if resistance is encountered. However, considering the severity of complications strangulated patients experience, the intensive neuro-ventilatory and hemodynamic management of the pathologies involved is important to reduce morbidity and mortality, as is the need to implement new campaigns for the education of parents and caregivers of children, aiming to avoid easily preventable accidents and to optimize safety mechanisms in cars with electric windows.
Iconic Meaning in Music: An Event-Related Potential Study.
Cai, Liman; Huang, Ping; Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners' experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360 ms and 410-460 ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music.
Iconic Meaning in Music: An Event-Related Potential Study
Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners’ experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360ms and 410-460ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music. PMID:26161561
X-window-based 2K display workstation
NASA Astrophysics Data System (ADS)
Weinberg, Wolfram S.; Hayrapetian, Alek S.; Cho, Paul S.; Valentino, Daniel J.; Taira, Ricky K.; Huang, H. K.
1991-07-01
A high-definition, high-performance display station for reading and review of digital radiological images is introduced. The station is based on a Sun SPARC Station 4 and employs X window system for display and manipulation of images. A mouse-operated graphic user interface is implemented utilizing Motif-style tools. The system supports up to four MegaScan gray-scale 2560 X 2048 monitors. A special configuration of frame and video buffer yields a data transfer of 50 M pixels/s. A magnetic disk array supplies a storage capacity of 2 GB with a data transfer rate of 4-6 MB/s. The system has access to the central archive through an ultrahigh-speed fiber-optic network and patient studies are automatically transferred to the local disk. The available image processing functions include change of lookup table, zoom and pan, and cine. Future enhancements will provide for manual contour tracing, length, area, and density measurements, text and graphic overlay, as well as composition of selected images. Additional preprocessing procedures under development will optimize the initial lookup table and adjust the images to a standard orientation.
A VxD-based automatic blending system using multithreaded programming.
Wang, L; Jiang, X; Chen, Y; Tan, K C
2004-01-01
This paper discusses the object-oriented software design for an automatic blending system. By combining the advantages of a programmable logic controller (PLC) and an industrial control PC (ICPC), an automatic blending control system is developed for a chemical plant. The system structure and multithread-based communication approach are first presented in this paper. The overall software design issues, such as system requirements and functionalities, are then discussed in detail. Furthermore, by replacing the conventional dynamic link library (DLL) with virtual X device drivers (VxD's), a practical and cost-effective solution is provided to improve the robustness of the Windows platform-based automatic blending system in small- and medium-sized plants.
Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data
NASA Astrophysics Data System (ADS)
Hong, J. H.; Su, Y. T.
2016-06-01
With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.
WinTICS-24 --- A Telescope Control Interface for MS Windows
NASA Astrophysics Data System (ADS)
Hawkins, R. Lee
1995-12-01
WinTICS-24 is a telescope control system interface and observing assistant written in Visual Basic for MS Windows. It provides the ability to control a telescope and up to 3 other instruments via the serial ports on an IBM-PC compatible computer, all from one consistent user interface. In addition to telescope control, WinTICS contains an observing logbook, trouble log (which can automatically email its entries to a responsible person), lunar phase display, object database (which allows the observer to type in the name of an object and automatically slew to it), a time of minimum calculator for eclipsing binary stars, and an interface to the Guide CD-ROM for bringing up finder charts of the current telescope coordinates. Currently WinTICS supports control of DFM telescopes, but is easily adaptable to other telescopes and instrumentation.
NASA Astrophysics Data System (ADS)
Gonzalez, Pablo J.
2017-04-01
Automatic interferometric processing of satellite radar data has emerged as a solution to the increasing amount of acquired SAR data. Automatic SAR and InSAR processing ranges from focusing raw echoes to the computation of displacement time series using large stacks of co-registered radar images. However, this type of interferometric processing approach demands the pre-described or adaptive selection of multiple processing parameters. One of the interferometric processing steps that much strongly influences the final results (displacement maps) is the interferometric phase filtering. There are a large number of phase filtering methods, however the "so-called" Goldstein filtering method is the most popular [Goldstein and Werner, 1998; Baran et al., 2003]. The Goldstein filter needs basically two parameters, the size of the window filter and a parameter to indicate the filter smoothing intensity. The modified Goldstein method removes the need to select the smoothing parameter based on the local interferometric coherence level, but still requires to specify the dimension of the filtering window. An optimal filtered phase quality usually requires careful selection of those parameters. Therefore, there is an strong need to develop automatic filtering methods to adapt for automatic processing, while maximizing filtered phase quality. Here, in this paper, I present a recursive adaptive phase filtering algorithm for accurate estimation of differential interferometric ground deformation and local coherence measurements. The proposed filter is based upon the modified Goldstein filter [Baran et al., 2003]. This filtering method improves the quality of the interferograms by performing a recursive iteration using variable (cascade) kernel sizes, and improving the coherence estimation by locally defringing the interferometric phase. The method has been tested using simulations and real cases relevant to the characteristics of the Sentinel-1 mission. Here, I present real examples from C-band interferograms showing strong and weak deformation gradients, with moderate baselines ( 100-200 m) and variable temporal baselines of 70 and 190 days over variable vegetated volcanoes (Mt. Etna, Hawaii and Nyragongo-Nyamulagira). The differential phase of those examples show intense localized volcano deformation and also vast areas of small differential phase variation. The proposed method outperforms the classical Goldstein and modified Goldstein filters by preserving subtle phase variations where the deformation fringe rate is high, and effectively suppressing phase noise in smoothly phase variation regions. Finally, this method also has the additional advantage of not requiring input parameters, except for the maximum filtering kernel size. References: Baran, I., Stewart, M.P., Kampes, B.M., Perski, Z., Lilly, P., (2003) A modification to the Goldstein radar interferogram filter. IEEE Transactions on Geoscience and Remote Sensing, vol. 41, No. 9., doi:10.1109/TGRS.2003.817212 Goldstein, R.M., Werner, C.L. (1998) Radar interferogram filtering for geophysical applications, Geophysical Research Letters, vol. 25, No. 21, 4035-4038, doi:10.1029/1998GL900033
Single-crystalline BaTiO3 films grown by gas-source molecular beam epitaxy
NASA Astrophysics Data System (ADS)
Matsubara, Yuya; Takahashi, Kei S.; Tokura, Yoshinori; Kawasaki, Masashi
2014-12-01
Thin BaTiO3 films were grown on GdScO3 (110) substrates by metalorganic gas-source molecular beam epitaxy. Titanium tetra-isopropoxide (TTIP) was used as a volatile precursor that provides a wide growth window of the supplied TTIP/Ba ratio for automatic adjustment of the film composition. Within the growth window, compressively strained films can be grown with excellent crystalline quality, whereas films grown outside of the growth window are relaxed with inferior crystallinity. This growth method will provide a way to study the intrinsic properties of ferroelectric BaTiO3 films and their heterostructures by precise control of the stoichiometry, structure, and purity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomes, C.
This report describes a successful project for transference of advanced AI technology into the domain of planning of outages of nuclear power plants as part of DOD`s dual-use program. ROMAN (Rome Lab Outage Manager) is the prototype system that was developed as a result of this project. ROMAN`s main innovation compared to the current state-of-the-art of outage management tools is its capability to automatically enforce safety constraints during the planning and scheduling phase. Another innovative aspect of ROMAN is the generation of more robust schedules that are feasible over time windows. In other words, ROMAN generates a family of schedulesmore » by assigning time intervals as start times to activities rather than single start times, without affecting the overall duration of the project. ROMAN uses a constraint satisfaction paradigm combining a global search tactic with constraint propagation. The derivation of very specialized representations for the constraints to perform efficient propagation is a key aspect for the generation of very fast schedules - constraints are compiled into the code, which is a novel aspect of our work using an automatic programming system, KIDS.« less
Automatic classification of seismic events within a regional seismograph network
NASA Astrophysics Data System (ADS)
Tiira, Timo; Kortström, Jari; Uski, Marja
2015-04-01
A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.
Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.
Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles
2015-11-01
Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.
Looking through the postdisaster policy window
NASA Astrophysics Data System (ADS)
Solecki, William D.; Michaels, Sarah
1994-07-01
Policy windows are transitory opportunities during which the likelihood of adopting new policy or legislative proposals is greater than usual. Accepted wisdom has held that natural disasters serve as focusing events that generate policy windows in their wake. This paper highlights the need for a more circumscribed understanding of when and where policy windows occur based on the experiences of three US regional planning organizations: a hand-picked commission of community leaders, a council of governments, and a special-purpose substate organization. The first operated in the San Francisco Bay Area of California following the Loma Prieta earthquake (October 1989), and the other two in South Carolina's Atlantic coastal plain after Hurricane Hugo (September 1989). The analysis concludes that natural disasters did not transform the agenda or mission of these entities. Policy windows were neither automatic outcomes of focusing events nor did they ensure the adoption of pertinent policy within the organizations investigated. Several conditions are minimally necessary for using policy windows to bring about hazard mitigation: comprehensive institutional conceptualization of hazards management, institutional strength and flexibility, and well-placed, effective policy entrepreneurs.
ESDAPT - APT PROGRAMMING EDITOR AND INTERPRETER
NASA Technical Reports Server (NTRS)
Premack, T.
1994-01-01
ESDAPT is a graphical programming environment for developing APT (Automatically Programmed Tool) programs for controlling numerically controlled machine tools. ESDAPT has a graphical user interface that provides the user with an APT syntax sensitive text editor and windows for displaying geometry and tool paths. APT geometry statement can also be created using menus and screen picks. ESDAPT interprets APT geometry statements and displays the results in its view windows. Tool paths are generated by batching the APT source to an APT processor (COSMIC P-APT recommended). The tool paths are then displayed in the view windows. Hardcopy output of the view windows is in color PostScript format. ESDAPT is written in C-language, yacc, lex, and XView for use on Sun4 series computers running SunOS. ESDAPT requires 4Mb of disk space, 7Mb of RAM, and MIT's X Window System, Version 11 Release 4, or OpenWindows version 3 for execution. Program documentation in PostScript format and an executable for OpenWindows version 3 are provided on the distribution media. The standard distribution medium for ESDAPT is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. This program was developed in 1992.
Alxneit, Ivo
2018-03-30
A python module (HRTEMFringeAnalyzer) is reported to evaluate the local crystallinity of samples from high-resolution transmission electron microscopy images in a mostly automated fashion. The user only selects the size of a square analyser window and a step size which translates the window in the micrograph. Together they define the resolution of the results obtained. Regions where fringe patterns are visible are identified and their lattice spacing d and direction ϕ as well as the corresponding mean errors σ determined. 1/σd is proportional to the coherence length of the structure, whereas σφ is a measure of how well the direction of the fringes is defined. Maps of these four indicators are computed. The performance of the program is demonstrated on two very different samples: ill-crystalline carbon deposits on a coked Ni/LFNO (reduced LaFe 0.8 Ni 0.2 O3±δ) catalyst and well-crystallized nanoparticles of zinc doped ceria. In the latter case, the automatic segmentation of large aggregates into individual crystalline domains is achieved by ϕ maps. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.
Van Strien, Jan W; Franken, Ingmar H A; Huijding, Jorg
2009-03-04
The early posterior negativity (EPN) reflects early selective visual processing of emotionally significant information. This study explored the association between fear of spiders and the EPN for spider pictures. Fifty women completed a Spider Phobia Questionnaire and watched the random rapid serial presentation of 600 neutral, 600 negatively valenced emotional, and 600 spider pictures (three pictures per second). The EPN was scored as the mean activity in the 225-300-ms time window at lateral occipital electrodes. Participants with higher scores on the phobia questionnaire showed larger (i.e. more negative) EPN amplitudes in response to spider pictures. The results suggest that the attentional capture of spider-related stimuli is an automatic response, which is modulated by the extent of spider fear.
[Computerized monitoring system in the operating center with UNIX and X-window].
Tanaka, Y; Hashimoto, S; Chihara, E; Kinoshita, T; Hirose, M; Nakagawa, M; Murakami, T
1992-01-01
We previously reported the fully automated data logging system in the operating center. Presently, we revised the system using a highly integrated operating system, UNIX instead of OS/9. With this multi-task and multi-window (X-window) system, we could monitor all 12 rooms in the operating center at a time. The system in the operating center consists of 2 computers, SONY NEWS1450 (UNIX workstation) and Sord M223 (CP/M, data logger). On the bitmapped display of the workstation, using X-window, the data of all the operating rooms can be visualized. Furthermore, 2 other minicomputers (Fujitsu A50 in the conference room, and A60 in the ICU) and a workstation (Sun3-80 in the ICU) were connected with ethernet. With the remote login function (NFS), we could easily obtain the data during the operation from outside the operating center. This system works automatically and needs no routine maintenance.
Large-scale building scenes reconstruction from close-range images based on line and plane feature
NASA Astrophysics Data System (ADS)
Ding, Yi; Zhang, Jianqing
2007-11-01
Automatic generate 3D models of buildings and other man-made structures from images has become a topic of increasing importance, those models may be in applications such as virtual reality, entertainment industry and urban planning. In this paper we address the main problems and available solution for the generation of 3D models from terrestrial images. We first generate a coarse planar model of the principal scene planes and then reconstruct windows to refine the building models. There are several points of novelty: first we reconstruct the coarse wire frame model use the line segments matching with epipolar geometry constraint; Secondly, we detect the position of all windows in the image and reconstruct the windows by established corner points correspondences between images, then add the windows to the coarse model to refine the building models. The strategy is illustrated on image triple of college building.
Generically Used Expert Scheduling System (GUESS): User's Guide Version 1.0
NASA Technical Reports Server (NTRS)
Liebowitz, Jay; Krishnamurthy, Vijaya; Rodens, Ira
1996-01-01
This user's guide contains instructions explaining how to best operate the program GUESS, a generic expert scheduling system. GUESS incorporates several important features for a generic scheduler, including automatic scheduling routines to generate a 'first' schedule for the user, a user interface that includes Gantt charts and enables the human scheduler to manipulate schedules manually, diagnostic report generators, and a variety of scheduling techniques. The current version of GUESS runs on an IBM PC or compatible in the Windows 3.1 or Windows '95 environment.
Wang, Jing-Min; Yang, Ming-Ta; Chen, Po-Lin
2017-01-01
With the advance of science and technology, people have a desire for convenient and comfortable living. Creating comfortable and healthy indoor environments is a major consideration for designing smart homes. As handheld devices become increasingly powerful and ubiquitous, this paper proposes an innovative use of smart handheld devices (SHD), using MIT App Inventor and fuzzy control, to perform the real-time monitoring and smart control of the designed intelligent windowsill system (IWS) in a smart home. A compact weather station that consists of environment sensors was constructed in the IWS for measuring of indoor illuminance, temperature-humidity, carbon dioxide (CO2) concentration and outdoor rain and wind direction. According to the measured environment information, the proposed system can automatically send a command to a fuzzy microcontroller performed by Arduino UNO to fully or partly open the electric curtain and electric window for adapting to climate changes in the indoor and outdoor environment. Moreover, the IWS can automatically close windows for rain splashing on the window. The presented novel control method for the windowsill not only expands the SHD applications, but greatly enhances convenience to users. To validate the feasibility and effectiveness of the IWS, a laboratory prototype was built and confirmed experimentally. PMID:28398266
Wang, Jing-Min; Yang, Ming-Ta; Chen, Po-Lin
2017-04-11
With the advance of science and technology, people have a desire for convenient and comfortable living. Creating comfortable and healthy indoor environments is a major consideration for designing smart homes. As handheld devices become increasingly powerful and ubiquitous, this paper proposes an innovative use of smart handheld devices (SHD), using MIT App Inventor and fuzzy control, to perform the real-time monitoring and smart control of the designed intelligent windowsill system (IWS) in a smart home. A compact weather station that consists of environment sensors was constructed in the IWS for measuring of indoor illuminance, temperature-humidity, carbon dioxide (CO₂) concentration and outdoor rain and wind direction. According to the measured environment information, the proposed system can automatically send a command to a fuzzy microcontroller performed by Arduino UNO to fully or partly open the electric curtain and electric window for adapting to climate changes in the indoor and outdoor environment. Moreover, the IWS can automatically close windows for rain splashing on the window. The presented novel control method for the windowsill not only expands the SHD applications, but greatly enhances convenience to users. To validate the feasibility and effectiveness of the IWS, a laboratory prototype was built and confirmed experimentally.
Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing
NASA Astrophysics Data System (ADS)
LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.
2017-12-01
With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.
Ye-Lin, Yiyao; Alberola-Rubio, José; Perales, Alfredo
2014-01-01
Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique. PMID:24523828
Ye-Lin, Yiyao; Garcia-Casado, Javier; Prats-Boluda, Gema; Alberola-Rubio, José; Perales, Alfredo
2014-01-01
Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.
38 CFR 17.157 - Definition-adaptive equipment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... includes, but is not limited to, a basic automatic transmission, power steering, power brakes, power window lifts, power seats, air-conditioning equipment when necessary for the health and safety of the veteran... MEDICAL Automotive Equipment and Driver Training § 17.157 Definition-adaptive equipment. The term...
Serag, Ahmed; Wilkinson, Alastair G.; Telford, Emma J.; Pataky, Rozalia; Sparrow, Sarah A.; Anblagan, Devasuda; Macnaught, Gillian; Semple, Scott I.; Boardman, James P.
2017-01-01
Quantitative volumes from brain magnetic resonance imaging (MRI) acquired across the life course may be useful for investigating long term effects of risk and resilience factors for brain development and healthy aging, and for understanding early life determinants of adult brain structure. Therefore, there is an increasing need for automated segmentation tools that can be applied to images acquired at different life stages. We developed an automatic segmentation method for human brain MRI, where a sliding window approach and a multi-class random forest classifier were applied to high-dimensional feature vectors for accurate segmentation. The method performed well on brain MRI data acquired from 179 individuals, analyzed in three age groups: newborns (38–42 weeks gestational age), children and adolescents (4–17 years) and adults (35–71 years). As the method can learn from partially labeled datasets, it can be used to segment large-scale datasets efficiently. It could also be applied to different populations and imaging modalities across the life course. PMID:28163680
[Automated anesthesia record system].
Zhu, Tao; Liu, Jin
2005-12-01
Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.
Automatic Fringe Detection for Oil Film Interferometry Measurement of Skin Friction
NASA Technical Reports Server (NTRS)
Naughton, Jonathan W.; Decker, Robert K.; Jafari, Farhad
2001-01-01
This report summarizes two years of work on investigating algorithms for automatically detecting fringe patterns in images acquired using oil-drop interferometry for the determination of skin friction. Several different analysis methods were tested, and a combination of a windowed Fourier transform followed by a correlation was found to be most effective. The implementation of this method is discussed and details of the process are described. The results indicate that this method shows promise for automating the fringe detection process, but further testing is required.
Hammer, J S; Strain, J J; Friedberg, A; Fulop, G
1995-05-01
No current system of computerized data entry of clinical information in consultation-liaison (C-L) psychiatry has been well received or has demonstrated that it saves the consultant's time. The inability to achieve accurate, complete, systematic collection of discrete variables and data entry in the harried C-L setting is a major impediment to the advancement of the subspecialty and health services research. The hand-held Notebook computer with Windows PEN ENTRY MICROCARES capabilities has permitted one-time direct entry of data at the time of collection at the patient's bedside. Variable choice and selection enhances the completeness and accuracy of data collection. For example, ICD-9, Axis III diagnoses may be selected from a "look-up" which at the same time automatically assigns the appropriate code and diagnostic-related groups, (DRG) number. A patient narrative can be typed at the nurse's station, a chart note printed for the medical record, and the MICRO-CARES literature database perused with the printing of selected citations, abstracts, and in some cases experts' commentaries for the consultee. The consultant's documentation time is halved using the NOTEBOOK WINDOWS PEN ENTRY MICRO-CARES software, with the advantage of more accurate and complete data description than with the traditional handwritten consultation records. Consultees preferred typewritten in contrast to handwritten notes. The cost of the hardware (about $2000) is less than that of an optical scanner, and it permits report generation and archival searches at the nurses' station without returning to the C-L office for scanning. Radio frequency or ethernet download from the Notebook permits direct data transfer to th C-L office archive computer.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569
Astrometrica: Astrometric data reduction of CCD images
NASA Astrophysics Data System (ADS)
Raab, Herbert
2012-03-01
Astrometrica is an interactive software tool for scientific grade astrometric data reduction of CCD images. The current version of the software is for the Windows 32bit operating system family. Astrometrica reads FITS (8, 16 and 32 bit integer files) and SBIG image files. The size of the images is limited only by available memory. It also offers automatic image calibration (Dark Frame and Flat Field correction), automatic reference star identification, automatic moving object detection and identification, and access to new-generation star catalogs (PPMXL, UCAC 3 and CMC-14), in addition to online help and other features. Astrometrica is shareware, available for use for a limited period of time (100 days) for free; special arrangements can be made for educational projects.
A new method for automatic discontinuity traces sampling on rock mass 3D model
NASA Astrophysics Data System (ADS)
Umili, G.; Ferrero, A.; Einstein, H. H.
2013-02-01
A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.
NASA Astrophysics Data System (ADS)
Xie, Huan; Luo, Xin; Xu, Xiong; Wang, Chen; Pan, Haiyan; Tong, Xiaohua; Liu, Shijie
2016-10-01
Water body is a fundamental element in urban ecosystems and water mapping is critical for urban and landscape planning and management. As remote sensing has increasingly been used for water mapping in rural areas, this spatially explicit approach applied in urban area is also a challenging work due to the water bodies mainly distributed in a small size and the spectral confusion widely exists between water and complex features in the urban environment. Water index is the most common method for water extraction at pixel level, and spectral mixture analysis (SMA) has been widely employed in analyzing urban environment at subpixel level recently. In this paper, we introduce an automatic subpixel water mapping method in urban areas using multispectral remote sensing data. The objectives of this research consist of: (1) developing an automatic land-water mixed pixels extraction technique by water index; (2) deriving the most representative endmembers of water and land by utilizing neighboring water pixels and adaptive iterative optimal neighboring land pixel for respectively; (3) applying a linear unmixing model for subpixel water fraction estimation. Specifically, to automatically extract land-water pixels, the locally weighted scatter plot smoothing is firstly used to the original histogram curve of WI image . And then the Ostu threshold is derived as the start point to select land-water pixels based on histogram of the WI image with the land threshold and water threshold determination through the slopes of histogram curve . Based on the previous process at pixel level, the image is divided into three parts: water pixels, land pixels, and mixed land-water pixels. Then the spectral mixture analysis (SMA) is applied to land-water mixed pixels for water fraction estimation at subpixel level. With the assumption that the endmember signature of a target pixel should be more similar to adjacent pixels due to spatial dependence, the endmember of water and land are determined by neighboring pure land or pure water pixels within a distance. To obtaining the most representative endmembers in SMA, we designed an adaptive iterative endmember selection method based on the spatial similarity of adjacent pixels. According to the spectral similarity in a spatial adjacent region, the spectrum of land endmember is determined by selecting the most representative land pixel in a local window, and the spectrum of water endmember is determined by calculating an average of the water pixels in the local window. The proposed hierarchical processing method based on WI and SMA (WISMA) is applied to urban areas for reliability evaluation using the Landsat-8 Operational Land Imager (OLI) images. For comparison, four methods at pixel level and subpixel level were chosen respectively. Results indicate that the water maps generated by the proposed method correspond as closely with the truth water maps with subpixel precision. And the results showed that the WISMA achieved the best performance in water mapping with comprehensive analysis of different accuracy evaluation indexes (RMSE and SE).
Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnette, Daniel W.
2012-01-05
Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less
Computed Tomographic Window Setting for Bronchial Measurement to Guide Double-Lumen Tube Size.
Seo, Jeong-Hwa; Bae, Jinyoung; Paik, Hyesun; Koo, Chang-Hoon; Bahk, Jae-Hyon
2018-04-01
The bronchial diameter measured on computed tomography (CT) can be used to guide double-lumen tube (DLT) sizes objectively. The bronchus is known to be measured most accurately in the so-called bronchial CT window. The authors investigated whether using the bronchial window results in the selection of more appropriately sized DLTs than using the other windows. CT image analysis and prospective randomized study. Tertiary hospital. Adults receiving left-sided DLTs. The authors simulated selection of DLT sizes based on the left bronchial diameters measured in the lung (width 1,500 Hounsfield unit [HU] and level -700 HU), bronchial (1,000 HU and -450 HU), and mediastinal (400 HU and 25 HU) CT windows. Furthermore, patients were randomly assigned to undergo imaging with either the bronchial or mediastinal window to guide DLT sizes. Using the underwater seal technique, the authors assessed whether the DLT was appropriately sized, undersized, or oversized for the patient. On 130 CT images, the bronchial diameter (9.9 ± 1.2 mm v 10.5 ± 1.3 mm v 11.7 ± 1.3 mm) and the selected DLT size were different in the lung, bronchial, and mediastinal windows, respectively (p < 0.001). In 13 patients (17%), the bronchial diameter measured in the lung window suggested too small DLTs (28 Fr) for adults. In the prospective study, oversized tubes were chosen less frequently in the bronchial window than in the mediastinal window (6/110 v 23/111; risk ratio 0.38; 95% CI 0.19-0.79; p = 0.003). No tubes were undersized after measurements in these two windows. The bronchial measurement in the bronchial window guided more appropriately sized DLTs compared with the lung or mediastinal windows. Copyright © 2017 Elsevier Inc. All rights reserved.
Funama, Yoshinori; Utsunomiya, Daisuke; Taguchi, Katsuyuki; Oda, Seitaro; Shimonobo, Toshiaki; Yamashita, Yasuyuki
2014-05-01
To investigate whether electrocardiogram (ECG)-gated single- and dual-heartbeat computed tomography coronary angiography (CTCA) with automatic exposure control (AEC) yields images with uniform image noise at reduced radiation doses. Using an anthropomorphic chest CT phantom we performed prospectively ECG-gated single- and dual-heartbeat CTCA on a second-generation 320-multidetector CT volume scanner. The exposure phase window was set at 75%, 70-80%, 40-80%, and 0-100% and the heart rate at 60 or 80 or corr80 bpm; images were reconstructed with filtered back projection (FBP) or iterative reconstruction (IR, adaptive iterative dose reduction 3D). We applied AEC and set the image noise level to 20 or 25 HU. For each technique we determined the image noise and the radiation dose to the phantom center. With half-scan reconstruction at 60 bpm, a 70-80% phase window- and a 20-HU standard deviation (SD) setting, the imagenoise level and -variation along the z axis manifested similar curves with FBP and IR. With half-scan reconstruction, the radiation dose to the phantom center with 70-80% phase window was 18.89 and 12.34 mGy for FBP and 4.61 and 3.10 mGy for IR at an SD setting SD of 20 and 25 HU, respectively. At 80 bpm with two-segment reconstruction the dose was approximately twice that of 60 bpm at both SD settings. However, increasing radiation dose at corr80 bpm was suppressed to 1.39 times compared to 60 bpm. AEC at ECG-gated single- and dual-heartbeat CTCA controls the image noise at different radiation dose. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Data in support of energy performance of double-glazed windows.
Shakouri, Mahmoud; Banihashemi, Saeed
2016-06-01
This paper provides the data used in a research project to propose a new simplified windows rating system based on saved annual energy ("Developing an empirical predictive energy-rating model for windows by using Artificial Neural Network" (Shakouri Hassanabadi and Banihashemi Namini, 2012) [1], "Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates" (Banihashemi et al., 2015) [2]). A full factorial simulation study was conducted to evaluate the performance of 26 different types of windows in a four-story residential building. In order to generalize the results, the selected windows were tested in four climates of cold, tropical, temperate, and hot and arid; and four different main orientations of North, West, South and East. The accompanied datasets include the annual saved cooling and heating energy in different climates and orientations by using the selected windows. Moreover, a complete dataset is provided that includes the specifications of 26 windows, climate data, month, and orientation of the window. This dataset can be used to make predictive models for energy efficiency assessment of double glazed windows.
Letter-sound processing deficits in children with developmental dyslexia: An ERP study.
Moll, Kristina; Hasko, Sandra; Groth, Katharina; Bartling, Jürgen; Schulte-Körne, Gerd
2016-04-01
The time course during letter-sound processing was investigated in children with developmental dyslexia (DD) and typically developing (TD) children using electroencephalography. Thirty-eight children with DD and 25 TD children participated in a visual-auditory oddball paradigm. Event-related potentials (ERPs) elicited by standard and deviant stimuli in an early (100-190 ms) and late (560-750 ms) time window were analysed. In the early time window, ERPs elicited by the deviant stimulus were delayed and less left lateralized over fronto-temporal electrodes for children with DD compared to TD children. In the late time window, children with DD showed higher amplitudes extending more over right frontal electrodes. Longer latencies in the early time window and stronger right hemispheric activation in the late time window were associated with slower reading and naming speed. Additionally, stronger right hemispheric activation in the late time window correlated with poorer phonological awareness skills. Deficits in early stages of letter-sound processing influence later more explicit cognitive processes during letter-sound processing. Identifying the neurophysiological correlates of letter-sound processing and their relation to reading related skills provides insight into the degree of automaticity during letter-sound processing beyond behavioural measures of letter-sound-knowledge. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Keeping PCs up to Date Can Be Fun
ERIC Educational Resources Information Center
Goldsborough, Reid
2004-01-01
The "joy" of computer maintenance takes many forms. These days, automation is the byword. Operating systems such as Microsoft Windows and utility suites such as Symantec's Norton Internet Security let you automatically keep crucial parts of your computer system up to date. It's fun to watch the technology keep tabs on itself. This document offers…
Wojtas-Niziurski, Wojciech; Meng, Yilin; Roux, Benoit; Bernèche, Simon
2013-01-01
The potential of mean force describing conformational changes of biomolecules is a central quantity that determines the function of biomolecular systems. Calculating an energy landscape of a process that depends on three or more reaction coordinates might require a lot of computational power, making some of multidimensional calculations practically impossible. Here, we present an efficient automatized umbrella sampling strategy for calculating multidimensional potential of mean force. The method progressively learns by itself, through a feedback mechanism, which regions of a multidimensional space are worth exploring and automatically generates a set of umbrella sampling windows that is adapted to the system. The self-learning adaptive umbrella sampling method is first explained with illustrative examples based on simplified reduced model systems, and then applied to two non-trivial situations: the conformational equilibrium of the pentapeptide Met-enkephalin in solution and ion permeation in the KcsA potassium channel. With this method, it is demonstrated that a significant smaller number of umbrella windows needs to be employed to characterize the free energy landscape over the most relevant regions without any loss in accuracy. PMID:23814508
Perovskite Photovoltachromic Supercapacitor with All-Transparent Electrodes.
Zhou, Feichi; Ren, Zhiwei; Zhao, Yuda; Shen, Xinpeng; Wang, Aiwu; Li, Yang Yang; Surya, Charles; Chai, Yang
2016-06-28
Photovoltachromic cells (PVCCs) are of great interest for the self-powered smart windows of architectures and vehicles, which require widely tunable transmittance and automatic color change under photostimuli. Organolead halide perovskite possesses high light absorption coefficient and enables thin and semitransparent photovoltaic device. In this work, we demonstrate co-anode and co-cathode photovoltachromic supercapacitors (PVCSs) by vertically integrating a perovskite solar cell (PSC) with MoO3/Au/MoO3 transparent electrode and electrochromic supercapacitor. The PVCSs provide a seamless integration of energy harvesting/storage device, automatic and wide color tunability, and enhanced photostability of PSCs. Compared with conventional PVCC, the counter electrodes of our PVCSs provide sufficient balancing charge, eliminate the necessity of reverse bias voltage for bleaching the device, and realize reasonable in situ energy storage. The color states of PVCSs not only indicate the amount of energy stored and energy consumed in real time, but also enhance the photostability of photovoltaic component by preventing its long-time photoexposure under fully charged state of PVCSs. This work designs PVCS devices for multifunctional smart window applications commonly made of glass.
Morphological Feature Extraction for Automatic Registration of Multispectral Images
NASA Technical Reports Server (NTRS)
Plaza, Antonio; LeMoigne, Jacqueline; Netanyahu, Nathan S.
2007-01-01
The task of image registration can be divided into two major components, i.e., the extraction of control points or features from images, and the search among the extracted features for the matching pairs that represent the same feature in the images to be matched. Manual extraction of control features can be subjective and extremely time consuming, and often results in few usable points. On the other hand, automated feature extraction allows using invariant target features such as edges, corners, and line intersections as relevant landmarks for registration purposes. In this paper, we present an extension of a recently developed morphological approach for automatic extraction of landmark chips and corresponding windows in a fully unsupervised manner for the registration of multispectral images. Once a set of chip-window pairs is obtained, a (hierarchical) robust feature matching procedure, based on a multiresolution overcomplete wavelet decomposition scheme, is used for registration purposes. The proposed method is validated on a pair of remotely sensed scenes acquired by the Advanced Land Imager (ALI) multispectral instrument and the Hyperion hyperspectral instrument aboard NASA's Earth Observing-1 satellite.
SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, C
Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary,more » orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.« less
Secure Video Surveillance System Acquisition Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
2009-12-04
The SVSS Acquisition Software collects and displays video images from two cameras through a VPN, and store the images onto a collection controller. The software is configured to allow a user to enter a time window to display up to 2 1/2, hours of video review. The software collects images from the cameras at a rate of 1 image per second and automatically deletes images older than 3 hours. The software code operates in a linux environment and can be run in a virtual machine on Windows XP. The Sandia software integrates the different COTS software together to build themore » video review system.« less
A median-Gaussian filtering framework for Moiré pattern noise removal from X-ray microscopy image.
Wei, Zhouping; Wang, Jian; Nichol, Helen; Wiebe, Sheldon; Chapman, Dean
2012-02-01
Moiré pattern noise in Scanning Transmission X-ray Microscopy (STXM) imaging introduces significant errors in qualitative and quantitative image analysis. Due to the complex origin of the noise, it is difficult to avoid Moiré pattern noise during the image data acquisition stage. In this paper, we introduce a post-processing method for filtering Moiré pattern noise from STXM images. This method includes a semi-automatic detection of the spectral peaks in the Fourier amplitude spectrum by using a local median filter, and elimination of the spectral noise peaks using a Gaussian notch filter. The proposed median-Gaussian filtering framework shows good results for STXM images with the size of power of two, if such parameters as threshold, sizes of the median and Gaussian filters, and size of the low frequency window, have been properly selected. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
He, A.; Quan, C.
2018-04-01
The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.
Glnemo2: Interactive Visualization 3D Program
NASA Astrophysics Data System (ADS)
Lambert, Jean-Charles
2011-10-01
Glnemo2 is an interactive 3D visualization program developed in C++ using the OpenGL library and Nokia QT 4.X API. It displays in 3D the particles positions of the different components of an nbody snapshot. It quickly gives a lot of information about the data (shape, density area, formation of structures such as spirals, bars, or peanuts). It allows for in/out zooms, rotations, changes of scale, translations, selection of different groups of particles and plots in different blending colors. It can color particles according to their density or temperature, play with the density threshold, trace orbits, display different time steps, take automatic screenshots to make movies, select particles using the mouse, and fly over a simulation using a given camera path. All these features are accessible from a very intuitive graphic user interface. Glnemo2 supports a wide range of input file formats (Nemo, Gadget 1 and 2, phiGrape, Ramses, list of files, realtime gyrfalcON simulation) which are automatically detected at loading time without user intervention. Glnemo2 uses a plugin mechanism to load the data, so that it is easy to add a new file reader. It's powered by a 3D engine which uses the latest OpenGL technology, such as shaders (glsl), vertex buffer object, frame buffer object, and takes in account the power of the graphic card used in order to accelerate the rendering. With a fast GPU, millions of particles can be rendered in real time. Glnemo2 runs on Linux, Windows (using minGW compiler), and MaxOSX, thanks to the QT4API.
Infrared Sky Imager (IRSI) Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Victor R.
2016-04-01
The Infrared Sky Imager (IRSI) deployed at the Atmospheric Radiation Measurement (ARM) Climate Research Facility is a Solmirus Corp. All Sky Infrared Visible Analyzer. The IRSI is an automatic, continuously operating, digital imaging and software system designed to capture hemispheric sky images and provide time series retrievals of fractional sky cover during both the day and night. The instrument provides diurnal, radiometrically calibrated sky imagery in the mid-infrared atmospheric window and imagery in the visible wavelengths for cloud retrievals during daylight hours. The software automatically identifies cloudy and clear regions at user-defined intervals and calculates fractional sky cover, providing amore » real-time display of sky conditions.« less
Interior Reconstruction Using the 3d Hough Transform
NASA Astrophysics Data System (ADS)
Dumitru, R.-C.; Borrmann, D.; Nüchter, A.
2013-02-01
Laser scanners are often used to create accurate 3D models of buildings for civil engineering purposes, but the process of manually vectorizing a 3D point cloud is time consuming and error-prone (Adan and Huber, 2011). Therefore, the need to characterize and quantify complex environments in an automatic fashion arises, posing challenges for data analysis. This paper presents a system for 3D modeling by detecting planes in 3D point clouds, based on which the scene is reconstructed at a high architectural level through removing automatically clutter and foreground data. The implemented software detects openings, such as windows and doors and completes the 3D model by inpainting.
Optical Characterization of Window Materials for Aerospace Applications
NASA Technical Reports Server (NTRS)
Tedjojuwono, Ken K.; Clark, Natalie; Humphreys, William M., Jr.
2013-01-01
An optical metrology laboratory has been developed to characterize the optical properties of optical window materials to be used for aerospace applications. Several optical measurement systems have been selected and developed to measure spectral transmittance, haze, clarity, birefringence, striae, wavefront quality, and wedge. In addition to silica based glasses, several optical lightweight polymer materials and transparent ceramics have been investigated in the laboratory. The measurement systems and selected empirical results for non-silica materials are described. These measurements will be used to form the basis of acceptance criteria for selection of window materials for future aerospace vehicle and habitat designs.
NASA Astrophysics Data System (ADS)
Xiao, Fan; Chen, Zhijun; Chen, Jianguo; Zhou, Yongzhang
2016-05-01
In this study, a novel batch sliding window (BSW) based singularity mapping approach was proposed. Compared to the traditional sliding window (SW) technique with disadvantages of the empirical predetermination of a fixed maximum window size and outliers sensitivity of least-squares (LS) linear regression method, the BSW based singularity mapping approach can automatically determine the optimal size of the largest window for each estimated position, and utilizes robust linear regression (RLR) which is insensitive to outlier values. In the case study, tin geochemical data in Gejiu, Yunnan, have been processed by BSW based singularity mapping approach. The results show that the BSW approach can improve the accuracy of the calculation of singularity exponent values due to the determination of the optimal maximum window size. The utilization of RLR method in the BSW approach can smoothen the distribution of singularity index values with few or even without much high fluctuate values looking like noise points that usually make a singularity map much roughly and discontinuously. Furthermore, the student's t-statistic diagram indicates a strong spatial correlation between high geochemical anomaly and known tin polymetallic deposits. The target areas within high tin geochemical anomaly could probably have much higher potential for the exploration of new tin polymetallic deposits than other areas, particularly for the areas that show strong tin geochemical anomalies whereas no tin polymetallic deposits have been found in them.
Zeng, Xueqiang; Luo, Gang
2017-12-01
Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.
NASA Technical Reports Server (NTRS)
Coggeshall, M. E.; Hoffer, R. M.
1973-01-01
Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.
The Use of Variable Q1 Isolation Windows Improves Selectivity in LC-SWATH-MS Acquisition.
Zhang, Ying; Bilbao, Aivett; Bruderer, Tobias; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard; Varesio, Emmanuel
2015-10-02
As tryptic peptides and metabolites are not equally distributed along the mass range, the probability of cross fragment ion interference is higher in certain windows when fixed Q1 SWATH windows are applied. We evaluated the benefits of utilizing variable Q1 SWATH windows with regards to selectivity improvement. Variable windows based on equalizing the distribution of either the precursor ion population (PIP) or the total ion current (TIC) within each window were generated by an in-house software, swathTUNER. These two variable Q1 SWATH window strategies outperformed, with respect to quantification and identification, the basic approach using a fixed window width (FIX) for proteomic profiling of human monocyte-derived dendritic cells (MDDCs). Thus, 13.8 and 8.4% additional peptide precursors, which resulted in 13.1 and 10.0% more proteins, were confidently identified by SWATH using the strategy PIP and TIC, respectively, in the MDDC proteomic sample. On the basis of the spectral library purity score, some improvement warranted by variable Q1 windows was also observed, albeit to a lesser extent, in the metabolomic profiling of human urine. We show that the novel concept of "scheduled SWATH" proposed here, which incorporates (i) variable isolation windows and (ii) precursor retention time segmentation further improves both peptide and metabolite identifications.
Process for selectively patterning epitaxial film growth on a semiconductor substrate
Sheldon, P.; Hayes, R.E.
1984-12-04
Disclosed is a process for selectively patterning epitaxial film growth on a semiconductor substrate. The process includes forming a masking member on the surface of the substrate, the masking member having at least two layers including a first layer disposed on the substrate and the second layer covering the first layer. A window is then opened in a selected portion of the second layer by removing that portion to expose the first layer thereunder. The first layer is then subjected to an etchant introduced through the window to dissolve the first layer a sufficient amount to expose the substrate surface directly beneath the window, the first layer being adapted to preferentially dissolve at a substantially greater rate than the second layer so as to create an overhanging ledge portion with the second layer by undercutting the edges thereof adjacent the window. The epitaxial film is then deposited on the exposed substrate surface directly beneath the window. Finally, an etchant is introduced through the window to dissolve the remainder of the first layer so as to lift-off the second layer and materials deposited thereon to fully expose the balance of the substrate surface.
Process for selectively patterning epitaxial film growth on a semiconductor substrate
Sheldon, Peter; Hayes, Russell E.
1986-01-01
A process is disclosed for selectively patterning epitaxial film growth on a semiconductor substrate. The process includes forming a masking member on the surface of the substrate, the masking member having at least two layers including a first layer disposed on the substrate and the second layer covering the first layer. A window is then opened in a selected portion of the second layer by removing that portion to expose the first layer thereunder. The first layer is then subjected to an etchant introduced through the window to dissolve a sufficient amount of the first layer to expose the substrate surface directly beneath the window, the first layer being adapted to preferentially dissolve at a substantially greater rate than the second layer so as to create an overhanging ledge portion with the second layer by undercutting the edges thereof adjacent to the window. The epitaxial film is then deposited on the exposed substrate surface directly beneath the window. Finally, an etchant is introduced through the window to dissolve the remainder of the first layer so as to lift-off the second layer and materials deposited thereon to fully expose the balance of the substrate surface.
Automatic rapid attachable warhead section
Trennel, A.J.
1994-05-10
Disclosed are a method and apparatus for automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly. 10 figures.
Automatic rapid attachable warhead section
Trennel, Anthony J.
1994-05-10
Disclosed are a method and apparatus for (1) automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, (2) automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, (3) manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and (4) automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
.... Actual pile driving time during this work window will depend on a number of factors, such as sediments... period beginning in November 2010, and ending in February 2011. This work window was selected to coincide.... The work window also coincides with the USFWS' required construction work window to avoid the peak...
Interactive floating windows: a new technique for stereoscopic video games
NASA Astrophysics Data System (ADS)
Zerebecki, Chris; Stanfield, Brodie; Tawadrous, Mina; Buckstein, Daniel; Hogue, Andrew; Kapralos, Bill
2012-03-01
The film industry has a long history of creating compelling experiences in stereoscopic 3D. Recently, the video game as an artistic medium has matured into an effective way to tell engaging and immersive stories. Given the current push to bring stereoscopic 3D technology into the consumer market there is considerable interest to develop stereoscopic 3D video games. Game developers have largely ignored the need to design their games specifically for stereoscopic 3D and have thus relied on automatic conversion and driver technology. Game developers need to evaluate solutions used in other media, such as film, to correct perceptual problems such as window violations, and modify or create new solutions to work within an interactive framework. In this paper we extend the dynamic floating window technique into the interactive domain enabling the player to position a virtual window in space. Interactively changing the position, size, and the 3D rotation of the virtual window, objects can be made to 'break the mask' dramatically enhancing the stereoscopic effect. By demonstrating that solutions from the film industry can be extended into the interactive space, it is our hope that this initiates further discussion in the game development community to strengthen their story-telling mechanisms in stereoscopic 3D games.
Use Them ... or Lose Them? The Case for and against Using QR Codes
ERIC Educational Resources Information Center
Cunningham, Chuck; Dull, Cassie
2011-01-01
A quick-response (QR) code is a two-dimensional, black-and-white square barcode and links directly to a URL of one's choice. When the code is scanned with a smartphone, it will automatically redirect the user to the designated URL. QR codes are popping up everywhere--billboards, magazines, posters, shop windows, TVs, computer screens, and more.…
Automatic segmentation of psoriasis lesions
NASA Astrophysics Data System (ADS)
Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang
2014-10-01
The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.
NASA Astrophysics Data System (ADS)
Ohnuma, Hidetoshi; Kawahira, Hiroichi
1998-09-01
An automatic alternative phase shift mask (PSM) pattern layout tool has been newly developed. This tool is dedicated for embedded DRAM in logic device to shrink gate line width with improving line width controllability in lithography process with a design rule below 0.18 micrometers by the KrF excimer laser exposure. The tool can crete Levenson type PSM used being coupled with a binary mask adopting a double exposure method for positive photo resist. By using graphs, this tool automatically creates alternative PSM patterns. Moreover, it does not give any phase conflicts. By adopting it to actual embedded DRAM in logic cells, we have provided 0.16 micrometers gate resist patterns at both random logic and DRAM areas. The patterns were fabricated using two masks with the double exposure method. Gate line width has been well controlled under a practical exposure-focus window.
Searchfield, Grant D; Linford, Tania; Kobayashi, Kei; Crowhen, David; Latzel, Matthias
2018-03-01
To compare preference for and performance of manually selected programmes to an automatic sound classifier, the Phonak AutoSense OS. A single blind repeated measures study. Participants were fit with Phonak Virto V90 ITE aids; preferences for different listening programmes were compared across four different sound scenarios (speech in: quiet, noise, loud noise and a car). Following a 4-week trial preferences were reassessed and the users preferred programme was compared to the automatic classifier for sound quality and hearing in noise (HINT test) using a 12 loudspeaker array. Twenty-five participants with symmetrical moderate-severe sensorineural hearing loss. Participant preferences of manual programme for scenarios varied considerably between and within sessions. A HINT Speech Reception Threshold (SRT) advantage was observed for the automatic classifier over participant's manual selection for speech in quiet, loud noise and car noise. Sound quality ratings were similar for both manual and automatic selections. The use of a sound classifier is a viable alternative to manual programme selection.
Alternative Fuels Data Center: Video Download Help
a Windows Media Video (WMV) link and select "Save Target As..." from the shortcut menu. To player on the screen. You can also expand the video to play in full-screen mode using Windows Media played in Windows Media Player. Download Windows Media Player. To watch videos on a Mac, double-click the
ERIC Educational Resources Information Center
Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.
2013-01-01
The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…
Timing of repetition suppression of event-related potentials to unattended objects.
Stefanics, Gabor; Heinzle, Jakob; Czigler, István; Valentini, Elia; Stephan, Klaas Enno
2018-05-26
Current theories of object perception emphasize the automatic nature of perceptual inference. Repetition suppression (RS), the successive decrease of brain responses to repeated stimuli, is thought to reflect the optimization of perceptual inference through neural plasticity. While functional imaging studies revealed brain regions that show suppressed responses to the repeated presentation of an object, little is known about the intra-trial time course of repetition effects to everyday objects. Here we used event-related potentials (ERP) to task-irrelevant line-drawn objects, while participants engaged in a distractor task. We quantified changes in ERPs over repetitions using three general linear models (GLM) that modelled RS by an exponential, linear, or categorical "change detection" function in each subject. Our aim was to select the model with highest evidence and determine the within-trial time-course and scalp distribution of repetition effects using that model. Model comparison revealed the superiority of the exponential model indicating that repetition effects are observable for trials beyond the first repetition. Model parameter estimates revealed a sequence of RS effects in three time windows (86-140ms, 322-360ms, and 400-446ms) and with occipital, temporo-parietal, and fronto-temporal distribution, respectively. An interval of repetition enhancement (RE) was also observed (320-340ms) over occipito-temporal sensors. Our results show that automatic processing of task-irrelevant objects involves multiple intervals of RS with distinct scalp topographies. These sequential intervals of RS and RE might reflect the short-term plasticity required for optimization of perceptual inference and the associated changes in prediction errors (PE) and predictions, respectively, over stimulus repetitions during automatic object processing. This article is protected by copyright. All rights reserved. © 2018 The Authors European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Design and comparison of laser windows for high-power lasers
NASA Astrophysics Data System (ADS)
Niu, Yanxiong; Liu, Wenwen; Liu, Haixia; Wang, Caili; Niu, Haisha; Man, Da
2014-11-01
High-power laser systems are getting more and more widely used in industry and military affairs. It is necessary to develop a high-power laser system which can operate over long periods of time without appreciable degradation in performance. When a high-energy laser beam transmits through a laser window, it is possible that the permanent damage is caused to the window because of the energy absorption by window materials. So, when we design a high-power laser system, a suitable laser window material must be selected and the laser damage threshold of the window must be known. In this paper, a thermal analysis model of high-power laser window is established, and the relationship between the laser intensity and the thermal-stress field distribution is studied by deducing the formulas through utilizing the integral-transform method. The influence of window radius, thickness and laser intensity on the temperature and stress field distributions is analyzed. Then, the performance of K9 glass and the fused silica glass is compared, and the laser-induced damage mechanism is analyzed. Finally, the damage thresholds of laser windows are calculated. The results show that compared with K9 glass, the fused silica glass has a higher damage threshold due to its good thermodynamic properties. The presented theoretical analysis and simulation results are helpful for the design and selection of high-power laser windows.
Kamphuis, C; Mollenhorst, H; Heesterbeek, J A P; Hogeveen, H
2010-08-01
The objective was to develop and validate a clinical mastitis (CM) detection model by means of decision-tree induction. For farmers milking with an automatic milking system (AMS), it is desirable that the detection model has a high level of sensitivity (Se), especially for more severe cases of CM, at a very high specificity (Sp). In addition, an alert for CM should be generated preferably at the quarter milking (QM) at which the CM infection is visible for the first time. Data were collected from 9 Dutch dairy herds milking automatically during a 2.5-yr period. Data included sensor data (electrical conductivity, color, and yield) at the QM level and visual observations of quarters with CM recorded by the farmers. Visual observations of quarters with CM were combined with sensor data of the most recent automatic milking recorded for that same quarter, within a 24-h time window before the visual assessment time. Sensor data of 3.5 million QM were collected, of which 348 QM were combined with a CM observation. Data were divided into a training set, including two-thirds of all data, and a test set. Cows in the training set were not included in the test set and vice versa. A decision-tree model was trained using only clear examples of healthy (n=24,717) or diseased (n=243) QM. The model was tested on 105 QM with CM and a random sample of 50,000 QM without CM. While keeping the Se at a level comparable to that of models currently used by AMS, the decision-tree model was able to decrease the number of false-positive alerts by more than 50%. At an Sp of 99%, 40% of the CM cases were detected. Sixty-four percent of the severe CM cases were detected and only 12.5% of the CM that were scored as watery milk. The Se increased considerably from 40% to 66.7% when the time window increased from less than 24h before the CM observation, to a time window from 24h before to 24h after the CM observation. Even at very wide time windows, however, it was impossible to reach an Se of 100%. This indicates the inability to detect all CM cases based on sensor data alone. Sensitivity levels varied largely when the decision tree was validated per herd. This trend was confirmed when decision trees were trained using data from 8 herds and tested on data from the ninth herd. This indicates that when using the decision tree as a generic CM detection model in practice, some herds will continue having difficulties in detecting CM using mastitis alert lists, whereas others will perform well. Copyright (c) 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Law, Maxine S.
1980-01-01
Presented are three curriculum sequences about windows that were designed as part of an architectural education course for junior high school students in Ohio. The three units, or "encounters," deal with stained glass, styles of windows, and functions of the window. Each includes student objectives and selected activities. (SJL)
SplitRacer - a new Semi-Automatic Tool to Quantify And Interpret Teleseismic Shear-Wave Splitting
NASA Astrophysics Data System (ADS)
Reiss, M. C.; Rumpker, G.
2017-12-01
We have developed a semi-automatic, MATLAB-based GUI to combine standard seismological tasks such as the analysis and interpretation of teleseismic shear-wave splitting. Shear-wave splitting analysis is widely used to infer seismic anisotropy, which can be interpreted in terms of lattice-preferred orientation of mantle minerals, shape-preferred orientation caused by fluid-filled cracks or alternating layers. Seismic anisotropy provides a unique link between directly observable surface structures and the more elusive dynamic processes in the mantle below. Thus, resolving the seismic anisotropy of the lithosphere/asthenosphere is of particular importance for geodynamic modeling and interpretations. The increasing number of seismic stations from temporary experiments and permanent installations creates a new basis for comprehensive studies of seismic anisotropy world-wide. However, the increasingly large data sets pose new challenges for the rapid and reliably analysis of teleseismic waveforms and for the interpretation of the measurements. Well-established routines and programs are available but are often impractical for analyzing large data sets from hundreds of stations. Additionally, shear wave splitting results are seldom evaluated using the same well-defined quality criteria which may complicate comparison with results from different studies. SplitRacer has been designed to overcome these challenges by incorporation of the following processing steps: i) downloading of waveform data from multiple stations in mseed-format using FDSNWS tools; ii) automated initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold; iii) particle-motion analysis of selected phases at longer periods to detect and correct for sensor misalignment; iv) splitting analysis of selected phases based on transverse-energy minimization for multiple, randomly-selected, relevant time windows; v) one and two-layer joint-splitting analysis for all phases at one station by simultaneously minimizing their transverse energy - this includes the analysis of null measurements. vi) comparison of results with theoretical splitting parameters determined for one, two, or continuously-varying anisotropic layer(s). Examples for the application of SplitRacer will be presented.
NASA Astrophysics Data System (ADS)
Creusen, I. M.; Hazelhoff, L.; De With, P. H. N.
2013-10-01
In large-scale automatic traffic sign surveying systems, the primary computational effort is concentrated at the traffic sign detection stage. This paper focuses on reducing the computational load of particularly the sliding window object detection algorithm which is employed for traffic sign detection. Sliding-window object detectors often use a linear SVM to classify the features in a window. In this case, the classification can be seen as a convolution of the feature maps with the SVM kernel. It is well known that convolution can be efficiently implemented in the frequency domain, for kernels larger than a certain size. We show that by careful reordering of sliding-window operations, most of the frequency-domain transformations can be eliminated, leading to a substantial increase in efficiency. Additionally, we suggest to use the overlap-add method to keep the memory use within reasonable bounds. This allows us to keep all the transformed kernels in memory, thereby eliminating even more domain transformations, and allows all scales in a multiscale pyramid to be processed using the same set of transformed kernels. For a typical sliding-window implementation, we have found that the detector execution performance improves with a factor of 5.3. As a bonus, many of the detector improvements from literature, e.g. chi-squared kernel approximations, sub-class splitting algorithms etc., can be more easily applied at a lower performance penalty because of an improved scalability.
Ionospheric gravity wave measurements with the USU dynasonde
NASA Technical Reports Server (NTRS)
Berkey, Frank T.; Deng, Jun Yuan
1992-01-01
A method for the measurement of ionospheric Gravity Wave (GW) using the USU Dynasonde is outlined. This method consists of a series of individual procedures, which includes functions for data acquisition, adaptive scaling, polarization discrimination, interpolation and extrapolation, digital filtering, windowing, spectrum analysis, GW detection, and graphics display. Concepts of system theory are applied to treat the ionosphere as a system. An adaptive ionogram scaling method was developed for automatically extracting ionogram echo traces from noisy raw sounding data. The method uses the well known Least Mean Square (LMS) algorithm to form a stochastic optimal estimate of the echo trace which is then used to control a moving window. The window tracks the echo trace, simultaneously eliminating the noise and interference. Experimental results show that the proposed method functions as designed. Case studies which extract GW from ionosonde measurements were carried out using the techniques described. Geophysically significant events were detected and the resultant processed results are illustrated graphically. This method was also developed for real time implementation in mind.
Storing and Deploying Solar Panels
NASA Technical Reports Server (NTRS)
Browning, D. L.; Stocker, H. M.; Kleidon, E. H.
1982-01-01
Like upward-drawn window shades, solar blankets are unfurled to length of 89m, almost filling opening in 95.59-meter-square frame. When frame is completely assembled, solar blankets are pulled from canisters, one by one by electric motor. A Thin cushion sheet is rolled up with each blanket to cushion solar cells. Sheet is taken up on roller as blanket is unfurled. Unrolling proceeds automatically.
Noise limitations of multiplier phototubes in the radiation environment of space
NASA Technical Reports Server (NTRS)
Viehmann, W.; Eubanks, A. G.
1976-01-01
The contributions of Cerenkov emission, luminescence, secondary electron emission, and bremsstrahlung to radiation-induced data current and noise of multiplier phototubes were analyzed quantitatively. Fluorescence and Cerenkov emission in the tube window are the major contributors and can quantitatively account for dark count levels observed in orbit. Radiation-induced noise can be minimized by shielding, tube selection, and mode of operation. Optical decoupling of windows and cathode (side-window tubes) leads to further reduction of radiation-induced dark counts, as does reducing the window thickness and effective cathode area, and selection of window/cathode combinations of low fluorescence efficiency. In trapped radiation-free regions of near-earth orbits and in free space, Cerenkov emission by relativistic particles contributes predominantly to the photoelectron yield per event. Operating multiplier phototubes in the photon (pulse) counting mode will discriminate against these large pulses and substantially reduce the dark count and noise to levels determined by fluorescence.
Zdravevski, Eftim; Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger
2017-01-01
Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from either accelerometer position. Machine learning techniques can be used for automatic activity recognition, as they provide very accurate activity recognition, significantly more accurate than when keeping a diary. Identification of jogging periods in adolescents can be performed using only one accelerometer. Performance-wise there is no significant benefit from using accelerometers on both locations.
Beltrame, Luca; Calura, Enrica; Popovici, Razvan R; Rizzetto, Lisa; Guedez, Damariz Rivero; Donato, Michele; Romualdi, Chiara; Draghici, Sorin; Cavalieri, Duccio
2011-08-01
Many models and analysis of signaling pathways have been proposed. However, neither of them takes into account that a biological pathway is not a fixed system, but instead it depends on the organism, tissue and cell type as well as on physiological, pathological and experimental conditions. The Biological Connection Markup Language (BCML) is a format to describe, annotate and visualize pathways. BCML is able to store multiple information, permitting a selective view of the pathway as it exists and/or behave in specific organisms, tissues and cells. Furthermore, BCML can be automatically converted into data formats suitable for analysis and into a fully SBGN-compliant graphical representation, making it an important tool that can be used by both computational biologists and 'wet lab' scientists. The XML schema and the BCML software suite are freely available under the LGPL for download at http://bcml.dc-atlas.net. They are implemented in Java and supported on MS Windows, Linux and OS X.
sEMG feature evaluation for identification of elbow angle resolution in graded arm movement.
Castro, Maria Claudia F; Colombini, Esther L; Aquino, Plinio T; Arjunan, Sridhar P; Kumar, Dinesh K
2014-11-25
Automatic and accurate identification of elbow angle from surface electromyogram (sEMG) is essential for myoelectric controlled upper limb exoskeleton systems. This requires appropriate selection of sEMG features, and identifying the limitations of such a system.This study has demonstrated that it is possible to identify three discrete positions of the elbow; full extension, right angle, and mid-way point, with window size of only 200 milliseconds. It was seen that while most features were suitable for this purpose, Power Spectral Density Averages (PSD-Av) performed best. The system correctly classified the sEMG against the elbow angle for 100% cases when only two discrete positions (full extension and elbow at right angle) were considered, while correct classification was 89% when there were three discrete positions. However, sEMG was unable to accurately determine the elbow position when five discrete angles were considered. It was also observed that there was no difference for extension or flexion phases.
Kairisto, V; Poola, A
1995-01-01
GraphROC for Windows is a program for clinical test evaluation. It was designed for the handling of large datasets obtained from clinical laboratory databases. In the user interface, graphical and numerical presentations are combined. For simplicity, numerical data is not shown unless requested. Relevant numbers can be "picked up" from the graph by simple mouse operations. Reference distributions can be displayed by using automatically optimized bin widths. Any percentile of the distribution with corresponding confidence limits can be chosen for display. In sensitivity-specificity analysis, both illness- and health-related distributions are shown in the same graph. The following data for any cutoff limit can be shown in a separate click window: clinical sensitivity and specificity with corresponding confidence limits, positive and negative likelihood ratios, positive and negative predictive values and efficiency. Predictive values and clinical efficiency of the cutoff limit can be updated for any prior probability of disease. Receiver Operating Characteristics (ROC) curves can be generated and combined into the same graph for comparison of several different tests. The area under the curve with corresponding confidence interval is calculated for each ROC curve. Numerical results of analyses and graphs can be printed or exported to other Microsoft Windows programs. GraphROC for Windows also employs a new method, developed by us, for the indirect estimation of health-related limits and change limits from mixed distributions of clinical laboratory data.
10 CFR 429.45 - Automatic commercial ice makers.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 3 2012-01-01 2012-01-01 false Automatic commercial ice makers. 429.45 Section 429.45... PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.45 Automatic commercial ice makers. (a... automatic commercial ice makers; and (2) For each basic model of automatic commercial ice maker selected for...
10 CFR 429.45 - Automatic commercial ice makers.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 3 2014-01-01 2014-01-01 false Automatic commercial ice makers. 429.45 Section 429.45... PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.45 Automatic commercial ice makers. (a... automatic commercial ice makers; and (2) For each basic model of automatic commercial ice maker selected for...
10 CFR 429.45 - Automatic commercial ice makers.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 3 2013-01-01 2013-01-01 false Automatic commercial ice makers. 429.45 Section 429.45... PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.45 Automatic commercial ice makers. (a... automatic commercial ice makers; and (2) For each basic model of automatic commercial ice maker selected for...
Automatic Detection of Welding Defects using Deep Neural Network
NASA Astrophysics Data System (ADS)
Hou, Wenhui; Wei, Ye; Guo, Jie; Jin, Yi; Zhu, Chang'an
2018-01-01
In this paper, we propose an automatic detection schema including three stages for weld defects in x-ray images. Firstly, the preprocessing procedure for the image is implemented to locate the weld region; Then a classification model which is trained and tested by the patches cropped from x-ray images is constructed based on deep neural network. And this model can learn the intrinsic feature of images without extra calculation; Finally, the sliding-window approach is utilized to detect the whole images based on the trained model. In order to evaluate the performance of the model, we carry out several experiments. The results demonstrate that the classification model we proposed is effective in the detection of welded joints quality.
Automatic detection, tracking and sensor integration
NASA Astrophysics Data System (ADS)
Trunk, G. V.
1988-06-01
This report surveys the state of the art of automatic detection, tracking, and sensor integration. In the area of detection, various noncoherent integrators such as the moving window integrator, feedback integrator, two-pole filter, binary integrator, and batch processor are discussed. Next, the three techniques for controlling false alarms, adapting thresholds, nonparametric detectors, and clutter maps are presented. In the area of tracking, a general outline is given of a track-while-scan system, and then a discussion is presented of the file system, contact-entry logic, coordinate systems, tracking filters, maneuver-following logic, tracking initiating, track-drop logic, and correlation procedures. Finally, in the area of multisensor integration the problems of colocated-radar integration, multisite-radar integration, radar-IFF integration, and radar-DF bearing strobe integration are treated.
docBUILDER - Building Your Useful Metadata for Earth Science Data and Services.
NASA Astrophysics Data System (ADS)
Weir, H. M.; Pollack, J.; Olsen, L. M.; Major, G. R.
2005-12-01
The docBUILDER tool, created by NASA's Global Change Master Directory (GCMD), assists the scientific community in efficiently creating quality data and services metadata. Metadata authors are asked to complete five required fields to ensure enough information is provided for users to discover the data and related services they seek. After the metadata record is submitted to the GCMD, it is reviewed for semantic and syntactic consistency. Currently, two versions are available - a Web-based tool accessible with most browsers (docBUILDERweb) and a stand-alone desktop application (docBUILDERsolo). The Web version is available through the GCMD website, at http://gcmd.nasa.gov/User/authoring.html. This version has been updated and now offers: personalized templates to ease entering similar information for multiple data sets/services; automatic population of Data Center/Service Provider URLs based on the selected center/provider; three-color support to indicate required, recommended, and optional fields; an editable text window containing the XML record, to allow for quick editing; and improved overall performance and presentation. The docBUILDERsolo version offers the ability to create metadata records on a computer wherever you are. Except for installation and the occasional update of keywords, data/service providers are not required to have an Internet connection. This freedom will allow users with portable computers (Windows, Mac, and Linux) to create records in field campaigns, whether in Antarctica or the Australian Outback. This version also offers a spell-checker, in addition to all of the features found in the Web version.
Improved Robustness and Efficiency for Automatic Visual Site Monitoring
2009-09-01
the space of expected poses. To avoid having to compare each test window with the whole training corpus, he builds a template hierarchy by...directions of motion. In a second layer of clustering, it also learns how the low-level clusters co-occur with each other. An infinite mix- ture model is used...implementation. We demonstrate the utility of this detector by modeling scene-level activities with a Hierarchical
Modeling of digital mammograms using bicubic spline functions and additive noise
NASA Astrophysics Data System (ADS)
Graffigne, Christine; Maintournam, Aboubakar; Strauss, Anne
1998-09-01
The purpose of our work is the microcalcifications detection on digital mammograms. In order to do so, we model the grey levels of digital mammograms by the sum of a surface trend (bicubic spline function) and an additive noise or texture. We also introduce a robust estimation method in order to overcome the bias introduced by the microcalcifications. After the estimation we consider the subtraction image values as noise. If the noise is not correlated, we adjust its distribution probability by the Pearson's system of densities. It allows us to threshold accurately the images of subtraction and therefore to detect the microcalcifications. If the noise is correlated, a unilateral autoregressive process is used and its coefficients are again estimated by the least squares method. We then consider non overlapping windows on the residues image. In each window the texture residue is computed and compared with an a priori threshold. This provides correct localization of the microcalcifications clusters. However this technique is definitely more time consuming that then automatic threshold assuming uncorrelated noise and does not lead to significantly better results. As a conclusion, even if the assumption of uncorrelated noise is not correct, the automatic thresholding based on the Pearson's system performs quite well on most of our images.
Shaping Attention with Reward: Effects of Reward on Space- and Object-Based Selection
Shomstein, Sarah; Johnson, Jacoba
2014-01-01
The contribution of rewarded actions to automatic attentional selection remains obscure. We hypothesized that some forms of automatic orienting, such as object-based selection, can be completely abandoned in lieu of reward maximizing strategy. While presenting identical visual stimuli to the observer, in a set of two experiments, we manipulate what is being rewarded (different object targets or random object locations) and the type of reward received (money or points). It was observed that reward alone guides attentional selection, entirely predicting behavior. These results suggest that guidance of selective attention, while automatic, is flexible and can be adjusted in accordance with external non-sensory reward-based factors. PMID:24121412
Detecting Hardware-assisted Hypervisor Rootkits within Nested Virtualized Environments
2012-06-14
least the minimum required for the guest OS and click “Next”. For 64-bit Windows 7 the minimum required is 2048 MB (Figure 66). Figure 66. Memory...prompted for Memory, allocate at least the minimum required for the guest OS, for 64-bit Windows 7 the minimum required is 2048 MB (Figure 79...130 21. Within the virtual disk creation wizard, select VDI for the file type (Figure 81). Figure 81. Select File Type 22. Select Dynamically
NASA Astrophysics Data System (ADS)
Butkiewicz, T.
2014-12-01
We developed free software that enables researchers to utilize Microsoft's new Kinect for Windows v2 sensor for a range of coastal and ocean mapping applications, as well as monitoring and measuring experimental scenes. While the original Kinect device used structured light and had very poor resolution, many geophysical researchers found uses for it in their experiments. The new next generation of this sensor uses time-of-flight technology, and can produce higher resolution depth measurements with an order of magnitude more accuracy. It also is capable of measurement through and under water. An analysis tool in our application lets users quickly select any arbitrary surface in the sensor's view. The tools automatically scans the surface, then calibrates and aligns a measurement volume to it. Depth readings from the sensor are converted into 3D point clouds, and points falling within this volume are projected into surface coordinates. Raster images can be output which consist of height fields aligned to the surface, generated from these projected measurements and interpolations between them. Images have a simple 1 pixel = 1 mm resolution and intensity values representing mm in height from the base-plane, which enables easy measurement and calculations to be conducted on the images in other analysis packages. Single snapshots can be taken manually on demand, or the software can monitor the surface automatically, capturing frames at preset intervals. This produces time lapse animations of dynamically changing surfaces. We apply this analysis tool to an experiment studying the behavior of underwater oil in response to flowing water of different speeds and temperatures. Blobs of viscous oils are placed in a flume apparatus, which circulates water past them. Over the course of a couple hours, the oil blobs spread out, waves slowly ripple across their surfaces, and erosions occur as smaller blobs break off from the main blob. All of this can be captured in 3D, with mm accuracy, through the water using the Kinect for Windows v2 sensor and our K2MapKit software.
a Method for the Seamlines Network Automatic Selection Based on Building Vector
NASA Astrophysics Data System (ADS)
Li, P.; Dong, Y.; Hu, Y.; Li, X.; Tan, P.
2018-04-01
In order to improve the efficiency of large scale orthophoto production of city, this paper presents a method for automatic selection of seamlines network in large scale orthophoto based on the buildings' vector. Firstly, a simple model of the building is built by combining building's vector, height and DEM, and the imaging area of the building on single DOM is obtained. Then, the initial Voronoi network of the measurement area is automatically generated based on the positions of the bottom of all images. Finally, the final seamlines network is obtained by optimizing all nodes and seamlines in the network automatically based on the imaging areas of the buildings. The experimental results show that the proposed method can not only get around the building seamlines network quickly, but also remain the Voronoi network' characteristics of projection distortion minimum theory, which can solve the problem of automatic selection of orthophoto seamlines network in image mosaicking effectively.
Multilocus Association Mapping Using Variable-Length Markov Chains
Browning, Sharon R.
2006-01-01
I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests. PMID:16685642
Multilocus association mapping using variable-length Markov chains.
Browning, Sharon R
2006-06-01
I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests.
10 CFR 431.135 - Units to be tested.
Code of Federal Regulations, 2011 CFR
2011-01-01
... EQUIPMENT Automatic Commercial Ice Makers Test Procedures § 431.135 Units to be tested. For each basic model of automatic commercial ice maker selected for testing, a sample of sufficient size shall be selected...
Fragomeni, Breno de Oliveira; Misztal, Ignacy; Lourenco, Daniela Lino; Aguilar, Ignacio; Okimoto, Ronald; Muir, William M
2014-01-01
The purpose of this study was to determine if the set of genomic regions inferred as accounting for the majority of genetic variation in quantitative traits remain stable over multiple generations of selection. The data set contained phenotypes for five generations of broiler chicken for body weight, breast meat, and leg score. The population consisted of 294,632 animals over five generations and also included genotypes of 41,036 single nucleotide polymorphism (SNP) for 4,866 animals, after quality control. The SNP effects were calculated by a GWAS type analysis using single step genomic BLUP approach for generations 1-3, 2-4, 3-5, and 1-5. Variances were calculated for windows of 20 SNP. The top ten windows for each trait that explained the largest fraction of the genetic variance across generations were examined. Across generations, the top 10 windows explained more than 0.5% but less than 1% of the total variance. Also, the pattern of the windows was not consistent across generations. The windows that explained the greatest variance changed greatly among the combinations of generations, with a few exceptions. In many cases, a window identified as top for one combination, explained less than 0.1% for the other combinations. We conclude that identification of top SNP windows for a population may have little predictive power for genetic selection in the following generations for the traits here evaluated.
Ward, R. E.; Purves, T.; Feldman, M.; Schiffman, R. M.; Barry, S.; Christner, M.; Kipa, G.; McCarthy, B. D.; Stiphout, R.
1991-01-01
The Care Windows development project demonstrated the feasibility of an approach designed to add the benefits of an event-driven, graphically-oriented user interface to an existing Medical Information Management System (MIMS) without overstepping economic and logistic constraints. The design solution selected for the Care Windows project incorporates three important design features: (1) the effective de-coupling of severs from requesters, permitting the use of an extensive pre-existing library of MIMS servers, (2) the off-loading of program control functions of the requesters to the workstation processor, reducing the load per transaction on central resources and permitting the use of object-oriented development environments available for microcomputers, (3) the selection of a low end, GUI-capable workstation consisting of a PC-compatible personal computer running Microsoft Windows 3.0, and (4) the development of a highly layered, modular workstation application, permitting the development of interchangeable modules to insure portability and adaptability. PMID:1807665
Crustal Fracturing Field and Presence of Fluid as Revealed by Seismic Anisotropy
NASA Astrophysics Data System (ADS)
Pastori, M.; Piccinini, D.; de Gori, P.; Margheriti, L.; Barchi, M. R.; di Bucci, D.
2010-12-01
In the last three years, we developed, tested and improved an automatic analysis code (Anisomat+) to calculate the shear wave splitting parameters, fast polarization direction (φ) and delay time (∂t). The code is a set of MatLab scripts able to retrieve crustal anisotropy parameters from three-component seismic recording of local earthquakes using horizontal component cross-correlation method. The analysis procedure consists in choosing an appropriate frequency range, that better highlights the signal containing the shear waves, and a length of time window on the seismogram centered on the S arrival (the temporal window contains at least one cycle of S wave). The code was compared to other two automatic analysis code (SPY and SHEBA) and tested on three Italian areas (Val d’Agri, Tiber Valley and L’Aquila surrounding) along the Apennine mountains. For each region we used the anisotropic parameters resulting from the automatic computation as a tool to determine the fracture field geometries connected with the active stress field. We compare the temporal variations of anisotropic parameters to the evolution of vp/vs ratio for the same seismicity. The anisotropic fast directions are used to define the active stress field (EDA model), finding a general consistence between fast direction and main stress indicators (focal mechanism and borehole break-out). The magnitude of delay time is used to define the fracture field intensity finding higher value in the volume where micro-seismicity occurs. Furthermore we studied temporal variations of anisotropic parameters and vp/vs ratio in order to explain if fluids play an important role in the earthquake generation process. The close association of anisotropic and vp/vs parameters variations and seismicity rate changes supports the hypothesis that the background seismicity is influenced by the fluctuation of pore fluid pressure in the rocks.
An Approximate Approach to Automatic Kernel Selection.
Ding, Lizhong; Liao, Shizhong
2016-02-02
Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.
Three-dimensional reconstruction from serial sections in PC-Windows platform by using 3D_Viewer.
Xu, Yi-Hua; Lahvis, Garet; Edwards, Harlene; Pitot, Henry C
2004-11-01
Three-dimensional (3D) reconstruction from serial sections allows identification of objects of interest in 3D and clarifies the relationship among these objects. 3D_Viewer, developed in our laboratory for this purpose, has four major functions: image alignment, movie frame production, movie viewing, and shift-overlay image generation. Color images captured from serial sections were aligned; then the contours of objects of interest were highlighted in a semi-automatic manner. These 2D images were then automatically stacked at different viewing angles, and their composite images on a projected plane were recorded by an image transform-shift-overlay technique. These composition images are used in the object-rotation movie show. The design considerations of the program and the procedures used for 3D reconstruction from serial sections are described. This program, with a digital image-capture system, a semi-automatic contours highlight method, and an automatic image transform-shift-overlay technique, greatly speeds up the reconstruction process. Since images generated by 3D_Viewer are in a general graphic format, data sharing with others is easy. 3D_Viewer is written in MS Visual Basic 6, obtainable from our laboratory on request.
Acoustic window planning for ultrasound acquisition.
Göbl, Rüdiger; Virga, Salvatore; Rackerseder, Julia; Frisch, Benjamin; Navab, Nassir; Hennersperger, Christoph
2017-06-01
Autonomous robotic ultrasound has recently gained considerable interest, especially for collaborative applications. Existing methods for acquisition trajectory planning are solely based on geometrical considerations, such as the pose of the transducer with respect to the patient surface. This work aims at establishing acoustic window planning to enable autonomous ultrasound acquisitions of anatomies with restricted acoustic windows, such as the liver or the heart. We propose a fully automatic approach for the planning of acquisition trajectories, which only requires information about the target region as well as existing tomographic imaging data, such as X-ray computed tomography. The framework integrates both geometrical and physics-based constraints to estimate the best ultrasound acquisition trajectories with respect to the available acoustic windows. We evaluate the developed method using virtual planning scenarios based on real patient data as well as for real robotic ultrasound acquisitions on a tissue-mimicking phantom. The proposed method yields superior image quality in comparison with a naive planning approach, while maintaining the necessary coverage of the target. We demonstrate that by taking image formation properties into account acquisition planning methods can outperform naive plannings. Furthermore, we show the need for such planning techniques, since naive approaches are not sufficient as they do not take the expected image quality into account.
An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.
Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad
2016-01-01
Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.
Bond-selective imaging of deep tissue through the optical window between 1600 and 1850 nm.
Wang, Pu; Wang, Han-Wei; Sturek, Michael; Cheng, Ji-Xin
2012-01-01
We report the employment of an optical window between 1600 nm and 1850 nm for bond-selective deep tissue imaging through harmonic vibrational excitation and acoustic detection of resultant pressure waves. In this window where a local minimum of water absorption resides, we found a 5 times enhancement of photoacoustic signal by first overtone excitation of the methylene group CH(2) at 1730 nm, compared to the second overtone excitation at 1210 nm. The enhancement allows 3D mapping of intramuscular fat with improved contrast and of lipid deposition inside an atherosclerotic artery wall in the presence of blood. Moreover, lipid and protein are differentiated based on the first overtone absorption profiles of CH(2) and methyl group CH(3) in this window. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automatic P-S phase picking procedure based on Kurtosis: Vanuatu region case study
NASA Astrophysics Data System (ADS)
Baillard, C.; Crawford, W. C.; Ballu, V.; Hibert, C.
2012-12-01
Automatic P and S phase picking is indispensable for large seismological data sets. Robust algorithms, based on short term and long term average ratio comparison (Allen, 1982), are commonly used for event detection, but further improvements can be made in phase identification and picking. We present a picking scheme using consecutively Kurtosis-derived Characteristic Functions (CF) and Eigenvalue decompositions on 3-component seismic data to independently pick P and S arrivals. When computed over a sliding window of the signal, a sudden increase in the CF reveals a transition from a gaussian to a non-gaussian distribution, characterizing the phase onset (Saragiotis, 2002). One advantage of the method is that it requires much fewer adjustable parameters than competing methods. We modified the Kurtosis CF to improve pick precision, by computing the CF over several frequency bandwidths, window sizes and smoothing parameters. Once phases were picked, we determined the onset type (P or S) using polarization parameters (rectilinearity, azimuth and dip) calculated using Eigenvalue decompositions of the covariance matrix (Cichowicz, 1993). Finally, we removed bad picks using a clustering procedure and the signal-to-noise ratio (SNR). The pick quality index was also assigned based on the SNR value. Amplitude calculation is integrated into the procedure to enable automatic magnitude calculation. We applied this procedure to data from a network of 30 wideband seismometers (including 10 oceanic bottom seismometers) in Vanuatu that ran for 10 months from May 2008 to February 2009. We manually picked the first 172 events of June, whose local magnitudes range from 0.7 to 3.7. We made a total of 1601 picks, 1094 P and 507 S. We then applied our automatic picking to the same dataset. 70% of the manually picked onsets were picked automatically. For P-picks, the difference between manual and automatic picks is 0.01 ± 0.08 s overall; for the best quality picks (quality index 0: 64% of the P-picks) the difference is -0.01 ± 0.07 s. For S-picks, the difference is -0.09 ± 0.26 s overall and -0.06 ± 0.14 s for good quality picks (index 1: 26% of the S-picks). Residuals showed no dependence on the event magnitudes. The method independently picks S and P waves with good precision and only a few parameters to adjust for relatively small earthquakes (mostly ≤ 2 Ml). The automatic procedure was then applied to the whole dataset. Earthquake locations obtained by inverting onset arrivals revealed clustering and lineations that helped us to constrain the subduction plane. Those key parameters will be integrated to a 3D finite-difference modeling and compared to GPS data in order to better understand the complex geodynamics behavior of the Vanuatu region.
A Scalable and Dynamic Testbed for Conducting Penetration-Test Training in a Laboratory Environment
2015-03-01
entry point through which to execute a payload to accomplish a higher-level goal: executing arbitrary code, escalating privileges , pivoting...Mobile Ad Hoc Network Emulator (EMANE)26 can emulate the entire network stack (physical to application -layer protocols). 2. Methodology To build a...to host Windows, Linux, MacOS, Android , and other operating systems without much effort. 4 E. A simple and automatic “restore” function: Many
USACE Takes Going Green to New Heights
2010-08-01
building of the same size—a savings of 4.5 mil- lion gallons of drinking water annually. To accomplish this, low-flow faucets , urinals, and showerheads... conserved with the help of room occupancy sensors that will automatically turn lights on and off, depending on whether a room is being occupied. Natural...round for the personnel. To conserve this air, large windows in the complex will be highly insulated to prevent air from leak- ing outside the
Lithographic process window optimization for mask aligner proximity lithography
NASA Astrophysics Data System (ADS)
Voelkel, Reinhard; Vogler, Uwe; Bramati, Arianna; Erdmann, Andreas; Ünal, Nezih; Hofmann, Ulrich; Hennemeyer, Marc; Zoberbier, Ralph; Nguyen, David; Brugger, Juergen
2014-03-01
We introduce a complete methodology for process window optimization in proximity mask aligner lithography. The commercially available lithography simulation software LAB from GenISys GmbH was used for simulation of light propagation and 3D resist development. The methodology was tested for the practical example of lines and spaces, 5 micron half-pitch, printed in a 1 micron thick layer of AZ® 1512HS1 positive photoresist on a silicon wafer. A SUSS MicroTec MA8 mask aligner, equipped with MO Exposure Optics® was used in simulation and experiment. MO Exposure Optics® is the latest generation of illumination systems for mask aligners. MO Exposure Optics® provides telecentric illumination and excellent light uniformity over the full mask field. MO Exposure Optics® allows the lithography engineer to freely shape the angular spectrum of the illumination light (customized illumination), which is a mandatory requirement for process window optimization. Three different illumination settings have been tested for 0 to 100 micron proximity gap. The results obtained prove, that the introduced process window methodology is a major step forward to obtain more robust processes in mask aligner lithography. The most remarkable outcome of the presented study is that a smaller exposure gap does not automatically lead to better print results in proximity lithography - what the "good instinct" of a lithographer would expect. With more than 5'000 mask aligners installed in research and industry worldwide, the proposed process window methodology might have significant impact on yield improvement and cost saving in industry.
Electro-optic tracking R&D for defense surveillance
NASA Astrophysics Data System (ADS)
Sutherland, Stuart; Woodruff, Chris J.
1995-09-01
Two aspects of work on automatic target detection and tracking for electro-optic (EO) surveillance are described. Firstly, a detection and tracking algorithm test-bed developed by DSTO and running on a PC under Windows NT is being used to assess candidate algorithms for unresolved and minimally resolved target detection. The structure of this test-bed is described and examples are given of its user interfaces and outputs. Secondly, a development by Australian industry under a Defence-funded contract, of a reconfigurable generic track processor (GTP) is outlined. The GTP will include reconfigurable image processing stages and target tracking algorithms. It will be used to demonstrate to the Australian Defence Force automatic detection and tracking capabilities, and to serve as a hardware base for real time algorithm refinement.
ACHP | News | ACHP Issues Program Comment for GSA on Select Repairs and
to windows, lighting, roofing, and heating, ventilating, and air-conditioning (HVAC) systems within Upgrades Windows Lighting Roofing Heating, Ventilation, and Air Conditioning (HVAC) Systems Updated March
NASA Astrophysics Data System (ADS)
Wang, J.; Feng, B.
2016-12-01
Impervious surface area (ISA) has long been studied as an important input into moisture flux models. In general, ISA impedes groundwater recharge, increases stormflow/flood frequency, and alters in-stream and riparian habitats. Urban area is recognized as one of the richest ISA environment. Urban ISA mapping assists flood prevention and urban planning. Hyperspectral imagery (HI), for its ability to detect subtle spectral signature, becomes an ideal candidate in urban ISA mapping. To map ISA from HI involves endmember (EM) selection. The high degree of spatial and spectral heterogeneity of urban environment puts great difficulty in this task: a compromise point is needed between the automatic degree and the good representativeness of the method. The study tested one manual and two semi-automatic EM selection strategies. The manual and the first semi-automatic methods have been widely used in EM selection. The second semi-automatic EM selection method is rather new and has been only proposed for moderate spatial resolution satellite. The manual method visually selected the EM candidates from eight landcover types in the original image. The first semi-automatic method chose the EM candidates using a threshold over the pixel purity index (PPI) map. The second semi-automatic method used the triangle shape of the HI scatter plot in the n-Dimension visualizer to identify the V-I-S (vegetation-impervious surface-soil) EM candidates: the pixels locate at the triangle points. The initial EM candidates from the three methods were further refined by three indexes (EM average RMSE, minimum average spectral angle, and count based EM selection) and generated three spectral libraries, which were used to classify the test image. Spectral angle mapper was applied. The accuracy reports for the classification results were generated. The overall accuracy are 85% for the manual method, 81% for the PPI method, and 87% for the V-I-S method. The V-I-S EM selection method performs best in this study. This fact proves the value of V-I-S EM selection method in not only moderate spatial resolution satellite image but also the more and more accessible high spatial resolution airborne image. This semi-automatic EM selection method can be adopted into a wide range of remote sensing images and provide ISA map for hydrology analysis.
Managing multiple image stacks from confocal laser scanning microscopy
NASA Astrophysics Data System (ADS)
Zerbe, Joerg; Goetze, Christian H.; Zuschratter, Werner
1999-05-01
A major goal in neuroanatomy is to obtain precise information about the functional organization of neuronal assemblies and their interconnections. Therefore, the analysis of histological sections frequently requires high resolution images in combination with an overview about the structure. To overcome this conflict we have previously introduced a software for the automatic acquisition of multiple image stacks (3D-MISA) in confocal laser scanning microscopy. Here, we describe a Windows NT based software for fast and easy navigation through the multiple images stacks (MIS-browser), the visualization of individual channels and layers and the selection of user defined subregions. In addition, the MIS browser provides useful tools for the visualization and evaluation of the datavolume, as for instance brightness and contrast corrections of individual layers and channels. Moreover, it includes a maximum intensity projection, panning and zoom in/out functions within selected channels or focal planes (x/y) and tracking along the z-axis. The import module accepts any tiff-format and reconstructs the original image arrangement after the user has defined the sequence of images in x/y and z and the number of channels. The implemented export module allows storage of user defined subregions (new single image stacks) for further 3D-reconstruction and evaluation.
Burgmans, Mark Christiaan; den Harder, J Michiel; Meershoek, Philippa; van den Berg, Nynke S; Chan, Shaun Xavier Ju Min; van Leeuwen, Fijs W B; van Erkel, Arian R
2017-06-01
To determine the accuracy of automatic and manual co-registration methods for image fusion of three-dimensional computed tomography (CT) with real-time ultrasonography (US) for image-guided liver interventions. CT images of a skills phantom with liver lesions were acquired and co-registered to US using GE Logiq E9 navigation software. Manual co-registration was compared to automatic and semiautomatic co-registration using an active tracker. Also, manual point registration was compared to plane registration with and without an additional translation point. Finally, comparison was made between manual and automatic selection of reference points. In each experiment, accuracy of the co-registration method was determined by measurement of the residual displacement in phantom lesions by two independent observers. Mean displacements for a superficial and deep liver lesion were comparable after manual and semiautomatic co-registration: 2.4 and 2.0 mm versus 2.0 and 2.5 mm, respectively. Both methods were significantly better than automatic co-registration: 5.9 and 5.2 mm residual displacement (p < 0.001; p < 0.01). The accuracy of manual point registration was higher than that of plane registration, the latter being heavily dependent on accurate matching of axial CT and US images by the operator. Automatic reference point selection resulted in significantly lower registration accuracy compared to manual point selection despite lower root-mean-square deviation (RMSD) values. The accuracy of manual and semiautomatic co-registration is better than that of automatic co-registration. For manual co-registration using a plane, choosing the correct plane orientation is an essential first step in the registration process. Automatic reference point selection based on RMSD values is error-prone.
Provision of an X-environment using the HEPiX-X11 scripts
NASA Astrophysics Data System (ADS)
Jones, R. W. L.; Cons, L.; Taddei, A.
1997-02-01
At CERN, we have created a user X11 environment within the HEPiX framework. Customisation is possible at the HEPiX, site, cluster, machine, group and user level, in order of increasing priority. The management of the X11 session is divorced from the window management. FVWM is the default window manager, being light on system resources while providing most of the desired functionality. The assembly of a correctly ordered. fvwmrc is done automatically by the scripts, with customisation allowed at all of the above levels. Two tools are provided to query aspects of that environment. These may be used both at the start of the X-session or when commencing any application. The first is guesskbd, a tool to identify the user's keyboard. A second, provides useful information about a given display.
NASA Astrophysics Data System (ADS)
Gerasimov, A. V.; Pashkov, S. V.; Khristenko, Yu. F.
2017-10-01
Space debris formed during the launch and operation of spacecrafts in the circumterrestrial space, and the flows of micrometeoroids from the depths of space pose a real threat to manned and automatic vehicles. Providing the fracture resistance of aluminum, glass and ceramic spacecraft elements is an important practical task. These materials are widely used in spacecraft elements such as bodies, tanks, windows, glass in optical devices, heat shields, etc.
Automatic Condensation of Electronic Publications by Sentence Selection.
ERIC Educational Resources Information Center
Brandow, Ronald; And Others
1995-01-01
Describes a system that performs automatic summaries of news from a large commercial news service encompassing 41 different publications. This system was compared to a system that used only the lead sentences of the texts. Lead-based summaries significantly outperformed the sentence-selection summaries. (AEF)
MRI-Guided Selection of Patients for Acute Ischemic Stroke Treatment
Leigh, Richard; Krakauer, John W.
2014-01-01
Purpose of review To summarize what is known about the use of MRI in acute stroke treatments (predominantly thrombolysis), to examine the assumptions and theories behind the interpretation of MR images of acute stroke and how they are used to select patients for therapies, and to suggest directions for future research. Recent findings Recent studies have been contradictory about the usefulness of MRI in selecting patients for treatment. New MRI models for selecting patients have emerged that focus not only on the ischemic penumbra but also the core infarct. Fixed time-window selection parameters are being replaced by individualized MRI features. New ways to interpret traditional MRI sequences are emerging. Summary Although the efficacy of acute stroke treatment is time dependent, the use of fixed time-windows does not account for individual differences in infarct evolution, which could be detected with MRI. While MRI shows promise for identifying patients who should be treated, as well as exclude patients who should not be treated, definitive evidence is still lacking. Future research should focus on validating the use of MRI to select patients for IV therapies in extended time windows. PMID:24978637
Improvements to the modal holographic wavefront sensor.
Kong, Fanpeng; Lambert, Andrew
2016-05-01
The Zernike coefficients of a light wavefront can be calculated directly by intensity ratios of pairs of spots in the reconstructed image plane of a holographic wavefront sensor (HWFS). However, the response curve of the HWFS heavily depends on the position and size of the detector for each spot and the distortions introduced by other aberrations. In this paper, we propose a method to measure the intensity of each spot by setting a threshold to select effective pixels and using the weighted average intensity within a selected window. Compared with using the integral intensity over a small window for each spot, we show through a numerical simulation that the proposed method reduces the dependency of the HWFS's response curve on the selection of the detector window. We also recorded a HWFS on a holographic plate using a blue laser and demonstrated its capability to detect the strength of encoded Zernike terms in an aberrated beam.
Thalagala, N
2015-11-01
The normative age ranges during which cohorts of children achieve milestones are called windows of achievement. The patterns of these windows of achievement are known to be both genetically and environmentally dependent. This study aimed to determine the windows of achievement for motor, social emotional, language and cognitive development milestones for infants and toddlers in Sri Lanka. A set of 293 milestones identified through a literature review were subjected to content validation using parent and expert reviews, which resulted in the selection of a revised set of 277 milestones. Thereafter, a sample of 1036 children from 2 months to 30 months was examined to see whether or not they had attained the selected milestones. Percentile ages of attaining milestone were determined using a rearranged closed form equation related to the logistic regression. The parameters required for calculations were derived through the logistic regression of milestone achievement statuses against ages of children. These percentile ages were used to define the respective windows of achievement. A set of 178 robust indicators that represent motor, socio emotional, language and cognitive development skills and their windows of achievement relevant to 2 to 24 months of age were determined. Windows of achievement for six gross motor milestones determined in the study were shown to closely overlap a similar set of windows of achievement published by the World Health Organization indicating the validity of some findings. A methodology combining the content validation based on qualitative techniques and age validation based on regression modelling found to be effective for determining age percentiles for realizing milestones and determining respective windows of achievement. © 2015 John Wiley & Sons Ltd.
Intensity-based segmentation and visualization of cells in 3D microscopic images using the GPU
NASA Astrophysics Data System (ADS)
Kang, Mi-Sun; Lee, Jeong-Eom; Jeon, Woong-ki; Choi, Heung-Kook; Kim, Myoung-Hee
2013-02-01
3D microscopy images contain abundant astronomical data, rendering 3D microscopy image processing time-consuming and laborious on a central processing unit (CPU). To solve these problems, many people crop a region of interest (ROI) of the input image to a small size. Although this reduces cost and time, there are drawbacks at the image processing level, e.g., the selected ROI strongly depends on the user and there is a loss in original image information. To mitigate these problems, we developed a 3D microscopy image processing tool on a graphics processing unit (GPU). Our tool provides efficient and various automatic thresholding methods to achieve intensity-based segmentation of 3D microscopy images. Users can select the algorithm to be applied. Further, the image processing tool provides visualization of segmented volume data and can set the scale, transportation, etc. using a keyboard and mouse. However, the 3D objects visualized fast still need to be analyzed to obtain information for biologists. To analyze 3D microscopic images, we need quantitative data of the images. Therefore, we label the segmented 3D objects within all 3D microscopic images and obtain quantitative information on each labeled object. This information can use the classification feature. A user can select the object to be analyzed. Our tool allows the selected object to be displayed on a new window, and hence, more details of the object can be observed. Finally, we validate the effectiveness of our tool by comparing the CPU and GPU processing times by matching the specification and configuration.
Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H
Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
Tutorial Guide: Computer-Aided Structural Modeling (CASM). Version 5.00
1994-04-01
2-3 SITE-SPECIFIC DATA DIALOG WINDOW ................. 2-4 SAVING PROJECT DATA ....................... 2-7 PRINTING PROJECT CRITERIA DATA...you to the main CASM screen without saving changes. REGIONAL DATA DIALOG WINDOW The Regonal• Dtadlalg windtow contim ms0•twrmofogoik hronimdon.The...Information so that It will be Included In your hardcopy output. 3. Select OK to save your Regional Data entries. The Regional Data dalog window will disappear
Illusory displacement of equiluminous kinetic edges.
Ramachandran, V S; Anstis, S M
1990-01-01
A stationary window was cut out of a stationary random-dot pattern. When a field of dots was moved continuously behind the window (a) the window appeared to move in the same direction even though it was stationary, (b) the position of the 'kinetic edges' defining the window was also displaced along the direction of dot motion, and (c) the edges of the window tended to fade on steady fixation even though the dots were still clearly visible. The illusory displacement was enhanced considerably if the kinetic edge was equiluminous and if the 'window' region was seen as 'figure' rather than 'ground'. Since the extraction of kinetic edges probably involves the use of direction-selective cells, the illusion may provide insights into how the visual system uses the output of these cells to localize the kinetic edges.
Multi-Window Controllers for Autonomous Space Systems
NASA Technical Reports Server (NTRS)
Lurie, B, J.; Hadaegh, F. Y.
1997-01-01
Multi-window controllers select between elementary linear controllers using nonlinear windows based on the amplitude and frequency content of the feedback error. The controllers are relatively simple to implement and perform much better than linear controllers. The commanders for such controllers only order the destination point and are freed from generating the command time-profiles. The robotic missions rely heavily on the tasks of acquisition and tracking. For autonomous and optimal control of the spacecraft, the control bandwidth must be larger while the feedback can (and, therefore, must) be reduced.. Combining linear compensators via multi-window nonlinear summer guarantees minimum phase character of the combined transfer function. It is shown that the solution may require using several parallel branches and windows. Several examples of multi-window nonlinear controller applications are presented.
Atmospheric Science Data Center
2016-04-29
ASDC Data Pool Notices • DataPool will transition from ... • Use IE7 for FTP sessions: a) Select "View", "Open FTP site in Windows Explorer" or b) Open Windows Explorer and enter the URL for the FTP site in the address bar ...
Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej
2015-01-01
Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma.
Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej
2015-01-01
Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma. PMID:26240787
Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger
2017-01-01
Background Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. Methods The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. Results The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from either accelerometer position. Conclusions Machine learning techniques can be used for automatic activity recognition, as they provide very accurate activity recognition, significantly more accurate than when keeping a diary. Identification of jogging periods in adolescents can be performed using only one accelerometer. Performance-wise there is no significant benefit from using accelerometers on both locations. PMID:28880923
Statistical optimisation techniques in fatigue signal editing problem
NASA Astrophysics Data System (ADS)
Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.
2015-02-01
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.
Statistical optimisation techniques in fatigue signal editing problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nopiah, Z. M.; Osman, M. H.; Baharin, N.
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window andmore » fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.« less
Automatic extraction of tree crowns from aerial imagery in urban environment
NASA Astrophysics Data System (ADS)
Liu, Jiahang; Li, Deren; Qin, Xunwen; Yang, Jianfeng
2006-10-01
Traditionally, field-based investigation is the main method to investigate greenbelt in urban environment, which is costly and low updating frequency. In higher resolution image, the imagery structure and texture of tree canopy has great similarity in statistics despite the great difference in configurations of tree canopy, and their surface structures and textures of tree crown are very different from the other types. In this paper, we present an automatic method to detect tree crowns using high resolution image in urban environment without any apriori knowledge. Our method catches unique structure and texture of tree crown surface, use variance and mathematical expectation of defined image window to position the candidate canopy blocks coarsely, then analysis their inner structure and texture to refine these candidate blocks. The possible spans of all the feature parameters used in our method automatically generate from the small number of samples, and HOLE and its distribution as an important characteristics are introduced into refining processing. Also the isotropy of candidate image block and holes' distribution is integrated in our method. After introduction the theory of our method, aerial imageries were used ( with a resolution about 0.3m ) to test our method, and the results indicate that our method is an effective approach to automatically detect tree crown in urban environment.
Temporally flexible feedback signal to foveal cortex for peripheral object recognition
Fan, Xiaoxu; Wang, Lan; Shao, Hanyu; Kersten, Daniel; He, Sheng
2016-01-01
Recent studies have shown that information from peripherally presented images is present in the human foveal retinotopic cortex, presumably because of feedback signals. We investigated this potential feedback signal by presenting noise in fovea at different object–noise stimulus onset asynchronies (SOAs), whereas subjects performed a discrimination task on peripheral objects. Results revealed a selective impairment of performance when foveal noise was presented at 250-ms SOA, but only for tasks that required comparing objects’ spatial details, suggesting a task- and stimulus-dependent foveal processing mechanism. Critically, the temporal window of foveal processing was shifted when mental rotation was required for the peripheral objects, indicating that the foveal retinotopic processing is not automatically engaged at a fixed time following peripheral stimulation; rather, it occurs at a stage when detailed information is required. Moreover, fMRI measurements using multivoxel pattern analysis showed that both image and object category-relevant information of peripheral objects was represented in the foveal cortex. Taken together, our results support the hypothesis of a temporally flexible feedback signal to the foveal retinotopic cortex when discriminating objects in the visual periphery. PMID:27671651
Flash memory management system and method utilizing multiple block list windows
NASA Technical Reports Server (NTRS)
Chow, James (Inventor); Gender, Thomas K. (Inventor)
2005-01-01
The present invention provides a flash memory management system and method with increased performance. The flash memory management system provides the ability to efficiently manage and allocate flash memory use in a way that improves reliability and longevity, while maintaining good performance levels. The flash memory management system includes a free block mechanism, a disk maintenance mechanism, and a bad block detection mechanism. The free block mechanism provides efficient sorting of free blocks to facilitate selecting low use blocks for writing. The disk maintenance mechanism provides for the ability to efficiently clean flash memory blocks during processor idle times. The bad block detection mechanism provides the ability to better detect when a block of flash memory is likely to go bad. The flash status mechanism stores information in fast access memory that describes the content and status of the data in the flash disk. The new bank detection mechanism provides the ability to automatically detect when new banks of flash memory are added to the system. Together, these mechanisms provide a flash memory management system that can improve the operational efficiency of systems that utilize flash memory.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
Select Components and Finish System Design of a Window Air Conditioner with Propane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Bo; Abdelaziz, Omar
This report describes the technical targets for developing a high efficiency window air conditioner (WAC) using propane (R-290). The baseline unit selected for this activity is a GE R-410A WAC. We established collaboration with a Chinese rotary compressor manufacturer, to select an R-290 compressor. We first modelled and calibrated the WAC system model using R-410A. Next, we applied the calibrated system model to design the R-290 WAC, and decided the strategies to reduce the system charge below 260 grams and achieve the capacity and efficiency targets.
Automatic learning-based beam angle selection for thoracic IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amit, Guy; Marshall, Andrea; Purdie, Thomas G., E-mail: tom.purdie@rmp.uhn.ca
Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationallymore » efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume coverage and organ at risk sparing and were superior over plans produced with fixed sets of common beam angles. The great majority of the automatic plans (93%) were approved as clinically acceptable by three radiation therapy specialists. Conclusions: The results demonstrated the feasibility of utilizing a learning-based approach for automatic selection of beam angles in thoracic IMRT planning. The proposed method may assist in reducing the manual planning workload, while sustaining plan quality.« less
Window Selection Tool | Efficient Windows Collaborative
Sacramento CA San Diego CA San Francisco CO Denver CO Grand Junction CT Hartford DC Washington DE Wilmington Louisville LA Lake Charles LA New Orleans LA Shreveport MA Boston MD Baltimore ME Portland MI Detroit MI
14 CFR 1214.117 - Launch and orbit parameters for a standard launch.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Launch at a time, selected by NASA, from a launch window of not less than 1 hour (a more restrictive launch window may be provided as an optional service). (b) For shared flights from KSC to the standard...
SOSPEX, an interactive tool to explore SOFIA spectral cubes
NASA Astrophysics Data System (ADS)
Fadda, Dario; Chambers, Edward T.
2018-01-01
We present SOSPEX (SOFIA SPectral EXplorer), an interactive tool to visualize and analyze spectral cubes obtained with the FIFI-LS and GREAT instruments onboard the SOFIA Infrared Observatory. This software package is written in Python 3 and it is available either through Github or Anaconda.Through this GUI it is possible to explore directly the spectral cubes produced by the SOFIA pipeline and archived in the SOFIA Science Archive. Spectral cubes are visualized showing their spatial and spectral dimensions in two different windows. By selecting a part of the spectrum, the flux from the corresponding slice of the cube is visualized in the spatial window. On the other hand, it is possible to define apertures on the spatial window to show the corresponding spectral energy distribution in the spectral window.Flux isocontours can be overlapped to external images in the spatial window while line names, atmospheric transmission, or external spectra can be overplotted on the spectral window. Atmospheric models with specific parameters can be retrieved, compared to the spectra and applied to the uncorrected FIFI-LS cubes in the cases where the standard values give unsatisfactory results. Subcubes can be selected and saved as FITS files by cropping or cutting the original cubes. Lines and continuum can be fitted in the spectral window saving the results in Jyson files which can be reloaded later. Finally, in the case of spatially extended observations, it is possible to compute spectral momenta as a function of the position to obtain velocity dispersion maps or velocity diagrams.
Radiometer Calibration and Characterization (RCC) User's Manual: Windows Version 4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreas, Afshin M.; Wilcox, Stephen M.
2016-02-29
The Radiometer Calibration and Characterization (RCC) software is a data acquisition and data archival system for performing Broadband Outdoor Radiometer Calibrations (BORCAL). RCC provides a unique method of calibrating broadband atmospheric longwave and solar shortwave radiometers using techniques that reduce measurement uncertainty and better characterize a radiometer's response profile. The RCC software automatically monitors and controls many of the components that contribute to uncertainty in an instrument's responsivity. This is a user's manual and guide to the RCC software.
A Trusted Path Design and Implementation for Security Enhanced Linux
2004-09-01
functionality by a member of the team? Witten, et al., [21] provides an excellent discussion of some aspects of the subject. Ultimately, open vs ...terminal window is a program like gnome - terminal that provides a TTY-like environment as a window inside an X Windows session. The phrase computer...Editors selected No sound or video No graphics Check all development boxes except KDE Administrative tools System tools No printing support
Tailored emails prompt electric vehicle owners to engage with tariff switching information
NASA Astrophysics Data System (ADS)
Nicolson, Moira; Huebner, Gesche M.; Shipworth, David; Elam, Simon
2017-06-01
The carbon intensity of the electricity used to charge an electric vehicle (EV) is dependent on when in the day charging occurs. However, persuading EV owners to adopt incentives to charge during off-peak hours is challenging. Here we show that governments could exploit the 'window of opportunity' created when people purchase their first EV to promote time-of-use tariffs. Email recipients (n = 7,038 EV owners) were more likely to click-through to an information webpage when the email emphasized specific reductions in home-charging costs versus general bill savings. However, the 'window of opportunity' for maximizing potential adoption is short; email open rates declined from over 70% immediately after purchase to 40% for recipients owning their EV for over three months. These results demonstrate the potential of prompts to change behaviours for which opt-out enrolment (where enrolment is automatic unless people explicitly opt out) would be unethical or less effective.
Non-numeric computation for high eccentricity orbits. [Earth satellite orbit perturbation
NASA Technical Reports Server (NTRS)
Sridharan, R.; Renard, M. L.
1975-01-01
Geocentric orbits of large eccentricity (e = 0.9 to 0.95) are significantly perturbed in cislunar space by the sun and moon. The time-history of the height of perigee, subsequent to launch, is particularly critical. The determination of 'launch windows' is mostly concerned with preventing the height of perigee from falling below its low initial value before the mission lifetime has elapsed. Between the extremes of high accuracy digital integration of the equations of motion and of using an approximate, but very fast, stability criteria method, this paper is concerned with the developement of a method of intermediate complexity using non-numeric computation. The computer is used as the theory generator to generalize Lidov's theory using six osculating elements. Symbolic integration is completely automatized and the output is a set of condensed formulae well suited for repeated applications in launch window analysis. Examples of applications are given.
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
Multiresolution forecasting for futures trading using wavelet decompositions.
Zhang, B L; Coggins, R; Jabri, M A; Dersch, D; Flower, B
2001-01-01
We investigate the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). We apply the Bayesian method of automatic relevance determination to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the individual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representation, or by another perceptron which learns the weight of each scale in the prediction of the original time series. The forecast results are then passed to a money management system to generate trades.
Martin, Corinna; Jablonka, Sibylle
2018-01-01
Local and spontaneous calcium signals play important roles in neurons and neuronal networks. Spontaneous or cell-autonomous calcium signals may be difficult to assess because they appear in an unpredictable spatiotemporal pattern and in very small neuronal loci of axons or dendrites. We developed an open source bioinformatics tool for an unbiased assessment of calcium signals in x,y-t imaging series. The tool bases its algorithm on a continuous wavelet transform-guided peak detection to identify calcium signal candidates. The highly sensitive calcium event definition is based on identification of peaks in 1D data through analysis of a 2D wavelet transform surface. For spatial analysis, the tool uses a grid to separate the x,y-image field in independently analyzed grid windows. A document containing a graphical summary of the data is automatically created and displays the loci of activity for a wide range of signal intensities. Furthermore, the number of activity events is summed up to create an estimated total activity value, which can be used to compare different experimental situations, such as calcium activity before or after an experimental treatment. All traces and data of active loci become documented. The tool can also compute the signal variance in a sliding window to visualize activity-dependent signal fluctuations. We applied the calcium signal detector to monitor activity states of cultured mouse neurons. Our data show that both the total activity value and the variance area created by a sliding window can distinguish experimental manipulations of neuronal activity states. Notably, the tool is powerful enough to compute local calcium events and ‘signal-close-to-noise’ activity in small loci of distal neurites of neurons, which remain during pharmacological blockade of neuronal activity with inhibitors such as tetrodotoxin, to block action potential firing, or inhibitors of ionotropic glutamate receptors. The tool can also offer information about local homeostatic calcium activity events in neurites. PMID:29601577
Toward realistic radiofrequency ablation of hepatic tumors 3D simulation and planning
NASA Astrophysics Data System (ADS)
Villard, Caroline; Soler, Luc; Gangi, Afshin; Mutter, Didier; Marescaux, Jacques
2004-05-01
Radiofrequency ablation (RFA) has become an increasingly used technique in the treatment of patients with unresectable hepatic tumors. Evaluation of vascular architecture, post-RFA tissue necrosis prediction, and the choice of a suitable needle placement strategy using conventional radiological techniques remain difficult. In an attempt to enhance the safety of RFA, a 3D simulator and treatment planning tool, that simulates the necrosis of the treated area, and proposes an optimal placement for the needle, has been developed. From enhanced spiral CT scans with 2 mm cuts, 3D reconstructions of patients with liver metastases are automatically generated. Virtual needles can be added to the 3D scene, together with their corresponding zones of necrosis that are displayed as a meshed spheroids representing the 60° C isosurface. The simulator takes into account the cooling effect of local vessels greater than 3mm in diameter, making necrosis shapes more realistic. Using a voxel-based algorithm, RFA spheroids are deformed following the shape of the vessels, extended by an additional cooled area. This operation is performed in real-time, allowing updates while needle is adjusted. This allows to observe whether the considered needle placement strategy would burn the whole cancerous zone or not. Planned needle positioning can also be automatically generated by the software to produce complete destruction of the tumor with a 1 cm margin, with maximum respect of the healthy liver and of all major extrahepatic and intrahepatic structures to avoid. If he wishes, the radiologist can select on the skin an insertion window for the needle, focusing the research of the trajectory.
Fu, Hai-Yan; Guo, Jun-Wei; Yu, Yong-Jie; Li, He-Dong; Cui, Hua-Peng; Liu, Ping-Ping; Wang, Bing; Wang, Sheng; Lu, Peng
2016-06-24
Peak detection is a critical step in chromatographic data analysis. In the present work, we developed a multi-scale Gaussian smoothing-based strategy for accurate peak extraction. The strategy consisted of three stages: background drift correction, peak detection, and peak filtration. Background drift correction was implemented using a moving window strategy. The new peak detection method is a variant of the system used by the well-known MassSpecWavelet, i.e., chromatographic peaks are found at local maximum values under various smoothing window scales. Therefore, peaks can be detected through the ridge lines of maximum values under these window scales, and signals that are monotonously increased/decreased around the peak position could be treated as part of the peak. Instrumental noise was estimated after peak elimination, and a peak filtration strategy was performed to remove peaks with signal-to-noise ratios smaller than 3. The performance of our method was evaluated using two complex datasets. These datasets include essential oil samples for quality control obtained from gas chromatography and tobacco plant samples for metabolic profiling analysis obtained from gas chromatography coupled with mass spectrometry. Results confirmed the reasonability of the developed method. Copyright © 2016 Elsevier B.V. All rights reserved.
Photonic microstructures for energy-generating clear glass and net-zero energy buildings
NASA Astrophysics Data System (ADS)
Vasiliev, Mikhail; Alghamedi, Ramzy; Nur-E-Alam, Mohammad; Alameh, Kamal
2016-08-01
Transparent energy-harvesting windows are emerging as practical building-integrated photovoltaics (BIPV), capable of generating electricity while simultaneously reducing heating and cooling demands. By incorporating spectrally-selective diffraction gratings as light deflecting structures of high visible transparency into lamination interlayers and using improved spectrally-selective thin-film coatings, most of the visible solar radiation can be transmitted through the glass windows with minimum attenuation. At the same time, the ultraviolet (UV) and a part of incident solar infrared (IR) radiation energy are converted and/or deflected geometrically towards the panel edge for collection by CuInSe2 solar cells. Experimental results show power conversion efficiencies in excess of 3.04% in 10 cm × 10 cm vertically-placed clear glass panels facing direct sunlight, and up to 2.08% in 50 cm × 50 cm installation-ready framed window systems. These results confirm the emergence of a new class of solar window system ready for industrial application.
Photonic microstructures for energy-generating clear glass and net-zero energy buildings.
Vasiliev, Mikhail; Alghamedi, Ramzy; Nur-E-Alam, Mohammad; Alameh, Kamal
2016-08-23
Transparent energy-harvesting windows are emerging as practical building-integrated photovoltaics (BIPV), capable of generating electricity while simultaneously reducing heating and cooling demands. By incorporating spectrally-selective diffraction gratings as light deflecting structures of high visible transparency into lamination interlayers and using improved spectrally-selective thin-film coatings, most of the visible solar radiation can be transmitted through the glass windows with minimum attenuation. At the same time, the ultraviolet (UV) and a part of incident solar infrared (IR) radiation energy are converted and/or deflected geometrically towards the panel edge for collection by CuInSe2 solar cells. Experimental results show power conversion efficiencies in excess of 3.04% in 10 cm × 10 cm vertically-placed clear glass panels facing direct sunlight, and up to 2.08% in 50 cm × 50 cm installation-ready framed window systems. These results confirm the emergence of a new class of solar window system ready for industrial application.
Alghamedi, Ramzy; Vasiliev, Mikhail; Nur-E-Alam, Mohammad; Alameh, Kamal
2014-10-16
All-inorganic visibly-transparent energy-harvesting clear laminated glass windows are the most practical solution to boosting building-integrated photovoltaics (BIPV) energy outputs significantly while reducing cooling- and heating-related energy consumption in buildings. By incorporating luminophore materials into lamination interlayers and using spectrally-selective thin-film coatings in conjunction with CuInSe2 solar cells, most of the visible solar radiation can be transmitted through the glass window with minimum attenuation while ultraviolet (UV) radiation is down-converted and routed together with a significant part of infrared radiation to the edges for collection by solar cells. Experimental results demonstrate a 10 cm × 10 cm vertically-placed energy-harvesting clear glass panel of transparency exceeding 60%, invisible solar energy attenuation greater than 90% and electrical power output near 30 Wp/m(2) mainly generated by infrared (IR) and UV radiations. These results open the way for the realization of large-area visibly-transparent energy-harvesting clear glass windows for BIPV systems.
Alghamedi, Ramzy; Vasiliev, Mikhail; Nur-E-Alam, Mohammad; Alameh, Kamal
2014-01-01
All-inorganic visibly-transparent energy-harvesting clear laminated glass windows are the most practical solution to boosting building-integrated photovoltaics (BIPV) energy outputs significantly while reducing cooling- and heating-related energy consumption in buildings. By incorporating luminophore materials into lamination interlayers and using spectrally-selective thin-film coatings in conjunction with CuInSe2 solar cells, most of the visible solar radiation can be transmitted through the glass window with minimum attenuation while ultraviolet (UV) radiation is down-converted and routed together with a significant part of infrared radiation to the edges for collection by solar cells. Experimental results demonstrate a 10 cm × 10 cm vertically-placed energy-harvesting clear glass panel of transparency exceeding 60%, invisible solar energy attenuation greater than 90% and electrical power output near 30 Wp/m2 mainly generated by infrared (IR) and UV radiations. These results open the way for the realization of large-area visibly-transparent energy-harvesting clear glass windows for BIPV systems. PMID:25321890
Photonic microstructures for energy-generating clear glass and net-zero energy buildings
Vasiliev, Mikhail; Alghamedi, Ramzy; Nur-E-Alam, Mohammad; Alameh, Kamal
2016-01-01
Transparent energy-harvesting windows are emerging as practical building-integrated photovoltaics (BIPV), capable of generating electricity while simultaneously reducing heating and cooling demands. By incorporating spectrally-selective diffraction gratings as light deflecting structures of high visible transparency into lamination interlayers and using improved spectrally-selective thin-film coatings, most of the visible solar radiation can be transmitted through the glass windows with minimum attenuation. At the same time, the ultraviolet (UV) and a part of incident solar infrared (IR) radiation energy are converted and/or deflected geometrically towards the panel edge for collection by CuInSe2 solar cells. Experimental results show power conversion efficiencies in excess of 3.04% in 10 cm × 10 cm vertically-placed clear glass panels facing direct sunlight, and up to 2.08% in 50 cm × 50 cm installation-ready framed window systems. These results confirm the emergence of a new class of solar window system ready for industrial application. PMID:27550827
Multiple-Diode-Laser Gas-Detection Spectrometer
NASA Technical Reports Server (NTRS)
Webster, Christopher R.; Beer, Reinhard; Sander, Stanley P.
1988-01-01
Small concentrations of selected gases measured automatically. Proposed multiple-laser-diode spectrometer part of system for measuring automatically concentrations of selected gases at part-per-billion level. Array of laser/photodetector pairs measure infrared absorption spectrum of atmosphere along probing laser beams. Adaptable to terrestrial uses as monitoring pollution or control of industrial processes.
Code of Federal Regulations, 2010 CFR
2010-10-01
... automatic interlocking. (a) The control circuits for aspects with indications more favorable than “proceed... 49 Transportation 4 2010-10-01 2010-10-01 false Signal control circuits, selection through track... automatic interlocking. 236.311 Section 236.311 Transportation Other Regulations Relating to Transportation...
Automatic Text Analysis Based on Transition Phenomena of Word Occurrences
ERIC Educational Resources Information Center
Pao, Miranda Lee
1978-01-01
Describes a method of selecting index terms directly from a word frequency list, an idea originally suggested by Goffman. Results of the analysis of word frequencies of two articles seem to indicate that the automated selection of index terms from a frequency list holds some promise for automatic indexing. (Author/MBR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burgmans, Mark Christiaan, E-mail: m.c.burgmans@lumc.nl; Harder, J. Michiel den, E-mail: chiel.den.harder@gmail.com; Meershoek, Philippa, E-mail: P.Meershoek@lumc.nl
PurposeTo determine the accuracy of automatic and manual co-registration methods for image fusion of three-dimensional computed tomography (CT) with real-time ultrasonography (US) for image-guided liver interventions.Materials and MethodsCT images of a skills phantom with liver lesions were acquired and co-registered to US using GE Logiq E9 navigation software. Manual co-registration was compared to automatic and semiautomatic co-registration using an active tracker. Also, manual point registration was compared to plane registration with and without an additional translation point. Finally, comparison was made between manual and automatic selection of reference points. In each experiment, accuracy of the co-registration method was determined bymore » measurement of the residual displacement in phantom lesions by two independent observers.ResultsMean displacements for a superficial and deep liver lesion were comparable after manual and semiautomatic co-registration: 2.4 and 2.0 mm versus 2.0 and 2.5 mm, respectively. Both methods were significantly better than automatic co-registration: 5.9 and 5.2 mm residual displacement (p < 0.001; p < 0.01). The accuracy of manual point registration was higher than that of plane registration, the latter being heavily dependent on accurate matching of axial CT and US images by the operator. Automatic reference point selection resulted in significantly lower registration accuracy compared to manual point selection despite lower root-mean-square deviation (RMSD) values.ConclusionThe accuracy of manual and semiautomatic co-registration is better than that of automatic co-registration. For manual co-registration using a plane, choosing the correct plane orientation is an essential first step in the registration process. Automatic reference point selection based on RMSD values is error-prone.« less
Angular selective window systems: Assessment of technical potential for energy savings
Fernandes, Luis L.; Lee, Eleanor S.; McNeil, Andrew; ...
2014-10-16
Static angular selective shading systems block direct sunlight and admit daylight within a specific range of incident solar angles. The objective of this study is to quantify their potential to reduce energy use and peak demand in commercial buildings using state-of-the art whole-building computer simulation software that allows accurate modeling of the behavior of optically-complex fenestration systems such as angular selective systems. Three commercial systems were evaluated: a micro-perforated screen, a tubular shading structure, and an expanded metal mesh. This evaluation was performed through computer simulation for multiple climates (Chicago, Illinois and Houston, Texas), window-to-wall ratios (0.15-0.60), building codes (ASHRAEmore » 90.1-2004 and 2010) and lighting control configurations (with and without). The modeling of the optical complexity of the systems took advantage of the development of state-of-the-art versions of the EnergyPlus, Radiance and Window simulation tools. Results show significant reductions in perimeter zone energy use; the best system reached 28% and 47% savings, respectively without and with daylighting controls (ASHRAE 90.1-2004, south facade, Chicago,WWR=0.45). As a result, angular selectivity and thermal conductance of the angle-selective layer, as well as spectral selectivity of low-emissivity coatings, were identified as factors with significant impact on performance.« less
ECG artifact cancellation in surface EMG signals by fractional order calculus application.
Miljković, Nadica; Popović, Nenad; Djordjević, Olivera; Konstantinović, Ljubica; Šekara, Tomislav B
2017-03-01
New aspects for automatic electrocardiography artifact removal from surface electromyography signals by application of fractional order calculus in combination with linear and nonlinear moving window filters are explored. Surface electromyography recordings of skeletal trunk muscles are commonly contaminated with spike shaped artifacts. This artifact originates from electrical heart activity, recorded by electrocardiography, commonly present in the surface electromyography signals recorded in heart proximity. For appropriate assessment of neuromuscular changes by means of surface electromyography, application of a proper filtering technique of electrocardiography artifact is crucial. A novel method for automatic artifact cancellation in surface electromyography signals by applying fractional order calculus and nonlinear median filter is introduced. The proposed method is compared with the linear moving average filter, with and without prior application of fractional order calculus. 3D graphs for assessment of window lengths of the filters, crest factors, root mean square differences, and fractional calculus orders (called WFC and WRC graphs) have been introduced. For an appropriate quantitative filtering evaluation, the synthetic electrocardiography signal and analogous semi-synthetic dataset have been generated. The examples of noise removal in 10 able-bodied subjects and in one patient with muscle dystrophy are presented for qualitative analysis. The crest factors, correlation coefficients, and root mean square differences of the recorded and semi-synthetic electromyography datasets showed that the most successful method was the median filter in combination with fractional order calculus of the order 0.9. Statistically more significant (p < 0.001) ECG peak reduction was obtained by the median filter application compared to the moving average filter in the cases of low level amplitude of muscle contraction compared to ECG spikes. The presented results suggest that the novel method combining a median filter and fractional order calculus can be used for automatic filtering of electrocardiography artifacts in the surface electromyography signal envelopes recorded in trunk muscles. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruse, J. E.; Doundoulakis, G.; Institute of Electronic Structure and Laser, Foundation for Research and Technology–Hellas, N. Plastira 100, 70013 Heraklion
2016-06-14
We analyze a method to selectively grow straight, vertical gallium nitride nanowires by plasma-assisted molecular beam epitaxy (MBE) at sites specified by a silicon oxide mask, which is thermally grown on silicon (111) substrates and patterned by electron-beam lithography and reactive-ion etching. The investigated method requires only one single molecular beam epitaxy MBE growth process, i.e., the SiO{sub 2} mask is formed on silicon instead of on a previously grown GaN or AlN buffer layer. We present a systematic and analytical study involving various mask patterns, characterization by scanning electron microscopy, transmission electron microscopy, and photoluminescence spectroscopy, as well asmore » numerical simulations, to evaluate how the dimensions (window diameter and spacing) of the mask affect the distribution of the nanowires, their morphology, and alignment, as well as their photonic properties. Capabilities and limitations for this method of selective-area growth of nanowires have been identified. A window diameter less than 50 nm and a window spacing larger than 500 nm can provide single nanowire nucleation in nearly all mask windows. The results are consistent with a Ga diffusion length on the silicon dioxide surface in the order of approximately 1 μm.« less
Threshold automatic selection hybrid phase unwrapping algorithm for digital holographic microscopy
NASA Astrophysics Data System (ADS)
Zhou, Meiling; Min, Junwei; Yao, Baoli; Yu, Xianghua; Lei, Ming; Yan, Shaohui; Yang, Yanlong; Dan, Dan
2015-01-01
Conventional quality-guided (QG) phase unwrapping algorithm is hard to be applied to digital holographic microscopy because of the long execution time. In this paper, we present a threshold automatic selection hybrid phase unwrapping algorithm that combines the existing QG algorithm and the flood-filled (FF) algorithm to solve this problem. The original wrapped phase map is divided into high- and low-quality sub-maps by selecting a threshold automatically, and then the FF and QG unwrapping algorithms are used in each level to unwrap the phase, respectively. The feasibility of the proposed method is proved by experimental results, and the execution speed is shown to be much faster than that of the original QG unwrapping algorithm.
Automatic detection and classification of obstacles with applications in autonomous mobile robots
NASA Astrophysics Data System (ADS)
Ponomaryov, Volodymyr I.; Rosas-Miranda, Dario I.
2016-04-01
Hardware implementation of an automatic detection and classification of objects that can represent an obstacle for an autonomous mobile robot using stereo vision algorithms is presented. We propose and evaluate a new method to detect and classify objects for a mobile robot in outdoor conditions. This method is divided in two parts, the first one is the object detection step based on the distance from the objects to the camera and a BLOB analysis. The second part is the classification step that is based on visuals primitives and a SVM classifier. The proposed method is performed in GPU in order to reduce the processing time values. This is performed with help of hardware based on multi-core processors and GPU platform, using a NVIDIA R GeForce R GT640 graphic card and Matlab over a PC with Windows 10.
MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.
Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten
2006-12-01
MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant
NASA Technical Reports Server (NTRS)
Tescher, Andrew G. (Editor)
1989-01-01
Various papers on image compression and automatic target recognition are presented. Individual topics addressed include: target cluster detection in cluttered SAR imagery, model-based target recognition using laser radar imagery, Smart Sensor front-end processor for feature extraction of images, object attitude estimation and tracking from a single video sensor, symmetry detection in human vision, analysis of high resolution aerial images for object detection, obscured object recognition for an ATR application, neural networks for adaptive shape tracking, statistical mechanics and pattern recognition, detection of cylinders in aerial range images, moving object tracking using local windows, new transform method for image data compression, quad-tree product vector quantization of images, predictive trellis encoding of imagery, reduced generalized chain code for contour description, compact architecture for a real-time vision system, use of human visibility functions in segmentation coding, color texture analysis and synthesis using Gibbs random fields.
ERIC Educational Resources Information Center
Doles, Daniel T.
In the constantly changing world of technology, migration is not only inevitable but many times necessary for survival, especially when the end result is simplicity for both users and IT support staff. This paper describes the migration at Franklin College (Indiana). It discusses the reasons for selecting Windows NT, the steps taken to complete…
Investigation on the Frequency Allocation for Radio Astronomy at the L Band
NASA Astrophysics Data System (ADS)
Abidin, Z. Z.; Umar, R.; Ibrahim, Z. A.; Rosli, Z.; Asanok, K.; Gasiprong, N.
2013-09-01
In this paper, the frequency allocation reserved for radio astronomy in the L band set by the International Telecommunication Union (ITU), which is between 1400 and 1427 MHz, is reviewed. We argue that the nearby frequencies are still very important for radio astronomers on the ground by investigating radio objects (H i sources) around 1300-1500 MHz. The L-band window is separated into a group of four windows, namely 1400-1427 MHz (window A), 1380-1400 MHz (window B), 1350-1380 MHz (window C), and 1300-1350 MHz (window D). These windows are selected according to their redshifts from a rest frequency for hydrogen spectral line at 1420.4057 MHz. Radio objects up to z ≈ 0.1 or frequency down to 1300 MHz are examined. We argue that since window B has important radio objects within the four windows, this window should also be given to radio astronomy. They are galaxies, spiral galaxies, and galaxy clusters. This underlines the significance of window B for radio astronomers on the ground. By investigating the severeness of radio frequency interference (RFI) within these windows, we have determined that window B still has significant, consistent RFI. The main RFI sources in the four windows have also been identified. We also found that the Department of Civil Aviation of Malaysia is assigned a frequency range of 1215-1427 MHz, which is transmitted within the four windows and inside the protected frequency for radio astronomy. We also investigated the RFI in the four windows on proposed sites of future radio astronomy observatories in Malaysia and Thailand and found the two best sites as Universiti Pendidikan Sultan Idris (UPSI) and Ubon Ratchathani, respectively. It has also been determined that RFI in window B increases with population density.
Removal of Noise from a Voice Signal by Synthesis
1973-05-01
for 102.4 millisecond windows is about five times as great as the cost of computing for 25.6 millisecond windows. Hammett in his work on an adaptive...spectrum analysis vocoder, has examined the selection of data window widths in detail [18]. The solution Hammett used to optimize the trade off between...result is: n s(t) E Ri(t - i . T) i-1 In this equation n is the number of impulse responses under consideration, s(t) is the resulting synthetic signal
Automatic allograft bone selection through band registration and its application to distal femur.
Zhang, Yu; Qiu, Lei; Li, Fengzan; Zhang, Qing; Zhang, Li; Niu, Xiaohui
2017-09-01
Clinical reports suggest that large bone defects could be effectively restored by allograft bone transplantation, where allograft bone selection acts an important role. Besides, there is a huge demand for developing the automatic allograft bone selection methods, as the automatic methods could greatly improve the management efficiency of the large bone banks. Although several automatic methods have been presented to select the most suitable allograft bone from the massive allograft bone bank, these methods still suffer from inaccuracy. In this paper, we propose an effective allograft bone selection method without using the contralateral bones. Firstly, the allograft bone is globally aligned to the recipient bone by surface registration. Then, the global alignment is further refined through band registration. The band, defined as the recipient points within the lifted and lowered cutting planes, could involve more local structure of the defected segment. Therefore, our method could achieve robust alignment and high registration accuracy of the allograft and recipient. Moreover, the existing contour method and surface method could be unified into one framework under our method by adjusting the lift and lower distances of the cutting planes. Finally, our method has been validated on the database of distal femurs. The experimental results indicate that our method outperforms the surface method and contour method.
Automatic Detection of Electric Power Troubles (ADEPT)
NASA Technical Reports Server (NTRS)
Wang, Caroline; Zeanah, Hugh; Anderson, Audie; Patrick, Clint; Brady, Mike; Ford, Donnie
1988-01-01
Automatic Detection of Electric Power Troubles (A DEPT) is an expert system that integrates knowledge from three different suppliers to offer an advanced fault-detection system. It is designed for two modes of operation: real time fault isolation and simulated modeling. Real time fault isolation of components is accomplished on a power system breadboard through the Fault Isolation Expert System (FIES II) interface with a rule system developed in-house. Faults are quickly detected and displayed and the rules and chain of reasoning optionally provided on a laser printer. This system consists of a simulated space station power module using direct-current power supplies for solar arrays on three power buses. For tests of the system's ablilty to locate faults inserted via switches, loads are configured by an INTEL microcomputer and the Symbolics artificial intelligence development system. As these loads are resistive in nature, Ohm's Law is used as the basis for rules by which faults are located. The three-bus system can correct faults automatically where there is a surplus of power available on any of the three buses. Techniques developed and used can be applied readily to other control systems requiring rapid intelligent decisions. Simulated modeling, used for theoretical studies, is implemented using a modified version of Kennedy Space Center's KATE (Knowledge-Based Automatic Test Equipment), FIES II windowing, and an ADEPT knowledge base.
Automatic Detection of Electric Power Troubles (ADEPT)
NASA Astrophysics Data System (ADS)
Wang, Caroline; Zeanah, Hugh; Anderson, Audie; Patrick, Clint; Brady, Mike; Ford, Donnie
1988-11-01
Automatic Detection of Electric Power Troubles (A DEPT) is an expert system that integrates knowledge from three different suppliers to offer an advanced fault-detection system. It is designed for two modes of operation: real time fault isolation and simulated modeling. Real time fault isolation of components is accomplished on a power system breadboard through the Fault Isolation Expert System (FIES II) interface with a rule system developed in-house. Faults are quickly detected and displayed and the rules and chain of reasoning optionally provided on a laser printer. This system consists of a simulated space station power module using direct-current power supplies for solar arrays on three power buses. For tests of the system's ablilty to locate faults inserted via switches, loads are configured by an INTEL microcomputer and the Symbolics artificial intelligence development system. As these loads are resistive in nature, Ohm's Law is used as the basis for rules by which faults are located. The three-bus system can correct faults automatically where there is a surplus of power available on any of the three buses. Techniques developed and used can be applied readily to other control systems requiring rapid intelligent decisions. Simulated modeling, used for theoretical studies, is implemented using a modified version of Kennedy Space Center's KATE (Knowledge-Based Automatic Test Equipment), FIES II windowing, and an ADEPT knowledge base.
Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system
NASA Astrophysics Data System (ADS)
Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong
2018-01-01
We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.
NASA Astrophysics Data System (ADS)
Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong
2018-05-01
In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.
Visualization of conserved structures by fusing highly variable datasets.
Silverstein, Jonathan C; Chhadia, Ankur; Dech, Fred
2002-01-01
Skill, effort, and time are required to identify and visualize anatomic structures in three-dimensions from radiological data. Fundamentally, automating these processes requires a technique that uses symbolic information not in the dynamic range of the voxel data. We were developing such a technique based on mutual information for automatic multi-modality image fusion (MIAMI Fuse, University of Michigan). This system previously demonstrated facility at fusing one voxel dataset with integrated symbolic structure information to a CT dataset (different scale and resolution) from the same person. The next step of development of our technique was aimed at accommodating the variability of anatomy from patient to patient by using warping to fuse our standard dataset to arbitrary patient CT datasets. A standard symbolic information dataset was created from the full color Visible Human Female by segmenting the liver parenchyma, portal veins, and hepatic veins and overwriting each set of voxels with a fixed color. Two arbitrarily selected patient CT scans of the abdomen were used for reference datasets. We used the warping functions in MIAMI Fuse to align the standard structure data to each patient scan. The key to successful fusion was the focused use of multiple warping control points that place themselves around the structure of interest automatically. The user assigns only a few initial control points to align the scans. Fusion 1 and 2 transformed the atlas with 27 points around the liver to CT1 and CT2 respectively. Fusion 3 transformed the atlas with 45 control points around the liver to CT1 and Fusion 4 transformed the atlas with 5 control points around the portal vein. The CT dataset is augmented with the transformed standard structure dataset, such that the warped structure masks are visualized in combination with the original patient dataset. This combined volume visualization is then rendered interactively in stereo on the ImmersaDesk in an immersive Virtual Reality (VR) environment. The accuracy of the fusions was determined qualitatively by comparing the transformed atlas overlaid on the appropriate CT. It was examined for where the transformed structure atlas was incorrectly overlaid (false positive) and where it was incorrectly not overlaid (false negative). According to this method, fusions 1 and 2 were correct roughly 50-75% of the time, while fusions 3 and 4 were correct roughly 75-100%. The CT dataset augmented with transformed dataset was viewed arbitrarily in user-centered perspective stereo taking advantage of features such as scaling, windowing and volumetric region of interest selection. This process of auto-coloring conserved structures in variable datasets is a step toward the goal of a broader, standardized automatic structure visualization method for radiological data. If successful it would permit identification, visualization or deletion of structures in radiological data by semi-automatically applying canonical structure information to the radiological data (not just processing and visualization of the data's intrinsic dynamic range). More sophisticated selection of control points and patterns of warping may allow for more accurate transforms, and thus advances in visualization, simulation, education, diagnostics, and treatment planning.
Zhang, Bingxu; Gu, Xiaoyan; Li, Yafei; Li, Xiaohong; Gu, Mengxiao; Zhang, Nan; Shen, Xiangguang; Ding, Huanzhong
2014-12-16
The resistance of cephalosporins is significantly serious in veterinary clinic. In order to inhibit the bacterial resistance production, the mutant selection window (MSW) hypothesis with Escherichia coli (E. coli) ATCC 25922 exposed to cefquinome in an animal tissue-cage model was investigated. Localized infection with E. coli was established in piglets, and the infected animals were administrated intramuscularly with various doses and intervals of cefquinome to provide antibiotic concentrations below the MIC99, between the MIC99 and the mutant prevention concentration (MPC), and above the MPC. E. coli lost susceptibility when drug concentrations fluctuated between the lower and upper boundaries of the window, which defined in vitro as the MIC99 (0.06 μg/mL) and the MPC (0.16 μg/mL) respectively. For PK/PD parameters, there were no mutant selection enrichment when T>MIC99 was ≤ 25% or T>MPC was ≥ 50% of administration interval. When T>MIC99 was > 25% and T>MPC was <50% of administration interval, resistance selection was observed. When AUC24 h/MIC99 and AUC24 h/MPC were considered, the mutant selection window extended from 32.84 h to 125.64 h and from 12.83 h to 49.09 h, respectively. These findings demonstrate that the MSW exists in vivo for time-dependent antimicrobial agents, and its boundaries fit well with those determined in vitro. Maintenance of antimicrobial concentrations above the MPC for > 50% of administration interval is a straightforward way to restrict the acquisition of resistance in this tissue cage model. This situation was achieved with daily intramuscular doses of 1 mg cefquinome/kg body weight.
Simple automatic strategy for background drift correction in chromatographic data analysis.
Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin
2016-06-03
Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.
Automatic selection of arterial input function using tri-exponential models
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Chen, Jeremy; Castro, Marcelo; Thomasson, David
2009-02-01
Dynamic Contrast Enhanced MRI (DCE-MRI) is one method for drug and tumor assessment. Selecting a consistent arterial input function (AIF) is necessary to calculate tissue and tumor pharmacokinetic parameters in DCE-MRI. This paper presents an automatic and robust method to select the AIF. The first stage is artery detection and segmentation, where knowledge about artery structure and dynamic signal intensity temporal properties of DCE-MRI is employed. The second stage is AIF model fitting and selection. A tri-exponential model is fitted for every candidate AIF using the Levenberg-Marquardt method, and the best fitted AIF is selected. Our method has been applied in DCE-MRIs of four different body parts: breast, brain, liver and prostate. The success rates in artery segmentation for 19 cases are 89.6%+/-15.9%. The pharmacokinetic parameters computed from the automatically selected AIFs are highly correlated with those from manually determined AIFs (R2=0.946, P(T<=t)=0.09). Our imaging-based tri-exponential AIF model demonstrated significant improvement over a previously proposed bi-exponential model.
Pandey, Anil Kumar; Saroha, Kartik; Sharma, Param Dev; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
In this study, we have developed a simple image processing application in MATLAB that uses suprathreshold stochastic resonance (SSR) and helps the user to visualize abdominopelvic tumor on the exported prediuretic positron emission tomography/computed tomography (PET/CT) images. A brainstorming session was conducted for requirement analysis for the program. It was decided that program should load the screen captured PET/CT images and then produces output images in a window with a slider control that should enable the user to view the best image that visualizes the tumor, if present. The program was implemented on personal computer using Microsoft Windows and MATLAB R2013b. The program has option for the user to select the input image. For the selected image, it displays output images generated using SSR in a separate window having a slider control. The slider control enables the user to view images and select one which seems to provide the best visualization of the area(s) of interest. The developed application enables the user to select, process, and view output images in the process of utilizing SSR to detect the presence of abdominopelvic tumor on prediuretic PET/CT image.
Selective excitation of window and buffer layers in chalcopyrite devices and modules
Glynn, Stephen; Repins, Ingrid L.; Burst, James M.; ...
2018-02-02
Window and buffer layers in chalcopyrite devices are well known to affect junctions, conduction, and photo-absorption properties of the device. Some of these layers, particularly 'buffers,' which are deposited directly on top of the absorber, exhibit metastable effects upon exposure to light. Thus, to understand device performance and/or metastability, it is sometimes desirable to selectively excite different layers in the device stack. Absorption characteristics of various window and buffer layers used in chalcopyrite devices are measured. These characteristics are compared with emission spectra of common and available light sources that might be used to optically excite such layers. Effects ofmore » the window and buffer absorption on device quantum efficiency and metastability are discussed. For the case of bath-deposited Zn(O,S) buffers, we conclude that this layer is not optically excited in research devices or modules. Furthermore, this provides a complimentary mechanism to the chemical differences that may cause long time constants (compared to devices with CdS buffers) associated with reaching a stable 'light-soaked' state.« less
Selective excitation of window and buffer layers in chalcopyrite devices and modules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glynn, Stephen; Repins, Ingrid L.; Burst, James M.
Window and buffer layers in chalcopyrite devices are well known to affect junctions, conduction, and photo-absorption properties of the device. Some of these layers, particularly 'buffers,' which are deposited directly on top of the absorber, exhibit metastable effects upon exposure to light. Thus, to understand device performance and/or metastability, it is sometimes desirable to selectively excite different layers in the device stack. Absorption characteristics of various window and buffer layers used in chalcopyrite devices are measured. These characteristics are compared with emission spectra of common and available light sources that might be used to optically excite such layers. Effects ofmore » the window and buffer absorption on device quantum efficiency and metastability are discussed. For the case of bath-deposited Zn(O,S) buffers, we conclude that this layer is not optically excited in research devices or modules. Furthermore, this provides a complimentary mechanism to the chemical differences that may cause long time constants (compared to devices with CdS buffers) associated with reaching a stable 'light-soaked' state.« less
Adaptive Liquid Crystal Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taheri, Bahman; Bodnar, Volodymyr
2011-12-31
Energy consumption by private and commercial sectors in the U.S. has steadily grown over the last decade. The uncertainty in future availability of imported oil, on which the energy consumption relies strongly, resulted in a dramatic increase in the cost of energy. About 20% of this consumption are used to heat and cool houses and commercial buildings. To reduce dependence on the foreign oil and cut down emission of greenhouse gases, it is necessary to eliminate losses and reduce total energy consumption by buildings. To achieve this goal it is necessary to redefine the role of the conventional windows. Atmore » a minimum, windows should stop being a source for energy loss. Ideally, windows should become a source of energy, providing net gain to reduce energy used to heat and cool homes. It is possible to have a net energy gain from a window if its light transmission can be dynamically altered, ideally electronically without the need of operator assistance, providing optimal control of the solar gain that varies with season and climate in the U.S. In addition, the window must not require power from the building for operation. Resolution of this problem is a societal challenge and of national interest and will have a broad global impact. For this purpose, the year-round, allclimate window solution to provide an electronically variable solar heat gain coefficient (SHGC) with a wide dynamic range is needed. AlphaMicron, Inc. (AMI) developed and manufactured 1ft × 1ft prototype panels for the world’s first auto-adjusting Adaptive Liquid Crystal Windows (ALCWs) that can operate from sunlight without the need for external power source and demonstrate an electronically adjustable SHGC. This novel windows are based on AlphaMicron’s patented e-Tint® technology, a guesthost liquid crystal system implemented on flexible, optically clear plastic films. This technology is suitable both for OEM and aftermarket (retro-fitting) lamination to new and existing windows. Low level of power consumption by ALCWs allows for on-board power electronics for automatic matching of transmission through windows to varying climate conditions without drawing the power from the power grid. ALCWs are capable of transmitting more sunlight in winters to assist in heating and less sunlight in summers to minimize overheating. As such, they can change the window from being a source of energy loss to a source of energy gain. In addition, the scalable AMI’s roll-to-roll process, proved by making 1ft × 1ftALCW prototype panels, allows for cost-effective production of large-scale window panels along with capability to change easily their color and shape. In addition to architectural glazing in houses and commercial buildings, ALCWs can be used in other applications where control of sunlight is needed, such as green houses, used by commercial produce growers and botanical gardens, cars, aircrafts, etc.« less
Automatic Registration of Scanned Satellite Imagery with a Digital Map Data Base.
1980-11-01
define the corresponding map window (mW)(procedure TRANSFORMWINDOW MAP A-- S4S Araofms Cpo iin et Serc Area deiatl compAr tal _______________ T...to a LIST-item). LIN: = ( ® code 2621431 ; ® pointer LA to the line list, © pointer PRI; pointer PR2), LIST: = ( Q pointer PL to a LIN-item; n pointer...items where PL -pointers are replaced by a code for the beginning (the number 262140 in our case) and end (the number 26241). Figure 3.2 illustra- tes a
Fast object detection algorithm based on HOG and CNN
NASA Astrophysics Data System (ADS)
Lu, Tongwei; Wang, Dandan; Zhang, Yanduo
2018-04-01
In the field of computer vision, object classification and object detection are widely used in many fields. The traditional object detection have two main problems:one is that sliding window of the regional selection strategy is high time complexity and have window redundancy. And the other one is that Robustness of the feature is not well. In order to solve those problems, Regional Proposal Network (RPN) is used to select candidate regions instead of selective search algorithm. Compared with traditional algorithms and selective search algorithms, RPN has higher efficiency and accuracy. We combine HOG feature and convolution neural network (CNN) to extract features. And we use SVM to classify. For TorontoNet, our algorithm's mAP is 1.6 percentage points higher. For OxfordNet, our algorithm's mAP is 1.3 percentage higher.
Content-based analysis of Ki-67 stained meningioma specimens for automatic hot-spot selection.
Swiderska-Chadaj, Zaneta; Markiewicz, Tomasz; Grala, Bartlomiej; Lorent, Malgorzata
2016-10-07
Hot-spot based examination of immunohistochemically stained histological specimens is one of the most important procedures in pathomorphological practice. The development of image acquisition equipment and computational units allows for the automation of this process. Moreover, a lot of possible technical problems occur in everyday histological material, which increases the complexity of the problem. Thus, a full context-based analysis of histological specimens is also needed in the quantification of immunohistochemically stained specimens. One of the most important reactions is the Ki-67 proliferation marker in meningiomas, the most frequent intracranial tumour. The aim of our study is to propose a context-based analysis of Ki-67 stained specimens of meningiomas for automatic selection of hot-spots. The proposed solution is based on textural analysis, mathematical morphology, feature ranking and classification, as well as on the proposed hot-spot gradual extinction algorithm to allow for the proper detection of a set of hot-spot fields. The designed whole slide image processing scheme eliminates such artifacts as hemorrhages, folds or stained vessels from the region of interest. To validate automatic results, a set of 104 meningioma specimens were selected and twenty hot-spots inside them were identified independently by two experts. The Spearman rho correlation coefficient was used to compare the results which were also analyzed with the help of a Bland-Altman plot. The results show that most of the cases (84) were automatically examined properly with two fields of view with a technical problem at the very most. Next, 13 had three such fields, and only seven specimens did not meet the requirement for the automatic examination. Generally, the Automatic System identifies hot-spot areas, especially their maximum points, better. Analysis of the results confirms the very high concordance between an automatic Ki-67 examination and the expert's results, with a Spearman rho higher than 0.95. The proposed hot-spot selection algorithm with an extended context-based analysis of whole slide images and hot-spot gradual extinction algorithm provides an efficient tool for simulation of a manual examination. The presented results have confirmed that the automatic examination of Ki-67 in meningiomas could be introduced in the near future.
The Assessment of Selectivity in Different Quadrupole-Orbitrap Mass Spectrometry Acquisition Modes
NASA Astrophysics Data System (ADS)
Berendsen, Bjorn J. A.; Wegh, Robin S.; Meijer, Thijs; Nielen, Michel W. F.
2015-02-01
Selectivity of the confirmation of identity in liquid chromatography (tandem) mass spectrometry using Q-Orbitrap instrumentation was assessed using different acquisition modes based on a representative experimental data set constructed from 108 samples, including six different matrix extracts and containing over 100 analytes each. Single stage full scan, all ion fragmentation, and product ion scanning were applied. By generating reconstructed ion chromatograms using unit mass window in targeted MS2, selected reaction monitoring (SRM), regularly applied using triple-quadrupole instruments, was mimicked. This facilitated the comparison of single stage full scan, all ion fragmentation, (mimicked) SRM, and product ion scanning applying a mass window down to 1 ppm. Single factor Analysis of Variance was carried out on the variance (s2) of the mass error to determine which factors and interactions are significant parameters with respect to selectivity. We conclude that selectivity is related to the target compound (mainly the mass defect), the matrix, sample clean-up, concentration, and mass resolution. Selectivity of the different instrumental configurations was quantified by counting the number of interfering peaks observed in the chromatograms. We conclude that precursor ion selection significantly contributes to selectivity: monitoring of a single product ion at high mass accuracy with a 1 Da precursor ion window proved to be equally selective or better to monitoring two transition products in mimicked SRM. In contrast, monitoring a single fragment in all ion fragmentation mode results in significantly lower selectivity versus mimicked SRM. After a thorough inter-laboratory evaluation study, the results of this study can be used for a critical reassessment of the current identification points system and contribute to the next generation of evidence-based and robust performance criteria in residue analysis and sports doping.
climwin: An R Toolbox for Climate Window Analysis.
Bailey, Liam D; van de Pol, Martijn
2016-01-01
When studying the impacts of climate change, there is a tendency to select climate data from a small set of arbitrary time periods or climate windows (e.g., spring temperature). However, these arbitrary windows may not encompass the strongest periods of climatic sensitivity and may lead to erroneous biological interpretations. Therefore, there is a need to consider a wider range of climate windows to better predict the impacts of future climate change. We introduce the R package climwin that provides a number of methods to test the effect of different climate windows on a chosen response variable and compare these windows to identify potential climate signals. climwin extracts the relevant data for each possible climate window and uses this data to fit a statistical model, the structure of which is chosen by the user. Models are then compared using an information criteria approach. This allows users to determine how well each window explains variation in the response variable and compare model support between windows. climwin also contains methods to detect type I and II errors, which are often a problem with this type of exploratory analysis. This article presents the statistical framework and technical details behind the climwin package and demonstrates the applicability of the method with a number of worked examples.
Van Strien, Jan W; Langeslag, Sandra J E; Strekalova, Nadja J; Gootjes, Liselotte; Franken, Ingmar H A
2009-01-28
To examine whether valence and arousal influence recognition memory during early automatic or during more sustained processes, event-related brain potentials (ERPs) of 21 women were recorded while they made old/new judgments in a continuous recognition task with pictures from the International Affective Picture System. The pictures were presented twice and differed in emotional valence and arousal. The P1 peak and four time windows were investigated: 200-300 ms, 300-400 ms, 400-600 ms, and 750-1000 ms after stimulus onset. There was a robust old/new effect starting in the 200-300 ms epoch and lasting all time windows. The valence effect was mainly present in the P1 peak and the 200-400 ms epoch, whereas the arousal effect was found in the 300-1000 ms epoch. Exploratory sLORETA analyses dissociated valence-dependent ventromedial prefrontal activity and arousal-dependent occipital activity in the 350-380 ms time window. Valence interacted with the 200-400 ms old/new effect at central and frontal sites. Arousal interacted with the 750-1000 ms old/new effect at posterior sites. It is concluded that valence influences fast recognition memory, while arousal may influence sustained encoding.
Yin, Xiaoming; Li, Xiang; Zhao, Liping; Fang, Zhongping
2009-11-10
A Shack-Hartmann wavefront sensor (SWHS) splits the incident wavefront into many subsections and transfers the distorted wavefront detection into the centroid measurement. The accuracy of the centroid measurement determines the accuracy of the SWHS. Many methods have been presented to improve the accuracy of the wavefront centroid measurement. However, most of these methods are discussed from the point of view of optics, based on the assumption that the spot intensity of the SHWS has a Gaussian distribution, which is not applicable to the digital SHWS. In this paper, we present a centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement. The method can detect the centroid of each focal spot precisely and robustly by eliminating the influence of various noises, such as diffraction of the digital SHWS, unevenness and instability of the light source, as well as deviation between the centroid of the focal spot and the center of the detection area. The experimental results demonstrate that the algorithm has better precision, repeatability, and stability compared with other commonly used centroid methods, such as the statistical averaging, thresholding, and windowing algorithms.
Variational estimation of process parameters in a simplified atmospheric general circulation model
NASA Astrophysics Data System (ADS)
Lv, Guokun; Koehl, Armin; Stammer, Detlef
2016-04-01
Parameterizations are used to simulate effects of unresolved sub-grid-scale processes in current state-of-the-art climate model. The values of the process parameters, which determine the model's climatology, are usually manually adjusted to reduce the difference of model mean state to the observed climatology. This process requires detailed knowledge of the model and its parameterizations. In this work, a variational method was used to estimate process parameters in the Planet Simulator (PlaSim). The adjoint code was generated using automatic differentiation of the source code. Some hydrological processes were switched off to remove the influence of zero-order discontinuities. In addition, the nonlinearity of the model limits the feasible assimilation window to about 1day, which is too short to tune the model's climatology. To extend the feasible assimilation window, nudging terms for all state variables were added to the model's equations, which essentially suppress all unstable directions. In identical twin experiments, we found that the feasible assimilation window could be extended to over 1-year and accurate parameters could be retrieved. Although the nudging terms transform to a damping of the adjoint variables and therefore tend to erases the information of the data over time, assimilating climatological information is shown to provide sufficient information on the parameters. Moreover, the mechanism of this regularization is discussed.
Automatic blood vessel based-liver segmentation using the portal phase abdominal CT
NASA Astrophysics Data System (ADS)
Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen
2018-02-01
Liver segmentation is the basis for computer-based planning of hepatic surgical interventions. In diagnosis and analysis of hepatic diseases and surgery planning, automatic segmentation of liver has high importance. Blood vessel (BV) has showed high performance at liver segmentation. In our previous work, we developed a semi-automatic method that segments the liver through the portal phase abdominal CT images in two stages. First stage was interactive segmentation of abdominal blood vessels (ABVs) and subsequent classification into hepatic (HBVs) and non-hepatic (non-HBVs). This stage had 5 interactions that include selective threshold for bone segmentation, selecting two seed points for kidneys segmentation, selection of inferior vena cava (IVC) entrance for starting ABVs segmentation, identification of the portal vein (PV) entrance to the liver and the IVC-exit for classifying HBVs from other ABVs (non-HBVs). Second stage is automatic segmentation of the liver based on segmented ABVs as described in [4]. For full automation of our method we developed a method [5] that segments ABVs automatically tackling the first three interactions. In this paper, we propose full automation of classifying ABVs into HBVs and non- HBVs and consequently full automation of liver segmentation that we proposed in [4]. Results illustrate that the method is effective at segmentation of the liver through the portal abdominal CT images.
Multimedia proceedings of the 10th Office Information Technology Conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, B.
1993-09-10
The CD contains the handouts for all the speakers, demo software from Apple, Adobe, Microsoft, and Zylabs, and video movies of the keynote speakers. Adobe Acrobat is used to provide full-fidelity retrieval of the speakers` slides and Apple`s Quicktime for Macintosh and Windows is used for video playback. ZyIndex is included for Windows users to provide a full-text search engine for selected documents. There are separately labelled installation and operating instructions for Macintosh and Windows users and some general materials common to both sets of users.
Influence of sampling window size and orientation on parafoveal cone packing density
Lombardo, Marco; Serrao, Sebastiano; Ducoli, Pietro; Lombardo, Giuseppe
2013-01-01
We assessed the agreement between sampling windows of different size and orientation on packing density estimates in images of the parafoveal cone mosaic acquired using a flood-illumination adaptive optics retinal camera. Horizontal and vertical oriented sampling windows of different size (320x160 µm, 160x80 µm and 80x40 µm) were selected in two retinal locations along the horizontal meridian in one eye of ten subjects. At each location, cone density tended to decline with decreasing sampling area. Although the differences in cone density estimates were not statistically significant, Bland-Altman plots showed that the agreement between cone density estimated within the different sampling window conditions was moderate. The percentage of the preferred packing arrangements of cones by Voronoi tiles was slightly affected by window size and orientation. The results illustrated the high importance of specifying the size and orientation of the sampling window used to derive cone metric estimates to facilitate comparison of different studies. PMID:24009995
The effects of window shape and reticle presence on performance in a vertical alignment task
NASA Technical Reports Server (NTRS)
Rosenberg, Erika L.; Haines, Richard F.; Jordan, Kevin
1989-01-01
This study was conducted to evaluate the effect of selected interior work-station orientational cuing upon the ability to align a target image with local vertical in the frontal plane. Angular error from gravitational vertical in an alignment task was measured for 20 observers viewing through two window shapes (square, round), two initial orientations of a computer-generated space shuttle image, and the presence or absence of a stabilized optical alignment reticle. In terms of overall accuracy, it was found that observer error was significantly smaller for the square window and reticle-present conditions than for the round window and reticle-absent conditions. Response bias data reflected an overall tendency to undershoot and greater variability of response in the round window/no reticle condition. These results suggest that environmental cuing information, such as that provided by square window frames and alignment reticles, may aid in subjective orientation and increase accuracy of response in a Space Station proximity operations alignment task.
Optical performance of segmented aperture windows for solar tower receivers
NASA Astrophysics Data System (ADS)
Buck, Reiner
2017-06-01
Segmented quartz windows are a concept to build larger windows for receivers that require a closed aperture. Reflection losses are a significant loss factor for such solar receivers. Without any additional measures, the reflection loss can reach about 12%. One important measure to improve transmission is the application of anti-reflective coatings, which is beneficial in any case. Another option is modifying the window geometry, especially the edge surfaces of the glass segments. A certain fraction of the reflection losses are caused by a light-guide effect in the glass body, for rays entering through the front surface. Changing the cut surfaces in a way reducing the light-guide effect can significantly improve transmission of a segmented window. Several possible configurations are evaluated and discussed. The results of ray-tracing simulations verify the improvement. The final selection of the window configuration depends on the optical properties and on mechanical strength, manufacturing and cost considerations. This has to be evaluated for any specific receiver design.
Switchable Materials for Smart Windows.
Wang, Yang; Runnerstrom, Evan L; Milliron, Delia J
2016-06-07
This article reviews the basic principles of and recent developments in electrochromic, photochromic, and thermochromic materials for applications in smart windows. Compared with current static windows, smart windows can dynamically modulate the transmittance of solar irradiation based on weather conditions and personal preferences, thus simultaneously improving building energy efficiency and indoor human comfort. Although some smart windows are commercially available, their widespread implementation has not yet been realized. Recent advances in nanostructured materials provide new opportunities for next-generation smart window technology owing to their unique structure-property relations. Nanomaterials can provide enhanced coloration efficiency, faster switching kinetics, and longer lifetime. In addition, their compatibility with solution processing enables low-cost and high-throughput fabrication. This review also discusses the importance of dual-band modulation of visible and near-infrared (NIR) light, as nearly 50% of solar energy lies in the NIR region. Some latest results show that solution-processable nanostructured systems can selectively modulate the NIR light without affecting the visible transmittance, thus reducing energy consumption by air conditioning, heating, and artificial lighting.
Smart windows with functions of reflective display and indoor temperature-control
NASA Astrophysics Data System (ADS)
Lee, I.-Hui; Chao, Yu-Ching; Hsu, Chih-Cheng; Chang, Liang-Chao; Chiu, Tien-Lung; Lee, Jiunn-Yih; Kao, Fu-Jen; Lee, Chih-Kung; Lee, Jiun-Haw
2010-02-01
In this paper, a switchable window based on cholestreric liquid crystal (CLC) was demonstrated. Under different applied voltages, incoming light at visible and infrared wavelengths was modulated, respectively. A mixture of CLC with a nematic liquid crystal and a chiral dopant selectively reflected infrared light without bias, which effectively reduced the indoor temperature under sunlight illumination. At this time, transmission at visible range was kept at high and the windows looked transparent. With increasing the voltage to 15V, CLC changed to focal conic state and can be used as a reflective display, a privacy window, or a screen for projector. Under a high voltage (30V), homeotropic state was achieved. At this time, both infrared and visible light can transmit which acted as a normal window, which permitted infrared spectrum of winter sunlight to enter the room so as to reduce the heating requirement. Such a device can be used as a switchable window in smart buildings, green houses and windshields.
Selective Window Application of Gentamicin+ Dexamethasone in Meniere's Disease.
Ardıç, Fazıl Necdet; Tümkaya, Funda; Aykal, Kamil; Çabuk, Burçin
2017-08-01
The purpose of the study is to prevent hearing loss when using intratympanic (IT) gentamicin for intractable Meniere's disease. It is a retrospective case review study. Twenty five patients who had definite Meniere's disease and had either selective window application or weekly IT gentamicin were included into the study. First group (selective) had dexamethasone on the round window and gentamicin on oval window during exploratory tympanotomy procedure. The second group had IT gentamicin at weekly intervals. The degree of caloric weakness (CW), average hearing level in low pitch (HLP) (250, 500, 1000, 2000 Hz) and high pitch (HHP) (4000, 6000, 8000 Hz) were compared before and after treatment. The need for further treatment was noted. In the first group, the average HLP was increased from 51.6±7dB to 52.2±5.6 dB. The average HHP was increased 41.96±20.2 dB to 47.2±18.3 dB after treatment. The CW changed from 37.6±23.9 % to 54.6±30.6 %. In the second group, the average HLP was increased from 56.3±10.5 dB to 61.65±18.3 dB. The average HHP was increased 59.05±17.4 dB to 69.4±21.98 dB after treatment. The CW changed from 45.8±22.3% to 71.53±29.63 %. Both methods had statisticaly significant increase in caloric weakness. But only IT gentamicin led a significant hearing loss in HHP. The use of dexamethasone and gentamycin via different windows in the middle ear is safe and effective method for Meniere's disease in the short term. Application of dexamethasone protects not only the hearing cells but vestibular cells also.
Rizzetto, Lisa; Guedez, Damariz Rivero; Donato, Michele; Romualdi, Chiara; Draghici, Sorin; Cavalieri, Duccio
2011-01-01
Motivation: Many models and analysis of signaling pathways have been proposed. However, neither of them takes into account that a biological pathway is not a fixed system, but instead it depends on the organism, tissue and cell type as well as on physiological, pathological and experimental conditions. Results: The Biological Connection Markup Language (BCML) is a format to describe, annotate and visualize pathways. BCML is able to store multiple information, permitting a selective view of the pathway as it exists and/or behave in specific organisms, tissues and cells. Furthermore, BCML can be automatically converted into data formats suitable for analysis and into a fully SBGN-compliant graphical representation, making it an important tool that can be used by both computational biologists and ‘wet lab’ scientists. Availability and implementation: The XML schema and the BCML software suite are freely available under the LGPL for download at http://bcml.dc-atlas.net. They are implemented in Java and supported on MS Windows, Linux and OS X. Contact: duccio.cavalieri@unifi.it; sorin@wayne.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21653523
Temporally-aware algorithms for the classification of anuran sounds.
Luque, Amalia; Romero-Lemos, Javier; Carrasco, Alejandro; Gonzalez-Abril, Luis
2018-01-01
Several authors have shown that the sounds of anurans can be used as an indicator of climate change. Hence, the recording, storage and further processing of a huge number of anuran sounds, distributed over time and space, are required in order to obtain this indicator. Furthermore, it is desirable to have algorithms and tools for the automatic classification of the different classes of sounds. In this paper, six classification methods are proposed, all based on the data-mining domain, which strive to take advantage of the temporal character of the sounds. The definition and comparison of these classification methods is undertaken using several approaches. The main conclusions of this paper are that: (i) the sliding window method attained the best results in the experiments presented, and even outperformed the hidden Markov models usually employed in similar applications; (ii) noteworthy overall classification performance has been obtained, which is an especially striking result considering that the sounds analysed were affected by a highly noisy background; (iii) the instance selection for the determination of the sounds in the training dataset offers better results than cross-validation techniques; and (iv) the temporally-aware classifiers have revealed that they can obtain better performance than their non-temporally-aware counterparts.
Temporally-aware algorithms for the classification of anuran sounds
Gonzalez-Abril, Luis
2018-01-01
Several authors have shown that the sounds of anurans can be used as an indicator of climate change. Hence, the recording, storage and further processing of a huge number of anuran sounds, distributed over time and space, are required in order to obtain this indicator. Furthermore, it is desirable to have algorithms and tools for the automatic classification of the different classes of sounds. In this paper, six classification methods are proposed, all based on the data-mining domain, which strive to take advantage of the temporal character of the sounds. The definition and comparison of these classification methods is undertaken using several approaches. The main conclusions of this paper are that: (i) the sliding window method attained the best results in the experiments presented, and even outperformed the hidden Markov models usually employed in similar applications; (ii) noteworthy overall classification performance has been obtained, which is an especially striking result considering that the sounds analysed were affected by a highly noisy background; (iii) the instance selection for the determination of the sounds in the training dataset offers better results than cross-validation techniques; and (iv) the temporally-aware classifiers have revealed that they can obtain better performance than their non-temporally-aware counterparts. PMID:29740517
Residual Silicone Detection. [external tank and solid rocket booster surfaces
NASA Technical Reports Server (NTRS)
Smith, T.
1980-01-01
Both photoelectron emission and ellipsometry proved successful in detecting silicone contamination on unpainted and epoxy painted metal surfaces such as those of the external tank and the solid rocket booster. Great success was achieved using photoelectron emission (PEE). Panels were deliberately contaminated to controlled levels and then mapped with PEE to reveal the areas and levels that were contaminated. The panels were then tested with regard to adhesive properties. Tapes were bonded over the contaminated and uncontaminated regions and the peel force was measured, or the contaminated panels were bonded (with CPR 483 foam) to uncontaminated panels and made into lap shear specimens. Other panels were bonded and made into wedge specimens for hydrothermal stress endurance tests. Strong adhesion resulted if the PEE signal fell within an acceptance window, but was poor outside the acceptance window. A prototype instrument is being prepared which can automatically be scanned over the external liquid hydrogen tank and identify those regions that are contaminated and will cause bond degradation.
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Cortical Oscillatory Mechanisms Supporting the Control of Human Social-Emotional Actions.
Bramson, Bob; Jensen, Ole; Toni, Ivan; Roelofs, Karin
2018-06-20
The human anterior prefrontal cortex (aPFC) is involved in regulating social-emotional behavior, presumably by modulating effective connectivity with downstream parietal, limbic, and motor cortices. Regulating that connectivity might rely on theta-band oscillations (4-8 Hz), a brain rhythm known to create overlapping periods of excitability between distant regions by temporally releasing neurons from inhibition. Here, we used MEG to understand how aPFC theta-band oscillations implement control over prepotent social-emotional behaviors; that is, the control over automatically elicited approach and avoidance actions. Forty human male participants performed a social approach-avoidance task in which they approached or avoided visually displayed emotional faces (happy or angry) by pulling or pushing a joystick. Approaching angry and avoiding happy faces (incongruent condition) requires rapid application of cognitive control to override prepotent habitual action tendencies to approach appetitive and to avoid aversive situations. In the time window before response delivery, trial-by-trial variations in aPFC theta-band power (6 Hz) predicted reaction time increases during emotional control and were inversely related to beta-band power (14-22 Hz) over parietofrontal cortex. In sensorimotor areas contralateral to the moving hand, premovement gamma-band rhythms (60-90 Hz) were stronger during incongruent than congruent trials, with power increases phase locked to peaks of the aPFC theta-band oscillations. These findings define a mechanistic relation between cortical areas involved in implementing rapid control over human social-emotional behavior. The aPFC may bias neural processing toward rule-driven actions and away from automatic emotional tendencies by coordinating tonic disinhibition and phasic enhancement of parietofrontal circuits involved in action selection. SIGNIFICANCE STATEMENT Being able to control social-emotional behavior is crucial for successful participation in society, as is illustrated by the severe social and occupational difficulties experienced by people suffering from social motivational disorders such as social anxiety. In this study, we show that theta-band oscillations in the anterior prefrontal cortex (aPFC), which are thought to provide temporal organization for neural firing during communication between distant brain areas, facilitate this control by linking aPFC to parietofrontal beta-band and sensorimotor gamma-band oscillations involved in action selection. These results contribute to a mechanistic understanding of cognitive control over automatic social-emotional action and point to frontal theta-band oscillations as a possible target of rhythmic neurostimulation techniques during treatment for social anxiety. Copyright © 2018 the authors 0270-6474/18/385739-11$15.00/0.
Analysis and Comparison of Some Automatic Vehicle Monitoring Systems
DOT National Transportation Integrated Search
1973-07-01
In 1970 UMTA solicited proposals and selected four companies to develop systems to demonstrate the feasibility of different automatic vehicle monitoring techniques. The demonstrations culminated in experiments in Philadelphia to assess the performanc...
Automatic Evolution of Molecular Nanotechnology Designs
NASA Technical Reports Server (NTRS)
Globus, Al; Lawton, John; Wipke, Todd; Saini, Subhash (Technical Monitor)
1998-01-01
This paper describes strategies for automatically generating designs for analog circuits at the molecular level. Software maps out the edges and vertices of potential nanotechnology systems on graphs, then selects appropriate ones through evolutionary or genetic paradigms.
[Study on the automatic parameters identification of water pipe network model].
Jia, Hai-Feng; Zhao, Qi-Feng
2010-01-01
Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.
Thai Automatic Speech Recognition
2005-01-01
used in an external DARPA evaluation involving medical scenarios between an American Doctor and a naïve monolingual Thai patient. 2. Thai Language... dictionary generation more challenging, and (3) the lack of word segmentation, which calls for automatic segmentation approaches to make n-gram language...requires a dictionary and provides various segmentation algorithms to automatically select suitable segmentations. Here we used a maximal matching
Method of identifying features in indexed data
Jarman, Kristin H [Richland, WA; Daly, Don Simone [Richland, WA; Anderson, Kevin K [Richland, WA; Wahl, Karen L [Richland, WA
2001-06-26
The present invention is a method of identifying features in indexed data, especially useful for distinguishing signal from noise in data provided as a plurality of ordered pairs. Each of the plurality of ordered pairs has an index and a response. The method has the steps of: (a) providing an index window having a first window end located on a first index and extending across a plurality of indices to a second window end; (b) selecting responses corresponding to the plurality of indices within the index window and computing a measure of dispersion of the responses; and (c) comparing the measure of dispersion to a dispersion critical value. Advantages of the present invention include minimizing signal to noise ratio, signal drift, varying baseline signal and combinations thereof.
A multimodal logistics service network design with time windows and environmental concerns
Zhang, Dezhi; He, Runzhong; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained. PMID:28934272
Superconductive radiofrequency window assembly
Phillips, Harry Lawrence; Elliott, Thomas S.
1998-01-01
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly (20) has a superconducting metal-ceramic design. The srf window assembly (20) comprises a superconducting frame (30), a ceramic plate (40) having a superconducting metallized area, and a superconducting eyelet (50) for sealing plate (40) into frame (30). The plate (40) is brazed to eyelet (50) which is then electron beam welded to frame (30). A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator.
Superconductive radiofrequency window assembly
Phillips, H.L.; Elliott, T.S.
1998-05-19
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The SRF window assembly has a superconducting metal-ceramic design. The SRF window assembly comprises a superconducting frame, a ceramic plate having a superconducting metallized area, and a superconducting eyelet for sealing plate into frame. The plate is brazed to eyelet which is then electron beam welded to frame. A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the SRF window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator. 11 figs.
Superconducting radiofrequency window assembly
Phillips, Harry L.; Elliott, Thomas S.
1997-01-01
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly (20) has a superconducting metal-ceramic design. The srf window assembly (20) comprises a superconducting frame (30), a ceramic plate (40) having a superconducting metallized area, and a superconducting eyelet (50) for sealing plate (40) into frame (30). The plate (40) is brazed to eyelet (50) which is then electron beam welded to frame (30). A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator.
Superconducting radiofrequency window assembly
Phillips, H.L.; Elliott, T.S.
1997-03-11
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly has a superconducting metal-ceramic design. The srf window assembly comprises a superconducting frame, a ceramic plate having a superconducting metallized area, and a superconducting eyelet for sealing plate into frame. The plate is brazed to eyelet which is then electron beam welded to frame. A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator. 11 figs.
A multimodal logistics service network design with time windows and environmental concerns.
Zhang, Dezhi; He, Runzhong; Li, Shuangyan; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained.
NASA Astrophysics Data System (ADS)
Pollastro, Pasquale; Rampone, Salvatore
The aim of this work is to describe a cleaning procedure of GenBank data, producing material to train and to assess the prediction accuracy of computational approaches for gene characterization. A procedure (GenBank2HS3D) has been defined, producing a dataset (HS3D - Homo Sapiens Splice Sites Dataset) of Homo Sapiens Splice regions extracted from GenBank (Rel.123 at this time). It selects, from the complete GenBank Primate Division, entries of Human Nuclear DNA according with several assessed criteria; then it extracts exons and introns from these entries (actually 4523 + 3802). Donor and acceptor sites are then extracted as windows of 140 nucleotides around each splice site (3799 + 3799). After discarding windows not including canonical GT-AG junctions (65 + 74), including insufficient data (not enough material for a 140 nucleotide window) (686 + 589), including not AGCT bases (29 + 30), and redundant (218 + 226), the remaining windows (2796 + 2880) are reported in the dataset. Finally, windows of false splice sites are selected by searching canonical GT-AG pairs in not splicing positions (271 937 + 332 296). The false sites in a range +/- 60 from a true splice site are marked as proximal. HS3D, release 1.2 at this time, is available at the Web server of the University of Sannio: http://www.sci.unisannio.it/docenti/rampone/.
Considerations in Phase Estimation and Event Location Using Small-aperture Regional Seismic Arrays
NASA Astrophysics Data System (ADS)
Gibbons, Steven J.; Kværna, Tormod; Ringdal, Frode
2010-05-01
The global monitoring of earthquakes and explosions at decreasing magnitudes necessitates the fully automatic detection, location and classification of an ever increasing number of seismic events. Many seismic stations of the International Monitoring System are small-aperture arrays designed to optimize the detection and measurement of regional phases. Collaboration with operators of mines within regional distances of the ARCES array, together with waveform correlation techniques, has provided an unparalleled opportunity to assess the ability of a small-aperture array to provide robust and accurate direction and slowness estimates for phase arrivals resulting from well-constrained events at sites of repeating seismicity. A significant reason for the inaccuracy of current fully-automatic event location estimates is the use of f- k slowness estimates measured in variable frequency bands. The variability of slowness and azimuth measurements for a given phase from a given source region is reduced by the application of almost any constant frequency band. However, the frequency band resulting in the most stable estimates varies greatly from site to site. Situations are observed in which regional P- arrivals from two sites, far closer than the theoretical resolution of the array, result in highly distinct populations in slowness space. This means that the f- k estimates, even at relatively low frequencies, can be sensitive to source and path-specific characteristics of the wavefield and should be treated with caution when inferring a geographical backazimuth under the assumption of a planar wavefront arriving along the great-circle path. Moreover, different frequency bands are associated with different biases meaning that slowness and azimuth station corrections (commonly denoted SASCs) cannot be calibrated, and should not be used, without reference to the frequency band employed. We demonstrate an example where fully-automatic locations based on a source-region specific fixed-parameter template are more stable than the corresponding analyst reviewed estimates. The reason is that the analyst selects a frequency band and analysis window which appears optimal for each event. In this case, the frequency band which produces the most consistent direction estimates has neither the best SNR or the greatest beam-gain, and is therefore unlikely to be chosen by an analyst without calibration data.
Statistical baseline assessment in cardiotocography.
Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura
2017-07-01
Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.
Sensors and Clinical Mastitis—The Quest for the Perfect Alert
Hogeveen, Henk; Kamphuis, Claudia; Steeneveld, Wilma; Mollenhorst, Herman
2010-01-01
When cows on dairy farms are milked with an automatic milking system or in high capacity milking parlors, clinical mastitis (CM) cannot be adequately detected without sensors. The objective of this paper is to describe the performance demands of sensor systems to detect CM and evaluats the current performance of these sensor systems. Several detection models based on different sensors were studied in the past. When evaluating these models, three factors are important: performance (in terms of sensitivity and specificity), the time window and the similarity of the study data with real farm data. A CM detection system should offer at least a sensitivity of 80% and a specificity of 99%. The time window should not be longer than 48 hours and study circumstances should be as similar to practical farm circumstances as possible. The study design should comprise more than one farm for data collection. Since 1992, 16 peer-reviewed papers have been published with a description and evaluation of CM detection models. There is a large variation in the use of sensors and algorithms. All this makes these results not very comparable. There is a also large difference in performance between the detection models and also a large variation in time windows used and little similarity between study data. Therefore, it is difficult to compare the overall performance of the different CM detection models. The sensitivity and specificity found in the different studies could, for a large part, be explained in differences in the used time window. None of the described studies satisfied the demands for CM detection models. PMID:22163637
Sensors and clinical mastitis--the quest for the perfect alert.
Hogeveen, Henk; Kamphuis, Claudia; Steeneveld, Wilma; Mollenhorst, Herman
2010-01-01
When cows on dairy farms are milked with an automatic milking system or in high capacity milking parlors, clinical mastitis (CM) cannot be adequately detected without sensors. The objective of this paper is to describe the performance demands of sensor systems to detect CM and evaluats the current performance of these sensor systems. Several detection models based on different sensors were studied in the past. When evaluating these models, three factors are important: performance (in terms of sensitivity and specificity), the time window and the similarity of the study data with real farm data. A CM detection system should offer at least a sensitivity of 80% and a specificity of 99%. The time window should not be longer than 48 hours and study circumstances should be as similar to practical farm circumstances as possible. The study design should comprise more than one farm for data collection. Since 1992, 16 peer-reviewed papers have been published with a description and evaluation of CM detection models. There is a large variation in the use of sensors and algorithms. All this makes these results not very comparable. There is a also large difference in performance between the detection models and also a large variation in time windows used and little similarity between study data. Therefore, it is difficult to compare the overall performance of the different CM detection models. The sensitivity and specificity found in the different studies could, for a large part, be explained in differences in the used time window. None of the described studies satisfied the demands for CM detection models.
Delay Differential Equation Models of Normal and Diseased Electrocardiograms
NASA Astrophysics Data System (ADS)
Lainscsek, Claudia; Sejnowski, Terrence J.
Time series analysis with nonlinear delay differential equations (DDEs) is a powerful tool since it reveals spectral as well as nonlinear properties of the underlying dynamical system. Here global DDE models are used to analyze electrocardiography recordings (ECGs) in order to capture distinguishing features for different heart conditions such as normal heart beat, congestive heart failure, and atrial fibrillation. To capture distinguishing features of the different data types the number of terms and delays in the model as well as the order of nonlinearity of the DDE model have to be selected. The DDE structure selection is done in a supervised way by selecting the DDE that best separates different data types. We analyzed 24 h of data from 15 young healthy subjects in normal sinus rhythm (NSR) of 15 congestive heart failure (CHF) patients as well as of 15 subjects suffering from atrial fibrillation (AF) selected from the Physionet database. For the analysis presented here we used 5 min non-overlapping data windows on the raw data without any artifact removal. For classification performance we used the Cohen Kappa coefficient computed directly from the confusion matrix. The overall classification performance of the three groups was around 72-99 % on the 5 min windows for the different approaches. For 2 h data windows the classification for all three groups was above 95%.
Detecting cheaters without thinking: testing the automaticity of the cheater detection module.
Van Lier, Jens; Revlin, Russell; De Neys, Wim
2013-01-01
Evolutionary psychologists have suggested that our brain is composed of evolved mechanisms. One extensively studied mechanism is the cheater detection module. This module would make people very good at detecting cheaters in a social exchange. A vast amount of research has illustrated performance facilitation on social contract selection tasks. This facilitation is attributed to the alleged automatic and isolated operation of the module (i.e., independent of general cognitive capacity). This study, using the selection task, tested the critical automaticity assumption in three experiments. Experiments 1 and 2 established that performance on social contract versions did not depend on cognitive capacity or age. Experiment 3 showed that experimentally burdening cognitive resources with a secondary task had no impact on performance on the social contract version. However, in all experiments, performance on a non-social contract version did depend on available cognitive capacity. Overall, findings validate the automatic and effortless nature of social exchange reasoning.
A new algorithm to detect earthquakes outside the seismic network: preliminary results
NASA Astrophysics Data System (ADS)
Giudicepietro, Flora; Esposito, Antonietta Maria; Ricciolino, Patrizia
2017-04-01
In this text we are going to present a new technique for detecting earthquakes outside the seismic network, which are often the cause of fault of automatic analysis system. Our goal is to develop a robust method that provides the discrimination result as quickly as possible. We discriminate local earthquakes from regional earthquakes, both recorded at SGG station, equipped with short period sensors, operated by Osservatorio Vesuviano (INGV) in the Southern Apennines (Italy). The technique uses a Multi Layer Perceptron (MLP) neural network with an architecture composed by an input layer, a hidden layer and a single node output layer. We pre-processed the data using the Linear Predictive Coding (LPC) technique to extract the spectral features of the signals in a compact form. We performed several experiments by shortening the signal window length. In particular, we used windows of 4, 2 and 1 seconds containing the onset of the local and the regional earthquakes. We used a dataset of 103 local earthquakes and 79 regional earthquakes, most of which occurred in Greece, Albania and Crete. We split the dataset into a training set, for the network training, and a testing set to evaluate the network's capacity of discrimination. In order to assess the network stability, we repeated this procedure six times, randomly changing the data composition of the training and testing set and the initial weights of the net. We estimated the performance of this method by calculating the average of correct detection percentages obtained for each of the six permutations. The average performances are 99.02%, 98.04% and 98.53%, which concern respectively the experiments carried out on 4, 2 and 1 seconds signal windows. The results show that our method is able to recognize the earthquakes outside the seismic network using only the first second of the seismic records, with a suitable percentage of correct detection. Therefore, this algorithm can be profitably used to make earthquake automatic analyses more robust and reliable. Finally, with appropriate tuning, it can be integrated in multi-parametric systems for monitoring high natural risk areas.
Precision Departure Release Capability (PDRC) Final Report
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin Brian; Kistler, Matthew Stephen; Gaither, Frank; Juro, Greg
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows that may be subject to constraints that create localized demand/capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool, based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas/Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents research results from the PDRC research activity. Companion papers present the Concept of Operations and a Technology Description.
Precision Departure Release Capability (PDRC) Technology Description
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin; Robinson, Corissia; Null, Jody R.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations.
Precision Departure Release Capability (PDRC): NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Thomas J.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations.
Precision Departure Release Capability (PDRC) Concept of Operations
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Capps, Richard A.; Day, Kevin Brian
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demandcapacity imbalances. When demand exceeds capacity Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in DallasFort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Concept of Operations. Companion papers include the Final Report and a Technology Description. ? SUBJECT:
The Lick-Gaertner automatic measuring system
NASA Technical Reports Server (NTRS)
Vasilevskis, S.; Popov, W. A.
1971-01-01
The Lick-Gaertner automatic equipment has been designed mainly for the measurement of stellar proper motions with reference to galaxies, and consists of two main components: the survey machine and the automatic measuring engine. The survey machine is used for initial inspection and selection of objects for subsequent measurement. Two plates, up to 17 x 17 inches each, are surveyed simultaneously by means of projection on a screen. The approximate positions of objects selected are measured by two optical screws: helical lines cut through an aluminum coating on glass cylinders. These approximate coordinates to a precision of the order of 0.03mm are transmitted to a card punch by encoders connected with the cylinders.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarroll, R; UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX; Beadle, B
Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrectmore » contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to flag auto-contours for special review or used with safety margins in a fully automatic treatment planning system.« less
Fromberger, Peter; Jordan, Kirsten; Steinkrauss, Henrike; von Herder, Jakob; Stolpmann, Georg; Kröner-Herwig, Birgit; Müller, Jürgen Leo
2013-05-01
Recent theories in sexuality highlight the importance of automatic and controlled attentional processes in viewing sexually relevant stimuli. The model of Spiering and Everaerd (2007) assumes that sexually relevant features of a stimulus are preattentively selected and automatically induce focal attention to these sexually relevant aspects. Whether this assumption proves true for pedophiles is unknown. It is aim of this study to test this assumption empirically for people suffering from pedophilic interests. Twenty-two pedophiles, 8 nonpedophilic forensic controls, and 52 healthy controls simultaneously viewed the picture of a child and the picture of an adult while eye movements were measured. Entry time was assessed as a measure of automatic attentional processes and relative fixation time in order to assess controlled attentional processes. Pedophiles demonstrated significantly shorter entry time to child stimuli than to adult stimuli. The opposite was the case for nonpedophiles, as they showed longer relative fixation time for adult stimuli, and, against all expectations, pedophiles also demonstrated longer relative fixation time for adult stimuli. The results confirmed the hypothesis that pedophiles automatically selected sexually relevant stimuli (children). Contrary to all expectations, this automatic selection did not trigger the focal attention to these sexually relevant pictures. Furthermore, pedophiles were first and longest attracted by faces and pubic regions of children; nonpedophiles were first and longest attracted by faces and breasts of adults. The results demonstrated, for the first time, that the face and pubic region are the most attracting regions in children for pedophiles. © 2013 American Psychological Association
New DTM Extraction Approach from Airborne Images Derived Dsm
NASA Astrophysics Data System (ADS)
Mousa, Y. A.; Helmholz, P.; Belton, D.
2017-05-01
In this work, a new filtering approach is proposed for a fully automatic Digital Terrain Model (DTM) extraction from very high resolution airborne images derived Digital Surface Models (DSMs). Our approach represents an enhancement of the existing DTM extraction algorithm Multi-directional and Slope Dependent (MSD) by proposing parameters that are more reliable for the selection of ground pixels and the pixelwise classification. To achieve this, four main steps are implemented: Firstly, 8 well-distributed scanlines are used to search for minima as a ground point within a pre-defined filtering window size. These selected ground points are stored with their positions on a 2D surface to create a network of ground points. Then, an initial DTM is created using an interpolation method to fill the gaps in the 2D surface. Afterwards, a pixel to pixel comparison between the initial DTM and the original DSM is performed utilising pixelwise classification of ground and non-ground pixels by applying a vertical height threshold. Finally, the pixels classified as non-ground are removed and the remaining holes are filled. The approach is evaluated using the Vaihingen benchmark dataset provided by the ISPRS working group III/4. The evaluation includes the comparison of our approach, denoted as Network of Ground Points (NGPs) algorithm, with the DTM created based on MSD as well as a reference DTM generated from LiDAR data. The results show that our proposed approach over performs the MSD approach.
Magnetic braking in Solar-type close binaries
NASA Astrophysics Data System (ADS)
Maceroni, C.; Rucinski, S. M.
In tidally locked binaries the angular momentum loss by magnetic braking affects the orbital period. While this effect is too small to be detected in individual systems, its signature can be seen in shape of the orbital period distribution of suitable samples. As a consequence information on the braking mechanisms can be obtained - at least in principle - from the analysis of the distributions, the main problems being the selection of a large and homogeneous sample of binaries and the appropriate treatment of the observational biases. New large databases of variable stars are becoming available as by-products of microlensing projects, which have the advantage of joining, for the first time, sample richness and homogeneity. We report the main results of the analysis of the eclipsing binaries in OGLE-I catalog, that contains several thousands variables detected in a pencil-beam search volume towards the Baade's Window. By means of an automatic filtering algorithm we extracted a sample of 74 detached, equal-mass, main-sequence binary stars with short orbital periods (i.e., in the range 0.19 < P < 8 days) and derived from the presently observed period distribution, after correction for selection effects, the expected slope of the braking law. The results suggest an AML braking law very close to the "saturated" one, with a very weak dependence on the period. However we are still far from constraining the precise value of the slope, because of the important role played by the observational bias.
1990-12-01
keys 7 Executing PBPKSIM 10 Main Menu 12 File Selection 13 Data 13 simulation 13 All 14 sTatistics 14 Change directory 14 dos Shell 15 eXit 15 Data...the PBPKSIM program are based upon the window design seen here: TITLE I MENU BAR I INFORMATION LINE I I I IMIN DISPLAY AREAI1 1 I I I I I I I STATUS...AREAI Title shows the location of the program by supplying the name of the window being exeLuted. Menu Bar displays the other windows or other
Automatic selective attention as a function of sensory modality in aging.
Guerreiro, Maria J S; Adam, Jos J; Van Gerven, Pascal W M
2012-03-01
It was recently hypothesized that age-related differences in selective attention depend on sensory modality (Guerreiro, M. J. S., Murphy, D. R., & Van Gerven, P. W. M. (2010). The role of sensory modality in age-related distraction: A critical review and a renewed view. Psychological Bulletin, 136, 975-1022. doi:10.1037/a0020731). So far, this hypothesis has not been tested in automatic selective attention. The current study addressed this issue by investigating age-related differences in automatic spatial cueing effects (i.e., facilitation and inhibition of return [IOR]) across sensory modalities. Thirty younger (mean age = 22.4 years) and 25 older adults (mean age = 68.8 years) performed 4 left-right target localization tasks, involving all combinations of visual and auditory cues and targets. We used stimulus onset asynchronies (SOAs) of 100, 500, 1,000, and 1,500 ms between cue and target. The results showed facilitation (shorter reaction times with valid relative to invalid cues at shorter SOAs) in the unimodal auditory and in both cross-modal tasks but not in the unimodal visual task. In contrast, there was IOR (longer reaction times with valid relative to invalid cues at longer SOAs) in both unimodal tasks but not in either of the cross-modal tasks. Most important, these spatial cueing effects were independent of age. The results suggest that the modality hypothesis of age-related differences in selective attention does not extend into the realm of automatic selective attention.
NASA Astrophysics Data System (ADS)
Yusop, Hanafi M.; Ghazali, M. F.; Yusof, M. F. M.; Remli, M. A. Pi; Kamarulzaman, M. H.
2017-10-01
In a recent study, the analysis of pressure transient signals could be seen as an accurate and low-cost method for leak and feature detection in water distribution systems. Transient phenomena occurs due to sudden changes in the fluid’s propagation in pipelines system caused by rapid pressure and flow fluctuation due to events such as closing and opening valves rapidly or through pump failure. In this paper, the feasibility of the Hilbert-Huang transform (HHT) method/technique in analysing the pressure transient signals in presented and discussed. HHT is a way to decompose a signal into intrinsic mode functions (IMF). However, the advantage of HHT is its difficulty in selecting the suitable IMF for the next data postprocessing method which is Hilbert Transform (HT). This paper reveals that utilizing the application of an integrated kurtosis-based algorithm for a z-filter technique (I-Kaz) to kurtosis ratio (I-Kaz-Kurtosis) allows/contributes to/leads to automatic selection of the IMF that should be used. This technique is demonstrated on a 57.90-meter medium high-density polyethylene (MDPE) pipe installed with a single artificial leak. The analysis results using the I-Kaz-kurtosis ratio revealed/confirmed that the method can be used as an automatic selection of the IMF although the noise level ratio of the signal is low. Therefore, the I-Kaz-kurtosis ratio method is recommended as a means to implement an automatic selection technique of the IMF for HHT analysis.
Hirose, Tomoaki; Igami, Tsuyoshi; Koga, Kusuto; Hayashi, Yuichiro; Ebata, Tomoki; Yokoyama, Yukihiro; Sugawara, Gen; Mizuno, Takashi; Yamaguchi, Junpei; Mori, Kensaku; Nagino, Masato
2017-03-01
Fusion angiography using reconstructed multidetector-row computed tomography (MDCT) images, and cholangiography using reconstructed images from MDCT with a cholangiographic agent include an anatomical gap due to the different periods of MDCT scanning. To conquer such gaps, we attempted to develop a cholangiography procedure that automatically reconstructs a cholangiogram from portal-phase MDCT images. The automatically produced cholangiography procedure utilized an original software program that was developed by the Graduate School of Information Science, Nagoya University. This program structured 5 candidate biliary tracts, and automatically selected one as the candidate for cholangiography. The clinical value of the automatically produced cholangiography procedure was estimated based on a comparison with manually produced cholangiography. Automatically produced cholangiograms were reconstructed for 20 patients who underwent MDCT scanning before biliary drainage for distal biliary obstruction. The procedure showed the ability to extract the 5 main biliary branches and the 21 subsegmental biliary branches in 55 and 25 % of the cases, respectively. The extent of aberrant connections and aberrant extractions outside the biliary tract was acceptable. Among all of the cholangiograms, 5 were clinically applied with no correction, 8 were applied with modest improvements, and 3 produced a correct cholangiography before automatic selection. Although our procedure requires further improvement based on the analysis of additional patient data, it may represent an alternative to direct cholangiography in the future.
Endpoint in plasma etch process using new modified w-multivariate charts and windowed regression
NASA Astrophysics Data System (ADS)
Zakour, Sihem Ben; Taleb, Hassen
2017-09-01
Endpoint detection is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done in the right way, especially if the etched area is very small (0.1%). It truly is a crucial part of supplying repeatable effects in every single wafer. When the film being etched has been completely cleared, the endpoint is reached. To ensure the desired device performance on the produced integrated circuit, the high optical emission spectroscopy (OES) sensor is employed. The huge number of gathered wavelengths (profiles) is then analyzed and pre-processed using a new proposed simple algorithm named Spectra peak selection (SPS) to select the important wavelengths, then we employ wavelet analysis (WA) to enhance the performance of detection by suppressing noise and redundant information. The selected and treated OES wavelengths are then used in modified multivariate control charts (MEWMA and Hotelling) for three statistics (mean, SD and CV) and windowed polynomial regression for mean. The employ of three aforementioned statistics is motivated by controlling mean shift, variance shift and their ratio (CV) if both mean and SD are not stable. The control charts show their performance in detecting endpoint especially W-mean Hotelling chart and the worst result is given by CV statistic. As the best detection of endpoint is given by the W-Hotelling mean statistic, this statistic will be used to construct a windowed wavelet Hotelling polynomial regression. This latter can only identify the window containing endpoint phenomenon.
Operation and control software for APNEA
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClelland, J.H.; Storm, B.H. Jr.; Ahearn, J.
1997-11-01
The human interface software for the Lockheed Martin Specialty Components (LMSC) Active/Passive Neutron Examination & Analysis System (APENA) provides a user friendly operating environment for the movement and analysis of waste drums. It is written in Microsoft Visual C++ on a Windows NT platform. Object oriented and multitasking techniques are used extensively to maximize the capability of the system. A waste drum is placed on a loading platform with a fork lift and then automatically moved into the APNEA chamber in preparation for analysis. A series of measurements is performed, controlled by menu commands to hardware components attached as peripheralmore » devices, in order to create data files for analysis. The analysis routines use the files to identify the pertinent radioactive characteristics of the drum, including the type, location, and quantity of fissionable material. At the completion of the measurement process, the drum is automatically unloaded and the data are archived in preparation for storage as part of the drum`s data signature. 3 figs.« less
A Patch-Based Approach for the Segmentation of Pathologies: Application to Glioma Labelling.
Cordier, Nicolas; Delingette, Herve; Ayache, Nicholas
2016-04-01
In this paper, we describe a novel and generic approach to address fully-automatic segmentation of brain tumors by using multi-atlas patch-based voting techniques. In addition to avoiding the local search window assumption, the conventional patch-based framework is enhanced through several simple procedures: an improvement of the training dataset in terms of both label purity and intensity statistics, augmented features to implicitly guide the nearest-neighbor-search, multi-scale patches, invariance to cube isometries, stratification of the votes with respect to cases and labels. A probabilistic model automatically delineates regions of interest enclosing high-probability tumor volumes, which allows the algorithm to achieve highly competitive running time despite minimal processing power and resources. This method was evaluated on Multimodal Brain Tumor Image Segmentation challenge datasets. State-of-the-art results are achieved, with a limited learning stage thus restricting the risk of overfit. Moreover, segmentation smoothness does not involve any post-processing.
TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information
Struck, Torsten H
2014-01-01
Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118
Kauppi, Jukka-Pekka; Martikainen, Kalle; Ruotsalainen, Ulla
2010-12-01
The central purpose of passive signal intercept receivers is to perform automatic categorization of unknown radar signals. Currently, there is an urgent need to develop intelligent classification algorithms for these devices due to emerging complexity of radar waveforms. Especially multifunction radars (MFRs) capable of performing several simultaneous tasks by utilizing complex, dynamically varying scheduled waveforms are a major challenge for automatic pattern classification systems. To assist recognition of complex radar emissions in modern intercept receivers, we have developed a novel method to recognize dynamically varying pulse repetition interval (PRI) modulation patterns emitted by MFRs. We use robust feature extraction and classifier design techniques to assist recognition in unpredictable real-world signal environments. We classify received pulse trains hierarchically which allows unambiguous detection of the subpatterns using a sliding window. Accuracy, robustness and reliability of the technique are demonstrated with extensive simulations using both static and dynamically varying PRI modulation patterns. Copyright © 2010 Elsevier Ltd. All rights reserved.
SU-F-J-200: An Improved Method for Event Selection in Compton Camera Imaging for Particle Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackin, D; Beddar, S; Polf, J
2016-06-15
Purpose: The uncertainty in the beam range in particle therapy limits the conformality of the dose distributions. Compton scatter cameras (CC), which measure the prompt gamma rays produced by nuclear interactions in the patient tissue, can reduce this uncertainty by producing 3D images confirming the particle beam range and dose delivery. However, the high intensity and short time windows of the particle beams limit the number of gammas detected. We attempt to address this problem by developing a method for filtering gamma ray scattering events from the background by applying the known gamma ray spectrum. Methods: We used a 4more » stage Compton camera to record in list mode the energy deposition and scatter positions of gammas from a Co-60 source. Each CC stage contained a 4×4 array of CdZnTe crystal. To produce images, we used a back-projection algorithm and four filtering Methods: basic, energy windowing, delta energy (ΔE), or delta scattering angle (Δθ). Basic filtering requires events to be physically consistent. Energy windowing requires event energy to fall within a defined range. ΔE filtering selects events with the minimum difference between the measured and a known gamma energy (1.17 and 1.33 MeV for Co-60). Δθ filtering selects events with the minimum difference between the measured scattering angle and the angle corresponding to a known gamma energy. Results: Energy window filtering reduced the FWHM from 197.8 mm for basic filtering to 78.3 mm. ΔE and Δθ filtering achieved the best results, FWHMs of 64.3 and 55.6 mm, respectively. In general, Δθ filtering selected events with scattering angles < 40°, while ΔE filtering selected events with angles > 60°. Conclusion: Filtering CC events improved the quality and resolution of the corresponding images. ΔE and Δθ filtering produced similar results but each favored different events.« less
Automatic transducer switching provides accurate wide range measurement of pressure differential
NASA Technical Reports Server (NTRS)
Yoder, S. K.
1967-01-01
Automatic pressure transducer switching network sequentially selects any one of a number of limited-range transducers as gas pressure rises or falls, extending the range of measurement and lessening the chances of damage due to high pressure.
iPat: intelligent prediction and association tool for genomic research.
Chen, Chunpeng James; Zhang, Zhiwu
2018-06-01
The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.
NASA Astrophysics Data System (ADS)
Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong
2018-01-01
An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.
Mento, Giovanni
2017-12-01
A main distinction has been proposed between voluntary and automatic mechanisms underlying temporal orienting (TO) of selective attention. Voluntary TO implies the endogenous directing of attention induced by symbolic cues. Conversely, automatic TO is exogenously instantiated by the physical properties of stimuli. A well-known example of automatic TO is sequential effects (SEs), which refer to the adjustments in participants' behavioral performance as a function of the trial-by-trial sequential distribution of the foreperiod between two stimuli. In this study a group of healthy adults underwent a cued reaction time task purposely designed to assess both voluntary and automatic TO. During the task, both post-cue and post-target event-related potentials (ERPs) were recorded by means of a high spatial resolution EEG system. In the results of the post-cue analysis, the P3a and P3b were identified as two distinct ERP markers showing distinguishable spatiotemporal features and reflecting automatic and voluntary a priori expectancy generation, respectively. The brain source reconstruction further revealed that distinct cortical circuits supported these two temporally dissociable components. Namely, the voluntary P3b was supported by a left sensorimotor network, while the automatic P3a was generated by a more distributed frontoparietal circuit. Additionally, post-cue contingent negative variation (CNV) and post-target P3 modulations were observed as common markers of voluntary and automatic expectancy implementation and response selection, although partially dissociable neural networks subserved these two mechanisms. Overall, these results provide new electrophysiological evidence suggesting that distinct neural substrates can be recruited depending on the voluntary or automatic cognitive nature of the cognitive mechanisms subserving TO. Copyright © 2017 Elsevier Ltd. All rights reserved.
Advances in Global Adjoint Tomography - Data Assimilation and Inversion Strategy
NASA Astrophysics Data System (ADS)
Ruan, Y.; Lei, W.; Lefebvre, M. P.; Modrak, R. T.; Smith, J. A.; Bozdag, E.; Tromp, J.
2016-12-01
Seismic tomography provides the most direct way to understand Earth's interior by imaging elastic heterogeneity, anisotropy and anelasticity. Resolving thefine structure of these properties requires accurate simulations of seismic wave propagation in complex 3-D Earth models. On the supercomputer "Titan" at Oak Ridge National Laboratory, we are employing a spectral-element method (Komatitsch & Tromp 1999, 2002) in combination with an adjoint method (Tromp et al., 2005) to accurately calculate theoretical seismograms and Frechet derivatives. Using 253 carefully selected events, Bozdag et al. (2016) iteratively determined a transversely isotropic earth model (GLAD_M15) using 15 preconditioned conjugate-gradient iterations. To obtain higher resolution images of the mantle, we have expanded our database to more than 4,220 Mw5.0-7.0 events occurred between 1995 and 2014. Instead of using the entire database all at once, we choose to draw subsets of about 1,000 events from our database for each iteration to achieve a faster convergence rate with limited computing resources. To provide good coverage of deep structures, we selected approximately 700 deep and intermedia earthquakes and 300 shallow events to start a new iteration. We reinverted the CMT solutions of these events in the latest model, and recalculated synthetic seismograms. Using the synthetics as reference seismograms, we selected time windows that show good agreement with data and make measurements within the windows. From the measurements we further assess the overall quality of each event and station, and exclude bad measurements base upon certain criteria. So far, with very conservative criteria, we have assimilated more than 8.0 million windows from 1,000 earthquakes in three period bands for the new iteration. For subsequent iterations, we will change the period bands and window selecting criteria to include more window. In the inversion, dense array data (e.g., USArray) usually dominate model updates. In order to better handle this issue, we introduced weighting of stations and events based upon their relative distance and showed that the contribution from dense array is better balanced in the Frechet derivatives. We will present a summary of this form of data assimilation and preliminary results of the first few iterations.
Residual motion compensation in ECG-gated interventional cardiac vasculature reconstruction
NASA Astrophysics Data System (ADS)
Schwemmer, C.; Rohkohl, C.; Lauritsch, G.; Müller, K.; Hornegger, J.
2013-06-01
Three-dimensional reconstruction of cardiac vasculature from angiographic C-arm CT (rotational angiography) data is a major challenge. Motion artefacts corrupt image quality, reducing usability for diagnosis and guidance. Many state-of-the-art approaches depend on retrospective ECG-gating of projection data for image reconstruction. A trade-off has to be made regarding the size of the ECG-gating window. A large temporal window is desirable to avoid undersampling. However, residual motion will occur in a large window, causing motion artefacts. We present an algorithm to correct for residual motion. Our approach is based on a deformable 2D-2D registration between the forward projection of an initial, ECG-gated reconstruction, and the original projection data. The approach is fully automatic and does not require any complex segmentation of vasculature, or landmarks. The estimated motion is compensated for during the backprojection step of a subsequent reconstruction. We evaluated the method using the publicly available CAVAREV platform and on six human clinical datasets. We found a better visibility of structure, reduced motion artefacts, and increased sharpness of the vessels in the compensated reconstructions compared to the initial reconstructions. At the time of writing, our algorithm outperforms the leading result of the CAVAREV ranking list. For the clinical datasets, we found an average reduction of motion artefacts by 13 ± 6%. Vessel sharpness was improved by 25 ± 12% on average.
A sentence sliding window approach to extract protein annotations from biomedical articles
Krallinger, Martin; Padron, Maria; Valencia, Alfonso
2005-01-01
Background Within the emerging field of text mining and statistical natural language processing (NLP) applied to biomedical articles, a broad variety of techniques have been developed during the past years. Nevertheless, there is still a great ned of comparative assessment of the performance of the proposed methods and the development of common evaluation criteria. This issue was addressed by the Critical Assessment of Text Mining Methods in Molecular Biology (BioCreative) contest. The aim of this contest was to assess the performance of text mining systems applied to biomedical texts including tools which recognize named entities such as genes and proteins, and tools which automatically extract protein annotations. Results The "sentence sliding window" approach proposed here was found to efficiently extract text fragments from full text articles containing annotations on proteins, providing the highest number of correctly predicted annotations. Moreover, the number of correct extractions of individual entities (i.e. proteins and GO terms) involved in the relationships used for the annotations was significantly higher than the correct extractions of the complete annotations (protein-function relations). Conclusion We explored the use of averaging sentence sliding windows for information extraction, especially in a context where conventional training data is unavailable. The combination of our approach with more refined statistical estimators and machine learning techniques might be a way to improve annotation extraction for future biomedical text mining applications. PMID:15960831
Zhu, Jingbo; Liu, Baoyue; Shan, Shibo; Ding, Yanl; Kou, Zinong; Xiao, Wei
2015-08-01
In order to meet the needs of efficient purification of products from natural resources, this paper developed an automatic vacuum liquid chromatographic device (AUTO-VLC) and applied it to the component separation of petroleum ether extracts of Schisandra chinensis (Turcz) Baill. The device was comprised of a solvent system, a 10-position distribution valve, a 3-position changes valve, dynamic axis compress chromatographic columns with three diameters, and a 10-position fraction valve. The programmable logic controller (PLC) S7- 200 was adopted to realize the automatic control and monitoring of the mobile phase changing, column selection, separation time setting and fraction collection. The separation results showed that six fractions (S1-S6) of different chemical components from 100 g Schisandra chinensis (Turcz) Baill. petroleum ether phase were obtained by the AUTO-VLC with 150 mm diameter dynamic axis compress chromatographic column. A new method used for the VLC separation parameters screened by using multiple development TLC was developed and confirmed. The initial mobile phase of AUTO-VLC was selected by taking Rf of all the target compounds ranging from 0 to 0.45 for fist development on the TLC; gradient elution ratio was selected according to k value (the slope of the linear function of Rf value and development times on the TLC) and the resolution of target compounds; elution times (n) were calculated by the formula n ≈ ΔRf/k. A total of four compounds with the purity more than 85% and 13 other components were separated from S5 under the selected conditions for only 17 h. Therefore, the development of the automatic VLC and its method are significant to the automatic and systematic separation of traditional Chinese medicines.
NASA Astrophysics Data System (ADS)
Durocher, M.; Mostofi Zadeh, S.; Burn, D. H.; Ashkar, F.
2017-12-01
Floods are one of the most costly hazards and frequency analysis of river discharges is an important part of the tools at our disposal to evaluate their inherent risks and to provide an adequate response. In comparison to the common examination of annual streamflow maximums, peaks over threshold (POT) is an interesting alternative that makes better use of the available information by including more than one flood event per year (on average). However, a major challenge is the selection of a satisfactory threshold above which peaks are assumed to respect certain conditions necessary for an adequate estimation of the risk. Additionally, studies have shown that POT is also a valuable approach to investigate the evolution of flood regimes in the context of climate change. Recently, automatic procedures for the selection of the threshold were suggested to guide that important choice, which otherwise rely on graphical tools and expert judgment. Furthermore, having an automatic procedure that is objective allows for quickly repeating the analysis on a large number of samples, which is useful in the context of large databases or for uncertainty analysis based on a resampling approach. This study investigates the impact of considering such procedures in a case study including many sites across Canada. A simulation study is conducted to evaluate the bias and predictive power of the automatic procedures in similar conditions as well as investigating the power of derived nonstationarity tests. The results obtained are also evaluated in the light of expert judgments established in a previous study. Ultimately, this study provides a thorough examination of the considerations that need to be addressed when conducting POT analysis using automatic threshold selection.
Ilunga-Mbuyamba, Elisee; Avina-Cervantes, Juan Gabriel; Cepeda-Negrete, Jonathan; Ibarra-Manzano, Mario Alberto; Chalopin, Claire
2017-12-01
Brain tumor segmentation is a routine process in a clinical setting and provides useful information for diagnosis and treatment planning. Manual segmentation, performed by physicians or radiologists, is a time-consuming task due to the large quantity of medical data generated presently. Hence, automatic segmentation methods are needed, and several approaches have been introduced in recent years including the Localized Region-based Active Contour Model (LRACM). There are many popular LRACM, but each of them presents strong and weak points. In this paper, the automatic selection of LRACM based on image content and its application on brain tumor segmentation is presented. Thereby, a framework to select one of three LRACM, i.e., Local Gaussian Distribution Fitting (LGDF), localized Chan-Vese (C-V) and Localized Active Contour Model with Background Intensity Compensation (LACM-BIC), is proposed. Twelve visual features are extracted to properly select the method that may process a given input image. The system is based on a supervised approach. Applied specifically to Magnetic Resonance Imaging (MRI) images, the experiments showed that the proposed system is able to correctly select the suitable LRACM to handle a specific image. Consequently, the selection framework achieves better accuracy performance than the three LRACM separately. Copyright © 2017 Elsevier Ltd. All rights reserved.
IDG - INTERACTIVE DIF GENERATOR
NASA Technical Reports Server (NTRS)
Preheim, L. E.
1994-01-01
The Interactive DIF Generator (IDG) utility is a tool used to generate and manipulate Directory Interchange Format files (DIF). Its purpose as a specialized text editor is to create and update DIF files which can be sent to NASA's Master Directory, also referred to as the International Global Change Directory at Goddard. Many government and university data systems use the Master Directory to advertise the availability of research data. The IDG interface consists of a set of four windows: (1) the IDG main window; (2) a text editing window; (3) a text formatting and validation window; and (4) a file viewing window. The IDG main window starts up the other windows and contains a list of valid keywords. The keywords are loaded from a user-designated file and selected keywords can be copied into any active editing window. Once activated, the editing window designates the file to be edited. Upon switching from the editing window to the formatting and validation window, the user has options for making simple changes to one or more files such as inserting tabs, aligning fields, and indenting groups. The viewing window is a scrollable read-only window that allows fast viewing of any text file. IDG is an interactive tool and requires a mouse or a trackball to operate. IDG uses the X Window System to build and manage its interactive forms, and also uses the Motif widget set and runs under Sun UNIX. IDG is written in C-language for Sun computers running SunOS. This package requires the X Window System, Version 11 Revision 4, with OSF/Motif 1.1. IDG requires 1.8Mb of hard disk space. The standard distribution medium for IDG is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The program was developed in 1991 and is a copyrighted work with all copyright vested in NASA. SunOS is a trademark of Sun Microsystems, Inc. X Window System is a trademark of Massachusetts Institute of Technology. OSF/Motif is a trademark of the Open Software Foundation, Inc. UNIX is a trademark of Bell Laboratories.
Surgical anatomy of the round window-Implications for cochlear implantation.
Luers, J C; Hüttenbrink, K B; Beutner, D
2018-04-01
The round window is an important portal for the application of active hearing aids and cochlear implants. The anatomical and topographical knowledge about the round window region is a prerequisite for successful insertion for a cochlear implant electrode. To sum up current knowledge about the round window anatomy and to give advice to the cochlear implant surgeon for optimal placement of an electrode. Systematic Medline search. Search term "round window[Title]" with no date restriction. Only publications in the English Language were included. All abstracts were screened for relevance, that is a focus on surgical anatomy of the round window. The search results were supplemented with hand searching of selected reviews and reference lists from included studies. Subjective assessment. There is substantial variability in size and shape of the round window. The round window is regarded as the most reliable surgical landmark to safely locate the scala tympani. Factors affecting the optimal trajectory line for atraumatic electrode insertion are anatomy of the round window, the anatomy of the intracochlear hook region and the variable orientation and size of the cochlea's basal turn. The very close relation to the sensitive inner ear structures necessitates a thorough anatomic knowledge and careful insertion technique, especially when implanting patients with residual hearing. In order to avoid electrode migration between the scalae and to achieve protect the modiolus and the basilar membrane, it is recommended to aim for an electrode insertion vector from postero-superior to antero-inferior. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Yang, Honggang; Lin, Huibin; Ding, Kang
2018-05-01
The performance of sparse features extraction by commonly used K-Singular Value Decomposition (K-SVD) method depends largely on the signal segment selected in rolling bearing diagnosis, furthermore, the calculating speed is relatively slow and the dictionary becomes so redundant when the fault signal is relatively long. A new sliding window denoising K-SVD (SWD-KSVD) method is proposed, which uses only one small segment of time domain signal containing impacts to perform sliding window dictionary learning and select an optimal pattern with oscillating information of the rolling bearing fault according to a maximum variance principle. An inner product operation between the optimal pattern and the whole fault signal is performed to enhance the characteristic of the impacts' occurrence moments. Lastly, the signal is reconstructed at peak points of the inner product to realize the extraction of the rolling bearing fault features. Both simulation and experiments verify that the method could extract the fault features effectively.
Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors
Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal
2014-01-01
Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Sallal, K.A.
1999-07-01
The study aims to explore the effect of different climates on window and skylight design in residential buildings. The study house is evaluated against climates that have design opportunities for passive systems, with emphasis on passive cooling. The study applies a variety of methods to evaluate the design. It has found that earth sheltering and night ventilation have the potential to provide 12--29% and 25--77% of the cooling requirements respectively for the study house in the selected climates. The reduction of the glazing area from 174 ft{sup 2} to 115 ft{sup 2} has different impacts on the cooling energy costmore » in the different climates. In climates such Fresno and Tucson, one should put the cooling energy savings as a priority for window design, particularly when determining the window size. In other climates such as Albuquerque, the priority of window design should be first given to heating savings requirements.« less
Writers Identification Based on Multiple Windows Features Mining
NASA Astrophysics Data System (ADS)
Fadhil, Murad Saadi; Alkawaz, Mohammed Hazim; Rehman, Amjad; Saba, Tanzila
2016-03-01
Now a days, writer identification is at high demand to identify the original writer of the script at high accuracy. The one of the main challenge in writer identification is how to extract the discriminative features of different authors' scripts to classify precisely. In this paper, the adaptive division method on the offline Latin script has been implemented using several variant window sizes. Fragments of binarized text a set of features are extracted and classified into clusters in the form of groups or classes. Finally, the proposed approach in this paper has been tested on various parameters in terms of text division and window sizes. It is observed that selection of the right window size yields a well positioned window division. The proposed approach is tested on IAM standard dataset (IAM, Institut für Informatik und angewandte Mathematik, University of Bern, Bern, Switzerland) that is a constraint free script database. Finally, achieved results are compared with several techniques reported in the literature.
Method for preparing dosimeter for measuring skin dose
Jones, Donald E.; Parker, DeRay; Boren, Paul R.
1982-01-01
A personnel dosimeter includes a plurality of compartments containing thermoluminescent dosimeter phosphors for registering radiation dose absorbed in the wearer's sensitive skin layer and for registering more deeply penetrating radiation. Two of the phosphor compartments communicate with thin windows of different thicknesses to obtain a ratio of shallowly penetrating radiation, e.g. beta. A third phosphor is disposed within a compartment communicating with a window of substantially greater thickness than the windows of the first two compartments for estimating the more deeply penetrating radiation dose. By selecting certain phosphors that are insensitive to neutrons and by loading the holder material with neutron-absorbing elements, energetic neutron dose can be estimated separately from other radiation dose. This invention also involves a method of injection molding of dosimeter holders with thin windows of consistent thickness at the corresponding compartments of different holders. This is achieved through use of a die insert having the thin window of precision thickness in place prior to the injection molding step.
Dosimeter for measuring skin dose and more deeply penetrating radiation
Jones, Donald E.; Parker, DeRay; Boren, Paul R.
1981-01-01
A personnel dosimeter includes a plurality of compartments containing thermoluminescent dosimeter phosphors for registering radiation dose absorbed in the wearer's sensitive skin layer and for registering more deeply penetrating radiation. Two of the phosphor compartments communicate with thin windows of different thicknesses to obtain a ratio of shallowly penetrating radiation, e.g. beta. A third phosphor is disposed within a compartment communicating with a window of substantially greater thickness than the windows of the first two compartments for estimating the more deeply penetrating radiation dose. By selecting certain phosphors that are insensitive to neutrons and by loading the holder material with netruon-absorbing elements, energetic neutron dose can be estimated separately from other radiation dose. This invention also involves a method of injection molding of dosimeter holders with thin windows of consistent thickness at the corresponding compartments of different holders. This is achieved through use of a die insert having the thin window of precision thickness in place prior to the injection molding step.
Investigation of high temperature antennas for space shuttle
NASA Technical Reports Server (NTRS)
Kuhlman, E. A.
1973-01-01
The design and development of high temperature antennas for the space shuttle orbiter are discussed. The antenna designs were based on three antenna types, an annular slot (L-Band), a linear slot (C-Band), and a horn (C-Band). The design approach was based on combining an RF window, which provides thermal protection, with an off-the-shelf antenna. Available antenna window materials were reviewed and compared, and the materials most compatible with the design requirements were selected. Two antenna window design approaches were considered: one employed a high temperature dielectric material and a low density insulation material, and the other an insulation material usable for the orbiter thermal protection system. Preliminary designs were formulated and integrated into the orbiter structure. Simple electrical models, with a series of window configurations, were constructed and tested. The results of tests and analyses for the final antenna system designs are given and show that high temperature antenna systems consisting of off-the-shelf antennas thermally protected by RF windows can be designed for the Space Shuttle Orbiter.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Opto-mechanical design of optical window for aero-optics effect simulation instruments
NASA Astrophysics Data System (ADS)
Wang, Guo-ming; Dong, Dengfeng; Zhou, Weihu; Ming, Xing; Zhang, Yan
2016-10-01
A complete theory is established for opto-mechanical systems design of the window in this paper, which can make the design more rigorous .There are three steps about the design. First, the universal model of aerodynamic environment is established based on the theory of Computational Fluid Dynamics, and the pneumatic pressure distribution and temperature data of optical window surface is obtained when aircraft flies in 5-30km altitude, 0.5-3Ma speed and 0-30°angle of attack. The temperature and pressure distribution values for the maximum constraint is selected as the initial value of external conditions on the optical window surface. Then, the optical window and mechanical structure are designed, which is also divided into two parts: First, mechanical structure which meet requirements of the security and tightness is designed. Finally, rigorous analysis and evaluation are given about the structure of optics and mechanics we have designed. There are two parts to be analyzed. First, the Fluid-Solid-Heat Coupled Model is given based on finite element analysis. And the deformation of the glass and structure can be obtained by the model, which can assess the feasibility of the designed optical windows and ancillary structure; Second, the new optical surface is fitted by Zernike polynomials according to the deformation of the surface of the optical window, which can evaluate imaging quality impact of spectral camera by the deformation of window.
Transparency of 2μ m window of Titan's atmosphere
NASA Astrophysics Data System (ADS)
Rannou, P.; Seignovert, B.; Le Mouélic, S.; Maltagliati, L.; Rey, M.; Sotin, C.
2018-02-01
Titan's atmosphere is optically thick and hides the surface and the lower layers from the view at almost all wavelengths. However, because gaseous absorptions are spectrally selective, some narrow spectral intervals are relatively transparent and allow to probe the surface. To use these intervals (called windows) a good knowledge of atmospheric absorption is necessary. Once gas spectroscopic linelists are well established, the absorption inside windows depends on the way the far wings of the methane absorption lines are cut-off. We know that the intensity in all the windows can be explained with the same cut-off parameters, except for the window at 2 μm. This discrepancy is generally treated with a workaround which consists in using a different cut-off description for this specific window. This window is relatively transparent and surface may have specific spectral signatures that could be detected. Thus, a good knowledge of atmosphere opacities is essential and our scope is to better understand what causes the difference between the 2 μm window and the other windows. In this work, we used scattered light at the limb and transmissions in occultation observed with VIMS (Visual Infrared Mapping Spectrometer) onboard Cassini, around the 2 μm window. Data shows an absorption feature that participates to the shape of this window. Our atmospheric model fits well the VIMS data at 2 μm with the same cut-off than for the other windows, provided an additional absorption is introduced in the middle of the window around ≃ 2.065 μm. It explains well the discrepency between the cut-off used at 2 μm, and we show that a gas with a fairly constant mixing ratio, possibly ethane, may be the cause of this absorption. Finally, we studied the impact of this absorption on the retrieval of the surface reflectivity and found that it is significant.
A data variance technique for automated despiking of magnetotelluric data with a remote reference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kappler, K.
2011-02-15
The magnetotelluric method employs co-located surface measurements of electric and magnetic fields to infer the local electrical structure of the earth. The frequency-dependent 'apparent resistivity' curves can be inaccurate at long periods if input data are contaminated - even when robust remote reference techniques are employed. Data despiking prior to processing can result in significantly more reliable estimates of long period apparent resistivities. This paper outlines a two-step method of automatic identification and replacement for spike-like contamination of magnetotelluric data; based on the simultaneity of natural electric and magnetic field variations at distant sites. This simultaneity is exploited both tomore » identify windows in time when the array data are compromised, and to generate synthetic data that replace observed transient noise spikes. In the first step, windows in data time series containing spikes are identified via intersite comparison of channel 'activity' - such as the variance of differenced data within each window. In the second step, plausible data for replacement of flagged windows is calculated by Wiener filtering coincident data in clean channels. The Wiener filters - which express the time-domain relationship between various array channels - are computed using an uncontaminated segment of array training data. Examples are shown where the algorithm is applied to artificially contaminated data, and to real field data. In both cases all spikes are successfully identified. In the case of implanted artificial noise, the synthetic replacement time series are very similar to the original recording. In all cases, apparent resistivity and phase curves obtained by processing the despiked data are much improved over curves obtained from raw data.« less
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
NASA Astrophysics Data System (ADS)
Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.
2008-12-01
Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous data volumes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... may conduct a review of the test records. The Secretary may then conduct enforcement testing of that...) For automatic commercial ice makers, as well as commercial refrigerators, freezers, and refrigerator... numbers to select the units to be tested. (ii) For automatic commercial ice makers, as well as commercial...
Detecting Cheaters without Thinking: Testing the Automaticity of the Cheater Detection Module
Van Lier, Jens; Revlin, Russell; De Neys, Wim
2013-01-01
Evolutionary psychologists have suggested that our brain is composed of evolved mechanisms. One extensively studied mechanism is the cheater detection module. This module would make people very good at detecting cheaters in a social exchange. A vast amount of research has illustrated performance facilitation on social contract selection tasks. This facilitation is attributed to the alleged automatic and isolated operation of the module (i.e., independent of general cognitive capacity). This study, using the selection task, tested the critical automaticity assumption in three experiments. Experiments 1 and 2 established that performance on social contract versions did not depend on cognitive capacity or age. Experiment 3 showed that experimentally burdening cognitive resources with a secondary task had no impact on performance on the social contract version. However, in all experiments, performance on a non-social contract version did depend on available cognitive capacity. Overall, findings validate the automatic and effortless nature of social exchange reasoning. PMID:23342012
Approximation, abstraction and decomposition in search and optimization
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1992-01-01
In this paper, I discuss four different areas of my research. One portion of my research has focused on automatic synthesis of search control heuristics for constraint satisfaction problems (CSPs). I have developed techniques for automatically synthesizing two types of heuristics for CSPs: Filtering functions are used to remove portions of a search space from consideration. Another portion of my research is focused on automatic synthesis of hierarchic algorithms for solving constraint satisfaction problems (CSPs). I have developed a technique for constructing hierarchic problem solvers based on numeric interval algebra. Another portion of my research is focused on automatic decomposition of design optimization problems. We are using the design of racing yacht hulls as a testbed domain for this research. Decomposition is especially important in the design of complex physical shapes such as yacht hulls. Another portion of my research is focused on intelligent model selection in design optimization. The model selection problem results from the difficulty of using exact models to analyze the performance of candidate designs.
NASA Astrophysics Data System (ADS)
Álvarez, Charlens; Martínez, Fabio; Romero, Eduardo
2015-01-01
The pelvic magnetic Resonance images (MRI) are used in Prostate cancer radiotherapy (RT), a process which is part of the radiation planning. Modern protocols require a manual delineation, a tedious and variable activity that may take about 20 minutes per patient, even for trained experts. That considerable time is an important work ow burden in most radiological services. Automatic or semi-automatic methods might improve the efficiency by decreasing the measure times while conserving the required accuracy. This work presents a fully automatic atlas- based segmentation strategy that selects the more similar templates for a new MRI using a robust multi-scale SURF analysis. Then a new segmentation is achieved by a linear combination of the selected templates, which are previously non-rigidly registered towards the new image. The proposed method shows reliable segmentations, obtaining an average DICE Coefficient of 79%, when comparing with the expert manual segmentation, under a leave-one-out scheme with the training database.
Automatic Semantic Facilitation in Anterior Temporal Cortex Revealed through Multimodal Neuroimaging
Gramfort, Alexandre; Hämäläinen, Matti S.; Kuperberg, Gina R.
2013-01-01
A core property of human semantic processing is the rapid, facilitatory influence of prior input on extracting the meaning of what comes next, even under conditions of minimal awareness. Previous work has shown a number of neurophysiological indices of this facilitation, but the mapping between time course and localization—critical for separating automatic semantic facilitation from other mechanisms—has thus far been unclear. In the current study, we used a multimodal imaging approach to isolate early, bottom-up effects of context on semantic memory, acquiring a combination of electroencephalography (EEG), magnetoencephalography (MEG), and functional magnetic resonance imaging (fMRI) measurements in the same individuals with a masked semantic priming paradigm. Across techniques, the results provide a strikingly convergent picture of early automatic semantic facilitation. Event-related potentials demonstrated early sensitivity to semantic association between 300 and 500 ms; MEG localized the differential neural response within this time window to the left anterior temporal cortex, and fMRI localized the effect more precisely to the left anterior superior temporal gyrus, a region previously implicated in semantic associative processing. However, fMRI diverged from early EEG/MEG measures in revealing semantic enhancement effects within frontal and parietal regions, perhaps reflecting downstream attempts to consciously access the semantic features of the masked prime. Together, these results provide strong evidence that automatic associative semantic facilitation is realized as reduced activity within the left anterior superior temporal cortex between 300 and 500 ms after a word is presented, and emphasize the importance of multimodal neuroimaging approaches in distinguishing the contributions of multiple regions to semantic processing. PMID:24155321
Hügelschäfer, Sabine; Jaudas, Alexander; Achtziger, Anja
2016-10-15
Gender categorization is highly automatic. Studies measuring ERPs during the presentation of male and female faces in a categorization task showed that this categorization is extremely quick (around 130ms, indicated by the N170). We tested whether this automatic process can be controlled by goal intentions and implementation intentions. First, we replicated the N170 modulation on gender-incongruent faces as reported in previous research. This effect was only observed in a task in which faces had to be categorized according to gender, but not in a task that required responding to a visual feature added to the face stimuli (the color of a dot) while gender was irrelevant. Second, it turned out that the N170 modulation on gender-incongruent faces was altered if a goal intention was set that aimed at controlling a gender bias. We interpret this finding as an indicator of nonconscious goal pursuit. The N170 modulation was completely absent when this goal intention was furnished with an implementation intention. In contrast, intentions did not alter brain activity at a later time window (P300), which is associated with more complex and rather conscious processes. In line with previous research, the P300 was modulated by gender incongruency even if individuals were strongly involved in another task, demonstrating the automaticity of gender detection. We interpret our findings as evidence that automatic gender categorization that occurs at a very early processing stage can be effectively controlled by intentions. Copyright © 2016 Elsevier B.V. All rights reserved.
Building-Integrated Solar Energy Devices based on Wavelength Selective Films
NASA Astrophysics Data System (ADS)
Ulavi, Tejas
A potentially attractive option for building integrated solar is to employ hybrid solar collectors which serve dual purposes, combining solar thermal technology with either thin film photovoltaics or daylighting. In this study, two hybrid concepts, a hybrid photovoltaic/thermal (PV/T) collector and a hybrid 'solar window', are presented and analyzed to evaluate technical performance. In both concepts, a wavelength selective film is coupled with a compound parabolic concentrator (CPC) to reflect and concentrate the infrared portion of the solar spectrum onto a tubular absorber. The visible portion of the spectrum is transmitted through the concentrator to either a thin film Cadmium Telluride (CdTe) solar panel for electricity generation or into the interior space for daylighting. Special attention is given to the design of the hybrid devices for aesthetic building integration. An adaptive concentrator design based on asymmetrical truncation of CPCs is presented for the hybrid solar window concept. The energetic and spectral split between the solar thermal module and the PV or daylighting module are functions of the optical properties of the wavelength selective film and the concentrator geometry, and are determined using a Monte Carlo Ray-Tracing (MCRT) model. Results obtained from the MCRT can be used in conjugation with meteorological data for specific applications to study the impact of CPC design parameters including the half-acceptance angle thetac, absorber diameter D and truncation on the annual thermal and PV/daylighting efficiencies. The hybrid PV/T system is analyzed for a rooftop application in Phoenix, AZ. Compared to a system of the same area with independent solar thermal and PV modules, the hybrid PV/T provides 20% more energy, annually. However, the increase in total delivered energy is due solely to the addition of the thermal module and is achieved at an expense of a decrease in the annual electrical efficiency from 8.8% to 5.8% due to shading by the absorber tubes. For this reason, the PV/T hybrid is not recommended over other options in new installations. The hybrid solar window is evaluated for a horizontal skylight and south and east facing vertical windows in Minneapolis, MN. The predicted visible transmittance for the solar window is 0.66 to 0.73 for single glazed systems and 0.61 to 0.67 for double glazed systems. The solar heat gain coefficient and the U-factor for the window are comparable to existing glazing technology. Annual thermal efficiencies of up to 24% and 26% are predicted for the vertical window and the horizontal skylight respectively. Experimental measurements of the solar thermal component of the window confirm the trends of the model. In conclusion, the hybrid solar window combines the functionality of an energy efficient fenestration system with hybrid thermal energy generation to provide a compelling solution towards sustainable design of the built environment.
Innovative Visualization Techniques applied to a Flood Scenario
NASA Astrophysics Data System (ADS)
Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael
2013-04-01
The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.
Balabin, Roman M; Smirnov, Sergey V
2011-04-29
During the past several years, near-infrared (near-IR/NIR) spectroscopy has increasingly been adopted as an analytical tool in various fields from petroleum to biomedical sectors. The NIR spectrum (above 4000 cm(-1)) of a sample is typically measured by modern instruments at a few hundred of wavelengths. Recently, considerable effort has been directed towards developing procedures to identify variables (wavelengths) that contribute useful information. Variable selection (VS) or feature selection, also called frequency selection or wavelength selection, is a critical step in data analysis for vibrational spectroscopy (infrared, Raman, or NIRS). In this paper, we compare the performance of 16 different feature selection methods for the prediction of properties of biodiesel fuel, including density, viscosity, methanol content, and water concentration. The feature selection algorithms tested include stepwise multiple linear regression (MLR-step), interval partial least squares regression (iPLS), backward iPLS (BiPLS), forward iPLS (FiPLS), moving window partial least squares regression (MWPLS), (modified) changeable size moving window partial least squares (CSMWPLS/MCSMWPLSR), searching combination moving window partial least squares (SCMWPLS), successive projections algorithm (SPA), uninformative variable elimination (UVE, including UVE-SPA), simulated annealing (SA), back-propagation artificial neural networks (BP-ANN), Kohonen artificial neural network (K-ANN), and genetic algorithms (GAs, including GA-iPLS). Two linear techniques for calibration model building, namely multiple linear regression (MLR) and partial least squares regression/projection to latent structures (PLS/PLSR), are used for the evaluation of biofuel properties. A comparison with a non-linear calibration model, artificial neural networks (ANN-MLP), is also provided. Discussion of gasoline, ethanol-gasoline (bioethanol), and diesel fuel data is presented. The results of other spectroscopic techniques application, such as Raman, ultraviolet-visible (UV-vis), or nuclear magnetic resonance (NMR) spectroscopies, can be greatly improved by an appropriate feature selection choice. Copyright © 2011 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
1989-06-01
Author's abstract: A nonrandom sample of 120 disproportionately short, tall, and overweight drivers compared the comfort and convenience of the automatic safety belt systems used in seventeen automobiles. Nine vehicles had motorized shoulder belts wi...
Automatic, Multiple Assessment Options in Undergraduate Meteorology Education
ERIC Educational Resources Information Center
Kahl, Jonathan D. W.
2017-01-01
Since 2008, automatic, multiple assessment options have been utilised in selected undergraduate meteorology courses at the University of Wisconsin--Milwaukee. Motivated by a desire to reduce stress among students, the assessment methodology includes examination-heavy and homework-heavy alternatives, differing by an adjustable 15% of the overall…
Varela, P; Silva, A; da Silva, F; da Graça, S; Manso, M E; Conway, G D
2010-10-01
The spectrogram is one of the best-known time-frequency distributions suitable to analyze signals whose energy varies both in time and frequency. In reflectometry, it has been used to obtain the frequency content of FM-CW signals for density profile inversion and also to study plasma density fluctuations from swept and fixed frequency data. Being implemented via the short-time Fourier transform, the spectrogram is limited in resolution, and for that reason several methods have been developed to overcome this problem. Among those, we focus on the reassigned spectrogram technique that is both easily automated and computationally efficient requiring only the calculation of two additional spectrograms. In each time-frequency window, the technique reallocates the spectrogram coordinates to the region that most contributes to the signal energy. The application to ASDEX Upgrade reflectometry data results in better energy concentration and improved localization of the spectral content of the reflected signals. When combined with the automatic (data driven) window length spectrogram, this technique provides improved profile accuracy, in particular, in regions where frequency content varies most rapidly such as the edge pedestal shoulder.
Detection of Splice Sites Using Support Vector Machine
NASA Astrophysics Data System (ADS)
Varadwaj, Pritish; Purohit, Neetesh; Arora, Bhumika
Automatic identification and annotation of exon and intron region of gene, from DNA sequences has been an important research area in field of computational biology. Several approaches viz. Hidden Markov Model (HMM), Artificial Intelligence (AI) based machine learning and Digital Signal Processing (DSP) techniques have extensively and independently been used by various researchers to cater this challenging task. In this work, we propose a Support Vector Machine based kernel learning approach for detection of splice sites (the exon-intron boundary) in a gene. Electron-Ion Interaction Potential (EIIP) values of nucleotides have been used for mapping character sequences to corresponding numeric sequences. Radial Basis Function (RBF) SVM kernel is trained using EIIP numeric sequences. Furthermore this was tested on test gene dataset for detection of splice site by window (of 12 residues) shifting. Optimum values of window size, various important parameters of SVM kernel have been optimized for a better accuracy. Receiver Operating Characteristic (ROC) curves have been utilized for displaying the sensitivity rate of the classifier and results showed 94.82% accuracy for splice site detection on test dataset.
Sliding Window-Based Region of Interest Extraction for Finger Vein Images
Yang, Lu; Yang, Gongping; Yin, Yilong; Xiao, Rongyang
2013-01-01
Region of Interest (ROI) extraction is a crucial step in an automatic finger vein recognition system. The aim of ROI extraction is to decide which part of the image is suitable for finger vein feature extraction. This paper proposes a finger vein ROI extraction method which is robust to finger displacement and rotation. First, we determine the middle line of the finger, which will be used to correct the image skew. Then, a sliding window is used to detect the phalangeal joints and further to ascertain the height of ROI. Last, for the corrective image with certain height, we will obtain the ROI by using the internal tangents of finger edges as the left and right boundary. The experimental results show that the proposed method can extract ROI more accurately and effectively compared with other methods, and thus improve the performance of finger vein identification system. Besides, to acquire the high quality finger vein image during the capture process, we propose eight criteria for finger vein capture from different aspects and these criteria should be helpful to some extent for finger vein capture. PMID:23507824
Microwave Radiometers for Fire Detection in Trains: Theory and Feasibility Study.
Alimenti, Federico; Roselli, Luca; Bonafoni, Stefania
2016-06-17
This paper introduces the theory of fire detection in moving vehicles by microwave radiometers. The system analysis is discussed and a feasibility study is illustrated on the basis of two implementation hypotheses. The basic idea is to have a fixed radiometer and to look inside the glass windows of the wagon when it passes in front of the instrument antenna. The proposed sensor uses a three-pixel multi-beam configuration that allows an image to be formed by the movement of the train itself. Each pixel is constituted by a direct amplification microwave receiver operating at 31.4 GHz. At this frequency, the antenna can be a 34 cm offset parabolic dish, whereas a 1 K brightness temperature resolution is achievable with an overall system noise figure of 6 dB, an observation bandwidth of 2 GHz and an integration time of 1 ms. The effect of the detector noise is also investigated and several implementation hypotheses are discussed. The presented study is important since it could be applied to the automatic fire alarm in trains and moving vehicles with dielectric wall/windows.
MEGA7: Molecular Evolutionary Genetics Analysis Version 7.0 for Bigger Datasets.
Kumar, Sudhir; Stecher, Glen; Tamura, Koichiro
2016-07-01
We present the latest version of the Molecular Evolutionary Genetics Analysis (Mega) software, which contains many sophisticated methods and tools for phylogenomics and phylomedicine. In this major upgrade, Mega has been optimized for use on 64-bit computing systems for analyzing larger datasets. Researchers can now explore and analyze tens of thousands of sequences in Mega The new version also provides an advanced wizard for building timetrees and includes a new functionality to automatically predict gene duplication events in gene family trees. The 64-bit Mega is made available in two interfaces: graphical and command line. The graphical user interface (GUI) is a native Microsoft Windows application that can also be used on Mac OS X. The command line Mega is available as native applications for Windows, Linux, and Mac OS X. They are intended for use in high-throughput and scripted analysis. Both versions are available from www.megasoftware.net free of charge. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Automated image segmentation-assisted flattening of atomic force microscopy images.
Wang, Yuliang; Lu, Tongda; Li, Xiaolai; Wang, Huimin
2018-01-01
Atomic force microscopy (AFM) images normally exhibit various artifacts. As a result, image flattening is required prior to image analysis. To obtain optimized flattening results, foreground features are generally manually excluded using rectangular masks in image flattening, which is time consuming and inaccurate. In this study, a two-step scheme was proposed to achieve optimized image flattening in an automated manner. In the first step, the convex and concave features in the foreground were automatically segmented with accurate boundary detection. The extracted foreground features were taken as exclusion masks. In the second step, data points in the background were fitted as polynomial curves/surfaces, which were then subtracted from raw images to get the flattened images. Moreover, sliding-window-based polynomial fitting was proposed to process images with complex background trends. The working principle of the two-step image flattening scheme were presented, followed by the investigation of the influence of a sliding-window size and polynomial fitting direction on the flattened images. Additionally, the role of image flattening on the morphological characterization and segmentation of AFM images were verified with the proposed method.
Microwave Radiometers for Fire Detection in Trains: Theory and Feasibility Study †
Alimenti, Federico; Roselli, Luca; Bonafoni, Stefania
2016-01-01
This paper introduces the theory of fire detection in moving vehicles by microwave radiometers. The system analysis is discussed and a feasibility study is illustrated on the basis of two implementation hypotheses. The basic idea is to have a fixed radiometer and to look inside the glass windows of the wagon when it passes in front of the instrument antenna. The proposed sensor uses a three-pixel multi-beam configuration that allows an image to be formed by the movement of the train itself. Each pixel is constituted by a direct amplification microwave receiver operating at 31.4 GHz. At this frequency, the antenna can be a 34 cm offset parabolic dish, whereas a 1 K brightness temperature resolution is achievable with an overall system noise figure of 6 dB, an observation bandwidth of 2 GHz and an integration time of 1 ms. The effect of the detector noise is also investigated and several implementation hypotheses are discussed. The presented study is important since it could be applied to the automatic fire alarm in trains and moving vehicles with dielectric wall/windows. PMID:27322280
1995-06-01
Energy efficient, 30 and 40 watt ballasts are Rapid Start, thermally protected, automatic resetting. Class P, high or low power factor as required...BALLASTS Energy efficient, 30 ana 40 watt Rapic Start, thermally protected, automatic resetting. Class P. high power factor, CEM, sound rated A. unless...BALLASTS Energy efficient, 40 Watt Rapid Start, thermally protected, automatic resetting, Class P, high power factor, CBM, sound rated A, unless
Application of industrial robots in automatic disassembly line of waste LCD displays
NASA Astrophysics Data System (ADS)
Wang, Sujuan
2017-11-01
In the automatic disassembly line of waste LCD displays, LCD displays are disassembled into plastic shells, metal shields, circuit boards, and LCD panels. Two industrial robots are used to cut metal shields and remove circuit boards in this automatic disassembly line. The functions of these two industrial robots, and the solutions to the critical issues of model selection, the interfaces with PLCs and the workflows were described in detail in this paper.
Wang, Hui; Xu, Lei; Fan, Zhanming; Liang, Junfu; Yan, Zixu; Sun, Zhonghua
2017-01-01
The aim of this study was to evaluate the workflow efficiency of a new automatic coronary-specific reconstruction technique (Smart Phase, GE Healthcare-SP) for selection of the best cardiac phase with least coronary motion when compared with expert manual selection (MS) of best phase in patients with high heart rate. A total of 46 patients with heart rates above 75 bpm who underwent single beat coronary computed tomography angiography (CCTA) were enrolled in this study. CCTA of all subjects were performed on a 256-detector row CT scanner (Revolution CT, GE Healthcare, Waukesha, Wisconsin, US). With the SP technique, the acquired phase range was automatically searched in 2% phase intervals during the reconstruction process to determine the optimal phase for coronary assessment, while for routine expert MS, reconstructions were performed at 5% intervals and a best phase was manually determined. The reconstruction and review times were recorded to measure the workflow efficiency for each method. Two reviewers subjectively assessed image quality for each coronary artery in the MS and SP reconstruction volumes using a 4-point grading scale. The average HR of the enrolled patients was 91.1±19.0bpm. A total of 204 vessels were assessed. The subjective image quality using SP was comparable to that of the MS, 1.45±0.85 vs 1.43±0.81 respectively (p = 0.88). The average time was 246 seconds for the manual best phase selection, and 98 seconds for the SP selection, resulting in average time saving of 148 seconds (60%) with use of the SP algorithm. The coronary specific automatic cardiac best phase selection technique (Smart Phase) improves clinical workflow in high heart rate patients and provides image quality comparable with manual cardiac best phase selection. Reconstruction of single-beat CCTA exams with SP can benefit the users with less experienced in CCTA image interpretation.
2015-01-01
Retinal fundus images are widely used in diagnosing and providing treatment for several eye diseases. Prior works using retinal fundus images detected the presence of exudation with the aid of publicly available dataset using extensive segmentation process. Though it was proved to be computationally efficient, it failed to create a diabetic retinopathy feature selection system for transparently diagnosing the disease state. Also the diagnosis of diseases did not employ machine learning methods to categorize candidate fundus images into true positive and true negative ratio. Several candidate fundus images did not include more detailed feature selection technique for diabetic retinopathy. To apply machine learning methods and classify the candidate fundus images on the basis of sliding window a method called, Diabetic Fundus Image Recuperation (DFIR) is designed in this paper. The initial phase of DFIR method select the feature of optic cup in digital retinal fundus images based on Sliding Window Approach. With this, the disease state for diabetic retinopathy is assessed. The feature selection in DFIR method uses collection of sliding windows to obtain the features based on the histogram value. The histogram based feature selection with the aid of Group Sparsity Non-overlapping function provides more detailed information of features. Using Support Vector Model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy diseases. The ranking of disease level for each candidate set provides a much promising result for developing practically automated diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, specificity rate, ranking efficiency and feature selection time. PMID:25974230
An Energy-Aware Hybrid ARQ Scheme with Multi-ACKs for Data Sensing Wireless Sensor Networks.
Zhang, Jinhuan; Long, Jun
2017-06-12
Wireless sensor networks (WSNs) are one of the important supporting technologies of edge computing. In WSNs, reliable communications are essential for most applications due to the unreliability of wireless links. In addition, network lifetime is also an important performance metric and needs to be considered in many WSN studies. In the paper, an energy-aware hybrid Automatic Repeat-reQuest protocol (ARQ) scheme is proposed to ensure energy efficiency under the guarantee of network transmission reliability. In the scheme, the source node sends data packets continuously with the correct window size and it does not need to wait for the acknowledgement (ACK) confirmation for each data packet. When the destination receives K data packets, it will return multiple copies of one ACK for confirmation to avoid ACK packet loss. The energy consumption of each node in flat circle network applying the proposed scheme is statistical analyzed and the cases under which it is more energy efficiency than the original scheme is discussed. Moreover, how to select parameters of the scheme is addressed to extend the network lifetime under the constraint of the network reliability. In addition, the energy efficiency of the proposed schemes is evaluated. Simulation results are presented to demonstrate that a node energy consumption reduction could be gained and the network lifetime is prolonged.
The new generation of OpenGL support in ROOT
NASA Astrophysics Data System (ADS)
Tadel, M.
2008-07-01
OpenGL has been promoted to become the main 3D rendering engine of the ROOT framework. This required a major re-modularization of OpenGL support on all levels, from basic window-system specific interface to medium-level object-representation and top-level scene management. This new architecture allows seamless integration of external scene-graph libraries into the ROOT OpenGL viewer as well as inclusion of ROOT 3D scenes into external GUI and OpenGL-based 3D-rendering frameworks. Scene representation was removed from inside of the viewer, allowing scene-data to be shared among several viewers and providing for a natural implementation of multi-view canvas layouts. The object-graph traversal infrastructure allows free mixing of 3D and 2D-pad graphics and makes implementation of ROOT canvas in pure OpenGL possible. Scene-elements representing ROOT objects trigger automatic instantiation of user-provided rendering-objects based on the dictionary information and class-naming convention. Additionally, a finer, per-object control over scene-updates is available to the user, allowing overhead-free maintenance of dynamic 3D scenes and creation of complex real-time animations. User-input handling was modularized as well, making it easy to support application-specific scene navigation, selection handling and tool management.
National Stormwater Calculator: Low Impact Development ...
The National Stormwater Calculator (NSC) makes it easy to estimate runoff reduction when planning a new development or redevelopment site with low impact development (LID) stormwater controls. The Calculator is currently deployed as a Windows desktop application. The Calculator is organized as a wizard style application that walks the user through the steps necessary to perform runoff calculations on a single urban sub-catchment of 10 acres or less in size. Using an interactive map, the user can select the sub-catchment location and the Calculator automatically acquires hydrologic data for the site.A new LID cost estimation module has been developed for the Calculator. This project involved programming cost curves into the existing Calculator desktop application. The integration of cost components of LID controls into the Calculator increases functionality and will promote greater use of the Calculator as a stormwater management and evaluation tool. The addition of the cost estimation module allows planners and managers to evaluate LID controls based on comparison of project cost estimates and predicted LID control performance. Cost estimation is accomplished based on user-identified size (or auto-sizing based on achieving volume control or treatment of a defined design storm), configuration of the LID control infrastructure, and other key project and site-specific variables, including whether the project is being applied as part of new development or redevelopm
SSAIS: A Program to Assess Adverse Impact in Multistage Selection Decisions
ERIC Educational Resources Information Center
De Corte, Wilfried
2004-01-01
The article describes a Windows program to estimate the expected value and sampling distribution function of the adverse impact ratio for general multistage selections. The results of the program can also be used to predict the risk that a future selection decision will result in an outcome that reflects the presence of adverse impact. The method…
2013-04-01
tumor microenvironment on clonal selection using intravital microscopy Jae-Hyun Park 1 , Miriam R. Fein 1 , Mikala Egeblad 1 1 Cold Spring Harbor...used surgically implanted mammary imaging windows in immunocompetent mice and injected “brainbow” expressing, syngeneic 4T1 breast carcinoma cells...under the windows. This allowed us to acquire multiple time- lapse imaging series by spinning disk confocal microscopy of the same tumor, done about 3
78 FR 11609 - Special Conditions: Embraer S.A., Model EMB-550 Airplane; Landing Pitchover Condition
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... automatic braking system. The applicable airworthiness regulations do not contain adequate or appropriate... with an automatic braking system. This feature is a pilot-selectable function that allows earlier braking at landing without pilot pedal input. When the autobrake system is armed before landing, it...
Electrophysiological Evidence of Automatic Early Semantic Processing
ERIC Educational Resources Information Center
Hinojosa, Jose A.; Martin-Loeches, Manuel; Munoz, Francisco; Casado, Pilar; Pozo, Miguel A.
2004-01-01
This study investigates the automatic-controlled nature of early semantic processing by means of the Recognition Potential (RP), an event-related potential response that reflects lexical selection processes. For this purpose tasks differing in their processing requirements were used. Half of the participants performed a physical task involving a…
Unsupervised MDP Value Selection for Automating ITS Capabilities
ERIC Educational Resources Information Center
Stamper, John; Barnes, Tiffany
2009-01-01
We seek to simplify the creation of intelligent tutors by using student data acquired from standard computer aided instruction (CAI) in conjunction with educational data mining methods to automatically generate adaptive hints. In our previous work, we have automatically generated hints for logic tutoring by constructing a Markov Decision Process…
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano
Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.
Zhang, Huacheng; Hou, Jue; Hu, Yaoxin; Wang, Peiyao; Ou, Ranwen; Jiang, Lei; Liu, Jefferson Zhe; Freeman, Benny D.; Hill, Anita J.; Wang, Huanting
2018-01-01
Porous membranes with ultrafast ion permeation and high ion selectivity are highly desirable for efficient mineral separation, water purification, and energy conversion, but it is still a huge challenge to efficiently separate monatomic ions of the same valence and similar sizes using synthetic membranes. We report metal organic framework (MOF) membranes, including ZIF-8 and UiO-66 membranes with uniform subnanometer pores consisting of angstrom-sized windows and nanometer-sized cavities for ultrafast selective transport of alkali metal ions. The angstrom-sized windows acted as ion selectivity filters for selection of alkali metal ions, whereas the nanometer-sized cavities functioned as ion conductive pores for ultrafast ion transport. The ZIF-8 and UiO-66 membranes showed a LiCl/RbCl selectivity of ~4.6 and ~1.8, respectively, which are much greater than the LiCl/RbCl selectivity of 0.6 to 0.8 measured in traditional porous membranes. Molecular dynamics simulations suggested that ultrafast and selective ion transport in ZIF-8 was associated with partial dehydration effects. This study reveals ultrafast and selective transport of monovalent ions in subnanometer MOF pores and opens up a new avenue to develop unique MOF platforms for efficient ion separations in the future. PMID:29487910
Zhang, Huacheng; Hou, Jue; Hu, Yaoxin; Wang, Peiyao; Ou, Ranwen; Jiang, Lei; Liu, Jefferson Zhe; Freeman, Benny D; Hill, Anita J; Wang, Huanting
2018-02-01
Porous membranes with ultrafast ion permeation and high ion selectivity are highly desirable for efficient mineral separation, water purification, and energy conversion, but it is still a huge challenge to efficiently separate monatomic ions of the same valence and similar sizes using synthetic membranes. We report metal organic framework (MOF) membranes, including ZIF-8 and UiO-66 membranes with uniform subnanometer pores consisting of angstrom-sized windows and nanometer-sized cavities for ultrafast selective transport of alkali metal ions. The angstrom-sized windows acted as ion selectivity filters for selection of alkali metal ions, whereas the nanometer-sized cavities functioned as ion conductive pores for ultrafast ion transport. The ZIF-8 and UiO-66 membranes showed a LiCl/RbCl selectivity of ~4.6 and ~1.8, respectively, which are much greater than the LiCl/RbCl selectivity of 0.6 to 0.8 measured in traditional porous membranes. Molecular dynamics simulations suggested that ultrafast and selective ion transport in ZIF-8 was associated with partial dehydration effects. This study reveals ultrafast and selective transport of monovalent ions in subnanometer MOF pores and opens up a new avenue to develop unique MOF platforms for efficient ion separations in the future.
Reflective insulating blinds for windows and the like
Barnes, P.R.; Shapira, H.B.
1979-12-07
Energy-conserving window blinds are provided. The blinds are fabricated from coupled and adjustable slats, each slat having an insulation layer and a reflective surface to face outwardly when the blinds are closed. A range of desired light and air transmission may be selected with the reflective surfaces of the slats adapted to direct sunlight upward toward the ceiling when the blinds are open. When the blinds are closed, the insulation of the slats reduces the heat loss or gain produced by the windows. If desired, the reflective surfaces of the slats may be concave. The edges of the slats are designed to seal against adjacent slats when the blinds are closed to ensure minimum air flow between slats.
Reflective insulating blinds for windows and the like
Barnes, Paul R.; Shapira, Hanna B.
1981-01-01
Energy-conserving window blinds are provided. The blinds are fabricated from coupled and adjustable slats, each slat having an insulation layer and a reflective surface to face outwardly when the blinds are closed. A range of desired light and air transmission may be selected with the reflective surfaces of the slats adapted to direct sunlight upward toward the ceiling when the blinds are open. When the blinds are closed, the insulation of the slats reduces the heat loss or gain produced by the windows. If desired, the reflective surfaces of the slats may be concave. The edges of the slats are designed to seal against adjacent slats when the blinds are closed to ensure minimum air flow between slats.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
Using volcanic tremor for eruption forecasting at White Island volcano (Whakaari), New Zealand
NASA Astrophysics Data System (ADS)
Chardot, Lauriane; Jolly, Arthur D.; Kennedy, Ben M.; Fournier, Nicolas; Sherburn, Steven
2015-09-01
Eruption forecasting is a challenging task because of the inherent complexity of volcanic systems. Despite remarkable efforts to develop complex models in order to explain volcanic processes prior to eruptions, the material Failure Forecast Method (FFM) is one of the very few techniques that can provide a forecast time for an eruption. However, the method requires testing and automation before being used as a real-time eruption forecasting tool at a volcano. We developed an automatic algorithm to issue forecasts from volcanic tremor increase episodes recorded by Real-time Seismic Amplitude Measurement (RSAM) at one station and optimised this algorithm for the period August 2011-January 2014 which comprises the recent unrest period at White Island volcano (Whakaari), New Zealand. A detailed residual analysis was paramount to select the most appropriate model explaining the RSAM time evolutions. In a hindsight simulation, four out of the five small eruptions reported during this period occurred within a failure window forecast by our optimised algorithm and the probability of an eruption on a day within a failure window was 0.21, which is 37 times higher than the probability of having an eruption on any day during the same period (0.0057). Moreover, the forecasts were issued prior to the eruptions by a few hours which is important from an emergency management point of view. Whereas the RSAM time evolutions preceding these four eruptions have a similar goodness-of-fit with the FFM, their spectral characteristics are different. The duration-amplitude distributions of the precursory tremor episodes support the hypothesis that several processes were likely occurring prior to these eruptions. We propose that slow rock failure and fluid flow processes are plausible candidates for the tremor source of these episodes. This hindsight exercise can be useful for future real-time implementation of the FFM at White Island. A similar methodology could also be tested at other volcanoes even if only a limited network is available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
EMAM, M; Eldib, A; Lin, M
2014-06-01
Purpose: An in-house Monte Carlo based treatment planning system (MC TPS) has been developed for modulated electron radiation therapy (MERT). Our preliminary MERT planning experience called for a more user friendly graphical user interface. The current work aimed to design graphical windows and tools to facilitate the contouring and planning process. Methods: Our In-house GUI MC TPS is built on a set of EGS4 user codes namely MCPLAN and MCBEAM in addition to an in-house optimization code, which was named as MCOPTIM. Patient virtual phantom is constructed using the tomographic images in DICOM format exported from clinical treatment planning systemsmore » (TPS). Treatment target volumes and critical structures were usually contoured on clinical TPS and then sent as a structure set file. In our GUI program we developed a visualization tool to allow the planner to visualize the DICOM images and delineate the various structures. We implemented an option in our code for automatic contouring of the patient body and lungs. We also created an interface window displaying a three dimensional representation of the target and also showing a graphical representation of the treatment beams. Results: The new GUI features helped streamline the planning process. The implemented contouring option eliminated the need for performing this step on clinical TPS. The auto detection option for contouring the outer patient body and lungs was tested on patient CTs and it was shown to be accurate as compared to that of clinical TPS. The three dimensional representation of the target and the beams allows better selection of the gantry, collimator and couch angles. Conclusion: An in-house GUI program has been developed for more efficient MERT planning. The application of aiding tools implemented in the program is time saving and gives better control of the planning process.« less
Otero, José; Palacios, Ana; Suárez, Rosario; Junco, Luis
2014-01-01
When selecting relevant inputs in modeling problems with low quality data, the ranking of the most informative inputs is also uncertain. In this paper, this issue is addressed through a new procedure that allows the extending of different crisp feature selection algorithms to vague data. The partial knowledge about the ordinal of each feature is modelled by means of a possibility distribution, and a ranking is hereby applied to sort these distributions. It will be shown that this technique makes the most use of the available information in some vague datasets. The approach is demonstrated in a real-world application. In the context of massive online computer science courses, methods are sought for automatically providing the student with a qualification through code metrics. Feature selection methods are used to find the metrics involved in the most meaningful predictions. In this study, 800 source code files, collected and revised by the authors in classroom Computer Science lectures taught between 2013 and 2014, are analyzed with the proposed technique, and the most relevant metrics for the automatic grading task are discussed. PMID:25114967
Navarro, Pedro J; Fernández-Isla, Carlos; Alcover, Pedro María; Suardíaz, Juan
2016-07-27
This paper presents a robust method for defect detection in textures, entropy-based automatic selection of the wavelet decomposition level (EADL), based on a wavelet reconstruction scheme, for detecting defects in a wide variety of structural and statistical textures. Two main features are presented. One of the new features is an original use of the normalized absolute function value (NABS) calculated from the wavelet coefficients derived at various different decomposition levels in order to identify textures where the defect can be isolated by eliminating the texture pattern in the first decomposition level. The second is the use of Shannon's entropy, calculated over detail subimages, for automatic selection of the band for image reconstruction, which, unlike other techniques, such as those based on the co-occurrence matrix or on energy calculation, provides a lower decomposition level, thus avoiding excessive degradation of the image, allowing a more accurate defect segmentation. A metric analysis of the results of the proposed method with nine different thresholding algorithms determined that selecting the appropriate thresholding method is important to achieve optimum performance in defect detection. As a consequence, several different thresholding algorithms depending on the type of texture are proposed.
Automatic Detection of Electric Power Troubles (ADEPT)
NASA Technical Reports Server (NTRS)
Wang, Caroline; Zeanah, Hugh; Anderson, Audie; Patrick, Clint; Brady, Mike; Ford, Donnie
1988-01-01
ADEPT is an expert system that integrates knowledge from three different suppliers to offer an advanced fault-detection system, and is designed for two modes of operation: real-time fault isolation and simulated modeling. Real time fault isolation of components is accomplished on a power system breadboard through the Fault Isolation Expert System (FIES II) interface with a rule system developed in-house. Faults are quickly detected and displayed and the rules and chain of reasoning optionally provided on a Laser printer. This system consists of a simulated Space Station power module using direct-current power supplies for Solar arrays on three power busses. For tests of the system's ability to locate faults inserted via switches, loads are configured by an INTEL microcomputer and the Symbolics artificial intelligence development system. As these loads are resistive in nature, Ohm's Law is used as the basis for rules by which faults are located. The three-bus system can correct faults automatically where there is a surplus of power available on any of the three busses. Techniques developed and used can be applied readily to other control systems requiring rapid intelligent decisions. Simulated modelling, used for theoretical studies, is implemented using a modified version of Kennedy Space Center's KATE (Knowledge-Based Automatic Test Equipment), FIES II windowing, and an ADEPT knowledge base. A load scheduler and a fault recovery system are currently under development to support both modes of operation.
pyGrav, a Python-based program for handling and processing relative gravity data
NASA Astrophysics Data System (ADS)
Hector, Basile; Hinderer, Jacques
2016-06-01
pyGrav is a Python-based open-source software dedicated to the complete processing of relative-gravity data. It is particularly suited for time-lapse gravity surveys where high precision is sought. Its purpose is to bind together single-task processing codes in a user-friendly interface for handy and fast treatment of raw gravity data from many stations of a network. The intuitive object-based implementation allows to easily integrate additional functions (reading/writing routines, processing schemes, data plots) related to the appropriate object (a station, a loop, or a survey). This makes pyGrav an evolving tool. Raw data can be corrected for tides and air pressure effects. The data selection step features a double table-plot graphical window with either manual or automatic selection according to specific thresholds on data channels (tilts, gravity values, gravity standard deviation, duration of measurements, etc.). Instrumental drifts and gravity residuals are obtained by least square analysis of the dataset. This first step leads to the gravity simple differences between a reference point and any point of the network. When different repetitions of the network are done, the software computes then the gravity double differences and associated errors. The program has been tested on two specific case studies: a large dataset acquired for the study of water storage changes on a small catchment in West Africa, and a dataset operated and processed by several different users for geothermal studies in northern Alsace, France. In both cases, pyGrav proved to be an efficient and easy-to-use solution for the effective processing of relative-gravity data.
[Development of a Software for Automatically Generated Contours in Eclipse TPS].
Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin
2015-03-01
The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.
Automatic Fastening Large Structures: a New Approach
NASA Technical Reports Server (NTRS)
Lumley, D. F.
1985-01-01
The external tank (ET) intertank structure for the space shuttle, a 27.5 ft diameter 22.5 ft long externally stiffened mechanically fastened skin-stringer-frame structure, was a labor intensitive manual structure built on a modified Saturn tooling position. A new approach was developed based on half-section subassemblies. The heart of this manufacturing approach will be 33 ft high vertical automatic riveting system with a 28 ft rotary positioner coming on-line in mid 1985. The Automatic Riveting System incorporates many of the latest automatic riveting technologies. Key features include: vertical columns with two sets of independently operating CNC drill-riveting heads; capability of drill, insert and upset any one piece fastener up to 3/8 inch diameter including slugs without displacing the workpiece offset bucking ram with programmable rotation and deep retraction; vision system for automatic parts program re-synchronization and part edge margin control; and an automatic rivet selection/handling system.
DELINEATING SUBTYPES OF SELF-INJURIOUS BEHAVIOR MAINTAINED BY AUTOMATIC REINFORCEMENT
Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.
2016-01-01
Self-injurious behavior (SIB) is maintained by automatic reinforcement in roughly 25% of cases. Automatically reinforced SIB typically has been considered a single functional category, and is less understood than socially reinforced SIB. Subtyping automatically reinforced SIB into functional categories has the potential to guide the development of more targeted interventions and increase our understanding of its biological underpinnings. The current study involved an analysis of 39 individuals with automatically reinforced SIB and a comparison group of 13 individuals with socially reinforced SIB. Automatically reinforced SIB was categorized into 3 subtypes based on patterns of responding in the functional analysis and the presence of self-restraint. These response features were selected as the basis for subtyping on the premise that they could reflect functional properties of SIB unique to each subtype. Analysis of treatment data revealed important differences across subtypes and provides preliminary support to warrant additional research on this proposed subtyping model. PMID:26223959
Patrick, Regan E; Rastogi, Anuj; Christensen, Bruce K
2015-01-01
Adaptive emotional responding relies on dual automatic and effortful processing streams. Dual-stream models of schizophrenia (SCZ) posit a selective deficit in neural circuits that govern goal-directed, effortful processes versus reactive, automatic processes. This imbalance suggests that when patients are confronted with competing automatic and effortful emotional response cues, they will exhibit diminished effortful responding and intact, possibly elevated, automatic responding compared to controls. This prediction was evaluated using a modified version of the face-vignette task (FVT). Participants viewed emotional faces (automatic response cue) paired with vignettes (effortful response cue) that signalled a different emotion category and were instructed to discriminate the manifest emotion. Patients made less vignette and more face responses than controls. However, the relationship between group and FVT responding was moderated by IQ and reading comprehension ability. These results replicate and extend previous research and provide tentative support for abnormal conflict resolution between automatic and effortful emotional processing predicted by dual-stream models of SCZ.
Rausch, Alexander M; Küng, Vera E; Pobel, Christoph; Markl, Matthias; Körner, Carolin
2017-09-22
The resulting properties of parts fabricated by powder bed fusion additive manufacturing processes are determined by their porosity, local composition, and microstructure. The objective of this work is to examine the influence of the stochastic powder bed on the process window for dense parts by means of numerical simulation. The investigations demonstrate the unique capability of simulating macroscopic domains in the range of millimeters with a mesoscopic approach, which resolves the powder bed and the hydrodynamics of the melt pool. A simulated process window reveals the influence of the stochastic powder layer. The numerical results are verified with an experimental process window for selective electron beam-melted Ti-6Al-4V. Furthermore, the influence of the powder bulk density is investigated numerically. The simulations predict an increase in porosity and surface roughness for samples produced with lower powder bulk densities. Due to its higher probability for unfavorable powder arrangements, the process stability is also decreased. This shrinks the actual parameter range in a process window for producing dense parts.
Rausch, Alexander M.; Küng, Vera E.; Pobel, Christoph; Körner, Carolin
2017-01-01
The resulting properties of parts fabricated by powder bed fusion additive manufacturing processes are determined by their porosity, local composition, and microstructure. The objective of this work is to examine the influence of the stochastic powder bed on the process window for dense parts by means of numerical simulation. The investigations demonstrate the unique capability of simulating macroscopic domains in the range of millimeters with a mesoscopic approach, which resolves the powder bed and the hydrodynamics of the melt pool. A simulated process window reveals the influence of the stochastic powder layer. The numerical results are verified with an experimental process window for selective electron beam-melted Ti-6Al-4V. Furthermore, the influence of the powder bulk density is investigated numerically. The simulations predict an increase in porosity and surface roughness for samples produced with lower powder bulk densities. Due to its higher probability for unfavorable powder arrangements, the process stability is also decreased. This shrinks the actual parameter range in a process window for producing dense parts. PMID:28937633
Continuation of research into software for space operations support, volume 1
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.
Wen, Wenhui; Wang, Yuxin; Liu, Hongji; Wang, Kai; Qiu, Ping; Wang, Ke
2018-01-01
One benefit of excitation at the 1700-nm window is the more accessible modalities of multiphoton signal generation. It is demonstrated here that the transmittance performance of the objective lens is of vital importance for efficient higher-order multiphoton signal generation and collection excited at the 1700-nm window. Two commonly used objective lenses for multiphoton microscopy (MPM) are characterized and compared, one with regular coating and the other with customized coating for high transmittance at the 1700-nm window. Our results show that, fourth harmonic generation imaging of mouse tail tendon and 5-photon fluorescence of carbon quantum dots using the regular objective lens shows an order of magnitude signal higher than those using the customized objective lens. Besides, the regular objective lens also enables a 3-photon fluorescence imaging depth of >1600 μm in mouse brain in vivo. Our results will provide guidelines for objective lens selection for MPM at the 1700-nm window. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
1988-04-01
e.g., definitions, references, pictures) on the selected item in a separate window. For example, in a hyper- text document on astronomy , the reader...might arrive at the highlighted word " Copernicus ", select the word with the keyboard or mouse, and then be offered a number of related topics from
NASA Astrophysics Data System (ADS)
Sagnotti, Leonardo
2013-04-01
Modern rock magnetometers and stepwise demagnetization procedures result in the production of large datasets, which need a versatile and fast software for their display and analysis. Various software packages for paleomagnetic analyses have been recently developed to overcome the problems linked to the limited capability and the loss of operability of early codes written in obsolete computer languages and/or platforms, not compatible with modern 64 bit processors. The Demagnetization Analysis in Excel (DAIE) workbook is a new software that has been designed to make the analysis of demagnetization data easy and accessible on an application (Microsoft Excel) widely diffused and available on both the Microsoft Windows and Mac OS X operating systems. The widespread diffusion of Excel should guarantee a long term working life, since compatibility and functionality of current Excel files should be most likely maintained during the development of new processors and operating systems. DAIE is designed for viewing and analyzing stepwise demagnetization data of both discrete and u-channel samples. DAIE consists of a single file and has an open modular structure organized in 10 distinct worksheets. The standard demagnetization diagrams and various parameters of common use are shown on the same worksheet including selectable parameters and user's choices. The remanence characteristic components may be computed by principal component analysis (PCA) on a selected interval of demagnetization steps. Saving of the PCA data can be done both sample by sample, or in automatic by applying the selected choices to all the samples included in the file. The DAIE open structure allows easy personalization, development and improvement. The workbook has the following features which may be valuable for various users: - Operability in nearly all the computers and platforms; - Easy inputs of demagnetization data by "copy and paste" from ASCII files; - Easy export of computed parameters and demagnetization plots; - Complete control of the whole workflow and possibility of implementation of the workbook by any user; - Modular structure in distinct worksheets for each type of analyses and plots, in order to make implementation and personalization easier; - Opportunity to use the workbook for educational purposes, since all the computations and analyses are easily traceable and accessible; - Automatic and fast analysis of a large batch of demagnetization data, such as those measured on u-channel samples. The DAIE workbook and the "User manual" are available for download on a dedicated web site (http://roma2.rm.ingv.it/en/facilities/software/49/daie).
A simulator evaluation of an automatic terminal approach system
NASA Technical Reports Server (NTRS)
Hinton, D. A.
1983-01-01
The automatic terminal approach system (ATAS) is a concept for improving the pilot/machine interface with cockpit automation. The ATAS can automatically fly a published instrument approach by using stored instrument approach data to automatically tune airplane avionics, control the airplane's autopilot, and display status information to the pilot. A piloted simulation study was conducted to determine the feasibility of an ATAS, determine pilot acceptance, and examine pilot/ATAS interaction. Seven instrument-rated pilots each flew four instrument approaches with a base-line heading select autopilot mode. The ATAS runs resulted in lower flight technical error, lower pilot workload, and fewer blunders than with the baseline autopilot. The ATAS status display enabled the pilots to maintain situational awareness during the automatic approaches. The system was well accepted by the pilots.
ADMAP (automatic data manipulation program)
NASA Technical Reports Server (NTRS)
Mann, F. I.
1971-01-01
Instructions are presented on the use of ADMAP, (automatic data manipulation program) an aerospace data manipulation computer program. The program was developed to aid in processing, reducing, plotting, and publishing electric propulsion trajectory data generated by the low thrust optimization program, HILTOP. The program has the option of generating SC4020 electric plots, and therefore requires the SC4020 routines to be available at excution time (even if not used). Several general routines are present, including a cubic spline interpolation routine, electric plotter dash line drawing routine, and single parameter and double parameter sorting routines. Many routines are tailored for the manipulation and plotting of electric propulsion data, including an automatic scale selection routine, an automatic curve labelling routine, and an automatic graph titling routine. Data are accepted from either punched cards or magnetic tape.
Tie Points Extraction for SAR Images Based on Differential Constraints
NASA Astrophysics Data System (ADS)
Xiong, X.; Jin, G.; Xu, Q.; Zhang, H.
2018-04-01
Automatically extracting tie points (TPs) on large-size synthetic aperture radar (SAR) images is still challenging because the efficiency and correct ratio of the image matching need to be improved. This paper proposes an automatic TPs extraction method based on differential constraints for large-size SAR images obtained from approximately parallel tracks, between which the relative geometric distortions are small in azimuth direction and large in range direction. Image pyramids are built firstly, and then corresponding layers of pyramids are matched from the top to the bottom. In the process, the similarity is measured by the normalized cross correlation (NCC) algorithm, which is calculated from a rectangular window with the long side parallel to the azimuth direction. False matches are removed by the differential constrained random sample consensus (DC-RANSAC) algorithm, which appends strong constraints in azimuth direction and weak constraints in range direction. Matching points in the lower pyramid images are predicted with the local bilinear transformation model in range direction. Experiments performed on ENVISAT ASAR and Chinese airborne SAR images validated the efficiency, correct ratio and accuracy of the proposed method.
Ligand.Info small-molecule Meta-Database.
von Grotthuss, Marcin; Koczyk, Grzegorz; Pas, Jakub; Wyrwicz, Lucjan S; Rychlewski, Leszek
2004-12-01
Ligand.Info is a compilation of various publicly available databases of small molecules. The total size of the Meta-Database is over 1 million entries. The compound records contain calculated three-dimensional coordinates and sometimes information about biological activity. Some molecules have information about FDA drug approving status or about anti-HIV activity. Meta-Database can be downloaded from the http://Ligand.Info web page. The database can also be screened using a Java-based tool. The tool can interactively cluster sets of molecules on the user side and automatically download similar molecules from the server. The application requires the Java Runtime Environment 1.4 or higher, which can be automatically downloaded from Sun Microsystems or Apple Computer and installed during the first use of Ligand.Info on desktop systems, which support Java (Ms Windows, Mac OS, Solaris, and Linux). The Ligand.Info Meta-Database can be used for virtual high-throughput screening of new potential drugs. Presented examples showed that using a known antiviral drug as query the system was able to find others antiviral drugs and inhibitors.
Automatic Censoring CFAR Detector Based on Ordered Data Difference for Low-Flying Helicopter Safety
Jiang, Wen; Huang, Yulin; Yang, Jianyu
2016-01-01
Being equipped with a millimeter-wave radar allows a low-flying helicopter to sense the surroundings in real time, which significantly increases its safety. However, nonhomogeneous clutter environments, such as a multiple target situation and a clutter edge environment, can dramatically affect the radar signal detection performance. In order to improve the radar signal detection performance in nonhomogeneous clutter environments, this paper proposes a new automatic censored cell averaging CFAR detector. The proposed CFAR detector does not require any prior information about the background environment and uses the hypothesis test of the first-order difference (FOD) result of ordered data to reject the unwanted samples in the reference window. After censoring the unwanted ranked cells, the remaining samples are combined to form an estimate of the background power level, thus getting better radar signal detection performance. The simulation results show that the FOD-CFAR detector provides low loss CFAR performance in a homogeneous environment and also performs robustly in nonhomogeneous environments. Furthermore, the measured results of a low-flying helicopter validate the basic performance of the proposed method. PMID:27399714
NASA Astrophysics Data System (ADS)
Hibino, Daisuke; Hsu, Mingyi; Shindo, Hiroyuki; Izawa, Masayuki; Enomoto, Yuji; Lin, J. F.; Hu, J. R.
2013-04-01
The impact on yield loss due to systematic defect which remains after Optical Proximity Correction (OPC) modeling has increased, and achieving an acceptable yield has become more difficult in the leading technology beyond 20 nm node production. Furthermore Process-Window has become narrow because of the complexity of IC design and less process margin. In the past, the systematic defects have been inspected by human-eyes. However the judgment by human-eyes is sometime unstable and not accurate. Moreover an enormous amount of time and labor will have to be expended on the one-by-one judgment for several thousands of hot-spot defects. In order to overcome these difficulties and improve the yield and manufacturability, the automated system, which can quantify the shape difference with high accuracy and speed, is needed. Inspection points could be increased for getting higher yield, if the automated system achieves our goal. Defect Window Analysis (DWA) system by using high-precision-contour extraction from SEM image on real silicon and quantifying method which can calculate the difference between defect pattern and non-defect pattern automatically, which was developed by Hitachi High-Technologies, has been applied to the defect judgment instead of the judgment by human-eyes. The DWA result which describes process behavior might be feedback to design or OPC or mask. This new methodology and evaluation results will be presented in detail in this paper.
Capitani, Paolo; Cerri, Matteo; Amici, Roberto; Baracchi, Francesca; Jones, Christine Ann; Luppi, Marco; Perez, Emanuele; Parmeggiani, Pier Luigi; Zamboni, Giovanni
A shift of physiological regulations from a homeostatic to a non-homeostatic modality characterizes the passage from non-NREM sleep (NREMS) to REM sleep (REMS). In the rat, an EEG index which allows the automatic scoring of transitions from NREMS to REMS has been proposed: the NREMS to REMS transition indicator value, NIV [J.H. Benington et al., Sleep 17 (1994) 28-36]. However, such transitions are not always followed by a REMS episode, but are often followed by an awakening. In the present study, the relationship between changes in EEG activity and hypothalamic temperature (Thy), taken as an index of autonomic activity, was studied within a window consisting of the 60s which precedes a state change from a consolidated NREMS episode. Furthermore, the probability that a transition would lead to REMS or wake was analysed. The results showed that, within this time window, both a modified NIV (NIV(60)) and the difference between Thy at the limits of the window (Thy(D)) were related to the probability of REMS onset. Both the relationship between the indices and the probability of REMS onset was sigmoid, the latter of which saturated at a probability level around 50-60%. The efficacy for the prediction of successful transitions from NREMS to REMS found using Thy(D) as an index supports the view that such a transition is a dynamic process where the physiological risk to enter REMS is weighted at a central level.
Temporally rendered automatic cloud extraction (TRACE) system
NASA Astrophysics Data System (ADS)
Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.
1999-10-01
Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.
Automatic vibration mode selection and excitation; combining modal filtering with autoresonance
NASA Astrophysics Data System (ADS)
Davis, Solomon; Bucher, Izhak
2018-02-01
Autoresonance is a well-known nonlinear feedback method used for automatically exciting a system at its natural frequency. Though highly effective in exciting single degree of freedom systems, in its simplest form it lacks a mechanism for choosing the mode of excitation when more than one is present. In this case a single mode will be automatically excited, but this mode cannot be chosen or changed. In this paper a new method for automatically exciting a general second-order system at any desired natural frequency using Autoresonance is proposed. The article begins by deriving a concise expression for the frequency of the limit cycle induced by an Autoresonance feedback loop enclosed on the system. The expression is based on modal decomposition, and provides valuable insight into the behavior of a system controlled in this way. With this expression, a method for selecting and exciting a desired mode naturally follows by combining Autoresonance with Modal Filtering. By taking various linear combinations of the sensor signals, by orthogonality one can "filter out" all the unwanted modes effectively. The desired mode's natural frequency is then automatically reflected in the limit cycle. In experiment the technique has proven extremely robust, even if the amplitude of the desired mode is significantly smaller than the others and the modal filters are greatly inaccurate.
Multi-alternative decision-making with non-stationary inputs.
Nunes, Luana F; Gurney, Kevin
2016-08-01
One of the most widely implemented models for multi-alternative decision-making is the multihypothesis sequential probability ratio test (MSPRT). It is asymptotically optimal, straightforward to implement, and has found application in modelling biological decision-making. However, the MSPRT is limited in application to discrete ('trial-based'), non-time-varying scenarios. By contrast, real world situations will be continuous and entail stimulus non-stationarity. In these circumstances, decision-making mechanisms (like the MSPRT) which work by accumulating evidence, must be able to discard outdated evidence which becomes progressively irrelevant. To address this issue, we introduce a new decision mechanism by augmenting the MSPRT with a rectangular integration window and a transparent decision boundary. This allows selection and de-selection of options as their evidence changes dynamically. Performance was enhanced by adapting the window size to problem difficulty. Further, we present an alternative windowing method which exponentially decays evidence and does not significantly degrade performance, while greatly reducing the memory resources necessary. The methods presented have proven successful at allowing for the MSPRT algorithm to function in a non-stationary environment.
NASA Astrophysics Data System (ADS)
Hwang, Taejin; Kim, Yong Nam; Kim, Soo Kon; Kang, Sei-Kwon; Cheong, Kwang-Ho; Park, Soah; Yoon, Jai-Woong; Han, Taejin; Kim, Haeyoung; Lee, Meyeon; Kim, Kyoung-Joo; Bae, Hoonsik; Suh, Tae-Suk
2015-06-01
The dose constraint during prostate intensity-modulated radiation therapy (IMRT) optimization should be patient-specific for better rectum sparing. The aims of this study are to suggest a novel method for automatically generating a patient-specific dose constraint by using an experience-based dose volume histogram (DVH) of the rectum and to evaluate the potential of such a dose constraint qualitatively. The normal tissue complication probabilities (NTCPs) of the rectum with respect to V %ratio in our study were divided into three groups, where V %ratio was defined as the percent ratio of the rectal volume overlapping the planning target volume (PTV) to the rectal volume: (1) the rectal NTCPs in the previous study (clinical data), (2) those statistically generated by using the standard normal distribution (calculated data), and (3) those generated by combining the calculated data and the clinical data (mixed data). In the calculated data, a random number whose mean value was on the fitted curve described in the clinical data and whose standard deviation was 1% was generated by using the `randn' function in the MATLAB program and was used. For each group, we validated whether the probability density function (PDF) of the rectal NTCP could be automatically generated with the density estimation method by using a Gaussian kernel. The results revealed that the rectal NTCP probability increased in proportion to V %ratio , that the predictive rectal NTCP was patient-specific, and that the starting point of IMRT optimization for the given patient might be different. The PDF of the rectal NTCP was obtained automatically for each group except that the smoothness of the probability distribution increased with increasing number of data and with increasing window width. We showed that during the prostate IMRT optimization, the patient-specific dose constraints could be automatically generated and that our method could reduce the IMRT optimization time as well as maintain the IMRT plan quality.
ERIC Educational Resources Information Center
Woodman, Geoffrey F.; Luck, Steven J.
2007-01-01
In many theories of cognition, researchers propose that working memory and perception operate interactively. For example, in previous studies researchers have suggested that sensory inputs matching the contents of working memory will have an automatic advantage in the competition for processing resources. The authors tested this hypothesis by…
A Neurobiological Theory of Automaticity in Perceptual Categorization
ERIC Educational Resources Information Center
Ashby, F. Gregory; Ennis, John M.; Spiering, Brian J.
2007-01-01
A biologically detailed computational model is described of how categorization judgments become automatic in tasks that depend on procedural learning. The model assumes 2 neural pathways from sensory association cortex to the premotor area that mediates response selection. A longer and slower path projects to the premotor area via the striatum,…
Study of the Acquisition of Peripheral Equipment for Use with Automatic Data Processing Systems.
ERIC Educational Resources Information Center
Comptroller General of the U.S., Washington, DC.
The General Accounting Office (GAO) performed this study because: preliminary indications showed that significant savings could be achieved in the procurement of selected computer components; the Federal Government is investing increasing amounts of money in Automatic Data Processing (ADP) equipment; and there is a widespread congressional…
ERIC Educational Resources Information Center
Army Ordnance Center and School, Aberdeen Proving Ground, MD.
These two texts and student workbook for a secondary/postsecondary-level correspondence course in automatic data processing comprise one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. The purpose stated for the individualized, self-paced…
FuGeF: A Resource Bound Secure Forwarding Protocol for Wireless Sensor Networks
Umar, Idris Abubakar; Mohd Hanapi, Zurina; Sali, A.; Zulkarnain, Zuriati A.
2016-01-01
Resource bound security solutions have facilitated the mitigation of spatio-temporal attacks by altering protocol semantics to provide minimal security while maintaining an acceptable level of performance. The Dynamic Window Secured Implicit Geographic Forwarding (DWSIGF) routing protocol for Wireless Sensor Network (WSN) has been proposed to achieve a minimal selection of malicious nodes by introducing a dynamic collection window period to the protocol’s semantics. However, its selection scheme suffers substantial packet losses due to the utilization of a single distance based parameter for node selection. In this paper, we propose a Fuzzy-based Geographic Forwarding protocol (FuGeF) to minimize packet loss, while maintaining performance. The FuGeF utilizes a new form of dynamism and introduces three selection parameters: remaining energy, connectivity cost, and progressive distance, as well as a Fuzzy Logic System (FLS) for node selection. These introduced mechanisms ensure the appropriate selection of a non-malicious node. Extensive simulation experiments have been conducted to evaluate the performance of the proposed FuGeF protocol as compared to DWSIGF variants. The simulation results show that the proposed FuGeF outperforms the two DWSIGF variants (DWSIGF-P and DWSIGF-R) in terms of packet delivery. PMID:27338411
FuGeF: A Resource Bound Secure Forwarding Protocol for Wireless Sensor Networks.
Umar, Idris Abubakar; Mohd Hanapi, Zurina; Sali, A; Zulkarnain, Zuriati A
2016-06-22
Resource bound security solutions have facilitated the mitigation of spatio-temporal attacks by altering protocol semantics to provide minimal security while maintaining an acceptable level of performance. The Dynamic Window Secured Implicit Geographic Forwarding (DWSIGF) routing protocol for Wireless Sensor Network (WSN) has been proposed to achieve a minimal selection of malicious nodes by introducing a dynamic collection window period to the protocol's semantics. However, its selection scheme suffers substantial packet losses due to the utilization of a single distance based parameter for node selection. In this paper, we propose a Fuzzy-based Geographic Forwarding protocol (FuGeF) to minimize packet loss, while maintaining performance. The FuGeF utilizes a new form of dynamism and introduces three selection parameters: remaining energy, connectivity cost, and progressive distance, as well as a Fuzzy Logic System (FLS) for node selection. These introduced mechanisms ensure the appropriate selection of a non-malicious node. Extensive simulation experiments have been conducted to evaluate the performance of the proposed FuGeF protocol as compared to DWSIGF variants. The simulation results show that the proposed FuGeF outperforms the two DWSIGF variants (DWSIGF-P and DWSIGF-R) in terms of packet delivery.
Tencer, Michal; Berini, Pierre
2008-11-04
We describe a method for the selective desorption of thiol self-assembled monolayers from gold surfaces having micrometer-scale separations on a substrate. In an electrolyte solution, the electrical resistance between the adjacent areas can be much lower than the resistance between a surface and the counter electrode. Also, both reductive and oxidative thiol desorption may occur. Therefore, the potentials of the surfaces must be independently controlled with a multichannel potentiostat and operating windows for a given thiol/electrolyte system must be established. In this study operating windows were established for 1-dodecanethiol-based SAMs in phosphate buffer, phosphate-buffered saline, and sodium hydroxide solution, and selective SAM removal was successfully performed in a four-electrode configuration.
Wang, Bei; Wang, Xingyu; Ikeda, Akio; Nagamine, Takashi; Shibasaki, Hiroshi; Nakamura, Masatoshi
2014-01-01
EEG (Electroencephalograph) interpretation is important for the diagnosis of neurological disorders. The proper adjustment of the montage can highlight the EEG rhythm of interest and avoid false interpretation. The aim of this study was to develop an automatic reference selection method to identify a suitable reference. The results may contribute to the accurate inspection of the distribution of EEG rhythms for quantitative EEG interpretation. The method includes two pre-judgements and one iterative detection module. The diffuse case is initially identified by pre-judgement 1 when intermittent rhythmic waveforms occur over large areas along the scalp. The earlobe reference or averaged reference is adopted for the diffuse case due to the effect of the earlobe reference depending on pre-judgement 2. An iterative detection algorithm is developed for the localised case when the signal is distributed in a small area of the brain. The suitable averaged reference is finally determined based on the detected focal and distributed electrodes. The presented technique was applied to the pathological EEG recordings of nine patients. One example of the diffuse case is introduced by illustrating the results of the pre-judgements. The diffusely intermittent rhythmic slow wave is identified. The effect of active earlobe reference is analysed. Two examples of the localised case are presented, indicating the results of the iterative detection module. The focal and distributed electrodes are detected automatically during the repeating algorithm. The identification of diffuse and localised activity was satisfactory compared with the visual inspection. The EEG rhythm of interest can be highlighted using a suitable selected reference. The implementation of an automatic reference selection method is helpful to detect the distribution of an EEG rhythm, which can improve the accuracy of EEG interpretation during both visual inspection and automatic interpretation. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.
Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz
2017-01-01
To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dise, J; Liang, X; Lin, L
Purpose: To evaluate an automatic interstitial catheter digitization algorithm that reduces treatment planning time and provide means for adaptive re-planning in HDR Brachytherapy of Gynecologic Cancers. Methods: The semi-automatic catheter digitization tool utilizes a region growing algorithm in conjunction with a spline model of the catheters. The CT images were first pre-processed to enhance the contrast between the catheters and soft tissue. Several seed locations were selected in each catheter for the region growing algorithm. The spline model of the catheters assisted in the region growing by preventing inter-catheter cross-over caused by air or metal artifacts. Source dwell positions frommore » day one CT scans were applied to subsequent CTs and forward calculated using the automatically digitized catheter positions. This method was applied to 10 patients who had received HDR interstitial brachytherapy on an IRB approved image-guided radiation therapy protocol. The prescribed dose was 18.75 or 20 Gy delivered in 5 fractions, twice daily, over 3 consecutive days. Dosimetric comparisons were made between automatic and manual digitization on day two CTs. Results: The region growing algorithm, assisted by the spline model of the catheters, was able to digitize all catheters. The difference between automatic and manually digitized positions was 0.8±0.3 mm. The digitization time ranged from 34 minutes to 43 minutes with a mean digitization time of 37 minutes. The bulk of the time was spent on manual selection of initial seed positions and spline parameter adjustments. There was no significance difference in dosimetric parameters between the automatic and manually digitized plans. D90% to the CTV was 91.5±4.4% for the manual digitization versus 91.4±4.4% for the automatic digitization (p=0.56). Conclusion: A region growing algorithm was developed to semi-automatically digitize interstitial catheters in HDR brachytherapy using the Syed-Neblett template. This automatic digitization tool was shown to be accurate compared to manual digitization.« less
Grammar-Supported 3d Indoor Reconstruction from Point Clouds for As-Built Bim
NASA Astrophysics Data System (ADS)
Becker, S.; Peter, M.; Fritsch, D.
2015-03-01
The paper presents a grammar-based approach for the robust automatic reconstruction of 3D interiors from raw point clouds. The core of the approach is a 3D indoor grammar which is an extension of our previously published grammar concept for the modeling of 2D floor plans. The grammar allows for the modeling of buildings whose horizontal, continuous floors are traversed by hallways providing access to the rooms as it is the case for most office buildings or public buildings like schools, hospitals or hotels. The grammar is designed in such way that it can be embedded in an iterative automatic learning process providing a seamless transition from LOD3 to LOD4 building models. Starting from an initial low-level grammar, automatically derived from the window representations of an available LOD3 building model, hypotheses about indoor geometries can be generated. The hypothesized indoor geometries are checked against observation data - here 3D point clouds - collected in the interior of the building. The verified and accepted geometries form the basis for an automatic update of the initial grammar. By this, the knowledge content of the initial grammar is enriched, leading to a grammar with increased quality. This higher-level grammar can then be applied to predict realistic geometries to building parts where only sparse observation data are available. Thus, our approach allows for the robust generation of complete 3D indoor models whose quality can be improved continuously as soon as new observation data are fed into the grammar-based reconstruction process. The feasibility of our approach is demonstrated based on a real-world example.
Automatic segmentation and supervised learning-based selection of nuclei in cancer tissue images.
Nandy, Kaustav; Gudla, Prabhakar R; Amundsen, Ryan; Meaburn, Karen J; Misteli, Tom; Lockett, Stephen J
2012-09-01
Analysis of preferential localization of certain genes within the cell nuclei is emerging as a new technique for the diagnosis of breast cancer. Quantitation requires accurate segmentation of 100-200 cell nuclei in each tissue section to draw a statistically significant result. Thus, for large-scale analysis, manual processing is too time consuming and subjective. Fortuitously, acquired images generally contain many more nuclei than are needed for analysis. Therefore, we developed an integrated workflow that selects, following automatic segmentation, a subpopulation of accurately delineated nuclei for positioning of fluorescence in situ hybridization-labeled genes of interest. Segmentation was performed by a multistage watershed-based algorithm and screening by an artificial neural network-based pattern recognition engine. The performance of the workflow was quantified in terms of the fraction of automatically selected nuclei that were visually confirmed as well segmented and by the boundary accuracy of the well-segmented nuclei relative to a 2D dynamic programming-based reference segmentation method. Application of the method was demonstrated for discriminating normal and cancerous breast tissue sections based on the differential positioning of the HES5 gene. Automatic results agreed with manual analysis in 11 out of 14 cancers, all four normal cases, and all five noncancerous breast disease cases, thus showing the accuracy and robustness of the proposed approach. Published 2012 Wiley Periodicals, Inc.
Urschler, Martin; Grassegger, Sabine; Štern, Darko
2015-01-01
Age estimation of individuals is important in human biology and has various medical and forensic applications. Recent interest in MR-based methods aims to investigate alternatives for established methods involving ionising radiation. Automatic, software-based methods additionally promise improved estimation objectivity. To investigate how informative automatically selected image features are regarding their ability to discriminate age, by exploring a recently proposed software-based age estimation method for MR images of the left hand and wrist. One hundred and two MR datasets of left hand images are used to evaluate age estimation performance, consisting of bone and epiphyseal gap volume localisation, computation of one age regression model per bone mapping image features to age and fusion of individual bone age predictions to a final age estimate. Quantitative results of the software-based method show an age estimation performance with a mean absolute difference of 0.85 years (SD = 0.58 years) to chronological age, as determined by a cross-validation experiment. Qualitatively, it is demonstrated how feature selection works and which image features of skeletal maturation are automatically chosen to model the non-linear regression function. Feasibility of automatic age estimation based on MRI data is shown and selected image features are found to be informative for describing anatomical changes during physical maturation in male adolescents.
Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.
Weijers, Gert; Starke, Alexander; Haudum, Alois; Thijssen, Johan M; Rehage, Jürgen; De Korte, Chris L
2010-07-01
The aim of this study was to test the hypothesis that automatic segmentation of vessels in ultrasound (US) images can produce similar or better results in grading fatty livers than interactive segmentation. A study was performed in postpartum dairy cows (N=151), as an animal model of human fatty liver disease, to test this hypothesis. Five transcutaneous and five intraoperative US liver images were acquired in each animal and a liverbiopsy was taken. In liver tissue samples, triacylglycerol (TAG) was measured by biochemical analysis and hepatic diseases other than hepatic lipidosis were excluded by histopathologic examination. Ultrasonic tissue characterization (UTC) parameters--Mean echo level, standard deviation (SD) of echo level, signal-to-noise ratio (SNR), residual attenuation coefficient (ResAtt) and axial and lateral speckle size--were derived using a computer-aided US (CAUS) protocol and software package. First, the liver tissue was interactively segmented by two observers. With increasing fat content, fewer hepatic vessels were visible in the ultrasound images and, therefore, a smaller proportion of the liver needed to be excluded from these images. Automatic-segmentation algorithms were implemented and it was investigated whether better results could be achieved than with the subjective and time-consuming interactive-segmentation procedure. The automatic-segmentation algorithms were based on both fixed and adaptive thresholding techniques in combination with a 'speckle'-shaped moving-window exclusion technique. All data were analyzed with and without postprocessing as contained in CAUS and with different automated-segmentation techniques. This enabled us to study the effect of the applied postprocessing steps on single and multiple linear regressions ofthe various UTC parameters with TAG. Improved correlations for all US parameters were found by using automatic-segmentation techniques. Stepwise multiple linear-regression formulas where derived and used to predict TAG level in the liver. Receiver-operating-characteristics (ROC) analysis was applied to assess the performance and area under the curve (AUC) of predicting TAG and to compare the sensitivity and specificity of the methods. Best speckle-size estimates and overall performance (R2 = 0.71, AUC = 0.94) were achieved by using an SNR-based adaptive automatic-segmentation method (used TAG threshold: 50 mg/g liver wet weight). Automatic segmentation is thus feasible and profitable.
Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun
2016-01-01
The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods. PMID:27258276
Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun
2016-05-31
The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods.
A versatile software package for inter-subject correlation based analyses of fMRI.
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/
A versatile software package for inter-subject correlation based analyses of fMRI
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yixing; Hong, Tianzhen; Piette, Mary Ann
Buildings in cities consume 30–70% of total primary energy, and improving building energy efficiency is one of the key strategies towards sustainable urbanization. Urban building energy models (UBEM) can support city managers to evaluate and prioritize energy conservation measures (ECMs) for investment and the design of incentive and rebate programs. This paper presents the retrofit analysis feature of City Building Energy Saver (CityBES) to automatically generate and simulate UBEM using EnergyPlus based on cities’ building datasets and user-selected ECMs. CityBES is a new open web-based tool to support city-scale building energy efficiency strategic plans and programs. The technical details ofmore » using CityBES for UBEM generation and simulation are introduced, including the workflow, key assumptions, and major databases. Also presented is a case study that analyzes the potential retrofit energy use and energy cost savings of five individual ECMs and two measure packages for 940 office and retail buildings in six city districts in northeast San Francisco, United States. The results show that: (1) all five measures together can save 23–38% of site energy per building; (2) replacing lighting with light-emitting diode lamps and adding air economizers to existing heating, ventilation and air-conditioning (HVAC) systems are most cost-effective with an average payback of 2.0 and 4.3 years, respectively; and (3) it is not economical to upgrade HVAC systems or replace windows in San Francisco due to the city's mild climate and minimal cooling and heating loads. Furthermore, the CityBES retrofit analysis feature does not require users to have deep knowledge of building systems or technologies for the generation and simulation of building energy models, which helps overcome major technical barriers for city managers and their consultants to adopt UBEM.« less
Current status of the UCSF second-generation PACS
NASA Astrophysics Data System (ADS)
Huang, H. K.; Arenson, Ronald L.; Wong, Albert W. K.; Bazzill, Todd M.; Lou, Shyhliang A.; Andriole, Katherine P.; Wang, Jun; Zhang, Jianguo; Wong, Stephen T. C.
1996-05-01
This paper describes the current status of the second generation PACS at UCSF commenced in October 1992. The UCSF PACS is designed in-house as a hospital-integrated PACS based on an open architecture concept using industrial standards including UNIX operating system, C programming language, X-Window user interface, TCP/IP communication protocol, DICOM 3.0 image standard and HL7 health data format. Other manufacturer's PACS components which conform with these standards can be easily integrated into the system. Relevant data from HIS and RIS is automatically incorporated into the PACS using HL7 data format and TCP/IP communication protocol. The UCSF system also takes advantage of state-of-the-art communication, storage, and software technologies in ATM, multiple storage media, automatic programming, multilevel processes for a better cost-performance system. The primary PACS network is the 155 Mbits/sec OC3 ATM with the Ethernet as the back-up. The UCSF PACS also connects Mt. Zion Hospital and San Francisco VA Medical Center in the San Francisco Bay area via an ATM wide area network with a T1 line as the back-up. Currently, five MR and five CT scanners from multiple sites, two computed radiography systems, two film digitizers, one US PACS module, the hospital HIS and the department RIS have been connected to the PACS network. The image data is managed by a mirrored database (Sybase). The PACS controller, with its 1.3 terabyte optical disk library, acquires 2.5 gigabytes digital data daily. Four 2K, five, 1,600-line multiple monitor display workstations are on line in neuroradiology, pediatric radiology and intensive care units for clinical use. In addition, the PACS supports over 100 Macintosh users in the department and selected hospital sites for both images and textual retrieval through a client/server mechanism. We are also developing a computation and visualization node in the PACS network for advancing radiology research.
Predicting Epileptic Seizures in Advance
Moghim, Negin; Corne, David W.
2014-01-01
Epilepsy is the second most common neurological disorder, affecting 0.6–0.8% of the world's population. In this neurological disorder, abnormal activity of the brain causes seizures, the nature of which tend to be sudden. Antiepileptic Drugs (AEDs) are used as long-term therapeutic solutions that control the condition. Of those treated with AEDs, 35% become resistant to medication. The unpredictable nature of seizures poses risks for the individual with epilepsy. It is clearly desirable to find more effective ways of preventing seizures for such patients. The automatic detection of oncoming seizures, before their actual onset, can facilitate timely intervention and hence minimize these risks. In addition, advance prediction of seizures can enrich our understanding of the epileptic brain. In this study, drawing on the body of work behind automatic seizure detection and prediction from digitised Invasive Electroencephalography (EEG) data, a prediction algorithm, ASPPR (Advance Seizure Prediction via Pre-ictal Relabeling), is described. ASPPR facilitates the learning of predictive models targeted at recognizing patterns in EEG activity that are in a specific time window in advance of a seizure. It then exploits advanced machine learning coupled with the design and selection of appropriate features from EEG signals. Results, from evaluating ASPPR independently on 21 different patients, suggest that seizures for many patients can be predicted up to 20 minutes in advance of their onset. Compared to benchmark performance represented by a mean S1-Score (harmonic mean of Sensitivity and Specificity) of 90.6% for predicting seizure onset between 0 and 5 minutes in advance, ASPPR achieves mean S1-Scores of: 96.30% for prediction between 1 and 6 minutes in advance, 96.13% for prediction between 8 and 13 minutes in advance, 94.5% for prediction between 14 and 19 minutes in advance, and 94.2% for prediction between 20 and 25 minutes in advance. PMID:24911316
Chen, Yixing; Hong, Tianzhen; Piette, Mary Ann
2017-08-07
Buildings in cities consume 30–70% of total primary energy, and improving building energy efficiency is one of the key strategies towards sustainable urbanization. Urban building energy models (UBEM) can support city managers to evaluate and prioritize energy conservation measures (ECMs) for investment and the design of incentive and rebate programs. This paper presents the retrofit analysis feature of City Building Energy Saver (CityBES) to automatically generate and simulate UBEM using EnergyPlus based on cities’ building datasets and user-selected ECMs. CityBES is a new open web-based tool to support city-scale building energy efficiency strategic plans and programs. The technical details ofmore » using CityBES for UBEM generation and simulation are introduced, including the workflow, key assumptions, and major databases. Also presented is a case study that analyzes the potential retrofit energy use and energy cost savings of five individual ECMs and two measure packages for 940 office and retail buildings in six city districts in northeast San Francisco, United States. The results show that: (1) all five measures together can save 23–38% of site energy per building; (2) replacing lighting with light-emitting diode lamps and adding air economizers to existing heating, ventilation and air-conditioning (HVAC) systems are most cost-effective with an average payback of 2.0 and 4.3 years, respectively; and (3) it is not economical to upgrade HVAC systems or replace windows in San Francisco due to the city's mild climate and minimal cooling and heating loads. Furthermore, the CityBES retrofit analysis feature does not require users to have deep knowledge of building systems or technologies for the generation and simulation of building energy models, which helps overcome major technical barriers for city managers and their consultants to adopt UBEM.« less
Sun, Shuping; Jiang, Zhongwei; Wang, Haibin; Fang, Yu
2014-05-01
This paper proposes a novel automatic method for the moment segmentation and peak detection analysis of heart sound (HS) pattern, with special attention to the characteristics of the envelopes of HS and considering the properties of the Hilbert transform (HT). The moment segmentation and peak location are accomplished in two steps. First, by applying the Viola integral waveform method in the time domain, the envelope (E(T)) of the HS signal is obtained with an emphasis on the first heart sound (S1) and the second heart sound (S2). Then, based on the characteristics of the E(T) and the properties of the HT of the convex and concave functions, a novel method, the short-time modified Hilbert transform (STMHT), is proposed to automatically locate the moment segmentation and peak points for the HS by the zero crossing points of the STMHT. A fast algorithm for calculating the STMHT of E(T) can be expressed by multiplying the E(T) by an equivalent window (W(E)). According to the range of heart beats and based on the numerical experiments and the important parameters of the STMHT, a moving window width of N=1s is validated for locating the moment segmentation and peak points for HS. The proposed moment segmentation and peak location procedure method is validated by sounds from Michigan HS database and sounds from clinical heart diseases, such as a ventricular septal defect (VSD), an aortic septal defect (ASD), Tetralogy of Fallot (TOF), rheumatic heart disease (RHD), and so on. As a result, for the sounds where S2 can be separated from S1, the average accuracies achieved for the peak of S1 (AP₁), the peak of S2 (AP₂), the moment segmentation points from S1 to S2 (AT₁₂) and the cardiac cycle (ACC) are 98.53%, 98.31% and 98.36% and 97.37%, respectively. For the sounds where S1 cannot be separated from S2, the average accuracies achieved for the peak of S1 and S2 (AP₁₂) and the cardiac cycle ACC are 100% and 96.69%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Interferometric phase measurement of zerodur, aluminum and SXA mirrors at cryogenic temperatures
NASA Technical Reports Server (NTRS)
Magner, Thomas J.; Barney, Richard D.
1988-01-01
A research program was undertaken to determine the surface figure error of several different types of mirrors at cryogenic temperatures. Two-inch diameter parabolic, spherical and flat mirrors were fabricated from zerodur, aluminum and a metal matrix composite of silicon carbide reinforced aluminum (SXA). The ratio of silicon carbide to aluminum was selected so that the coefficient of thermal expansion (CTE) of the metal matrix matched electroless nickel. A liquuid helium dewar was modified to add an interferometric grade window, a cold electronic shutter and a strain-free copper mirror mount. Interferometric phase measurements on each mirror mounted in the dewar were made without the window, with the window, under vacuum, at around 80K and between 10K and 24K.
NASA Astrophysics Data System (ADS)
Yusop, Hanafi M.; Ghazali, M. F.; Yusof, M. F. M.; PiRemli, M. A.; Karollah, B.; Rusman
2017-10-01
Pressure transient signal occurred due to sudden changes in fluid propagation filled in pipelines system, which is caused by rapid pressure and flow fluctuation in a system, such as closing and opening valve rapidly. The application of Hilbert-Huang Transform (HHT) as the method to analyse the pressure transient signal utilised in this research. However, this method has the difficulty in selecting the suitable IMF for the further post-processing, which is Hilbert Transform (HT). This paper proposed the implementation of Integrated Kurtosis-based Algorithm for z-filter Technique (I-kaz) to kurtosis ratio (I-kaz-Kurtosis) for that allows automatic selection of intrinsic mode function (IMF) that’s should be used. This work demonstrated the synthetic pressure transient signal generates using transmission line modelling (TLM) in order to test the effectiveness of I-kaz as the autonomous selection of intrinsic mode function (IMF). A straight fluid network was designed using TLM fixing with higher resistance at some point act as a leak and connecting to the pipe feature (junction, pipefitting or blockage). The analysis results using I-kaz-kurtosis ratio revealed that the method can be utilised as an automatic selection of intrinsic mode function (IMF) although the noise level ratio of the signal is lower. I-kaz-kurtosis ratio is recommended and advised to be implemented as automatic selection of intrinsic mode function (IMF) through HHT analysis.
Momeni, Saba; Pourghassem, Hossein
2014-08-01
Recently image fusion has prominent role in medical image processing and is useful to diagnose and treat many diseases. Digital subtraction angiography is one of the most applicable imaging to diagnose brain vascular diseases and radiosurgery of brain. This paper proposes an automatic fuzzy-based multi-temporal fusion algorithm for 2-D digital subtraction angiography images. In this algorithm, for blood vessel map extraction, the valuable frames of brain angiography video are automatically determined to form the digital subtraction angiography images based on a novel definition of vessel dispersion generated by injected contrast material. Our proposed fusion scheme contains different fusion methods for high and low frequency contents based on the coefficient characteristic of wrapping second generation of curvelet transform and a novel content selection strategy. Our proposed content selection strategy is defined based on sample correlation of the curvelet transform coefficients. In our proposed fuzzy-based fusion scheme, the selection of curvelet coefficients are optimized by applying weighted averaging and maximum selection rules for the high frequency coefficients. For low frequency coefficients, the maximum selection rule based on local energy criterion is applied to better visual perception. Our proposed fusion algorithm is evaluated on a perfect brain angiography image dataset consisting of one hundred 2-D internal carotid rotational angiography videos. The obtained results demonstrate the effectiveness and efficiency of our proposed fusion algorithm in comparison with common and basic fusion algorithms.
Decision Variants for the Automatic Determination of Optimal Feature Subset in RF-RFE.
Chen, Qi; Meng, Zhaopeng; Liu, Xinyi; Jin, Qianguo; Su, Ran
2018-06-15
Feature selection, which identifies a set of most informative features from the original feature space, has been widely used to simplify the predictor. Recursive feature elimination (RFE), as one of the most popular feature selection approaches, is effective in data dimension reduction and efficiency increase. A ranking of features, as well as candidate subsets with the corresponding accuracy, is produced through RFE. The subset with highest accuracy (HA) or a preset number of features (PreNum) are often used as the final subset. However, this may lead to a large number of features being selected, or if there is no prior knowledge about this preset number, it is often ambiguous and subjective regarding final subset selection. A proper decision variant is in high demand to automatically determine the optimal subset. In this study, we conduct pioneering work to explore the decision variant after obtaining a list of candidate subsets from RFE. We provide a detailed analysis and comparison of several decision variants to automatically select the optimal feature subset. Random forest (RF)-recursive feature elimination (RF-RFE) algorithm and a voting strategy are introduced. We validated the variants on two totally different molecular biology datasets, one for a toxicogenomic study and the other one for protein sequence analysis. The study provides an automated way to determine the optimal feature subset when using RF-RFE.
Robust Control for the Mercury Laser Altimeter
NASA Technical Reports Server (NTRS)
Rosenberg, Jacob S.
2006-01-01
Mercury Laser Altimeter Science Algorithms is a software system for controlling the laser altimeter aboard the Messenger spacecraft, which is to enter into orbit about Mercury in 2011. The software will control the altimeter by dynamically modifying hardware inputs for gain, threshold, channel-disable flags, range-window start location, and range-window width, by using ranging information provided by the spacecraft and noise counts from instrument hardware. In addition, because of severe bandwidth restrictions, the software also selects returns for downlink.
Carraro, Luciana; Castelli, Luigi; Macchiella, Claudia
2011-01-01
Research has widely explored the differences between conservatives and liberals, and it has been also recently demonstrated that conservatives display different reactions toward valenced stimuli. However, previous studies have not yet fully illuminated the cognitive underpinnings of these differences. In the current work, we argued that political ideology is related to selective attention processes, so that negative stimuli are more likely to automatically grab the attention of conservatives as compared to liberals. In Experiment 1, we demonstrated that negative (vs. positive) information impaired the performance of conservatives, more than liberals, in an Emotional Stroop Task. This finding was confirmed in Experiment 2 and in Experiment 3 employing a Dot-Probe Task, demonstrating that threatening stimuli were more likely to attract the attention of conservatives. Overall, results support the conclusion that people embracing conservative views of the world display an automatic selective attention for negative stimuli. PMID:22096486
Howell, Peter; Sackin, Stevie; Glenn, Kazan
2007-01-01
This program of work is intended to develop automatic recognition procedures to locate and assess stuttered dysfluencies. This and the following article together, develop and test recognizers for repetitions and prolongations. The automatic recognizers classify the speech in two stages: In the first, the speech is segmented and in the second the segments are categorized. The units that are segmented are words. Here assessments by human judges on the speech of 12 children who stutter are described using a corresponding procedure. The accuracy of word boundary placement across judges, categorization of the words as fluent, repetition or prolongation, and duration of the different fluency categories are reported. These measures allow reliable instances of repetitions and prolongations to be selected for training and assessing the recognizers in the subsequent paper. PMID:9328878
Design of Automatic Extraction Algorithm of Knowledge Points for MOOCs
Chen, Haijian; Han, Dongmei; Zhao, Lina
2015-01-01
In recent years, Massive Open Online Courses (MOOCs) are very popular among college students and have a powerful impact on academic institutions. In the MOOCs environment, knowledge discovery and knowledge sharing are very important, which currently are often achieved by ontology techniques. In building ontology, automatic extraction technology is crucial. Because the general methods of text mining algorithm do not have obvious effect on online course, we designed automatic extracting course knowledge points (AECKP) algorithm for online course. It includes document classification, Chinese word segmentation, and POS tagging for each document. Vector Space Model (VSM) is used to calculate similarity and design the weight to optimize the TF-IDF algorithm output values, and the higher scores will be selected as knowledge points. Course documents of “C programming language” are selected for the experiment in this study. The results show that the proposed approach can achieve satisfactory accuracy rate and recall rate. PMID:26448738
POPCORN: a Supervisory Control Simulation for Workload and Performance Research
NASA Technical Reports Server (NTRS)
Hart, S. G.; Battiste, V.; Lester, P. T.
1984-01-01
A multi-task simulation of a semi-automatic supervisory control system was developed to provide an environment in which training, operator strategy development, failure detection and resolution, levels of automation, and operator workload can be investigated. The goal was to develop a well-defined, but realistically complex, task that would lend itself to model-based analysis. The name of the task (POPCORN) reflects the visual display that depicts different task elements milling around waiting to be released and pop out to be performed. The operator's task was to complete each of 100 task elements that ere represented by different symbols, by selecting a target task and entering the desired a command. The simulated automatic system then completed the selected function automatically. Highly significant differences in performance, strategy, and rated workload were found as a function of all experimental manipulations (except reward/penalty).
A Multiple-range Self-balancing Thermocouple Potentiometer
NASA Technical Reports Server (NTRS)
Warshawsky, I; Estrin, M
1951-01-01
A multiple-range potentiometer circuit is described that provides automatic measurement of temperatures or temperature differences with any one of several thermocouple-material pairs. Techniques of automatic reference junction compensation, span adjustment, and zero suppression are described that permit rapid selection of range and wire material, without the necessity for restandardization, by setting of two external tap switches.
Chinese Journal of Lasers (Selected Articles),
1986-04-22
properties We first investigated silicate based glasses, then the other inorganic glasses such as borate, phosphate, germanate. tellurate ...of the growth of high melting temperature I.~ oxides, several upward pulling single crystal furnaces with nigh precision mechanical movement and high...automatic electronic weighting systems, and programmable automatic movement correction systems. The reliability of most of these control systems
An automatic camera device for measuring waterfowl use
Cowardin, L.M.; Ashe, J.E.
1965-01-01
A Yashica Sequelle camera was modified and equipped with a timing device so that it would take pictures automatically at 15-minute intervals. Several of these cameras were used to photograph randomly selected quadrats located in different marsh habitats. The number of birds photographed in the different areas was used as an index of waterfowl use.
Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm
ERIC Educational Resources Information Center
Rac-Lubashevsky, Rachel; Kessler, Yoav
2016-01-01
Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…
ERIC Educational Resources Information Center
Okurut, Jeje Moses
2018-01-01
The impact of automatic promotion practice on students dropping out of Uganda's primary education was assessed using propensity score in difference in differences analysis technique. The analysis strategy was instrumental in addressing the selection bias problem, as well as biases arising from common trends over time, and permanent latent…
Optimizing Input/Output Using Adaptive File System Policies
NASA Technical Reports Server (NTRS)
Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.
1996-01-01
Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.
Automatic thermal control switches. [for use in Space Shuttle borne Get Away Special container
NASA Technical Reports Server (NTRS)
Wing, L. D.
1982-01-01
Two automatic, flexible connection thermal control switches have been designed and tested in a thermal vacuum facility and in the Get Away Special (GAS) container flown on the third Shuttle flight. The switches are complementary in that one switch passes heat when the plate on which it is mounted exceeds some selected temperature and the other switch will pass heat only when the mounting plate temperature is below the selected value. Both switches are driven and controlled by phase-change capsule motors and require no other power source or thermal sensors.
Virgolin, Marco; van Dijk, Irma W E M; Wiersma, Jan; Ronckers, Cécile M; Witteveen, Cees; Bel, Arjan; Alderliesten, Tanja; Bosman, Peter A N
2018-04-01
The aim of this study is to establish the first step toward a novel and highly individualized three-dimensional (3D) dose distribution reconstruction method, based on CT scans and organ delineations of recently treated patients. Specifically, the feasibility of automatically selecting the CT scan of a recently treated childhood cancer patient who is similar to a given historically treated child who suffered from Wilms' tumor is assessed. A cohort of 37 recently treated children between 2- and 6-yr old are considered. Five potential notions of ground-truth similarity are proposed, each focusing on different anatomical aspects. These notions are automatically computed from CT scans of the abdomen and 3D organ delineations (liver, spleen, spinal cord, external body contour). The first is based on deformable image registration, the second on the Dice similarity coefficient, the third on the Hausdorff distance, the fourth on pairwise organ distances, and the last is computed by means of the overlap volume histogram. The relationship between typically available features of historically treated patients and the proposed ground-truth notions of similarity is studied by adopting state-of-the-art machine learning techniques, including random forest. Also, the feasibility of automatically selecting the most similar patient is assessed by comparing ground-truth rankings of similarity with predicted rankings. Similarities (mainly) based on the external abdomen shape and on the pairwise organ distances are highly correlated (Pearson r p ≥ 0.70) and are successfully modeled with random forests based on historically recorded features (pseudo-R 2 ≥ 0.69). In contrast, similarities based on the shape of internal organs cannot be modeled. For the similarities that random forest can reliably model, an estimation of feature relevance indicates that abdominal diameters and weight are the most important. Experiments on automatically selecting similar patients lead to coarse, yet quite robust results: the most similar patient is retrieved only 22% of the times, however, the error in worst-case scenarios is limited, with the fourth most similar patient being retrieved. Results demonstrate that automatically selecting similar patients is feasible when focusing on the shape of the external abdomen and on the position of internal organs. Moreover, whereas the common practice in phantom-based dose reconstruction is to select a representative phantom using age, height, and weight as discriminant factors for any treatment scenario, our analysis on abdominal tumor treatment for children shows that the most relevant features are weight and the anterior-posterior and left-right abdominal diameters. © 2018 American Association of Physicists in Medicine.
VACTIV: A graphical dialog based program for an automatic processing of line and band spectra
NASA Astrophysics Data System (ADS)
Zlokazov, V. B.
2013-05-01
The program VACTIV-Visual ACTIV-has been developed for an automatic analysis of spectrum-like distributions, in particular gamma-ray spectra or alpha-spectra and is a standard graphical dialog based Windows XX application, driven by a menu, mouse and keyboard. On the one hand, it was a conversion of an existing Fortran program ACTIV [1] to the DELPHI language; on the other hand, it is a transformation of the sequential syntax of Fortran programming to a new object-oriented style, based on the organization of event interactions. New features implemented in the algorithms of both the versions consisted in the following as peak model both an analytical function and a graphical curve could be used; the peak search algorithm was able to recognize not only Gauss peaks but also peaks with an irregular form; both narrow peaks (2-4 channels) and broad ones (50-100 channels); the regularization technique in the fitting guaranteed a stable solution in the most complicated cases of strongly overlapping or weak peaks. The graphical dialog interface of VACTIV is much more convenient than the batch mode of ACTIV. [1] V.B. Zlokazov, Computer Physics Communications, 28 (1982) 27-37. NEW VERSION PROGRAM SUMMARYProgram Title: VACTIV Catalogue identifier: ABAC_v2_0 Licensing provisions: no Programming language: DELPHI 5-7 Pascal. Computer: IBM PC series. Operating system: Windows XX. RAM: 1 MB Keywords: Nuclear physics, spectrum decomposition, least squares analysis, graphical dialog, object-oriented programming. Classification: 17.6. Catalogue identifier of previous version: ABAC_v1_0 Journal reference of previous version: Comput. Phys. Commun. 28 (1982) 27 Does the new version supersede the previous version?: Yes. Nature of problem: Program VACTIV is intended for precise analysis of arbitrary spectrum-like distributions, e.g. gamma-ray and X-ray spectra and allows the user to carry out the full cycle of automatic processing of such spectra, i.e. calibration, automatic peak search and estimation of parameters of interest. VACTIV can run on any standard modern laptop. Reasons for the new version: At the time of its creation (1999) VACTIV was seemingly the first attempt to apply the newest programming languages and styles to systems of spectrum analysis. Its goal was to both get a convenient and efficient technique for data processing, and to elaborate the formalism of spectrum analysis in terms of classes, their properties, their methods and events of an object-oriented programming language. Summary of revisions: Compared with ACTIV, VACTIV preserves all the mathematical algorithms, but provides the user with all the benefits of an interface, based on a graphical dialog. It allows him to make a quick intervention in the work of the program; in particular, to carry out the on-line control of the fitting process: depending on the intermediate results and using the visual form of data representation, to change the conditions for the fitting and so achieve the optimum performance, selecting the optimum strategy. To find the best conditions for the fitting one can compress the spectrum, delete the blunders from it, smooth it using a high-frequency spline filter and build the background using a low-frequency spline filter; use not only automatic methods for the blunder deletion, the peak search, the peak model forming and the calibration, but also use manual mouse clicking on the spectrum graph. Restrictions: To enhance the reliability and portability of the program the majority of the most important arrays have a static allocation; all the arrays are allocated with a surplus, and the total pool of the program is restricted only by the size of the computer virtual memory. A spectrum has the static size of 32 K real words. The maximum size of the least-square matrix is 314 (the maximum number of fitted parameters per one analyzed spectrum interval, not for the whole spectrum), from which it follows that the maximum number of peaks in one spectrum interval is 154. The maximum total number of peaks in the spectrum is not restricted. Running time: The calculation time is negligibly small compared with the time for the dialog; using ini-files the program can be partly used in a semi-dialog mode.
Chen, Keguang; Yin, Dongming; Lyu, Huiying; Yang, Lin; Zhang, Tianyu; Dai, Peidong
2016-01-01
With the aggravation of the external auditory canal malformation, the size of extra-niche fossa became smaller, providing concrete data and valuable information for the better design, selecting and safer implantation of the transducer in the area of round window niche. Three-dimensional measurements and assessments before surgery might be helpful for a safer surgical approach and implantation of a vibrant soundbridge. The aim of this study was to investigate whether differences exist in the morphology of the posterior tympanum related to the round window vibroplasty among congenital aural atresia (CAA), congenital aural stenosis (CAS), and a normal control group, and to analyze its effect on the round window implantation of vibrant soundbridge. CT images of 10 normal subjects (20 ears), 27 CAS patients (30 ears), and 25 CAA patients (30 ears) were analyzed. The depth and the size of outside fossa of round window niche related to the round window vibroplasty (extra-niche fossa)and the distances between the center of round window niche and extra-niche fossa were calculated based on three-dimensional reconstruction using mimics software. Finally, the data were analyzed statistically. The size of extra-niche fossa in the atresia group was smaller than in the stenosis group (p < 0.05); furthermore, the size of extra-niche fossa in the stenosis group was smaller than that of the control group (p < 0.05). There was no statistically significant difference of the depth of extra-niche fossa among different groups.
Kentucky geotechnical database.
DOT National Transportation Integrated Search
2005-03-01
Development of a comprehensive dynamic, geotechnical database is described. Computer software selected to program the client/server application in windows environment, components and structure of the geotechnical database, and primary factors cons...
Selecting Cases for Intensive Analysis: A Diversity of Goals and Methods
ERIC Educational Resources Information Center
Gerring, John; Cojocaru, Lee
2016-01-01
This study revisits the task of case selection in case study research, proposing a new typology of strategies that is explicit, disaggregated, and relatively comprehensive. A secondary goal is to explore the prospects for case selection by "algorithm," aka "ex ante," "automatic," "quantitative,"…
Neuronal synchronization and selective color processing in the human brain.
Müller, Matthias M; Keil, Andreas
2004-04-01
In the present study, subjects selectively attended to the color of checkerboards in a feature-based attention paradigm. Induced gamma band responses (GBRs), the induced alpha band, and the event-related potential (ERP) were analyzed to uncover neuronal dynamics during selective feature processing. Replicating previous ERP findings, the selection negativity (SN) with a latency of about 160 msec was extracted. Furthermore, and similarly to previous EEG studies, a gamma band peak in a time window between 290 and 380 msec was found. This peak had its major energy in the 55- to 70-Hz range and was significantly larger for the attended color. Contrary to previous human induced gamma band studies, a much earlier 40- to 50-Hz peak in a time window between 160 and 220 msec after stimulus onset and, thus, concurrently to the SN was prominent with significantly more energy for attended as opposed to unattended color. The induced alpha band (9.8-11.7 Hz), on the other hand, exhibited a marked suppression for attended color in a time window between 450 and 600 msec after stimulus onset. A comparison of the time course of the 40- to 50-Hz and 55- to 70-Hz induced GBR, the induced alpha band, and the ERP revealed temporal coincidences for changes in the morphology of these brain responses. Despite these similarities in the time domain, the cortical source configuration was found to discriminate between induced GBRs and the SN. Our results suggest that large-scale synchronous high-frequency brain activity as measured in the human GBR play a specific role in attentive processing of stimulus features.
Actions of the dual FAAH/MAGL inhibitor JZL195 in a murine neuropathic pain model
Adamson Barnes, Nicholas S.; Mitchell, Vanessa A.; Kazantzis, Nicholas P.
2015-01-01
Background and Purpose While cannabinoids have been proposed as a potential treatment for neuropathic pain, they have limitations. Cannabinoid receptor agonists have good efficacy in animal models of neuropathic pain; they have a poor therapeutic window. Conversely, selective fatty acid amide hydrolase (FAAH) inhibitors that enhance the endocannabinoid system have a better therapeutic window, but lesser efficacy. We examined whether JZL195, a dual inhibitor of FAAH and monacylglycerol lipase (MAGL), could overcome these limitations. Experimental Approach C57BL/6 mice underwent the chronic constriction injury (CCI) model of neuropathic pain. Mechanical and cold allodynia, plus cannabinoid side effects, were assessed in response to systemic drug application. Key Results JZL195 and the cannabinoid receptor agonist WIN55212 produced dose‐dependent reductions in CCI‐induced mechanical and cold allodynia, plus side effects including motor incoordination, catalepsy and sedation. JZL195 reduced allodynia with an ED50 at least four times less than that at which it produced side effects. By contrast, WIN55212 reduced allodynia and produce side effects with similar ED50s. The maximal anti‐allodynic effect of JZL195 was greater than that produced by selective FAAH, or MAGL inhibitors. The JZL195‐induced anti‐allodynia was maintained during repeated treatment. Conclusions and Implications These findings suggest that JZL195 has greater anti‐allodynic efficacy than selective FAAH, or MAGL inhibitors, plus a greater therapeutic window than a cannabinoid receptor agonist. Thus, dual FAAH/MAGL inhibition may have greater potential in alleviating neuropathic pain, compared with selective FAAH and MAGL inhibitors, or cannabinoid receptor agonists. PMID:26398331
Leveraging Paraphrase Labels to Extract Synonyms from Twitter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antoniak, Maria A.; Bell, Eric B.; Xia, Fei
2015-05-18
We present an approach for automatically learning synonyms from a paraphrase corpus of tweets. This work shows improvement on the task of paraphrase detection when we substitute our extracted synonyms into the training set. The synonyms are learned by using chunks from a shallow parse to create candidate synonyms and their context windows, and the synonyms are incorporated into a paraphrase detection system that uses machine translation metrics as features for a classifier. We demonstrate a 2.29% improvement in F1 when we train and test on the paraphrase training set, providing better coverage than previous systems, which shows the potentialmore » power of synonyms that are representative of a specific topic.« less
GRIL: genome rearrangement and inversion locator.
Darling, Aaron E; Mau, Bob; Blattner, Frederick R; Perna, Nicole T
2004-01-01
GRIL is a tool to automatically identify collinear regions in a set of bacterial-size genome sequences. GRIL uses three basic steps. First, regions of high sequence identity are located. Second, some of these regions are filtered based on user-specified criteria. Finally, the remaining regions of sequence identity are used to define significant collinear regions among the sequences. By locating collinear regions of sequence, GRIL provides a basis for multiple genome alignment using current alignment systems. GRIL also provides a basis for using current inversion distance tools to infer phylogeny. GRIL is implemented in C++ and runs on any x86-based Linux or Windows platform. It is available from http://asap.ahabs.wisc.edu/gril
NASA Astrophysics Data System (ADS)
Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.
The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.
Validation of automatic segmentation of ribs for NTCP modeling.
Stam, Barbara; Peulen, Heike; Rossi, Maddalena M G; Belderbos, José S A; Sonke, Jan-Jakob
2016-03-01
Determination of a dose-effect relation for rib fractures in a large patient group has been limited by the time consuming manual delineation of ribs. Automatic segmentation could facilitate such an analysis. We determine the accuracy of automatic rib segmentation in the context of normal tissue complication probability modeling (NTCP). Forty-one patients with stage I/II non-small cell lung cancer treated with SBRT to 54 Gy in 3 fractions were selected. Using the 4DCT derived mid-ventilation planning CT, all ribs were manually contoured and automatically segmented. Accuracy of segmentation was assessed using volumetric, shape and dosimetric measures. Manual and automatic dosimetric parameters Dx and EUD were tested for equivalence using the Two One-Sided T-test (TOST), and assessed for agreement using Bland-Altman analysis. NTCP models based on manual and automatic segmentation were compared. Automatic segmentation was comparable with the manual delineation in radial direction, but larger near the costal cartilage and vertebrae. Manual and automatic Dx and EUD were significantly equivalent. The Bland-Altman analysis showed good agreement. The two NTCP models were very similar. Automatic rib segmentation was significantly equivalent to manual delineation and can be used for NTCP modeling in a large patient group. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Davy, Nicholas C.; Sezen-Edmonds, Melda; Gao, Jia; Lin, Xin; Liu, Amy; Yao, Nan; Kahn, Antoine; Loo, Yueh-Lin
2017-08-01
Current smart window technologies offer dynamic control of the optical transmission of the visible and near-infrared portions of the solar spectrum to reduce lighting, heating and cooling needs in buildings and to improve occupant comfort. Solar cells harvesting near-ultraviolet photons could satisfy the unmet need of powering such smart windows over the same spatial footprint without competing for visible or infrared photons, and without the same aesthetic and design constraints. Here, we report organic single-junction solar cells that selectively harvest near-ultraviolet photons, produce open-circuit voltages eclipsing 1.6 V and exhibit scalability in power generation, with active layers (10 cm2) substantially larger than those typical of demonstration organic solar cells (0.04-0.2 cm2). Integration of these solar cells with a low-cost, polymer-based electrochromic window enables intelligent management of the solar spectrum, with near-ultraviolet photons powering the regulation of visible and near-infrared photons for natural lighting and heating purposes.
Evolving Better Cars: Teaching Evolution by Natural Selection with a Digital Inquiry Activity
ERIC Educational Resources Information Center
Royer, Anne M.; Schultheis, Elizabeth H.
2014-01-01
Evolutionary experiments are usually difficult to perform in the classroom because of the large sizes and long timescales of experiments testing evolutionary hypotheses. Computer applications give students a window to observe evolution in action, allowing them to gain comfort with the process of natural selection and facilitating inquiry…
Mars entry guidance based on an adaptive reference drag profile
NASA Astrophysics Data System (ADS)
Liang, Zixuan; Duan, Guangfei; Ren, Zhang
2017-08-01
The conventional Mars entry tracks a fixed reference drag profile (FRDP). To improve the landing precision, a novel guidance approach that utilizes an adaptive reference drag profile (ARDP) is presented. The entry flight is divided into two phases. For each phase, a family of drag profiles corresponding to various trajectory lengths is planned. Two update windows are investigated for the reference drag profile. At each window, the ARDP is selected online from the profile database according to the actual range-to-go. The tracking law for the selected drag profile is designed based on the feedback linearization. Guidance approaches using the ARDP and the FRDP are then tested and compared. Simulation results demonstrate that the proposed ARDP approach achieves much higher guidance precision than the conventional FRDP approach.
Dong, Shaopeng; Yuan, Mei; Wang, Qiusheng; Liang, Zhiling
2018-05-21
The acoustic emission (AE) method is useful for structural health monitoring (SHM) of composite structures due to its high sensitivity and real-time capability. The main challenge, however, is how to classify the AE data into different failure mechanisms because the detected signals are affected by various factors. Empirical wavelet transform (EWT) is a solution for analyzing the multi-component signals and has been used to process the AE data. In order to solve the spectrum separation problem of the AE signals, this paper proposes a novel modified separation method based on local window maxima (LWM) algorithm. It searches the local maxima of the Fourier spectrum in a proper window, and automatically determines the boundaries of spectrum segmentations, which helps to eliminate the impact of noise interference or frequency dispersion in the detected signal and obtain the meaningful empirical modes that are more related to the damage characteristics. Additionally, both simulation signal and AE signal from the composite structures are used to verify the effectiveness of the proposed method. Finally, the experimental results indicate that the proposed method performs better than the original EWT method in identifying different damage mechanisms of composite structures.
Dong, Shaopeng; Yuan, Mei; Wang, Qiusheng; Liang, Zhiling
2018-01-01
The acoustic emission (AE) method is useful for structural health monitoring (SHM) of composite structures due to its high sensitivity and real-time capability. The main challenge, however, is how to classify the AE data into different failure mechanisms because the detected signals are affected by various factors. Empirical wavelet transform (EWT) is a solution for analyzing the multi-component signals and has been used to process the AE data. In order to solve the spectrum separation problem of the AE signals, this paper proposes a novel modified separation method based on local window maxima (LWM) algorithm. It searches the local maxima of the Fourier spectrum in a proper window, and automatically determines the boundaries of spectrum segmentations, which helps to eliminate the impact of noise interference or frequency dispersion in the detected signal and obtain the meaningful empirical modes that are more related to the damage characteristics. Additionally, both simulation signal and AE signal from the composite structures are used to verify the effectiveness of the proposed method. Finally, the experimental results indicate that the proposed method performs better than the original EWT method in identifying different damage mechanisms of composite structures. PMID:29883411
Hussein, Sami; Kruger, Jörg
2011-01-01
Robot assisted training has proven beneficial as an extension of conventional therapy to improve rehabilitation outcome. Further facilitation of this positive impact is expected from the application of cooperative control algorithms to increase the patient's contribution to the training effort according to his level of ability. This paper presents an approach for cooperative training for end-effector based gait rehabilitation devices. Thereby it provides the basis to firstly establish sophisticated cooperative control methods in this class of devices. It uses a haptic control framework to synthesize and render complex, task specific training environments, which are composed of polygonal primitives. Training assistance is integrated as part of the environment into the haptic control framework. A compliant window is moved along a nominal training trajectory compliantly guiding and supporting the foot motion. The level of assistance is adjusted via the stiffness of the moving window. Further an iterative learning algorithm is used to automatically adjust this assistance level. Stable haptic rendering of the dynamic training environments and adaptive movement assistance have been evaluated in two example training scenarios: treadmill walking and stair climbing. Data from preliminary trials with one healthy subject is provided in this paper. © 2011 IEEE
Automatic sleep scoring: a search for an optimal combination of measures.
Krakovská, Anna; Mezeiová, Kristína
2011-09-01
The objective of this study is to find the best set of characteristics of polysomnographic signals for the automatic classification of sleep stages. A selection was made from 74 measures, including linear spectral measures, interdependency measures, and nonlinear measures of complexity that were computed for the all-night polysomnographic recordings of 20 healthy subjects. The adopted multidimensional analysis involved quadratic discriminant analysis, forward selection procedure, and selection by the best subset procedure. Two situations were considered: the use of four polysomnographic signals (EEG, EMG, EOG, and ECG) and the use of the EEG alone. For the given database, the best automatic sleep classifier achieved approximately an 81% agreement with the hypnograms of experts. The classifier was based on the next 14 features of polysomnographic signals: the ratio of powers in the beta and delta frequency range (EEG, channel C3), the fractal exponent (EMG), the variance (EOG), the absolute power in the sigma 1 band (EEG, C3), the relative power in the delta 2 band (EEG, O2), theta/gamma (EEG, C3), theta/alpha (EEG, O1), sigma/gamma (EEG, C4), the coherence in the delta 1 band (EEG, O1-O2), the entropy (EMG), the absolute theta 2 (EEG, Fp1), theta/alpha (EEG, Fp1), the sigma 2 coherence (EEG, O1-C3), and the zero-crossing rate (ECG); however, even with only four features, we could perform sleep scoring with a 74% accuracy, which is comparable to the inter-rater agreement between two independent specialists. We have shown that 4-14 carefully selected polysomnographic features were sufficient for successful sleep scoring. The efficiency of the corresponding automatic classifiers was verified and conclusively demonstrated on all-night recordings from healthy adults. Copyright © 2011 Elsevier B.V. All rights reserved.
Automatic color preference correction for color reproduction
NASA Astrophysics Data System (ADS)
Tsukada, Masato; Funayama, Chisato; Tajima, Johji
2000-12-01
The reproduction of natural objects in color images has attracted a great deal of attention. Reproduction more pleasing colors of natural objects is one of the methods available to improve image quality. We developed an automatic color correction method to maintain preferred color reproduction for three significant categories: facial skin color, green grass and blue sky. In this method, a representative color in an object area to be corrected is automatically extracted from an input image, and a set of color correction parameters is selected depending on the representative color. The improvement in image quality for reproductions of natural image was more than 93 percent in subjective experiments. These results show the usefulness of our automatic color correction method for the reproduction of preferred colors.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
NASA Astrophysics Data System (ADS)
Gao, M.; Li, J.
2018-04-01
Geometric correction is an important preprocessing process in the application of GF4 PMS image. The method of geometric correction that is based on the manual selection of geometric control points is time-consuming and laborious. The more common method, based on a reference image, is automatic image registration. This method involves several steps and parameters. For the multi-spectral sensor GF4 PMS, it is necessary for us to identify the best combination of parameters and steps. This study mainly focuses on the following issues: necessity of Rational Polynomial Coefficients (RPC) correction before automatic registration, base band in the automatic registration and configuration of GF4 PMS spatial resolution.
Application of quantum-behaved particle swarm optimization to motor imagery EEG classification.
Hsu, Wei-Yen
2013-12-01
In this study, we propose a recognition system for single-trial analysis of motor imagery (MI) electroencephalogram (EEG) data. Applying event-related brain potential (ERP) data acquired from the sensorimotor cortices, the system chiefly consists of automatic artifact elimination, feature extraction, feature selection and classification. In addition to the use of independent component analysis, a similarity measure is proposed to further remove the electrooculographic (EOG) artifacts automatically. Several potential features, such as wavelet-fractal features, are then extracted for subsequent classification. Next, quantum-behaved particle swarm optimization (QPSO) is used to select features from the feature combination. Finally, selected sub-features are classified by support vector machine (SVM). Compared with without artifact elimination, feature selection using a genetic algorithm (GA) and feature classification with Fisher's linear discriminant (FLD) on MI data from two data sets for eight subjects, the results indicate that the proposed method is promising in brain-computer interface (BCI) applications.
Alternating evolutionary pressure in a genetic algorithm facilitates protein model selection
Offman, Marc N; Tournier, Alexander L; Bates, Paul A
2008-01-01
Background Automatic protein modelling pipelines are becoming ever more accurate; this has come hand in hand with an increasingly complicated interplay between all components involved. Nevertheless, there are still potential improvements to be made in template selection, refinement and protein model selection. Results In the context of an automatic modelling pipeline, we analysed each step separately, revealing several non-intuitive trends and explored a new strategy for protein conformation sampling using Genetic Algorithms (GA). We apply the concept of alternating evolutionary pressure (AEP), i.e. intermediate rounds within the GA runs where unrestrained, linear growth of the model populations is allowed. Conclusion This approach improves the overall performance of the GA by allowing models to overcome local energy barriers. AEP enabled the selection of the best models in 40% of all targets; compared to 25% for a normal GA. PMID:18673557
Mueller, David S.
2013-01-01
profiles from the entire cross section and multiple transects to determine a mean profile for the measurement. The use of an exponent derived from normalized data from the entire cross section is shown to be valid for application of the power velocity distribution law in the computation of the unmeasured discharge in a cross section. Selected statistics are combined with empirically derived criteria to automatically select the appropriate extrapolation methods. A graphical user interface (GUI) provides the user tools to visually evaluate the automatically selected extrapolation methods and manually change them, as necessary. The sensitivity of the total discharge to available extrapolation methods is presented in the GUI. Use of extrap by field hydrographers has demonstrated that extrap is a more accurate and efficient method of determining the appropriate extrapolation methods compared with tools currently (2012) provided in the ADCP manufacturers’ software.
A staggered-grid convolutional differentiator for elastic wave modelling
NASA Astrophysics Data System (ADS)
Sun, Weijia; Zhou, Binzhong; Fu, Li-Yun
2015-11-01
The computation of derivatives in governing partial differential equations is one of the most investigated subjects in the numerical simulation of physical wave propagation. An analytical staggered-grid convolutional differentiator (CD) for first-order velocity-stress elastic wave equations is derived in this paper by inverse Fourier transformation of the band-limited spectrum of a first derivative operator. A taper window function is used to truncate the infinite staggered-grid CD stencil. The truncated CD operator is almost as accurate as the analytical solution, and as efficient as the finite-difference (FD) method. The selection of window functions will influence the accuracy of the CD operator in wave simulation. We search for the optimal Gaussian windows for different order CDs by minimizing the spectral error of the derivative and comparing the windows with the normal Hanning window function for tapering the CD operators. It is found that the optimal Gaussian window appears to be similar to the Hanning window function for tapering the same CD operator. We investigate the accuracy of the windowed CD operator and the staggered-grid FD method with different orders. Compared to the conventional staggered-grid FD method, a short staggered-grid CD operator achieves an accuracy equivalent to that of a long FD operator, with lower computational costs. For example, an 8th order staggered-grid CD operator can achieve the same accuracy of a 16th order staggered-grid FD algorithm but with half of the computational resources and time required. Numerical examples from a homogeneous model and a crustal waveguide model are used to illustrate the superiority of the CD operators over the conventional staggered-grid FD operators for the simulation of wave propagations.
Rowe, David; Chambers, Scott; Hampson, Amy; Eastwood, Hayden; Campbell, Luke; O'Leary, Stephen
2016-03-01
Cochlear implant recipients show improved speech perception and music appreciation when residual acoustic hearing is combined with the cochlear implant. However, up to one third of patients lose their pre-operative residual hearing weeks to months after implantation, for reasons that are not well understood. This study tested whether this "delayed" hearing loss was influenced by the route of electrode array insertion and/or position of the electrode array within scala tympani in a guinea pig model of cochlear implantation. Five treatment groups were monitored over 12 weeks: (1) round window implant; (2) round window incised with no implant; (3) cochleostomy with medially-oriented implant; (4) cochleostomy with laterally-oriented implant; and (5) cochleostomy with no implant. Hearing was measured at selected time points by the auditory brainstem response. Cochlear condition was assessed histologically, with cochleae three-dimensionally reconstructed to plot electrode paths and estimate tissue response. Electrode array trajectories matched their intended paths. Arrays inserted via the round window were situated nearer to the basilar membrane and organ of Corti over the majority of their intrascalar path compared with arrays inserted via cochleostomy. Round window interventions exhibited delayed, low frequency hearing loss that was not seen after cochleostomy. This hearing loss appeared unrelated to the extent of tissue reaction or injury within scala tympani, although round window insertion was histologically the most traumatic mode of implantation. We speculate that delayed hearing loss was related not to the electrode position as postulated, but rather to the muscle graft used to seal the round window post-intervention, by altering cochlear mechanics via round window fibrosis. Copyright © 2015 Elsevier B.V. All rights reserved.
System for definition of the central-chest vasculature
NASA Astrophysics Data System (ADS)
Taeprasartsit, Pinyo; Higgins, William E.
2009-02-01
Accurate definition of the central-chest vasculature from three-dimensional (3D) multi-detector CT (MDCT) images is important for pulmonary applications. For instance, the aorta and pulmonary artery help in automatic definition of the Mountain lymph-node stations for lung-cancer staging. This work presents a system for defining major vascular structures in the central chest. The system provides automatic methods for extracting the aorta and pulmonary artery and semi-automatic methods for extracting the other major central chest arteries/veins, such as the superior vena cava and azygos vein. Automatic aorta and pulmonary artery extraction are performed by model fitting and selection. The system also extracts certain vascular structure information to validate outputs. A semi-automatic method extracts vasculature by finding the medial axes between provided important sites. Results of the system are applied to lymph-node station definition and guidance of bronchoscopic biopsy.
A chest-shape target automatic detection method based on Deformable Part Models
NASA Astrophysics Data System (ADS)
Zhang, Mo; Jin, Weiqi; Li, Li
2016-10-01
Automatic weapon platform is one of the important research directions at domestic and overseas, it needs to accomplish fast searching for the object to be shot under complex background. Therefore, fast detection for given target is the foundation of further task. Considering that chest-shape target is common target of shoot practice, this paper treats chestshape target as the target and studies target automatic detection method based on Deformable Part Models. The algorithm computes Histograms of Oriented Gradient(HOG) features of the target and trains a model using Latent variable Support Vector Machine(SVM); In this model, target image is divided into several parts then we can obtain foot filter and part filters; Finally, the algorithm detects the target at the HOG features pyramid with method of sliding window. The running time of extracting HOG pyramid with lookup table can be shorten by 36%. The result indicates that this algorithm can detect the chest-shape target in natural environments indoors or outdoors. The true positive rate of detection reaches 76% with many hard samples, and the false positive rate approaches 0. Running on a PC (Intel(R)Core(TM) i5-4200H CPU) with C++ language, the detection time of images with the resolution of 640 × 480 is 2.093s. According to TI company run library about image pyramid and convolution for DM642 and other hardware, our detection algorithm is expected to be implemented on hardware platform, and it has application prospect in actual system.
Automatic representation of urban terrain models for simulations on the example of VBS2
NASA Astrophysics Data System (ADS)
Bulatov, Dimitri; Häufel, Gisela; Solbrig, Peter; Wernerus, Peter
2014-10-01
Virtual simulations have been on the rise together with the fast progress of rendering engines and graphics hardware. Especially in military applications, offensive actions in modern peace-keeping missions have to be quick, firm and precise, especially under the conditions of asymmetric warfare, non-cooperative urban terrain and rapidly developing situations. Going through the mission in simulation can prepare the minds of soldiers and leaders, increase selfconfidence and tactical awareness, and finally save lives. This work is dedicated to illustrate the potential and limitations of integration of semantic urban terrain models into a simulation. Our system of choice is Virtual Battle Space 2, a simulation system created by Bohemia Interactive System. The topographic object types that we are able to export into this simulation engine are either results of the sensor data evaluation (building, trees, grass, and ground), which is done fully-automatically, or entities obtained from publicly available sources (streets and water-areas), which can be converted into the system-proper format with a few mouse clicks. The focus of this work lies in integrating of information about building façades into the simulation. We are inspired by state-of the art methods that allow for automatic extraction of doors and windows in laser point clouds captured from building walls and thus increase the level of details of building models. As a consequence, it is important to simulate these animationable entities. Doing so, we are able to make accessible some of the buildings in the simulation.
Design of an automatic production monitoring system on job shop manufacturing
NASA Astrophysics Data System (ADS)
Prasetyo, Hoedi; Sugiarto, Yohanes; Rosyidi, Cucuk Nur
2018-02-01
Every production process requires monitoring system, so the desired efficiency and productivity can be monitored at any time. This system is also needed in the job shop type of manufacturing which is mainly influenced by the manufacturing lead time. Processing time is one of the factors that affect the manufacturing lead time. In a conventional company, the recording of processing time is done manually by the operator on a sheet of paper. This method is prone to errors. This paper aims to overcome this problem by creating a system which is able to record and monitor the processing time automatically. The solution is realized by utilizing electric current sensor, barcode, RFID, wireless network and windows-based application. An automatic monitoring device is attached to the production machine. It is equipped with a touch screen-LCD so that the operator can use it easily. Operator identity is recorded through RFID which is embedded in his ID card. The workpiece data are collected from the database by scanning the barcode listed on its monitoring sheet. A sensor is mounted on the machine to measure the actual machining time. The system's outputs are actual processing time and machine's capacity information. This system is connected wirelessly to a workshop planning application belongs to the firm. Test results indicated that all functions of the system can run properly. This system successfully enables supervisors, PPIC or higher level management staffs to monitor the processing time quickly with a better accuracy.
Oregon Washington Coastal Ocean Forecast System: Real-time Modeling and Data Assimilation
NASA Astrophysics Data System (ADS)
Erofeeva, S.; Kurapov, A. L.; Pasmans, I.
2016-02-01
Three-day forecasts of ocean currents, temperature and salinity along the Oregon and Washington coasts are produced daily by a numerical ROMS-based ocean circulation model. NAM is used to derive atmospheric forcing for the model. Fresh water discharge from Columbia River, Fraser River, and small rivers in Puget Sound are included. The forecast is constrained by open boundary conditions derived from the global Navy HYCOM model and once in 3 days assimilation of recent data, including HF radar surface currents, sea surface temperature from the GOES satellite, and SSH from several satellite altimetry missions. 4-dimensional variational data assimilation is implemented in 3-day time windows using the tangent linear and adjoint codes developed at OSU. The system is semi-autonomous - all the data, including NAM and HYCOM fields are automatically updated, and daily operational forecast is automatically initiated. The pre-assimilation data quality control and post-assimilation forecast quality control require the operator's involvement. The daily forecast and 60 days of hindcast fields are available for public on opendap. As part of the system model validation plots to various satellites and SEAGLIDER are also automatically updated and available on the web (http://ingria.coas.oregonstate.edu/rtdavow/). Lessons learned in this pilot real-time coastal ocean forecasting project help develop and test metrics for forecast skill assessment for the West Coast Operational Forecast System (WCOFS), currently at testing and development phase at the National Oceanic and Atmospheric Administration (NOAA).
ERIC Educational Resources Information Center
Khatib, Mohammad; Fat'hi, Jalil
2011-01-01
Prompted by the recent shift of attention from just focusing on the top-down processing in L2 reading towards considering the basic component, bottom-up processing, the role of phonological component has also enjoyed popularity among a selected circle of SLA investigators (Koda, 2005). This study investigated the effect of the automatization of…
Exploring the Developmental Changes in Automatic Two-Digit Number Processing
ERIC Educational Resources Information Center
Chan, Winnie Wai Lan; Au, Terry K.; Tang, Joey
2011-01-01
Even when two-digit numbers are irrelevant to the task at hand, adults process them. Do children process numbers automatically, and if so, what kind of information is activated? In a novel dot-number Stroop task, children (Grades 1-5) and adults were shown two different two-digit numbers made up of dots. Participants were asked to select the…
A customizable commercial miniaturized 320×256 indium gallium arsenide shortwave infrared camera
NASA Astrophysics Data System (ADS)
Huang, Shih-Che; O'Grady, Matthew; Groppe, Joseph V.; Ettenberg, Martin H.; Brubaker, Robert M.
2004-10-01
The design and performance of a commercial short-wave-infrared (SWIR) InGaAs microcamera engine is presented. The 0.9-to-1.7 micron SWIR imaging system consists of a room-temperature-TEC-stabilized, 320x256 (25 μm pitch) InGaAs focal plane array (FPA) and a high-performance, highly customizable image-processing set of electronics. The detectivity, D*, of the system is greater than 1013 cm-√Hz/W at 1.55 μm, and this sensitivity may be adjusted in real-time over 100 dB. It features snapshot-mode integration with a minimum exposure time of 130 μs. The digital video processor provides real time pixel-to-pixel, 2-point dark-current subtraction and non-uniformity compensation along with defective-pixel substitution. Other features include automatic gain control (AGC), gamma correction, 7 preset configurations, adjustable exposure time, external triggering, and windowing. The windowing feature is highly flexible; the region of interest (ROI) may be placed anywhere on the imager and can be varied at will. Windowing allows for high-speed readout enabling such applications as target acquisition and tracking; for example, a 32x32 ROI window may be read out at over 3500 frames per second (fps). Output video is provided as EIA170-compatible analog, or as 12-bit CameraLink-compatible digital. All the above features are accomplished in a small volume < 28 cm3, weight < 70 g, and with low power consumption < 1.3 W at room temperature using this new microcamera engine. Video processing is based on a field-programmable gate array (FPGA) platform with a soft-embedded processor that allows for ease of integration/addition of customer-specific algorithms, processes, or design requirements. The camera was developed with the high-performance, space-restricted, power-conscious application in mind, such as robotic or UAV deployment.
Gronchi, G; Righi, S; Pierguidi, L; Giovannelli, F; Murasecco, I; Viggiano, M P
2018-04-01
The positivity effect in the elderly consists of an attentional preference for positive information as well as avoidance of negative information. Extant theories predict either that the positivity effect depends on controlled attentional processes (socio-emotional selectivity theory), or on an automatic gating selection mechanism (dynamic integration theory). This study examined the role of automatic and controlled attention in the positivity effect. Two dot-probe tasks (with the duration of the stimuli lasting 100 ms and 500 ms, respectively) were employed to compare the attentional bias of 35 elderly people to that of 35 young adults. The stimuli used were expressive faces displaying neutral, disgusted, fearful, and happy expressions. In comparison to young people, the elderly allocated more attention to happy faces at 100 ms and they tended to avoid fearful faces at 500 ms. The findings are not predicted by either theory taken alone, but support the hypothesis that the positivity effect in the elderly is driven by two different processes: an automatic attention bias toward positive stimuli, and a controlled mechanism that diverts attention away from negative stimuli. Copyright © 2018 Elsevier B.V. All rights reserved.
Excoffier, Laurent; Lischer, Heidi E L
2010-05-01
We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.
Towards a portable Raman spectrometer using a concave grating and a time-gated CMOS SPAD.
Li, Zhiyun; Deen, M Jamal
2014-07-28
A low-cost, compact Raman spectrometer suitable for the on-line water monitoring applications is explored. A custom-designed concave grating for wavelength selection was fabricated and tested. The detection of the Raman signal is accomplished with a time-gated single photon avalanche diode (TG-SPAD). A fixed gate window of 3.5ns is designed and applied to the TG-SPAD. The temporal resolution of the SPAD was ~60ps when tested with a 7ps, 532nm solid-state laser. To test the efficiency of the gating in fluorescence signal suppression, different detection windows (3ns-0.25ns) within the 3.5ns gate window are used to measure the Raman spectra of Rhodamine B. Strong Raman peaks are resolved with this low-cost system.
Smart windows based on cholesteric liquid crystals (Conference Presentation)
NASA Astrophysics Data System (ADS)
Khandelwal, Hitesh; Debije, Michael G.; Schenning, Albert P. H. J.
2017-02-01
With increase in global warming, use of active cooling and heating devices are continuously increasing to maintain interior temperature of built environment, greenhouses and cars. To reduce the consumption of tremendous amount of energy on cooling and heating devices we need an improved control of transparent features (i.e. windows). In this respect, smart window which is capable for reflecting solar infrared energy without interfering with the visible light would be very attractive. Most of the technologies developed so far are to control the visible light. These technologies block visual contact to the outside world which cause negative effects on human health. An appealing method to selectively control infrared transmission is via utilizing the reflection properties of cholesteric liquid crystals. In our research, we have fabricated a smart window which is capable of reflecting different amount of solar infrared energy depending on the specific climate conditions. The reflection bandwidth can be tuned from 120 nm to 1100 nm in the infrared region without interfering with the visible solar radiations. Calculations reveal that between 8% and 45% of incident solar infrared light can be reflected with a single cell. Simulation studies predicted that more than 12% of the energy spent on heating, cooling and lighting in the built environment can be saved by using the fabricated smart window compared to standard double glazing window.
Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang
2017-04-26
This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.
Rivolo, Simone; Nagel, Eike; Smith, Nicolas P; Lee, Jack
2014-01-01
Coronary Wave Intensity Analysis (cWIA) is a technique capable of separating the effects of proximal arterial haemodynamics from cardiac mechanics. The cWIA ability to establish a mechanistic link between coronary haemodynamics measurements and the underlying pathophysiology has been widely demonstrated. Moreover, the prognostic value of a cWIA-derived metric has been recently proved. However, the clinical application of cWIA has been hindered due to the strong dependence on the practitioners, mainly ascribable to the cWIA-derived indices sensitivity to the pre-processing parameters. Specifically, as recently demonstrated, the cWIA-derived metrics are strongly sensitive to the Savitzky-Golay (S-G) filter, typically used to smooth the acquired traces. This is mainly due to the inability of the S-G filter to deal with the different timescale features present in the measured waveforms. Therefore, we propose to apply an adaptive S-G algorithm that automatically selects pointwise the optimal filter parameters. The newly proposed algorithm accuracy is assessed against a cWIA gold standard, provided by a newly developed in-silico cWIA modelling framework, when physiological noise is added to the simulated traces. The adaptive S-G algorithm, when used to automatically select the polynomial degree of the S-G filter, provides satisfactory results with ≤ 10% error for all the metrics through all the levels of noise tested. Therefore, the newly proposed method makes cWIA fully automatic and independent from the practitioners, opening the possibility to multi-centre trials.
Gorlin, Yelena; Jaramillo, Thomas F.
2014-01-01
The selection of an appropriate substrate is an important initial step for many studies of electrochemically active materials. In order to help researchers with the substrate selection process, we employ a consistent experimental methodology to evaluate the electrochemical reactivity and stability of seven potential substrate materials for electrocatalyst and photoelectrode evaluation. Using cyclic voltammetry with a progressively increased scan range, we characterize three transparent conducting oxides (indium tin oxide, fluorine-doped tin oxide, and aluminum-doped zinc oxide) and four opaque conductors (gold, stainless steel 304, glassy carbon, and highly oriented pyrolytic graphite) in three different electrolytes (sulfuric acid, sodium acetate, and sodium hydroxide). We determine the inert potential window for each substrate/electrolyte combination and make recommendations about which materials may be most suitable for application under different experimental conditions. Furthermore, the testing methodology provides a framework for other researchers to evaluate and report the baseline activity of other substrates of interest to the broader community. PMID:25357131
Benck, Jesse D.; Pinaud, Blaise A.; Gorlin, Yelena; ...
2014-10-30
The selection of an appropriate substrate is an important initial step for many studies of electrochemically active materials. In order to help researchers with the substrate selection process, we employ a consistent experimental methodology to evaluate the electrochemical reactivity and stability of seven potential substrate materials for electrocatalyst and photoelectrode evaluation. Using cyclic voltammetry with a progressively increased scan range, we characterize three transparent conducting oxides (indium tin oxide, fluorine-doped tin oxide, and aluminum-doped zinc oxide) and four opaque conductors (gold, stainless steel 304, glassy carbon, and highly oriented pyrolytic graphite) in three different electrolytes (sulfuric acid, sodium acetate, andmore » sodium hydroxide). Here, we determine the inert potential window for each substrate/electrolyte combination and make recommendations about which materials may be most suitable for application under different experimental conditions. Furthermore, the testing methodology provides a framework for other researchers to evaluate and report the baseline activity of other substrates of interest to the broader community.« less
Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali
2017-01-01
Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports.
Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali
2017-01-01
Objectives Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Methods Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Results Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. Conclusion The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports. PMID:28166263
Surface Fitting Filtering of LIDAR Point Cloud with Waveform Information
NASA Astrophysics Data System (ADS)
Xing, S.; Li, P.; Xu, Q.; Wang, D.; Li, P.
2017-09-01
Full-waveform LiDAR is an active technology of photogrammetry and remote sensing. It provides more detailed information about objects along the path of a laser pulse than discrete-return topographic LiDAR. The point cloud and waveform information with high quality can be obtained by waveform decomposition, which could make contributions to accurate filtering. The surface fitting filtering method with waveform information is proposed to present such advantage. Firstly, discrete point cloud and waveform parameters are resolved by global convergent Levenberg Marquardt decomposition. Secondly, the ground seed points are selected, of which the abnormal ones are detected by waveform parameters and robust estimation. Thirdly, the terrain surface is fitted and the height difference threshold is determined in consideration of window size and mean square error. Finally, the points are classified gradually with the rising of window size. The filtering process is finished until window size is larger than threshold. The waveform data in urban, farmland and mountain areas from "WATER (Watershed Allied Telemetry Experimental Research)" are selected for experiments. Results prove that compared with traditional method, the accuracy of point cloud filtering is further improved and the proposed method has highly practical value.
Mutual information-based facial expression recognition
NASA Astrophysics Data System (ADS)
Hazar, Mliki; Hammami, Mohamed; Hanêne, Ben-Abdallah
2013-12-01
This paper introduces a novel low-computation discriminative regions representation for expression analysis task. The proposed approach relies on interesting studies in psychology which show that most of the descriptive and responsible regions for facial expression are located around some face parts. The contributions of this work lie in the proposition of new approach which supports automatic facial expression recognition based on automatic regions selection. The regions selection step aims to select the descriptive regions responsible or facial expression and was performed using Mutual Information (MI) technique. For facial feature extraction, we have applied Local Binary Patterns Pattern (LBP) on Gradient image to encode salient micro-patterns of facial expressions. Experimental studies have shown that using discriminative regions provide better results than using the whole face regions whilst reducing features vector dimension.
NASA Technical Reports Server (NTRS)
Spirkovska, Liljana (Inventor)
2006-01-01
Method and system for automatically displaying, visually and/or audibly and/or by an audible alarm signal, relevant weather data for an identified aircraft pilot, when each of a selected subset of measured or estimated aviation situation parameters, corresponding to a given aviation situation, has a value lying in a selected range. Each range for a particular pilot may be a default range, may be entered by the pilot and/or may be automatically determined from experience and may be subsequently edited by the pilot to change a range and to add or delete parameters describing a situation for which a display should be provided. The pilot can also verbally activate an audible display or visual display of selected information by verbal entry of a first command or a second command, respectively, that specifies the information required.
ERIC Educational Resources Information Center
Piper, Jim
1998-01-01
Discusses the importance of paying attention to facility requirements when selecting windows during a school building retrofit. Facility requirements to consider include security needs, lighting, energy conservation, and ease the cost of maintenance. (GR)
Slicing Method for curved façade and window extraction from point clouds
NASA Astrophysics Data System (ADS)
Iman Zolanvari, S. M.; Laefer, Debra F.
2016-09-01
Laser scanning technology is a fast and reliable method to survey structures. However, the automatic conversion of such data into solid models for computation remains a major challenge, especially where non-rectilinear features are present. Since, openings and the overall dimensions of the buildings are the most critical elements in computational models for structural analysis, this article introduces the Slicing Method as a new, computationally-efficient method for extracting overall façade and window boundary points for reconstructing a façade into a geometry compatible for computational modelling. After finding a principal plane, the technique slices a façade into limited portions, with each slice representing a unique, imaginary section passing through a building. This is done along a façade's principal axes to segregate window and door openings from structural portions of the load-bearing masonry walls. The method detects each opening area's boundaries, as well as the overall boundary of the façade, in part, by using a one-dimensional projection to accelerate processing. Slices were optimised as 14.3 slices per vertical metre of building and 25 slices per horizontal metre of building, irrespective of building configuration or complexity. The proposed procedure was validated by its application to three highly decorative, historic brick buildings. Accuracy in excess of 93% was achieved with no manual intervention on highly complex buildings and nearly 100% on simple ones. Furthermore, computational times were less than 3 sec for data sets up to 2.6 million points, while similar existing approaches required more than 16 hr for such datasets.
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
Layout pattern analysis using the Voronoi diagram of line segments
NASA Astrophysics Data System (ADS)
Dey, Sandeep Kumar; Cheilaris, Panagiotis; Gabrani, Maria; Papadopoulou, Evanthia
2016-01-01
Early identification of problematic patterns in very large scale integration (VLSI) designs is of great value as the lithographic simulation tools face significant timing challenges. To reduce the processing time, such a tool selects only a fraction of possible patterns which have a probable area of failure, with the risk of missing some problematic patterns. We introduce a fast method to automatically extract patterns based on their structure and context, using the Voronoi diagram of line-segments as derived from the edges of VLSI design shapes. Designers put line segments around the problematic locations in patterns called "gauges," along which the critical distance is measured. The gauge center is the midpoint of a gauge. We first use the Voronoi diagram of VLSI shapes to identify possible problematic locations, represented as gauge centers. Then we use the derived locations to extract windows containing the problematic patterns from the design layout. The problematic locations are prioritized by the shape and proximity information of the design polygons. We perform experiments for pattern selection in a portion of a 22-nm random logic design layout. The design layout had 38,584 design polygons (consisting of 199,946 line segments) on layer Mx, and 7079 markers generated by an optical rule checker (ORC) tool. The optical rules specify requirements for printing circuits with minimum dimension. Markers are the locations of some optical rule violations in the layout. We verify our approach by comparing the coverage of our extracted patterns to the ORC-generated markers. We further derive a similarity measure between patterns and between layouts. The similarity measure helps to identify a set of representative gauges that reduces the number of patterns for analysis.
Optimal filter parameters for low SNR seismograms as a function of station and event location
NASA Astrophysics Data System (ADS)
Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.
1999-06-01
Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.
NASA Technical Reports Server (NTRS)
Davis, Robert P.; Underwood, Ian M.
1987-01-01
The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.
Automatically operable self-leveling load table
NASA Technical Reports Server (NTRS)
Burch, J. L. (Inventor)
1974-01-01
A self-leveling load table is described which is automatically maintained level by selectively opening and closing solenoid valves for inserting and removing air from chambers under the table. The table is floated in a fluid by nine air chambers beneath the top of the table. These chambers are open at the bottom and four oppositely located chambers are used for leveling the table by having the air increased or decreased by means of a flexible hose. Air bearing pendulums are used for selectively energizing solenoid valves which either apply pressurized air to the chamber or evacuate air from the chamber by means of a vacuum source.
Automatic measurement of images on astrometric plates
NASA Astrophysics Data System (ADS)
Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.
1994-04-01
We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).