Sample records for optimal time window

  1. Optimization of ramp area aircraft push back time windows in the presence of uncertainty

    NASA Astrophysics Data System (ADS)

    Coupe, William Jeremy

    It is well known that airport surface traffic congestion at major airports is responsible for increased taxi-out times, fuel burn and excess emissions and there is potential to mitigate these negative consequences through optimizing airport surface traffic operations. Due to a highly congested voice communication channel between pilots and air traffic controllers and a data communication channel that is used only for limited functions, one of the most viable near-term strategies for improvement of the surface traffic is issuing a push back advisory to each departing aircraft. This dissertation focuses on the optimization of a push back time window for each departing aircraft. The optimization takes into account both spatial and temporal uncertainties of ramp area aircraft trajectories. The uncertainties are described by a stochastic kinematic model of aircraft trajectories, which is used to infer distributions of combinations of push back times that lead to conflict among trajectories from different gates. The model is validated and the distributions are included in the push back time window optimization. Under the assumption of a fixed taxiway spot schedule, the computed push back time windows can be integrated with a higher level taxiway scheduler to optimize the flow of traffic from the gate to the departure runway queue. To enable real-time decision making the computational time of the push back time window optimization is critical and is analyzed throughout.

  2. A multimodal logistics service network design with time windows and environmental concerns

    PubMed Central

    Zhang, Dezhi; He, Runzhong; Wang, Zhongwei

    2017-01-01

    The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained. PMID:28934272

  3. A multimodal logistics service network design with time windows and environmental concerns.

    PubMed

    Zhang, Dezhi; He, Runzhong; Li, Shuangyan; Wang, Zhongwei

    2017-01-01

    The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained.

  4. Optimal Window and Lattice in Gabor Transform. Application to Audio Analysis.

    PubMed

    Lachambre, Helene; Ricaud, Benjamin; Stempfel, Guillaume; Torrésani, Bruno; Wiesmeyr, Christoph; Onchis-Moaca, Darian

    2015-01-01

    This article deals with the use of optimal lattice and optimal window in Discrete Gabor Transform computation. In the case of a generalized Gaussian window, extending earlier contributions, we introduce an additional local window adaptation technique for non-stationary signals. We illustrate our approach and the earlier one by addressing three time-frequency analysis problems to show the improvements achieved by the use of optimal lattice and window: close frequencies distinction, frequency estimation and SNR estimation. The results are presented, when possible, with real world audio signals.

  5. A general method to determine sampling windows for nonlinear mixed effects models with an application to population pharmacokinetic studies.

    PubMed

    Foo, Lee Kien; McGree, James; Duffull, Stephen

    2012-01-01

    Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.

  6. Single-agent parallel window search

    NASA Technical Reports Server (NTRS)

    Powley, Curt; Korf, Richard E.

    1991-01-01

    Parallel window search is applied to single-agent problems by having different processes simultaneously perform iterations of Iterative-Deepening-A(asterisk) (IDA-asterisk) on the same problem but with different cost thresholds. This approach is limited by the time to perform the goal iteration. To overcome this disadvantage, the authors consider node ordering. They discuss how global node ordering by minimum h among nodes with equal f = g + h values can reduce the time complexity of serial IDA-asterisk by reducing the time to perform the iterations prior to the goal iteration. Finally, the two ideas of parallel window search and node ordering are combined to eliminate the weaknesses of each approach while retaining the strengths. The resulting approach, called simply parallel window search, can be used to find a near-optimal solution quickly, improve the solution until it is optimal, and then finally guarantee optimality, depending on the amount of time available.

  7. Computing an optimal time window of audiovisual integration in focused attention tasks: illustrated by studies on effect of age and prior knowledge.

    PubMed

    Colonius, Hans; Diederich, Adele

    2011-07-01

    The concept of a "time window of integration" holds that information from different sensory modalities must not be perceived too far apart in time in order to be integrated into a multisensory perceptual event. Empirical estimates of window width differ widely, however, ranging from 40 to 600 ms depending on context and experimental paradigm. Searching for theoretical derivation of window width, Colonius and Diederich (Front Integr Neurosci 2010) developed a decision-theoretic framework using a decision rule that is based on the prior probability of a common source, the likelihood of temporal disparities between the unimodal signals, and the payoff for making right or wrong decisions. Here, this framework is extended to the focused attention task where subjects are asked to respond to signals from a target modality only. Evoking the framework of the time-window-of-integration (TWIN) model, an explicit expression for optimal window width is obtained. The approach is probed on two published focused attention studies. The first is a saccadic reaction time study assessing the efficiency with which multisensory integration varies as a function of aging. Although the window widths for young and older adults differ by nearly 200 ms, presumably due to their different peripheral processing speeds, neither of them deviates significantly from the optimal values. In the second study, head saccadic reactions times to a perfectly aligned audiovisual stimulus pair had been shown to depend on the prior probability of spatial alignment. Intriguingly, they reflected the magnitude of the time-window widths predicted by our decision-theoretic framework, i.e., a larger time window is associated with a higher prior probability.

  8. On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2011-01-01

    This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.

  9. Fault Diagnosis of Induction Machines in a Transient Regime Using Current Sensors with an Optimized Slepian Window

    PubMed Central

    Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel

    2018-01-01

    The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650

  10. Fault Diagnosis of Induction Machines in a Transient Regime Using Current Sensors with an Optimized Slepian Window.

    PubMed

    Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel

    2018-01-06

    The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.

  11. Due-Window Assignment Scheduling with Variable Job Processing Times

    PubMed Central

    Wu, Yu-Bin

    2015-01-01

    We consider a common due-window assignment scheduling problem jobs with variable job processing times on a single machine, where the processing time of a job is a function of its position in a sequence (i.e., learning effect) or its starting time (i.e., deteriorating effect). The problem is to determine the optimal due-windows, and the processing sequence simultaneously to minimize a cost function includes earliness, tardiness, the window location, window size, and weighted number of tardy jobs. We prove that the problem can be solved in polynomial time. PMID:25918745

  12. Optimal pulse design for communication-oriented slow-light pulse detection.

    PubMed

    Stenner, Michael D; Neifeld, Mark A

    2008-01-21

    We present techniques for designing pulses for linear slow-light delay systems which are optimal in the sense that they maximize the signal-to-noise ratio (SNR) and signal-to-noise-plus-interference ratio (SNIR) of the detected pulse energy. Given a communication model in which input pulses are created in a finite temporal window and output pulse energy in measured in a temporally-offset output window, the SNIR-optimal pulses achieve typical improvements of 10 dB compared to traditional pulse shapes for a given output window offset. Alternatively, for fixed SNR or SNIR, window offset (detection delay) can be increased by 0.3 times the window width. This approach also invites a communication-based model for delay and signal fidelity.

  13. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-09-02

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.

  14. Single-machine common/slack due window assignment problems with linear decreasing processing times

    NASA Astrophysics Data System (ADS)

    Zhang, Xingong; Lin, Win-Chin; Wu, Wen-Hsiang; Wu, Chin-Chia

    2017-08-01

    This paper studies linear non-increasing processing times and the common/slack due window assignment problems on a single machine, where the actual processing time of a job is a linear non-increasing function of its starting time. The aim is to minimize the sum of the earliness cost, tardiness cost, due window location and due window size. Some optimality results are discussed for the common/slack due window assignment problems and two O(n log n) time algorithms are presented to solve the two problems. Finally, two examples are provided to illustrate the correctness of the corresponding algorithms.

  15. Efficient constraint handling in electromagnetism-like algorithm for traveling salesman problem with time windows.

    PubMed

    Yurtkuran, Alkın; Emel, Erdal

    2014-01-01

    The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms.

  16. Efficient Constraint Handling in Electromagnetism-Like Algorithm for Traveling Salesman Problem with Time Windows

    PubMed Central

    Yurtkuran, Alkın

    2014-01-01

    The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms. PMID:24723834

  17. Time-Frequency Distribution Analyses of Ku-Band Radar Doppler Echo Signals

    NASA Astrophysics Data System (ADS)

    Bujaković, Dimitrije; Andrić, Milenko; Bondžulić, Boban; Mitrović, Srđan; Simić, Slobodan

    2015-03-01

    Real radar echo signals of a pedestrian, vehicle and group of helicopters are analyzed in order to maximize signal energy around central Doppler frequency in time-frequency plane. An optimization, preserving this concentration, is suggested based on three well-known concentration measures. Various window functions and time-frequency distributions were optimization inputs. Conducted experiments on an analytic and three real signals have shown that energy concentration significantly depends on used time-frequency distribution and window function, for all three used criteria.

  18. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics

    PubMed Central

    Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  19. Determination of injection molding process windows for optical lenses using response surface methodology.

    PubMed

    Tsai, Kuo-Ming; Wang, He-Yi

    2014-08-20

    This study focuses on injection molding process window determination for obtaining optimal imaging optical properties, astigmatism, coma, and spherical aberration using plastic lenses. The Taguchi experimental method was first used to identify the optimized combination of parameters and significant factors affecting the imaging optical properties of the lens. Full factorial experiments were then implemented based on the significant factors to build the response surface models. The injection molding process windows for lenses with optimized optical properties were determined based on the surface models, and confirmation experiments were performed to verify their validity. The results indicated that the significant factors affecting the optical properties of lenses are mold temperature, melt temperature, and cooling time. According to experimental data for the significant factors, the oblique ovals for different optical properties on the injection molding process windows based on melt temperature and cooling time can be obtained using the curve fitting approach. The confirmation experiments revealed that the average errors for astigmatism, coma, and spherical aberration are 3.44%, 5.62%, and 5.69%, respectively. The results indicated that the process windows proposed are highly reliable.

  20. Optimization of ecosystem model parameters with different temporal variabilities using tower flux data and an ensemble Kalman filter

    NASA Astrophysics Data System (ADS)

    He, L.; Chen, J. M.; Liu, J.; Mo, G.; Zhen, T.; Chen, B.; Wang, R.; Arain, M.

    2013-12-01

    Terrestrial ecosystem models have been widely used to simulate carbon, water and energy fluxes and climate-ecosystem interactions. In these models, some vegetation and soil parameters are determined based on limited studies from literatures without consideration of their seasonal variations. Data assimilation (DA) provides an effective way to optimize these parameters at different time scales . In this study, an ensemble Kalman filter (EnKF) is developed and applied to optimize two key parameters of an ecosystem model, namely the Boreal Ecosystem Productivity Simulator (BEPS): (1) the maximum photosynthetic carboxylation rate (Vcmax) at 25 °C, and (2) the soil water stress factor (fw) for stomatal conductance formulation. These parameters are optimized through assimilating observations of gross primary productivity (GPP) and latent heat (LE) fluxes measured in a 74 year-old pine forest, which is part of the Turkey Point Flux Station's age-sequence sites. Vcmax is related to leaf nitrogen concentration and varies slowly over the season and from year to year. In contrast, fw varies rapidly in response to soil moisture dynamics in the root-zone. Earlier studies suggested that DA of vegetation parameters at daily time steps leads to Vcmax values that are unrealistic. To overcome the problem, we developed a three-step scheme to optimize Vcmax and fw. First, the EnKF is applied daily to obtain precursor estimates of Vcmax and fw. Then Vcmax is optimized at different time scales assuming fw is unchanged from first step. The best temporal period or window size is then determined by analyzing the magnitude of the minimized cost-function, and the coefficient of determination (R2) and Root-mean-square deviation (RMSE) of GPP and LE between simulation and observation. Finally, the daily fw value is optimized for rain free days corresponding to the Vcmax curve from the best window size. The optimized fw is then used to model its relationship with soil moisture. We found that the optimized fw is best correlated linearly to soil water content at 5 to 10 cm depth. We also found that both the temporal scale or window size and the priori uncertainty of Vcmax (given as its standard deviation) are important in determining the seasonal trajectory of Vcmax. During the leaf expansion stage, an appropriate window size leads to reasonable estimate of Vcmax. In the summer, the fluctuation of optimized Vcmax is mainly caused by the uncertainties in Vcmax but not the window size. Our study suggests that a smooth Vcmax curve optimized from an optimal time window size is close to the reality though the RMSE of GPP at this window is not the minimum. It also suggests that for the accurate optimization of Vcmax, it is necessary to set appropriate levels of uncertainty of Vcmax in the spring and summer because the rate of leaf nitrogen concentration change is different over the season. Parameter optimizations for more sites and multi-years are in progress.

  1. Determining the optimal window length for pattern recognition-based myoelectric control: balancing the competing effects of classification error and controller delay.

    PubMed

    Smith, Lauren H; Hargrove, Levi J; Lock, Blair A; Kuiken, Todd A

    2011-04-01

    Pattern recognition-based control of myoelectric prostheses has shown great promise in research environments, but has not been optimized for use in a clinical setting. To explore the relationship between classification error, controller delay, and real-time controllability, 13 able-bodied subjects were trained to operate a virtual upper-limb prosthesis using pattern recognition of electromyogram (EMG) signals. Classification error and controller delay were varied by training different classifiers with a variety of analysis window lengths ranging from 50 to 550 ms and either two or four EMG input channels. Offline analysis showed that classification error decreased with longer window lengths (p < 0.01 ). Real-time controllability was evaluated with the target achievement control (TAC) test, which prompted users to maneuver the virtual prosthesis into various target postures. The results indicated that user performance improved with lower classification error (p < 0.01 ) and was reduced with longer controller delay (p < 0.01 ), as determined by the window length. Therefore, both of these effects should be considered when choosing a window length; it may be beneficial to increase the window length if this results in a reduced classification error, despite the corresponding increase in controller delay. For the system employed in this study, the optimal window length was found to be between 150 and 250 ms, which is within acceptable controller delays for conventional multistate amplitude controllers.

  2. Seismic signal time-frequency analysis based on multi-directional window using greedy strategy

    NASA Astrophysics Data System (ADS)

    Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin

    2017-08-01

    Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.

  3. Using applet-servlet communication for optimizing window, level and crop for DICOM to JPEG conversion.

    PubMed

    Kamauu, Aaron W C; DuVall, Scott L; Wiggins, Richard H; Avrin, David E

    2008-09-01

    In the creation of interesting radiological cases in a digital teaching file, it is necessary to adjust the window and level settings of an image to effectively display the educational focus. The web-based applet described in this paper presents an effective solution for real-time window and level adjustments without leaving the picture archiving and communications system workstation. Optimized images are created, as user-defined parameters are passed between the applet and a servlet on the Health Insurance Portability and Accountability Act-compliant teaching file server.

  4. Occupant-responsive optimal control of smart facade systems

    NASA Astrophysics Data System (ADS)

    Park, Cheol-Soo

    Windows provide occupants with daylight, direct sunlight, visual contact with the outside and a feeling of openness. Windows enable the use of daylighting and offer occupants a outside view. Glazing may also cause a number of problems: undesired heat gain/loss in winter. An over-lit window can cause glare, which is another major complaint by occupants. Furthermore, cold or hot window surfaces induce asymmetric thermal radiation which can result in thermal discomfort. To reduce the potential problems of window systems, double skin facades and airflow window systems have been introduced in the 1970s. They typically contain interstitial louvers and ventilation openings. The current problem with double skin facades and airflow windows is that their operation requires adequate dynamic control to reach their expected performance. Many studies have recognized that only an optimal control enables these systems to truly act as active energy savers and indoor environment controllers. However, an adequate solution for this dynamic optimization problem has thus far not been developed. The primary objective of this study is to develop occupant responsive optimal control of smart facade systems. The control could be implemented as a smart controller that operates the motorized Venetian blind system and the opening ratio of ventilation openings. The objective of the control is to combine the benefits of large windows with low energy demands for heating and cooling, while keeping visual well-being and thermal comfort at an optimal level. The control uses a simulation model with an embedded optimization routine that allows occupant interaction via the Web. An occupant can access the smart controller from a standard browser and choose a pre-defined mode (energy saving mode, visual comfort mode, thermal comfort mode, default mode, nighttime mode) or set a preferred mode (user-override mode) by moving preference sliders on the screen. The most prominent feature of these systems is the capability of dynamically reacting to the environmental input data through real-time optimization. The proposed occupant responsive optimal control of smart facade systems could provide a breakthrough in this under-developed area and lead to a renewed interest in smart facade systems.

  5. Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows

    NASA Astrophysics Data System (ADS)

    Jittamai, Phongchai

    This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.

  6. Millisecond timing on PCs and Macs.

    PubMed

    MacInnes, W J; Taylor, T L

    2001-05-01

    A real-time, object-oriented solution for displaying stimuli on Windows 95/98, MacOS and Linux platforms is presented. The program, written in C++, utilizes a special-purpose window class (GLWindow), OpenGL, and 32-bit graphics acceleration; it avoids display timing uncertainty by substituting the new window class for the default window code for each system. We report the outcome of tests for real-time capability across PC and Mac platforms running a variety of operating systems. The test program, which can be used as a shell for programming real-time experiments and testing specific processors, is available at http://www.cs.dal.ca/~macinnwj. We propose to provide researchers with a sense of the usefulness of our program, highlight the ability of many multitasking environments to achieve real time, as well as caution users about systems that may not achieve real time, even under optimal conditions.

  7. Fully automatic time-window selection using machine learning for global adjoint tomography

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.

    2017-12-01

    Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.

  8. Localization of interictal epileptic spikes with MEG: optimization of an automated beamformer screening method (SAMepi) in a diverse epilepsy population

    PubMed Central

    Scott, Jonathan M.; Robinson, Stephen E.; Holroyd, Tom; Coppola, Richard; Sato, Susumu; Inati, Sara K.

    2016-01-01

    OBJECTIVE To describe and optimize an automated beamforming technique followed by identification of locations with excess kurtosis (g2) for efficient detection and localization of interictal spikes in medically refractory epilepsy patients. METHODS Synthetic Aperture Magnetometry with g2 averaged over a sliding time window (SAMepi) was performed in 7 focal epilepsy patients and 5 healthy volunteers. The effect of varied window lengths on detection of spiking activity was evaluated. RESULTS Sliding window lengths of 0.5–10 seconds performed similarly, with 0.5 and 1 second windows detecting spiking activity in one of the 3 virtual sensor locations with highest kurtosis. These locations were concordant with the region of eventual surgical resection in these 7 patients who remained seizure free at one year. Average g2 values increased with increasing sliding window length in all subjects. In healthy volunteers kurtosis values stabilized in datasets longer than two minutes. CONCLUSIONS SAMepi using g2 averaged over 1 second sliding time windows in datasets of at least 2 minutes duration reliably identified interictal spiking and the presumed seizure focus in these 7 patients. Screening the 5 locations with highest kurtosis values for spiking activity is an efficient and accurate technique for localizing interictal activity using MEG. SIGNIFICANCE SAMepi should be applied using the parameter values and procedure described for optimal detection and localization of interictal spikes. Use of this screening procedure could significantly improve the efficiency of MEG analysis if clinically validated. PMID:27760068

  9. Evaluation of optimized magnetic resonance perfusion imaging scanning time window after contrast agent injection for differentiating benign and malignant breast lesions

    PubMed Central

    Dong, Jie; Wang, Dawei; Ma, Zhenshen; Deng, Guodong; Wang, Lanhua; Zhang, Jiandong

    2017-01-01

    The aim of the study was evaluate the 3.0 T magnetic resonance (MR) perfusion imaging scanning time window following contrast injection for differentiating benign and malignant breast lesions and to determine the optimum scanning time window for increased scanner usage efficiency and reduced diagnostic adverse risk factors. A total of 52 women with breast abnormalities were selected for conventional MR imaging and T1 dynamic-enhanced imaging. Quantitative parameters [volume transfer constant (Ktrans), rate constant (Kep) and extravascular extracellular volume fraction (Ve)] were calculated at phases 10, 20, 30, 40 and 50, which represented time windows at 5, 10, 15, 20 and 25 min, respectively, following injection of contrast agent. The association of the parameters at different phases with benign and malignant tumor diagnosis was analyzed. MR perfusion imaging was verified as an effective modality in the diagnosis of breast malignancies and the best scanning time window was identified: i) Values of Ktrans and Kep at all phases were statistically significant in differentiating benign and malignant tumors (P<0.05), while the value of Ve had statistical significance only at stage 10, but not at any other stages (P>0.05); ii) values of Ve in benign tumors increased with phase number, but achieved no obvious changes at different phases in malignant tumors; iii) the optimum scanning time window of breast perfusion imaging with 3.0 T MR was between phases 10 and 30 (i.e., between 5 and 15 min after contrast agent injection). The variation trend of Ve values at different phases may serve as a diagnostic reference for differentiating benign and malignant breast abnormalities. The most efficient scanning time window was indicated to be 5 min after contrast injection, based on the observation that the Ve value only had statistical significance in diagnosis at stage 10. However, the optimal scanning time window is from 5 to 15 min following the injection of contrast agent, since that the variation trend of Ve is able to serve as a diagnostic reference. PMID:28450944

  10. Evaluation of optimized magnetic resonance perfusion imaging scanning time window after contrast agent injection for differentiating benign and malignant breast lesions.

    PubMed

    Dong, Jie; Wang, Dawei; Ma, Zhenshen; Deng, Guodong; Wang, Lanhua; Zhang, Jiandong

    2017-03-01

    The aim of the study was evaluate the 3.0 T magnetic resonance (MR) perfusion imaging scanning time window following contrast injection for differentiating benign and malignant breast lesions and to determine the optimum scanning time window for increased scanner usage efficiency and reduced diagnostic adverse risk factors. A total of 52 women with breast abnormalities were selected for conventional MR imaging and T1 dynamic-enhanced imaging. Quantitative parameters [volume transfer constant (K trans ), rate constant (K ep ) and extravascular extracellular volume fraction (V e )] were calculated at phases 10, 20, 30, 40 and 50, which represented time windows at 5, 10, 15, 20 and 25 min, respectively, following injection of contrast agent. The association of the parameters at different phases with benign and malignant tumor diagnosis was analyzed. MR perfusion imaging was verified as an effective modality in the diagnosis of breast malignancies and the best scanning time window was identified: i) Values of K trans and K ep at all phases were statistically significant in differentiating benign and malignant tumors (P<0.05), while the value of V e had statistical significance only at stage 10, but not at any other stages (P>0.05); ii) values of V e in benign tumors increased with phase number, but achieved no obvious changes at different phases in malignant tumors; iii) the optimum scanning time window of breast perfusion imaging with 3.0 T MR was between phases 10 and 30 (i.e., between 5 and 15 min after contrast agent injection). The variation trend of V e values at different phases may serve as a diagnostic reference for differentiating benign and malignant breast abnormalities. The most efficient scanning time window was indicated to be 5 min after contrast injection, based on the observation that the V e value only had statistical significance in diagnosis at stage 10. However, the optimal scanning time window is from 5 to 15 min following the injection of contrast agent, since that the variation trend of V e is able to serve as a diagnostic reference.

  11. Exclusive queueing model including the choice of service windows

    NASA Astrophysics Data System (ADS)

    Tanaka, Masahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro

    2018-01-01

    In a queueing system involving multiple service windows, choice behavior is a significant concern. This paper incorporates the choice of service windows into a queueing model with a floor represented by discrete cells. We contrived a logit-based choice algorithm for agents considering the numbers of agents and the distances to all service windows. Simulations were conducted with various parameters of agent choice preference for these two elements and for different floor configurations, including the floor length and the number of service windows. We investigated the model from the viewpoint of transit times and entrance block rates. The influences of the parameters on these factors were surveyed in detail and we determined that there are optimum floor lengths that minimize the transit times. In addition, we observed that the transit times were determined almost entirely by the entrance block rates. The results of the presented model are relevant to understanding queueing systems including the choice of service windows and can be employed to optimize facility design and floor management.

  12. Optimal ranking regime analysis of intra- to multidecadal U.S. climate variability. Part I: Temperature

    USDA-ARS?s Scientific Manuscript database

    The Optimal Ranking Regime (ORR) method was used to identify intra- to multi-decadal (IMD) time windows containing significant ranking sequences in U.S. climate division temperature data. The simplicity of the ORR procedure’s output – a time series’ most significant non-overlapping periods of high o...

  13. Low b-value diffusion-weighted cardiac magnetic resonance imaging: initial results in humans using an optimal time-window imaging approach.

    PubMed

    Rapacchi, Stanislas; Wen, Han; Viallon, Magalie; Grenier, Denis; Kellman, Peter; Croisille, Pierre; Pai, Vinay M

    2011-12-01

    Diffusion-weighted imaging (DWI) using low b-values permits imaging of intravoxel incoherent motion in tissues. However, low b-value DWI of the human heart has been considered too challenging because of additional signal loss due to physiological motion, which reduces both signal intensity and the signal-to-noise ratio (SNR). We address these signal loss concerns by analyzing cardiac motion during a heartbeat to determine the time-window during which cardiac bulk motion is minimal. Using this information to optimize the acquisition of DWI data and combining it with a dedicated image processing approach has enabled us to develop a novel low b-value diffusion-weighted cardiac magnetic resonance imaging approach, which significantly reduces intravoxel incoherent motion measurement bias introduced by motion. Simulations from displacement encoded motion data sets permitted the delineation of an optimal time-window with minimal cardiac motion. A number of single-shot repetitions of low b-value DWI cardiac magnetic resonance imaging data were acquired during this time-window under free-breathing conditions with bulk physiological motion corrected for by using nonrigid registration. Principal component analysis (PCA) was performed on the registered images to improve the SNR, and temporal maximum intensity projection (TMIP) was applied to recover signal intensity from time-fluctuant motion-induced signal loss. This PCATMIP method was validated with experimental data, and its benefits were evaluated in volunteers before being applied to patients. Optimal time-window cardiac DWI in combination with PCATMIP postprocessing yielded significant benefits for signal recovery, contrast-to-noise ratio, and SNR in the presence of bulk motion for both numerical simulations and human volunteer studies. Analysis of mean apparent diffusion coefficient (ADC) maps showed homogeneous values among volunteers and good reproducibility between free-breathing and breath-hold acquisitions. The PCATMIP DWI approach also indicated its potential utility by detecting ADC variations in acute myocardial infarction patients. Studying cardiac motion may provide an appropriate strategy for minimizing the impact of bulk motion on cardiac DWI. Applying PCATMIP image processing improves low b-value DWI and enables reliable analysis of ADC in the myocardium. The use of a limited number of repetitions in a free-breathing mode also enables easier application in clinical conditions.

  14. Statistical Determination of the Gating Windows for Respiratory-Gated Radiotherapy Using a Visible Guiding System.

    PubMed

    Oh, Se An; Yea, Ji Woon; Kim, Sung Kyu

    2016-01-01

    Respiratory-gated radiation therapy (RGRT) is used to minimize the radiation dose to normal tissue in lung-cancer patients. Although determining the gating window in the respiratory phase of patients is important in RGRT, it is not easy. Our aim was to determine the optimal gating window when using a visible guiding system for RGRT. Between April and October 2014, the breathing signals of 23 lung-cancer patients were recorded with a real-time position management (RPM) respiratory gating system (Varian, USA). We performed statistical analysis with breathing signals to find the optimal gating window for guided breathing in RGRT. When we compared breathing signals before and after the breathing training, 19 of the 23 patients showed statistically significant differences (p < 0.05). The standard deviation of the respiration signals after breathing training was lowest for phases of 30%-70%. The results showed that the optimal gating window in RGRT is 40% (30%-70%) with respect to repeatability for breathing after respiration training with the visible guiding system. RGRT was performed with the RPM system to confirm the usefulness of the visible guiding system. The RPM system and our visible guiding system improve the respiratory regularity, which in turn should improve the accuracy and efficiency of RGRT.

  15. Removal of Noise from a Voice Signal by Synthesis

    DTIC Science & Technology

    1973-05-01

    for 102.4 millisecond windows is about five times as great as the cost of computing for 25.6 millisecond windows. Hammett in his work on an adaptive...spectrum analysis vocoder, has examined the selection of data window widths in detail [18]. The solution Hammett used to optimize the trade off between...result is: n s(t) E Ri(t - i . T) i-1 In this equation n is the number of impulse responses under consideration, s(t) is the resulting synthetic signal

  16. Defining the therapeutic time window for suppressing the inflammatory prostaglandin E2 signaling after status epilepticus

    PubMed Central

    Du, Yifeng; Kemper, Timothy; Qiu, Jiange; Jiang, Jianxiong

    2016-01-01

    Neuroinflammation is a common feature in nearly all neurological and some psychiatric disorders. Resembling its extraneural counterpart, neuroinflammation can be both beneficial and detrimental depending on the responding molecules. The overall effect of inflammation on disease progression is highly dependent on the extent of inflammatory mediator production and the duration of inflammatory induction. The time-dependent aspect of inflammatory responses suggests that the therapeutic time window for quelling neuroinflammation might vary with molecular targets and injury types. Therefore, it is important to define the therapeutic time window for anti-inflammatory therapeutics, as contradicting or negative results might arise when different treatment regimens are utilized even in similar animal models. Herein, we discuss a few critical factors that can help define the therapeutic time window and optimize treatment paradigm for suppressing the cyclooxygenase-2/prostaglandin-mediated inflammation after status epilepticus. These determinants should also be relevant to other anti-inflammatory therapeutic strategies for the CNS diseases. PMID:26689339

  17. Simulation-based process windows simultaneously considering two and three conflicting criteria in injection molding

    PubMed Central

    Rodríguez-Yáñez, Alicia Berenice; Méndez-Vázquez, Yaileen

    2014-01-01

    Process windows in injection molding are habitually built with only one performance measure in mind. In reality, a more realistic picture can be obtained when considering multiple performance measures at a time, especially in the presence of conflict. In this work, the construction of process windows for injection molding (IM) is undertaken considering two and three performance measures in conflict simultaneously. The best compromises between the criteria involved are identified through the direct application of the concept of Pareto-dominance in multiple criteria optimization. The aim is to provide a formal and realistic strategy to set processing conditions in IM operations. The resulting optimization approach is easily implementable in MS Excel. The solutions are presented graphically to facilitate their use in manufacturing plants. PMID:25530927

  18. Simulation-based process windows simultaneously considering two and three conflicting criteria in injection molding.

    PubMed

    Rodríguez-Yáñez, Alicia Berenice; Méndez-Vázquez, Yaileen; Cabrera-Ríos, Mauricio

    2014-01-01

    Process windows in injection molding are habitually built with only one performance measure in mind. In reality, a more realistic picture can be obtained when considering multiple performance measures at a time, especially in the presence of conflict. In this work, the construction of process windows for injection molding (IM) is undertaken considering two and three performance measures in conflict simultaneously. The best compromises between the criteria involved are identified through the direct application of the concept of Pareto-dominance in multiple criteria optimization. The aim is to provide a formal and realistic strategy to set processing conditions in IM operations. The resulting optimization approach is easily implementable in MS Excel. The solutions are presented graphically to facilitate their use in manufacturing plants.

  19. An online input force time history reconstruction algorithm using dynamic principal component analysis

    NASA Astrophysics Data System (ADS)

    Prawin, J.; Rama Mohan Rao, A.

    2018-01-01

    The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.

  20. Multi-Window Controllers for Autonomous Space Systems

    NASA Technical Reports Server (NTRS)

    Lurie, B, J.; Hadaegh, F. Y.

    1997-01-01

    Multi-window controllers select between elementary linear controllers using nonlinear windows based on the amplitude and frequency content of the feedback error. The controllers are relatively simple to implement and perform much better than linear controllers. The commanders for such controllers only order the destination point and are freed from generating the command time-profiles. The robotic missions rely heavily on the tasks of acquisition and tracking. For autonomous and optimal control of the spacecraft, the control bandwidth must be larger while the feedback can (and, therefore, must) be reduced.. Combining linear compensators via multi-window nonlinear summer guarantees minimum phase character of the combined transfer function. It is shown that the solution may require using several parallel branches and windows. Several examples of multi-window nonlinear controller applications are presented.

  1. Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors

    PubMed Central

    Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal

    2014-01-01

    Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782

  2. Vehicle routing problem with time windows using natural inspired algorithms

    NASA Astrophysics Data System (ADS)

    Pratiwi, A. B.; Pratama, A.; Sa’diyah, I.; Suprajitno, H.

    2018-03-01

    Process of distribution of goods needs a strategy to make the total cost spent for operational activities minimized. But there are several constrains have to be satisfied which are the capacity of the vehicles and the service time of the customers. This Vehicle Routing Problem with Time Windows (VRPTW) gives complex constrains problem. This paper proposes natural inspired algorithms for dealing with constrains of VRPTW which involves Bat Algorithm and Cat Swarm Optimization. Bat Algorithm is being hybrid with Simulated Annealing, the worst solution of Bat Algorithm is replaced by the solution from Simulated Annealing. Algorithm which is based on behavior of cats, Cat Swarm Optimization, is improved using Crow Search Algorithm to make simplier and faster convergence. From the computational result, these algorithms give good performances in finding the minimized total distance. Higher number of population causes better computational performance. The improved Cat Swarm Optimization with Crow Search gives better performance than the hybridization of Bat Algorithm and Simulated Annealing in dealing with big data.

  3. Performance evaluation and optimization of the MiniPET-II scanner

    NASA Astrophysics Data System (ADS)

    Lajtos, Imre; Emri, Miklos; Kis, Sandor A.; Opposits, Gabor; Potari, Norbert; Kiraly, Beata; Nagy, Ferenc; Tron, Lajos; Balkay, Laszlo

    2013-04-01

    This paper presents results of the performance of a small animal PET system (MiniPET-II) installed at our Institute. MiniPET-II is a full ring camera that includes 12 detector modules in a single ring comprised of 1.27×1.27×12 mm3 LYSO scintillator crystals. The axial field of view and the inner ring diameter are 48 mm and 211 mm, respectively. The goal of this study was to determine the NEMA-NU4 performance parameters of the scanner. In addition, we also investigated how the calculated parameters depend on the coincidence time window (τ=2, 3 and 4 ns) and the low threshold settings of the energy window (Elt=250, 350 and 450 keV). Independent measurements supported optimization of the effective system radius and the coincidence time window of the system. We found that the optimal coincidence time window and low threshold energy window are 3 ns and 350 keV, respectively. The spatial resolution was close to 1.2 mm in the center of the FOV with an increase of 17% at the radial edge. The maximum value of the absolute sensitivity was 1.37% for a point source. Count rate tests resulted in peak values for the noise equivalent count rate (NEC) curve and scatter fraction of 14.2 kcps (at 36 MBq) and 27.7%, respectively, using the rat phantom. Numerical values of the same parameters obtained for the mouse phantom were 55.1 kcps (at 38.8 MBq) and 12.3%, respectively. The recovery coefficients of the image quality phantom ranged from 0.1 to 0.87. Altering the τ and Elt resulted in substantial changes in the NEC peak and the sensitivity while the effect on the image quality was negligible. The spatial resolution proved to be, as expected, independent of the τ and Elt. The calculated optimal effective system radius (resulting in the best image quality) was 109 mm. Although the NEC peak parameters do not compare favorably with those of other small animal scanners, it can be concluded that under normal counting situations the MiniPET-II imaging capability assures remarkably good image quality, sensitivity and spatial resolution.

  4. Optimization and performance evaluation of the microPET II scanner for in vivo small-animal imaging

    NASA Astrophysics Data System (ADS)

    Yang, Yongfeng; Tai, Yuan-Chuan; Siegel, Stefan; Newport, Danny F.; Bai, Bing; Li, Quanzheng; Leahy, Richard M.; Cherry, Simon R.

    2004-06-01

    MicroPET II is a newly developed PET (positron emission tomography) scanner designed for high-resolution imaging of small animals. It consists of 17 640 LSO crystals each measuring 0.975 × 0.975 × 12.5 mm3, which are arranged in 42 contiguous rings, with 420 crystals per ring. The scanner has an axial field of view (FOV) of 4.9 cm and a transaxial FOV of 8.5 cm. The purpose of this study was to carefully evaluate the performance of the system and to optimize settings for in vivo mouse and rat imaging studies. The volumetric image resolution was found to depend strongly on the reconstruction algorithm employed and averaged 1.1 mm (1.4 µl) across the central 3 cm of the transaxial FOV when using a statistical reconstruction algorithm with accurate system modelling. The sensitivity, scatter fraction and noise-equivalent count (NEC) rate for mouse- and rat-sized phantoms were measured for different energy and timing windows. Mouse imaging was optimized with a wide open energy window (150-750 keV) and a 10 ns timing window, leading to a sensitivity of 3.3% at the centre of the FOV and a peak NEC rate of 235 000 cps for a total activity of 80 MBq (2.2 mCi) in the phantom. Rat imaging, due to the higher scatter fraction, and the activity that lies outside of the field of view, achieved a maximum NEC rate of 24 600 cps for a total activity of 80 MBq (2.2 mCi) in the phantom, with an energy window of 250-750 keV and a 6 ns timing window. The sensitivity at the centre of the FOV for these settings is 2.1%. This work demonstrates that different scanner settings are necessary to optimize the NEC count rate for different-sized animals and different injected doses. Finally, phantom and in vivo animal studies are presented to demonstrate the capabilities of microPET II for small-animal imaging studies.

  5. VO2 thermochromic smart window for energy savings and generation

    PubMed Central

    Zhou, Jiadong; Gao, Yanfeng; Zhang, Zongtao; Luo, Hongjie; Cao, Chuanxiang; Chen, Zhang; Dai, Lei; Liu, Xinling

    2013-01-01

    The ability to achieve energy saving in architectures and optimal solar energy utilisation affects the sustainable development of the human race. Traditional smart windows and solar cells cannot be combined into one device for energy saving and electricity generation. A VO2 film can respond to the environmental temperature to intelligently regulate infrared transmittance while maintaining visible transparency, and can be applied as a thermochromic smart window. Herein, we report for the first time a novel VO2-based smart window that partially utilises light scattering to solar cells around the glass panel for electricity generation. This smart window combines energy-saving and generation in one device, and offers potential to intelligently regulate and utilise solar radiation in an efficient manner. PMID:24157625

  6. VO₂ thermochromic smart window for energy savings and generation.

    PubMed

    Zhou, Jiadong; Gao, Yanfeng; Zhang, Zongtao; Luo, Hongjie; Cao, Chuanxiang; Chen, Zhang; Dai, Lei; Liu, Xinling

    2013-10-24

    The ability to achieve energy saving in architectures and optimal solar energy utilisation affects the sustainable development of the human race. Traditional smart windows and solar cells cannot be combined into one device for energy saving and electricity generation. A VO2 film can respond to the environmental temperature to intelligently regulate infrared transmittance while maintaining visible transparency, and can be applied as a thermochromic smart window. Herein, we report for the first time a novel VO2-based smart window that partially utilises light scattering to solar cells around the glass panel for electricity generation. This smart window combines energy-saving and generation in one device, and offers potential to intelligently regulate and utilise solar radiation in an efficient manner.

  7. Optimized retrievals of precipitable water from the VAS 'split window'

    NASA Technical Reports Server (NTRS)

    Chesters, Dennis; Robinson, Wayne D.; Uccellini, Louis W.

    1987-01-01

    Precipitable water fields have been retrieved from the VISSR Atmospheric Sounder (VAS) using a radiation transfer model for the differential water vapor absorption between the 11- and 12-micron 'split window' channels. Previous moisture retrievals using only the split window channels provided very good space-time continuity but poor absolute accuracy. This note describes how retrieval errors can be significantly reduced from plus or minus 0.9 to plus or minus 0.6 gm/sq cm by empirically optimizing the effective air temperature and absorption coefficients used in the two-channel model. The differential absorption between the VAS 11- and 12-micron channels, empirically estimated from 135 colocated VAS-RAOB observations, is found to be approximately 50 percent smaller than the theoretical estimates. Similar discrepancies have been noted previously between theoretical and empirical absorption coefficients applied to the retrieval of sea surface temperatures using radiances observed by VAS and polar-orbiting satellites. These discrepancies indicate that radiation transfer models for the 11-micron window appear to be less accurate than the satellite observations.

  8. Optimization of finite difference forward modeling for elastic waves based on optimum combined window functions

    NASA Astrophysics Data System (ADS)

    Jian, Wang; Xiaohong, Meng; Hong, Liu; Wanqiu, Zheng; Yaning, Liu; Sheng, Gui; Zhiyang, Wang

    2017-03-01

    Full waveform inversion and reverse time migration are active research areas for seismic exploration. Forward modeling in the time domain determines the precision of the results, and numerical solutions of finite difference have been widely adopted as an important mathematical tool for forward modeling. In this article, the optimum combined of window functions was designed based on the finite difference operator using a truncated approximation of the spatial convolution series in pseudo-spectrum space, to normalize the outcomes of existing window functions for different orders. The proposed combined window functions not only inherit the characteristics of the various window functions, to provide better truncation results, but also control the truncation error of the finite difference operator manually and visually by adjusting the combinations and analyzing the characteristics of the main and side lobes of the amplitude response. Error level and elastic forward modeling under the proposed combined system were compared with outcomes from conventional window functions and modified binomial windows. Numerical dispersion is significantly suppressed, which is compared with modified binomial window function finite-difference and conventional finite-difference. Numerical simulation verifies the reliability of the proposed method.

  9. Optimal ranking regime analysis of TreeFlow dendrohydrological reconstructions

    USDA-ARS?s Scientific Manuscript database

    The Optimal Ranking Regime (ORR) method was used to identify 6-100 year time windows containing significant ranking sequences in 55 western U.S. streamflow reconstructions, and reconstructions of the level of the Great Salt Lake and San Francisco Bay salinity during 1500-2007. The method’s ability t...

  10. The research on the mean shift algorithm for target tracking

    NASA Astrophysics Data System (ADS)

    CAO, Honghong

    2017-06-01

    The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.

  11. Optimization for routing vehicles of seafood product transportation

    NASA Astrophysics Data System (ADS)

    Soenandi, I. A.; Juan, Y.; Budi, M.

    2017-12-01

    Recently, increasing usage of marine products is creating new challenges for businesses of marine products in terms of transportation that used to carry the marine products like seafood to the main warehouse. This can be a problem if the carrier fleet is limited, and there are time constraints in terms of the freshness of the marine product. There are many ways to solve this problem, including the optimization of routing vehicles. In this study, this strategy is to implement in the marine product business in Indonesia with such an expected arrangement of the company to optimize routing problem in transportation with time and capacity windows. Until now, the company has not used the scientific method to manage the routing of their vehicle from warehouse to the location of marine products source. This study will solve a stochastic Vehicle Routing Problems (VRP) with time and capacity windows by using the comparison of six methods and looking the best results for the optimization, in this situation the company could choose the best method, in accordance with the existing condition. In this research, we compared the optimization with another method such as branch and bound, dynamic programming and Ant Colony Optimization (ACO). Finally, we get the best result after running ACO algorithm with existing travel time data. With ACO algorithm was able to reduce vehicle travel time by 3189.65 minutes, which is about 23% less than existing and based on consideration of the constraints of time within 2 days (including rest time for the driver) using 28 tons capacity of truck and the companies need two units of vehicles for transportation.

  12. A scan statistic for identifying optimal risk windows in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M

    2013-08-30

    In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Better delivery/pick up routes in the presence of uncertainty.

    DOT National Transportation Integrated Search

    2007-08-01

    We consider the Courier Delivery Problem, a variant of the Vehicle Routing Problem with : time windows in which customers appear probabilistically and their service times are uncertain. : We use scenario-based stochastic optimization with recourse fo...

  14. Robust alignment of chromatograms by statistically analyzing the shifts matrix generated by moving window fast Fourier transform cross-correlation.

    PubMed

    Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian

    2015-03-01

    Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Free-breathing 3D Cardiac MRI Using Iterative Image-Based Respiratory Motion Correction

    PubMed Central

    Moghari, Mehdi H.; Roujol, Sébastien; Chan, Raymond H.; Hong, Susie N.; Bello, Natalie; Henningsson, Markus; Ngo, Long H.; Goddu, Beth; Goepfert, Lois; Kissinger, Kraig V.; Manning, Warren J.; Nezafat, Reza

    2012-01-01

    Respiratory motion compensation using diaphragmatic navigator (NAV) gating with a 5 mm gating window is conventionally used for free-breathing cardiac MRI. Due to the narrow gating window, scan efficiency is low resulting in long scan times, especially for patients with irregular breathing patterns. In this work, a new retrospective motion compensation algorithm is presented to reduce the scan time for free-breathing cardiac MRI that increasing the gating window to 15 mm without compromising image quality. The proposed algorithm iteratively corrects for respiratory-induced cardiac motion by optimizing the sharpness of the heart. To evaluate this technique, two coronary MRI datasets with 1.3 mm3 resolution were acquired from 11 healthy subjects (7 females, 25±9 years); one using a NAV with a 5 mm gating window acquired in 12.0±2.0 minutes and one with a 15 mm gating window acquired in 7.1±1.0 minutes. The images acquired with a 15 mm gating window were corrected using the proposed algorithm and compared to the uncorrected images acquired with the 5 mm and 15 mm gating windows. The image quality score, sharpness, and length of the three major coronary arteries were equivalent between the corrected images and the images acquired with a 5 mm gating window (p-value>0.05), while the scan time was reduced by a factor of 1.7. PMID:23132549

  16. Improving the Performance of PbS Quantum Dot Solar Cells by Optimizing ZnO Window Layer

    NASA Astrophysics Data System (ADS)

    Yang, Xiaokun; Hu, Long; Deng, Hui; Qiao, Keke; Hu, Chao; Liu, Zhiyong; Yuan, Shengjie; Khan, Jahangeer; Li, Dengbing; Tang, Jiang; Song, Haisheng; Cheng, Chun

    2017-04-01

    Comparing with hot researches in absorber layer, window layer has attracted less attention in PbS quantum dot solar cells (QD SCs). Actually, the window layer plays a key role in exciton separation, charge drifting, and so on. Herein, ZnO window layer was systematically investigated for its roles in QD SCs performance. The physical mechanism of improved performance was also explored. It was found that the optimized ZnO films with appropriate thickness and doping concentration can balance the optical and electrical properties, and its energy band align well with the absorber layer for efficient charge extraction. Further characterizations demonstrated that the window layer optimization can help to reduce the surface defects, improve the heterojunction quality, as well as extend the depletion width. Compared with the control devices, the optimized devices have obtained an efficiency of 6.7% with an enhanced V oc of 18%, J sc of 21%, FF of 10%, and power conversion efficiency of 58%. The present work suggests a useful strategy to improve the device performance by optimizing the window layer besides the absorber layer.

  17. A frequency-based window width optimized two-dimensional S-Transform profilometry

    NASA Astrophysics Data System (ADS)

    Zhong, Min; Chen, Feng; Xiao, Chao

    2017-11-01

    A new scheme is proposed to as a frequency-based window width optimized two-dimensional S-Transform profilometry, in which parameters pu and pv are introduced to control the width of a two-dimensional Gaussian window. Unlike the standard two-dimensional S-transform using the Gaussian window with window width proportional to the reciprocal local frequency of the tested signal, the size of window width for the optimized two-dimensional S-Transform varies with the pu th (pv th) power of the reciprocal local frequency fx (fy) in x (y) direction. The paper gives a detailed theoretical analysis of optimized two-dimensional S-Transform in fringe analysis as well as the characteristics of the modified Gauss window. Simulations are applied to evaluate the proposed scheme, the results show that the new scheme has better noise reduction ability and can extract phase distribution more precise in comparison with the standard two-dimensional S-transform even though the surface of the measured object varies sharply. Finally, the proposed scheme is demonstrated on three-dimensional surface reconstruction for a complex plastic cat mask to show its effectiveness.

  18. Improved artificial bee colony algorithm for vehicle routing problem with time windows

    PubMed Central

    Yan, Qianqian; Zhang, Mengjie; Yang, Yunong

    2017-01-01

    This paper investigates a well-known complex combinatorial problem known as the vehicle routing problem with time windows (VRPTW). Unlike the standard vehicle routing problem, each customer in the VRPTW is served within a given time constraint. This paper solves the VRPTW using an improved artificial bee colony (IABC) algorithm. The performance of this algorithm is improved by a local optimization based on a crossover operation and a scanning strategy. Finally, the effectiveness of the IABC is evaluated on some well-known benchmarks. The results demonstrate the power of IABC algorithm in solving the VRPTW. PMID:28961252

  19. Optimization of simultaneous tritium–radiocarbon internal gas proportional counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonicalzi, R. M.; Aalseth, C. E.; Day, A. R.

    Specific environmental applications can benefit from dual tritium and radiocarbon measurements in a single compound. Assuming typical environmental levels, it is often the low tritium activity relative to the higher radiocarbon activity that limits the dual measurement. In this paper, we explore the parameter space for a combined tritium and radiocarbon measurement using a methane sample mixed with an argon fill gas in low-background proportional counters of a specific design. We present an optimized methane percentage, detector fill pressure, and analysis energy windows to maximize measurement sensitivity while minimizing count time. The final optimized method uses a 9-atm fill ofmore » P35 (35% methane, 65% argon), and a tritium analysis window from 1.5 to 10.3 keV, which stops short of the tritium beta decay endpoint energy of 18.6 keV. This method optimizes tritium counting efficiency while minimizing radiocarbon beta decay interference.« less

  20. A study on characteristics of retrospective optimal interpolation with WRF testbed

    NASA Astrophysics Data System (ADS)

    Kim, S.; Noh, N.; Lim, G.

    2012-12-01

    This study presents the application of retrospective optimal interpolation (ROI) with Weather Research and Forecasting model (WRF). Song et al. (2009) suggest ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. Song and Lim (2011) improve the method by incorporating eigen-decomposition and covariance inflation. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In this study, ROI method is applied to WRF model to validate the algorithm and to investigate the capability. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance. Using the background error covariance in eigen-space, 1-profile assimilation experiment is performed. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation. The characteristics and strength/weakness of ROI method are investigated by conducting the experiments with other data assimilation method.

  1. Dynamic vehicle routing with time windows in theory and practice.

    PubMed

    Yang, Zhiwei; van Osta, Jan-Paul; van Veen, Barry; van Krevelen, Rick; van Klaveren, Richard; Stam, Andries; Kok, Joost; Bäck, Thomas; Emmerich, Michael

    2017-01-01

    The vehicle routing problem is a classical combinatorial optimization problem. This work is about a variant of the vehicle routing problem with dynamically changing orders and time windows. In real-world applications often the demands change during operation time. New orders occur and others are canceled. In this case new schedules need to be generated on-the-fly. Online optimization algorithms for dynamical vehicle routing address this problem but so far they do not consider time windows. Moreover, to match the scenarios found in real-world problems adaptations of benchmarks are required. In this paper, a practical problem is modeled based on the procedure of daily routing of a delivery company. New orders by customers are introduced dynamically during the working day and need to be integrated into the schedule. A multiple ant colony algorithm combined with powerful local search procedures is proposed to solve the dynamic vehicle routing problem with time windows. The performance is tested on a new benchmark based on simulations of a working day. The problems are taken from Solomon's benchmarks but a certain percentage of the orders are only revealed to the algorithm during operation time. Different versions of the MACS algorithm are tested and a high performing variant is identified. Finally, the algorithm is tested in situ: In a field study, the algorithm schedules a fleet of cars for a surveillance company. We compare the performance of the algorithm to that of the procedure used by the company and we summarize insights gained from the implementation of the real-world study. The results show that the multiple ant colony algorithm can get a much better solution on the academic benchmark problem and also can be integrated in a real-world environment.

  2. Progesterone in experimental permanent stroke: a dose-response and therapeutic time-window study

    PubMed Central

    Wali, Bushra; Ishrat, Tauheed; Won, Soonmi; Stein, Donald G.

    2014-01-01

    Currently, the only approved treatment for ischaemic stroke is tissue plasminogen activator, a clot-buster. This treatment can have dangerous consequences if not given within the first 4 h after stroke. Our group and others have shown progesterone to be beneficial in preclinical studies of stroke, but a progesterone dose-response and time-window study is lacking. We tested male Sprague-Dawley rats (12 months old) with permanent middle cerebral artery occlusion or sham operations on multiple measures of sensory, motor and cognitive performance. For the dose-response study, animals received intraperitoneal injections of progesterone (8, 16 or 32 mg/kg) at 1 h post-occlusion, and subcutaneous injections at 6 h and then once every 24 h for 7 days. For the time-window study, the optimal dose of progesterone was given starting at 3, 6 or 24 h post-stroke. Behavioural recovery was evaluated at repeated intervals. Rats were killed at 22 days post-stroke and brains extracted for evaluation of infarct volume. Both 8 and 16 mg/kg doses of progesterone produced attenuation of infarct volume compared with the placebo, and improved functional outcomes up to 3 weeks after stroke on locomotor activity, grip strength, sensory neglect, gait impairment, motor coordination and spatial navigation tests. In the time-window study, the progesterone group exhibited substantial neuroprotection as late as 6 h after stroke onset. Compared with placebo, progesterone showed a significant reduction in infarct size with 3- and 6-h delays. Moderate doses (8 and 16 mg/kg) of progesterone reduced infarct size and improved functional deficits in our clinically relevant model of stroke. The 8 mg/kg dose was optimal in improving motor, sensory and memory function, and this effect was observed over a large therapeutic time window. Progesterone shows promise as a potential therapeutic agent and should be examined for safety and efficacy in a clinical trial for ischaemic stroke. PMID:24374329

  3. Performance Management and Optimization of Semiconductor Design Projects

    NASA Astrophysics Data System (ADS)

    Hinrichs, Neele; Olbrich, Markus; Barke, Erich

    2010-06-01

    The semiconductor industry is characterized by fast technological changes and small time-to-market windows. Improving productivity is the key factor to stand up to the competitors and thus successfully persist in the market. In this paper a Performance Management System for analyzing, optimizing and evaluating chip design projects is presented. A task graph representation is used to optimize the design process regarding time, cost and workload of resources. Key Performance Indicators are defined in the main areas cost, profit, resources, process and technical output to appraise the project.

  4. Effect of Data Assimilation Parameters on The Optimized Surface CO2 Flux in Asia

    NASA Astrophysics Data System (ADS)

    Kim, Hyunjung; Kim, Hyun Mee; Kim, Jinwoong; Cho, Chun-Ho

    2018-02-01

    In this study, CarbonTracker, an inverse modeling system based on the ensemble Kalman filter, was used to evaluate the effects of data assimilation parameters (assimilation window length and ensemble size) on the estimation of surface CO2 fluxes in Asia. Several experiments with different parameters were conducted, and the results were verified using CO2 concentration observations. The assimilation window lengths tested were 3, 5, 7, and 10 weeks, and the ensemble sizes were 100, 150, and 300. Therefore, a total of 12 experiments using combinations of these parameters were conducted. The experimental period was from January 2006 to December 2009. Differences between the optimized surface CO2 fluxes of the experiments were largest in the Eurasian Boreal (EB) area, followed by Eurasian Temperate (ET) and Tropical Asia (TA), and were larger in boreal summer than in boreal winter. The effect of ensemble size on the optimized biosphere flux is larger than the effect of the assimilation window length in Asia, but the importance of them varies in specific regions in Asia. The optimized biosphere flux was more sensitive to the assimilation window length in EB, whereas it was sensitive to the ensemble size as well as the assimilation window length in ET. The larger the ensemble size and the shorter the assimilation window length, the larger the uncertainty (i.e., spread of ensemble) of optimized surface CO2 fluxes. The 10-week assimilation window and 300 ensemble size were the optimal configuration for CarbonTracker in the Asian region based on several verifications using CO2 concentration measurements.

  5. Regarding the optimization of O1-mode ECRH and the feasibility of EBW startup on NSTX-U

    NASA Astrophysics Data System (ADS)

    Lopez, N. A.; Poli, F. M.

    2018-06-01

    Recently published scenarios for fully non-inductive startup and operation on the National Spherical Torus eXperiment Upgrade (NSTX-U) (Menard et al 2012 Nucl. Fusion 52 083015) show Electron Cyclotron Resonance Heating (ECRH) as an important component in preparing a target plasma for efficient High Harmonic Fast Wave and Neutral Beam heating. The modeling of the propagation and absorption of EC waves in the evolving plasma is required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and current drive during this window. Here, we extend a previous optimization of O1-mode ECRH on NSTX-U to account for the full time-dependent performance of the ECRH using simulations performed with TRANSP. We find that the evolution of the density profile has a prominent role in the optimization by defining the time window of operation, which in certain cases may be a more important metric to compare launcher performance than the average power absorption. This feature cannot be captured by analysis on static profiles, and should be accounted for when optimizing ECRH on any device that operates near the cutoff density. Additionally, the utility of the electron Bernstein wave (EBW) in driving current and generating closed flux surfaces in the early startup phase has been demonstrated on a number of devices. Using standalone GENRAY simulations, we find that efficient EBW current drive is possible on NSTX-U if the injection angle is shifted below the midplane and aimed towards the top half of the vacuum vessel. However, collisional damping of the EBW is projected to be significant, in some cases accounting for up to 97% of the absorbed EBW power.

  6. Regarding the optimization of O1-mode ECRH and the feasibility of EBW startup on NSTX-U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, Nicolas; Poli, Francesca M.

    Recently published scenarios for fully non-inductive startup and operation on the National Spherical Torus eXperiment Upgrade (NSTX-U) [Menard J et al 2012 Nucl. Fusion 52 083015] show Electron Cyclotron Resonance Heating (ECRH) as an important component in preparing a target plasma for efficient High Harmonic Fast Wave and Neutral Beam heating. The modelling of the propagation and absorption of EC waves in the evolving plasma is required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and current drive during this window. Here in this paper, we extend a previous optimization ofmore » O1-mode ECRH on NSTX-U to account for the full time-dependent performance of the ECRH using simulations performed with TRANSP. We find that the evolution of the density profile has a prominent role in the optimization by defining the time window of operation, which in certain cases may be a more important metric to compare launcher performance than the average power absorption. This feature cannot be captured by analysis on static profiles, and should be accounted for when optimizing ECRH on any device that operates near the cutoff density. Additionally, the utility of the electron Bernstein wave (EBW) in driving current and generating closed flux surfaces in the early startup phase has been demonstrated on a number of devices. Using standalone GENRAY simulations, we find that efficient EBW current drive is possible on NSTX-U if the injection angle is shifted below the midplane and aimed towards the top half of the vacuum vessel. However, collisional damping of the EBW is projected to be significant, in some cases accounting for up to 97\\% of the absorbed EBW power.« less

  7. Regarding the optimization of O1-mode ECRH and the feasibility of EBW startup on NSTX-U

    DOE PAGES

    Lopez, Nicolas; Poli, Francesca M.

    2018-03-29

    Recently published scenarios for fully non-inductive startup and operation on the National Spherical Torus eXperiment Upgrade (NSTX-U) [Menard J et al 2012 Nucl. Fusion 52 083015] show Electron Cyclotron Resonance Heating (ECRH) as an important component in preparing a target plasma for efficient High Harmonic Fast Wave and Neutral Beam heating. The modelling of the propagation and absorption of EC waves in the evolving plasma is required to define the most effective window of operation, and to optimize the launcher geometry for maximal heating and current drive during this window. Here in this paper, we extend a previous optimization ofmore » O1-mode ECRH on NSTX-U to account for the full time-dependent performance of the ECRH using simulations performed with TRANSP. We find that the evolution of the density profile has a prominent role in the optimization by defining the time window of operation, which in certain cases may be a more important metric to compare launcher performance than the average power absorption. This feature cannot be captured by analysis on static profiles, and should be accounted for when optimizing ECRH on any device that operates near the cutoff density. Additionally, the utility of the electron Bernstein wave (EBW) in driving current and generating closed flux surfaces in the early startup phase has been demonstrated on a number of devices. Using standalone GENRAY simulations, we find that efficient EBW current drive is possible on NSTX-U if the injection angle is shifted below the midplane and aimed towards the top half of the vacuum vessel. However, collisional damping of the EBW is projected to be significant, in some cases accounting for up to 97\\% of the absorbed EBW power.« less

  8. Effects of Feedback Timing and Type on Learning ESL Grammar Rules

    ERIC Educational Resources Information Center

    Lavolette, Elizabeth H. P.

    2014-01-01

    The optimal timing of feedback on formative assessments is an open question, with the cognitive processing window theory (Doughty, 2001) underlying the interaction approach suggesting that immediate feedback may be most beneficial for language acquisition (e.g., Gass, 2010; Polio, 2012) and two educational psychology hypotheses conversely…

  9. Optimal word sizes for dissimilarity measures and estimation of the degree of dissimilarity between DNA sequences.

    PubMed

    Wu, Tiee-Jian; Huang, Ying-Hsueh; Li, Lung-An

    2005-11-15

    Several measures of DNA sequence dissimilarity have been developed. The purpose of this paper is 3-fold. Firstly, we compare the performance of several word-based or alignment-based methods. Secondly, we give a general guideline for choosing the window size and determining the optimal word sizes for several word-based measures at different window sizes. Thirdly, we use a large-scale simulation method to simulate data from the distribution of SK-LD (symmetric Kullback-Leibler discrepancy). These simulated data can be used to estimate the degree of dissimilarity beta between any pair of DNA sequences. Our study shows (1) for whole sequence similiarity/dissimilarity identification the window size taken should be as large as possible, but probably not >3000, as restricted by CPU time in practice, (2) for each measure the optimal word size increases with window size, (3) when the optimal word size is used, SK-LD performance is superior in both simulation and real data analysis, (4) the estimate beta of beta based on SK-LD can be used to filter out quickly a large number of dissimilar sequences and speed alignment-based database search for similar sequences and (5) beta is also applicable in local similarity comparison situations. For example, it can help in selecting oligo probes with high specificity and, therefore, has potential in probe design for microarrays. The algorithm SK-LD, estimate beta and simulation software are implemented in MATLAB code, and are available at http://www.stat.ncku.edu.tw/tjwu

  10. Counter tube window and X-ray fluorescence analyzer study

    NASA Technical Reports Server (NTRS)

    Hertel, R.; Holm, M.

    1973-01-01

    A study was performed to determine the best design tube window and X-ray fluorescence analyzer for quantitative analysis of Venusian dust and condensates. The principal objective of the project was to develop the best counter tube window geometry for the sensing element of the instrument. This included formulation of a mathematical model of the window and optimization of its parameters. The proposed detector and instrument has several important features. The instrument will perform a near real-time analysis of dust in the Venusian atmosphere, and is capable of measuring dust layers less than 1 micron thick. In addition, wide dynamic measurement range will be provided to compensate for extreme variations in count rates. An integral pulse-height analyzer and memory accumulate data and read out spectra for detail computer analysis on the ground.

  11. Adaptive Window Zero-Crossing-Based Instantaneous Frequency Estimation

    NASA Astrophysics Data System (ADS)

    Sekhar, S. Chandra; Sreenivas, TV

    2004-12-01

    We address the problem of estimating instantaneous frequency (IF) of a real-valued constant amplitude time-varying sinusoid. Estimation of polynomial IF is formulated using the zero-crossings of the signal. We propose an algorithm to estimate nonpolynomial IF by local approximation using a low-order polynomial, over a short segment of the signal. This involves the choice of window length to minimize the mean square error (MSE). The optimal window length found by directly minimizing the MSE is a function of the higher-order derivatives of the IF which are not available a priori. However, an optimum solution is formulated using an adaptive window technique based on the concept of intersection of confidence intervals. The adaptive algorithm enables minimum MSE-IF (MMSE-IF) estimation without requiring a priori information about the IF. Simulation results show that the adaptive window zero-crossing-based IF estimation method is superior to fixed window methods and is also better than adaptive spectrogram and adaptive Wigner-Ville distribution (WVD)-based IF estimators for different signal-to-noise ratio (SNR).

  12. A staggered-grid convolutional differentiator for elastic wave modelling

    NASA Astrophysics Data System (ADS)

    Sun, Weijia; Zhou, Binzhong; Fu, Li-Yun

    2015-11-01

    The computation of derivatives in governing partial differential equations is one of the most investigated subjects in the numerical simulation of physical wave propagation. An analytical staggered-grid convolutional differentiator (CD) for first-order velocity-stress elastic wave equations is derived in this paper by inverse Fourier transformation of the band-limited spectrum of a first derivative operator. A taper window function is used to truncate the infinite staggered-grid CD stencil. The truncated CD operator is almost as accurate as the analytical solution, and as efficient as the finite-difference (FD) method. The selection of window functions will influence the accuracy of the CD operator in wave simulation. We search for the optimal Gaussian windows for different order CDs by minimizing the spectral error of the derivative and comparing the windows with the normal Hanning window function for tapering the CD operators. It is found that the optimal Gaussian window appears to be similar to the Hanning window function for tapering the same CD operator. We investigate the accuracy of the windowed CD operator and the staggered-grid FD method with different orders. Compared to the conventional staggered-grid FD method, a short staggered-grid CD operator achieves an accuracy equivalent to that of a long FD operator, with lower computational costs. For example, an 8th order staggered-grid CD operator can achieve the same accuracy of a 16th order staggered-grid FD algorithm but with half of the computational resources and time required. Numerical examples from a homogeneous model and a crustal waveguide model are used to illustrate the superiority of the CD operators over the conventional staggered-grid FD operators for the simulation of wave propagations.

  13. Recalibration of the Multisensory Temporal Window of Integration Results from Changing Task Demands

    PubMed Central

    Mégevand, Pierre; Molholm, Sophie; Nayak, Ashabari; Foxe, John J.

    2013-01-01

    The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands. PMID:23951203

  14. Waveguide transition with vacuum window for multiband dynamic nuclear polarization systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rybalko, Oleksandr; Bowen, Sean; Zhurbenko, Vitaliy

    2016-05-15

    A low loss waveguide transition section and oversized microwave vacuum window covering several frequency bands (94 GHz, 140 GHz, 188 GHz) is presented. The transition is compact and was optimized for multiband Dynamic Nuclear Polarization (DNP) systems in a full-wave simulator. The window is more broadband than commercially available windows, which are usually optimized for single band operation. It is demonstrated that high-density polyethylene with urethane adhesive can be used as a low loss microwave vacuum window in multiband DNP systems. The overall assembly performance and dimensions are found using full-wave simulations. The practical aspects of the window implementation inmore » the waveguide are discussed. To verify the design and simulation results, the window is tested experimentally at the three frequencies of interest.« less

  15. Accessing thermoplastic processing windows in metallic glasses using rapid capacitive discharge

    PubMed Central

    Kaltenboeck, Georg; Harris, Thomas; Sun, Kerry; Tran, Thomas; Chang, Gregory; Schramm, Joseph P.; Demetriou, Marios D.; Johnson, William L.

    2014-01-01

    The ability of the rapid-capacitive discharge approach to access optimal viscosity ranges in metallic glasses for thermoplastic processing is explored. Using high-speed thermal imaging, the heating uniformity and stability against crystallization of Zr35Ti30Cu7.5Be27.5 metallic glass heated deeply into the supercooled region is investigated. The method enables homogeneous volumetric heating of bulk samples throughout the entire supercooled liquid region at high rates (~105 K/s) sufficient to bypass crystallization throughout. The crystallization onsets at temperatures in the vicinity of the “crystallization nose” were identified and a Time-Temperature-Transformation diagram is constructed, revealing a “critical heating rate” for the metallic glass of ~1000 K/s. Thermoplastic process windows in the optimal viscosity range of 100–104 Pa·s are identified, being confined between the glass relaxation and the eutectic crystallization transition. Within this process window, near-net forging of a fine precision metallic glass part is demonstrated. PMID:25269892

  16. Sliding window denoising K-Singular Value Decomposition and its application on rolling bearing impact fault diagnosis

    NASA Astrophysics Data System (ADS)

    Yang, Honggang; Lin, Huibin; Ding, Kang

    2018-05-01

    The performance of sparse features extraction by commonly used K-Singular Value Decomposition (K-SVD) method depends largely on the signal segment selected in rolling bearing diagnosis, furthermore, the calculating speed is relatively slow and the dictionary becomes so redundant when the fault signal is relatively long. A new sliding window denoising K-SVD (SWD-KSVD) method is proposed, which uses only one small segment of time domain signal containing impacts to perform sliding window dictionary learning and select an optimal pattern with oscillating information of the rolling bearing fault according to a maximum variance principle. An inner product operation between the optimal pattern and the whole fault signal is performed to enhance the characteristic of the impacts' occurrence moments. Lastly, the signal is reconstructed at peak points of the inner product to realize the extraction of the rolling bearing fault features. Both simulation and experiments verify that the method could extract the fault features effectively.

  17. Assimilation of total lightning data using the three-dimensional variational method at convection-allowing resolution

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Zhang, Yijun; Xu, Liangtao; Zheng, Dong; Yao, Wen

    2017-08-01

    A large number of observational analyses have shown that lightning data can be used to indicate areas of deep convection. It is important to assimilate observed lightning data into numerical models, so that more small-scale information can be incorporated to improve the quality of the initial condition and the subsequent forecasts. In this study, the empirical relationship between flash rate, water vapor mixing ratio, and graupel mixing ratio was used to adjust the model relative humidity, which was then assimilated by using the three-dimensional variational data assimilation system of the Weather Research and Forecasting model in cycling mode at 10-min intervals. To find the appropriate assimilation time-window length that yielded significant improvement in both the initial conditions and subsequent forecasts, four experiments with different assimilation time-window lengths were conducted for a squall line case that occurred on 10 July 2007 in North China. It was found that 60 min was the appropriate assimilation time-window length for this case, and longer assimilation window length was unnecessary since no further improvement was present. Forecasts of 1-h accumulated precipitation during the assimilation period and the subsequent 3-h accumulated precipitation were significantly improved compared with the control experiment without lightning data assimilation. The simulated reflectivity was optimal after 30 min of the forecast, it remained optimal during the following 42 min, and the positive effect from lightning data assimilation began to diminish after 72 min of the forecast. Overall, the improvement from lightning data assimilation can be maintained for about 3 h.

  18. Intra- to Multi-Decadal Temperature Variability over the Continental United States: 1896-2012

    USDA-ARS?s Scientific Manuscript database

    The Optimal Ranking Regime (ORR) method was used to identify intra- to multi-decadal (IMD) time windows containing significant ranking sequences in U.S. climate division temperature data. The simplicity of the ORR procedure’s output – a time series’ most significant non-overlapping periods of high o...

  19. Optimizing read-out of the NECTAr front-end electronics

    NASA Astrophysics Data System (ADS)

    Vorobiov, S.; Feinstein, F.; Bolmont, J.; Corona, P.; Delagnes, E.; Falvard, A.; Gascón, D.; Glicenstein, J.-F.; Naumann, C. L.; Nayman, P.; Ribo, M.; Sanuy, A.; Tavernet, J.-P.; Toussenel, F.; Vincent, P.

    2012-12-01

    We describe the optimization of the read-out specifications of the NECTAr front-end electronics for the Cherenkov Telescope Array (CTA). The NECTAr project aims at building and testing a demonstrator module of a new front-end electronics design, which takes an advantage of the know-how acquired while building the cameras of the CAT, H.E.S.S.-I and H.E.S.S.-II experiments. The goal of the optimization work is to define the specifications of the digitizing electronics of a CTA camera, in particular integration time window, sampling rate, analog bandwidth using physics simulations. We employed for this work real photomultiplier pulses, sampled at 100 ps with a 600 MHz bandwidth oscilloscope. The individual pulses are drawn randomly at the times at which the photo-electrons, originating from atmospheric showers, arrive at the focal planes of imaging atmospheric Cherenkov telescopes. The timing information is extracted from the existing CTA simulations on the GRID and organized in a local database, together with all the relevant physical parameters (energy, primary particle type, zenith angle, distance from the shower axis, pixel offset from the optical axis, night-sky background level, etc.), and detector configurations (telescope types, camera/mirror configurations, etc.). While investigating the parameter space, an optimal pixel charge integration time window, which minimizes relative error in the measured charge, has been determined. This will allow to gain in sensitivity and to lower the energy threshold of CTA telescopes. We present results of our optimizations and first measurements obtained using the NECTAr demonstrator module.

  20. A toy model that predicts the qualitative role of bar bend in a push jerk.

    PubMed

    Santos, Aaron; Meltzer, Norman E

    2009-11-01

    In this work, we describe a simple coarse-grained model of a barbell that can be used to determine the qualitative role of bar bend during a jerk. In simulations of this model, we observed a narrow time window during which the lifter can leverage the elasticity of the bar in order to lift the weight to a maximal height. This time window shifted to later times as the weight was increased. In addition, we found that the optimal time to initiate the drive was strongly correlated with the time at which the bar had reached a maximum upward velocity after recoiling. By isolating the effect of the bar, we obtained a generalized strategy for lifting heavy weight in the jerk.

  1. Three Orbital Burns to Molniya Orbit Via Shuttle_Centaur G Upper Stage

    NASA Technical Reports Server (NTRS)

    Williams, Craig H.

    2015-01-01

    An unclassified analytical trajectory design, performance, and mission study was done for the 1982 to 1986 joint National Aeronautics and Space Administration (NASA)-United States Air Force (USAF) Shuttle/Centaur G upper stage development program to send performance-demanding payloads to high orbits such as Molniya using an unconventional orbit transfer. This optimized three orbital burn transfer to Molniya orbit was compared to the then-baselined two burn transfer. The results of the three dimensional trajectory optimization performed include powered phase steering data and coast phase orbital element data. Time derivatives of the orbital elements as functions of thrust components were evaluated and used to explain the optimization's solution. Vehicle performance as a function of parking orbit inclination was given. Performance and orbital element data was provided for launch windows as functions of launch time. Ground track data was given for all burns and coasts including variation within the launch window. It was found that a Centaur with fully loaded propellant tanks could be flown from a 37 deg inclination low Earth parking orbit and achieve Molniya orbit with comparable performance to the baselined transfer which started from a 57 deg inclined orbit: 9,545 versus 9,552 lb of separated spacecraft weight, respectively. There was a significant reduction in the need for propellant launch time reserve for a 1 hr window: only 78 lb for the three burn transfer versus 320 lb for the two burn transfer. Conversely, this also meant that longer launch windows over more orbital revolutions could be done for the same amount of propellant reserve. There was no practical difference in ground tracking station or airborne assets needed to secure telemetric data, even though the geometric locations of the burns varied considerably. There was a significant adverse increase in total mission elapsed time for the three versus two burn transfer (12 vs. 1-1/4 hr), but could be accommodated by modest modifications to Centaur systems. Future applications were discussed. The three burn transfer was found to be a viable, arguably preferable, alternative to the two burn transfer.

  2. Three Orbital Burns to Molniya Orbit via Shuttle Centaur G Upper Stage

    NASA Technical Reports Server (NTRS)

    Williams, Craig H.

    2014-01-01

    An unclassified analytical trajectory design, performance, and mission study was done for the 1982-86 joint NASA-USAF Shuttle/Centaur G upper stage development program to send performance-demanding payloads to high orbits such as Molniya using an unconventional orbit transfer. This optimized three orbital burn transfer to Molniya orbit was compared to the then-baselined two burn transfer. The results of the three dimensional trajectory optimization performed include powered phase steering data and coast phase orbital element data. Time derivatives of the orbital elements as functions of thrust components were evaluated and used to explain the optimization's solution. Vehicle performance as a function of parking orbit inclination was given. Performance and orbital element data was provided for launch windows as functions of launch time. Ground track data was given for all burns and coasts including variation within the launch window. It was found that a Centaur with fully loaded propellant tanks could be flown from a 37deg inclination low Earth parking orbit and achieve Molniya orbit with comparable performance to the baselined transfer which started from a 57deg inclined orbit: 9,545 lb vs. 9,552 lb of separated spacecraft weight respectively. There was a significant reduction in the need for propellant launch time reserve for a one hour window: only 78 lb for the three burn transfer vs. 320 lb for the two burn transfer. Conversely, this also meant that longer launch windows over more orbital revolutions could be done for the same amount of propellant reserve. There was no practical difference in ground tracking station or airborne assets needed to secure telemetric data, even though the geometric locations of the burns varied considerably. There was a significant adverse increase in total mission elapsed time for the three vs. two burn transfer (12 vs. 11/4 hrs), but could be accommodated by modest modifications to Centaur systems. Future applications were discussed. The three burn transfer was found to be a viable, arguably preferable, alternative to the two burn transfer.

  3. Optimization of energy window and evaluation of scatter compensation methods in MPS using the ideal observer with model mismatch

    NASA Astrophysics Data System (ADS)

    Ghaly, Michael; Links, Jonathan M.; Frey, Eric

    2015-03-01

    In this work, we used the ideal observer (IO) and IO with model mismatch (IO-MM) applied in the projection domain and an anthropomorphic Channelized Hotelling Observer (CHO) applied to reconstructed images to optimize the acquisition energy window width and evaluate various scatter compensation methods in the context of a myocardial perfusion SPECT defect detection task. The IO has perfect knowledge of the image formation process and thus reflects performance with perfect compensation for image-degrading factors. Thus, using the IO to optimize imaging systems could lead to suboptimal parameters compared to those optimized for humans interpreting SPECT images reconstructed with imperfect or no compensation. The IO-MM allows incorporating imperfect system models into the IO optimization process. We found that with near-perfect scatter compensation, the optimal energy window for the IO and CHO were similar; in its absence the IO-MM gave a better prediction of the optimal energy window for the CHO using different scatter compensation methods. These data suggest that the IO-MM may be useful for projection-domain optimization when model mismatch is significant, and that the IO is useful when followed by reconstruction with good models of the image formation process.

  4. Prospective PET image quality gain calculation method by optimizing detector parameters.

    PubMed

    Theodorakis, Lampros; Loudos, George; Prassopoulos, Vasilios; Kappas, Constantine; Tsougos, Ioannis; Georgoulias, Panagiotis

    2015-12-01

    Lutetium-based scintillators with high-performance electronics introduced time-of-flight (TOF) reconstruction in the clinical setting. Let G' be the total signal to noise ratio gain in a reconstructed image using the TOF kernel compared with conventional reconstruction modes. G' is then the product of G1 gain arising from the reconstruction process itself and (n-1) other gain factors (G2, G3, … Gn) arising from the inherent properties of the detector. We calculated G2 and G3 gains resulting from the optimization of the coincidence and energy window width for prompts and singles, respectively. Both quantitative and image-based validated Monte Carlo models of Lu2SiO5 (LSO) TOF-permitting and Bi4Ge3O12 (BGO) TOF-nonpermitting detectors were used for the calculations. G2 and G3 values were 1.05 and 1.08 for the BGO detector and G3 was 1.07 for the LSO. A value of almost unity for G2 of the LSO detector indicated a nonsignificant optimization by altering the energy window setting. G' was found to be ∼1.4 times higher for the TOF-permitting detector after reconstruction and optimization of the coincidence and energy windows. The method described could potentially predict image noise variations by altering detector acquisition parameters. It could also further contribute toward a long-lasting debate related to cost-efficiency issues of TOF scanners versus the non-TOF ones. Some vendors re-engage nowadays to non-TOF product line designs in an effort to reduce crystal costs. Therefore, exploring the limits of image quality gain by altering the parameters of these detectors remains a topical issue.

  5. Optimal routing of coordinated aircraft to Identify moving surface contacts

    DTIC Science & Technology

    2017-06-01

    Time TAO Tactical Action Officer TSP Traveling Salesman Problem TSPTW TSP with Time Windows UAV unmanned aerial vehicle VRP Vehicle Routing...Orienteering Problem (OP), while the ORCA TI formulation follows the structure of a time dependent Traveling Salesman Problem (TSP), or a time dependent...Fox, Kenneth R., Bezalel Gavish, and Stephen C. Graves. 1980. “An n- Constraint Formulation of the ( Time Dependent) Traveling Salesman Problem

  6. Optimization of knowledge sharing through multi-forum using cloud computing architecture

    NASA Astrophysics Data System (ADS)

    Madapusi Vasudevan, Sriram; Sankaran, Srivatsan; Muthuswamy, Shanmugasundaram; Ram, N. Sankar

    2011-12-01

    Knowledge sharing is done through various knowledge sharing forums which requires multiple logins through multiple browser instances. Here a single Multi-Forum knowledge sharing concept is introduced which requires only one login session which makes user to connect multiple forums and display the data in a single browser window. Also few optimization techniques are introduced here to speed up the access time using cloud computing architecture.

  7. An Integer Programming Model For Solving Heterogeneous Vehicle Routing Problem With Hard Time Window considering Service Choice

    NASA Astrophysics Data System (ADS)

    Susilawati, Enny; Mawengkang, Herman; Efendi, Syahril

    2018-01-01

    Generally a Vehicle Routing Problem with time windows (VRPTW) can be defined as a problem to determine the optimal set of routes used by a fleet of vehicles to serve a given set of customers with service time restrictions; the objective is to minimize the total travel cost (related to the travel times or distances) and operational cost (related to the number of vehicles used). In this paper we address a variant of the VRPTW in which the fleet of vehicle is heterogenic due to the different size of demand from customers. The problem, called Heterogeneous VRP (HVRP) also includes service levels. We use integer programming model to describe the problem. A feasible neighbourhood approach is proposed to solve the model.

  8. Intelligent Hybrid Vehicle Power Control - Part 1: Machine Learning of Optimal Vehicle Power

    DTIC Science & Technology

    2012-06-30

    time window ),[ tWt DT : vave, vmax, vmin, ac, vst and vend, where the first four parameters are, respectively, the average speed, maximum speed...minimum speed and average acceleration, during the time period ),[ tWt DT , vst is the vehicle speed at )( DTWt  , and vend is the vehicle

  9. Experimental study on the crack detection with optimized spatial wavelet analysis and windowing

    NASA Astrophysics Data System (ADS)

    Ghanbari Mardasi, Amir; Wu, Nan; Wu, Christine

    2018-05-01

    In this paper, a high sensitive crack detection is experimentally realized and presented on a beam under certain deflection by optimizing spatial wavelet analysis. Due to the crack existence in the beam structure, a perturbation/slop singularity is induced in the deflection profile. Spatial wavelet transformation works as a magnifier to amplify the small perturbation signal at the crack location to detect and localize the damage. The profile of a deflected aluminum cantilever beam is obtained for both intact and cracked beams by a high resolution laser profile sensor. Gabor wavelet transformation is applied on the subtraction of intact and cracked data sets. To improve detection sensitivity, scale factor in spatial wavelet transformation and the transformation repeat times are optimized. Furthermore, to detect the possible crack close to the measurement boundaries, wavelet transformation edge effect, which induces large values of wavelet coefficient around the measurement boundaries, is efficiently reduced by introducing different windowing functions. The result shows that a small crack with depth of less than 10% of the beam height can be localized with a clear perturbation. Moreover, the perturbation caused by a crack at 0.85 mm away from one end of the measurement range, which is covered by wavelet transform edge effect, emerges by applying proper window functions.

  10. Time-frequency analysis-based time-windowing algorithm for the inverse synthetic aperture radar imaging of ships

    NASA Astrophysics Data System (ADS)

    Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong

    2018-01-01

    An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.

  11. Helicopter TEM parameters analysis and system optimization based on time constant

    NASA Astrophysics Data System (ADS)

    Xiao, Pan; Wu, Xin; Shi, Zongyang; Li, Jutao; Liu, Lihua; Fang, Guangyou

    2018-03-01

    Helicopter transient electromagnetic (TEM) method is a kind of common geophysical prospecting method, widely used in mineral detection, underground water exploration and environment investigation. In order to develop an efficient helicopter TEM system, it is necessary to analyze and optimize the system parameters. In this paper, a simple and quantitative method is proposed to analyze the system parameters, such as waveform, power, base frequency, measured field and sampling time. A wire loop model is used to define a comprehensive 'time constant domain' that shows a range of time constant, analogous to a range of conductance, after which the characteristics of the system parameters in this domain is obtained. It is found that the distortion caused by the transmitting base frequency is less than 5% when the ratio of the transmitting period to the target time constant is greater than 6. When the sampling time window is less than the target time constant, the distortion caused by the sampling time window is less than 5%. According to this method, a helicopter TEM system, called CASHTEM, is designed, and flight test has been carried out in the known mining area. The test results show that the system has good detection performance, verifying the effectiveness of the method.

  12. The effect of blood acceleration on the ultrasound power Doppler spectrum

    NASA Astrophysics Data System (ADS)

    Matchenko, O. S.; Barannik, E. A.

    2017-09-01

    The purpose of the present work was to study the influence of blood acceleration and time window length on the power Doppler spectrum for Gaussian ultrasound beams. The work has been carried out on the basis of continuum model of the ultrasound scattering from inhomogeneities in fluid flow. Correlation function of fluctuations has been considered for uniformly accelerated scatterers, and the resulting power Doppler spectra have been calculated. It is shown that within the initial phase of systole uniformly accelerated slow blood flow in pulmonary artery and aorta tends to make the correlation function about 4.89 and 7.83 times wider, respectively, than the sensitivity function of typical probing system. Given peak flow velocities, the sensitivity function becomes, vice versa, about 4.34 and 3.84 times wider, respectively, then the correlation function. In these limiting cases, the resulting spectra can be considered as Gaussian. The optimal time window duration decreases with increasing acceleration of blood flow and equals to 11.62 and 7.54 ms for pulmonary artery and aorta, respectively. The width of the resulting power Doppler spectrum is shown to be defined mostly by the wave vector of the incident field, the duration of signal and the acceleration of scatterers in the case of low flow velocities. In the opposite case geometrical properties of probing field and the average velocity itself are more essential. In the sense of signal-noise ratio, the optimal duration of time window can be found. Abovementioned results may contribute to the improved techniques of Doppler ultrasound diagnostics of cardiovascular system.

  13. Methodology of mixed load customized bus lines and adjustment based on time windows

    PubMed Central

    Song, Rui

    2018-01-01

    Custom bus routes need to be optimized to meet the needs of a customized bus for personalized trips of different passengers. This paper introduced a customized bus routing problem in which trips for each depot are given, and each bus stop has a fixed time window within which trips should be completed. Treating a trip as a virtual stop was the first consideration in solving the school bus routing problem (SBRP). Then, the mixed load custom bus routing model was established with a time window that satisfies its requirement and the result were solved by Cplex software. Finally, a simple network diagram with three depots, four pickup stops, and five delivery stops was structured to verify the correctness of the model, and based on the actual example, the result is that all the buses ran 124.42 kilometers, the sum of kilometers was 10.35 kilometers less than before. The paths and departure times of the different busses that were provided by the model were evaluated to meet the needs of the given conditions, thus providing valuable information for actual work. PMID:29320505

  14. Optimal Design for Hetero-Associative Memory: Hippocampal CA1 Phase Response Curve and Spike-Timing-Dependent Plasticity

    PubMed Central

    Miyata, Ryota; Ota, Keisuke; Aonishi, Toru

    2013-01-01

    Recently reported experimental findings suggest that the hippocampal CA1 network stores spatio-temporal spike patterns and retrieves temporally reversed and spread-out patterns. In this paper, we explore the idea that the properties of the neural interactions and the synaptic plasticity rule in the CA1 network enable it to function as a hetero-associative memory recalling such reversed and spread-out spike patterns. In line with Lengyel’s speculation (Lengyel et al., 2005), we firstly derive optimally designed spike-timing-dependent plasticity (STDP) rules that are matched to neural interactions formalized in terms of phase response curves (PRCs) for performing the hetero-associative memory function. By maximizing object functions formulated in terms of mutual information for evaluating memory retrieval performance, we search for STDP window functions that are optimal for retrieval of normal and doubly spread-out patterns under the constraint that the PRCs are those of CA1 pyramidal neurons. The system, which can retrieve normal and doubly spread-out patterns, can also retrieve reversed patterns with the same quality. Finally, we demonstrate that purposely designed STDP window functions qualitatively conform to typical ones found in CA1 pyramidal neurons. PMID:24204822

  15. Determination of Equine Cytochrome c Backbone Amide Hydrogen/Deuterium Exchange Rates by Mass Spectrometry Using a Wider Time Window and Isotope Envelope.

    PubMed

    Hamuro, Yoshitomo

    2017-03-01

    A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.

  16. Determination of Equine Cytochrome c Backbone Amide Hydrogen/Deuterium Exchange Rates by Mass Spectrometry Using a Wider Time Window and Isotope Envelope

    NASA Astrophysics Data System (ADS)

    Hamuro, Yoshitomo

    2017-03-01

    A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification.

  17. Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design

    NASA Astrophysics Data System (ADS)

    Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.

    1987-04-01

    Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.

  18. Impulsive noise suppression in color images based on the geodesic digital paths

    NASA Astrophysics Data System (ADS)

    Smolka, Bogdan; Cyganek, Boguslaw

    2015-02-01

    In the paper a novel filtering design based on the concept of exploration of the pixel neighborhood by digital paths is presented. The paths start from the boundary of a filtering window and reach its center. The cost of transitions between adjacent pixels is defined in the hybrid spatial-color space. Then, an optimal path of minimum total cost, leading from pixels of the window's boundary to its center is determined. The cost of an optimal path serves as a degree of similarity of the central pixel to the samples from the local processing window. If a pixel is an outlier, then all the paths starting from the window's boundary will have high costs and the minimum one will also be high. The filter output is calculated as a weighted mean of the central pixel and an estimate constructed using the information on the minimum cost assigned to each image pixel. So, first the costs of optimal paths are used to build a smoothed image and in the second step the minimum cost of the central pixel is utilized for construction of the weights of a soft-switching scheme. The experiments performed on a set of standard color images, revealed that the efficiency of the proposed algorithm is superior to the state-of-the-art filtering techniques in terms of the objective restoration quality measures, especially for high noise contamination ratios. The proposed filter, due to its low computational complexity, can be applied for real time image denoising and also for the enhancement of video streams.

  19. Error-based analysis of optimal tuning functions explains phenomena observed in sensory neurons.

    PubMed

    Yaeli, Steve; Meir, Ron

    2010-01-01

    Biological systems display impressive capabilities in effectively responding to environmental signals in real time. There is increasing evidence that organisms may indeed be employing near optimal Bayesian calculations in their decision-making. An intriguing question relates to the properties of optimal encoding methods, namely determining the properties of neural populations in sensory layers that optimize performance, subject to physiological constraints. Within an ecological theory of neural encoding/decoding, we show that optimal Bayesian performance requires neural adaptation which reflects environmental changes. Specifically, we predict that neuronal tuning functions possess an optimal width, which increases with prior uncertainty and environmental noise, and decreases with the decoding time window. Furthermore, even for static stimuli, we demonstrate that dynamic sensory tuning functions, acting at relatively short time scales, lead to improved performance. Interestingly, the narrowing of tuning functions as a function of time was recently observed in several biological systems. Such results set the stage for a functional theory which may explain the high reliability of sensory systems, and the utility of neuronal adaptation occurring at multiple time scales.

  20. Modeling municipal solid waste collection: A generalized vehicle routing model with multiple transfer stations, gather sites and inhomogeneous vehicles in time windows.

    PubMed

    Son, Le Hoang; Louati, Amal

    2016-06-01

    Municipal Solid Waste (MSW) collection is a necessary process in any municipality resulting in the quality-of-life, economic aspects and urban structuralization. The intrinsic nature of MSW collection relates to the development of effective vehicle routing models that optimize the total traveling distances of vehicles, the environmental emission and the investment costs. In this article, we propose a generalized vehicle routing model including multiple transfer stations, gather sites and inhomogeneous vehicles in time windows for MSW collection. It takes into account traveling in one-way routes, the number of vehicles per m(2) and waiting time at traffic stops for reduction of operational time. The proposed model could be used for scenarios having similar node structures and vehicles' characteristics. A case study at Danang city, Vietnam is given to illustrate the applicability of this model. The experimental results have clearly shown that the new model reduces both total traveling distances and operational hours of vehicles in comparison with those of practical scenarios. Optimal routes of vehicles on streets and markets at Danang are given. Those results are significant to practitioners and local policy makers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. A Simulation Study of Paced TCP

    NASA Technical Reports Server (NTRS)

    Kulik, Joanna; Coulter, Robert; Rockwell, Dennis; Partridge, Craig

    2000-01-01

    In this paper, we study the performance of paced TCP, a modified version of TCP designed especially for high delay- bandwidth networks. In typical networks, TCP optimizes its send-rate by transmitting increasingly large bursts, or windows, of packets, one burst per round-trip time, until it reaches a maximum window-size, which corresponds to the full capacity of the network. In a network with a high delay-bandwidth product, however, Transmission Control Protocol's (TCPs) maximum window-size may be larger than the queue size of the intermediate routers, and routers will begin to drop packets as soon as the windows become too large for the router queues. The TCP sender then concludes that the bottleneck capacity of the network has been reached, and it limits its send-rate accordingly. Partridge proposed paced TCP as a means of solving the problem of queueing bottlenecks. A sender using paced TCP would release packets in multiple, small bursts during a round-trip time in which ordinary TCP would release a single, large burst of packets. This approach allows the sender to increase its send-rate to the maximum window size without encountering queueing bottlenecks. This paper describes the performance of paced TCP in a simulated network and discusses implementation details that can affect the performance of paced TCP.

  2. Adaptive synchrosqueezing based on a quilted short-time Fourier transform

    NASA Astrophysics Data System (ADS)

    Berrian, Alexander; Saito, Naoki

    2017-08-01

    In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.

  3. Real-time image reconstruction and display system for MRI using a high-speed personal computer.

    PubMed

    Haishi, T; Kose, K

    1998-09-01

    A real-time NMR image reconstruction and display system was developed using a high-speed personal computer and optimized for the 32-bit multitasking Microsoft Windows 95 operating system. The system was operated at various CPU clock frequencies by changing the motherboard clock frequency and the processor/bus frequency ratio. When the Pentium CPU was used at the 200 MHz clock frequency, the reconstruction time for one 128 x 128 pixel image was 48 ms and that for the image display on the enlarged 256 x 256 pixel window was about 8 ms. NMR imaging experiments were performed with three fast imaging sequences (FLASH, multishot EPI, and one-shot EPI) to demonstrate the ability of the real-time system. It was concluded that in most cases, high-speed PC would be the best choice for the image reconstruction and display system for real-time MRI. Copyright 1998 Academic Press.

  4. A probabilistic method for the estimation of residual risk in donated blood.

    PubMed

    Bish, Ebru K; Ragavan, Prasanna K; Bish, Douglas R; Slonim, Anthony D; Stramer, Susan L

    2014-10-01

    The residual risk (RR) of transfusion-transmitted infections, including the human immunodeficiency virus and hepatitis B and C viruses, is typically estimated by the incidence[Formula: see text]window period model, which relies on the following restrictive assumptions: Each screening test, with probability 1, (1) detects an infected unit outside of the test's window period; (2) fails to detect an infected unit within the window period; and (3) correctly identifies an infection-free unit. These assumptions need not hold in practice due to random or systemic errors and individual variations in the window period. We develop a probability model that accurately estimates the RR by relaxing these assumptions, and quantify their impact using a published cost-effectiveness study and also within an optimization model. These assumptions lead to inaccurate estimates in cost-effectiveness studies and to sub-optimal solutions in the optimization model. The testing solution generated by the optimization model translates into fewer expected infections without an increase in the testing cost. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Human Mars Mission: Launch Window from Earth Orbit. Pt. 1

    NASA Technical Reports Server (NTRS)

    Young, Archie

    1999-01-01

    The determination of orbital window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to the earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a delta V penalty. Usually, because of the delta V penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: 1) One impulsive maneuver from a Highly Elliptical Orbit (HEO); 2) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO); 3) One impulsive maneuver from a Low Earth Orbit (LEO); 4) Two impulsive maneuvers form LEO; and 5) Three impulsive maneuvers form LEO. The formulation of these five different launch window modes provides a rapid means of generating realistic parametric data for space exploration studies. Also the formulation provides vector and geometrical data sufficient for use as a good starting point in detail trajectory analysis based on calculus of variations, steepest descent, or parameter optimization program techniques.

  6. Human Exploration Missions Study Launch Window from Earth Orbit

    NASA Technical Reports Server (NTRS)

    Young, Archie

    2001-01-01

    The determination of orbital launch window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a Delta(V) penalty. Usually, because of the Delta(V) penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: (1) One impulsive maneuver from a Low Earth Orbit (LEO), (2) Two impulsive maneuvers from LEO, (3) Three impulsive maneuvers from LEO, (4) One impulsive maneuvers from a Highly Elliptical Orbit (HEO), (5) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO) The formulation of these five different launch window modes provides a rapid means of generating realistic parametric data for space exploration studies. Also the formulation provides vector and geometrical data sufficient for use as a good starting point in detail trajectory analysis based on calculus of variations, steepest descent, or parameter optimization program techniques.

  7. Aggregation of Electric Current Consumption Features to Extract Maintenance KPIs

    NASA Astrophysics Data System (ADS)

    Simon, Victor; Johansson, Carl-Anders; Galar, Diego

    2017-09-01

    All electric powered machines offer the possibility of extracting information and calculating Key Performance Indicators (KPIs) from the electric current signal. Depending on the time window, sampling frequency and type of analysis, different indicators from the micro to macro level can be calculated for such aspects as maintenance, production, energy consumption etc. On the micro-level, the indicators are generally used for condition monitoring and diagnostics and are normally based on a short time window and a high sampling frequency. The macro indicators are normally based on a longer time window with a slower sampling frequency and are used as indicators for overall performance, cost or consumption. The indicators can be calculated directly from the current signal but can also be based on a combination of information from the current signal and operational data like rpm, position etc. One or several of those indicators can be used for prediction and prognostics of a machine's future behavior. This paper uses this technique to calculate indicators for maintenance and energy optimization in electric powered machines and fleets of machines, especially machine tools.

  8. Probabilistic objective functions for sensor management

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald P. S.; Zajic, Tim R.

    2004-08-01

    This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.

  9. Mechanisms Underlying Decision-Making as Revealed by Deep-Brain Stimulation in Patients with Parkinson's Disease.

    PubMed

    Herz, Damian M; Little, Simon; Pedrosa, David J; Tinkhauser, Gerd; Cheeran, Binith; Foltynie, Tom; Bogacz, Rafal; Brown, Peter

    2018-04-23

    To optimally balance opposing demands of speed and accuracy during decision-making, we must flexibly adapt how much evidence we require before making a choice. Such adjustments in decision thresholds have been linked to the subthalamic nucleus (STN), and therapeutic STN deep-brain stimulation (DBS) has been shown to interfere with this function. Here, we performed continuous as well as closed-loop DBS of the STN while Parkinson's disease patients performed a perceptual decision-making task. Closed-loop STN DBS allowed temporally patterned STN stimulation and simultaneous recordings of STN activity. This revealed that DBS only affected patients' ability to adjust decision thresholds if applied in a specific temporally confined time window during deliberation. Only stimulation in that window diminished the normal slowing of response times that occurred on difficult trials when DBS was turned off. Furthermore, DBS eliminated a relative, time-specific increase in STN beta oscillations and compromised its functional relationship with trial-by-trial adjustments in decision thresholds. Together, these results provide causal evidence that the STN is involved in adjusting decision thresholds in distinct, time-limited processing windows during deliberation. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. A genetic algorithm for dynamic inbound ordering and outbound dispatching problem with delivery time windows

    NASA Astrophysics Data System (ADS)

    Kim, Byung Soo; Lee, Woon-Seek; Koh, Shiegheun

    2012-07-01

    This article considers an inbound ordering and outbound dispatching problem for a single product in a third-party warehouse, where the demands are dynamic over a discrete and finite time horizon, and moreover, each demand has a time window in which it must be satisfied. Replenishing orders are shipped in containers and the freight cost is proportional to the number of containers used. The problem is classified into two cases, i.e. non-split demand case and split demand case, and a mathematical model for each case is presented. An in-depth analysis of the models shows that they are very complicated and difficult to find optimal solutions as the problem size becomes large. Therefore, genetic algorithm (GA) based heuristic approaches are designed to solve the problems in a reasonable time. To validate and evaluate the algorithms, finally, some computational experiments are conducted.

  11. Mitigation of time-varying distortions in Nyquist-WDM systems using machine learning

    NASA Astrophysics Data System (ADS)

    Granada Torres, Jhon J.; Varughese, Siddharth; Thomas, Varghese A.; Chiuchiarelli, Andrea; Ralph, Stephen E.; Cárdenas Soto, Ana M.; Guerrero González, Neil

    2017-11-01

    We propose a machine learning-based nonsymmetrical demodulation technique relying on clustering to mitigate time-varying distortions derived from several impairments such as IQ imbalance, bias drift, phase noise and interchannel interference. Experimental results show that those impairments cause centroid movements in the received constellations seen in time-windows of 10k symbols in controlled scenarios. In our demodulation technique, the k-means algorithm iteratively identifies the cluster centroids in the constellation of the received symbols in short time windows by means of the optimization of decision thresholds for a minimum BER. We experimentally verified the effectiveness of this computationally efficient technique in multicarrier 16QAM Nyquist-WDM systems over 270 km links. Our nonsymmetrical demodulation technique outperforms the conventional QAM demodulation technique, reducing the OSNR requirement up to ∼0.8 dB at a BER of 1 × 10-2 for signals affected by interchannel interference.

  12. Route optimization as an instrument to improve animal welfare and economics in pre-slaughter logistics.

    PubMed

    Frisk, Mikael; Jonsson, Annie; Sellman, Stefan; Flisberg, Patrik; Rönnqvist, Mikael; Wennergren, Uno

    2018-01-01

    Each year, more than three million animals are transported from farms to abattoirs in Sweden. Animal transport is related to economic and environmental costs and a negative impact on animal welfare. Time and the number of pick-up stops between farms and abattoirs are two key parameters for animal welfare. Both are highly dependent on efficient and qualitative transportation planning, which may be difficult if done manually. We have examined the benefits of using route optimization in cattle transportation planning. To simulate the effects of various planning time windows and transportation time regulations and number of pick-up stops along each route, we have used data that represent one year of cattle transport. Our optimization model is a development of a model used in forestry transport that solves a general pick-up and delivery vehicle routing problem. The objective is to minimize transportation costs. We have shown that the length of the planning time window has a significant impact on the animal transport time, the total driving time and the total distance driven; these parameters that will not only affect animal welfare but also affect the economy and environment in the pre-slaughter logistic chain. In addition, we have shown that changes in animal transportation regulations, such as minimizing the number of allowed pick-up stops on each route or minimizing animal transportation time, will have positive effects on animal welfare measured in transportation hours and number of pick-up stops. However, this leads to an increase in working time and driven distances, leading to higher transportation costs for the transport and negative environmental impact.

  13. Route optimization as an instrument to improve animal welfare and economics in pre-slaughter logistics

    PubMed Central

    2018-01-01

    Each year, more than three million animals are transported from farms to abattoirs in Sweden. Animal transport is related to economic and environmental costs and a negative impact on animal welfare. Time and the number of pick-up stops between farms and abattoirs are two key parameters for animal welfare. Both are highly dependent on efficient and qualitative transportation planning, which may be difficult if done manually. We have examined the benefits of using route optimization in cattle transportation planning. To simulate the effects of various planning time windows and transportation time regulations and number of pick-up stops along each route, we have used data that represent one year of cattle transport. Our optimization model is a development of a model used in forestry transport that solves a general pick-up and delivery vehicle routing problem. The objective is to minimize transportation costs. We have shown that the length of the planning time window has a significant impact on the animal transport time, the total driving time and the total distance driven; these parameters that will not only affect animal welfare but also affect the economy and environment in the pre-slaughter logistic chain. In addition, we have shown that changes in animal transportation regulations, such as minimizing the number of allowed pick-up stops on each route or minimizing animal transportation time, will have positive effects on animal welfare measured in transportation hours and number of pick-up stops. However, this leads to an increase in working time and driven distances, leading to higher transportation costs for the transport and negative environmental impact. PMID:29513704

  14. Passive movement improves the learning and memory function of rats with cerebral infarction by inhibiting neuron cell apoptosis.

    PubMed

    Li, Man; Peng, Jun; Wang, Meng-Die; Song, Yan-Ling; Mei, Yuan-Wu; Fang, Yuan

    2014-02-01

    Passive movement has been found to improve evidently ischemic stroke patients' impaired capacity of learning and memory, but the optimal time window of initiating the therapy and the underlying mechanism are not fully understood. In this study, the effect of passive movement at different time windows on learning and memory of rats with cerebral infarction was detected. The results showed that the expression of caspase-3 and escape latency in the passive movement group were all considerably lower than those in the model group (P < 0.05), while the expression of Bcl-2 mRNA was significantly higher than those in the model group (P < 0.05). Moreover, we found that there were most significant changes of escape latency and expressions of Bcl-2 mRNA and caspase-3 when the therapy started at 24 h after focal cerebral infarction. These results suggest that passive movement is able to contribute to the recovery of learning and memory of rats with cerebral infarction, which is partially mediated by inhibiting neuron cell apoptosis, and the optimal therapeutic time is at 24 h after cerebral infarction.

  15. Joint optimization of green vehicle scheduling and routing problem with time-varying speeds

    PubMed Central

    Zhang, Dezhi; Wang, Xin; Ni, Nan; Zhang, Zhuo

    2018-01-01

    Based on an analysis of the congestion effect and changes in the speed of vehicle flow during morning and evening peaks in a large- or medium-sized city, the piecewise function is used to capture the rules of the time-varying speed of vehicles, which are very important in modelling their fuel consumption and CO2 emission. A joint optimization model of the green vehicle scheduling and routing problem with time-varying speeds is presented in this study. Extra wages during nonworking periods and soft time-window constraints are considered. A heuristic algorithm based on the adaptive large neighborhood search algorithm is also presented. Finally, a numerical simulation example is provided to illustrate the optimization model and its algorithm. Results show that, (1) the shortest route is not necessarily the route that consumes the least energy, (2) the departure time influences the vehicle fuel consumption and CO2 emissions and the optimal departure time saves on fuel consumption and reduces CO2 emissions by up to 5.4%, and (3) extra driver wages have significant effects on routing and departure time slot decisions. PMID:29466370

  16. Joint optimization of green vehicle scheduling and routing problem with time-varying speeds.

    PubMed

    Zhang, Dezhi; Wang, Xin; Li, Shuangyan; Ni, Nan; Zhang, Zhuo

    2018-01-01

    Based on an analysis of the congestion effect and changes in the speed of vehicle flow during morning and evening peaks in a large- or medium-sized city, the piecewise function is used to capture the rules of the time-varying speed of vehicles, which are very important in modelling their fuel consumption and CO2 emission. A joint optimization model of the green vehicle scheduling and routing problem with time-varying speeds is presented in this study. Extra wages during nonworking periods and soft time-window constraints are considered. A heuristic algorithm based on the adaptive large neighborhood search algorithm is also presented. Finally, a numerical simulation example is provided to illustrate the optimization model and its algorithm. Results show that, (1) the shortest route is not necessarily the route that consumes the least energy, (2) the departure time influences the vehicle fuel consumption and CO2 emissions and the optimal departure time saves on fuel consumption and reduces CO2 emissions by up to 5.4%, and (3) extra driver wages have significant effects on routing and departure time slot decisions.

  17. Catalog of Window Taper Functions for Sidelobe Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doerry, Armin W.

    Window taper functions of finite apertures are well-known to control undesirable sidelobes, albeit with performance trades. A plethora of various taper functions have been developed over the years to achieve various optimizations. We herein catalog a number of window functions, and com pare principal characteristics.

  18. Evaluation of various energy windows at different radionuclides for scatter and attenuation correction in nuclear medicine.

    PubMed

    Asgari, Afrouz; Ashoor, Mansour; Sohrabpour, Mostafa; Shokrani, Parvaneh; Rezaei, Ali

    2015-05-01

    Improving signal to noise ratio (SNR) and qualified images by the various methods is very important for detecting the abnormalities at the body organs. Scatter and attenuation of photons by the organs lead to errors in radiopharmaceutical estimation as well as degradation of images. The choice of suitable energy window and the radionuclide have a key role in nuclear medicine which appearing the lowest scatter fraction as well as having a nearly constant linear attenuation coefficient as a function of phantom thickness. The energy windows of symmetrical window (SW), asymmetric window (ASW), high window (WH) and low window (WL) using Tc-99m and Sm-153 radionuclide with solid water slab phantom (RW3) and Teflon bone phantoms have been compared, and Matlab software and Monte Carlo N-Particle (MCNP4C) code were modified to simulate these methods and obtaining the amounts of FWHM and full width at tenth maximum (FWTM) using line spread functions (LSFs). The experimental data were obtained from the Orbiter Scintron gamma camera. Based on the results of the simulation as well as experimental work, the performance of WH and ASW display of the results, lowest scatter fraction as well as constant linear attenuation coefficient as a function of phantom thickness. WH and ASW were optimal windows in nuclear medicine imaging for Tc-99m in RW3 phantom and Sm-153 in Teflon bone phantom. Attenuation correction was done for WH and ASW optimal windows and for these radionuclides using filtered back projection algorithm. Results of simulation and experimental show that very good agreement between the set of experimental with simulation as well as theoretical values with simulation data were obtained which was nominally less than 7.07 % for Tc-99m and less than 8.00 % for Sm-153. Corrected counts were not affected by the thickness of scattering material. The Simulated results of Line Spread Function (LSF) for Sm-153 and Tc-99m in phantom based on four windows and TEW method were indicated that the FWHM and FWTM values were approximately the same in TEW method and WH and ASW, but the sensitivity at the optimal window was more than that of the other one. The suitable determination of energy window width on the energy spectra can be useful in optimal design to improve efficiency and contrast. It is found that the WH is preferred to the ASW and the ASW is preferred to the SW.

  19. A study on the characteristics of retrospective optimal interpolation using an Observing System Simulation Experiment

    NASA Astrophysics Data System (ADS)

    Kim, Shin-Woo; Noh, Nam-Kyu; Lim, Gyu-Ho

    2013-04-01

    This study presents the introduction of retrospective optimal interpolation (ROI) and its application with Weather Research and Forecasting model (WRF). Song et al. (2009) suggested ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. The assimilation window of ROI algorithm is gradually increased, similar with that of the quasi-static variational assimilation (QSVA; Pires et al., 1996). Unlike QSVA method, however, ROI method assimilates the data at post analysis time using perturbation method (Verlaan and Heemink, 1997) without adjoint model. Song and Lim (2011) improved this method by incorporating eigen-decomposition and covariance inflation. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance which can concentrate ROI analyses on the error variances of governing eigenmodes by transforming the control variables into eigenspace. A total energy norm is used for the normalization of each control variables. In this study, ROI method is applied to WRF model with Observing System Simulation Experiment (OSSE) to validate the algorithm and to investigate the capability. Horizontal wind, pressure, potential temperature, and water vapor mixing ratio are used for control variables and observations. Firstly, 1-profile assimilation experiment is performed. Subsequently, OSSE's are performed using the virtual observing system which consists of synop, ship, and sonde data. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error with the assimilation by ROI. The characteristics and strength/weakness of ROI method are also investigated by conducting the experiments with 3D-Var (3-dimensional variational) method and 4D-Var (4-dimensional variational) method. In the initial time, ROI produces a larger forecast error than that of 4D-Var. However, the difference between the two experimental results is decreased gradually with time, and the ROI shows apparently better result (i.e., smaller forecast error) than that of 4D-Var after 9-hour forecast.

  20. Optimizing searches for electromagnetic counterparts of gravitational wave triggers

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael W.; Tao, Duo; Chan, Man Leong; Chatterjee, Deep; Christensen, Nelson; Ghosh, Shaon; Greco, Giuseppe; Hu, Yiming; Kapadia, Shasvath; Rana, Javed; Salafia, Om Sharan; Stubbs11, Christopher

    2018-04-01

    With the detection of a binary neutron star system and its corresponding electromagnetic counterparts, a new window of transient astronomy has opened. Due to the size of the sky localization regions, which can span hundreds to thousands of square degrees, there are significant benefits to optimizing tilings for these large sky areas. The rich science promised by gravitational-wave astronomy has led to the proposal for a variety of proposed tiling and time allocation schemes, and for the first time, we make a systematic comparison of some of these methods. We find that differences of a factor of 2 or more in efficiency are possible, depending on the algorithm employed. For this reason, with future surveys searching for electromagnetic counterparts, care should be taken when selecting tiling, time allocation, and scheduling algorithms to optimize counterpart detection.

  1. Optimizing searches for electromagnetic counterparts of gravitational wave triggers

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael W.; Tao, Duo; Chan, Man Leong; Chatterjee, Deep; Christensen, Nelson; Ghosh, Shaon; Greco, Giuseppe; Hu, Yiming; Kapadia, Shasvath; Rana, Javed; Salafia, Om Sharan; Stubbs, Christopher W.

    2018-07-01

    With the detection of a binary neutron star system and its corresponding electromagnetic counterparts, a new window of transient astronomy has opened. Due to the size of the sky localization regions, which can span hundreds to thousands of square degrees, there are significant benefits to optimizing tilings for these large sky areas. The rich science promised by gravitational wave astronomy has led to the proposal for a variety of proposed tiling and time allocation schemes, and for the first time, we make a systematic comparison of some of these methods. We find that differences of a factor of 2 or more in efficiency are possible, depending on the algorithm employed. For this reason, with future surveys searching for electromagnetic counterparts, care should be taken when selecting tiling, time allocation, and scheduling algorithms to optimize counterpart detection.

  2. On the multiple depots vehicle routing problem with heterogeneous fleet capacity and velocity

    NASA Astrophysics Data System (ADS)

    Hanum, F.; Hartono, A. P.; Bakhtiar, T.

    2018-03-01

    This current manuscript concerns with the optimization problem arising in a route determination of products distribution. The problem is formulated in the form of multiple depots and time windowed vehicle routing problem with heterogeneous capacity and velocity of fleet. Model includes a number of constraints such as route continuity, multiple depots availability and serving time in addition to generic constraints. In dealing with the unique feature of heterogeneous velocity, we generate a number of velocity profiles along the road segments, which then converted into traveling-time tables. An illustrative example of rice distribution among villages by bureau of logistics is provided. Exact approach is utilized to determine the optimal solution in term of vehicle routes and starting time of service.

  3. Current status of the real-time processing of complex radar signatures

    NASA Astrophysics Data System (ADS)

    Clay, E.

    The real-time processing technique developed by ONERA to characterize radar signatures at the Brahms station is described. This technique is used for the real-time analysis of the RCS of airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys. Using this technique, it is also possible to optimize the experimental parameters, i.e., the analysis band, the microwave-network gain, and the electromagnetic window of the analysis.

  4. Support Vector Data Description Model to Map Specific Land Cover with Optimal Parameters Determined from a Window-Based Validation Set.

    PubMed

    Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang

    2017-04-26

    This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.

  5. Time-Resolved Data Acquisition for In Situ Subsurface Planetary Geochemistry

    NASA Technical Reports Server (NTRS)

    Bodnarik, Julia Gates; Burger, Dan M.; Burger, Arnold; Evans, Larry G.; Parsons, Ann M.; Starr, Richard D.; Stassun, Keivan G.

    2012-01-01

    The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface geochemistry of planetary bodies in situ. All previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on a constant neutron source produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated.

  6. Time-resolved Neutron-gamma-ray Data Acquisition for in Situ Subsurface Planetary Geochemistry

    NASA Technical Reports Server (NTRS)

    Bodnarik, Julie G.; Burger, Dan Michael; Burger, A.; Evans, L. G.; Parsons, A. M.; Schweitzer, J. S.; Starr R. D.; Stassun, K. G.

    2013-01-01

    The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface elemental composition of planetary bodies in situ. Previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on neutrons produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated.

  7. Improved safety of retinal photocoagulation with a shaped beam and modulated pulse

    NASA Astrophysics Data System (ADS)

    Sramek, Christopher; Brown, Jefferson; Paulus, Yannis M.; Nomoto, Hiroyuki; Palanker, Daniel

    2010-02-01

    Shorter pulse durations help confine thermal damage during retinal photocoagulation, decrease treatment time and minimize pain. However, safe therapeutic window (the ratio of threshold powers for rupture and mild coagulation) decreases with shorter exposures. A ring-shaped beam enables safer photocoagulation than conventional beams by reducing the maximum temperature in the center of the spot. Similarly, a temporal pulse modulation decreasing its power over time improves safety by maintaining constant temperature for a significant portion of the pulse. Optimization of the beam and pulse shapes was performed using a computational model. In vivo experiments were performed to verify the predicted improvement. With each of these approaches, the pulse duration can be decreased by a factor of two, from 20 ms down to 10 ms while maintaining the same therapeutic window.

  8. Generalized serial search code acquisition - The equivalent circular state diagram approach

    NASA Technical Reports Server (NTRS)

    Polydoros, A.; Simon, M. K.

    1984-01-01

    A transform-domain method for deriving the generating function of the acquisition process resulting from an arbitrary serial search strategy is presented. The method relies on equivalent circular state diagrams, uses Mason's formula from flow-graph theory, and employs a minimum number of required parameters. The transform-domain approach is briefly described and the concept of equivalent circular state diagrams is introduced and exploited to derive the generating function and resulting mean acquisition time for three particular cases of interest, the continuous/center Z search, the broken/center Z search, and the expanding window search. An optimization of the latter technique is performed whereby the number of partial windows which minimizes the mean acquisition time is determined. The numerical results satisfy certain intuitive predictions and provide useful design guidelines for such systems.

  9. Detection and display of acoustic window for guiding and training cardiac ultrasound users

    NASA Astrophysics Data System (ADS)

    Huang, Sheng-Wen; Radulescu, Emil; Wang, Shougang; Thiele, Karl; Prater, David; Maxwell, Douglas; Rafter, Patrick; Dupuy, Clement; Drysdale, Jeremy; Erkamp, Ramon

    2014-03-01

    Successful ultrasound data collection strongly relies on the skills of the operator. Among different scans, echocardiography is especially challenging as the heart is surrounded by ribs and lung tissue. Less experienced users might acquire compromised images because of suboptimal hand-eye coordination and less awareness of artifacts. Clearly, there is a need for a tool that can guide and train less experienced users to position the probe optimally. We propose to help users with hand-eye coordination by displaying lines overlaid on B-mode images. The lines indicate the edges of blockages (e.g., ribs) and are updated in real time according to movement of the probe relative to the blockages. They provide information about how probe positioning can be improved. To distinguish between blockage and acoustic window, we use coherence, an indicator of channel data similarity after applying focusing delays. Specialized beamforming was developed to estimate coherence. Image processing is applied to coherence maps to detect unblocked beams and the angle of the lines for display. We built a demonstrator based on a Philips iE33 scanner, from which beamsummed RF data and video output are transferred to a workstation for processing. The detected lines are overlaid on B-mode images and fed back to the scanner display to provide users real-time guidance. Using such information in addition to B-mode images, users will be able to quickly find a suitable acoustic window for optimal image quality, and improve their skill.

  10. Low-Energy Defibrillation Failure Correction is Possible Through Nonlinear Analysis of Spatiotemporal Arrhythmia Data

    NASA Astrophysics Data System (ADS)

    Simonotto, Jennifer; Furman, Michael; Beaver, Thomas; Spano, Mark; Kavanagh, Katherine; Iden, Jason; Hu, Gang; Ditto, William

    2004-03-01

    Explanted Porcine hearts were Langendorff-perfused, administered a voltage-sensitive fluorescent dye (Di-4-ANEPPS) and illuminated with a ND:Yag laser (532 nm); the change in fluorescence resulting from electrical activity on the heart surface was recorded with an 80 x 80 pixel CCD camera at 1000 frames per second. The heart was put into fibrillation with rapid ventricular pacing and shocks were administered close to the defibrillation threshold. Defibrillation failure data was analyzed using synchronization, space-time volume plots and recurrence quantification. Preliminary spatiotemporal synchronization results reveal a short window of time ( 1 second) after defibrillation failure in which the disordered electrical activity becomes ordered; this ordered period occurs 4-5 seconds after the defibrillation shock. Recurrence analysis of a single time series confirmed these results, thus opening the avenue for dynamic defibrillators that can detect an optimal window for cardioversion.

  11. Assessment of precursory information in seismo-electromagnetic phenomena

    NASA Astrophysics Data System (ADS)

    Han, P.; Hattori, K.; Zhuang, J.

    2017-12-01

    Previous statistical studies showed that there were correlations between seismo-electromagnetic phenomena and sizeable earthquakes in Japan. In this study, utilizing Molchan's error diagram, we evaluate whether these phenomena contain precursory information and discuss how they can be used in short-term forecasting of large earthquake events. In practice, for given series of precursory signals and related earthquake events, each prediction strategy is characterized by the leading time of alarms, the length of alarm window, the alarm radius (area) and magnitude. The leading time is the time length between a detected anomaly and its following alarm, and the alarm window is the duration that an alarm lasts. The alarm radius and magnitude are maximum predictable distance and minimum predictable magnitude of earthquake events, respectively. We introduce the modified probability gain (PG') and the probability difference (D') to quantify the forecasting performance and to explore the optimal prediction parameters for a given electromagnetic observation. The above methodology is firstly applied to ULF magnetic data and GPS-TEC data. The results show that the earthquake predictions based on electromagnetic anomalies are significantly better than random guesses, indicating the data contain potential useful precursory information. Meanwhile, we reveal the optimal prediction parameters for both observations. The methodology proposed in this study could be also applied to other pre-earthquake phenomena to find out whether there is precursory information, and then on this base explore the optimal alarm parameters in practical short-term forecast.

  12. Novel characterization method of impedance cardiography signals using time-frequency distributions.

    PubMed

    Escrivá Muñoz, Jesús; Pan, Y; Ge, S; Jensen, E W; Vallverdú, M

    2018-03-16

    The purpose of this document is to describe a methodology to select the most adequate time-frequency distribution (TFD) kernel for the characterization of impedance cardiography signals (ICG). The predominant ICG beat was extracted from a patient and was synthetized using time-frequency variant Fourier approximations. These synthetized signals were used to optimize several TFD kernels according to a performance maximization. The optimized kernels were tested for noise resistance on a clinical database. The resulting optimized TFD kernels are presented with their performance calculated using newly proposed methods. The procedure explained in this work showcases a new method to select an appropriate kernel for ICG signals and compares the performance of different time-frequency kernels found in the literature for the case of ICG signals. We conclude that, for ICG signals, the performance (P) of the spectrogram with either Hanning or Hamming windows (P = 0.780) and the extended modified beta distribution (P = 0.765) provided similar results, higher than the rest of analyzed kernels. Graphical abstract Flowchart for the optimization of time-frequency distribution kernels for impedance cardiography signals.

  13. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem.

    PubMed

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA.

  14. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem

    PubMed Central

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA. PMID:26167171

  15. Space-time interpolation of satellite winds in the tropics

    NASA Astrophysics Data System (ADS)

    Patoux, Jérôme; Levy, Gad

    2013-09-01

    A space-time interpolator for creating average geophysical fields from satellite measurements is presented and tested. It is designed for optimal spatiotemporal averaging of heterogeneous data. While it is illustrated with satellite surface wind measurements in the tropics, the methodology can be useful for interpolating, analyzing, and merging a wide variety of heterogeneous and satellite data in the atmosphere and ocean over the entire globe. The spatial and temporal ranges of the interpolator are determined by averaging satellite and in situ measurements over increasingly larger space and time windows and matching the corresponding variability at each scale. This matching provides a relationship between temporal and spatial ranges, but does not provide a unique pair of ranges as a solution to all averaging problems. The pair of ranges most appropriate for a given application can be determined by performing a spectral analysis of the interpolated fields and choosing the smallest values that remove any or most of the aliasing due to the uneven sampling by the satellite. The methodology is illustrated with the computation of average divergence fields over the equatorial Pacific Ocean from SeaWinds-on-QuikSCAT surface wind measurements, for which 72 h and 510 km are suggested as optimal interpolation windows. It is found that the wind variability is reduced over the cold tongue and enhanced over the Pacific warm pool, consistent with the notion that the unstably stratified boundary layer has generally more variable winds and more gustiness than the stably stratified boundary layer. It is suggested that the spectral analysis optimization can be used for any process where time-space correspondence can be assumed.

  16. A randomized trial to determine the impact on compliance of a psychophysical peripheral cue based on the Elaboration Likelihood Model.

    PubMed

    Horton, Rachael Jane; Minniti, Antoinette; Mireylees, Stewart; McEntegart, Damian

    2008-11-01

    Non-compliance in clinical studies is a significant issue, but causes remain unclear. Utilizing the Elaboration Likelihood Model of persuasion, this study assessed the psychophysical peripheral cue 'Interactive Voice Response System (IVRS) call frequency' on compliance. 71 participants were randomized to once daily (OD), twice daily (BID) or three times daily (TID) call schedules over two weeks. Participants completed 30-item cognitive function tests at each call. Compliance was defined as proportion of expected calls within a narrow window (+/- 30 min around scheduled time), and within a relaxed window (-30 min to +4 h). Data were analyzed by ANOVA and pairwise comparisons adjusted by the Bonferroni correction. There was a relationship between call frequency and compliance. Bonferroni adjusted pairwise comparisons showed significantly higher compliance (p=0.03) for the BID (51.0%) than TID (30.3%) for the narrow window; for the extended window, compliance was higher (p=0.04) with OD (59.5%), than TID (38.4%). The IVRS psychophysical peripheral cue call frequency supported the ELM as a route to persuasion. The results also support OD strategy for optimal compliance. Models suggest specific indicators to enhance compliance with medication dosing and electronic patient diaries to improve health outcomes and data integrity respectively.

  17. Demonstration Program for Low-Cost, High-Energy-Saving Dynamic Windows

    DTIC Science & Technology

    2014-09-01

    Design The scope of this project was to demonstrate the impact of dynamic windows via energy savings and HVAC peak-load reduction; to validate the...temperature and glare. While the installed dynamic window system does not directly control the HVAC or lighting of the facility, those systems are designed ...optimize energy efficiency and HVAC load management. The conversion to inoperable windows caused an unforeseen reluctance to accept the design and

  18. Comparison of dual-biomarker PIB-PET and dual-tracer PET in AD diagnosis.

    PubMed

    Fu, Liping; Liu, Linwen; Zhang, Jinming; Xu, Baixuan; Fan, Yong; Tian, Jiahe

    2014-11-01

    To identify the optimal time window for capturing perfusion information from early (11)C-PIB imaging frames (perfusion PIB, (11)C-pPIB) and to compare the performance of (18)F-FDG PET and "dual biomarker" (11)C-PIB PET [(11)C-pPIB and amyloid PIB ((11)C-aPIB)] for classification of AD, MCI and CN subjects. Forty subjects (14 CN, 12 MCI and 14 AD patients) underwent (18)F-FDG and (11)C-PIB PET studies. Pearson correlation between the (18)F-FDG image and sum of early (11)C-PIB frames was maximised to identify the optimal time window for (11)C-pPIB. The classification power of imaging parameters was evaluated with a leave-one-out validation. A 7-min time window yielded the highest correlation between (18)F-FDG and (11)C-pPIB. (11)C-pPIB and (18)F-FDG images shared a similar radioactive distribution pattern. (18)F-FDG performed better than (11)C-pPIB for the classification of both AD vs. CN and MCI vs. CN. (11)C-pPIB + (11)C-aPIB and (18)F-FDG + (11)C-aPIB yielded the highest classification accuracy for the classification of AD vs. CN, and (18)F-FDG + (11)C-aPIB had the best classification performance for the classification of MCI vs. C-pPIB could serve as a useful biomarker of rCBF for measuring neural activity and improve the diagnostic power of PET for AD in conjunction with (11)C-aPIB. (18)F-FDG and (11)C-PIB dual-tracer PET examination could better detect MCI. • Dual-tracer PET examination provides neurofunctional and neuropathological information for AD diagnosis. • The identified optimal 11C-pPIB time frames had highest correlation with 18F-FDG. • 11C-pPIB images shared a similar radioactive distribution pattern with 18F-FDG images. • 11C-pPIB can provide neurofunctional information. • Dual-tracer PET examination could better detect MCI.

  19. Regulation of Viable and Optimal Cohorts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aubin, Jean-Pierre, E-mail: aubin.jp@gmail.com

    This study deals with the evolution of (scalar) attributes (resources or income in evolutionary demography or economics, position in traffic management, etc.) of a population of “mobiles” (economic agents, vehicles, etc.). The set of mobiles sharing the same attributes is regarded as an instantaneous cohort described by the number of its elements. The union of instantaneous cohorts during a mobile window between two attributes is a cohort. Given a measure defining the number of instantaneous cohorts, the accumulation of the mobile attributes on a evolving mobile window is the measure of the cohort on this temporal mobile window. Imposing accumulationmore » constraints and departure conditions, this study is devoted to the regulation of the evolutions of the attributes which are1.viable in the sense that the accumulations constraints are satisfied at each instant;2.and, among them, optimal, in the sense that both the duration of the temporal mobile window is maximum and that the accumulation on this temporal mobile window is the largest viable one. This value is the “accumulation valuation” function. Viable and optimal evolutions under accumulation constraints are regulated by an “implicit Volterra integro-differential inclusion” built from the accumulation valuation function, solution to an Hamilton–Jacobi–Bellman partial differential equation under constraints which is constructed for this purpose.« less

  20. Precision glass molding: Toward an optimal fabrication of optical lenses

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Liu, Weidong

    2017-03-01

    It is costly and time consuming to use machining processes, such as grinding, polishing and lapping, to produce optical glass lenses with complex features. Precision glass molding (PGM) has thus been developed to realize an efficient manufacture of such optical components in a single step. However, PGM faces various technical challenges. For example, a PGM process must be carried out within the super-cooled region of optical glass above its glass transition temperature, in which the material has an unstable non-equilibrium structure. Within a narrow window of allowable temperature variation, the glass viscosity can change from 105 to 1012 Pas due to the kinetic fragility of the super-cooled liquid. This makes a PGM process sensitive to its molding temperature. In addition, because of the structural relaxation in this temperature window, the atomic structure that governs the material properties is strongly dependent on time and thermal history. Such complexity often leads to residual stresses and shape distortion in a lens molded, causing unexpected changes in density and refractive index. This review will discuss some of the central issues in PGM processes and provide a method based on a manufacturing chain consideration from mold material selection, property and deformation characterization of optical glass to process optimization. The realization of such optimization is a necessary step for the Industry 4.0 of PGM.

  1. GCaMP expression in retinal ganglion cells characterized using a low-cost fundus imaging system

    NASA Astrophysics Data System (ADS)

    Chang, Yao-Chuan; Walston, Steven T.; Chow, Robert H.; Weiland, James D.

    2017-10-01

    Objective. Virus-transduced, intracellular-calcium indicators are effective reporters of neural activity, offering the advantage of cell-specific labeling. Due to the existence of an optimal time window for the expression of calcium indicators, a suitable tool for tracking GECI expression in vivo following transduction is highly desirable. Approach. We developed a noninvasive imaging approach based on a custom-modified, low-cost fundus viewing system that allowed us to monitor and characterize in vivo bright-field and fluorescence images of the mouse retina. AAV2-CAG-GCaMP6f was injected into a mouse eye. The fundus imaging system was used to measure fluorescence at several time points post injection. At defined time points, we prepared wholemount retina mounted on a transparent multielectrode array and used calcium imaging to evaluate the responsiveness of retinal ganglion cells (RGCs) to external electrical stimulation. Main results. The noninvasive fundus imaging system clearly resolves individual (RGCs and axons. RGC fluorescence intensity and the number of observable fluorescent cells show a similar rising trend from week 1 to week 3 after viral injection, indicating a consistent increase of GCaMP6f expression. Analysis of the in vivo fluorescence intensity trend and in vitro neurophysiological responsiveness shows that the slope of intensity versus days post injection can be used to estimate the optimal time for calcium imaging of RGCs in response to external electrical stimulation. Significance. The proposed fundus imaging system enables high-resolution digital fundus imaging in the mouse eye, based on off-the-shelf components. The long-term tracking experiment with in vitro calcium imaging validation demonstrates the system can serve as a powerful tool monitoring the level of genetically-encoded calcium indicator expression, further determining the optimal time window for following experiment.

  2. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging.

    PubMed

    Rong, Xing; Du, Yong; Frey, Eric C

    2012-06-21

    Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were derived for calculating the bias due to model-mismatch and the variance of the VOI activity estimates, respectively. To obtain the optimal acquisition energy window for general situations of interest in clinical (90)Y microsphere imaging, we generated phantoms with multiple tumors of various sizes and various tumor-to-normal activity concentration ratios using a digital phantom that realistically simulates human anatomy, simulated (90)Y microsphere imaging with a clinical SPECT system and typical imaging parameters using a previously validated Monte Carlo simulation code, and used a previously proposed method for modeling the image degrading effects in quantitative SPECT reconstruction. The obtained optimal acquisition energy window was 100-160 keV. The values of the proposed FOM were much larger than the FOM taking into account only the variance of the activity estimates, thus demonstrating in our experiment that the bias of the activity estimates due to model-mismatch was a more important factor than the variance in terms of limiting the reliability of activity estimates.

  3. Weighted Optimization-Based Distributed Kalman Filter for Nonlinear Target Tracking in Collaborative Sensor Networks.

    PubMed

    Chen, Jie; Li, Jiahong; Yang, Shuanghua; Deng, Fang

    2017-11-01

    The identification of the nonlinearity and coupling is crucial in nonlinear target tracking problem in collaborative sensor networks. According to the adaptive Kalman filtering (KF) method, the nonlinearity and coupling can be regarded as the model noise covariance, and estimated by minimizing the innovation or residual errors of the states. However, the method requires large time window of data to achieve reliable covariance measurement, making it impractical for nonlinear systems which are rapidly changing. To deal with the problem, a weighted optimization-based distributed KF algorithm (WODKF) is proposed in this paper. The algorithm enlarges the data size of each sensor by the received measurements and state estimates from its connected sensors instead of the time window. A new cost function is set as the weighted sum of the bias and oscillation of the state to estimate the "best" estimate of the model noise covariance. The bias and oscillation of the state of each sensor are estimated by polynomial fitting a time window of state estimates and measurements of the sensor and its neighbors weighted by the measurement noise covariance. The best estimate of the model noise covariance is computed by minimizing the weighted cost function using the exhaustive method. The sensor selection method is in addition to the algorithm to decrease the computation load of the filter and increase the scalability of the sensor network. The existence, suboptimality and stability analysis of the algorithm are given. The local probability data association method is used in the proposed algorithm for the multitarget tracking case. The algorithm is demonstrated in simulations on tracking examples for a random signal, one nonlinear target, and four nonlinear targets. Results show the feasibility and superiority of WODKF against other filtering algorithms for a large class of systems.

  4. Surgical anatomy of the round window-Implications for cochlear implantation.

    PubMed

    Luers, J C; Hüttenbrink, K B; Beutner, D

    2018-04-01

    The round window is an important portal for the application of active hearing aids and cochlear implants. The anatomical and topographical knowledge about the round window region is a prerequisite for successful insertion for a cochlear implant electrode. To sum up current knowledge about the round window anatomy and to give advice to the cochlear implant surgeon for optimal placement of an electrode. Systematic Medline search. Search term "round window[Title]" with no date restriction. Only publications in the English Language were included. All abstracts were screened for relevance, that is a focus on surgical anatomy of the round window. The search results were supplemented with hand searching of selected reviews and reference lists from included studies. Subjective assessment. There is substantial variability in size and shape of the round window. The round window is regarded as the most reliable surgical landmark to safely locate the scala tympani. Factors affecting the optimal trajectory line for atraumatic electrode insertion are anatomy of the round window, the anatomy of the intracochlear hook region and the variable orientation and size of the cochlea's basal turn. The very close relation to the sensitive inner ear structures necessitates a thorough anatomic knowledge and careful insertion technique, especially when implanting patients with residual hearing. In order to avoid electrode migration between the scalae and to achieve protect the modiolus and the basilar membrane, it is recommended to aim for an electrode insertion vector from postero-superior to antero-inferior. © 2017 John Wiley & Sons Ltd.

  5. Comparisons of neural networks to standard techniques for image classification and correlation

    NASA Technical Reports Server (NTRS)

    Paola, Justin D.; Schowengerdt, Robert A.

    1994-01-01

    Neural network techniques for multispectral image classification and spatial pattern detection are compared to the standard techniques of maximum-likelihood classification and spatial correlation. The neural network produced a more accurate classification than maximum-likelihood of a Landsat scene of Tucson, Arizona. Some of the errors in the maximum-likelihood classification are illustrated using decision region and class probability density plots. As expected, the main drawback to the neural network method is the long time required for the training stage. The network was trained using several different hidden layer sizes to optimize both the classification accuracy and training speed, and it was found that one node per class was optimal. The performance improved when 3x3 local windows of image data were entered into the net. This modification introduces texture into the classification without explicit calculation of a texture measure. Larger windows were successfully used for the detection of spatial features in Landsat and Magellan synthetic aperture radar imagery.

  6. Optimizing the location of fuel treatments over time at landscape scales

    Treesearch

    Greg Jones; Woodam Chung

    2011-01-01

    Fuel treatments are a vital part of forest management - but when faced with limited budgets, narrow burning windows, and air quality restrictions, it can be challenging to prioritize where, when, and how fuel treatments should be applied across the landscape to achieve the most benefi t. To help ease this process, land managers can turn to various standalone models,...

  7. Evaluation of Building Energy Saving Through the Development of Venetian Blinds' Optimal Control Algorithm According to the Orientation and Window-to-Wall Ratio

    NASA Astrophysics Data System (ADS)

    Kwon, Hyuk Ju; Yeon, Sang Hun; Lee, Keum Ho; Lee, Kwang Ho

    2018-02-01

    As various studies focusing on building energy saving have been continuously conducted, studies utilizing renewable energy sources, instead of fossil fuel, are needed. In particular, studies regarding solar energy are being carried out in the field of building science; in order to utilize such solar energy effectively, solar radiation being brought into the indoors should be acquired and blocked properly. Blinds are a typical solar radiation control device that is capable of controlling indoor thermal and light environments. However, slat-type blinds are manually controlled, giving a negative effect on building energy saving. In this regard, studies regarding the automatic control of slat-type blinds have been carried out for the last couple of decades. Therefore, this study aims to provide preliminary data for optimal control research through the controlling of slat angle in slat-type blinds by comprehensively considering various input variables. The window area ratio and orientation were selected as input variables. It was found that an optimal control algorithm was different among each window-to-wall ratio and window orientation. In addition, through comparing and analyzing the building energy saving performance for each condition by applying the developed algorithms to simulations, up to 20.7 % energy saving was shown in the cooling period and up to 12.3 % energy saving was shown in the heating period. In addition, building energy saving effect was greater as the window area ratio increased given the same orientation, and the effects of window-to-wall ratio in the cooling period were higher than those of window-to-wall ratio in the heating period.

  8. Single Mode SU8 Polymer Based Mach-Zehnder Interferometer for Bio-Sensing Application

    NASA Astrophysics Data System (ADS)

    Boiragi, Indrajit; Kundu, Sushanta; Makkar, Roshan; Chalapathi, Krishnamurthy

    2011-10-01

    This paper explains the influence of different parameters to the sensitivity of an optical waveguide Mach-Zehnder Interferometer (MZI) for real time detection of biomolecules. The sensing principle is based on the interaction of evanescence field with the biomolecules that get immobilized on sensing arm. The sensitivity has been calculated by varying the sensing window length, wavelength and concentration of bio-analyte. The maximum attainable sensitivity for the preferred design is the order of 10-8 RIU at 840 nm wavelength with a sensing window length of 1cm. All the simulation work has been carried out with Opti-BPMCAD for the optimization of MZI device parameters. The SU8 polymers are used as a core and clad material to fabricate the waveguide. The refractive index of cladding layer is optimized by varying the curing temperature for a fixed time period and the achieved index difference between core and clad is Δn = 0.0151. The fabricated MZI device has been characterized with LASER beam profiler at 840 nm wavelength. This study demonstrates the effectiveness of the different parameter to the sensitivity of a single mode optical waveguide Mach-Zehnder Interferometer for bio-sensing application.

  9. A Deep Ensemble Learning Method for Monaural Speech Separation.

    PubMed

    Zhang, Xiao-Lei; Wang, DeLiang

    2016-03-01

    Monaural speech separation is a fundamental problem in robust speech processing. Recently, deep neural network (DNN)-based speech separation methods, which predict either clean speech or an ideal time-frequency mask, have demonstrated remarkable performance improvement. However, a single DNN with a given window length does not leverage contextual information sufficiently, and the differences between the two optimization objectives are not well understood. In this paper, we propose a deep ensemble method, named multicontext networks, to address monaural speech separation. The first multicontext network averages the outputs of multiple DNNs whose inputs employ different window lengths. The second multicontext network is a stack of multiple DNNs. Each DNN in a module of the stack takes the concatenation of original acoustic features and expansion of the soft output of the lower module as its input, and predicts the ratio mask of the target speaker; the DNNs in the same module employ different contexts. We have conducted extensive experiments with three speech corpora. The results demonstrate the effectiveness of the proposed method. We have also compared the two optimization objectives systematically and found that predicting the ideal time-frequency mask is more efficient in utilizing clean training speech, while predicting clean speech is less sensitive to SNR variations.

  10. Sitting in the Pilot's Seat; Optimizing Human-Systems Interfaces for Unmanned Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Queen, Steven M.; Sanner, Kurt Gregory

    2011-01-01

    One of the pilot-machine interfaces (the forward viewing camera display) for an Unmanned Aerial Vehicle called the DROID (Dryden Remotely Operated Integrated Drone) will be analyzed for optimization. The goal is to create a visual display for the pilot that as closely resembles an out-the-window view as possible. There are currently no standard guidelines for designing pilot-machine interfaces for UAVs. Typically, UAV camera views have a narrow field, which limits the situational awareness (SA) of the pilot. Also, at this time, pilot-UAV interfaces often use displays that have a diagonal length of around 20". Using a small display may result in a distorted and disproportional view for UAV pilots. Making use of a larger display and a camera lens with a wider field of view may minimize the occurrences of pilot error associated with the inability to see "out the window" as in a manned airplane. It is predicted that the pilot will have a less distorted view of the DROID s surroundings, quicker response times and more stable vehicle control. If the experimental results validate this concept, other UAV pilot-machine interfaces will be improved with this design methodology.

  11. Blurred image restoration using knife-edge function and optimal window Wiener filtering.

    PubMed

    Wang, Min; Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects.

  12. Blurred image restoration using knife-edge function and optimal window Wiener filtering

    PubMed Central

    Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects. PMID:29377950

  13. Coherent Amplification of Ultrafast Molecular Dynamics in an Optical Oscillator

    NASA Astrophysics Data System (ADS)

    Aharonovich, Igal; Pe'er, Avi

    2016-02-01

    Optical oscillators present a powerful optimization mechanism. The inherent competition for the gain resources between possible modes of oscillation entails the prevalence of the most efficient single mode. We harness this "ultrafast" coherent feedback to optimize an optical field in time, and show that, when an optical oscillator based on a molecular gain medium is synchronously pumped by ultrashort pulses, a temporally coherent multimode field can develop that optimally dumps a general, dynamically evolving vibrational wave packet, into a single vibrational target state. Measuring the emitted field opens a new window to visualization and control of fast molecular dynamics. The realization of such a coherent oscillator with hot alkali dimers appears within experimental reach.

  14. Plan averaging for multicriteria navigation of sliding window IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, David, E-mail: dcraft@partners.org; Papp, Dávid; Unkelbach, Jan

    2014-02-15

    Purpose: To describe a method for combining sliding window plans [intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT)] for use in treatment plan averaging, which is needed for Pareto surface navigation based multicriteria treatment planning. Methods: The authors show that by taking an appropriately defined average of leaf trajectories of sliding window plans, the authors obtain a sliding window plan whose fluence map is the exact average of the fluence maps corresponding to the initial plans. In the case of static-beam IMRT, this also implies that the dose distribution of the averaged plan is the exact dosimetricmore » average of the initial plans. In VMAT delivery, the dose distribution of the averaged plan is a close approximation of the dosimetric average of the initial plans. Results: The authors demonstrate the method on three Pareto optimal VMAT plans created for a demanding paraspinal case, where the tumor surrounds the spinal cord. The results show that the leaf averaged plans yield dose distributions that approximate the dosimetric averages of the precomputed Pareto optimal plans well. Conclusions: The proposed method enables the navigation of deliverable Pareto optimal plans directly, i.e., interactive multicriteria exploration of deliverable sliding window IMRT and VMAT plans, eliminating the need for a sequencing step after navigation and hence the dose degradation that is caused by such a sequencing step.« less

  15. Optimization and comparison of simultaneous and separate acquisition protocols for dual isotope myocardial perfusion SPECT.

    PubMed

    Ghaly, Michael; Links, Jonathan M; Frey, Eric C

    2015-07-07

    Dual-isotope simultaneous-acquisition (DISA) rest-stress myocardial perfusion SPECT (MPS) protocols offer a number of advantages over separate acquisition. However, crosstalk contamination due to scatter in the patient and interactions in the collimator degrade image quality. Compensation can reduce the effects of crosstalk, but does not entirely eliminate image degradations. Optimizing acquisition parameters could further reduce the impact of crosstalk. In this paper we investigate the optimization of the rest Tl-201 energy window width and relative injected activities using the ideal observer (IO), a realistic digital phantom population and Monte Carlo (MC) simulated Tc-99m and Tl-201 projections as a means to improve image quality. We compared performance on a perfusion defect detection task for Tl-201 acquisition energy window widths varying from 4 to 40 keV centered at 72 keV for a camera with a 9% energy resolution. We also investigated 7 different relative injected activities, defined as the ratio of Tc-99m and Tl-201 activities, while keeping the total effective dose constant at 13.5 mSv. For each energy window and relative injected activity, we computed the IO test statistics using a Markov chain Monte Carlo (MCMC) method for an ensemble of 1,620 triplets of fixed and reversible defect-present, and defect-absent noisy images modeling realistic background variations. The volume under the 3-class receiver operating characteristic (ROC) surface (VUS) was estimated and served as the figure of merit. For simultaneous acquisition, the IO suggested that relative Tc-to-Tl injected activity ratios of 2.6-5 and acquisition energy window widths of 16-22% were optimal. For separate acquisition, we observed a broad range of optimal relative injected activities from 2.6 to 12.1 and acquisition energy window of widths 16-22%. A negative correlation between Tl-201 injected activity and the width of the Tl-201 energy window was observed in these ranges. The results also suggested that DISA methods could potentially provide image quality as good as that obtained with separate acquisition protocols. We compared observer performance for the optimized protocols and the current clinical protocol using separate acquisition. The current clinical protocols provided better performance at a cost of injecting the patient with approximately double the injected activity of Tc-99m and Tl-201, resulting in substantially increased radiation dose.

  16. Context Switching with Multiple Register Windows: A RISC Performance Study

    NASA Technical Reports Server (NTRS)

    Konsek, Marion B.; Reed, Daniel A.; Watcharawittayakul, Wittaya

    1987-01-01

    Although previous studies have shown that a large file of overlapping register windows can greatly reduce procedure call/return overhead, the effects of register windows in a multiprogramming environment are poorly understood. This paper investigates the performance of multiprogrammed, reduced instruction set computers (RISCs) as a function of window management strategy. Using an analytic model that reflects context switch and procedure call overheads, we analyze the performance of simple, linearly self-recursive programs. For more complex programs, we present the results of a simulation study. These studies show that a simple strategy that saves all windows prior to a context switch, but restores only a single window following a context switch, performs near optimally.

  17. SU-G-IeP4-12: Performance of In-111 Coincident Gamma-Ray Counting: A Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pahlka, R; Kappadath, S; Mawlawi, O

    2016-06-15

    Purpose: The decay of In-111 results in a non-isotropic gamma-ray cascade, which is normally imaged using a gamma camera. Creating images with a gamma camera using coincident gamma-rays from In-111 has not been previously studied. Our objective was to explore the feasibility of imaging this cascade as coincidence events and to determine the optimal timing resolution and source activity using Monte Carlo simulations. Methods: GEANT4 was used to simulate the decay of the In-111 nucleus and to model the gamma camera. Each photon emission was assigned a timestamp, and the time delay and angular separation for the second gamma-ray inmore » the cascade was consistent with the known intermediate state half-life of 85ns. The gamma-rays are transported through a model of a Siemens dual head Symbia “S” gamma camera with a 5/8-inch thick crystal and medium energy collimators. A true coincident event was defined as a single 171keV gamma-ray followed by a single 245keV gamma-ray within a specified time window (or vice versa). Several source activities (ranging from 10uCi to 5mCi) with and without incorporation of background counts were then simulated. Each simulation was analyzed using varying time windows to assess random events. The noise equivalent count rate (NECR) was computed based on the number of true and random counts for each combination of activity and time window. No scatter events were assumed since sources were simulated in air. Results: As expected, increasing the timing window increased the total number of observed coincidences albeit at the expense of true coincidences. A timing window range of 200–500ns maximizes the NECR at clinically-used source activities. The background rate did not significantly alter the maximum NECR. Conclusion: This work suggests coincident measurements of In-111 gamma-ray decay can be performed with commercial gamma cameras at clinically-relevant activities. Work is ongoing to assess useful clinical applications.« less

  18. Geometric subspace methods and time-delay embedding for EEG artifact removal and classification.

    PubMed

    Anderson, Charles W; Knight, James N; O'Connor, Tim; Kirby, Michael J; Sokolov, Artem

    2006-06-01

    Generalized singular-value decomposition is used to separate multichannel electroencephalogram (EEG) into components found by optimizing a signal-to-noise quotient. These components are used to filter out artifacts. Short-time principal components analysis of time-delay embedded EEG is used to represent windowed EEG data to classify EEG according to which mental task is being performed. Examples are presented of the filtering of various artifacts and results are shown of classification of EEG from five mental tasks using committees of decision trees.

  19. 76 FR 35425 - Notice of Intent to Grant Partially Exclusive License of the United States Patent Application No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... like. For observation of the sample, embodiments provide a thin-membrane window etched in the center of... windows. This gap may be adjusted by employing spacers. Alternatively, the thickness of a film established... optimize resolution each window may have a thickness on the order of 50 nm and the gap may be on the order...

  20. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam's Window.

    PubMed

    Onorante, Luca; Raftery, Adrian E

    2016-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam's window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods.

  1. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam’s Window*

    PubMed Central

    Onorante, Luca; Raftery, Adrian E.

    2015-01-01

    Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam’s window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well with that of other methods. PMID:26917859

  2. The Transcranial Doppler Sonography for Optimal Monitoring and Optimization of Cerebral Perfusion in Aortic Arch Surgery: A Case Series.

    PubMed

    Ghazy, Tamer; Darwisch, Ayham; Schmidt, Torsten; Nguyen, Phong; Elmihy, Sohaila; Fajfrova, Zuzana; Zickmüller, Claudia; Matschke, Klaus; Kappert, Utz

    2017-06-16

    To analyze the feasibility and advantages of transcranial doppler sonography (TCD) for monitoring and optimization of selective cerebral perfusion (SCP) in aortic arch surgery. From April 2013 to April 2014, nine patients with extensive aortic pathology underwent surgery under moderate hypothermic cardiac arrest with unilateral antegrade SCP under TCD monitoring in our institution. Adequate sonographic window and visualization of circle of Willis were to be confirmed. Intraoperatively, a cerebral cross-filling of the contralateral cerebral arteries on the unilateral SCP was to be confirmed with TCD. If no cross-filling was confirmed, an optimization of the SCP was performed via increasing cerebral flow and increasing PCO2. If not successful, the SCP was to be switched to bilateral perfusion. Air bubble hits were recorded at the termination of SCP. A sonographic window was confirmed in all patients. Procedural success was 100%. The mean operative time was 298 ± 89 minutes. Adequate cross-filling was confirmed in 8 patients. In 1 patient, inadequate cross-filling was detected by TCD and an optimization of cerebral flow was necessary, which was successfully confirmed by TCD. There was no conversion to bilateral perfusion. Extensive air bubble hits were confirmed in 1 patient, who suffered a postoperative stroke. The 30-day mortality rate was 0. Conclusion: The TCD is feasible for cerebral perfusion monitoring in aortic surgery. It enables a confirmation of adequacy of cerebral perfusion strategy or the need for its optimization. Documentation of calcific or air-bubble hits might add insight into patients suffering postoperative neurological deficits.

  3. Design optimization of highly asymmetrical layouts by 2D contour metrology

    NASA Astrophysics Data System (ADS)

    Hu, C. M.; Lo, Fred; Yang, Elvis; Yang, T. H.; Chen, K. C.

    2018-03-01

    As design pitch shrinks to the resolution limit of up-to-date optical lithography technology, the Critical Dimension (CD) variation tolerance has been dramatically decreased for ensuring the functionality of device. One of critical challenges associates with the narrower CD tolerance for whole chip area is the proximity effect control on asymmetrical layout environments. To fulfill the tight CD control of complex features, the Critical Dimension Scanning Electron Microscope (CD-SEM) based measurement results for qualifying process window and establishing the Optical Proximity Correction (OPC) model become insufficient, thus 2D contour extraction technique [1-5] has been an increasingly important approach for complementing the insufficiencies of traditional CD measurement algorithm. To alleviate the long cycle time and high cost penalties for product verification, manufacturing requirements are better to be well handled at design stage to improve the quality and yield of ICs. In this work, in-house 2D contour extraction platform was established for layout design optimization of 39nm half-pitch Self-Aligned Double Patterning (SADP) process layer. Combining with the adoption of Process Variation Band Index (PVBI), the contour extraction platform enables layout optimization speedup as comparing to traditional methods. The capabilities of identifying and handling lithography hotspots in complex layout environments of 2D contour extraction platform allow process window aware layout optimization to meet the manufacturing requirements.

  4. New clinical insights for transiently evoked otoacoustic emission protocols.

    PubMed

    Hatzopoulos, Stavros; Grzanka, Antoni; Martini, Alessandro; Konopka, Wieslaw

    2009-08-01

    The objective of the study was to optimize the area of a time-frequency analysis and then investigate any stable patterns in the time-frequency structure of otoacoustic emissions in a population of 152 healthy adults sampled over one year. TEOAE recordings were collected from 302 ears in subjects presenting normal hearing and normal impedance values. The responses were analyzed by the Wigner-Ville distribution (WVD). The TF region of analysis was optimized by examining the energy content of various rectangular and triangular TF regions. The TEOAE components from the initial and recordings 12 months later were compared in the optimized TF region. The best region for TF analysis was identified with base point 1 at 2.24 ms and 2466 Hz, base point 2 at 6.72 ms and 2466 Hz, and the top point at 2.24 ms and 5250 Hz. Correlation indices from the TF optimized region were higher, and were statistically significant, than the traditional indices in the selected time window. An analysis of the TF data within a 12-month period indicated a 85% TEOAE component similarity in 90% of the tested subjects.

  5. Antireflective nanostructures for CPV

    NASA Astrophysics Data System (ADS)

    Buencuerpo, Jeronimo; Torne, Lorena; Alvaro, Raquel; Llorens, Jose Manuel; Dotor, María Luisa; Ripalda, Jose Maria

    2017-09-01

    We have optimized a periodic antireflective nanostructure. The optimal design has a theoretical broadband reflectivity of 0.54% on top of GaInP with an AlInP window layer. Preliminary fabrication attempts have been carried out on top of GaAs substrates. Due to the lack of a window layer, and the need to fine tune the fabrication process, the fabricated nanostructures have a reflectivity of 3.1%, but this is already significantly lower than the theoretical broadband reflectance of standard MgF2/ZnS bilayers (4.5%).

  6. Ionospheric gravity wave measurements with the USU dynasonde

    NASA Technical Reports Server (NTRS)

    Berkey, Frank T.; Deng, Jun Yuan

    1992-01-01

    A method for the measurement of ionospheric Gravity Wave (GW) using the USU Dynasonde is outlined. This method consists of a series of individual procedures, which includes functions for data acquisition, adaptive scaling, polarization discrimination, interpolation and extrapolation, digital filtering, windowing, spectrum analysis, GW detection, and graphics display. Concepts of system theory are applied to treat the ionosphere as a system. An adaptive ionogram scaling method was developed for automatically extracting ionogram echo traces from noisy raw sounding data. The method uses the well known Least Mean Square (LMS) algorithm to form a stochastic optimal estimate of the echo trace which is then used to control a moving window. The window tracks the echo trace, simultaneously eliminating the noise and interference. Experimental results show that the proposed method functions as designed. Case studies which extract GW from ionosonde measurements were carried out using the techniques described. Geophysically significant events were detected and the resultant processed results are illustrated graphically. This method was also developed for real time implementation in mind.

  7. Robust Dynamic Multi-objective Vehicle Routing Optimization Method.

    PubMed

    Guo, Yi-Nan; Cheng, Jian; Luo, Sha; Gong, Dun-Wei

    2017-03-21

    For dynamic multi-objective vehicle routing problems, the waiting time of vehicle, the number of serving vehicles, the total distance of routes were normally considered as the optimization objectives. Except for above objectives, fuel consumption that leads to the environmental pollution and energy consumption was focused on in this paper. Considering the vehicles' load and the driving distance, corresponding carbon emission model was built and set as an optimization objective. Dynamic multi-objective vehicle routing problems with hard time windows and randomly appeared dynamic customers, subsequently, were modeled. In existing planning methods, when the new service demand came up, global vehicle routing optimization method was triggered to find the optimal routes for non-served customers, which was time-consuming. Therefore, robust dynamic multi-objective vehicle routing method with two-phase is proposed. Three highlights of the novel method are: (i) After finding optimal robust virtual routes for all customers by adopting multi-objective particle swarm optimization in the first phase, static vehicle routes for static customers are formed by removing all dynamic customers from robust virtual routes in next phase. (ii)The dynamically appeared customers append to be served according to their service time and the vehicles' statues. Global vehicle routing optimization is triggered only when no suitable locations can be found for dynamic customers. (iii)A metric measuring the algorithms' robustness is given. The statistical results indicated that the routes obtained by the proposed method have better stability and robustness, but may be sub-optimum. Moreover, time-consuming global vehicle routing optimization is avoided as dynamic customers appear.

  8. Optimizing Timing of Immunotherapy Improves Control of Tumors by Hypofractionated Radiation Therapy

    PubMed Central

    Baird, Jason R.; Savage, Talicia; Cottam, Benjamin; Friedman, David; Bambina, Shelly; Messenheimer, David J.; Fox, Bernard; Newell, Pippa; Bahjat, Keith S.; Gough, Michael J.; Crittenden, Marka R.

    2016-01-01

    The anecdotal reports of promising results seen with immunotherapy and radiation in advanced malignancies have prompted several trials combining immunotherapy and radiation. However, the ideal timing of immunotherapy with radiation has not been clarified. Tumor bearing mice were treated with 20Gy radiation delivered only to the tumor combined with either anti-CTLA4 antibody or anti-OX40 agonist antibody. Immunotherapy was delivered at a single timepoint around radiation. Surprisingly, the optimal timing of these therapies varied. Anti-CTLA4 was most effective when given prior to radiation therapy, in part due to regulatory T cell depletion. Administration of anti-OX40 agonist antibody was optimal when delivered one day following radiation during the post-radiation window of increased antigen presentation. Combination treatment of anti-CTLA4, radiation, and anti-OX40 using the ideal timing in a transplanted spontaneous mammary tumor model demonstrated tumor cures. These data demonstrate that the combination of immunotherapy and radiation results in improved therapeutic efficacy, and that the ideal timing of administration with radiation is dependent on the mechanism of action of the immunotherapy utilized. PMID:27281029

  9. Predicting Visual Distraction Using Driving Performance Data

    PubMed Central

    Kircher, Katja; Ahlstrom, Christer

    2010-01-01

    Behavioral variables are often used as performance indicators (PIs) of visual or internal distraction induced by secondary tasks. The objective of this study is to investigate whether visual distraction can be predicted by driving performance PIs in a naturalistic setting. Visual distraction is here defined by a gaze based real-time distraction detection algorithm called AttenD. Seven drivers used an instrumented vehicle for one month each in a small scale field operational test. For each of the visual distraction events detected by AttenD, seven PIs such as steering wheel reversal rate and throttle hold were calculated. Corresponding data were also calculated for time periods during which the drivers were classified as attentive. For each PI, means between distracted and attentive states were calculated using t-tests for different time-window sizes (2 – 40 s), and the window width with the smallest resulting p-value was selected as optimal. Based on the optimized PIs, logistic regression was used to predict whether the drivers were attentive or distracted. The logistic regression resulted in predictions which were 76 % correct (sensitivity = 77 % and specificity = 76 %). The conclusion is that there is a relationship between behavioral variables and visual distraction, but the relationship is not strong enough to accurately predict visual driver distraction. Instead, behavioral PIs are probably best suited as complementary to eye tracking based algorithms in order to make them more accurate and robust. PMID:21050615

  10. Lithographic process window optimization for mask aligner proximity lithography

    NASA Astrophysics Data System (ADS)

    Voelkel, Reinhard; Vogler, Uwe; Bramati, Arianna; Erdmann, Andreas; Ünal, Nezih; Hofmann, Ulrich; Hennemeyer, Marc; Zoberbier, Ralph; Nguyen, David; Brugger, Juergen

    2014-03-01

    We introduce a complete methodology for process window optimization in proximity mask aligner lithography. The commercially available lithography simulation software LAB from GenISys GmbH was used for simulation of light propagation and 3D resist development. The methodology was tested for the practical example of lines and spaces, 5 micron half-pitch, printed in a 1 micron thick layer of AZ® 1512HS1 positive photoresist on a silicon wafer. A SUSS MicroTec MA8 mask aligner, equipped with MO Exposure Optics® was used in simulation and experiment. MO Exposure Optics® is the latest generation of illumination systems for mask aligners. MO Exposure Optics® provides telecentric illumination and excellent light uniformity over the full mask field. MO Exposure Optics® allows the lithography engineer to freely shape the angular spectrum of the illumination light (customized illumination), which is a mandatory requirement for process window optimization. Three different illumination settings have been tested for 0 to 100 micron proximity gap. The results obtained prove, that the introduced process window methodology is a major step forward to obtain more robust processes in mask aligner lithography. The most remarkable outcome of the presented study is that a smaller exposure gap does not automatically lead to better print results in proximity lithography - what the "good instinct" of a lithographer would expect. With more than 5'000 mask aligners installed in research and industry worldwide, the proposed process window methodology might have significant impact on yield improvement and cost saving in industry.

  11. Early Warning for Large Magnitude Earthquakes: Is it feasible?

    NASA Astrophysics Data System (ADS)

    Zollo, A.; Colombelli, S.; Kanamori, H.

    2011-12-01

    The mega-thrust, Mw 9.0, 2011 Tohoku earthquake has re-opened the discussion among the scientific community about the effectiveness of Earthquake Early Warning (EEW) systems, when applied to such large events. Many EEW systems are now under-testing or -development worldwide and most of them are based on the real-time measurement of ground motion parameters in a few second window after the P-wave arrival. Currently, we are using the initial Peak Displacement (Pd), and the Predominant Period (τc), among other parameters, to rapidly estimate the earthquake magnitude and damage potential. A well known problem about the real-time estimation of the magnitude is the parameter saturation. Several authors have shown that the scaling laws between early warning parameters and magnitude are robust and effective up to magnitude 6.5-7; the correlation, however, has not yet been verified for larger events. The Tohoku earthquake occurred near the East coast of Honshu, Japan, on the subduction boundary between the Pacific and the Okhotsk plates. The high quality Kik- and K- networks provided a large quantity of strong motion records of the mainshock, with a wide azimuthal coverage both along the Japan coast and inland. More than 300 3-component accelerograms have been available, with an epicentral distance ranging from about 100 km up to more than 500 km. This earthquake thus presents an optimal case study for testing the physical bases of early warning and to investigate the feasibility of a real-time estimation of earthquake size and damage potential even for M > 7 earthquakes. In the present work we used the acceleration waveform data of the main shock for stations along the coast, up to 200 km epicentral distance. We measured the early warning parameters, Pd and τc, within different time windows, starting from 3 seconds, and expanding the testing time window up to 30 seconds. The aim is to verify the correlation of these parameters with Peak Ground Velocity and Magnitude, respectively, as a function of the length of the P-wave window. The entire rupture process of the Tohoku earthquake lasted more than 120 seconds, as shown by the source time functions obtained by several authors. When a 3 second window is used to measure Pd and τc the result is an obvious underestimation of the event size and final PGV. However, as the time window increases up to 27-30 seconds, the measured values of Pd and τc become comparable with those expected for a magnitude M≥8.5 earthquake, according to the τc vs. M and the PGV vs. Pd relationships obtained in a previous work. Since we did not observe any saturation effect for the predominant period and peak displacement measured within a P-wave, 30-seconds window, we infer that, at least from a theoretical point of view, the estimation of earthquake damage potential through the early warning parameters is still feasible for large events, provided that a longer time window is used for parameter measurement. The off-line analysis of the Tohoku event records shows that reliable estimations of the damage potential could have been obtained 40-50 seconds after the origin time, by updating the measurements of the early warning parameters in progressively enlarged P-wave time windows from 3 to 30 seconds.

  12. Data-driven optimal binning for respiratory motion management in PET.

    PubMed

    Kesner, Adam L; Meier, Joseph G; Burckhardt, Darrell D; Schwartz, Jazmin; Lynch, David A

    2018-01-01

    Respiratory gating has been used in PET imaging to reduce the amount of image blurring caused by patient motion. Optimal binning is an approach for using the motion-characterized data by binning it into a single, easy to understand/use, optimal bin. To date, optimal binning protocols have utilized externally driven motion characterization strategies that have been tuned with population-derived assumptions and parameters. In this work, we are proposing a new strategy with which to characterize motion directly from a patient's gated scan, and use that signal to create a patient/instance-specific optimal bin image. Two hundred and nineteen phase-gated FDG PET scans, acquired using data-driven gating as described previously, were used as the input for this study. For each scan, a phase-amplitude motion characterization was generated and normalized using principle component analysis. A patient-specific "optimal bin" window was derived using this characterization, via methods that mirror traditional optimal window binning strategies. The resulting optimal bin images were validated by correlating quantitative and qualitative measurements in the population of PET scans. In 53% (n = 115) of the image population, the optimal bin was determined to include 100% of the image statistics. In the remaining images, the optimal binning windows averaged 60% of the statistics and ranged between 20% and 90%. Tuning the algorithm, through a single acceptance window parameter, allowed for adjustments of the algorithm's performance in the population toward conservation of motion or reduced noise-enabling users to incorporate their definition of optimal. In the population of images that were deemed appropriate for segregation, average lesion SUV max were 7.9, 8.5, and 9.0 for nongated images, optimal bin, and gated images, respectively. The Pearson correlation of FWHM measurements between optimal bin images and gated images were better than with nongated images, 0.89 and 0.85, respectively. Generally, optimal bin images had better resolution than the nongated images and better noise characteristics than the gated images. We extended the concept of optimal binning to a data-driven form, updating a traditionally one-size-fits-all approach to a conformal one that supports adaptive imaging. This automated strategy was implemented easily within a large population and encapsulated motion information in an easy to use 3D image. Its simplicity and practicality may make this, or similar approaches ideal for use in clinical settings. © 2017 American Association of Physicists in Medicine.

  13. Runtime Speculative Software-Only Fault Tolerance

    DTIC Science & Technology

    2012-06-01

    reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing

  14. Enlisted Personnel Allocation System

    DTIC Science & Technology

    1989-03-01

    hierarchy is further subdivided into two characteristic groupings: intelligence qualifications and physical qualifications. 41 I I 7 -, S- ie p if- i - LL...weighted as 30% of the applicant’s Intelligence Qualifications score). As shown in Figure 6, a step function generates a score based on the...34 There is no aritificial time window imposed on any MOS. Any open training date within the full DEP horizon may be recommended by the optimization

  15. Dynamic Network Formation Using Ant Colony Optimization

    DTIC Science & Technology

    2009-03-01

    backhauls, VRP with pick-up and delivery, VRP with satellite facilities, and VRP with time windows (Murata & Itai , 2005). The general vehicle...given route is only visited once. The objective of the basic problem is to minimize a total cost as follows (Murata & Itai , 2005): M m mc 1 min...Problem based on Ant Colony System. Second Internation Workshop on Freight Transportation and Logistics. Palermo, Italy. Murata, T., & Itai , R. (2005

  16. Visualization Development of the Ballistic Threat Geospatial Optimization

    DTIC Science & Technology

    2015-07-01

    topographic globes, Keyhole Markup Language (KML), and Collada files. World Wind gives the user the ability to import 3-D models and navigate...present. After the first person view window is closed , the images stored in memory are then converted to a QuickTime movie (.MOV). The video will be...processing unit HPC high-performance computing JOGL Java implementation of OpenGL KML Keyhole Markup Language NASA National Aeronautics and Space

  17. Optimizing the admission time of outbound trucks entering a cross-dock with uniform arrival time by considering a queuing model

    NASA Astrophysics Data System (ADS)

    Motaghedi-Larijani, Arash; Aminnayeri, Majid

    2017-03-01

    Cross-docking is a supply-chain strategy that can reduce transportation and inventory costs. This study is motivated by a fruit and vegetable distribution centre in Tehran, which has cross-docks and a limited time to admit outbound trucks. In this article, outbound trucks are assumed to arrive at the cross-dock with a single outbound door with a uniform distribution (0,L). The total number of assigned trucks is constant and the loading time is fixed. A queuing model is modified for this situation and the expected waiting time of each customer is calculated. Then, a curve for the waiting time is calculated. Finally, the length of window time L is optimized to minimize the total cost, which includes the waiting time of the trucks and the admission cost of the cross-dock. Some illustrative examples of cross-docking are presented and solved using the proposed method.

  18. Dynamic Granger-Geweke causality modeling with application to interictal spike propagation

    PubMed Central

    Lin, Fa-Hsuan; Hara, Keiko; Solo, Victor; Vangel, Mark; Belliveau, John W.; Stufflebeam, Steven M.; Hamalainen, Matti S.

    2010-01-01

    A persistent problem in developing plausible neurophysiological models of perception, cognition, and action is the difficulty of characterizing the interactions between different neural systems. Previous studies have approached this problem by estimating causal influences across brain areas activated during cognitive processing using Structural Equation Modeling and, more recently, with Granger-Geweke causality. While SEM is complicated by the need for a priori directional connectivity information, the temporal resolution of dynamic Granger-Geweke estimates is limited because the underlying autoregressive (AR) models assume stationarity over the period of analysis. We have developed a novel optimal method for obtaining data-driven directional causality estimates with high temporal resolution in both time and frequency domains. This is achieved by simultaneously optimizing the length of the analysis window and the chosen AR model order using the SURE criterion. Dynamic Granger-Geweke causality in time and frequency domains is subsequently calculated within a moving analysis window. We tested our algorithm by calculating the Granger-Geweke causality of epileptic spike propagation from the right frontal lobe to the left frontal lobe. The results quantitatively suggested the epileptic activity at the left frontal lobe was propagated from the right frontal lobe, in agreement with the clinical diagnosis. Our novel computational tool can be used to help elucidate complex directional interactions in the human brain. PMID:19378280

  19. dAcquisition setting optimization and quantitative imaging for 124I studies with the Inveon microPET-CT system.

    PubMed

    Anizan, Nadège; Carlier, Thomas; Hindorf, Cecilia; Barbet, Jacques; Bardiès, Manuel

    2012-02-13

    Noninvasive multimodality imaging is essential for preclinical evaluation of the biodistribution and pharmacokinetics of radionuclide therapy and for monitoring tumor response. Imaging with nonstandard positron-emission tomography [PET] isotopes such as 124I is promising in that context but requires accurate activity quantification. The decay scheme of 124I implies an optimization of both acquisition settings and correction processing. The PET scanner investigated in this study was the Inveon PET/CT system dedicated to small animal imaging. The noise equivalent count rate [NECR], the scatter fraction [SF], and the gamma-prompt fraction [GF] were used to determine the best acquisition parameters for mouse- and rat-sized phantoms filled with 124I. An image-quality phantom as specified by the National Electrical Manufacturers Association NU 4-2008 protocol was acquired and reconstructed with two-dimensional filtered back projection, 2D ordered-subset expectation maximization [2DOSEM], and 3DOSEM with maximum a posteriori [3DOSEM/MAP] algorithms, with and without attenuation correction, scatter correction, and gamma-prompt correction (weighted uniform distribution subtraction). Optimal energy windows were established for the rat phantom (390 to 550 keV) and the mouse phantom (400 to 590 keV) by combining the NECR, SF, and GF results. The coincidence time window had no significant impact regarding the NECR curve variation. Activity concentration of 124I measured in the uniform region of an image-quality phantom was underestimated by 9.9% for the 3DOSEM/MAP algorithm with attenuation and scatter corrections, and by 23% with the gamma-prompt correction. Attenuation, scatter, and gamma-prompt corrections decreased the residual signal in the cold insert. The optimal energy windows were chosen with the NECR, SF, and GF evaluation. Nevertheless, an image quality and an activity quantification assessment were required to establish the most suitable reconstruction algorithm and corrections for 124I small animal imaging.

  20. Part-Task Simulation of Synthetic and Enhanced Vision Concepts for Lunar Landing

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Bailey, Randall E.; Jackson, E. Bruce; Williams, Steven P.; Kramer, Lynda J.; Barnes, James R.

    2010-01-01

    During Apollo, the constraints placed by the design of the Lunar Module (LM) window for crew visibility and landing trajectory were a major problem. Lunar landing trajectories were tailored to provide crew visibility using nearly 70 degrees look-down angle from the canted LM windows. Apollo landings were scheduled only at specific times and locations to provide optimal sunlight on the landing site. The complications of trajectory design and crew visibility are still a problem today. Practical vehicle designs for lunar lander missions using optimal or near-optimal fuel trajectories render the natural vision of the crew from windows inadequate for the approach and landing task. Further, the sun angles for the desirable landing areas in the lunar polar regions create visually powerful, season-long shadow effects. Fortunately, Synthetic and Enhanced Vision (S/EV) technologies, conceived and developed in the aviation domain, may provide solutions to this visibility problem and enable additional benefits for safer, more efficient lunar operations. Piloted simulation evaluations have been conducted to assess the handling qualities of the various lunar landing concepts, including the influence of cockpit displays and the informational data and formats. Evaluation pilots flew various landing scenarios with S/EV displays. For some of the evaluation trials, an eye glasses-mounted, monochrome monocular display, coupled with head tracking, was worn. The head-worn display scene consisted of S/EV fusion concepts. The results of this experiment showed that a head-worn system did not increase the pilot s workload when compared to using just the head-down displays. As expected, the head-worn system did not provide an increase in performance measures. Some pilots commented that the head-worn system provided greater situational awareness compared to just head-down displays.

  1. Part-task simulation of synthetic and enhanced vision concepts for lunar landing

    NASA Astrophysics Data System (ADS)

    Arthur, Jarvis J., III; Bailey, Randall E.; Jackson, E. Bruce; Barnes, James R.; Williams, Steven P.; Kramer, Lynda J.

    2010-04-01

    During Apollo, the constraints placed by the design of the Lunar Module (LM) window for crew visibility and landing trajectory were "a major problem." Lunar landing trajectories were tailored to provide crew visibility using nearly 70 degrees look-down angle from the canted LM windows. Apollo landings were scheduled only at specific times and locations to provide optimal sunlight on the landing site. The complications of trajectory design and crew visibility are still a problem today. Practical vehicle designs for lunar lander missions using optimal or near-optimal fuel trajectories render the natural vision of the crew from windows inadequate for the approach and landing task. Further, the sun angles for the desirable landing areas in the lunar polar regions create visually powerful, season-long shadow effects. Fortunately, Synthetic and Enhanced Vision (S/EV) technologies, conceived and developed in the aviation domain, may provide solutions to this visibility problem and enable additional benefits for safer, more efficient lunar operations. Piloted simulation evaluations have been conducted to assess the handling qualities of the various lunar landing concepts, including the influence of cockpit displays and the informational data and formats. Evaluation pilots flew various landing scenarios with S/EV displays. For some of the evaluation trials, an eye glasses-mounted, monochrome monocular display, coupled with head tracking, was worn. The head-worn display scene consisted of S/EV fusion concepts. The results of this experiment showed that a head-worn system did not increase the pilot's workload when compared to using just the head-down displays. As expected, the head-worn system did not provide an increase in performance measures. Some pilots commented that the head-worn system provided greater situational awareness compared to just head-down displays.

  2. Automated image segmentation-assisted flattening of atomic force microscopy images.

    PubMed

    Wang, Yuliang; Lu, Tongda; Li, Xiaolai; Wang, Huimin

    2018-01-01

    Atomic force microscopy (AFM) images normally exhibit various artifacts. As a result, image flattening is required prior to image analysis. To obtain optimized flattening results, foreground features are generally manually excluded using rectangular masks in image flattening, which is time consuming and inaccurate. In this study, a two-step scheme was proposed to achieve optimized image flattening in an automated manner. In the first step, the convex and concave features in the foreground were automatically segmented with accurate boundary detection. The extracted foreground features were taken as exclusion masks. In the second step, data points in the background were fitted as polynomial curves/surfaces, which were then subtracted from raw images to get the flattened images. Moreover, sliding-window-based polynomial fitting was proposed to process images with complex background trends. The working principle of the two-step image flattening scheme were presented, followed by the investigation of the influence of a sliding-window size and polynomial fitting direction on the flattened images. Additionally, the role of image flattening on the morphological characterization and segmentation of AFM images were verified with the proposed method.

  3. Optimization and comparison of simultaneous and separate acquisition protocols for dual isotope myocardial perfusion SPECT

    PubMed Central

    Ghaly, Michael; Links, Jonathan M; Frey, Eric C

    2015-01-01

    Dual-isotope simultaneous-acquisition (DISA) rest-stress myocardial perfusion SPECT (MPS) protocols offer a number of advantages over separate acquisition. However, crosstalk contamination due to scatter in the patient and interactions in the collimator degrade image quality. Compensation can reduce the effects of crosstalk, but does not entirely eliminate image degradations. Optimizing acquisition parameters could further reduce the impact of crosstalk. In this paper we investigate the optimization of the rest Tl-201 energy window width and relative injected activities using the ideal observer (IO), a realistic digital phantom population and Monte Carlo (MC) simulated Tc-99m and Tl-201 projections as a means to improve image quality. We compared performance on a perfusion defect detection task for Tl-201 acquisition energy window widths varying from 4 to 40 keV centered at 72 keV for a camera with a 9% energy resolution. We also investigated 7 different relative injected activities, defined as the ratio of Tc-99m and Tl-201 activities, while keeping the total effective dose constant at 13.5 mSv. For each energy window and relative injected activity, we computed the IO test statistics using a Markov chain Monte Carlo (MCMC) method for an ensemble of 1,620 triplets of fixed and reversible defect-present, and defect-absent noisy images modeling realistic background variations. The volume under the 3-class receiver operating characteristic (ROC) surface (VUS) was estimated and served as the figure of merit. For simultaneous acquisition, the IO suggested that relative Tc-to-Tl injected activity ratios of 2.6–5 and acquisition energy window widths of 16–22% were optimal. For separate acquisition, we observed a broad range of optimal relative injected activities from 2.6 to 12.1 and acquisition energy window of widths 16–22%. A negative correlation between Tl-201 injected activity and the width of the Tl-201 energy window was observed in these ranges. The results also suggested that DISA methods could potentially provide image quality as good as that obtained with separate acquisition protocols. We compared observer performance for the optimized protocols and the current clinical protocol using separate acquisition. The current clinical protocols provided better performance at a cost of injecting the patient with approximately double the injected activity of Tc-99m and Tl-201, resulting in substantially increased radiation dose. PMID:26083239

  4. Determination of the optimal tolerance for MLC positioning in sliding window and VMAT techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, V., E-mail: vhernandezmasgrau@gmail.com; Abella, R.; Calvo, J. F.

    2015-04-15

    Purpose: Several authors have recommended a 2 mm tolerance for multileaf collimator (MLC) positioning in sliding window treatments. In volumetric modulated arc therapy (VMAT) treatments, however, the optimal tolerance for MLC positioning remains unknown. In this paper, the authors present the results of a multicenter study to determine the optimal tolerance for both techniques. Methods: The procedure used is based on dynalog file analysis. The study was carried out using seven Varian linear accelerators from five different centers. Dynalogs were collected from over 100 000 clinical treatments and in-house software was used to compute the number of tolerance faults as amore » function of the user-defined tolerance. Thus, the optimal value for this tolerance, defined as the lowest achievable value, was investigated. Results: Dynalog files accurately predict the number of tolerance faults as a function of the tolerance value, especially for low fault incidences. All MLCs behaved similarly and the Millennium120 and the HD120 models yielded comparable results. In sliding window techniques, the number of beams with an incidence of hold-offs >1% rapidly decreases for a tolerance of 1.5 mm. In VMAT techniques, the number of tolerance faults sharply drops for tolerances around 2 mm. For a tolerance of 2.5 mm, less than 0.1% of the VMAT arcs presented tolerance faults. Conclusions: Dynalog analysis provides a feasible method for investigating the optimal tolerance for MLC positioning in dynamic fields. In sliding window treatments, the tolerance of 2 mm was found to be adequate, although it can be reduced to 1.5 mm. In VMAT treatments, the typically used 5 mm tolerance is excessively high. Instead, a tolerance of 2.5 mm is recommended.« less

  5. Studies and optimization of Pohang Light Source-II superconducting radio frequency system at stable top-up operation with beam current of 400 mA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joo, Youngdo, E-mail: Ydjoo77@postech.ac.kr; Yu, Inha; Park, Insoo

    After three years of upgrading work, the Pohang Light Source-II (PLS-II) is now successfully operating. The final quantitative goal of PLS-II is a top-up user-service operation with beam current of 400 mA to be completed by the end of 2014. During the beam store test up to 400 mA in the storage ring (SR), it was observed that the vacuum pressure around the radio frequency (RF) window of the superconducting cavity rapidly increases over the interlock level limiting the availability of the maximum beam current storing. Although available beam current is enhanced by setting a higher RF accelerating voltage, it is bettermore » to keep the RF accelerating voltage as low as possible in the long time top-up operation. We investigated the cause of the window vacuum pressure increment by studying the changes in the electric field distribution at the superconducting cavity and waveguide according to the beam current. In our simulation, an equivalent physical modeling was developed using a finite-difference time-domain code. The simulation revealed that the electric field amplitude at the RF window is exponentially increased as the beam current increases, thus this high electric field amplitude causes a RF breakdown at the RF window, which comes with the rapid increase of window vacuum pressure. The RF accelerating voltage of PLS-II RF system was set to 4.95 MV, which was estimated using the maximum available beam current that works as a function of RF voltage, and the top-up operation test with the beam current of 400 mA was successfully carried out.« less

  6. Integrating speech in time depends on temporal expectancies and attention.

    PubMed

    Scharinger, Mathias; Steinberg, Johanna; Tavano, Alessandro

    2017-08-01

    Sensory information that unfolds in time, such as in speech perception, relies on efficient chunking mechanisms in order to yield optimally-sized units for further processing. Whether or not two successive acoustic events receive a one-unit or a two-unit interpretation seems to depend on the fit between their temporal extent and a stipulated temporal window of integration. However, there is ongoing debate on how flexible this temporal window of integration should be, especially for the processing of speech sounds. Furthermore, there is no direct evidence of whether attention may modulate the temporal constraints on the integration window. For this reason, we here examine how different word durations, which lead to different temporal separations of sound onsets, interact with attention. In an Electroencephalography (EEG) study, participants actively and passively listened to words where word-final consonants were occasionally omitted. Words had either a natural duration or were artificially prolonged in order to increase the separation of speech sound onsets. Omission responses to incomplete speech input, originating in left temporal cortex, decreased when the critical speech sound was separated from previous sounds by more than 250 msec, i.e., when the separation was larger than the stipulated temporal window of integration (125-150 msec). Attention, on the other hand, only increased omission responses for stimuli with natural durations. We complemented the event-related potential (ERP) analyses by a frequency-domain analysis on the stimulus presentation rate. Notably, the power of stimulation frequency showed the same duration and attention effects than the omission responses. We interpret these findings on the background of existing research on temporal integration windows and further suggest that our findings may be accounted for within the framework of predictive coding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Improving delivery routes using combined heuristic and optimization in a consumer goods distribution company

    NASA Astrophysics Data System (ADS)

    Wibisono, E.; Santoso, A.; Sunaryo, M. A.

    2017-11-01

    XYZ is a distributor of various consumer goods products. The company plans its delivery routes daily and in order to obtain route construction in a short amount of time, it simplifies the process by assigning drivers based on geographic regions. This approach results in inefficient use of vehicles leading to imbalance workloads. In this paper, we propose a combined method involving heuristic and optimization to obtain better solutions in acceptable computation time. The heuristic is based on a time-oriented, nearest neighbor (TONN) to form clusters if the number of locations is higher than a certain value. The optimization part uses a mathematical modeling formulation based on vehicle routing problem that considers heterogeneous vehicles, time windows, and fixed costs (HVRPTWF) and is used to solve routing problem in clusters. A case study using data from one month of the company’s operations is analyzed, and data from one day of operations are detailed in this paper. The analysis shows that the proposed method results in 24% cost savings on that month, but it can be as high as 54% in a day.

  8. Evaluation of Aged Garlic Extract Neuroprotective Effect in a Focal Model of Cerebral Ischemia

    NASA Astrophysics Data System (ADS)

    Aguilera, Penélope; Maldonado, Perla D.; Ortiz-Plata, Alma; Barrera, Diana; Chánez-Cárdenas, María Elena

    2008-02-01

    The oxidant species generated in cerebral ischemia have been implicated as important mediators of neuronal injury through damage to lipids, DNA, and proteins. Since ischemia as well as reperfusion insults generate oxidative stress, the administration of antioxidants may limit oxidative damage and ameliorate disease progression. The present work shows the transitory neuroprotective effect of the aged garlic extract (AGE) administration (a proposed antioxidant compound) in a middle cerebral artery occlusion (MCAO) model in rats and established its therapeutic window. To determine the optimal time of administration, animal received AGE (1.2 mL/kg) intraperitoneally 30 min before onset of reperfusion (-0.5 R), at the beginning of reperfusion (0R), or 1 h after onset of reperfusion (1R). Additional doses were administrated after 1, 2, or 3 h after onset of reperfusion. To establish the therapeutic window of AGE, the infarct area was determined for each treatment after different times of reperfusion. Results show that the administration of AGE at the onset of reperfusion reduced the infarct area by 70% (evaluated after 2 h reperfusion). The therapeutic window of AGE was determined. Repeated doses did not extend the temporal window of protection. A significant reduction in the nitrotyrosine level was observed in the brain tissue subjected to MCAO after AGE treatment at the onset of reperfusion. Data in the present work show that AGE exerts a transitory neuroprotective effect in response to ischemia/reperfusion-induced neuronal injury.

  9. Reduced cost and improved figure of sapphire optical components

    NASA Astrophysics Data System (ADS)

    Walters, Mark; Bartlett, Kevin; Brophy, Matthew R.; DeGroote Nelson, Jessica; Medicus, Kate

    2015-10-01

    Sapphire presents many challenges to optical manufacturers due to its high hardness and anisotropic properties. Long lead times and high prices are the typical result of such challenges. The cost of even a simple 'grind and shine' process can be prohibitive. The high precision surfaces required by optical sensor applications further exacerbate the challenge of processing sapphire thereby increasing cost further. Optimax has demonstrated a production process for such windows that delivers over 50% time reduction as compared to traditional manufacturing processes for sapphire, while producing windows with less than 1/5 wave rms figure error. Optimax's sapphire production process achieves significant improvement in cost by implementation of a controlled grinding process to present the best possible surface to the polishing equipment. Following the grinding process is a polishing process taking advantage of chemical interactions between slurry and substrate to deliver excellent removal rates and surface finish. Through experiments, the mechanics of the polishing process were also optimized to produce excellent optical figure. In addition to reducing the cost of producing large sapphire sensor windows, the grinding and polishing technology Optimax has developed aids in producing spherical sapphire components to better figure quality. In addition to reducing the cost of producing large sapphire sensor windows, the grinding and polishing technology Optimax has developed aids in producing spherical sapphire components to better figure quality. Through specially developed polishing slurries, the peak-to-valley figure error of spherical sapphire parts is reduced by over 80%.

  10. SU-E-T-294: Dosimetric Analysis of Planning Phase Using Overlap Volume Histogram for Respiratory Gated Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, S; Kim, D; Kim, T

    2015-06-15

    Purpose: End-of-exhale (EOE) phase is generally preferred for gating window because tumor position is more reproducible. However, other gating windows might be more appropriate for dose distribution perspective. In this pilot study, we proposed to utilize overlap volume histogram (OVH) to search optimized gating window and demonstrated its feasibility. Methods: We acquired 4DCT of 10 phases for 3 lung patients (2 with a target at right middle lobe and 1 at right upper lobe). After structures were defined in every phase, the OVH of each OAR was generated to quantify the three dimensional spatial relationship between the PTV and OARsmore » (bronchus, esophagus, heart and cord etc.) at each phase. OVH tells the overlap volume of an OAR according to outward distance from the PTV. Relative overlap volume at 20 mm outward distance from the PTV (ROV-20) was also defined as a metric for measuring overlap volume and obtained. For dose calculation, 3D CRT plans were made for all phases under the same beam angles and objectives (e.g., 95% of the PTV coverage with at least 100% of the prescription dose of 50 Gy). The gating window phase was ranked according to ROV-20, and the relationship between the OVH and dose distribution at each phase was evaluated by comparing the maximum dose, mean dose, and equivalent uniform dose of OAR. Results: OVHs showed noticeable difference from phase to phase, implying it is possible to find optimal phases for gating window. For 2 out of 3 patients (both with a target at RML), maximum dose, mean dose, and EUD increased as ROV-20 increased. Conclusion: It is demonstrated that optimal phases (in dose distribution perspective) for gating window could exist and OVH can be a useful tool for determining such phases without performing dose optimization calculations in all phases. This work was supported by the Radiation Technology R&D program (No. 2013M2A2A7043498) and the Mid-career Researcher Program (2012-007883) through the National Research Foundation (NRF) funded by the Ministry of Science, ICT & Future Planning (MSIP) of Korea.« less

  11. A batch sliding window method for local singularity mapping and its application for geochemical anomaly identification

    NASA Astrophysics Data System (ADS)

    Xiao, Fan; Chen, Zhijun; Chen, Jianguo; Zhou, Yongzhang

    2016-05-01

    In this study, a novel batch sliding window (BSW) based singularity mapping approach was proposed. Compared to the traditional sliding window (SW) technique with disadvantages of the empirical predetermination of a fixed maximum window size and outliers sensitivity of least-squares (LS) linear regression method, the BSW based singularity mapping approach can automatically determine the optimal size of the largest window for each estimated position, and utilizes robust linear regression (RLR) which is insensitive to outlier values. In the case study, tin geochemical data in Gejiu, Yunnan, have been processed by BSW based singularity mapping approach. The results show that the BSW approach can improve the accuracy of the calculation of singularity exponent values due to the determination of the optimal maximum window size. The utilization of RLR method in the BSW approach can smoothen the distribution of singularity index values with few or even without much high fluctuate values looking like noise points that usually make a singularity map much roughly and discontinuously. Furthermore, the student's t-statistic diagram indicates a strong spatial correlation between high geochemical anomaly and known tin polymetallic deposits. The target areas within high tin geochemical anomaly could probably have much higher potential for the exploration of new tin polymetallic deposits than other areas, particularly for the areas that show strong tin geochemical anomalies whereas no tin polymetallic deposits have been found in them.

  12. Optimization of the Energy Window for PETbox4, a Preclinical PET Tomograph With a Small Inner Diameter

    NASA Astrophysics Data System (ADS)

    Gu, Z.; Bao, Q.; Taschereau, R.; Wang, H.; Bai, B.; Chatziioannou, A. F.

    2014-06-01

    Small animal positron emission tomography (PET) systems are often designed by employing close geometry configurations. Due to the different characteristics caused by geometrical factors, these tomographs require data acquisition protocols that differ from those optimized for conventional large diameter ring systems. In this work we optimized the energy window for data acquisitions with PETbox4, a 50 mm detector separation (box-like geometry) pre-clinical PET scanner, using the Geant4 Application for Tomographic Emission (GATE). The fractions of different types of events were estimated using a voxelized phantom including a mouse as well as its supporting chamber, mimicking a realistic mouse imaging environment. Separate code was developed to extract additional information about the gamma interactions for more accurate event type classification. Three types of detector backscatter events were identified in addition to the trues, phantom scatters and randoms. The energy window was optimized based on the noise equivalent count rate (NECR) and scatter fraction (SF) with lower-level discriminators (LLD) corresponding to energies from 150 keV to 450 keV. The results were validated based on the calculated image uniformity, spillover ratio (SOR) and recovery coefficient (RC) from physical measurements using the National Electrical Manufacturers Association (NEMA) NU-4 image quality phantom. These results indicate that when PETbox4 is operated with a more narrow energy window (350-650 keV), detector backscatter rejection is unnecessary. For the NEMA NU-4 image quality phantom, the SOR for the water chamber decreases by about 45% from 15.1% to 8.3%, and the SOR for the air chamber decreases by 31% from 12.0% to 8.3% at the LLDs of 150 and 350 keV, without obvious change in uniformity, further supporting the simulation based optimization. The optimization described in this work is not limited to PETbox4, but also applicable or helpful to other small inner diameter geometry scanners.

  13. Ecophysiological function of leaf 'windows' in Lithops species - 'Living Stones' that grow underground.

    PubMed

    Martin, C E; Brandmeyer, E A; Ross, R D

    2013-01-01

    Leaf temperatures were lower when light entry at the leaf tip window was prevented through covering the window with reflective tape, relative to leaf temperatures of plants with leaf tip windows covered with transparent tape. This was true when leaf temperatures were measured with an infrared thermometer, but not with a fine-wire thermocouple. Leaf tip windows of Lithops growing in high-rainfall regions of southern Africa were larger than the windows of plants (numerous individuals of 17 species) growing in areas with less rainfall and, thus, more annual insolation. The results of this study indicate that leaf tip windows of desert plants with an underground growth habit can allow entry of supra-optimal levels of radiant energy, thus most likely inhibiting photosynthetic activity. Consequently, the size of the leaf tip windows correlates inversely with habitat solar irradiance, minimising the probability of photoinhibition, while maximising the absorption of irradiance in cloudy, high-rainfall regions. © 2012 German Botanical Society and The Royal Botanical Society of the Netherlands.

  14. Optimization of hierarchical structure and nanoscale-enabled plasmonic refraction for window electrodes in photovoltaics.

    PubMed

    Han, Bing; Peng, Qiang; Li, Ruopeng; Rong, Qikun; Ding, Yang; Akinoglu, Eser Metin; Wu, Xueyuan; Wang, Xin; Lu, Xubing; Wang, Qianming; Zhou, Guofu; Liu, Jun-Ming; Ren, Zhifeng; Giersig, Michael; Herczynski, Andrzej; Kempa, Krzysztof; Gao, Jinwei

    2016-09-26

    An ideal network window electrode for photovoltaic applications should provide an optimal surface coverage, a uniform current density into and/or from a substrate, and a minimum of the overall resistance for a given shading ratio. Here we show that metallic networks with quasi-fractal structure provides a near-perfect practical realization of such an ideal electrode. We find that a leaf venation network, which possesses key characteristics of the optimal structure, indeed outperforms other networks. We further show that elements of hierarchal topology, rather than details of the branching geometry, are of primary importance in optimizing the networks, and demonstrate this experimentally on five model artificial hierarchical networks of varied levels of complexity. In addition to these structural effects, networks containing nanowires are shown to acquire transparency exceeding the geometric constraint due to the plasmonic refraction.

  15. Doppler Imaging in Aortic Stenosis: The Importance of the Nonapical Imaging Windows to Determine Severity in a Contemporary Cohort.

    PubMed

    Thaden, Jeremy J; Nkomo, Vuyisile T; Lee, Kwang Je; Oh, Jae K

    2015-07-01

    Although the highest aortic valve velocity was thought to be obtained from imaging windows other than the apex in about 20% of patients with severe aortic stenosis (AS), its occurrence appears to be increasing as the age of patients has increased with the application of transcatheter aortic valve replacement. The aim of this study was to determine the frequency with which the highest peak jet velocity (Vmax) is found at each imaging window, the degree to which neglecting nonapical imaging windows underestimates AS severity, and factors influencing the location of the optimal imaging window in contemporary patients. Echocardiograms obtained in 100 consecutive patients with severe AS from January 3 to May 23, 2012, in which all imaging windows were interrogated, were retrospectively analyzed. AS severity (aortic valve area and mean gradient) was calculated on the basis of the apical imaging window alone and the imaging window with the highest peak jet velocity. The left ventricular-aortic root angle measured in the parasternal long-axis view as well as clinical variables were correlated with the location of highest peak jet velocity. Vmax was most frequently obtained in the right parasternal window (50%), followed by the apex (39%). Subjects with acute angulation more commonly had Vmax at the right parasternal window (65% vs 43%, P = .05) and less commonly had Vmax at the apical window (19% vs 48%, P = .005), but Vmax was still located outside the apical imaging window in 52% of patients with obtuse aortic root angles. If nonapical windows were neglected, 8% of patients (eight of 100) were misclassified from high-gradient severe AS to low-gradient severe AS, and another 15% (15 of 100) with severe AS (aortic valve area < 1.0 cm(2)) were misclassified as having moderate AS (aortic valve area > 1.0 cm(2)). In this contemporary cohort, Vmax was located outside the apical imaging window in 61% of patients, and neglecting the nonapical imaging windows resulted in the misclassification of AS severity in 23% of patients. Aortic root angulation as measured by two-dimensional echocardiography influences the location of Vmax modestly. Despite increasing time constraints on many echocardiography laboratories, these data confirm that routine Doppler interrogation from multiple imaging windows is critical to accurately determine the severity of AS in contemporary clinical practice. Copyright © 2015 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  16. Method of high speed flow field influence and restrain on laser communication

    NASA Astrophysics Data System (ADS)

    Meng, Li-xin; Wang, Chun-hui; Qian, Cun-zhu; Wang, Shuo; Zhang, Li-zhong

    2013-08-01

    For laser communication performance which carried by airplane or airship, due to high-speed platform movement, the air has two influences in platform and laser communication terminal window. The first influence is that aerodynamic effect causes the deformation of the optical window; the second one is that a shock wave and boundary layer would be generated. For subsonic within the aircraft, the boundary layer is the main influence. The presence of a boundary layer could change the air density and the temperature of the optical window, which causes the light deflection and received beam spot flicker. Ultimately, the energy hunting of the beam spot which reaches receiving side increases, so that the error rate increases. In this paper, aerodynamic theory is used in analyzing the influence of the optical window deformation due to high speed air. Aero-optics theory is used to analyze the influence of the boundary layer in laser communication link. Based on this, we focused on working on exploring in aerodynamic and aero-optical effect suppression method in the perspective of the optical window design. Based on planning experimental aircraft types and equipment installation location, we optimized the design parameters of the shape and thickness of the optical window, the shape and size of air-management kit. Finally, deformation of the optical window and air flow distribution were simulated by fluid simulation software in the different mach and different altitude fly condition. The simulation results showed that the optical window can inhibit the aerodynamic influence after optimization. In addition, the boundary layer is smoothed; the turbulence influence is reduced, which meets the requirements of the airborne laser communication.

  17. Planning minimum-energy paths in an off-road environment with anisotropic traversal costs and motion constraints. Doctoral thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, R.S.

    1989-06-01

    For a vehicle operating across arbitrarily-contoured terrain, finding the most fuel-efficient route between two points can be viewed as a high-level global path-planning problem with traversal costs and stability dependent on the direction of travel (anisotropic). The problem assumes a two-dimensional polygonal map of homogeneous cost regions for terrain representation constructed from elevation information. The anisotropic energy cost of vehicle motion has a non-braking component dependent on horizontal distance, a braking component dependent on vertical distance, and a constant path-independent component. The behavior of minimum-energy paths is then proved to be restricted to a small, but optimal set of traversalmore » types. An optimal-path-planning algorithm, using a heuristic search technique, reduces the infinite number of paths between the start and goal points to a finite number by generating sequences of goal-feasible window lists from analyzing the polygonal map and applying pruning criteria. The pruning criteria consist of visibility analysis, heading analysis, and region-boundary constraints. Each goal-feasible window lists specifies an associated convex optimization problem, and the best of all locally-optimal paths through the goal-feasible window lists is the globally-optimal path. These ideas have been implemented in a computer program, with results showing considerably better performance than the exponential average-case behavior predicted.« less

  18. Study of noise transmission through double wall aircraft windows

    NASA Technical Reports Server (NTRS)

    Vaicaitis, R.

    1983-01-01

    Analytical and experimental procedures were used to predict the noise transmitted through double wall windows into the cabin of a twin-engine G/A aircraft. The analytical model was applied to optimize cabin noise through parametric variation of the structural and acoustic parameters. The parametric study includes mass addition, increase in plexiglass thickness, decrease in window size, increase in window cavity depth, depressurization of the space between the two window plates, replacement of the air cavity with a transparent viscoelastic material, change in stiffness of the plexiglass material, and different absorptive materials for the interior walls of the cabin. It was found that increasing the exterior plexiglass thickness and/or decreasing the total window size could achieve the proper amount of noise reduction for this aircraft. The total added weight to the aircraft is then about 25 lbs.

  19. Universal entrainment mechanism controls contact times with motile cells

    NASA Astrophysics Data System (ADS)

    Mathijssen, Arnold J. T. M.; Jeanneret, Raphaël; Polin, Marco

    2018-03-01

    Contact between particles and motile cells underpins a wide variety of biological processes, from nutrient capture and ligand binding to grazing, viral infection, and cell-cell communication. The window of opportunity for these interactions depends on the basic mechanism determining contact time, which is currently unknown. By combining experiments on three different species—Chlamydomonas reinhardtii, Tetraselmis subcordiforms, and Oxyrrhis marina—with simulations and analytical modeling, we show that the fundamental physical process regulating proximity to a swimming microorganism is hydrodynamic particle entrainment. The resulting distribution of contact times is derived within the framework of Taylor dispersion as a competition between advection by the cell surface and microparticle diffusion, and predicts the existence of an optimal tracer size that is also observed experimentally. Spatial organization of flagella, swimming speed, and swimmer and tracer size influence entrainment features and provide tradeoffs that may be tuned to optimize the estimated probabilities for microbial interactions like predation and infection.

  20. 650-nm-band high-power and highly reliable laser diodes with a window-mirror structure

    NASA Astrophysics Data System (ADS)

    Shima, Akihiro; Hironaka, Misao; Ono, Ken-ichi; Takemi, Masayoshi; Sakamoto, Yoshifumi; Kunitsugu, Yasuhiro; Yamashita, Koji

    1998-05-01

    An active layer structure with 658 nm-emission at 25 degrees Celsius has been optimized in order to reduce the operating current of the laser diodes (LD) under high temperature condition. For improvement of the maximum output power and the reliability limited by mirror degradation, we have applied a zinc-diffused-type window-mirror structure which prevents the optical absorption at the mirror facet. As a result, the CW output power of 50 mW is obtained even at 80 degrees Celsius for a 650 micrometer-long window-mirror LD. In addition, the maximum light output power over 150 mW at 25 degrees Celsius has been realized without any optical mirror damage. In the aging tests, the LDs have been operating for over 2,500 - 5,000 hours under the CW condition of 30 - 50 mW at 60 degrees Celsius. The window-mirror structure also enables reliable 60 degree Celsius, 30 mW, CW operation of the LDs with 651 nm- emission at 25 degrees Celsius. Moreover, the maximum output power of around 100 mW even at 80 degrees Celsius and reliable 2,000-hour operation at 60 degrees Celsius, 70 mW have been realized for the first time by 659 nm LDs with a long cavity length of 900 micrometers.

  1. Short-time fractional Fourier methods for the time-frequency representation of chirp signals.

    PubMed

    Capus, Chris; Brown, Keith

    2003-06-01

    The fractional Fourier transform (FrFT) provides a valuable tool for the analysis of linear chirp signals. This paper develops two short-time FrFT variants which are suited to the analysis of multicomponent and nonlinear chirp signals. Outputs have similar properties to the short-time Fourier transform (STFT) but show improved time-frequency resolution. The FrFT is a parameterized transform with parameter, a, related to chirp rate. The two short-time implementations differ in how the value of a is chosen. In the first, a global optimization procedure selects one value of a with reference to the entire signal. In the second, a values are selected independently for each windowed section. Comparative variance measures based on the Gaussian function are given and are shown to be consistent with the uncertainty principle in fractional domains. For appropriately chosen FrFT orders, the derived fractional domain uncertainty relationship is minimized for Gaussian windowed linear chirp signals. The two short-time FrFT algorithms have complementary strengths demonstrated by time-frequency representations for a multicomponent bat chirp, a highly nonlinear quadratic chirp, and an output pulse from a finite-difference sonar model with dispersive change. These representations illustrate the improvements obtained in using FrFT based algorithms compared to the STFT.

  2. Accumulated energy norm for full waveform inversion of marine data

    NASA Astrophysics Data System (ADS)

    Shin, Changsoo; Ha, Wansoo

    2017-12-01

    Macro-velocity models are important for imaging the subsurface structure. However, the conventional objective functions of full waveform inversion in the time and the frequency domain have a limited ability to recover the macro-velocity model because of the absence of low-frequency information. In this study, we propose new objective functions that can recover the macro-velocity model by minimizing the difference between the zero-frequency components of the square of seismic traces. Instead of the seismic trace itself, we use the square of the trace, which contains low-frequency information. We apply several time windows to the trace and obtain zero-frequency information of the squared trace for each time window. The shape of the new objective functions shows that they are suitable for local optimization methods. Since we use the acoustic wave equation in this study, this method can be used for deep-sea marine data, in which elastic effects can be ignored. We show that the zero-frequency components of the square of the seismic traces can be used to recover macro-velocities from synthetic and field data.

  3. The quality estimation of exterior wall’s and window filling’s construction design

    NASA Astrophysics Data System (ADS)

    Saltykov, Ivan; Bovsunovskaya, Maria

    2017-10-01

    The article reveals the term of “artificial envelope” in dwelling building. Authors offer a complex multifactorial approach to the design quality estimation of external fencing structures, which is based on various parameters impact. These referred parameters are: functional, exploitation, cost, and also, the environmental index is among them. The quality design index Qк is inputting for the complex characteristic of observed above parameters. The mathematical relation of this index from these parameters is the target function for the quality design estimation. For instance, the article shows the search of optimal variant for wall and window designs in small, middle and large square dwelling premises of economic class buildings. The graphs of target function single parameters are expressed for the three types of residual chamber’s dimensions. As a result of the showing example, there is a choice of window opening’s dimensions, which make the wall’s and window’s constructions properly correspondent to the producible complex requirements. The authors reveal the comparison of recommended window filling’s square in accordance with the building standards, and the square, due to the finding of the optimal variant of the design quality index. The multifactorial approach for optimal design searching, which is mentioned in this article, can be used in consideration of various construction elements of dwelling buildings in accounting of suitable climate, social and economic construction area features.

  4. Risperidone Effects on Brain Dynamic Connectivity-A Prospective Resting-State fMRI Study in Schizophrenia.

    PubMed

    Lottman, Kristin K; Kraguljac, Nina V; White, David M; Morgan, Charity J; Calhoun, Vince D; Butt, Allison; Lahti, Adrienne C

    2017-01-01

    Resting-state functional connectivity studies in schizophrenia evaluating average connectivity over the entire experiment have reported aberrant network integration, but findings are variable. Examining time-varying (dynamic) functional connectivity may help explain some inconsistencies. We assessed dynamic network connectivity using resting-state functional MRI in patients with schizophrenia, while unmedicated ( n  = 34), after 1 week ( n  = 29) and 6 weeks of treatment with risperidone ( n  = 24), as well as matched controls at baseline ( n  = 35) and after 6 weeks ( n  = 19). After identifying 41 independent components (ICs) comprising resting-state networks, sliding window analysis was performed on IC timecourses using an optimal window size validated with linear support vector machines. Windowed correlation matrices were then clustered into three discrete connectivity states (a relatively sparsely connected state, a relatively abundantly connected state, and an intermediately connected state). In unmedicated patients, static connectivity was increased between five pairs of ICs and decreased between two pairs of ICs when compared to controls, dynamic connectivity showed increased connectivity between the thalamus and somatomotor network in one of the three states. State statistics indicated that, in comparison to controls, unmedicated patients had shorter mean dwell times and fraction of time spent in the sparsely connected state, and longer dwell times and fraction of time spent in the intermediately connected state. Risperidone appeared to normalize mean dwell times after 6 weeks, but not fraction of time. Results suggest that static connectivity abnormalities in schizophrenia may partly be related to altered brain network temporal dynamics rather than consistent dysconnectivity within and between functional networks and demonstrate the importance of implementing complementary data analysis techniques.

  5. TimepixCam: a fast optical imager with time-stamping

    NASA Astrophysics Data System (ADS)

    Fisher-Levine, M.; Nomerotski, A.

    2016-03-01

    We describe a novel fast optical imager, TimepixCam, based on an optimized silicon pixel sensor with a thin entrance window, read out by a Timepix ASIC. TimepixCam is able to record and time-stamp light flashes in excess of 1,000 photons with high quantum efficiency in the 400-1000nm wavelength range with 20ns timing resolution, corresponding to an effective rate of 50 Megaframes per second. The camera was used for imaging ions impinging on a microchannel plate followed by a phosphor screen. Possible applications include spatial and velocity map imaging of ions in time-of-flight mass spectroscopy; coincidence imaging of ions and electrons, and other time-resolved types of imaging spectroscopy.

  6. Multi-alternative decision-making with non-stationary inputs.

    PubMed

    Nunes, Luana F; Gurney, Kevin

    2016-08-01

    One of the most widely implemented models for multi-alternative decision-making is the multihypothesis sequential probability ratio test (MSPRT). It is asymptotically optimal, straightforward to implement, and has found application in modelling biological decision-making. However, the MSPRT is limited in application to discrete ('trial-based'), non-time-varying scenarios. By contrast, real world situations will be continuous and entail stimulus non-stationarity. In these circumstances, decision-making mechanisms (like the MSPRT) which work by accumulating evidence, must be able to discard outdated evidence which becomes progressively irrelevant. To address this issue, we introduce a new decision mechanism by augmenting the MSPRT with a rectangular integration window and a transparent decision boundary. This allows selection and de-selection of options as their evidence changes dynamically. Performance was enhanced by adapting the window size to problem difficulty. Further, we present an alternative windowing method which exponentially decays evidence and does not significantly degrade performance, while greatly reducing the memory resources necessary. The methods presented have proven successful at allowing for the MSPRT algorithm to function in a non-stationary environment.

  7. Dynamics of retinal photocoagulation and rupture

    NASA Astrophysics Data System (ADS)

    Sramek, Christopher; Paulus, Yannis; Nomoto, Hiroyuki; Huie, Phil; Brown, Jefferson; Palanker, Daniel

    2009-05-01

    In laser retinal photocoagulation, short (<20 ms) pulses have been found to reduce thermal damage to the inner retina, decrease treatment time, and minimize pain. However, the safe therapeutic window (defined as the ratio of power for producing a rupture to that of mild coagulation) decreases with shorter exposures. To quantify the extent of retinal heating and maximize the therapeutic window, a computational model of millisecond retinal photocoagulation and rupture was developed. Optical attenuation of 532-nm laser light in ocular tissues was measured, including retinal pigment epithelial (RPE) pigmentation and cell-size variability. Threshold powers for vaporization and RPE damage were measured with pulse durations ranging from 1 to 200 ms. A finite element model of retinal heating inferred that vaporization (rupture) takes place at 180-190°C. RPE damage was accurately described by the Arrhenius model with activation energy of 340 kJ/mol. Computed photocoagulation lesion width increased logarithmically with pulse duration, in agreement with histological findings. The model will allow for the optimization of beam parameters to increase the width of the therapeutic window for short exposures.

  8. Shilling attack detection for recommender systems based on credibility of group users and rating time series.

    PubMed

    Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.

  9. Shilling attack detection for recommender systems based on credibility of group users and rating time series

    PubMed Central

    Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user’s credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method. PMID:29742134

  10. Reflex Triode X-Ray Source Research on Gamble

    DTIC Science & Technology

    2007-06-01

    dosimeters ( TLDs ) located at the vacuum window (18-27 cm from the converter), near the pinhole camera and near the image plate. II. EXPERIMENTAL...MeV- electron beams to thin converters in order to optimize emission of sub-100- keV x-rays. Thin converters reduce self-absorption of low-energy...x-rays, but the beam electrons must pass many times through the converter for efficient x-ray production. The triode configuration was found to be

  11. Optimizing Terminal Conditions Using Geometric Guidance for Low-Control Authority Munitions

    DTIC Science & Technology

    2008-06-01

    Lowest altitude allowable for maximum canard deflection per unit of acceleration constant hT δ g Canard deflection per unit of acceleration transition...target within that range window in less than five minutes from time of fire [17]. The launch platform can supply the munition with some preflight...linear 7. The information supplied by the onboard navigation system has no errors 8. The control system is always able to generate the exact amount

  12. Window panes of eternity. Health, disease, and inherited risk.

    PubMed Central

    Scriver, C. R.

    1982-01-01

    Personal health reflects harmony between individual and experience; it is optimal homeostasis. Disease is an outcome of incongruity leading to dishomeostasis. Relative to earlier times, disease in modern society has higher "heritability" (in the broad meaning of the term). Inherited risks are facts compatible with anticipation and prevention of disease. This viewpoint has major implications for medical practice, deployment of health services, themes of research, and education of health care personnel and citizens. PMID:6763817

  13. Optimization of hierarchical structure and nanoscale-enabled plasmonic refraction for window electrodes in photovoltaics

    PubMed Central

    Han, Bing; Peng, Qiang; Li, Ruopeng; Rong, Qikun; Ding, Yang; Akinoglu, Eser Metin; Wu, Xueyuan; Wang, Xin; Lu, Xubing; Wang, Qianming; Zhou, Guofu; Liu, Jun-Ming; Ren, Zhifeng; Giersig, Michael; Herczynski, Andrzej; Kempa, Krzysztof; Gao, Jinwei

    2016-01-01

    An ideal network window electrode for photovoltaic applications should provide an optimal surface coverage, a uniform current density into and/or from a substrate, and a minimum of the overall resistance for a given shading ratio. Here we show that metallic networks with quasi-fractal structure provides a near-perfect practical realization of such an ideal electrode. We find that a leaf venation network, which possesses key characteristics of the optimal structure, indeed outperforms other networks. We further show that elements of hierarchal topology, rather than details of the branching geometry, are of primary importance in optimizing the networks, and demonstrate this experimentally on five model artificial hierarchical networks of varied levels of complexity. In addition to these structural effects, networks containing nanowires are shown to acquire transparency exceeding the geometric constraint due to the plasmonic refraction. PMID:27667099

  14. Nonuniform Effects of Reinstatement within the Time Window

    ERIC Educational Resources Information Center

    Galluccio, Llissa; Rovee-Collier, Carolyn

    2006-01-01

    A time window is a limited period after an event initially occurs in which additional information can be integrated with the memory of that event. It shuts when the memory is forgotten. The time window hypothesis holds that the impact of a manipulation at different points within the time window is nonuniform. In two operant conditioning…

  15. Modeling and experimental investigation of x-ray spectra from a liquid metal anode x-ray tube

    NASA Astrophysics Data System (ADS)

    David, Bernd R.; Thran, Axel; Eckart, Rainer

    2004-11-01

    This paper presents simulated and measured spectra of a novel type of x-ray tube. The bremsstrahlung generating principle of this tube is based on the interaction of high energetic electrons with a turbulently flowing liquid metal separated from the vacuum by a thin window. We simulated the interaction of 50-150 keV electrons with liquid metal targets composed of the elements Ga, In, Sn, as well as the solid elements C, W and Re used for the electron windows. We obtained x-ray spectra and energy loss curves for various liquid metal/window combinations and thicknesses of the window material. In terms of optimum heat transport a thin diamond window in combination with the liquid metal GaInSn is the best suited system. If photon flux is the optimization criteria, thin tungsten/rhenium windows cooled by GaInSn should be preferred.

  16. Optimizing the performance of bandpass photon detectors for inverse photoemission: Transmission of alkaline earth fluoride window crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thiede, Christian, E-mail: christian.thiede@uni-muenster.de; Schmidt, Anke B.; Donath, Markus

    2015-08-15

    Bandpass photon detectors are widely used in inverse photoemission in the isochromat mode at energies in the vacuum-ultraviolet spectral range. The energy bandpass of gas-filled counters is usually formed by the ionization threshold of the counting gas as high-pass filter and the transmission cutoff of an alkaline earth fluoride window as low-pass filter. The transmission characteristics of the window have, therefore, a crucial impact on the detector performance. We present transmission measurements in the vacuum-ultraviolet spectral range for alkaline earth fluoride window crystals in the vicinity of the transmission cutoff as a function of crystal purity, surface finish, surface contamination,more » temperature, and thickness. Our findings reveal that the transmission characteristics of the window crystal and, thus, the detector performance depend critically on these window parameters.« less

  17. Vanadium dioxide nanogrid films for high transparency smart architectural window applications.

    PubMed

    Liu, Chang; Balin, Igal; Magdassi, Shlomo; Abdulhalim, Ibrahim; Long, Yi

    2015-02-09

    This study presents a novel approach towards achieving high luminous transmittance (T(lum)) for vanadium dioxide (VO(2)) thermochromic nanogrid films whilst maintaining the solar modulation ability (ΔT(sol)). The perforated VO(2)-based films employ orderly-patterned nano-holes, which are able to favorably transmit visible light dramatically but retain large near-infrared modulation, thereby enhancing ΔT(sol). Numerical optimizations using parameter search algorithms have implemented through a series of Finite Difference Time Domain (FDTD) simulations by varying film thickness, cell periodicity, grid dimensions and variations of grid arrangement. The best performing results of T(lum) (76.5%) and ΔT(sol) (14.0%) are comparable, if not superior, to the results calculated from nanothermochromism, nanoporosity and biomimic nanostructuring. It opens up a new approach for thermochromic smart window applications.

  18. Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Kilgour, David P. A.; Hughes, Sam; Kilgour, Samantha L.; Mackay, C. Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K.; Clarke, David J.; Goodlett, David R.

    2017-02-01

    We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks.

  19. Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry.

    PubMed

    Kilgour, David P A; Hughes, Sam; Kilgour, Samantha L; Mackay, C Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K; Clarke, David J; Goodlett, David R

    2017-02-01

    We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks. Graphical Abstract ᅟ.

  20. Time-Frequency Analysis And Pattern Recognition Using Singular Value Decomposition Of The Wigner-Ville Distribution

    NASA Astrophysics Data System (ADS)

    Boashash, Boualem; Lovell, Brian; White, Langford

    1988-01-01

    Time-Frequency analysis based on the Wigner-Ville Distribution (WVD) is shown to be optimal for a class of signals where the variation of instantaneous frequency is the dominant characteristic. Spectral resolution and instantaneous frequency tracking is substantially improved by using a Modified WVD (MWVD) based on an Autoregressive spectral estimator. Enhanced signal-to-noise ratio may be achieved by using 2D windowing in the Time-Frequency domain. The WVD provides a tool for deriving descriptors of signals which highlight their FM characteristics. These descriptors may be used for pattern recognition and data clustering using the methods presented in this paper.

  1. Suppression of sound radiation to far field of near-field acoustic communication system using evanescent sound field

    NASA Astrophysics Data System (ADS)

    Fujii, Ayaka; Wakatsuki, Naoto; Mizutani, Koichi

    2016-01-01

    A method of suppressing sound radiation to the far field of a near-field acoustic communication system using an evanescent sound field is proposed. The amplitude of the evanescent sound field generated from an infinite vibrating plate attenuates exponentially with increasing a distance from the surface of the vibrating plate. However, a discontinuity of the sound field exists at the edge of the finite vibrating plate in practice, which broadens the wavenumber spectrum. A sound wave radiates over the evanescent sound field because of broadening of the wavenumber spectrum. Therefore, we calculated the optimum distribution of the particle velocity on the vibrating plate to reduce the broadening of the wavenumber spectrum. We focused on a window function that is utilized in the field of signal analysis for reducing the broadening of the frequency spectrum. The optimization calculation is necessary for the design of window function suitable for suppressing sound radiation and securing a spatial area for data communication. In addition, a wide frequency bandwidth is required to increase the data transmission speed. Therefore, we investigated a suitable method for calculating the sound pressure level at the far field to confirm the variation of the distribution of sound pressure level determined on the basis of the window shape and frequency. The distribution of the sound pressure level at a finite distance was in good agreement with that obtained at an infinite far field under the condition generating the evanescent sound field. Consequently, the window function was optimized by the method used to calculate the distribution of the sound pressure level at an infinite far field using the wavenumber spectrum on the vibrating plate. According to the result of comparing the distributions of the sound pressure level in the cases with and without the window function, it was confirmed that the area whose sound pressure level was reduced from the maximum level to -50 dB was extended. Additionally, we designed a sound insulator so as to realize a similar distribution of the particle velocity to that obtained using the optimized window function. Sound radiation was suppressed using a sound insulator put above the vibrating surface in the simulation using the three-dimensional finite element method. On the basis of this finding, it was suggested that near-field acoustic communication which suppressed sound radiation can be realized by applying the optimized window function to the particle velocity field.

  2. Forming of complex-shaped composite tubes using optimized bladder-assisted resin transfer molding

    NASA Astrophysics Data System (ADS)

    Schillfahrt, Christian; Fauster, Ewald; Schledjewski, Ralf

    2018-05-01

    This work addresses the manufacturing of tubular composite structures by means of bladder-assisted resin transfer molding using elastomeric bladders. In order to achieve successful processing of such parts, knowledge of the compaction and impregnation behavior of the textile preform is vital. Hence, efficient analytical models that describe the influencing parameters of the preform compaction and filling stage were developed and verified through practical experiments. A process window describing optimal and critical operating conditions during the injection stage was created by evaluating the impact of the relevant process pressures on filling time. Finally, a cascaded injection procedure was investigated that particularly facilitates the manufacturing of long composite tubes.

  3. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  4. Design of high-efficiency diffractive optical elements towards ultrafast mid-infrared time-stretched imaging and spectroscopy

    NASA Astrophysics Data System (ADS)

    Xie, Hongbo; Ren, Delun; Wang, Chao; Mao, Chensheng; Yang, Lei

    2018-02-01

    Ultrafast time stretch imaging offers unprecedented imaging speed and enables new discoveries in scientific research and engineering. One challenge in exploiting time stretch imaging in mid-infrared is the lack of high-quality diffractive optical elements (DOEs), which encode the image information into mid-infrared optical spectrum. This work reports the design and optimization of mid-infrared DOE with high diffraction-efficiency, broad bandwidth and large field of view. Using various typical materials with their refractive indices ranging from 1.32 to 4.06 in ? mid-infrared band, diffraction efficiencies of single-layer and double-layer DOEs have been studied in different wavelength bands with different field of views. More importantly, by replacing the air gap of double-layer DOE with carefully selected optical materials, one optimized ? triple-layer DOE, with efficiency higher than 95% in the whole ? mid-infrared window and field of view greater than ?, is designed and analyzed. This new DOE device holds great potential in ultrafast mid-infrared time stretch imaging and spectroscopy.

  5. Phenology-based Spartina alterniflora mapping in coastal wetland of the Yangtze Estuary using time series of GaoFen satellite no. 1 wide field of view imagery

    NASA Astrophysics Data System (ADS)

    Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao

    2017-04-01

    Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the use of monthly NDVI time series imagery. These results show the importance of considering the phenological stage for image selection for mapping S. alterniflora using GF-1 WFV imagery. Furthermore, in light of the better tradeoff between the number of images and classification accuracy when using multitemporal GF-1 WFV imagery, we suggest using multitemporal imagery acquired at appropriate phenological windows for S. alterniflora mapping at regional scales.

  6. Fast approximate delivery of fluence maps for IMRT and VMAT

    NASA Astrophysics Data System (ADS)

    Balvert, Marleen; Craft, David

    2017-02-01

    In this article we provide a method to generate the trade-off between delivery time and fluence map matching quality for dynamically delivered fluence maps. At the heart of our method lies a mathematical programming model that, for a given duration of delivery, optimizes leaf trajectories and dose rates such that the desired fluence map is reproduced as well as possible. We begin with the single fluence map case and then generalize the model and the solution technique to the delivery of sequential fluence maps. The resulting large-scale, non-convex optimization problem was solved using a heuristic approach. We test our method using a prostate case and a head and neck case, and present the resulting trade-off curves. Analysis of the leaf trajectories reveals that short time plans have larger leaf openings in general than longer delivery time plans. Our method allows one to explore the continuum of possibilities between coarse, large segment plans characteristic of direct aperture approaches and narrow field plans produced by sliding window approaches. Exposing this trade-off will allow for an informed choice between plan quality and solution time. Further research is required to speed up the optimization process to make this method clinically implementable.

  7. 47 CFR 15.323 - Specific requirements for devices operating in the 1920-1930 MHz sub-band.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...] (c) Devices must incorporate a mechanism for monitoring the time and spectrum windows that its... transmission, devices must monitor the combined time and spectrum windows in which they intend to transmit for... windows without further monitoring. However, occupation of the same combined time and spectrum windows by...

  8. Polyp measurement with CT colonography: multiple-reader, multiple-workstation comparison.

    PubMed

    Young, Brett M; Fletcher, J G; Paulsen, Scott R; Booya, Fargol; Johnson, C Daniel; Johnson, Kristina T; Melton, Zackary; Rodysill, Drew; Mandrekar, Jay

    2007-01-01

    The risk of invasive colorectal cancer in colorectal polyps correlates with lesion size. Our purpose was to define the most accurate methods for measuring polyp size at CT colonography (CTC) using three models of workstations and multiple observers. Six reviewers measured 24 unique polyps of known size (5, 7, 10, and 12 mm), shape (sessile, flat, and pedunculated), and location (straight or curved bowel segment) using CTC data sets obtained at two doses (5 mAs and 65 mAs) and a previously described colonic phantom model. Reviewers measured the largest diameter of polyps on three proprietary workstations. Each polyp was measured with lung and soft-tissue windows on axial, 2D multiplanar reconstruction (MPR), and 3D images. There were significant differences among measurements obtained at various settings within each workstation (p < 0.0001). Measurements on 2D images were more accurate with lung window than with soft-tissue window settings (p < 0.0001). For the 65-mAs data set, the most accurate measurements were obtained in analysis of axial images with lung window, 2D MPR images with lung window, and 3D tissue cube images for Wizard, Advantage, and Vitrea workstations, respectively, without significant differences in accuracy among techniques (0.11 < p < 0.59). The mean absolute error values for these optimal settings were 0.48 mm, 0.61 mm, and 0.76 mm, respectively, for the three workstations. Within the ultralow-dose 5-mAs data set the best methods for Wizard, Advantage, and Vitrea were axial with lung window, 2D MPR with lung window, and 2D MPR with lung window, respectively. Use of nearly all measurement methods, except for the Vitrea 3D tissue cube and the Wizard 2D MPR with lung window, resulted in undermeasurement of the true size of the polyps. Use of CTC computer workstations facilitates accurate polyp measurement. For routine CTC examinations, polyps should be measured with lung window settings on 2D axial or MPR images (Wizard and Advantage) or 3D images (Vitrea). When these optimal methods are used, these three commercial workstations do not differ significantly in acquisition of accurate polyp measurements at routine dose settings.

  9. The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2010-01-01

    The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).

  10. Sub-half-micron contact window design with 3D photolithography simulator

    NASA Astrophysics Data System (ADS)

    Brainerd, Steve K.; Bernard, Douglas A.; Rey, Juan C.; Li, Jiangwei; Granik, Yuri; Boksha, Victor V.

    1997-07-01

    In state of the art IC design and manufacturing certain lithography layers have unique requirements. Latitudes and tolerances that apply to contacts and polysilicon gates are tight for such critical layers. Industry experts are discussing the most cost effective ways to use feature- oriented equipment and materials already developed for these layers. Such requirements introduce new dimensions into the traditionally challenging task for the photolithography engineer when considering various combinations of multiple factors to optimize and control the process. In addition, he/she faces a rapidly increasing cost of experiments, limited time and scarce access to equipment to conduct them. All the reasons presented above support simulation as an ideal method to satisfy these demands. However lithography engineers may be easily dissatisfied with a simulation tool when discovering disagreement between the simulation and experimental data. The problem is that several parameters used in photolithography simulation are very process specific. Calibration, i.e. matching experimental and simulation data using a specific set of procedures allows one to effectively use the simulation tool. We present results of a simulation based approach to optimize photolithography processes for sub-0.5 micron contact windows. Our approach consists of: (1) 3D simulation to explore different lithographic options, (2) calibration to a range of process conditions with extensive use of specifically developed optimization techniques. The choice of a 3D simulator is essential because of 3D nature of the problem of contact window design. We use DEPICT 4.1. This program performs fast aerial image simulation as presented before. For 3D exposure the program uses an extension to three-dimensions of the high numerical aperture model combined with Fast Fourier Transforms for maximum performance and accuracy. We use Kim (U.C. Berkeley) model and the fast marching Level Set method respectively for the calculation of resist development rates and resist surface movement during development process. Calibration efforts were aimed at matching experimental results on contact windows obtained after exposure of a binary mask. Additionally, simulation was applied to conduct quantitative analysis of PSM design capabilities, optical proximity correction, and stepper parameter optimization. Extensive experiments covered exposure (ASML 5500/100D stepper), pre- and post-exposure bake and development (2.38% TMAH, puddle process) of JSR IX725D2G and TOK iP3500 photoresists films on 200 mm test wafers. `Aquatar' was used as top antireflective coating, SEM pictures of developed patterns were analyzed and compared with simulation results for different values of defocus, exposure energies, numerical aperture and partial coherence.

  11. Exploring How Age, Accession Source, Childbearing and the SWO Career Path Influence Female SWO Retention

    DTIC Science & Technology

    2017-03-01

    OCW from ages 25–30 based on the 2015 U.S. Department of Health and Human Services Report and the Navy’s biennial Pregnancy and Parenthood Survey ... Survey , adds the 83 optimal childbearing window (OCW) to illustrate how pregnancy timing would align between ages 25–30 on each SWO career path. Each...U.S. Department of Health and Human Services report and the biennial U.S. Navy Pregnancy and Parenthood Survey , a majority of women are having

  12. Resource Constrained Planning of Multiple Projects with Separable Activities

    NASA Astrophysics Data System (ADS)

    Fujii, Susumu; Morita, Hiroshi; Kanawa, Takuya

    In this study we consider a resource constrained planning problem of multiple projects with separable activities. This problem provides a plan to process the activities considering a resource availability with time window. We propose a solution algorithm based on the branch and bound method to obtain the optimal solution minimizing the completion time of all projects. We develop three methods for improvement of computational efficiency, that is, to obtain initial solution with minimum slack time rule, to estimate lower bound considering both time and resource constraints and to introduce an equivalence relation for bounding operation. The effectiveness of the proposed methods is demonstrated by numerical examples. Especially as the number of planning projects increases, the average computational time and the number of searched nodes are reduced.

  13. Better than Optimal by Taking a Limit?

    ERIC Educational Resources Information Center

    Betounes, David

    2012-01-01

    Designing an optimal Norman window is a standard calculus exercise. How much more difficult (or interesting) is its generalization to deploying multiple semicircles along the head (or along head and sill, or head and jambs)? What if we use shapes beside semi-circles? As the number of copies of the shape increases and the optimal Norman windows…

  14. Solving a bi-objective mathematical model for location-routing problem with time windows in multi-echelon reverse logistics using metaheuristic procedure

    NASA Astrophysics Data System (ADS)

    Ghezavati, V. R.; Beigi, M.

    2016-12-01

    During the last decade, the stringent pressures from environmental and social requirements have spurred an interest in designing a reverse logistics (RL) network. The success of a logistics system may depend on the decisions of the facilities locations and vehicle routings. The location-routing problem (LRP) simultaneously locates the facilities and designs the travel routes for vehicles among established facilities and existing demand points. In this paper, the location-routing problem with time window (LRPTW) and homogeneous fleet type and designing a multi-echelon, and capacitated reverse logistics network, are considered which may arise in many real-life situations in logistics management. Our proposed RL network consists of hybrid collection/inspection centers, recovery centers and disposal centers. Here, we present a new bi-objective mathematical programming (BOMP) for LRPTW in reverse logistic. Since this type of problem is NP-hard, the non-dominated sorting genetic algorithm II (NSGA-II) is proposed to obtain the Pareto frontier for the given problem. Several numerical examples are presented to illustrate the effectiveness of the proposed model and algorithm. Also, the present work is an effort to effectively implement the ɛ-constraint method in GAMS software for producing the Pareto-optimal solutions in a BOMP. The results of the proposed algorithm have been compared with the ɛ-constraint method. The computational results show that the ɛ-constraint method is able to solve small-size instances to optimality within reasonable computing times, and for medium-to-large-sized problems, the proposed NSGA-II works better than the ɛ-constraint.

  15. A Feeder-Bus Dispatch Planning Model for Emergency Evacuation in Urban Rail Transit Corridors

    PubMed Central

    Wang, Yun; Yan, Xuedong; Zhou, Yu; Zhang, Wenyi

    2016-01-01

    The mobility of modern metropolises strongly relies on urban rail transit (URT) systems, and such a heavy dependence causes that even minor service interruptions would make the URT systems unsustainable. This study aims at optimally dispatching the ground feeder-bus to coordinate with the urban rails’ operation for eliminating the effect of unexpected service interruptions in URT corridors. A feeder-bus dispatch planning model was proposed for the collaborative optimization of URT and feeder-bus cooperation under emergency situations and minimizing the total evacuation cost of the feeder-buses. To solve the model, a concept of dummy feeder-bus system is proposed to transform the non-linear model into traditional linear programming (ILP) model, i.e., traditional transportation problem. The case study of Line #2 of Nanjing URT in China was adopted to illustrate the model application and sensitivity analyses of the key variables. The modeling results show that as the evacuation time window increases, the total evacuation cost as well as the number of dispatched feeder-buses decrease, and the dispatched feeder-buses need operate for more times along the feeder-bus line. The number of dispatched feeder-buses does not show an obvious change with the increase of parking spot capacity and time window, indicating that simply increasing the parking spot capacity would cause huge waste for the emergent bus utilization. When the unbalanced evacuation demand exists between stations, the more feeder-buses are needed. The method of this study will contribute to improving transportation emergency management and resource allocation for URT systems. PMID:27676179

  16. An Analysis of Peer-Reviewed Scores and Impact Factors with Different Citation Time Windows: A Case Study of 28 Ophthalmologic Journals

    PubMed Central

    Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu

    2015-01-01

    Background An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor’s definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals’ impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research Limitations Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ Value We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation. PMID:26295157

  17. An Analysis of Peer-Reviewed Scores and Impact Factors with Different Citation Time Windows: A Case Study of 28 Ophthalmologic Journals.

    PubMed

    Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu

    2015-01-01

    An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor's definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals' impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation.

  18. Predicting progression of mild cognitive impairment to dementia using neuropsychological data: a supervised learning approach using time windows.

    PubMed

    Pereira, Telma; Lemos, Luís; Cardoso, Sandra; Silva, Dina; Rodrigues, Ana; Santana, Isabel; de Mendonça, Alexandre; Guerreiro, Manuela; Madeira, Sara C

    2017-07-19

    Predicting progression from a stage of Mild Cognitive Impairment to dementia is a major pursuit in current research. It is broadly accepted that cognition declines with a continuum between MCI and dementia. As such, cohorts of MCI patients are usually heterogeneous, containing patients at different stages of the neurodegenerative process. This hampers the prognostic task. Nevertheless, when learning prognostic models, most studies use the entire cohort of MCI patients regardless of their disease stages. In this paper, we propose a Time Windows approach to predict conversion to dementia, learning with patients stratified using time windows, thus fine-tuning the prognosis regarding the time to conversion. In the proposed Time Windows approach, we grouped patients based on the clinical information of whether they converted (converter MCI) or remained MCI (stable MCI) within a specific time window. We tested time windows of 2, 3, 4 and 5 years. We developed a prognostic model for each time window using clinical and neuropsychological data and compared this approach with the commonly used in the literature, where all patients are used to learn the models, named as First Last approach. This enables to move from the traditional question "Will a MCI patient convert to dementia somewhere in the future" to the question "Will a MCI patient convert to dementia in a specific time window". The proposed Time Windows approach outperformed the First Last approach. The results showed that we can predict conversion to dementia as early as 5 years before the event with an AUC of 0.88 in the cross-validation set and 0.76 in an independent validation set. Prognostic models using time windows have higher performance when predicting progression from MCI to dementia, when compared to the prognostic approach commonly used in the literature. Furthermore, the proposed Time Windows approach is more relevant from a clinical point of view, predicting conversion within a temporal interval rather than sometime in the future and allowing clinicians to timely adjust treatments and clinical appointments.

  19. Mission planning optimization of video satellite for ground multi-object staring imaging

    NASA Astrophysics Data System (ADS)

    Cui, Kaikai; Xiang, Junhua; Zhang, Yulin

    2018-03-01

    This study investigates the emergency scheduling problem of ground multi-object staring imaging for a single video satellite. In the proposed mission scenario, the ground objects require a specified duration of staring imaging by the video satellite. The planning horizon is not long, i.e., it is usually shorter than one orbit period. A binary decision variable and the imaging order are used as the design variables, and the total observation revenue combined with the influence of the total attitude maneuvering time is regarded as the optimization objective. Based on the constraints of the observation time windows, satellite attitude adjustment time, and satellite maneuverability, a constraint satisfaction mission planning model is established for ground object staring imaging by a single video satellite. Further, a modified ant colony optimization algorithm with tabu lists (Tabu-ACO) is designed to solve this problem. The proposed algorithm can fully exploit the intelligence and local search ability of ACO. Based on full consideration of the mission characteristics, the design of the tabu lists can reduce the search range of ACO and improve the algorithm efficiency significantly. The simulation results show that the proposed algorithm outperforms the conventional algorithm in terms of optimization performance, and it can obtain satisfactory scheduling results for the mission planning problem.

  20. Timing of favorable conditions, competition and fertility interact to govern recruitment of invasive Chinese tallow tree in stressful environments.

    PubMed

    Gabler, Christopher A; Siemann, Evan

    2013-01-01

    The rate of new exotic recruitment following removal of adult invaders (reinvasion pressure) influences restoration outcomes and costs but is highly variable and poorly understood. We hypothesize that broad variation in average reinvasion pressure of Triadica sebifera (Chinese tallow tree, a major invader) arises from differences among habitats in spatiotemporal availability of realized recruitment windows. These windows are periods of variable duration long enough to permit establishment given local environmental conditions. We tested this hypothesis via a greenhouse mesocosm experiment that quantified how the duration of favorable moisture conditions prior to flood or drought stress (window duration), competition and nutrient availability influenced Triadica success in high stress environments. Window duration influenced pre-stress seedling abundance and size, growth during stress and final abundance; it interacted with other factors to affect final biomass and germination during stress. Stress type and competition impacted final size and biomass, plus germination, mortality and changes in size during stress. Final abundance also depended on competition and the interaction of window duration, stress type and competition. Fertilization interacted with competition and stress to influence biomass and changes in height, respectively, but did not affect Triadica abundance. Overall, longer window durations promoted Triadica establishment, competition and drought (relative to flood) suppressed establishment, and fertilization had weak effects. Interactions among factors frequently produced different effects in specific contexts. Results support our 'outgrow the stress' hypothesis and show that temporal availability of abiotic windows and factors that influence growth rates govern Triadica recruitment in stressful environments. These findings suggest that native seed addition can effectively suppress superior competitors in stressful environments. We also describe environmental scenarios where specific management methods may be more or less effective. Our results enable better niche-based estimates of local reinvasion pressure, which can improve restoration efficacy and efficiency by informing site selection and optimal management.

  1. Timing of Favorable Conditions, Competition and Fertility Interact to Govern Recruitment of Invasive Chinese Tallow Tree in Stressful Environments

    PubMed Central

    Gabler, Christopher A.; Siemann, Evan

    2013-01-01

    The rate of new exotic recruitment following removal of adult invaders (reinvasion pressure) influences restoration outcomes and costs but is highly variable and poorly understood. We hypothesize that broad variation in average reinvasion pressure of Triadica sebifera (Chinese tallow tree, a major invader) arises from differences among habitats in spatiotemporal availability of realized recruitment windows. These windows are periods of variable duration long enough to permit establishment given local environmental conditions. We tested this hypothesis via a greenhouse mesocosm experiment that quantified how the duration of favorable moisture conditions prior to flood or drought stress (window duration), competition and nutrient availability influenced Triadica success in high stress environments. Window duration influenced pre-stress seedling abundance and size, growth during stress and final abundance; it interacted with other factors to affect final biomass and germination during stress. Stress type and competition impacted final size and biomass, plus germination, mortality and changes in size during stress. Final abundance also depended on competition and the interaction of window duration, stress type and competition. Fertilization interacted with competition and stress to influence biomass and changes in height, respectively, but did not affect Triadica abundance. Overall, longer window durations promoted Triadica establishment, competition and drought (relative to flood) suppressed establishment, and fertilization had weak effects. Interactions among factors frequently produced different effects in specific contexts. Results support our ‘outgrow the stress’ hypothesis and show that temporal availability of abiotic windows and factors that influence growth rates govern Triadica recruitment in stressful environments. These findings suggest that native seed addition can effectively suppress superior competitors in stressful environments. We also describe environmental scenarios where specific management methods may be more or less effective. Our results enable better niche-based estimates of local reinvasion pressure, which can improve restoration efficacy and efficiency by informing site selection and optimal management. PMID:23967212

  2. Boosting the down-shifting luminescence of rare-earth nanocrystals for biological imaging beyond 1500 nm.

    PubMed

    Zhong, Yeteng; Ma, Zhuoran; Zhu, Shoujun; Yue, Jingying; Zhang, Mingxi; Antaris, Alexander L; Yuan, Jie; Cui, Ran; Wan, Hao; Zhou, Ying; Wang, Weizhi; Huang, Ngan F; Luo, Jian; Hu, Zhiyuan; Dai, Hongjie

    2017-09-29

    In vivo fluorescence imaging in the near-infrared region between 1500-1700 nm (NIR-IIb window) affords high spatial resolution, deep-tissue penetration, and diminished auto-fluorescence due to the suppressed scattering of long-wavelength photons and large fluorophore Stokes shifts. However, very few NIR-IIb fluorescent probes exist currently. Here, we report the synthesis of a down-conversion luminescent rare-earth nanocrystal with cerium doping (Er/Ce co-doped NaYbF 4 nanocrystal core with an inert NaYF 4 shell). Ce doping is found to suppress the up-conversion pathway while boosting down-conversion by ~9-fold to produce bright 1550 nm luminescence under 980 nm excitation. Optimization of the inert shell coating surrounding the core and hydrophilic surface functionalization minimize the luminescence quenching effect by water. The resulting biocompatible, bright 1550 nm emitting nanoparticles enable fast in vivo imaging of blood vasculature in the mouse brain and hindlimb in the NIR-IIb window with short exposure time of 20 ms for rare-earth based probes.Fluorescence imaging in the near-infrared window between 1500-1700 nm (NIR-IIb window) offers superior spatial resolution and tissue penetration depth, but few NIR-IIb probes exist. Here, the authors synthesize rare earth down-converting nanocrystals as promising fluorescent probes for in vivo imaging in this spectral region.

  3. Critical disease windows shaped by stress exposure alter allocation trade-offs between development and immunity.

    PubMed

    Kirschman, Lucas J; Crespi, Erica J; Warne, Robin W

    2018-01-01

    Ubiquitous environmental stressors are often thought to alter animal susceptibility to pathogens and contribute to disease emergence. However, duration of exposure to a stressor is likely critical, because while chronic stress is often immunosuppressive, acute stress can temporarily enhance immune function. Furthermore, host susceptibility to stress and disease often varies with ontogeny; increasing during critical developmental windows. How the duration and timing of exposure to stressors interact to shape critical windows and influence disease processes is not well tested. We used ranavirus and larval amphibians as a model system to investigate how physiological stress and pathogenic infection shape development and disease dynamics in vertebrates. Based on a resource allocation model, we designed experiments to test how exposure to stressors may induce resource trade-offs that shape critical windows and disease processes because the neuroendocrine stress axis coordinates developmental remodelling, immune function and energy allocation in larval amphibians. We used wood frog larvae (Lithobates sylvaticus) to investigate how chronic and acute exposure to corticosterone, the dominant amphibian glucocorticoid hormone, mediates development and immune function via splenocyte immunohistochemistry analysis in association with ranavirus infection. Corticosterone treatments affected immune function, as both chronic and acute exposure suppressed splenocyte proliferation, although viral replication rate increased only in the chronic corticosterone treatment. Time to metamorphosis and survival depended on both corticosterone treatment and infection status. In the control and chronic corticosterone treatments, ranavirus infection decreased survival and delayed metamorphosis, although chronic corticosterone exposure accelerated rate of metamorphosis in uninfected larvae. Acute corticosterone exposure accelerated metamorphosis increased survival in infected larvae. Interactions between stress exposure (via glucocorticoid actions) and infection impose resource trade-offs that shape optimal allocation between development and somatic function. As a result, critical disease windows are likely shaped by stress exposure because any conditions that induce changes in differentiation rates will alter the duration and susceptibility of organisms to stressors or disease. © 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.

  4. Longitudinal positron emission tomography imaging of glial cell activation in a mouse model of mesial temporal lobe epilepsy: Toward identification of optimal treatment windows.

    PubMed

    Nguyen, Duc-Loc; Wimberley, Catriona; Truillet, Charles; Jego, Benoit; Caillé, Fabien; Pottier, Géraldine; Boisgard, Raphaël; Buvat, Irène; Bouilleret, Viviane

    2018-06-01

    Mesiotemporal lobe epilepsy is the most common type of drug-resistant partial epilepsy, with a specific history that often begins with status epilepticus due to various neurological insults followed by a silent period. During this period, before the first seizure occurs, a specific lesion develops, described as unilateral hippocampal sclerosis (HS). It is still challenging to determine which drugs, administered at which time point, will be most effective during the formation of this epileptic process. Neuroinflammation plays an important role in pathophysiological mechanisms in epilepsy, and therefore brain inflammation biomarkers such as translocator protein 18 kDa (TSPO) can be potent epilepsy biomarkers. TSPO is associated with reactive astrocytes and microglia. A unilateral intrahippocampal kainate injection mouse model can reproduce the defining features of human temporal lobe epilepsy with unilateral HS and the pattern of chronic pharmacoresistant temporal seizures. We hypothesized that longitudinal imaging using TSPO positron emission tomography (PET) with 18 F-DPA-714 could identify optimal treatment windows in a mouse model during the formation of HS. The model was induced into the right dorsal hippocampus of male C57/Bl6 mice. Micro-PET/computed tomographic scanning was performed before model induction and along the development of the HS at 7 days, 14 days, 1 month, and 6 months. In vitro autoradiography and immunohistofluorescence were performed on additional mice at each time point. TSPO PET uptake reached peak at 7 days and mostly related to microglial activation, whereas after 14 days, reactive astrocytes were shown to be the main cells expressing TSPO, reflected by a continuing increased PET uptake. TSPO-targeted PET is a highly potent longitudinal biomarker of epilepsy and could be of interest to determine the therapeutic windows in epilepsy and to monitor response to treatment. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.

  5. Pinning down high-performance Cu-chalcogenides as thin-film solar cell absorbers: A successive screening approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yubo; Zhang, Wenqing, E-mail: wqzhang@mail.sic.ac.cn, E-mail: pzhang3@buffalo.edu; State Key Laboratory of High Performance Ceramics and Superfine Microstructures, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050

    2016-05-21

    Photovoltaic performances of Cu-chalcogenides solar cells are strongly correlated with the absorber fundamental properties such as optimal bandgap, desired band alignment with window material, and high photon absorption ability. According to these criteria, we carry out a successive screening for 90 Cu-chalcogenides using efficient theoretical approaches. Besides the well-recognized CuInSe{sub 2} and Cu{sub 2}ZnSnSe{sub 4} materials, several novel candidates are identified to have optimal bandgaps of around 1.0–1.5 eV, spike-like band alignments with CdS window layer, sharp photon absorption edges, and high absorption coefficients. These new systems have great potential to be superior absorbers for photovolatic applications if their carrriermore » transport and defect properties are properly optimized.« less

  6. Infrared sensor and window system issues

    NASA Astrophysics Data System (ADS)

    Hargraves, Charles H., Jr.; Martin, James M.

    1992-12-01

    EO/IR windows are a significant challenge for the weapon system sensor designer who must design for high EO performance, low radar cross section (RCS), supersonic flight, durability, producibility and affordable initial and life cycle costs. This is particularly true in the 8 to 12 micron IR band at which window materials and coating choices are limited by system design requirements. The requirements also drive the optimization of numerous mechanical, optical, materials, and electrical parameters. This paper addresses the EO/IR window as a system design challenge. The interrelationship of the optical, mechanical, and system design processes are examined. This paper presents a summary of the test results, trade studies and analyses that were performed for multi-segment, flight-worthy optical windows with superior optical performance at subsonic and supersonic aircraft velocities and reduced radar cross section. The impact of the window assembly on EO system modulation transfer function (MTF) and sensitivity will be discussed. The use of conductive coatings for shielding/signature control will be discussed.

  7. The suppression effect of external magnetic field on the high-power microwave window multipactor phenomenon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xue, E-mail: zhangxue.iecas@yahoo.com; Wang, Yong; Fan, Junjie

    2015-02-15

    To suppress the surface multipactor phenomenon and improve the transmitting power of the high-power microwave window, the application of external magnetic fields is theoretically analyzed and simulated. A Monte Carlo algorithm is used to track the secondary electron trajectories and study the multipactor scenario on the surface of a cylinder window. It is confirmed that over-resonant magnetic fields (an external magnetic field whose magnitude is slightly greater than that of a resonant magnetic field) will generate a compensating trajectory and collision, which can suppress the secondary electron avalanche. The optimal value of this external magnetic field that will avoid themore » multipactor phenomenon on cylinder windows is discussed.« less

  8. Active noise attenuation in ventilation windows.

    PubMed

    Huang, Huahua; Qiu, Xiaojun; Kang, Jian

    2011-07-01

    The feasibility of applying active noise control techniques to attenuate low frequency noise transmission through a natural ventilation window into a room is investigated analytically and experimentally. The window system is constructed by staggering the opening sashes of a spaced double glazing window to allow ventilation and natural light. An analytical model based on the modal expansion method is developed to calculate the low frequency sound field inside the window and the room and to be used in the active noise control simulations. The effectiveness of the proposed analytical model is validated by using the finite element method. The performance of the active control system for a window with different source and receiver configurations are compared, and it is found that the numerical and experimental results are in good agreement and the best result is achieved when the secondary sources are placed in the center at the bottom of the staggered window. The extra attenuation at the observation points in the optimized window system is almost equivalent to the noise reduction at the error sensor and the frequency range of effective control is up to 390 Hz in the case of a single channel active noise control system. © 2011 Acoustical Society of America

  9. Temporal Integration Windows in Neural Processing and Perception Aligned to Saccadic Eye Movements.

    PubMed

    Wutz, Andreas; Muschter, Evelyn; van Koningsbruggen, Martijn G; Weisz, Nathan; Melcher, David

    2016-07-11

    When processing dynamic input, the brain balances the opposing needs of temporal integration and sensitivity to change. We hypothesized that the visual system might resolve this challenge by aligning integration windows to the onset of newly arriving sensory samples. In a series of experiments, human participants observed the same sequence of two displays separated by a brief blank delay when performing either an integration or segregation task. First, using magneto-encephalography (MEG), we found a shift in the stimulus-evoked time courses by a 150-ms time window between task signals. After stimulus onset, multivariate pattern analysis (MVPA) decoding of task in occipital-parietal sources remained above chance for almost 1 s, and the task-decoding pattern interacted with task outcome. In the pre-stimulus period, the oscillatory phase in the theta frequency band was informative about both task processing and behavioral outcome for each task separately, suggesting that the post-stimulus effects were caused by a theta-band phase shift. Second, when aligning stimulus presentation to the onset of eye fixations, there was a similar phase shift in behavioral performance according to task demands. In both MEG and behavioral measures, task processing was optimal first for segregation and then integration, with opposite phase in the theta frequency range (3-5 Hz). The best fit to neurophysiological and behavioral data was given by a dampened 3-Hz oscillation from stimulus or eye fixation onset. The alignment of temporal integration windows to input changes found here may serve to actively organize the temporal processing of continuous sensory input. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  10. Optimization and Simulation of SLM Process for High Density H13 Tool Steel Parts

    NASA Astrophysics Data System (ADS)

    Laakso, Petri; Riipinen, Tuomas; Laukkanen, Anssi; Andersson, Tom; Jokinen, Antero; Revuelta, Alejandro; Ruusuvuori, Kimmo

    This paper demonstrates the successful printing and optimization of processing parameters of high-strength H13 tool steel by Selective Laser Melting (SLM). D-Optimal Design of Experiments (DOE) approach is used for parameter optimization of laser power, scanning speed and hatch width. With 50 test samples (1×1×1cm) we establish parameter windows for these three parameters in relation to part density. The calculated numerical model is found to be in good agreement with the density data obtained from the samples using image analysis. A thermomechanical finite element simulation model is constructed of the SLM process and validated by comparing the calculated densities retrieved from the model with the experimentally determined densities. With the simulation tool one can explore the effect of different parameters on density before making any printed samples. Establishing a parameter window provides the user with freedom for parameter selection such as choosing parameters that result in fastest print speed.

  11. A Hybrid Approach for CpG Island Detection in the Human Genome.

    PubMed

    Yang, Cheng-Hong; Lin, Yu-Da; Chiang, Yi-Cheng; Chuang, Li-Yeh

    2016-01-01

    CpG islands have been demonstrated to influence local chromatin structures and simplify the regulation of gene activity. However, the accurate and rapid determination of CpG islands for whole DNA sequences remains experimentally and computationally challenging. A novel procedure is proposed to detect CpG islands by combining clustering technology with the sliding-window method (PSO-based). Clustering technology is used to detect the locations of all possible CpG islands and process the data, thus effectively obviating the need for the extensive and unnecessary processing of DNA fragments, and thus improving the efficiency of sliding-window based particle swarm optimization (PSO) search. This proposed approach, named ClusterPSO, provides versatile and highly-sensitive detection of CpG islands in the human genome. In addition, the detection efficiency of ClusterPSO is compared with eight CpG island detection methods in the human genome. Comparison of the detection efficiency for the CpG islands in human genome, including sensitivity, specificity, accuracy, performance coefficient (PC), and correlation coefficient (CC), ClusterPSO revealed superior detection ability among all of the test methods. Moreover, the combination of clustering technology and PSO method can successfully overcome their respective drawbacks while maintaining their advantages. Thus, clustering technology could be hybridized with the optimization algorithm method to optimize CpG island detection. The prediction accuracy of ClusterPSO was quite high, indicating the combination of CpGcluster and PSO has several advantages over CpGcluster and PSO alone. In addition, ClusterPSO significantly reduced implementation time.

  12. Optimization for Service Routes of Pallet Service Center Based on the Pallet Pool Mode

    PubMed Central

    He, Shiwei; Song, Rui

    2016-01-01

    Service routes optimization (SRO) of pallet service center should meet customers' demand firstly and then, through the reasonable method of lines organization, realize the shortest path of vehicle driving. The routes optimization of pallet service center is similar to the distribution problems of vehicle routing problem (VRP) and Chinese postman problem (CPP), but it has its own characteristics. Based on the relevant research results, the conditions of determining the number of vehicles, the one way of the route, the constraints of loading, and time windows are fully considered, and a chance constrained programming model with stochastic constraints is constructed taking the shortest path of all vehicles for a delivering (recycling) operation as an objective. For the characteristics of the model, a hybrid intelligent algorithm including stochastic simulation, neural network, and immune clonal algorithm is designed to solve the model. Finally, the validity and rationality of the optimization model and algorithm are verified by the case. PMID:27528865

  13. Spatial, temporal, and hybrid decompositions for large-scale vehicle routing with time windows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, Russell W

    This paper studies the use of decomposition techniques to quickly find high-quality solutions to large-scale vehicle routing problems with time windows. It considers an adaptive decomposition scheme which iteratively decouples a routing problem based on the current solution. Earlier work considered vehicle-based decompositions that partitions the vehicles across the subproblems. The subproblems can then be optimized independently and merged easily. This paper argues that vehicle-based decompositions, although very effective on various problem classes also have limitations. In particular, they do not accommodate temporal decompositions and may produce spatial decompositions that are not focused enough. This paper then proposes customer-based decompositionsmore » which generalize vehicle-based decouplings and allows for focused spatial and temporal decompositions. Experimental results on class R2 of the extended Solomon benchmarks demonstrates the benefits of the customer-based adaptive decomposition scheme and its spatial, temporal, and hybrid instantiations. In particular, they show that customer-based decompositions bring significant benefits over large neighborhood search in contrast to vehicle-based decompositions.« less

  14. Reconfigurable vision system for real-time applications

    NASA Astrophysics Data System (ADS)

    Torres-Huitzil, Cesar; Arias-Estrada, Miguel

    2002-03-01

    Recently, a growing community of researchers has used reconfigurable systems to solve computationally intensive problems. Reconfigurability provides optimized processors for systems on chip designs, and makes easy to import technology to a new system through reusable modules. The main objective of this work is the investigation of a reconfigurable computer system targeted for computer vision and real-time applications. The system is intended to circumvent the inherent computational load of most window-based computer vision algorithms. It aims to build a system for such tasks by providing an FPGA-based hardware architecture for task specific vision applications with enough processing power, using the minimum amount of hardware resources as possible, and a mechanism for building systems using this architecture. Regarding the software part of the system, a library of pre-designed and general-purpose modules that implement common window-based computer vision operations is being investigated. A common generic interface is established for these modules in order to define hardware/software components. These components can be interconnected to develop more complex applications, providing an efficient mechanism for transferring image and result data among modules. Some preliminary results are presented and discussed.

  15. Benchmarks for Enhanced Network Performance: Hands-On Testing of Operating System Solutions to Identify the Optimal Application Server Platform for the Graduate School of Business and Public Policy

    DTIC Science & Technology

    2010-09-01

    for Applied Mathematics. Kennedy, R. C. (2009a). Clocking Windows netbook performance. Retrieved on 08/14/2010, from http...podcasts.infoworld.com/d/hardware/clocking-windows- netbook -performance-883?_kip_ipx=1177119066-1281460794 Kennedy, R. C. (2009b). OfficeBench 7: A cool new way to

  16. Discrete Optimization Model for Vehicle Routing Problem with Scheduling Side Cosntraints

    NASA Astrophysics Data System (ADS)

    Juliandri, Dedy; Mawengkang, Herman; Bu'ulolo, F.

    2018-01-01

    Vehicle Routing Problem (VRP) is an important element of many logistic systems which involve routing and scheduling of vehicles from a depot to a set of customers node. This is a hard combinatorial optimization problem with the objective to find an optimal set of routes used by a fleet of vehicles to serve the demands a set of customers It is required that these vehicles return to the depot after serving customers’ demand. The problem incorporates time windows, fleet and driver scheduling, pick-up and delivery in the planning horizon. The goal is to determine the scheduling of fleet and driver and routing policies of the vehicles. The objective is to minimize the overall costs of all routes over the planning horizon. We model the problem as a linear mixed integer program. We develop a combination of heuristics and exact method for solving the model.

  17. Optimal deployment of resources for maximizing impact in spreading processes

    PubMed Central

    2017-01-01

    The effective use of limited resources for controlling spreading processes on networks is of prime significance in diverse contexts, ranging from the identification of “influential spreaders” for maximizing information dissemination and targeted interventions in regulatory networks, to the development of mitigation policies for infectious diseases and financial contagion in economic systems. Solutions for these optimization tasks that are based purely on topological arguments are not fully satisfactory; in realistic settings, the problem is often characterized by heterogeneous interactions and requires interventions in a dynamic fashion over a finite time window via a restricted set of controllable nodes. The optimal distribution of available resources hence results from an interplay between network topology and spreading dynamics. We show how these problems can be addressed as particular instances of a universal analytical framework based on a scalable dynamic message-passing approach and demonstrate the efficacy of the method on a variety of real-world examples. PMID:28900013

  18. JPARSS: A Java Parallel Network Package for Grid Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jie; Akers, Walter; Chen, Ying

    2002-03-01

    The emergence of high speed wide area networks makes grid computinga reality. However grid applications that need reliable data transfer still have difficulties to achieve optimal TCP performance due to network tuning of TCP window size to improve bandwidth and to reduce latency on a high speed wide area network. This paper presents a Java package called JPARSS (Java Parallel Secure Stream (Socket)) that divides data into partitions that are sent over several parallel Java streams simultaneously and allows Java or Web applications to achieve optimal TCP performance in a grid environment without the necessity of tuning TCP window size.more » This package enables single sign-on, certificate delegation and secure or plain-text data transfer using several security components based on X.509 certificate and SSL. Several experiments will be presented to show that using Java parallelstreams is more effective than tuning TCP window size. In addition a simple architecture using Web services« less

  19. TIMESERIESSTREAMING.VI: LabVIEW program for reliable data streaming of large analog time series

    NASA Astrophysics Data System (ADS)

    Czerwinski, Fabian; Oddershede, Lene B.

    2011-02-01

    With modern data acquisition devices that work fast and very precise, scientists often face the task of dealing with huge amounts of data. These need to be rapidly processed and stored onto a hard disk. We present a LabVIEW program which reliably streams analog time series of MHz sampling. Its run time has virtually no limitation. We explicitly show how to use the program to extract time series from two experiments: For a photodiode detection system that tracks the position of an optically trapped particle and for a measurement of ionic current through a glass capillary. The program is easy to use and versatile as the input can be any type of analog signal. Also, the data streaming software is simple, highly reliable, and can be easily customized to include, e.g., real-time power spectral analysis and Allan variance noise quantification. Program summaryProgram title: TimeSeriesStreaming.VI Catalogue identifier: AEHT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 250 No. of bytes in distributed program, including test data, etc.: 63 259 Distribution format: tar.gz Programming language: LabVIEW ( http://www.ni.com/labview/) Computer: Any machine running LabVIEW 8.6 or higher Operating system: Windows XP and Windows 7 RAM: 60-360 Mbyte Classification: 3 Nature of problem: For numerous scientific and engineering applications, it is highly desirable to have an efficient, reliable, and flexible program to perform data streaming of time series sampled with high frequencies and possibly for long time intervals. This type of data acquisition often produces very large amounts of data not easily streamed onto a computer hard disk using standard methods. Solution method: This LabVIEW program is developed to directly stream any kind of time series onto a hard disk. Due to optimized timing and usage of computational resources, such as multicores and protocols for memory usage, this program provides extremely reliable data acquisition. In particular, the program is optimized to deal with large amounts of data, e.g., taken with high sampling frequencies and over long time intervals. The program can be easily customized for time series analyses. Restrictions: Only tested in Windows-operating LabVIEW environments, must use TDMS format, acquisition cards must be LabVIEW compatible, driver DAQmx installed. Running time: As desirable: microseconds to hours

  20. Training loads and injury risk in Australian football-differing acute: chronic workload ratios influence match injury risk.

    PubMed

    Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E

    2017-08-01

    (1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated daily for each athlete, and modelled against non-contact injury likelihood using a quadratic relationship. 6 workload variables, 8 acute time windows (2-9 days) and 7 chronic time windows (14-35 days) were considered (336 combinations). Each parameter combination was compared for injury likelihood fit (using R 2 ). The ratio of moderate speed running workload (18-24 km/h) in the previous 3 days (acute time window) compared with the previous 21 days (chronic time window) best explained the injury likelihood in matches (R 2 =0.79) and in the immediate 2 or 5 days following matches (R 2 =0.76-0.82). The 3:21 acute:chronic workload ratio discriminated between high-risk and low-risk athletes (relative risk=1.98-2.43). Using the previous 6 days to calculate the acute workload time window yielded similar results. The choice of acute time window significantly influenced model performance and appeared to reflect the competition and training schedule. Daily workload ratios can inform injury risk in Australian football. Clinicians and conditioning coaches should consider the sport-specific schedule of competition and training when choosing acute and chronic time windows. For Australian football, the ratio of moderate speed running in a 3-day or 6-day acute time window and a 21-day chronic time window best explained injury risk. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Training loads and injury risk in Australian football—differing acute: chronic workload ratios influence match injury risk

    PubMed Central

    Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E

    2017-01-01

    Aims (1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Methods Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated daily for each athlete, and modelled against non-contact injury likelihood using a quadratic relationship. 6 workload variables, 8 acute time windows (2–9 days) and 7 chronic time windows (14–35 days) were considered (336 combinations). Each parameter combination was compared for injury likelihood fit (using R2). Results The ratio of moderate speed running workload (18–24 km/h) in the previous 3 days (acute time window) compared with the previous 21 days (chronic time window) best explained the injury likelihood in matches (R2=0.79) and in the immediate 2 or 5 days following matches (R2=0.76–0.82). The 3:21 acute:chronic workload ratio discriminated between high-risk and low-risk athletes (relative risk=1.98–2.43). Using the previous 6 days to calculate the acute workload time window yielded similar results. The choice of acute time window significantly influenced model performance and appeared to reflect the competition and training schedule. Conclusions Daily workload ratios can inform injury risk in Australian football. Clinicians and conditioning coaches should consider the sport-specific schedule of competition and training when choosing acute and chronic time windows. For Australian football, the ratio of moderate speed running in a 3-day or 6-day acute time window and a 21-day chronic time window best explained injury risk. PMID:27789430

  2. Time-marching multi-grid seismic tomography

    NASA Astrophysics Data System (ADS)

    Tong, P.; Yang, D.; Liu, Q.

    2016-12-01

    From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.

  3. Prediction on sunspot activity based on fuzzy information granulation and support vector machine

    NASA Astrophysics Data System (ADS)

    Peng, Lingling; Yan, Haisheng; Yang, Zhigang

    2018-04-01

    In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.

  4. Optimal strategies for observation of active galactic nuclei variability with Imaging Atmospheric Cherenkov Telescopes

    NASA Astrophysics Data System (ADS)

    Giomi, Matteo; Gerard, Lucie; Maier, Gernot

    2016-07-01

    Variable emission is one of the defining characteristic of active galactic nuclei (AGN). While providing precious information on the nature and physics of the sources, variability is often challenging to observe with time- and field-of-view-limited astronomical observatories such as Imaging Atmospheric Cherenkov Telescopes (IACTs). In this work, we address two questions relevant for the observation of sources characterized by AGN-like variability: what is the most time-efficient way to detect such sources, and what is the observational bias that can be introduced by the choice of the observing strategy when conducting blind surveys of the sky. Different observing strategies are evaluated using simulated light curves and realistic instrument response functions of the Cherenkov Telescope Array (CTA), a future gamma-ray observatory. We show that strategies that makes use of very small observing windows, spread over large periods of time, allows for a faster detection of the source, and are less influenced by the variability properties of the sources, as compared to strategies that concentrate the observing time in a small number of large observing windows. Although derived using CTA as an example, our conclusions are conceptually valid for any IACTs facility, and in general, to all observatories with small field of view and limited duty cycle.

  5. Photothermal Therapy Generates a Thermal Window of Immunogenic Cell Death in Neuroblastoma.

    PubMed

    Sweeney, Elizabeth E; Cano-Mejia, Juliana; Fernandes, Rohan

    2018-04-17

    A thermal "window" of immunogenic cell death (ICD) elicited by nanoparticle-based photothermal therapy (PTT) in an animal model of neuroblastoma is described. In studies using Prussian blue nanoparticles to administer photothermal therapy (PBNP-PTT) to established localized tumors in the neuroblastoma model, it is observed that PBNP-PTT conforms to the "more is better" paradigm, wherein higher doses of PBNP-PTT generates higher cell/local heating and thereby more cell death, and consequently improved animal survival. However, in vitro analysis of the biochemical correlates of ICD (ATP, high-motility group box 1, and calreticulin) elicited by PBNP-PTT demonstrates that PBNP-PTT triggers a thermal window of ICD. ICD markers are highly expressed within an optimal temperature (thermal dose) window of PBNP-PTT (63.3-66.4 °C) as compared with higher (83.0-83.5 °C) and lower PBNP-PTT (50.7-52.7 °C) temperatures, which both yield lower expression. Subsequent vaccination studies in the neuroblastoma model confirm the in vitro findings, wherein PBNP-PTT administered within the optimal temperature window results in long-term survival (33.3% at 100 d) compared with PBNP-PTT administered within the higher (0%) and lower (20%) temperature ranges, and controls (0%). The findings demonstrate a tunable immune response to heat generated by PBNP-PTT, which should be critically engaged in the administration of PTT for maximizing its therapeutic benefits. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Cochlear implantation through the round window with a straight slotted electrode array: optimizing the surgical procedure.

    PubMed

    Mom, Thierry; Bachy, Aurélie; Houette, Aubry; Pavier, Yoann; Pastourel, Rémy; Gabrillargues, Jean; Saroul, Nicolas; Gilain, Laurent; Avan, Paul

    2016-04-01

    The question addressed here is how optimizing the quality of insertion through the round window with the lower morbidity, when using a straight and slotted electrode array of regular length. This retrospective analysis includes all cases implanted with a cochlear implant Digisonic SP (Neurelec-Oticon Medical) since 2004. We checked the operative charts, the depth of insertion, and the follow-up. For comparisons, contingency tables were used and a Chi-square test was performed. A p value <0.05 was considered significant. 126 cases of patients with non-malformed cochleas were implanted through the round window. The mean age was 53.8 ± 16.2 for adults and 3.6 ± 2.6 for children (24 cases). The mean follow-up was 33 ± 22 months. The straight electrode array had either a square or a soft pointed tip (n = 84). Full insertion was achieved in 79 out of 84 cases with a soft tip vs. 18 out of 42 square tips (χ (2) = 41.41, DOF = 1, p < 0.0001). Two cases were stuck at the round window niche by a prominent crista fenestrae. In all cases but one, the chorda tympany was preserved. In one case, a misrouting to the vestibule required a revision surgery. Implantation through the round window with a straight and slotted electrode array with a soft tip (Digisonic SP, Neurelec-Oticon Medical) can lead to a full insertion in 94 % of cases. Drilling out a prominent crista fenestrae is recommended.

  7. Electrodeposition of ZnO-doped films as window layer for Cd-free CIGS-based solar cells

    NASA Astrophysics Data System (ADS)

    Tsin, Fabien; Vénérosy, Amélie; Hildebrandt, Thibaud; Hariskos, Dimitrios; Naghavi, Negar; Lincot, Daniel; Rousset, Jean

    2016-02-01

    The Cu(In,Ga)Se2 (CIGS) thin film solar cell technology has made a steady progress within the last decade reaching efficiency up to 22.3% on laboratory scale, thus overpassing the highest efficiency for polycrystalline silicon solar cells. High efficiency CIGS modules employ a so-called buffer layer of cadmium sulfide CdS deposited by Chemical Bath Deposition (CBD), which presence and Cd-containing waste present some environmental concerns. A second potential bottleneck for CIGS technology is its window layer made of i-ZnO/ZnO:Al, which is deposited by sputtering requiring expensive vacuum equipment. A non-vacuum deposition of transparent conductive oxide (TCO) relying on simpler equipment with lower investment costs will be more economically attractive, and could increase competitiveness of CIGS-based modules with the mainstream silicon-based technologies. In the frame of Novazolar project, we have developed a low-cost aqueous solution photo assisted electrodeposition process of the ZnO-based window layer for high efficiency CIGS-based solar cells. The window layer deposition have been first optimized on classical CdS buffer layer leading to cells with efficiencies similar to those measured with the sputtered references on the same absorber (15%). The the optimized ZnO doped layer has been adapted to cadmium free devices where the CdS is replaced by chemical bath deposited zinc oxysulfide Zn(S,O) buffer layer. The effect of different growth parameters has been studied on CBD-Zn(S,O)-plated co-evaporated Cu(In,Ga)Se2 substrates provided by the Zentrum für Sonnenenergie-und Wasserstoff-Forschung (ZSW). This optimization of the electrodeposition of ZnO:Cl on CIGS/Zn(S,O) stacks led to record efficiency of 14%, while the reference cell with a sputtered (Zn,Mg)O/ZnO:Al window layer has an efficiency of 15.2%.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Steve; Haji-Sheikh, Michael; Huntington, Andrew

    The Voxtel VX-798 is a prototype X-ray pixel array detector (PAD) featuring a silicon sensor photodiode array of 48 x 48 pixels, each 130 mu m x 130 mu m x 520 mu m thick, coupled to a CMOS readout application specific integrated circuit (ASIC). The first synchrotron X-ray characterization of this detector is presented, and its ability to selectively count individual X-rays within two independent arrival time windows, a programmable energy range, and localized to a single pixel is demonstrated. During our first trial run at Argonne National Laboratory's Advance Photon Source, the detector achieved a 60 ns gatingmore » time and 700 eV full width at half-maximum energy resolution in agreement with design parameters. Each pixel of the PAD holds two independent digital counters, and the discriminator for X-ray energy features both an upper and lower threshold to window the energy of interest discarding unwanted background. This smart-pixel technology allows energy and time resolution to be set and optimized in software. It is found that the detector linearity follows an isolated dead-time model, implying that megahertz count rates should be possible in each pixel. Measurement of the line and point spread functions showed negligible spatial blurring. When combined with the timing structure of the synchrotron storage ring, it is demonstrated that the area detector can perform both picosecond time-resolved X-ray diffraction and fluorescence spectroscopy measurements.« less

  9. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Optimization of window settings for standard and advanced virtual monoenergetic imaging in abdominal dual-energy CT angiography.

    PubMed

    Caruso, Damiano; Parinella, Ashley H; Schoepf, U Joseph; Stroebel, Maxwell H; Mangold, Stefanie; Wichmann, Julian L; Varga-Szemes, Akos; Ball, B Devon; De Santis, Domenico; Laghi, Andrea; De Cecco, Carlo N

    2017-03-01

    To determine the optimal window setting for displaying virtual monoenergetic reconstructions of third generation dual-source, dual-energy CT (DECT) angiography of the abdomen. Forty-five patients were evaluated with DECT angiography (90/150 kV, 180/90 ref. mAs). Three datasets were reconstructed: standard linear blending (M_0.6), 70 keV traditional virtual monoenergetic (M70), and 40 keV advanced noise-optimized virtual monoenergetic (M40+). The best window setting (width and level, W/L) was assessed by two blinded observers and was correlated with aortic attenuation to obtain the Optimized W/L setting (O-W/L). Subjective image quality was assessed, and vessel diameters were measured to determine any possible influences between different W/L settings. Repeated measures of variance were used to evaluate comparison of W/L values, image quality, and vessel sizing between M_0.6, M70, and M40+. The Best W/L (B-W/L) for M70 and M40+ was 880/280 and 1410/450, respectively. Results from regression analysis inferred an O-W/L of 850/270 for M70 and 1350/430 for M40+. Significant differences for W and L were found between the Best and the Optimized W/L for M40+, and between M70 and M40+ for both the Best and Optimized W/L. No significant differences for vessel measurements were found using the O-W/L for M40+ compared to the standard M_0.6 (p ≥ 0.16), and significant differences were observed when using the B-W/L with M40+ compared to M_0.6 (p ≤ 0.04). In order to optimize virtual monoenergetic imaging with both traditional M70 and advanced M40+, adjusting the W/L settings is necessary. Our results suggest a W/L setting of 850/270 for M70 and 1350/430 for M40+.

  11. Management of heat in laser tissue welding using NIR cover window material.

    PubMed

    Sriramoju, Vidyasagar; Savage, Howard; Katz, Alvin; Muthukattil, Ronex; Alfano, Robert R

    2011-12-01

    Laser tissue welding (LTW) is a novel method of surgical wound closure by the use of laser radiation to induce fusion of the biological tissues. Molecular dynamics associated with LTW is a result of thermal and non-thermal mechanisms. This research focuses exclusively on better heat management to reduce thermal damage of tissues in LTW using a near infrared laser radiation. An infrared continuous-wave (CW) laser radiation at 1,450 nm wavelength corresponding to the absorption band from combination vibrational modes of water is used to weld together ex vivo porcine aorta. In these studies we measured the optimal laser power and scan speed, for better tensile strength of the weld and lesser tissue dehydration. Significant amount of water loss from the welded tissue results in cellular death and tissue buckling. Various thermally conductive optical cover windows were used as heat sinks to reduce thermal effects during LTW for the dissipation of the heat. The optimal use of the method prevents tissue buckling and minimizes the water loss. Diamond, sapphire, BK7, fused silica, and IR quartz transparent optical cover windows were tested. The data from this study suggests that IR-quartz as the material with optimal thermal conductivity is ideal for laser welding of the porcine aorta. Copyright © 2011 Wiley Periodicals, Inc.

  12. Climate optimized planting windows for cotton in the lower Mississippi Delta region

    USDA-ARS?s Scientific Manuscript database

    Unique, variable summer climate of the lower Mississippi Delta region poses a critical challenge to cotton producers in deciding when to plant for optimized production. Traditional 2- to 4-year agronomic field trials conducted in this area fail to capture the effects of long-term climate variabiliti...

  13. Effect of partial covering of the visitor viewing area window on positioning and orientation of zoo orangutans: A preference test.

    PubMed

    Bloomfield, Rachel C; Gillespie, Graeme R; Kerswell, Keven J; Butler, Kym L; Hemsworth, Paul H

    2015-01-01

    The window of the visitor viewing area adjacent to an animal platform in an orangutan enclosure was altered to produce three viewing treatments in a randomized controlled experiment. These treatments were window uncovered, left side of the window covered or right side of the window covered. Observations were conducted on the orangutans present on the platform, and on their location (left or right side), and orientation (towards or away from the window) while on the platform. The partial covering of the window had little effect on the proportion of time orangutans spent on the viewing platform, or on the direction they faced when on the platform. When the orangutans were facing towards the window, and the right side was uncovered, irrespective of whether the left side was covered, they spent about three quarters of the time on the right side, suggesting a preference for the right side of the platform. However, when the right side was covered and the left side uncovered, the animals facing towards the window spent only about a quarter of the time on the right side, that is, they spent more time on the uncovered side. The results suggest that the orangutans have a preference to position themselves to face the window of the visitor viewing area. © 2015 Wiley Periodicals, Inc.

  14. Optimal whole-body PET scanner configurations for different volumes of LSO scintillator: a simulation study.

    PubMed

    Poon, Jonathan K; Dahlbom, Magnus L; Moses, William W; Balakrishnan, Karthik; Wang, Wenli; Cherry, Simon R; Badawi, Ramsey D

    2012-07-07

    The axial field of view (AFOV) of the current generation of clinical whole-body PET scanners range from 15-22 cm, which limits sensitivity and renders applications such as whole-body dynamic imaging or imaging of very low activities in whole-body cellular tracking studies, almost impossible. Generally, extending the AFOV significantly increases the sensitivity and count-rate performance. However, extending the AFOV while maintaining detector thickness has significant cost implications. In addition, random coincidences, detector dead time, and object attenuation may reduce scanner performance as the AFOV increases. In this paper, we use Monte Carlo simulations to find the optimal scanner geometry (i.e. AFOV, detector thickness and acceptance angle) based on count-rate performance for a range of scintillator volumes ranging from 10 to 93 l with detector thickness varying from 5 to 20 mm. We compare the results to the performance of a scanner based on the current Siemens Biograph mCT geometry and electronics. Our simulation models were developed based on individual components of the Siemens Biograph mCT and were validated against experimental data using the NEMA NU-2 2007 count-rate protocol. In the study, noise-equivalent count rate (NECR) was computed as a function of maximum ring difference (i.e. acceptance angle) and activity concentration using a 27 cm diameter, 200 cm uniformly filled cylindrical phantom for each scanner configuration. To reduce the effect of random coincidences, we implemented a variable coincidence time window based on the length of the lines of response, which increased NECR performance up to 10% compared to using a static coincidence time window for scanners with a large maximum ring difference values. For a given scintillator volume, the optimal configuration results in modest count-rate performance gains of up to 16% compared to the shortest AFOV scanner with the thickest detectors. However, the longest AFOV of approximately 2 m with 20 mm thick detectors resulted in performance gains of 25-31 times higher NECR relative to the current Siemens Biograph mCT scanner configuration.

  15. Optimal whole-body PET scanner configurations for different volumes of LSO scintillator: a simulation study

    PubMed Central

    Poon, Jonathan K; Dahlbom, Magnus L; Moses, William W; Balakrishnan, Karthik; Wang, Wenli; Cherry, Simon R; Badawi, Ramsey D

    2013-01-01

    The axial field of view (AFOV) of the current generation of clinical whole-body PET scanners range from 15–22 cm, which limits sensitivity and renders applications such as whole-body dynamic imaging, or imaging of very low activities in whole-body cellular tracking studies, almost impossible. Generally, extending the AFOV significantly increases the sensitivity and count-rate performance. However, extending the AFOV while maintaining detector thickness has significant cost implications. In addition, random coincidences, detector dead time, and object attenuation may reduce scanner performance as the AFOV increases. In this paper, we use Monte Carlo simulations to find the optimal scanner geometry (i.e. AFOV, detector thickness and acceptance angle) based on count-rate performance for a range of scintillator volumes ranging from 10 to 90 l with detector thickness varying from 5 to 20 mm. We compare the results to the performance of a scanner based on the current Siemens Biograph mCT geometry and electronics. Our simulation models were developed based on individual components of the Siemens Biograph mCT and were validated against experimental data using the NEMA NU-2 2007 count-rate protocol. In the study, noise-equivalent count rate (NECR) was computed as a function of maximum ring difference (i.e. acceptance angle) and activity concentration using a 27 cm diameter, 200 cm uniformly filled cylindrical phantom for each scanner configuration. To reduce the effect of random coincidences, we implemented a variable coincidence time window based on the length of the lines of response, which increased NECR performance up to 10% compared to using a static coincidence time window for scanners with large maximum ring difference values. For a given scintillator volume, the optimal configuration results in modest count-rate performance gains of up to 16% compared to the shortest AFOV scanner with the thickest detectors. However, the longest AFOV of approximately 2 m with 20 mm thick detectors resulted in performance gains of 25–31 times higher NECR relative to the current Siemens Biograph mCT scanner configuration. PMID:22678106

  16. Optimal whole-body PET scanner configurations for different volumes of LSO scintillator: a simulation study

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Moses, William W.; Balakrishnan, Karthik; Wang, Wenli; Cherry, Simon R.; Badawi, Ramsey D.

    2012-07-01

    The axial field of view (AFOV) of the current generation of clinical whole-body PET scanners range from 15-22 cm, which limits sensitivity and renders applications such as whole-body dynamic imaging or imaging of very low activities in whole-body cellular tracking studies, almost impossible. Generally, extending the AFOV significantly increases the sensitivity and count-rate performance. However, extending the AFOV while maintaining detector thickness has significant cost implications. In addition, random coincidences, detector dead time, and object attenuation may reduce scanner performance as the AFOV increases. In this paper, we use Monte Carlo simulations to find the optimal scanner geometry (i.e. AFOV, detector thickness and acceptance angle) based on count-rate performance for a range of scintillator volumes ranging from 10 to 93 l with detector thickness varying from 5 to 20 mm. We compare the results to the performance of a scanner based on the current Siemens Biograph mCT geometry and electronics. Our simulation models were developed based on individual components of the Siemens Biograph mCT and were validated against experimental data using the NEMA NU-2 2007 count-rate protocol. In the study, noise-equivalent count rate (NECR) was computed as a function of maximum ring difference (i.e. acceptance angle) and activity concentration using a 27 cm diameter, 200 cm uniformly filled cylindrical phantom for each scanner configuration. To reduce the effect of random coincidences, we implemented a variable coincidence time window based on the length of the lines of response, which increased NECR performance up to 10% compared to using a static coincidence time window for scanners with a large maximum ring difference values. For a given scintillator volume, the optimal configuration results in modest count-rate performance gains of up to 16% compared to the shortest AFOV scanner with the thickest detectors. However, the longest AFOV of approximately 2 m with 20 mm thick detectors resulted in performance gains of 25-31 times higher NECR relative to the current Siemens Biograph mCT scanner configuration.

  17. Efficient full-chip SRAF placement using machine learning for best accuracy and improved consistency

    NASA Astrophysics Data System (ADS)

    Wang, Shibing; Baron, Stanislas; Kachwala, Nishrin; Kallingal, Chidam; Sun, Dezheng; Shu, Vincent; Fong, Weichun; Li, Zero; Elsaid, Ahmad; Gao, Jin-Wei; Su, Jing; Ser, Jung-Hoon; Zhang, Quan; Chen, Been-Der; Howell, Rafael; Hsu, Stephen; Luo, Larry; Zou, Yi; Zhang, Gary; Lu, Yen-Wen; Cao, Yu

    2018-03-01

    Various computational approaches from rule-based to model-based methods exist to place Sub-Resolution Assist Features (SRAF) in order to increase process window for lithography. Each method has its advantages and drawbacks, and typically requires the user to make a trade-off between time of development, accuracy, consistency and cycle time. Rule-based methods, used since the 90 nm node, require long development time and struggle to achieve good process window performance for complex patterns. Heuristically driven, their development is often iterative and involves significant engineering time from multiple disciplines (Litho, OPC and DTCO). Model-based approaches have been widely adopted since the 20 nm node. While the development of model-driven placement methods is relatively straightforward, they often become computationally expensive when high accuracy is required. Furthermore these methods tend to yield less consistent SRAFs due to the nature of the approach: they rely on a model which is sensitive to the pattern placement on the native simulation grid, and can be impacted by such related grid dependency effects. Those undesirable effects tend to become stronger when more iterations or complexity are needed in the algorithm to achieve required accuracy. ASML Brion has developed a new SRAF placement technique on the Tachyon platform that is assisted by machine learning and significantly improves the accuracy of full chip SRAF placement while keeping consistency and runtime under control. A Deep Convolutional Neural Network (DCNN) is trained using the target wafer layout and corresponding Continuous Transmission Mask (CTM) images. These CTM images have been fully optimized using the Tachyon inverse mask optimization engine. The neural network generated SRAF guidance map is then used to place SRAF on full-chip. This is different from our existing full-chip MB-SRAF approach which utilizes a SRAF guidance map (SGM) of mask sensitivity to improve the contrast of optical image at the target pattern edges. In this paper, we demonstrate that machine learning assisted SRAF placement can achieve a superior process window compared to the SGM model-based SRAF method, while keeping the full-chip runtime affordable, and maintain consistency of SRAF placement . We describe the current status of this machine learning assisted SRAF technique and demonstrate its application to full chip mask synthesis and discuss how it can extend the computational lithography roadmap.

  18. Window for Optimal Frequency Operation and Reliability of 3DEG and 2DEG Channels for Oxide Microwave MESFETs and HFETs

    DTIC Science & Technology

    2016-04-01

    noise, and energy relaxation for doped zinc-oxide and structured ZnO transistor materials with a 2-D electron gas (2DEG) channel subjected to a strong...function on the time delay. Closed symbols represent the Monte Carlo data with hot-phonon effect at different electron gas density: 1•1017 cm-3...Monte Carlo simulation is performed for electron gas density of 1•1018 cm-3. Figure 18. Monte Carlo simulation of density-dependent hot-electron energy

  19. Enhanced understanding of non-axisymmetric intrinsic and controlled field impacts in tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    In, Y.; Park, J. -K.; Jeon, Y. M.

    Here, an extensive study of intrinsic and controlled non-axisymmetric field (δB) impacts in KSTAR has enhanced the understanding about non-axisymmetric field physics and its implications, in particular, on resonant magnetic perturbation (RMP) physics and power threshold (P th) for L–H transition. The n=1 intrinsic non-axisymmetric field in KSTAR was measured to remain as low as δB/B 0 ~ 4×10 –5 even at high-beta plasmas (β N ~ 2), which corresponds to approximately 20% below the targeted ITER tolerance level. As for the RMP edge-localized-modes (ELM) control, robust n=1 RMP ELM-crash-suppression has been not only sustained for more than ~90 τ E, but also confirmed to be compatible with rotating RMP. An optimal window of radial position of lower X-point (i.e. R x =more » $$1.44\\pm 0.02\\,$$ m) proved to be quite critical to reach full n=1 RMP-driven ELM-crash-suppression, while a constraint of the safety factor could be relaxed (q 95 = 5 $$\\pm $$ 0.25). A more encouraging finding was that even when R x cannot be positioned in the optimal window, another systematic scan in the vicinity of the previously optimal R x allows for a new optimal window with relatively small variations of plasma parameters. Also, we have addressed the importance of optimal phasing (i.e. toroidal phase difference between adjacent rows) for n=1 RMP-driven ELM control, consistent with an ideal plasma response modeling which could predict phasing-dependent ELM suppression windows. In support of ITER RMP study, intentionally misaligned RMPs have been found to be quite effective during ELM-mitigation stage in lowering the peaks of divertor heat flux, as well as in broadening the 'wet' areas. Besides, a systematic survey of P th dependence on non-axisymmetric field has revealed the potential limit of the merit of low intrinsic non-axisymmetry. Considering that the ITER RMP coils are composed of 3-rows, just like in KSTAR, further 3D physics study in KSTAR is expected to help us minimize the uncertainties of the ITER RMP coils, as well as establish an optimal 3D configuration for ITER and future reactors.« less

  20. Enhanced understanding of non-axisymmetric intrinsic and controlled field impacts in tokamaks

    NASA Astrophysics Data System (ADS)

    In, Y.; Park, J.-K.; Jeon, Y. M.; Kim, J.; Park, G. Y.; Ahn, J.-W.; Loarte, A.; Ko, W. H.; Lee, H. H.; Yoo, J. W.; Juhn, J. W.; Yoon, S. W.; Park, H.; Physics Task Force in KSTAR, 3D

    2017-11-01

    An extensive study of intrinsic and controlled non-axisymmetric field (δB) impacts in KSTAR has enhanced the understanding about non-axisymmetric field physics and its implications, in particular, on resonant magnetic perturbation (RMP) physics and power threshold (P th) for L-H transition. The n  =  1 intrinsic non-axisymmetric field in KSTAR was measured to remain as low as δB/B 0 ~ 4  ×  10-5 even at high-beta plasmas (β N ~ 2), which corresponds to approximately 20% below the targeted ITER tolerance level. As for the RMP edge-localized-modes (ELM) control, robust n  =  1 RMP ELM-crash-suppression has been not only sustained for more than ~90 τ E, but also confirmed to be compatible with rotating RMP. An optimal window of radial position of lower X-point (i.e. R x   =  1.44+/- 0.02 m) proved to be quite critical to reach full n  =  1 RMP-driven ELM-crash-suppression, while a constraint of the safety factor could be relaxed (q 95  =  5 +/- 0.25). A more encouraging finding was that even when R x cannot be positioned in the optimal window, another systematic scan in the vicinity of the previously optimal R x allows for a new optimal window with relatively small variations of plasma parameters. Also, we have addressed the importance of optimal phasing (i.e. toroidal phase difference between adjacent rows) for n  =  1 RMP-driven ELM control, consistent with an ideal plasma response modeling which could predict phasing-dependent ELM suppression windows. In support of ITER RMP study, intentionally misaligned RMPs have been found to be quite effective during ELM-mitigation stage in lowering the peaks of divertor heat flux, as well as in broadening the ‘wet’ areas. Besides, a systematic survey of P th dependence on non-axisymmetric field has revealed the potential limit of the merit of low intrinsic non-axisymmetry. Considering that the ITER RMP coils are composed of 3-rows, just like in KSTAR, further 3D physics study in KSTAR is expected to help us minimize the uncertainties of the ITER RMP coils, as well as establish an optimal 3D configuration for ITER and future reactors.

  1. Enhanced understanding of non-axisymmetric intrinsic and controlled field impacts in tokamaks

    DOE PAGES

    In, Y.; Park, J. -K.; Jeon, Y. M.; ...

    2017-08-24

    Here, an extensive study of intrinsic and controlled non-axisymmetric field (δB) impacts in KSTAR has enhanced the understanding about non-axisymmetric field physics and its implications, in particular, on resonant magnetic perturbation (RMP) physics and power threshold (P th) for L–H transition. The n=1 intrinsic non-axisymmetric field in KSTAR was measured to remain as low as δB/B 0 ~ 4×10 –5 even at high-beta plasmas (β N ~ 2), which corresponds to approximately 20% below the targeted ITER tolerance level. As for the RMP edge-localized-modes (ELM) control, robust n=1 RMP ELM-crash-suppression has been not only sustained for more than ~90 τ E, but also confirmed to be compatible with rotating RMP. An optimal window of radial position of lower X-point (i.e. R x =more » $$1.44\\pm 0.02\\,$$ m) proved to be quite critical to reach full n=1 RMP-driven ELM-crash-suppression, while a constraint of the safety factor could be relaxed (q 95 = 5 $$\\pm $$ 0.25). A more encouraging finding was that even when R x cannot be positioned in the optimal window, another systematic scan in the vicinity of the previously optimal R x allows for a new optimal window with relatively small variations of plasma parameters. Also, we have addressed the importance of optimal phasing (i.e. toroidal phase difference between adjacent rows) for n=1 RMP-driven ELM control, consistent with an ideal plasma response modeling which could predict phasing-dependent ELM suppression windows. In support of ITER RMP study, intentionally misaligned RMPs have been found to be quite effective during ELM-mitigation stage in lowering the peaks of divertor heat flux, as well as in broadening the 'wet' areas. Besides, a systematic survey of P th dependence on non-axisymmetric field has revealed the potential limit of the merit of low intrinsic non-axisymmetry. Considering that the ITER RMP coils are composed of 3-rows, just like in KSTAR, further 3D physics study in KSTAR is expected to help us minimize the uncertainties of the ITER RMP coils, as well as establish an optimal 3D configuration for ITER and future reactors.« less

  2. Optimized ECR plasma apparatus with varied microwave window thickness

    DOEpatents

    Berry, Lee A.

    1995-01-01

    The present invention describes a technique to control the radial profile of microwave power in an ECR plasma discharge. In order to provide for a uniform plasma density to a specimen, uniform energy absorption by the plasma is desired. By controlling the radial profile of the microwave power transmitted through the microwave window of a reactor, the profile of the transmitted energy to the plasma can be controlled in order to have uniform energy absorption by the plasma. An advantage of controlling the profile using the window transmission characteristics is that variations to the radial profile of microwave power can be made without changing the microwave coupler or reactor design.

  3. Liver enhancement in healthy dogs after gadoxetic acid administration during dynamic contrast-enhanced magnetic resonance imaging.

    PubMed

    Borusewicz, P; Stańczyk, E; Kubiak, K; Spużak, J; Glińska-Suchocka, K; Jankowski, M; Nicpoń, J; Podgórski, P

    2018-05-01

    Dynamic contrast enhanced (DCE)-magnetic resonance imaging (MRI) consists of acquisition of native baseline images, followed by a series of acquisitions performed during and after administration of a contrast medium. DCE-MRI, in conjunction with hepatobiliary-specific contrast media, such as gadoxetic acid (GD-EOB-DTPA), allows for precise characterisation of the enhancement pattern of the hepatic parenchyma following administration of the contrast agent. The aim of the study was to assess the pattern of temporal resolution contrast enhancement of the hepatic parenchyma following administration of GD-EOB-DTPA and to determine the optimal time window for post-contrast assessment of the liver. The study was carried out on eight healthy beagle dogs. MRI was performed using a 1.5T scanner. The imaging protocol included T1 weighted (T1-W) gradient echo (GRE), T2 weighted (T2-W) turbo spin echo (TSE) and dynamic T1-W GRE sequences. The dynamic T1-W sequence was performed using single 10mm thick slices. Regions of interest (ROIs) were chosen and the signal intensity curves were calculated for quantitative image analysis. The mean time to peak for all dogs was 26min. The plateau phase lasted on average 21min. A gradual decrease in the signal intensity of the hepatic parenchyma was observed in all dogs. A DCE-MRI enhancement pattern of the hepatic parenchyma was evident in dogs following the administration of a GD-EOB-DTPA, establishing baseline data for an optimal time window between 26 and 41min after administration of the contrast agent. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Measuring oxygen tension modulation, induced by a new pre-radiotherapy therapeutic, in a mammary window chamber mouse model

    NASA Astrophysics Data System (ADS)

    Schafer, Rachel; Gmitro, Arthur F.

    2015-03-01

    Tumor regions under hypoxic or low oxygen conditions respond less effectively to many treatment strategies, including radiation therapy. A novel investigational therapeutic, NVX-108 (NuvOx Pharma), has been developed to increase delivery of oxygen through the use of a nano-emulsion of dodecofluoropentane. By raising pO2 levels prior to delivering radiation, treatment efficacy may be improved. To aid in evaluating the novel drug, oxygen tension was quantitatively measured, spatially and temporally, to record the effect of administrating NVX-108 in an orthotopic mammary window chamber mouse model of breast cancer. The oxygen tension was measured through the use of an oxygen-sensitive coating, comprised of phosphorescent platinum porphyrin dye embedded in a polystyrene matrix. The coating, applied to the surface of the coverslip of the window chamber through spin coating, is placed in contact with the mammary fat pad to record the oxygenation status of the surface tissue layer. Prior to implantation of the window chamber, a tumor is grown in the SCID mouse model by injection of MCF-7 cells into the mammary fat pad. Two-dimensional spatial distributions of the pO2 levels were obtained through conversion of measured maps of phosphorescent lifetime. The resulting information on the spatial and temporal variation of the induced oxygen modulation could provide valuable insight into the optimal timing between administration of NVX-108 and radiation treatment to provide the most effective treatment outcome.

  5. Nutrient timing revisited: is there a post-exercise anabolic window?

    PubMed Central

    2013-01-01

    Nutrient timing is a popular nutritional strategy that involves the consumption of combinations of nutrients--primarily protein and carbohydrate--in and around an exercise session. Some have claimed that this approach can produce dramatic improvements in body composition. It has even been postulated that the timing of nutritional consumption may be more important than the absolute daily intake of nutrients. The post-exercise period is widely considered the most critical part of nutrient timing. Theoretically, consuming the proper ratio of nutrients during this time not only initiates the rebuilding of damaged muscle tissue and restoration of energy reserves, but it does so in a supercompensated fashion that enhances both body composition and exercise performance. Several researchers have made reference to an anabolic “window of opportunity” whereby a limited time exists after training to optimize training-related muscular adaptations. However, the importance - and even the existence - of a post-exercise ‘window’ can vary according to a number of factors. Not only is nutrient timing research open to question in terms of applicability, but recent evidence has directly challenged the classical view of the relevance of post-exercise nutritional intake with respect to anabolism. Therefore, the purpose of this paper will be twofold: 1) to review the existing literature on the effects of nutrient timing with respect to post-exercise muscular adaptations, and; 2) to draw relevant conclusions that allow practical, evidence-based nutritional recommendations to be made for maximizing the anabolic response to exercise. PMID:23360586

  6. Cross-modal integration of polyphonic characters in Chinese audio-visual sentences: a MVPA study based on functional connectivity.

    PubMed

    Zhang, Zhengyi; Zhang, Gaoyan; Zhang, Yuanyuan; Liu, Hong; Xu, Junhai; Liu, Baolin

    2017-12-01

    This study aimed to investigate the functional connectivity in the brain during the cross-modal integration of polyphonic characters in Chinese audio-visual sentences. The visual sentences were all semantically reasonable and the audible pronunciations of the polyphonic characters in corresponding sentences contexts varied in four conditions. To measure the functional connectivity, correlation, coherence and phase synchronization index (PSI) were used, and then multivariate pattern analysis was performed to detect the consensus functional connectivity patterns. These analyses were confined in the time windows of three event-related potential components of P200, N400 and late positive shift (LPS) to investigate the dynamic changes of the connectivity patterns at different cognitive stages. We found that when differentiating the polyphonic characters with abnormal pronunciations from that with the appreciate ones in audio-visual sentences, significant classification results were obtained based on the coherence in the time window of the P200 component, the correlation in the time window of the N400 component and the coherence and PSI in the time window the LPS component. Moreover, the spatial distributions in these time windows were also different, with the recruitment of frontal sites in the time window of the P200 component, the frontal-central-parietal regions in the time window of the N400 component and the central-parietal sites in the time window of the LPS component. These findings demonstrate that the functional interaction mechanisms are different at different stages of audio-visual integration of polyphonic characters.

  7. Real-Time Imaging with Frequency Scanning Array Antenna for Industrial Inspection Applications at W band

    NASA Astrophysics Data System (ADS)

    Larumbe, Belen; Laviada, Jaime; Ibáñez-Loinaz, Asier; Teniente, Jorge

    2018-01-01

    A real-time imaging system based on a frequency scanning antenna for conveyor belt setups is presented in this paper. The frequency scanning antenna together with an inexpensive parabolic reflector operates at the W band enabling the detection of details with dimensions in the order of 2 mm. In addition, a low level of sidelobes is achieved by optimizing unequal dividers to window the power distribution for sidelobe reduction. Furthermore, the quality of the images is enhanced by the radiation pattern properties. The performance of the system is validated by showing simulation as well as experimental results obtained in real time, proving the feasibility of these kinds of frequency scanning antennas for cost-effective imaging applications.

  8. Non-stationary dynamics in the bouncing ball: A wavelet perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behera, Abhinna K., E-mail: abhinna@iiserkol.ac.in; Panigrahi, Prasanta K., E-mail: pprasanta@iiserkol.ac.in; Sekar Iyengar, A. N., E-mail: ansekar.iyengar@saha.ac.in

    2014-12-01

    The non-stationary dynamics of a bouncing ball, comprising both periodic as well as chaotic behavior, is studied through wavelet transform. The multi-scale characterization of the time series displays clear signatures of self-similarity, complex scaling behavior, and periodicity. Self-similar behavior is quantified by the generalized Hurst exponent, obtained through both wavelet based multi-fractal detrended fluctuation analysis and Fourier methods. The scale dependent variable window size of the wavelets aptly captures both the transients and non-stationary periodic behavior, including the phase synchronization of different modes. The optimal time-frequency localization of the continuous Morlet wavelet is found to delineate the scales corresponding tomore » neutral turbulence, viscous dissipation regions, and different time varying periodic modulations.« less

  9. Iterated local search algorithm for solving the orienteering problem with soft time windows.

    PubMed

    Aghezzaf, Brahim; Fahim, Hassan El

    2016-01-01

    In this paper we study the orienteering problem with time windows (OPTW) and the impact of relaxing the time windows on the profit collected by the vehicle. The way of relaxing time windows adopted in the orienteering problem with soft time windows (OPSTW) that we study in this research is a late service relaxation that allows linearly penalized late services to customers. We solve this problem heuristically by considering a hybrid iterated local search. The results of the computational study show that the proposed approach is able to achieve promising solutions on the OPTW test instances available in the literature, one new best solution is found. On the newly generated test instances of the OPSTW, the results show that the profit collected by the OPSTW is better than the profit collected by the OPTW.

  10. A modular approach to intensity-modulated arc therapy optimization with noncoplanar trajectories

    NASA Astrophysics Data System (ADS)

    Papp, Dávid; Bortfeld, Thomas; Unkelbach, Jan

    2015-07-01

    Utilizing noncoplanar beam angles in volumetric modulated arc therapy (VMAT) has the potential to combine the benefits of arc therapy, such as short treatment times, with the benefits of noncoplanar intensity modulated radiotherapy (IMRT) plans, such as improved organ sparing. Recently, vendors introduced treatment machines that allow for simultaneous couch and gantry motion during beam delivery to make noncoplanar VMAT treatments possible. Our aim is to provide a reliable optimization method for noncoplanar isocentric arc therapy plan optimization. The proposed solution is modular in the sense that it can incorporate different existing beam angle selection and coplanar arc therapy optimization methods. Treatment planning is performed in three steps. First, a number of promising noncoplanar beam directions are selected using an iterative beam selection heuristic; these beams serve as anchor points of the arc therapy trajectory. In the second step, continuous gantry/couch angle trajectories are optimized using a simple combinatorial optimization model to define a beam trajectory that efficiently visits each of the anchor points. Treatment time is controlled by limiting the time the beam needs to trace the prescribed trajectory. In the third and final step, an optimal arc therapy plan is found along the prescribed beam trajectory. In principle any existing arc therapy optimization method could be incorporated into this step; for this work we use a sliding window VMAT algorithm. The approach is demonstrated using two particularly challenging cases. The first one is a lung SBRT patient whose planning goals could not be satisfied with fewer than nine noncoplanar IMRT fields when the patient was treated in the clinic. The second one is a brain tumor patient, where the target volume overlaps with the optic nerves and the chiasm and it is directly adjacent to the brainstem. Both cases illustrate that the large number of angles utilized by isocentric noncoplanar VMAT plans can help improve dose conformity, homogeneity, and organ sparing simultaneously using the same beam trajectory length and delivery time as a coplanar VMAT plan.

  11. A note on windowing for the waveform relaxation

    NASA Technical Reports Server (NTRS)

    Zhang, Hong

    1994-01-01

    The technique of windowing has been often used in the implementation of the waveform relaxations for solving ODE's or time dependent PDE's. Its efficiency depends upon problem stiffness and operator splitting. Using model problems, the estimates for window length and convergence rate are derived. The electiveness of windowing is then investigated for non-stiff and stiff cases respectively. lt concludes that for the former, windowing is highly recommended when a large discrepancy exists between the convergence rate on a time interval and the ones on its subintervals. For the latter, windowing does not provide any computational advantage if machine features are disregarded. The discussion is supported by experimental results.

  12. Night-time naturally ventilated offices: Statistical simulations of window-use patterns from field monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, Geun Young; Steemers, Koen

    2010-07-15

    This paper investigates occupant behaviour of window-use in night-time naturally ventilated offices on the basis of a pilot field study, conducted during the summers of 2006 and 2007 in Cambridge, UK, and then demonstrates the effects of employing night-time ventilation on indoor thermal conditions using predictive models of occupant window-use. A longitudinal field study shows that occupants make good use of night-time natural ventilation strategies when provided with openings that allow secure ventilation, and that there is a noticeable time of day effect in window-use patterns (i.e. increased probability of action on arrival and departure). We develop logistic models ofmore » window-use for night-time naturally ventilated offices, which are subsequently applied to a behaviour algorithm, including Markov chains and Monte Carlo methods. The simulations using the behaviour algorithm demonstrate a good agreement with the observational data of window-use, and reveal how building design and occupant behaviour collectively affect the thermal performance of offices. They illustrate that the provision of secure ventilation leads to more frequent use of the window, and thus contributes significantly to the achievement of a comfortable indoor environment during the daytime occupied period. For example, the maximum temperature for a night-time ventilated office is found to be 3 C below the predicted value for a daytime-only ventilated office. (author)« less

  13. A Numerical Study of the Thermal Characteristics of an Air Cavity Formed by Window Sashes in a Double Window

    NASA Astrophysics Data System (ADS)

    Kang, Jae-sik; Oh, Eun-Joo; Bae, Min-Jung; Song, Doo-Sam

    2017-12-01

    Given that the Korean government is implementing what has been termed the energy standards and labelling program for windows, window companies will be required to assign window ratings based on the experimental results of their product. Because this has added to the cost and time required for laboratory tests by window companies, the simulation system for the thermal performance of windows has been prepared to compensate for time and cost burdens. In Korea, a simulator is usually used to calculate the thermal performance of a window through WINDOW/THERM, complying with ISO 15099. For a single window, the simulation results are similar to experimental results. A double window is also calculated using the same method, but the calculation results for this type of window are unreliable. ISO 15099 should not recommend the calculation of the thermal properties of an air cavity between window sashes in a double window. This causes a difference between simulation and experimental results pertaining to the thermal performance of a double window. In this paper, the thermal properties of air cavities between window sashes in a double window are analyzed through computational fluid dynamics (CFD) simulations with the results compared to calculation results certified by ISO 15099. The surface temperature of the air cavity analyzed by CFD is compared to the experimental temperatures. These results show that an appropriate calculation method for an air cavity between window sashes in a double window should be established for reliable thermal performance results for a double window.

  14. Real-Time Station Grouping under Dynamic Traffic for IEEE 802.11ah

    PubMed Central

    Tian, Le; Latré, Steven

    2017-01-01

    IEEE 802.11ah, marketed as Wi-Fi HaLow, extends Wi-Fi to the sub-1 GHz spectrum. Through a number of physical layer (PHY) and media access control (MAC) optimizations, it aims to bring greatly increased range, energy-efficiency, and scalability. This makes 802.11ah the perfect candidate for providing connectivity to Internet of Things (IoT) devices. One of these new features, referred to as the Restricted Access Window (RAW), focuses on improving scalability in highly dense deployments. RAW divides stations into groups and reduces contention and collisions by only allowing channel access to one group at a time. However, the standard does not dictate how to determine the optimal RAW grouping parameters. The optimal parameters depend on the current network conditions, and it has been shown that incorrect configuration severely impacts throughput, latency and energy efficiency. In this paper, we propose a traffic-adaptive RAW optimization algorithm (TAROA) to adapt the RAW parameters in real time based on the current traffic conditions, optimized for sensor networks in which each sensor transmits packets with a certain (predictable) frequency and may change the transmission frequency over time. The TAROA algorithm is executed at each target beacon transmission time (TBTT), and it first estimates the packet transmission interval of each station only based on packet transmission information obtained by access point (AP) during the last beacon interval. Then, TAROA determines the RAW parameters and assigns stations to RAW slots based on this estimated transmission frequency. The simulation results show that, compared to enhanced distributed channel access/distributed coordination function (EDCA/DCF), the TAROA algorithm can highly improve the performance of IEEE 802.11ah dense networks in terms of throughput, especially when hidden nodes exist, although it does not always achieve better latency performance. This paper contributes with a practical approach to optimizing RAW grouping under dynamic traffic in real time, which is a major leap towards applying RAW mechanism in real-life IoT networks. PMID:28677617

  15. Real-Time Station Grouping under Dynamic Traffic for IEEE 802.11ah.

    PubMed

    Tian, Le; Khorov, Evgeny; Latré, Steven; Famaey, Jeroen

    2017-07-04

    IEEE 802.11ah, marketed as Wi-Fi HaLow, extends Wi-Fi to the sub-1 GHz spectrum. Through a number of physical layer (PHY) and media access control (MAC) optimizations, it aims to bring greatly increased range, energy-efficiency, and scalability. This makes 802.11ah the perfect candidate for providing connectivity to Internet of Things (IoT) devices. One of these new features, referred to as the Restricted Access Window (RAW), focuses on improving scalability in highly dense deployments. RAW divides stations into groups and reduces contention and collisions by only allowing channel access to one group at a time. However, the standard does not dictate how to determine the optimal RAW grouping parameters. The optimal parameters depend on the current network conditions, and it has been shown that incorrect configuration severely impacts throughput, latency and energy efficiency. In this paper, we propose a traffic-adaptive RAW optimization algorithm (TAROA) to adapt the RAW parameters in real time based on the current traffic conditions, optimized for sensor networks in which each sensor transmits packets with a certain (predictable) frequency and may change the transmission frequency over time. The TAROA algorithm is executed at each target beacon transmission time (TBTT), and it first estimates the packet transmission interval of each station only based on packet transmission information obtained by access point (AP) during the last beacon interval. Then, TAROA determines the RAW parameters and assigns stations to RAW slots based on this estimated transmission frequency. The simulation results show that, compared to enhanced distributed channel access/distributed coordination function (EDCA/DCF), the TAROA algorithm can highly improve the performance of IEEE 802.11ah dense networks in terms of throughput, especially when hidden nodes exist, although it does not always achieve better latency performance. This paper contributes with a practical approach to optimizing RAW grouping under dynamic traffic in real time, which is a major leap towards applying RAW mechanism in real-life IoT networks.

  16. Process Flow Features as a Host-Based Event Knowledge Representation

    DTIC Science & Technology

    2012-06-14

    an executing process during a window of time called a process flow. Process flows are calculated from key process data structures extracted from...for Cluster 98. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 4.9. Davies- Boldin Dunn Index Sliding Window 5 on Windows 7...82 4.10. Davies- Boldin Dunn Index Sliding Window 10 on Windows 7 . 83 4.11. Davies- Boldin Dunn Index Sliding Window 20 on Windows 7 . 83 ix List of

  17. Dynamic state estimation based on Poisson spike trains—towards a theory of optimal encoding

    NASA Astrophysics Data System (ADS)

    Susemihl, Alex; Meir, Ron; Opper, Manfred

    2013-03-01

    Neurons in the nervous system convey information to higher brain regions by the generation of spike trains. An important question in the field of computational neuroscience is how these sensory neurons encode environmental information in a way which may be simply analyzed by subsequent systems. Many aspects of the form and function of the nervous system have been understood using the concepts of optimal population coding. Most studies, however, have neglected the aspect of temporal coding. Here we address this shortcoming through a filtering theory of inhomogeneous Poisson processes. We derive exact relations for the minimal mean squared error of the optimal Bayesian filter and, by optimizing the encoder, obtain optimal codes for populations of neurons. We also show that a class of non-Markovian, smooth stimuli are amenable to the same treatment, and provide results for the filtering and prediction error which hold for a general class of stochastic processes. This sets a sound mathematical framework for a population coding theory that takes temporal aspects into account. It also formalizes a number of studies which discussed temporal aspects of coding using time-window paradigms, by stating them in terms of correlation times and firing rates. We propose that this kind of analysis allows for a systematic study of temporal coding and will bring further insights into the nature of the neural code.

  18. Exponential smoothing weighted correlations

    NASA Astrophysics Data System (ADS)

    Pozzi, F.; Di Matteo, T.; Aste, T.

    2012-06-01

    In many practical applications, correlation matrices might be affected by the "curse of dimensionality" and by an excessive sensitiveness to outliers and remote observations. These shortcomings can cause problems of statistical robustness especially accentuated when a system of dynamic correlations over a running window is concerned. These drawbacks can be partially mitigated by assigning a structure of weights to observational events. In this paper, we discuss Pearson's ρ and Kendall's τ correlation matrices, weighted with an exponential smoothing, computed on moving windows using a data-set of daily returns for 300 NYSE highly capitalized companies in the period between 2001 and 2003. Criteria for jointly determining optimal weights together with the optimal length of the running window are proposed. We find that the exponential smoothing can provide more robust and reliable dynamic measures and we discuss that a careful choice of the parameters can reduce the autocorrelation of dynamic correlations whilst keeping significance and robustness of the measure. Weighted correlations are found to be smoother and recovering faster from market turbulence than their unweighted counterparts, helping also to discriminate more effectively genuine from spurious correlations.

  19. Adjoint-Based Climate Model Tuning: Application to the Planet Simulator

    NASA Astrophysics Data System (ADS)

    Lyu, Guokun; Köhl, Armin; Matei, Ion; Stammer, Detlef

    2018-01-01

    The adjoint method is used to calibrate the medium complexity climate model "Planet Simulator" through parameter estimation. Identical twin experiments demonstrate that this method can retrieve default values of the control parameters when using a long assimilation window of the order of 2 months. Chaos synchronization through nudging, required to overcome limits in the temporal assimilation window in the adjoint method, is employed successfully to reach this assimilation window length. When assimilating ERA-Interim reanalysis data, the observations of air temperature and the radiative fluxes are the most important data for adjusting the control parameters. The global mean net longwave fluxes at the surface and at the top of the atmosphere are significantly improved by tuning two model parameters controlling the absorption of clouds and water vapor. The global mean net shortwave radiation at the surface is improved by optimizing three model parameters controlling cloud optical properties. The optimized parameters improve the free model (without nudging terms) simulation in a way similar to that in the assimilation experiments. Results suggest a promising way for tuning uncertain parameters in nonlinear coupled climate models.

  20. Parameters effective on estimating a nonstationary mixed-phase wavelet using cumulant matching approach

    NASA Astrophysics Data System (ADS)

    Vosoughi, Ehsan; Javaherian, Abdolrahim

    2018-01-01

    Seismic inversion is a process performed to remove the effects of propagated wavelets in order to recover the acoustic impedance. To obtain valid velocity and density values related to subsurface layers through the inversion process, it is highly essential to perform reliable wavelet estimation such as cumulant matching approach. For this purpose, the seismic data were windowed in this work in such a way that two consecutive windows were only one sample apart. Also, we did not consider any fixed wavelet for any window and let the phase of each wavelet rotate in each sample in the window. Comparing the fourth order cumulant of the whitened trace and fourth-order moment of the all-pass operator in each window generated a cost function that should be minimized with a non-linear optimization method. In this regard, parameters effective on the estimation of the nonstationary mixed-phase wavelets were tested over the created nonstationary seismic trace at 0.82 s and 1.6 s. Besides, we compared the consequences of each parameter on estimated wavelets at two mentioned times. The parameters studied in this work are window length, taper type, the number of iteration, signal-to-noise ratio, bandwidth to central frequency ratio, and Q factor. The results show that applying the optimum values of the effective parameters, the average correlation of the estimated mixed-phase wavelets with the original ones is about 87%. Moreover, the effectiveness of the proposed approach was examined on a synthetic nonstationary seismic section with variable Q factor values alongside the time and offset axis. Eventually, the cumulant matching method was applied on a cross line of the migrated data from a 3D data set of an oilfield in the Persian Gulf. Also, the effect of the wrong Q estimation on the estimated mixed-phase wavelet was considered on the real data set. It is concluded that the accuracy of the estimated wavelet relied on the estimated Q and more than 10% error in the estimated value of Q is acceptable. Eventually, an 88% correlation was found between the estimated mixed-phase wavelets and the original ones for three horizons. The estimated wavelets applied to the data and the result of deconvolution processes was presented.

  1. Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR

    NASA Technical Reports Server (NTRS)

    Corpaccioli, Luca; Linskens, Harry; Komar, David R.

    2014-01-01

    The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.

  2. Scheduling algorithm for data relay satellite optical communication based on artificial intelligent optimization

    NASA Astrophysics Data System (ADS)

    Zhao, Wei-hu; Zhao, Jing; Zhao, Shang-hong; Li, Yong-jun; Wang, Xiang; Dong, Yi; Dong, Chen

    2013-08-01

    Optical satellite communication with the advantages of broadband, large capacity and low power consuming broke the bottleneck of the traditional microwave satellite communication. The formation of the Space-based Information System with the technology of high performance optical inter-satellite communication and the realization of global seamless coverage and mobile terminal accessing are the necessary trend of the development of optical satellite communication. Considering the resources, missions and restraints of Data Relay Satellite Optical Communication System, a model of optical communication resources scheduling is established and a scheduling algorithm based on artificial intelligent optimization is put forwarded. According to the multi-relay-satellite, multi-user-satellite, multi-optical-antenna and multi-mission with several priority weights, the resources are scheduled reasonable by the operation: "Ascertain Current Mission Scheduling Time" and "Refresh Latter Mission Time-Window". The priority weight is considered as the parameter of the fitness function and the scheduling project is optimized by the Genetic Algorithm. The simulation scenarios including 3 relay satellites with 6 optical antennas, 12 user satellites and 30 missions, the simulation result reveals that the algorithm obtain satisfactory results in both efficiency and performance and resources scheduling model and the optimization algorithm are suitable in multi-relay-satellite, multi-user-satellite, and multi-optical-antenna recourses scheduling problem.

  3. Photon counting phosphorescence lifetime imaging with TimepixCam

    DOE PAGES

    Hirvonen, Liisa M.; Fisher-Levine, Merlin; Suhling, Klaus; ...

    2017-01-12

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window, and read out by a Timepix ASIC. The 256 x 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting (TCSPC) imaging. We have characterised the photon detection capabilities of this detector system, and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200more » μm diameter polystyrene beads.« less

  4. Recovery considerations for possible high inclination long duration earth orbital missions

    NASA Technical Reports Server (NTRS)

    Obriant, T. E.; Ferguson, J. E.

    1969-01-01

    Problem areas are discussed and various solutions proposed. One of the major recovery problems encountered with missions having higher orbital inclinations than previous missions is the greater likelihood of severe weather conditions in the landing zones, especially if landing zones are optimized for orbital coverage considerations. Restricting the reentry window and increasing in-orbit wait times can partially eliminate the weather problem, but the possibility of emergency landings at higher latitudes still exists. It can be expected that the increased confidence level in spacecraft reliability that will exist by the time the high-inclination missions are flown will reduce the probabilities of an emergency landing in an unfavorable recovery location to a very low level.

  5. Photon counting phosphorescence lifetime imaging with TimepixCam.

    PubMed

    Hirvonen, Liisa M; Fisher-Levine, Merlin; Suhling, Klaus; Nomerotski, Andrei

    2017-01-01

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window and read out by a Timepix Application Specific Integrated Circuit. The 256 × 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting imaging. We have characterised the photon detection capabilities of this detector system and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200 μm diameter polystyrene beads.

  6. Photon counting phosphorescence lifetime imaging with TimepixCam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirvonen, Liisa M.; Fisher-Levine, Merlin; Suhling, Klaus

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window, and read out by a Timepix ASIC. The 256 x 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting (TCSPC) imaging. We have characterised the photon detection capabilities of this detector system, and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200more » μm diameter polystyrene beads.« less

  7. Photon counting phosphorescence lifetime imaging with TimepixCam

    NASA Astrophysics Data System (ADS)

    Hirvonen, Liisa M.; Fisher-Levine, Merlin; Suhling, Klaus; Nomerotski, Andrei

    2017-01-01

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window and read out by a Timepix Application Specific Integrated Circuit. The 256 × 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting imaging. We have characterised the photon detection capabilities of this detector system and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200 μm diameter polystyrene beads.

  8. Time Is Brain: The Stroke Theory of Relativity.

    PubMed

    Gomez, Camilo R

    2018-04-25

    Since the introduction of the philosophical tenet "Time is Brain!," multiple lines of research have demonstrated that other factors contribute to the degree of ischemic injury at any one point in time, and it is now clear that the therapeutic window of acute ischemic stroke is more protracted than it was first suspected. To define a more realistic relationship between time and the ischemic process, we used computational modeling to assess how these 2 variables are affected by collateral circulatory competence. Starting from the premise that the expression "Time=Brain" is mathematically false, we reviewed the existing literature on the attributes of cerebral ischemia over time, with particular attention to relevant clinical parameters, and the effect of different variables, particularly collateral circulation, on the time-ischemia relationship. We used this information to construct a theoretical computational model and applied it to categorically different yet abnormal cerebral perfusion scenarios, allowing comparison of their behavior both overall (i.e., final infarct volume) and in real-time (i.e., instantaneous infarct growth rate). Optimal collateral circulatory competence was predictably associated with slower infarct growth rates and prolongation of therapeutic window. Modeling of identifiable specific types of perfusion maps allows forecasting of the fate of the ischemic process over time. Distinct cerebral perfusion map patterns can be readily identified in patients with acute ischemic stroke. These patterns have inherently different behaviors relative to the time-ischemia construct, allowing the possibility of improving parsing and treatment allocation. It is clearly evident that the effect of time on the ischemic process is relative. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  9. A simple approach to measure transmissibility and forecast incidence.

    PubMed

    Nouvellet, Pierre; Cori, Anne; Garske, Tini; Blake, Isobel M; Dorigatti, Ilaria; Hinsley, Wes; Jombart, Thibaut; Mills, Harriet L; Nedjati-Gilani, Gemma; Van Kerkhove, Maria D; Fraser, Christophe; Donnelly, Christl A; Ferguson, Neil M; Riley, Steven

    2018-03-01

    Outbreaks of novel pathogens such as SARS, pandemic influenza and Ebola require substantial investments in reactive interventions, with consequent implementation plans sometimes revised on a weekly basis. Therefore, short-term forecasts of incidence are often of high priority. In light of the recent Ebola epidemic in West Africa, a forecasting exercise was convened by a network of infectious disease modellers. The challenge was to forecast unseen "future" simulated data for four different scenarios at five different time points. In a similar method to that used during the recent Ebola epidemic, we estimated current levels of transmissibility, over variable time-windows chosen in an ad hoc way. Current estimated transmissibility was then used to forecast near-future incidence. We performed well within the challenge and often produced accurate forecasts. A retrospective analysis showed that our subjective method for deciding on the window of time with which to estimate transmissibility often resulted in the optimal choice. However, when near-future trends deviated substantially from exponential patterns, the accuracy of our forecasts was reduced. This exercise highlights the urgent need for infectious disease modellers to develop more robust descriptions of processes - other than the widespread depletion of susceptible individuals - that produce non-exponential patterns of incidence. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Therapeutic Time Window for Edaravone Treatment of Traumatic Brain Injury in Mice

    PubMed Central

    Miyamoto, Kazuyuki; Ohtaki, Hirokazu; Dohi, Kenji; Tsumuraya, Tomomi; Song, Dandan; Kiriyama, Keisuke; Satoh, Kazue; Shimizu, Ai; Aruga, Tohru; Shioda, Seiji

    2013-01-01

    Traumatic brain injury (TBI) is a major cause of death and disability in young people. No effective therapy is available to ameliorate its damaging effects. Our aim was to investigate the optimal therapeutic time window of edaravone, a free radical scavenger which is currently used in Japan. We also determined the temporal profile of reactive oxygen species (ROS) production, oxidative stress, and neuronal death. Male C57Bl/6 mice were subjected to a controlled cortical impact (CCI). Edaravone (3.0 mg/kg), or vehicle, was administered intravenously at 0, 3, or 6 hours following CCI. The production of superoxide radicals (O2 ∙−) as a marker of ROS, of nitrotyrosine (NT) as an indicator of oxidative stress, and neuronal death were measured for 24 hours following CCI. Superoxide radical production was clearly evident 3 hours after CCI, with oxidative stress and neuronal cell death becoming apparent after 6 hours. Edaravone administration after CCI resulted in a significant reduction in the injury volume and oxidative stress, particularly at the 3-hour time point. Moreover, the greatest decrease in O2 ∙− levels was observed when edaravone was administered 3 hours following CCI. These findings suggest that edaravone could prove clinically useful to ameliorate the devastating effects of TBI. PMID:23710445

  11. The application of DEA (Data Envelopment Analysis) window analysis in the assessment of influence on operational efficiencies after the establishment of branched hospitals.

    PubMed

    Jia, Tongying; Yuan, Huiyun

    2017-04-12

    Many large-scaled public hospitals have established branched hospitals in China. This study is to provide evidence for strategy making on the management and development of multi-branched hospitals by evaluating and comparing the operational efficiencies of different hospitals before and after their establishment of branched hospitals. DEA (Data Envelopment Analysis) window analysis was performed on a 7-year data pool from five public hospitals provided by health authorities and institutional surveys. The operational efficiencies of sample hospitals measured in this study (including technical efficiency, pure technical efficiency and scale efficiency) had overall trends towards increase during this 7-year period of time, however, a temporary downturn occurred shortly after the establishment of branched hospitals; pure technical efficiency contributed more to the improvement of technical efficiency compared to scale efficiency. The establishment of branched-hospitals did not lead to a long-term negative effect on hospital operational efficiencies. Our data indicated the importance of improving scale efficiency via the optimization of organizational management, as well as the advantage of a different form of branch-establishment, merging and reorganization. This study brought an insight into the practical application of DEA window analysis on the assessment of hospital operational efficiencies.

  12. Is Latency to Test Deadline a Predictor of Student Test Performance?

    ERIC Educational Resources Information Center

    Landrum, R. Eric; Gurung, Regan A. R.

    2013-01-01

    When students are given a period or window of time to take an exam, is taking an exam earlier in the window (high latency to deadline) related to test scores? In Study 1, students (n = 236) were given windows of time to take online each of 13 quizzes and 4 exams. In Study 2, students (n = 251) similarly took 4 exams online within a test window. In…

  13. S-wave attenuation of the shallow sediments in the North China basin based on borehole seismograms of local earthquakes

    NASA Astrophysics Data System (ADS)

    Wang, Sheng; Li, Zhiwei

    2018-06-01

    S-wave velocity and attenuation structures of shallow sediments play important roles in accurate prediction of strong ground motion. However, it is more difficult to investigate the attenuation than velocity structures. In this study, we developed a new approach for estimating frequency-dependent S-wave attenuation (Q_S^{ - 1}) structures of shallow sediments based on multiple time window analysis of borehole seismograms from local earthquakes. Multiple time windows for separating direct and surface-reflected S-waves in local earthquake waveforms at borehole stations are selected with a global optimization scheme. With respect to different time windows, the transfer functions between direct and surface-reflected S-waves are achieved with a weighted averaging scheme, based on which frequency dependent Q_S^{ - 1} values are obtained. Synthetic tests suggest that the proposed method can restore robust and reliableQ_S^{ - 1} values, especially when the dataset of local earthquakes is not abundant. We utilize this method for local earthquake waveforms at 14 borehole seismic stations in the North China basin, and obtain Q_S^{ - 1} values in 2 ˜ 10 Hz frequency band, as well as average {V_P}, {V_S} and {V_P}/{{}}{V_S} ratio for shallow sediments deep to a few hundred meters. Results suggest that Q_S^{ - 1} values are to 0.01˜0.06, and generally decrease with frequency. The average attenuation structure of shallow sediments within the depth of a few hundred meters beneath 14 borehole stations in the North China basin can be modeled as Q_S^{ - 1} = 0.056{f^{ - 0.61}}. It is generally consistent with the attenuation structure of sedimentary basins in other areas, such as Mississippi Embayment sediments in the United States and Sendai basin in Japan.

  14. An adaptive segment method for smoothing lidar signal based on noise estimation

    NASA Astrophysics Data System (ADS)

    Wang, Yuzhao; Luo, Pingping

    2014-10-01

    An adaptive segmentation smoothing method (ASSM) is introduced in the paper to smooth the signal and suppress the noise. In the ASSM, the noise is defined as the 3σ of the background signal. An integer number N is defined for finding the changing positions in the signal curve. If the difference of adjacent two points is greater than 3Nσ, the position is recorded as an end point of the smoothing segment. All the end points detected as above are recorded and the curves between them will be smoothed separately. In the traditional method, the end points of the smoothing windows in the signals are fixed. The ASSM creates changing end points in different signals and the smoothing windows could be set adaptively. The windows are always set as the half of the segmentations and then the average smoothing method will be applied in the segmentations. The Iterative process is required for reducing the end-point aberration effect in the average smoothing method and two or three times are enough. In ASSM, the signals are smoothed in the spacial area nor frequent area, that means the frequent disturbance will be avoided. A lidar echo was simulated in the experimental work. The echo was supposed to be created by a space-born lidar (e.g. CALIOP). And white Gaussian noise was added to the echo to act as the random noise resulted from environment and the detector. The novel method, ASSM, was applied to the noisy echo to filter the noise. In the test, N was set to 3 and the Iteration time is two. The results show that, the signal could be smoothed adaptively by the ASSM, but the N and the Iteration time might be optimized when the ASSM is applied in a different lidar.

  15. A PC-based shutter glasses controller for visual stimulation using multithreading in LabWindows/CVI.

    PubMed

    Gramatikov, Ivan; Simons, Kurt; Guyton, David; Gramatikov, Boris

    2017-05-01

    Amblyopia, commonly known as "lazy eye," is poor vision in an eye from prolonged neurologic suppression. It is a major public health problem, afflicting up to 3.6% of children, and will lead to lifelong visual impairment if not identified and treated in early childhood. Traditional treatment methods, such as occluding or penalizing the good eye with eye patches or blurring eye drops, do not always yield satisfactory results. Newer methods have emerged, based on liquid crystal shutter glasses that intermittently occlude the better eye, or alternately occlude the two eyes, thus stimulating vision in the "lazy" eye. As yet there is no technology that allows easy and efficient optimization of the shuttering characteristics for a given individual. The purpose of this study was to develop an inexpensive, computer-based system to perform liquid crystal shuttering in laboratory and clinical settings to help "wake up" the suppressed eye in amblyopic patients, and to help optimize the individual shuttering parameters such as wave shape, level of transparency/opacity, frequency, and duty cycle of the shuttering. We developed a liquid crystal glasses controller connected by USB cable to a PC computer. It generates the voltage waveforms going to the glasses, and has potentiometer knobs for interactive adjustments by the patient. In order to achieve good timing performance in this bidirectional system, we used multithreading programming techniques with data protection, implemented in LabWindows/CVI. The hardware and software developed were assessed experimentally. We achieved an accuracy of ±1Hz for the frequency, and ±2% for the duty cycle of the occlusion pulses. We consider these values to be satisfactory for the purpose of optimizing the visual stimulation by means of shutter glasses. The system can be used for individual optimization of shuttering attributes by clinicians, for training sessions in clinical settings, or even at home, aimed at stimulating vision in the "lazy" eye. Multithreading offers significant benefits for data acquisition and instrument control, making it possible to implement time-efficient algorithms in inexpensive yet versatile medical instrumentation with only minimum requirements on the hardware. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Sliding-window analysis tracks fluctuations in amygdala functional connectivity associated with physiological arousal and vigilance during fear conditioning.

    PubMed

    Baczkowski, Blazej M; Johnstone, Tom; Walter, Henrik; Erk, Susanne; Veer, Ilya M

    2017-06-01

    We evaluated whether sliding-window analysis can reveal functionally relevant brain network dynamics during a well-established fear conditioning paradigm. To this end, we tested if fMRI fluctuations in amygdala functional connectivity (FC) can be related to task-induced changes in physiological arousal and vigilance, as reflected in the skin conductance level (SCL). Thirty-two healthy individuals participated in the study. For the sliding-window analysis we used windows that were shifted by one volume at a time. Amygdala FC was calculated for each of these windows. Simultaneously acquired SCL time series were averaged over time frames that corresponded to the sliding-window FC analysis, which were subsequently regressed against the whole-brain seed-based amygdala sliding-window FC using the GLM. Surrogate time series were generated to test whether connectivity dynamics could have occurred by chance. In addition, results were contrasted against static amygdala FC and sliding-window FC of the primary visual cortex, which was chosen as a control seed, while a physio-physiological interaction (PPI) was performed as cross-validation. During periods of increased SCL, the left amygdala became more strongly coupled with the bilateral insula and anterior cingulate cortex, core areas of the salience network. The sliding-window analysis yielded a connectivity pattern that was unlikely to have occurred by chance, was spatially distinct from static amygdala FC and from sliding-window FC of the primary visual cortex, but was highly comparable to that of the PPI analysis. We conclude that sliding-window analysis can reveal functionally relevant fluctuations in connectivity in the context of an externally cued task. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. ZEUS-2: a second generation submillimeter grating spectrometer for exploring distant galaxies

    NASA Astrophysics Data System (ADS)

    Ferkinhoff, Carl; Nikola, Thomas; Parshley, Stephen C.; Stacey, Gordon J.; Irwin, Kent D.; Cho, Hsiao-Mei; Halpern, Mark

    2010-07-01

    ZEUS-2, the second generation (z)Redshift and Early Universe Spectrometer, like its predecessor is a moderate resolution (R~1000) long-slit, echelle grating spectrometer optimized for the detection of faint, broad lines from distant galaxies. It is designed for studying star-formation across cosmic time. ZEUS-2 employs three TES bolometer arrays (555 pixels total) to deliver simultaneous, multi-beam spectra in up to 4 submillimeter windows. The NIST Boulder-built arrays operate at ~100mK and are readout via SQUID multiplexers and the Multi-Channel Electronics from the University of British Columbia. The instrument is cooled via a pulse-tube cooler and two-stage ADR. Various filter configurations give ZEUS-2 access to 7 different telluric windows from 200 to 850 micron enabling the simultaneous mapping of lines from extended sources or the simultaneous detection of the 158 micron [CII] line and the [NII] 122 or 205 micron lines from z = 1-2 galaxies. ZEUS-2 is designed for use on the CSO, APEX and possibly JCMT.

  18. Computed Tomography Window Blending: Feasibility in Thoracic Trauma.

    PubMed

    Mandell, Jacob C; Wortman, Jeremy R; Rocha, Tatiana C; Folio, Les R; Andriole, Katherine P; Khurana, Bharti

    2018-02-07

    This study aims to demonstrate the feasibility of processing computed tomography (CT) images with a custom window blending algorithm that combines soft-tissue, bone, and lung window settings into a single image; to compare the time for interpretation of chest CT for thoracic trauma with window blending and conventional window settings; and to assess diagnostic performance of both techniques. Adobe Photoshop was scripted to process axial DICOM images from retrospective contrast-enhanced chest CTs performed for trauma with a window-blending algorithm. Two emergency radiologists independently interpreted the axial images from 103 chest CTs with both blended and conventional windows. Interpretation time and diagnostic performance were compared with Wilcoxon signed-rank test and McNemar test, respectively. Agreement with Nexus CT Chest injury severity was assessed with the weighted kappa statistic. A total of 13,295 images were processed without error. Interpretation was faster with window blending, resulting in a 20.3% time saving (P < .001), with no difference in diagnostic performance, within the power of the study to detect a difference in sensitivity of 5% as determined by post hoc power analysis. The sensitivity of the window-blended cases was 82.7%, compared to 81.6% for conventional windows. The specificity of the window-blended cases was 93.1%, compared to 90.5% for conventional windows. All injuries of major clinical significance (per Nexus CT Chest criteria) were correctly identified in all reading sessions, and all negative cases were correctly classified. All readers demonstrated near-perfect agreement with injury severity classification with both window settings. In this pilot study utilizing retrospective data, window blending allows faster preliminary interpretation of axial chest CT performed for trauma, with no significant difference in diagnostic performance compared to conventional window settings. Future studies would be required to assess the utility of window blending in clinical practice. Copyright © 2018 The Association of University Radiologists. All rights reserved.

  19. Dual Adaptive Filtering by Optimal Projection Applied to Filter Muscle Artifacts on EEG and Comparative Study

    PubMed Central

    Peyrodie, Laurent; Szurhaj, William; Bolo, Nicolas; Pinti, Antonio; Gallois, Philippe

    2014-01-01

    Muscle artifacts constitute one of the major problems in electroencephalogram (EEG) examinations, particularly for the diagnosis of epilepsy, where pathological rhythms occur within the same frequency bands as those of artifacts. This paper proposes to use the method dual adaptive filtering by optimal projection (DAFOP) to automatically remove artifacts while preserving true cerebral signals. DAFOP is a two-step method. The first step consists in applying the common spatial pattern (CSP) method to two frequency windows to identify the slowest components which will be considered as cerebral sources. The two frequency windows are defined by optimizing convolutional filters. The second step consists in using a regression method to reconstruct the signal independently within various frequency windows. This method was evaluated by two neurologists on a selection of 114 pages with muscle artifacts, from 20 clinical recordings of awake and sleeping adults, subject to pathological signals and epileptic seizures. A blind comparison was then conducted with the canonical correlation analysis (CCA) method and conventional low-pass filtering at 30 Hz. The filtering rate was 84.3% for muscle artifacts with a 6.4% reduction of cerebral signals even for the fastest waves. DAFOP was found to be significantly more efficient than CCA and 30 Hz filters. The DAFOP method is fast and automatic and can be easily used in clinical EEG recordings. PMID:25298967

  20. Performance of the round window soft coupler for the backward stimulation of the cochlea in a temporal bone model.

    PubMed

    Gostian, Antoniu-Oreste; Schwarz, David; Mandt, Philipp; Anagiotos, Andreas; Ortmann, Magdalene; Pazen, David; Beutner, Dirk; Hüttenbrink, Karl-Bernd

    2016-11-01

    The round window vibroplasty is a feasible option for the treatment of conductive, sensorineural and mixed hearing loss. Although clinical data suggest a satisfying clinical outcome with various coupling methods, the most efficient coupling technique of the floating mass transducer to the round window is still a matter of debate. For this, a soft silicone-made coupler has been developed recently that aims to ease and optimize the stimulation of the round window membrane of this middle ear implant. We performed a temporal bone study evaluating the performance of the soft coupler compared to the coupling with individually shaped cartilage, perichondrium and the titanium round window coupler with loads up to 20 mN at the unaltered and fully exposed round window niche. The stimulation of the cochlea was measured by the volume velocities of the stapes footplate detected by a laser Doppler vibrometer. The coupling method was computed as significant factor with cartilage and perichondrium allowing for the highest volume velocities followed by the soft and titanium coupler. Exposure of the round window niche allowed for higher volume velocities while the applied load did not significantly affect the results. The soft coupler allows for a good contact to the round window membrane and an effective backward stimulation of the cochlea. Clinical data are mandatory to evaluate performance of this novel coupling method in vivo.

  1. Optimized ECR plasma apparatus with varied microwave window thickness

    DOEpatents

    Berry, L.A.

    1995-11-14

    The present invention describes a technique to control the radial profile of microwave power in an ECR plasma discharge. In order to provide for a uniform plasma density to a specimen, uniform energy absorption by the plasma is desired. By controlling the radial profile of the microwave power transmitted through the microwave window of a reactor, the profile of the transmitted energy to the plasma can be controlled in order to have uniform energy absorption by the plasma. An advantage of controlling the profile using the window transmission characteristics is that variations to the radial profile of microwave power can be made without changing the microwave coupler or reactor design. 9 figs.

  2. HITEMP Material and Structural Optimization Technology Transfer

    NASA Technical Reports Server (NTRS)

    Collier, Craig S.; Arnold, Steve (Technical Monitor)

    2001-01-01

    The feasibility of adding viscoelasticity and the Generalized Method of Cells (GMC) for micromechanical viscoelastic behavior into the commercial HyperSizer structural analysis and optimization code was investigated. The viscoelasticity methodology was developed in four steps. First, a simplified algorithm was devised to test the iterative time stepping method for simple one-dimensional multiple ply structures. Second, GMC code was made into a callable subroutine and incorporated into the one-dimensional code to test the accuracy and usability of the code. Third, the viscoelastic time-stepping and iterative scheme was incorporated into HyperSizer for homogeneous, isotropic viscoelastic materials. Finally, the GMC was included in a version of HyperSizer. MS Windows executable files implementing each of these steps is delivered with this report, as well as source code. The findings of this research are that both viscoelasticity and GMC are feasible and valuable additions to HyperSizer and that the door is open for more advanced nonlinear capability, such as viscoplasticity.

  3. Using Markov Models of Fault Growth Physics and Environmental Stresses to Optimize Control Actions

    NASA Technical Reports Server (NTRS)

    Bole, Brian; Goebel, Kai; Vachtsevanos, George

    2012-01-01

    A generalized Markov chain representation of fault dynamics is presented for the case that available modeling of fault growth physics and future environmental stresses can be represented by two independent stochastic process models. A contrived but representatively challenging example will be presented and analyzed, in which uncertainty in the modeling of fault growth physics is represented by a uniformly distributed dice throwing process, and a discrete random walk is used to represent uncertain modeling of future exogenous loading demands to be placed on the system. A finite horizon dynamic programming algorithm is used to solve for an optimal control policy over a finite time window for the case that stochastic models representing physics of failure and future environmental stresses are known, and the states of both stochastic processes are observable by implemented control routines. The fundamental limitations of optimization performed in the presence of uncertain modeling information are examined by comparing the outcomes obtained from simulations of an optimizing control policy with the outcomes that would be achievable if all modeling uncertainties were removed from the system.

  4. A new adaptive multiple modelling approach for non-linear and non-stationary systems

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Gong, Yu; Hong, Xia

    2016-07-01

    This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

  5. Fuzzy CMAC With incremental Bayesian Ying-Yang learning and dynamic rule construction.

    PubMed

    Nguyen, M N

    2010-04-01

    Inspired by the philosophy of ancient Chinese Taoism, Xu's Bayesian ying-yang (BYY) learning technique performs clustering by harmonizing the training data (yang) with the solution (ying). In our previous work, the BYY learning technique was applied to a fuzzy cerebellar model articulation controller (FCMAC) to find the optimal fuzzy sets; however, this is not suitable for time series data analysis. To address this problem, we propose an incremental BYY learning technique in this paper, with the idea of sliding window and rule structure dynamic algorithms. Three contributions are made as a result of this research. First, an online expectation-maximization algorithm incorporated with the sliding window is proposed for the fuzzification phase. Second, the memory requirement is greatly reduced since the entire data set no longer needs to be obtained during the prediction process. Third, the rule structure dynamic algorithm with dynamically initializing, recruiting, and pruning rules relieves the "curse of dimensionality" problem that is inherent in the FCMAC. Because of these features, the experimental results of the benchmark data sets of currency exchange rates and Mackey-Glass show that the proposed model is more suitable for real-time streaming data analysis.

  6. Changing ASD-ADHD symptom co-occurrence across the lifespan with adolescence as crucial time window: Illustrating the need to go beyond childhood.

    PubMed

    Hartman, Catharina A; Geurts, Hilde M; Franke, Barbara; Buitelaar, Jan K; Rommelse, Nanda N J

    2016-12-01

    Literature on the co-occurrence between Autism Spectrum Disorder (ASD) and Attention-Deficit/Hyperactivity Disorder (ADHD) is strongly biased by a focus on childhood age. A review of the adolescent and adult literature was made on core and related symptoms of ADHD and ASD. In addition, an empirical approach was used including 17,173 ASD-ADHD symptom ratings from participants aged 0 to 84 years. Results indicate that ASD/ADHD constellations peak during adolescence and are lower in early childhood and old age. We hypothesize that on the border of the expected transition to independent adulthood, ASD and ADHD co-occur most because social adaptation and EF skills matter most. Lower correlations in childhood and older age may be due to more diffuse symptoms reflecting respectively still differentiating and de-differentiating EF functions. We plea for a strong research focus in adolescence which may -after early childhood- be a second crucial time window for catching-up pattern explaining more optimal outcomes. We discuss obstacles and oppportunities of a full lifespan approach into old age. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Implementing and Bounding a Cascade Heuristic for Large-Scale Optimization

    DTIC Science & Technology

    2017-06-01

    solving the monolith, we develop a method for producing lower bounds to the optimal objective function value. To do this, we solve a new integer...as developing and analyzing methods for producing lower bounds to the optimal objective function value of the seminal problem monolith, which this...length of the window decreases, the end effects of the model typically increase (Zerr, 2016). There are four primary methods for correcting end

  8. Scheduling Aircraft Landings under Constrained Position Shifting

    NASA Technical Reports Server (NTRS)

    Balakrishnan, Hamsa; Chandran, Bala

    2006-01-01

    Optimal scheduling of airport runway operations can play an important role in improving the safety and efficiency of the National Airspace System (NAS). Methods that compute the optimal landing sequence and landing times of aircraft must accommodate practical issues that affect the implementation of the schedule. One such practical consideration, known as Constrained Position Shifting (CPS), is the restriction that each aircraft must land within a pre-specified number of positions of its place in the First-Come-First-Served (FCFS) sequence. We consider the problem of scheduling landings of aircraft in a CPS environment in order to maximize runway throughput (minimize the completion time of the landing sequence), subject to operational constraints such as FAA-specified minimum inter-arrival spacing restrictions, precedence relationships among aircraft that arise either from airline preferences or air traffic control procedures that prevent overtaking, and time windows (representing possible control actions) during which each aircraft landing can occur. We present a Dynamic Programming-based approach that scales linearly in the number of aircraft, and describe our computational experience with a prototype implementation on realistic data for Denver International Airport.

  9. In chronic myeloid leukemia patients on second-line tyrosine kinase inhibitor therapy, deep sequencing of BCR-ABL1 at the time of warning may allow sensitive detection of emerging drug-resistant mutants.

    PubMed

    Soverini, Simona; De Benedittis, Caterina; Castagnetti, Fausto; Gugliotta, Gabriele; Mancini, Manuela; Bavaro, Luana; Machova Polakova, Katerina; Linhartova, Jana; Iurlo, Alessandra; Russo, Domenico; Pane, Fabrizio; Saglio, Giuseppe; Rosti, Gianantonio; Cavo, Michele; Baccarani, Michele; Martinelli, Giovanni

    2016-08-02

    Imatinib-resistant chronic myeloid leukemia (CML) patients receiving second-line tyrosine kinase inhibitor (TKI) therapy with dasatinib or nilotinib have a higher risk of disease relapse and progression and not infrequently BCR-ABL1 kinase domain (KD) mutations are implicated in therapeutic failure. In this setting, earlier detection of emerging BCR-ABL1 KD mutations would offer greater chances of efficacy for subsequent salvage therapy and limit the biological consequences of full BCR-ABL1 kinase reactivation. Taking advantage of an already set up and validated next-generation deep amplicon sequencing (DS) assay, we aimed to assess whether DS may allow a larger window of detection of emerging BCR-ABL1 KD mutants predicting for an impending relapse. a total of 125 longitudinal samples from 51 CML patients who had acquired dasatinib- or nilotinib-resistant mutations during second-line therapy were analyzed by DS from the time of failure and mutation detection by conventional sequencing backwards. BCR-ABL1/ABL1%(IS) transcript levels were used to define whether the patient had 'optimal response', 'warning' or 'failure' at the time of first mutation detection by DS. DS was able to backtrack dasatinib- or nilotinib-resistant mutations to the previous sample(s) in 23/51 (45 %) pts. Median mutation burden at the time of first detection by DS was 5.5 % (range, 1.5-17.5 %); median interval between detection by DS and detection by conventional sequencing was 3 months (range, 1-9 months). In 5 cases, the mutations were detectable at baseline. In the remaining cases, response level at the time mutations were first detected by DS could be defined as 'Warning' (according to the 2013 ELN definitions of response to 2nd-line therapy) in 13 cases, as 'Optimal response' in one case, as 'Failure' in 4 cases. No dasatinib- or nilotinib-resistant mutations were detected by DS in 15 randomly selected patients with 'warning' at various timepoints, that later turned into optimal responders with no treatment changes. DS enables a larger window of detection of emerging BCR-ABL1 KD mutations predicting for an impending relapse. A 'Warning' response may represent a rational trigger, besides 'Failure', for DS-based mutation screening in CML patients undergoing second-line TKI therapy.

  10. X-ray characterization of a multichannel smart-pixel array detector.

    PubMed

    Ross, Steve; Haji-Sheikh, Michael; Huntington, Andrew; Kline, David; Lee, Adam; Li, Yuelin; Rhee, Jehyuk; Tarpley, Mary; Walko, Donald A; Westberg, Gregg; Williams, George; Zou, Haifeng; Landahl, Eric

    2016-01-01

    The Voxtel VX-798 is a prototype X-ray pixel array detector (PAD) featuring a silicon sensor photodiode array of 48 × 48 pixels, each 130 µm × 130 µm × 520 µm thick, coupled to a CMOS readout application specific integrated circuit (ASIC). The first synchrotron X-ray characterization of this detector is presented, and its ability to selectively count individual X-rays within two independent arrival time windows, a programmable energy range, and localized to a single pixel is demonstrated. During our first trial run at Argonne National Laboratory's Advance Photon Source, the detector achieved a 60 ns gating time and 700 eV full width at half-maximum energy resolution in agreement with design parameters. Each pixel of the PAD holds two independent digital counters, and the discriminator for X-ray energy features both an upper and lower threshold to window the energy of interest discarding unwanted background. This smart-pixel technology allows energy and time resolution to be set and optimized in software. It is found that the detector linearity follows an isolated dead-time model, implying that megahertz count rates should be possible in each pixel. Measurement of the line and point spread functions showed negligible spatial blurring. When combined with the timing structure of the synchrotron storage ring, it is demonstrated that the area detector can perform both picosecond time-resolved X-ray diffraction and fluorescence spectroscopy measurements.

  11. Global Scale Simultaneous Retrieval of Smoothened Vegetation Optical Depth and Surface Roughness Parameter using AMSR-E X-band Observations

    NASA Astrophysics Data System (ADS)

    Lanka, Karthikeyan; Pan, Ming; Konings, Alexandra; Piles, María; D, Nagesh Kumar; Wood, Eric

    2017-04-01

    Traditionally, passive microwave retrieval algorithms such as Land Parameter Retrieval Model (LPRM) estimate simultaneously soil moisture and Vegetation Optical Depth (VOD) using brightness temperature (Tb) data. The algorithm requires a surface roughness parameter which - despite implications - is generally assumed to be constant at global scale. Due to inherent noise in the satellite data and retrieval algorithm, the VOD retrievals are usually observed to be highly fluctuating at daily scale which may not occur in reality. Such noisy VOD retrievals along with spatially invariable roughness parameter may affect the quality of soil moisture retrievals. The current work aims to smoothen the VOD retrievals (with an assumption that VOD remains constant over a period of time) and simultaneously generate, for the first time, global surface roughness map using multiple descending X-band Tb observations of AMSR-E. The methodology utilizes Tb values under a moving-time-window-setup to estimate concurrently the soil moisture of each day and a constant VOD in the window. Prior to this step, surface roughness parameter is estimated using the complete time series of Tb record. Upon carrying out the necessary sensitivity analysis, the smoothened VOD along with soil moisture retrievals is generated for the 10-year duration of AMSR-E (2002-2011) with a 7-day moving window using the LPRM framework. The spatial patterns of resulted global VOD maps are in coherence with vegetation biomass and climate conditions. The VOD results also exhibit a smoothening effect in terms of lower values of standard deviation. This is also evident from time series comparison of VOD and LPRM VOD retrievals without optimization over moving windows at several grid locations across the globe. The global surface roughness map also exhibited spatial patterns that are strongly influenced by topography and land use conditions. Some of the noticeable features include high roughness over mountainous regions and heavily vegetated tropical rainforests, low roughness in desert areas and moderate roughness value over higher latitudes. The new datasets of VOD and surface roughness can help improving the quality of soil moisture retrievals. Also, the methodology proposed is generic by nature and can be implemented over currently operating AMSR2, SMOS, and SMAP soil moisture missions.

  12. Studies on the Parametric Effects of Plasma Arc Welding of 2205 Duplex Stainless Steel

    NASA Astrophysics Data System (ADS)

    Selva Bharathi, R.; Siva Shanmugam, N.; Murali Kannan, R.; Arungalai Vendan, S.

    2018-03-01

    This research study attempts to create an optimized parametric window by employing Taguchi algorithm for Plasma Arc Welding (PAW) of 2 mm thick 2205 duplex stainless steel. The parameters considered for experimentation and optimization are the welding current, welding speed and pilot arc length respectively. The experimentation involves the parameters variation and subsequently recording the depth of penetration and bead width. Welding current of 60-70 A, welding speed of 250-300 mm/min and pilot arc length of 1-2 mm are the range between which the parameters are varied. Design of experiments is used for the experimental trials. Back propagation neural network, Genetic algorithm and Taguchi techniques are used for predicting the bead width, depth of penetration and validated with experimentally achieved results which were in good agreement. Additionally, micro-structural characterizations are carried out to examine the weld quality. The extrapolation of these optimized parametric values yield enhanced weld strength with cost and time reduction.

  13. Optimal deployment of resources for maximizing impact in spreading processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lokhov, Andrey Y.; Saad, David

    The effective use of limited resources for controlling spreading processes on networks is of prime significance in diverse contexts, ranging from the identification of “influential spreaders” for maximizing information dissemination and targeted interventions in regulatory networks, to the development of mitigation policies for infectious diseases and financial contagion in economic systems. Solutions for these optimization tasks that are based purely on topological arguments are not fully satisfactory; in realistic settings, the problem is often characterized by heterogeneous interactions and requires interventions in a dynamic fashion over a finite time window via a restricted set of controllable nodes. The optimal distributionmore » of available resources hence results from an interplay between network topology and spreading dynamics. Here, we show how these problems can be addressed as particular instances of a universal analytical framework based on a scalable dynamic message-passing approach and demonstrate the efficacy of the method on a variety of real-world examples.« less

  14. Optimal deployment of resources for maximizing impact in spreading processes

    DOE PAGES

    Lokhov, Andrey Y.; Saad, David

    2017-09-12

    The effective use of limited resources for controlling spreading processes on networks is of prime significance in diverse contexts, ranging from the identification of “influential spreaders” for maximizing information dissemination and targeted interventions in regulatory networks, to the development of mitigation policies for infectious diseases and financial contagion in economic systems. Solutions for these optimization tasks that are based purely on topological arguments are not fully satisfactory; in realistic settings, the problem is often characterized by heterogeneous interactions and requires interventions in a dynamic fashion over a finite time window via a restricted set of controllable nodes. The optimal distributionmore » of available resources hence results from an interplay between network topology and spreading dynamics. Here, we show how these problems can be addressed as particular instances of a universal analytical framework based on a scalable dynamic message-passing approach and demonstrate the efficacy of the method on a variety of real-world examples.« less

  15. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    PubMed

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  16. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    PubMed Central

    Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250

  17. Study of controlled-release floating tablets of dipyridamole using the dry-coated method.

    PubMed

    Chen, Kai; Wen, Haoyang; Yang, Feifei; Yu, Yibin; Gai, Xiumei; Wang, Haiying; Li, Pingfei; Pan, Weisan; Yang, Xinggang

    2018-01-01

    Dipyridamole (DIP), having a short biological half-life, has a narrow absorption window and is primarily absorbed in the stomach. So, the purpose of this study was to prepare controlled-release floating (CRF) tablets of dipyridamole by the dry-coated method. The influence of agents with different viscosity, hydroxypropylmethylcellulose (HPMC) and polyvinylpyrollidon K30 (PVP K30) in the core tablet and low-viscosity HPMC and PVP K30 in the coating layer on drug release, were investigated. Then, a study with a three-factor, three-level orthogonal experimental design was used to optimize the formulation of the CRF tablets. After data processing, the optimized formulation was found to be: 80 mg HPMC K4M in the core tablet, 80 mg HPMC E15 in core tablet and 40 mg PVP K30 in the coating layer. Moreover, an in vitro buoyancy study showed that the optimized formulation had an excellent floating ability and could immediately float without a lag time and this lasted more than 12 h. Furthermore, an in vivo gamma scintigraphic study showed that the gastric residence time of the CRF tablet was about 8 h.

  18. Mass Spectrometry Parameters Optimization for the 46 Multiclass Pesticides Determination in Strawberries with Gas Chromatography Ion-Trap Tandem Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Fernandes, Virgínia C.; Vera, Jose L.; Domingues, Valentina F.; Silva, Luís M. S.; Mateus, Nuno; Delerue-Matos, Cristina

    2012-12-01

    Multiclass analysis method was optimized in order to analyze pesticides traces by gas chromatography with ion-trap and tandem mass spectrometry (GC-MS/MS). The influence of some analytical parameters on pesticide signal response was explored. Five ion trap mass spectrometry (IT-MS) operating parameters, including isolation time (IT), excitation voltage (EV), excitation time (ET), maximum excitation energy or " q" value (q), and isolation mass window (IMW) were numerically tested in order to maximize the instrument analytical signal response. For this, multiple linear regression was used in data analysis to evaluate the influence of the five parameters on the analytical response in the ion trap mass spectrometer and to predict its response. The assessment of the five parameters based on the regression equations substantially increased the sensitivity of IT-MS/MS in the MS/MS mode. The results obtained show that for most of the pesticides, these parameters have a strong influence on both signal response and detection limit. Using the optimized method, a multiclass pesticide analysis was performed for 46 pesticides in a strawberry matrix. Levels higher than the limit established for strawberries by the European Union were found in some samples.

  19. Letter-sound processing deficits in children with developmental dyslexia: An ERP study.

    PubMed

    Moll, Kristina; Hasko, Sandra; Groth, Katharina; Bartling, Jürgen; Schulte-Körne, Gerd

    2016-04-01

    The time course during letter-sound processing was investigated in children with developmental dyslexia (DD) and typically developing (TD) children using electroencephalography. Thirty-eight children with DD and 25 TD children participated in a visual-auditory oddball paradigm. Event-related potentials (ERPs) elicited by standard and deviant stimuli in an early (100-190 ms) and late (560-750 ms) time window were analysed. In the early time window, ERPs elicited by the deviant stimulus were delayed and less left lateralized over fronto-temporal electrodes for children with DD compared to TD children. In the late time window, children with DD showed higher amplitudes extending more over right frontal electrodes. Longer latencies in the early time window and stronger right hemispheric activation in the late time window were associated with slower reading and naming speed. Additionally, stronger right hemispheric activation in the late time window correlated with poorer phonological awareness skills. Deficits in early stages of letter-sound processing influence later more explicit cognitive processes during letter-sound processing. Identifying the neurophysiological correlates of letter-sound processing and their relation to reading related skills provides insight into the degree of automaticity during letter-sound processing beyond behavioural measures of letter-sound-knowledge. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. WE-AB-209-10: Optimizing the Delivery of Sequential Fluence Maps for Efficient VMAT Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, D; Balvert, M

    2016-06-15

    Purpose: To develop an optimization model and solution approach for computing MLC leaf trajectories and dose rates for high quality matching of a set of optimized fluence maps to be delivered sequentially around a patient in a VMAT treatment. Methods: We formulate the fluence map matching problem as a nonlinear optimization problem where time is discretized but dose rates and leaf positions are continuous variables. For a given allotted time, which is allocated across the fluence maps based on the complexity of each fluence map, the optimization problem searches for the best leaf trajectories and dose rates such that themore » original fluence maps are closely recreated. Constraints include maximum leaf speed, maximum dose rate, and leaf collision avoidance, as well as the constraint that the ending leaf positions for one map are the starting leaf positions for the next map. The resulting model is non-convex but smooth, and therefore we solve it by local searches from a variety of starting positions. We improve solution time by a custom decomposition approach which allows us to decouple the rows of the fluence maps and solve each leaf pair individually. This decomposition also makes the problem easily parallelized. Results: We demonstrate method on a prostate case and a head-and-neck case and show that one can recreate fluence maps to high degree of fidelity in modest total delivery time (minutes). Conclusion: We present a VMAT sequencing method that reproduces optimal fluence maps by searching over a vast number of possible leaf trajectories. By varying the total allotted time given, this approach is the first of its kind to allow users to produce VMAT solutions that span the range of wide-field coarse VMAT deliveries to narrow-field high-MU sliding window-like approaches.« less

  1. Lagged kernel machine regression for identifying time windows of susceptibility to exposures of complex mixtures.

    PubMed

    Liu, Shelley H; Bobb, Jennifer F; Lee, Kyu Ha; Gennings, Chris; Claus Henn, Birgit; Bellinger, David; Austin, Christine; Schnaas, Lourdes; Tellez-Rojo, Martha M; Hu, Howard; Wright, Robert O; Arora, Manish; Coull, Brent A

    2018-07-01

    The impact of neurotoxic chemical mixtures on children's health is a critical public health concern. It is well known that during early life, toxic exposures may impact cognitive function during critical time intervals of increased vulnerability, known as windows of susceptibility. Knowledge on time windows of susceptibility can help inform treatment and prevention strategies, as chemical mixtures may affect a developmental process that is operating at a specific life phase. There are several statistical challenges in estimating the health effects of time-varying exposures to multi-pollutant mixtures, such as: multi-collinearity among the exposures both within time points and across time points, and complex exposure-response relationships. To address these concerns, we develop a flexible statistical method, called lagged kernel machine regression (LKMR). LKMR identifies critical exposure windows of chemical mixtures, and accounts for complex non-linear and non-additive effects of the mixture at any given exposure window. Specifically, LKMR estimates how the effects of a mixture of exposures change with the exposure time window using a Bayesian formulation of a grouped, fused lasso penalty within a kernel machine regression (KMR) framework. A simulation study demonstrates the performance of LKMR under realistic exposure-response scenarios, and demonstrates large gains over approaches that consider each time window separately, particularly when serial correlation among the time-varying exposures is high. Furthermore, LKMR demonstrates gains over another approach that inputs all time-specific chemical concentrations together into a single KMR. We apply LKMR to estimate associations between neurodevelopment and metal mixtures in Early Life Exposures in Mexico and Neurotoxicology, a prospective cohort study of child health in Mexico City.

  2. Optimization of the front contact to minimize short-circuit current losses in CdTe thin-film solar cells

    NASA Astrophysics Data System (ADS)

    Kephart, Jason Michael

    With a growing population and rising standard of living, the world is in need of clean sources of energy at low cost in order to meet both economic and environmental needs. Solar energy is an abundant resource which is fundamentally adequate to meet all human energy needs. Photovoltaics are an attractive way to safely convert this energy to electricity with little to no noise, moving parts, water, or arable land. Currently, thin-film photovoltaic modules based on cadmium telluride are a low-cost solution with multiple GW/year commercial production, but have lower conversion efficiency than the dominant technology, crystalline silicon. Increasing the conversion efficiency of these panels through optimization of the electronic and optical structure of the cell can further lower the cost of these modules. The front contact of the CdTe thin-film solar cell is critical to device efficiency for three important reasons: it must transmit light to the CdTe absorber to be collected, it must form a reasonably passive interface and serve as a growth template for the CdTe, and it must allow electrons to be extracted from the CdTe. The current standard window layer material, cadmium sulfide, has a low bandgap of 2.4 eV which can block over 20% of available light from being converted to mobile charge carriers. Reducing the thickness of this layer or replacing it with a higher-bandgap material can provide a commensurate increase in device efficiency. When the CdS window is made thinner, a degradation in electronic quality of the device is observed with a reduction in open-circuit voltage and fill factor. One commonly used method to enable a thinner optimum CdS thickness is a high-resistance transparent (HRT) layer between the transparent conducting oxide electrode and window layer. The function of this layer has not been fully explained in the literature, and existing hypotheses center on the existence of pinholes in the window layer which are not consistent with observed results. In this work numerous HRT layers were examined beginning with an empirical optimization to create a SnO2-based HRT which allows significantly reduced CdS thickness while maintaining diode quality. The role of this layer was explored through measurement of band alignment parameters via photoemission. These results suggest a negative correlation of work function to device open-circuit voltage, which implies that non-ideal band alignment at the front interface of CdTe is in large part responsible for the loss of electronic quality. Several scenarios explored through 1-dimensional modeling in the SCAPS program corroborate this theory. A sputter-deposited (Mg,Zn)O layer was tested which allows for complete elimination of the CdS window layer with an increase in open-circuit voltage and near complete transmission of all above-bandgap light. An additional window layer material---sputtered, oxygenated CdS---was explored for its transparency. This material was found only to produce high efficiency devices with an effective buffer layer such as the optimized SnO2-base HRT. The dependence of chemical, optical, electrical, and device properties on oxygen content was explored, and the stability of these devices was determined to depend largely on the minimization of copper in the device. Both sputter-deposited alloy window layers appeared to have tunable electron affinity which was critical to optimizing band alignment and therefore device efficiency. Several scenarios explored through 1-dimensional modeling in the SCAPS program corroborate this theory. Both window layers allowed an AM1.5G efficiency increase from a baseline of approximately 13% to 16%.

  3. Epoch of reionization window. II. Statistical methods for foreground wedge reduction

    NASA Astrophysics Data System (ADS)

    Liu, Adrian; Parsons, Aaron R.; Trott, Cathryn M.

    2014-07-01

    For there to be a successful measurement of the 21 cm epoch of reionization (EoR) power spectrum, it is crucial that strong foreground contaminants be robustly suppressed. These foregrounds come from a variety of sources (such as Galactic synchrotron emission and extragalactic point sources), but almost all share the property of being spectrally smooth and, when viewed through the chromatic response of an interferometer, occupy a signature "wedge" region in cylindrical k⊥k∥ Fourier space. The complement of the foreground wedge is termed the "EoR window" and is expected to be mostly foreground-free, allowing clean measurements of the power spectrum. This paper is a sequel to a previous paper that established a rigorous mathematical framework for describing the foreground wedge and the EoR window. Here, we use our framework to explore statistical methods by which the EoR window can be enlarged, thereby increasing the sensitivity of a power spectrum measurement. We adapt the Feldman-Kaiser-Peacock approximation (commonly used in galaxy surveys) for 21 cm cosmology and also compare the optimal quadratic estimator to simpler estimators that ignore covariances between different Fourier modes. The optimal quadratic estimator is found to suppress foregrounds by an extra factor of ˜105 in power at the peripheries of the EoR window, boosting the detection of the cosmological signal from 12σ to 50σ at the midpoint of reionization in our fiducial models. If numerical issues can be finessed, decorrelation techniques allow the EoR window to be further enlarged, enabling measurements to be made deep within the foreground wedge. These techniques do not assume that foreground is Gaussian distributed, and we additionally prove that a final round of foreground subtraction can be performed after decorrelation in a way that is guaranteed to have no cosmological signal loss.

  4. Theory of Arachnid Prey Localization

    NASA Astrophysics Data System (ADS)

    Stürzl, W.; Kempter, R.; van Hemmen, J. L.

    2000-06-01

    Sand scorpions and many other arachnids locate their prey through highly sensitive slit sensilla at the tips (tarsi) of their eight legs. This sensor array responds to vibrations with stimulus-locked action potentials encoding the target direction. We present a neuronal model to account for stimulus angle determination using a population of second-order neurons, each receiving excitatory input from one tarsus and inhibition from a triad opposite to it. The input opens a time window whose width determines a neuron's firing probability. Stochastic optimization is realized through tuning the balance between excitation and inhibition. The agreement with experiments on the sand scorpion is excellent.

  5. The roles of the trading time risks on stock investment return and risks in stock price crashes

    NASA Astrophysics Data System (ADS)

    Li, Jiang-Cheng; Dong, Zhi-Wei; Yang, Guo-Hui; Long, Chao

    2017-03-01

    The roles of the trading time risks (TTRs) on stock investment return and risks are investigated in the condition of stock price crashes with Hushen300 data (CSI300) and Dow Jones Industrial Average (ˆDJI), respectively. In order to describe the TTR, we employ the escape time that the stock price drops from the maximum to minimum value in a data window length (DWL). After theoretical and empirical research on probability density function of return, the results in both ˆDJI and CSI300 indicate that: (i) As increasing DWL, the expectation of returns and its stability are weakened. (ii) An optimal TTR is related to a maximum return and minimum risk of stock investment in stock price crashes.

  6. Multimodal correlation of dynamic [18F]-AV-1451 perfusion PET and neuronal hypometabolism in [18F]-FDG PET.

    PubMed

    Hammes, Jochen; Leuwer, Isabel; Bischof, Gérard N; Drzezga, Alexander; van Eimeren, Thilo

    2017-12-01

    Cerebral glucose metabolism measured with [18F]-FDG PET is a well established marker of neuronal dysfunction in neurodegeneration. The tau-protein tracer [18F]-AV-1451 PET is currently under evaluation and shows promising results. Here, we assess the feasibility of early perfusion imaging with AV-1451 as a substite for FDG PET in assessing neuronal injury. Twenty patients with suspected neurodegeneration underwent FDG and early phase AV-1451 PET imaging. Ten one-minute timeframes were acquired after application of 200 MBq AV-1451. FDG images were acquired on a different date according to clinical protocol. Early AV-1451 timeframes were coregistered to individual FDG-scans and spatially normalized. Voxel-wise intermodal correlations were calculated on within-subject level for every possible time window. The window with highest pooled correlation was considered optimal. Z-transformed deviation maps (ZMs) were created from both FDG and early AV-1451 images, comparing against FDG images of healthy controls. Regional patterns and extent of perfusion deficits were highly comparable to metabolic deficits. Best results were observed in a time window from 60 to 360 s (r = 0.86). Correlation strength ranged from r = 0.96 (subcortical gray matter) to 0.83 (frontal lobe) in regional analysis. ZMs of early AV-1451 and FDG images were highly similar. Perfusion imaging with AV-1451 is a valid biomarker for assessment of neuronal dysfunction in neurodegenerative diseases. Radiation exposure and complexity of the diagnostic workup could be reduced significantly by routine acquisition of early AV-1451 images, sparing additional FDG PET.

  7. Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data.

    PubMed

    Kim, Sehwi; Jung, Inkyung

    2017-01-01

    The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns.

  8. Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data

    PubMed Central

    Kim, Sehwi

    2017-01-01

    The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns. PMID:28753674

  9. Design Optimization Toolkit: Users' Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less

  10. Novel Approaches for Visualizing and Analyzing Dose-Timing Data from Electronic Drug Monitors, or "How the 'Broken Window' Theory Pertains to ART Adherence".

    PubMed

    Gill, Christopher J; DeSilva, Mary Bachman; Hamer, Davidson H; Keyi, Xu; Wilson, Ira B; Sabin, Lora

    2015-11-01

    Adherence to antiretroviral medications is usually expressed in terms of the proportion of doses taken. However, the timing of doses taken may also be an important dimension to overall adherence. Little is known about whether patients who mistime doses are also more likely to skip doses. Using data from the completed Adherence for Life randomized controlled trial, we created visual and statistical models to capture and analyze dose timing data collected longitudinally with electronic drug monitors (EDM). From scatter plots depicting dose time versus calendar date, we identified dominant patterns of dose taking and calculated key features [slope of line over calendar date; residual mean standard error (RMSE)]. Each was assessed for its ability to categorize subjects with 'sub-optimal' (<95 % of doses taken) using area under the receiver operating characteristic (AROC) curve analysis. Sixty eight subjects contributed EDM data, with ~300 to 400 observations/subject. While regression line slopes did not predict 'sub-optimal' adherence (AROC 0.51, 95 % CI 0.26-0.75), the variability in dose timing (RMSE) was strongly predictive (AROC 0.79, 95 % CI 0.62-0.97). Compared with the lowest quartile of RMSE (minimal dose time variability), each successive quartile roughly doubled the odds of 'sub-optimal' adherence (OR 2.1, 95 % CI 1.3-3.4). Patterns of dose timing and mistiming are strongly related to overall adherence behavior. Notably, individuals who skip doses are more likely to mistime doses, with the degree of risk positively correlated with the extent of dose timing variability.

  11. Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.

  12. A test of multiple correlation temporal window characteristic of non-Markov processes

    NASA Astrophysics Data System (ADS)

    Arecchi, F. T.; Farini, A.; Megna, N.

    2016-03-01

    We introduce a sensitive test of memory effects in successive events. The test consists of a combination K of binary correlations at successive times. K decays monotonically from K = 1 for uncorrelated events as a Markov process. For a monotonic memory fading, K<1 always. Here we report evidence of a K>1 temporal window in cognitive tasks consisting of the visual identification of the front face of the Necker cube after a previous presentation of the same. We speculate that memory effects provide a temporal window with K>1 and this experiment could be a possible first step towards a better comprehension of this phenomenon. The K>1 behaviour is maximal at an inter-measurement time τ around 2s with inter-subject differences. The K>1 persists over a time window of 1s around τ; outside this window the K<1 behaviour is recovered. The universal occurrence of a K>1 window in pairs of successive perceptions suggests that, at variance with single visual stimuli eliciting a suitable response, a pair of stimuli shortly separated in time displays mutual correlations.

  13. The second window ICG technique demonstrates a broad plateau period for near infrared fluorescence tumor contrast in glioblastoma

    PubMed Central

    Sheikh, Saad; Xia, Leilei; Pierce, John; Newton, Andrew; Predina, Jarrod; Cho, Steve; Nasrallah, MacLean; Singhal, Sunil; Dorsey, Jay; Lee, John Y. K.

    2017-01-01

    Introduction Fluorescence-guided surgery has emerged as a powerful tool to detect, localize and resect tumors in the operative setting. Our laboratory has pioneered a novel way to administer an FDA-approved near-infrared (NIR) contrast agent to help surgeons with this task. This technique, coined Second Window ICG, exploits the natural permeability of tumor vasculature and its poor clearance to deliver high doses of indocyanine green (ICG) to tumors. This technique differs substantially from established ICG video angiography techniques that visualize ICG within minutes of injection. We hypothesized that Second Window ICG can provide NIR optical contrast with good signal characteristics in intracranial brain tumors over a longer period of time than previously appreciated with ICG video angiography alone. We tested this hypothesis in an intracranial mouse glioblastoma model, and corroborated this in a human clinical trial. Methods Intracranial tumors were established in 20 mice using the U251-Luc-GFP cell line. Successful grafts were confirmed with bioluminescence. Intravenous tail vein injections of 5.0 mg/kg (high dose) or 2.5 mg/kg (low dose) ICG were performed. The Perkin Elmer IVIS Spectrum (closed field) was used to visualize NIR fluorescence signal at seven delayed time points following ICG injection. NIR signals were quantified using LivingImage software. Based on the success of our results, human subjects were recruited to a clinical trial and intravenously injected with high dose 5.0 mg/kg. Imaging was performed with the VisionSense Iridium (open field) during surgery one day after ICG injection. Results In the murine model, the NIR signal-to-background ratio (SBR) in gliomas peaks at one hour after infusion, then plateaus and remains strong and stable for at least 48 hours. Higher dose 5.0 mg/kg improves NIR signal as compared to lower dose at 2.5 mg/kg (SBR = 3.5 vs. 2.8; P = 0.0624). Although early (≤ 6 hrs) visualization of the Second Window ICG accumulation in gliomas is stronger than late (≥24 hrs) visualization (SBR = 3.94 vs. 2.32; p<0.05) there appears to be a long plateau period of stable ICG NIR signal accumulation within tumors in the murine model. We call this long plateau period the “Second Window” of ICG. In glioblastoma patients, the delayed visualization of intratumoral NIR signal was strong (SBR 7.50 ± 0.74), without any significant difference within the 19 to 30 hour visualization window (R2 = 0.019). Conclusion The Second Window ICG technique allows neurosurgeons to deliver NIR optical contrast agent to human glioblastoma patients, thus providing real-time tumor identification in the operating room. This nonspecific tumor accumulation of ICG within the tumor provides strong signal to background contrast, and is not significantly time dependent between 6 hours to 48 hours, providing a broad plateau for stable visualization. This finding suggests that optimal imaging of the “Second Window of ICG” may be within this plateau period, thus providing signal uniformity across subjects. PMID:28738091

  14. Physiologically Based Absorption Modeling to Design Extended-Release Clinical Products for an Ester Prodrug.

    PubMed

    Ding, Xuan; Day, Jeffrey S; Sperry, David C

    2016-11-01

    Absorption modeling has demonstrated its great value in modern drug product development due to its utility in understanding and predicting in vivo performance. In this case, we integrated physiologically based modeling in the development processes to effectively design extended-release (ER) clinical products for an ester prodrug LY545694. By simulating the trial results of immediate-release products, we delineated complex pharmacokinetics due to prodrug conversion and established an absorption model to describe the clinical observations. This model suggested the prodrug has optimal biopharmaceutical properties to warrant developing an ER product. Subsequently, we incorporated release profiles of prototype ER tablets into the absorption model to simulate the in vivo performance of these products observed in an exploratory trial. The models suggested that the absorption of these ER tablets was lower than the IR products because the extended release from the formulations prevented the drug from taking advantage of the optimal absorption window. Using these models, we formed a strategy to optimize the ER product to minimize the impact of the absorption window limitation. Accurate prediction of the performance of these optimized products by modeling was confirmed in a third clinical trial.

  15. Sunshade for building exteriors

    DOEpatents

    Braunstein, Richard; McKenna, Gregory B.; Hewitt, David W.; Harper, Randolph S.

    2002-01-01

    A sunshade for shading window exteriors includes at least one connecting bracket for attachment to a window mullion, a blade support strut attached to the connecting bracket at a first joint, and a plurality of louvered blades supported by the blade support strut at a second joint. The pivot angle at the first joint may be varied to extend the louvered blades a desired distance from the window mullion. The louvered blades are positioned at a preselected fixed profile angle on the second joint in order to optimize shading at the latitude where the sunshade is installed. In a preferred embodiment, the louvered blades have top walls supporting photovoltaic cells and the sunshade includes electric cables for connecting the photovoltaic cells to an electric circuit.

  16. Enhanced water window x-ray emission from in situ formed carbon clusters irradiated by intense ultra-short laser pulses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakravarty, U.; Rao, B. S.; Arora, V.

    Enhanced water window x-ray emission (23–44 Å) from carbon clusters, formed in situ using a pre-pulse, irradiated by intense (I > 10{sup 17} W/cm{sup 2}) ultra-short laser pulse, is demonstrated. An order of magnitude x-ray enhancement over planar graphite target is observed in carbon clusters, formed by a sub-ns pre-pulse, interacting with intense main pulse after a delay. The effect of the delay and the duration of the main pulse is studied for optimizing the x-ray emission in the water window region. This x-ray source has added advantages of being an efficient, high repetition rate, and low debris x-ray source.

  17. Design and fabrication of a large area freestanding compressive stress SiO2 optical window

    NASA Astrophysics Data System (ADS)

    Van Toan, Nguyen; Sangu, Suguru; Ono, Takahito

    2016-07-01

    This paper reports the design and fabrication of a 7.2 mm  ×  9.6 mm freestanding compressive stress SiO2 optical window without buckling. An application of the SiO2 optical window with and without liquid penetration has been demonstrated for an optical modulator and its optical characteristic is evaluated by using an image sensor. Two methods for SiO2 optical window fabrication have been presented. The first method is a combination of silicon etching and a thermal oxidation process. Silicon capillaries fabricated by deep reactive ion etching (deep RIE) are completely oxidized to form the SiO2 capillaries. The large compressive stress of the oxide causes buckling of the optical window, which is reduced by optimizing the design of the device structure. A magnetron-type RIE, which is investigated for deep SiO2 etching, is the second method. This method achieves deep SiO2 etching together with smooth surfaces, vertical shapes and a high aspect ratio. Additionally, in order to avoid a wrinkling optical window, the idea of a Peano curve structure has been proposed to achieve a freestanding compressive stress SiO2 optical window. A 7.2 mm  ×  9.6 mm optical window area without buckling integrated with an image sensor for an optical modulator has been successfully fabricated. The qualitative and quantitative evaluations have been performed in cases with and without liquid penetration.

  18. LLE Review 117 (October-December 2008)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bittle, W., editor

    2009-05-28

    This volume of the LLE Review, covering October-December 2008, features 'Demonstration of the Shock-Timing Technique for Ignition Targets at the National Ignition Facility' by T. R. Boehly, V. N. Goncharov, S. X. Hu, J. A. Marozas, T. C. Sangster, D. D. Meyerhofer (LLE), D. Munro, P. M. Celliers, D. G. Hicks, G. W. Collins, H. F. Robey, O. L. Landen (LLNL), and R. E. Olson (SNL). In this article (p. 1) the authors report on a technique to measure the velocity and timing of shock waves in a capsule contained within hohlraum targets. This technique is critical for optimizing themore » drive profiles for high-performance inertial-confinement-fusion capsules, which are compressed by multiple precisely timed shock waves. The shock-timing technique was demonstrated on OMEGA using surrogate hohlraum targets heated to 180 eV and fitted with a re-entrant cone and quartz window to facilitate velocity measurements using velocity interferometry. Cryogenic experiments using targets filled with liquid deuterium further demonstrated the entire timing technique in a hohlraum environment. Direct-drive cryogenic targets with multiple spherical shocks were also used to validate this technique, including convergence effects at relevant pressures (velocities) and sizes. These results provide confidence that shock velocity and timing can be measured in NIF ignition targets, thereby optimizing these critical parameters.« less

  19. 75 FR 11841 - Repowering Assistance Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... application window. SUMMARY: RBS is announcing a new application window to submit applications for the...-time application window for remaining FY 2009 funds. Paperwork Reduction Act In accordance with the... allocate all of the FY 2009 authorized funds. Therefore, the Agency is opening a new application window to...

  20. Correlates of avian building strikes at a glass façade museum surrounded by avian habitat

    NASA Astrophysics Data System (ADS)

    Kahle, L.; Flannery, M.; Dumbacher, J. P.

    2013-12-01

    Bird window collisions are the second largest anthropogenic cause of bird deaths in the world. Effective mitigation requires an understanding of which birds are most likely to strike, when, and why. Here, we examine five years of avian window strike data from the California Academy of Sciences - a relatively new museum with significant glass façade situated in Golden Gate Park, San Francisco. We examine correlates of window-killed birds, including age, sex, season, and migratory or sedentary tendencies of the birds. We also examine correlates of window kills such as presence of habitat surrounding the building and overall window area. We found that males are almost three times more likely than females to mortally strike windows, and immature birds are three times more abundant than adults in our window kill dataset. Among seasons, strikes were not notably different in spring, summer, and fall; however they were notably reduced in winter. There was no statistical effect of building orientation (north, south, east, or west), and the presence of avian habitat directly adjacent to windows had a minor effect. We also report ongoing studies examining various efforts to reduce window kill (primarily external decals and large electronic window blinds.) We hope that improving our understanding of the causes of the window strikes will help us strategically reduce window strikes.

  1. Formation of Size- and Position-Controlled Nanometer Size Pt Dots on GaAs and InP Substrates by Pulsed Electrochemical Deposition

    NASA Astrophysics Data System (ADS)

    Sato, Taketomo; Kaneshiro, Chinami; HiroshiOkada, HiroshiOkada; Hasegawa, Hideki

    1999-04-01

    Attempts were made to form regular arrays of size- andposition-controlled Pt-dots on GaAs and InP by combining an insitu electrochemical process with the electron beam (EB)lithography. This utilizes the precipitation of Pt nano-particles atthe initial stage of electrodeposition. First, electrochemicalconditions were optimized in the mode of self-assembled dot arrayformation on unpatterned substrates. Minimum in-plane dot diameters of22 nm and 26 nm on GaAs and InP, respectively, were obtained underthe optimal pulsed mode. Then, Pt dots were selectively formed onpatterned substrates with open circular windows formed by EBlithography, thereby realizing dot-position control. The Pt dot wasfound to have been deposited at the center of each open window, andthe in-plane diameter of the dot could be controlled by the number,width and period of the pulse-waveform applied to substrates. Aminimum diameter of 20 nm was realized in windows with a diameter of100 nm, using a single pulse. Current-voltage (I-V)measurements using an atomic force microscopy (AFM) system with aconductive probe indicated that each Pt dot/n-GaAs contact possessed ahigh Schottky barrier height of about 1 eV.

  2. An approach to unbiased subsample interpolation for motion tracking.

    PubMed

    McCormick, Matthew M; Varghese, Tomy

    2013-04-01

    Accurate subsample displacement estimation is necessary for ultrasound elastography because of the small deformations that occur and the subsequent application of a derivative operation on local displacements. Many of the commonly used subsample estimation techniques introduce significant bias errors. This article addresses a reduced bias approach to subsample displacement estimations that consists of a two-dimensional windowed-sinc interpolation with numerical optimization. It is shown that a Welch or Lanczos window with a Nelder-Mead simplex or regular-step gradient-descent optimization is well suited for this purpose. Little improvement results from a sinc window radius greater than four data samples. The strain signal-to-noise ratio (SNR) obtained in a uniformly elastic phantom is compared with other parabolic and cosine interpolation methods; it is found that the strain SNR ratio is improved over parabolic interpolation from 11.0 to 13.6 in the axial direction and 0.7 to 1.1 in the lateral direction for an applied 1% axial deformation. The improvement was most significant for small strains and displacement tracking in the lateral direction. This approach does not rely on special properties of the image or similarity function, which is demonstrated by its effectiveness with the application of a previously described regularization technique.

  3. Virtual Monoenergetic Images From a Novel Dual-Layer Spectral Detector Computed Tomography Scanner in Portal Venous Phase: Adjusted Window Settings Depending on Assessment Focus Are Essential for Image Interpretation.

    PubMed

    Hickethier, Tilman; Iuga, Andra-Iza; Lennartz, Simon; Hauger, Myriam; Byrtus, Jonathan; Luetkens, Julian A; Haneder, Stefan; Maintz, David; Doerner, Jonas

    We aimed to determine optimal window settings for conventional polyenergetic (PolyE) and virtual monoenergetic images (MonoE) derived from abdominal portal venous phase computed tomography (CT) examinations on a novel dual-layer spectral-detector CT (SDCT). From 50 patients, SDCT data sets MonoE at 40 kiloelectron volt as well as PolyE were reconstructed and best individual window width and level values manually were assessed separately for evaluation of abdominal arteries as well as for liver lesions. Via regression analysis, optimized individual values were mathematically calculated. Subjective image quality parameters, vessel, and liver lesion diameters were measured to determine influences of different W/L settings. Attenuation and contrast-to-noise values were significantly higher in MonoE compared with PolyE. Compared with standard settings, almost all adjusted W/L settings varied significantly and yielded higher subjective scoring. No differences were found between manually adjusted and mathematically calculated W/L settings. PolyE and MonoE from abdominal portal venous phase SDCT examinations require appropriate W/L settings depending on reconstruction technique and assessment focus.

  4. Using genetic algorithms to optimize the analogue method for precipitation prediction in the Swiss Alps

    NASA Astrophysics Data System (ADS)

    Horton, Pascal; Jaboyedoff, Michel; Obled, Charles

    2018-01-01

    Analogue methods provide a statistical precipitation prediction based on synoptic predictors supplied by general circulation models or numerical weather prediction models. The method samples a selection of days in the archives that are similar to the target day to be predicted, and consider their set of corresponding observed precipitation (the predictand) as the conditional distribution for the target day. The relationship between the predictors and predictands relies on some parameters that characterize how and where the similarity between two atmospheric situations is defined. This relationship is usually established by a semi-automatic sequential procedure that has strong limitations: (i) it cannot automatically choose the pressure levels and temporal windows (hour of the day) for a given meteorological variable, (ii) it cannot handle dependencies between parameters, and (iii) it cannot easily handle new degrees of freedom. In this work, a global optimization approach relying on genetic algorithms could optimize all parameters jointly and automatically. The global optimization was applied to some variants of the analogue method for the Rhône catchment in the Swiss Alps. The performance scores increased compared to reference methods, especially for days with high precipitation totals. The resulting parameters were found to be relevant and coherent between the different subregions of the catchment. Moreover, they were obtained automatically and objectively, which reduces the effort that needs to be invested in exploration attempts when adapting the method to a new region or for a new predictand. For example, it obviates the need to assess a large number of combinations of pressure levels and temporal windows of predictor variables that were manually selected beforehand. The optimization could also take into account parameter inter-dependencies. In addition, the approach allowed for new degrees of freedom, such as a possible weighting between pressure levels, and non-overlapping spatial windows.

  5. Thermohydraulic behavior of the liquid metal target of a spallation neutron source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, Y.

    1996-06-01

    The author presents work done on three main problems. (1) Natural circulation in double coaxial cylindircal container: The thermohydraulic behaviour of the liquid metal target of the spallation neutron source at PSI has been investigated. The configuration is a natural-circulation loop in a concentric double-tube-type container. The results show that the natural-circulation loop concept is valid for the design phase of the target construction, and the current specified design criteria will be fulfilled with the proposed parameter values. (2) Flow around the window: Water experiments were performed for geometry optimisation of the window shape of the SINQ container for avoidingmore » generating recirculation zones at peripheral area and the optimal cooling of the central part of the beam entrance window. Flow visualisation technique was mainly used for various window shapes, gap distance between the window and the guide tube edge. (3) Flow in window cooling channels: Flows in narrow gaps of cooling channels of two different types of windows were studied by flow visualisation techniques. One type is a slightly curved round cooling channel and the other is hemispherical shape, both of which have only 2 mm gap distance and the water inlet is located on one side and flows out from the opposite side. In both cases, the central part of the flow area has lower velocity than peripheral area.« less

  6. Affordable Window Insulation with R-10/inch Rating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenifer Marchesi Redouane Begag; Je Kyun Lee; Danny Ou

    2004-10-15

    During the performance of contract DE-FC26-00-NT40998, entitled ''Affordable Window Insulation with R-10/inch Value'', research was conducted at Aspen Aerogels, Inc. to develop new transparent aerogel materials suitable for window insulation applications. The project requirements were to develop a formulation or multiple formulations that have high transparency (85-90%) in the visible region, are hydrophobic (will not opacify with exposure to water vapor or liquid), and have at least 2% resiliency (interpreted as recoverable 2% strain and better than 5% strain to failure in compression). Results from an unrelated project showed that silica aerogels covalently bonded to organic polymers exhibit excellent mechanicalmore » properties. At the outset of this project, we believed that such a route is the best to improve mechanical properties. We have applied Design of Experiment (DOE) techniques to optimize formulations including both silica aerogels and organically modified silica aerogels (''Ormosils''). We used these DOE results to optimize formulations around the local/global optimization points. This report documents that we succeeded in developing a number of formulations that meet all of the stated criteria. We successfully developed formulations utilizing a two-step approach where the first step involves acid catalyzed hydrolysis and the second step involves base catalyzed condensation to make the gels. The gels were dried using supercritical CO{sub 2} and we were able to make 1 foot x 1 foot x 0.5 inch panels that met the criteria established.« less

  7. Validation of accelerometer wear and nonwear time classification algorithm.

    PubMed

    Choi, Leena; Liu, Zhouwen; Matthews, Charles E; Buchowski, Maciej S

    2011-02-01

    the use of movement monitors (accelerometers) for measuring physical activity (PA) in intervention and population-based studies is becoming a standard methodology for the objective measurement of sedentary and active behaviors and for the validation of subjective PA self-reports. A vital step in PA measurement is the classification of daily time into accelerometer wear and nonwear intervals using its recordings (counts) and an accelerometer-specific algorithm. the purpose of this study was to validate and improve a commonly used algorithm for classifying accelerometer wear and nonwear time intervals using objective movement data obtained in the whole-room indirect calorimeter. we conducted a validation study of a wear or nonwear automatic algorithm using data obtained from 49 adults and 76 youth wearing accelerometers during a strictly monitored 24-h stay in a room calorimeter. The accelerometer wear and nonwear time classified by the algorithm was compared with actual wearing time. Potential improvements to the algorithm were examined using the minimum classification error as an optimization target. the recommended elements in the new algorithm are as follows: 1) zero-count threshold during a nonwear time interval, 2) 90-min time window for consecutive zero or nonzero counts, and 3) allowance of 2-min interval of nonzero counts with the upstream or downstream 30-min consecutive zero-count window for detection of artifactual movements. Compared with the true wearing status, improvements to the algorithm decreased nonwear time misclassification during the waking and the 24-h periods (all P values < 0.001). the accelerometer wear or nonwear time algorithm improvements may lead to more accurate estimation of time spent in sedentary and active behaviors.

  8. A novel fast optical switch based on two cascaded Terahertz Optical Asymmetric Demultiplexers (TOAD).

    PubMed

    Wang, Bing; Baby, Varghese; Tong, Wilson; Xu, Lei; Friedman, Michelle; Runser, Robert; Glesk, Ivan; Prucnal, Paul

    2002-01-14

    A novel optical switch based on cascading two terahertz optical asymmetric demultiplexers (TOAD) is presented. By utilizing the sharp edge of the asymmetric TOAD switching window profile, two TOAD switching windows are overlapped to produce a narrower aggregate switching window, not limited by the pulse propagation time in the SOA of the TOAD. Simulations of the cascaded TOAD switching window show relatively constant window amplitude for different window sizes. Experimental results on cascading two TOADs, each with a switching window of 8ps, but with the SOA on opposite sides of the fiber loop, show a minimum switching window of 2.7ps.

  9. Interface Engineering with MoS2 -Pd Nanoparticles Hybrid Structure for a Low Voltage Resistive Switching Memory.

    PubMed

    Wang, Xue-Feng; Tian, He; Zhao, Hai-Ming; Zhang, Tian-Yu; Mao, Wei-Quan; Qiao, Yan-Cong; Pang, Yu; Li, Yu-Xing; Yang, Yi; Ren, Tian-Ling

    2018-01-01

    Metal oxide-based resistive random access memory (RRAM) has attracted a lot of attention for its scalability, temperature robustness, and potential to achieve machine learning. However, a thick oxide layer results in relatively high program voltage while a thin one causes large leakage current and a small window. Owing to these fundamental limitations, by optimizing the oxide layer itself a novel interface engineering idea is proposed to reduce the programming voltage, increase the uniformity and on/off ratio. According to this idea, a molybdenum disulfide (MoS 2 )-palladium nanoparticles hybrid structure is used to engineer the oxide/electrode interface of hafnium oxide (HfO x )-based RRAM. Through its interface engineering, the set voltage can be greatly lowered (from -3.5 to -0.8 V) with better uniformity under a relatively thick HfO x layer (≈15 nm), and a 30 times improvement of the memory window can be obtained. Moreover, due to the atomic thickness of MoS 2 film and high transmittance of ITO, the proposed RRAM exhibits high transparency in visible light. As the proposed interface-engineering RRAM exhibits good transparency, low SET voltage, and a large resistive switching window, it has huge potential in data storage in transparent circuits and wearable electronics with relatively low supply voltage. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Using genetic algorithms to achieve an automatic and global optimization of analogue methods for statistical downscaling of precipitation

    NASA Astrophysics Data System (ADS)

    Horton, Pascal; Weingartner, Rolf; Obled, Charles; Jaboyedoff, Michel

    2017-04-01

    Analogue methods (AMs) rely on the hypothesis that similar situations, in terms of atmospheric circulation, are likely to result in similar local or regional weather conditions. These methods consist of sampling a certain number of past situations, based on different synoptic-scale meteorological variables (predictors), in order to construct a probabilistic prediction for a local weather variable of interest (predictand). They are often used for daily precipitation prediction, either in the context of real-time forecasting, reconstruction of past weather conditions, or future climate impact studies. The relationship between predictors and predictands is defined by several parameters (predictor variable, spatial and temporal windows used for the comparison, analogy criteria, and number of analogues), which are often calibrated by means of a semi-automatic sequential procedure that has strong limitations. AMs may include several subsampling levels (e.g. first sorting a set of analogs in terms of circulation, then restricting to those with similar moisture status). The parameter space of the AMs can be very complex, with substantial co-dependencies between the parameters. Thus, global optimization techniques are likely to be necessary for calibrating most AM variants, as they can optimize all parameters of all analogy levels simultaneously. Genetic algorithms (GAs) were found to be successful in finding optimal values of AM parameters. They allow taking into account parameters inter-dependencies, and selecting objectively some parameters that were manually selected beforehand (such as the pressure levels and the temporal windows of the predictor variables), and thus obviate the need of assessing a high number of combinations. The performance scores of the optimized methods increased compared to reference methods, and this even to a greater extent for days with high precipitation totals. The resulting parameters were found to be relevant and spatially coherent. Moreover, they were obtained automatically and objectively, which reduces efforts invested in exploration attempts when adapting the method to a new region or for a new predictand. In addition, the approach allowed for new degrees of freedom, such as a weighting between the pressure levels, and non overlapping spatial windows. Genetic algorithms were then used further in order to automatically select predictor variables and analogy criteria. This resulted in interesting outputs, providing new predictor-criterion combinations. However, some limitations of the approach were encountered, and the need of the expert input is likely to remain necessary. Nevertheless, letting GAs exploring a dataset for the best predictor for a predictand of interest is certainly a useful tool, particularly when applied for a new predictand or a new region under different climatic characteristics.

  11. On the relationship between human search strategies, conspicuity, and search performance

    NASA Astrophysics Data System (ADS)

    Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander

    2005-05-01

    We determined the relationship between search performance with a limited field of view (FOV) and several scanning- and scene parameters in human observer experiments. The observers (38 trained army scouts) searched through a large search sector for a target (a camouflaged person) on a heath. From trial to trial the target appeared at a different location. With a joystick the observers scanned through a panoramic image (displayed on a PC-monitor) while the scan path was registered. Four conditions were run differing in sensor type (visual or thermal infrared) and window size (large or small). In conditions with a small window size the zoom option could be used. Detection performance was highly dependent on zoom factor and deteriorated when scan speed increased beyond a threshold value. Moreover, the distribution of scan speeds scales with the threshold speed. This indicates that the observers are aware of their limitations and choose a (near) optimal search strategy. We found no correlation between the fraction of detected targets and overall search time for the individual observers, indicating that both are independent measures of individual search performance. Search performance (fraction detected, total search time, time in view for detection) was found to be strongly related to target conspicuity. Moreover, we found the same relationship between search performance and conspicuity for visual and thermal targets. This indicates that search performance can be predicted directly by conspicuity regardless of the sensor type.

  12. High-Reliability Waveguide Vacuum/Pressure Window

    NASA Technical Reports Server (NTRS)

    Britcliffe, Michael J.; Hanson, Theodore R.; Long, Ezra M.; Montanez, Steven

    2013-01-01

    The NASA Deep Space Network (DSN) uses commercial waveguide windows on the output waveguide of Ka-band (32 GHz) low-noise amplifiers. Mechanical failure of these windows resulted in an unacceptable loss in tracking time. To address this issue, a new Ka-band WR-28 waveguide window has been designed, fabricated, and tested. The window uses a slab of low-loss, low-dielectric constant foam that is bonded into a 1/2-wave-thick waveguide/flange. The foam is a commercially available, rigid, closed-cell polymethacrylimide. It has excellent electrical properties with a dielectric constant of 1.04, and a loss tangent of 0.01. It is relatively strong with a tensile strength of 1 MPa. The material is virtually impermeable to helium. The finished window exhibits a leak rate of less than 3x10(exp -3)cu cm/s with helium. The material is also chemically resistant and can be cleaned with acetone. The window is constructed by fabricating a window body by brazing a short length of WR-28 copper waveguide into a standard rectangular flange, and machining the resulting part to a thickness of 4.6 mm. The foam is machined to a rectangular shape with a dimension of 7.06x3.53 mm. The foam is bonded into the body with a two-part epoxy. After curing, the excess glue and foam are knife-trimmed by hand. The finished window has a loss of less than 0.08 dB (2%) and a return loss of greater than 25 dB at 32 GHz. This meets the requirements for the DSN application. The window is usable for most applications over the entire 26-to-40-GHz waveguide band. The window return loss can be tuned to a required frequency by var y in g the thickness of the window slightly. Most standard waveguide windows use a thin membrane of material bonded into a recess in a waveguide flange, or sandwiched between two flanges with a polymer seal. Designs using the recessed window are prone to mechanical failure over time due to constraints on the dimensions of the recess that allow the bond to fail. Designs using the sandwich method are often permeable to helium, which prohibits the use of helium leak detection. At the time of this reporting, 40 windows have been produced. Twelve are in operation with a combined operating time of over 30,000 hours without a failure.

  13. 76 FR 23911 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Reef Fish Fishery of the Gulf of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-29

    ... maintenance window for the Gulf individual fishing quota (IFQ) programs, and removing obsolete codified text..., etc.), extends the IFQ maintenance window an additional 8 hours to allow for more time to conduct end... maintenance window. All electronic IFQ transactions must be completed by December 31 at 6 p.m. eastern time...

  14. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    NASA Astrophysics Data System (ADS)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)

  15. Spatio-temporal Granger causality: a new framework

    PubMed Central

    Luo, Qiang; Lu, Wenlian; Cheng, Wei; Valdes-Sosa, Pedro A.; Wen, Xiaotong; Ding, Mingzhou; Feng, Jianfeng

    2015-01-01

    That physiological oscillations of various frequencies are present in fMRI signals is the rule, not the exception. Herein, we propose a novel theoretical framework, spatio-temporal Granger causality, which allows us to more reliably and precisely estimate the Granger causality from experimental datasets possessing time-varying properties caused by physiological oscillations. Within this framework, Granger causality is redefined as a global index measuring the directed information flow between two time series with time-varying properties. Both theoretical analyses and numerical examples demonstrate that Granger causality is a monotonically increasing function of the temporal resolution used in the estimation. This is consistent with the general principle of coarse graining, which causes information loss by smoothing out very fine-scale details in time and space. Our results confirm that the Granger causality at the finer spatio-temporal scales considerably outperforms the traditional approach in terms of an improved consistency between two resting-state scans of the same subject. To optimally estimate the Granger causality, the proposed theoretical framework is implemented through a combination of several approaches, such as dividing the optimal time window and estimating the parameters at the fine temporal and spatial scales. Taken together, our approach provides a novel and robust framework for estimating the Granger causality from fMRI, EEG, and other related data. PMID:23643924

  16. Markov Tracking for Agent Coordination

    NASA Technical Reports Server (NTRS)

    Washington, Richard; Lau, Sonie (Technical Monitor)

    1998-01-01

    Partially observable Markov decision processes (POMDPs) axe an attractive representation for representing agent behavior, since they capture uncertainty in both the agent's state and its actions. However, finding an optimal policy for POMDPs in general is computationally difficult. In this paper we present Markov Tracking, a restricted problem of coordinating actions with an agent or process represented as a POMDP Because the actions coordinate with the agent rather than influence its behavior, the optimal solution to this problem can be computed locally and quickly. We also demonstrate the use of the technique on sequential POMDPs, which can be used to model a behavior that follows a linear, acyclic trajectory through a series of states. By imposing a "windowing" restriction that restricts the number of possible alternatives considered at any moment to a fixed size, a coordinating action can be calculated in constant time, making this amenable to coordination with complex agents.

  17. Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds

    NASA Astrophysics Data System (ADS)

    Cheng, Tian

    Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A fast Fourier transform (FFT) method is presented to avoid the root-searching process in the inverse Laplace transform of multilayered walls. Generalized explicit FFT formulae for calculating the discrete Fourier transform (DFT) are developed for the first time. They can largely facilitate the implementation of FFT. The new method also provides a basis for generating the symbolic response factors. Validation simulations show that it can generate the response factors as accurate as the analytical solutions. The second method is for direct estimation of annual or seasonal cooling loads without the need for tedious hourly energy simulations. It is validated by hourly simulation results with DOE2. Then symbolic long-term cooling load can be created by combining the two methods with thermal network analysis. The symbolic long-term cooling load can keep the design parameters of interest as symbols, which is particularly useful for the optimal design and sensitivity analysis. The methodology is applied to an office building in Hong Kong for the optimal design of building envelope. Design variables such as window-to-wall ratio, building orientation, and glazing optical and thermal properties are included in the study. Results show that the selected design values could significantly impact the energy performance of windows, and the optimal design of side-lit buildings could greatly enhance energy savings. The application example also demonstrates that the developed methodology significantly facilitates the optimal building design and sensitivity analysis, and leads to high computational efficiency.

  18. Acceleration of MCNP calculations for small pipes configurations by using Weigth Windows Importance cards created by the SN-3D ATTILA

    NASA Astrophysics Data System (ADS)

    Castanier, Eric; Paterne, Loic; Louis, Céline

    2017-09-01

    In the nuclear engineering, you have to manage time and precision. Especially in shielding design, you have to be more accurate and efficient to reduce cost (shielding thickness optimization), and for this, you use 3D codes. In this paper, we want to see if we can easily applicate the CADIS methods for design shielding of small pipes which go through large concrete walls. We assess the impact of the WW generated by the 3D-deterministic code ATTILA versus WW directly generated by MCNP (iterative and manual process). The comparison is based on the quality of the convergence (estimated relative error (σ), Variance of Variance (VOV) and Figure of Merit (FOM)), on time (computer time + modelling) and on the implement for the engineer.

  19. Self spectrum window method in wigner-ville distribution.

    PubMed

    Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun

    2005-01-01

    Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.

  20. Computational model for behavior shaping as an adaptive health intervention strategy.

    PubMed

    Berardi, Vincent; Carretero-González, Ricardo; Klepeis, Neil E; Ghanipoor Machiani, Sahar; Jahangiri, Arash; Bellettiere, John; Hovell, Melbourne

    2018-03-01

    Adaptive behavioral interventions that automatically adjust in real-time to participants' changing behavior, environmental contexts, and individual history are becoming more feasible as the use of real-time sensing technology expands. This development is expected to improve shortcomings associated with traditional behavioral interventions, such as the reliance on imprecise intervention procedures and limited/short-lived effects. JITAI adaptation strategies often lack a theoretical foundation. Increasing the theoretical fidelity of a trial has been shown to increase effectiveness. This research explores the use of shaping, a well-known process from behavioral theory for engendering or maintaining a target behavior, as a JITAI adaptation strategy. A computational model of behavior dynamics and operant conditioning was modified to incorporate the construct of behavior shaping by adding the ability to vary, over time, the range of behaviors that were reinforced when emitted. Digital experiments were performed with this updated model for a range of parameters in order to identify the behavior shaping features that optimally generated target behavior. Narrowing the range of reinforced behaviors continuously in time led to better outcomes compared with a discrete narrowing of the reinforcement window. Rapid narrowing followed by more moderate decreases in window size was more effective in generating target behavior than the inverse scenario. The computational shaping model represents an effective tool for investigating JITAI adaptation strategies. Model parameters must now be translated from the digital domain to real-world experiments so that model findings can be validated.

  1. Improvement of the user interface of multimedia applications by automatic display layout

    NASA Astrophysics Data System (ADS)

    Lueders, Peter; Ernst, Rolf

    1995-03-01

    Multimedia research has mainly focussed on real-time data capturing and display combined with compression, storage and transmission of these data. However, there is another problem considering real-time selecting and arranging a possibly large amount of data from multiple media on the computer screen together with textual and graphical data of regular software. This problem has already been known from complex software systems, such as CASE and hypertest, and will even be aggravated in multimedia systems. The aim of our work is to alleviate the user from the burden of continuously selecting, placing and sizing windows and their contents, but without introducing solutions limited to only few applications. We present an experimental system which controls the computer screen contents and layouts, directed by a user and/or tool provided information filter and prioritization. To be application independent, the screen layout is based on general layout optimization algorithms adapted from the VLSI layout which are controlled by application specific objective functions. In this paper, we discuss the problems of a comprehensible screen layout including the stability of optical information in time, the information filtering, the layout algorithms and the adaptation of the objective function to include a specific application. We give some examples of different standard applications with layout problems ranging from hierarchical graph layout to window layout. The results show that the automatic tool independent display layout will be possible in a real time interactive environment.

  2. A Fast Estimation Algorithm for Two-Dimensional Gravity Data (GEOFAST),

    DTIC Science & Technology

    1979-11-15

    to a wide class of problems (Refs. 9 and 17). The major inhibitor to the widespread appli- ( cation of optimal gravity data processing is the severe...extends directly to two dimensions. Define the nln 2xn1 n2 diagonal window matrix W as the Kronecker product of two one-dimensional windows W = W1 0 W2 (B...Inversion of Separable Matrices Consider the linear system y = T x (B.3-1) where T is block Toeplitz of dimension nln 2xnIn 2 . Its fre- quency domain

  3. Progress Towards Highly Efficient Windows for Zero—Energy Buildings

    NASA Astrophysics Data System (ADS)

    Selkowitz, Stephen

    2008-09-01

    Energy efficient windows could save 4 quads/year, with an additional 1 quad/year gain from daylighting in commercial buildings. This corresponds to 13% of energy used by US buildings and 5% of all energy used by the US. The technical potential is thus very large and the economic potential is slowly becoming a reality. This paper describes the progress in energy efficient windows that employ low-emissivity glazing, electrochromic switchable coatings and other novel materials. Dynamic systems are being developed that use sensors and controls to modulate daylighting and shading contributions in response to occupancy, comfort and energy needs. Improving the energy performance of windows involves physics in a variety of application: optics, heat transfer, materials science and applied engineering. Technical solutions must also be compatible with national policy, codes and standards, economics, business practice and investment, real and perceived risks, comfort, health, safety, productivity, amenities, and occupant preference and values. The challenge is to optimize energy performance by understanding and reinforcing the synergetic coupling between these many issues.

  4. Combining the Hanning windowed interpolated FFT in both directions

    NASA Astrophysics Data System (ADS)

    Chen, Kui Fu; Li, Yan Feng

    2008-06-01

    The interpolated fast Fourier transform (IFFT) has been proposed as a way to eliminate the picket fence effect (PFE) of the fast Fourier transform. The modulus based IFFT, cited in most relevant references, makes use of only the 1st and 2nd highest spectral lines. An approach using three principal spectral lines is proposed. This new approach combines both directions of the complex spectrum based IFFT with the Hanning window. The optimal weight to minimize the estimation variance is established on the first order Taylor series expansion of noise interference. A numerical simulation is carried out, and the results are compared with the Cramer-Rao bound. It is demonstrated that the proposed approach has a lower estimation variance than the two-spectral-line approach. The improvement depends on the extent of sampling deviating from the coherent condition, and the best is decreasing variance by 2/7. However, it is also shown that the estimation variance of the windowed IFFT with the Hanning is significantly higher than that of without windowing.

  5. Developmental windows of breast cancer risk provide opportunities for targeted chemoprevention

    PubMed Central

    Martinson, Holly A.; Lyons, Traci R.; Giles, Erin D.; Borges, Virginia F.; Schedin, Pepper

    2014-01-01

    The magnitude of the breast cancer problem implores researchers to aggressively investigate prevention strategies. However, several barriers currently reduce the feasibility of breast cancer prevention. These barriers include the inability to accurately predict future breast cancer diagnosis at the individual level, the need for improved understanding of when to implement interventions, uncertainty with respect to optimal duration of treatment, and negative side effects associated with currently approved chemoprevention therapies. None-the-less, the unique biology of the mammary gland, with its postnatal development and conditional terminal differentiation, may permit the resolution of many of these barriers. Specifically, lifecycle-specific windows of breast cancer risk have been identified that may be amenable to risk-reducing strategies. Here, we argue for prevention research focused on two of these lifecycle windows of risk: postpartum mammary gland involution and peri-menopause. We provide evidence that these windows are highly amenable to targeted, limited duration treatments. Such approaches could result in the prevention of postpartum and postmenopausal breast cancers, correspondingly. PMID:23664839

  6. Predicting the optimal process window for the coating of single-crystalline organic films with mobilities exceeding 7 cm2/Vs.

    NASA Astrophysics Data System (ADS)

    Janneck, Robby; Vercesi, Federico; Heremans, Paul; Genoe, Jan; Rolin, Cedric

    2016-09-01

    Organic thin film transistors (OTFTs) based on single crystalline thin films of organic semiconductors have seen considerable development in the recent years. The most successful method for the fabrication of single crystalline films are solution-based meniscus guided coating techniques such as dip-coating, solution shearing or zone casting. These upscalable methods enable rapid and efficient film formation without additional processing steps. The single-crystalline film quality is strongly dependent on solvent choice, substrate temperature and coating speed. So far, however, process optimization has been conducted by trial and error methods, involving, for example, the variation of coating speeds over several orders of magnitude. Through a systematic study of solvent phase change dynamics in the meniscus region, we develop a theoretical framework that links the optimal coating speed to the solvent choice and the substrate temperature. In this way, we can accurately predict an optimal processing window, enabling fast process optimization. Our approach is verified through systematic OTFT fabrication based on films grown with different semiconductors, solvents and substrate temperatures. The use of best predicted coating speeds delivers state of the art devices. In the case of C8BTBT, OTFTs show well-behaved characteristics with mobilities up to 7 cm2/Vs and onset voltages close to 0 V. Our approach also explains well optimal recipes published in the literature. This route considerably accelerates parameter screening for all meniscus guided coating techniques and unveils the physics of single crystalline film formation.

  7. Interactive orbital proximity operations planning system instruction and training guide

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Ellis, Stephen R.

    1994-01-01

    This guide instructs users in the operation of a Proximity Operations Planning System. This system uses an interactive graphical method for planning fuel-efficient rendezvous trajectories in the multi-spacecraft environment of the space station and allows the operator to compose a multi-burn transfer trajectory between orbit initial chaser and target trajectories. The available task time (window) of the mission is predetermined and the maneuver is subject to various operational constraints, such as departure, arrival, spatial, plume impingement, and en route passage constraints. The maneuvers are described in terms of the relative motion experienced in a space station centered coordinate system. Both in-orbital plane as well as out-of-orbital plane maneuvering is considered. A number of visual optimization aids are used for assisting the operator in reaching fuel-efficient solutions. These optimization aids are based on the Primer Vector theory. The visual feedback of trajectory shapes, operational constraints, and optimization functions, provided by user-transparent and continuously active background computations, allows the operator to make fast, iterative design changes that rapidly converge to fuel-efficient solutions. The planning tool is an example of operator-assisted optimization of nonlinear cost functions.

  8. An optimization framework for workplace charging strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yongxi; Zhou, Yan

    2015-03-01

    The workplace charging (WPC) has been recently recognized as the most important secondary charging point next to residential charging for plug-in electric vehicles (PEVs). The current WPC practice is spontaneous and grants every PEV a designated charger, which may not be practical or economic when there are a large number of PEVs present at workplace. This study is the first research undertaken that develops an optimization framework for WPC strategies to satisfy all charging demand while explicitly addressing different eligible levels of charging technology and employees’ demographic distributions. The optimization model is to minimize the lifetime cost of equipment, installations,more » and operations, and is formulated as an integer program. We demonstrate the applicability of the model using numerical examples based on national average data. The results indicate that the proposed optimization model can reduce the total cost of running a WPC system by up to 70% compared to the current practice. The WPC strategies are sensitive to the time windows and installation costs, and dominated by the PEV population size. The WPC has also been identified as an alternative sustainable transportation program to the public transit subsidy programs for both economic and environmental advantages.« less

  9. Sparsely sampling the sky: Regular vs. random sampling

    NASA Astrophysics Data System (ADS)

    Paykari, P.; Pires, S.; Starck, J.-L.; Jaffe, A. H.

    2015-09-01

    Aims: The next generation of galaxy surveys, aiming to observe millions of galaxies, are expensive both in time and money. This raises questions regarding the optimal investment of this time and money for future surveys. In a previous work, we have shown that a sparse sampling strategy could be a powerful substitute for the - usually favoured - contiguous observation of the sky. In our previous paper, regular sparse sampling was investigated, where the sparse observed patches were regularly distributed on the sky. The regularity of the mask introduces a periodic pattern in the window function, which induces periodic correlations at specific scales. Methods: In this paper, we use a Bayesian experimental design to investigate a "random" sparse sampling approach, where the observed patches are randomly distributed over the total sparsely sampled area. Results: We find that in this setting, the induced correlation is evenly distributed amongst all scales as there is no preferred scale in the window function. Conclusions: This is desirable when we are interested in any specific scale in the galaxy power spectrum, such as the matter-radiation equality scale. As the figure of merit shows, however, there is no preference between regular and random sampling to constrain the overall galaxy power spectrum and the cosmological parameters.

  10. Low-Rank Matrix Recovery Approach for Clutter Rejection in Real-Time IR-UWB Radar-Based Moving Target Detection

    PubMed Central

    Sabushimike, Donatien; Na, Seung You; Kim, Jin Young; Bui, Ngoc Nam; Seo, Kyung Sik; Kim, Gil Gyeom

    2016-01-01

    The detection of a moving target using an IR-UWB Radar involves the core task of separating the waves reflected by the static background and by the moving target. This paper investigates the capacity of the low-rank and sparse matrix decomposition approach to separate the background and the foreground in the trend of UWB Radar-based moving target detection. Robust PCA models are criticized for being batched-data-oriented, which makes them inconvenient in realistic environments where frames need to be processed as they are recorded in real time. In this paper, a novel method based on overlapping-windows processing is proposed to cope with online processing. The method consists of processing a small batch of frames which will be continually updated without changing its size as new frames are captured. We prove that RPCA (via its Inexact Augmented Lagrange Multiplier (IALM) model) can successfully separate the two subspaces, which enhances the accuracy of target detection. The overlapping-windows processing method converges on the optimal solution with its batch counterpart (i.e., processing batched data with RPCA), and both methods prove the robustness and efficiency of the RPCA over the classic PCA and the commonly used exponential averaging method. PMID:27598159

  11. Attenuation correction for the large non-human primate brain imaging using microPET.

    PubMed

    Naidoo-Variawa, S; Lehnert, W; Kassiou, M; Banati, R; Meikle, S R

    2010-04-21

    Assessment of the biodistribution and pharmacokinetics of radiopharmaceuticals in vivo is often performed on animal models of human disease prior to their use in humans. The baboon brain is physiologically and neuro-anatomically similar to the human brain and is therefore a suitable model for evaluating novel CNS radioligands. We previously demonstrated the feasibility of performing baboon brain imaging on a dedicated small animal PET scanner provided that the data are accurately corrected for degrading physical effects such as photon attenuation in the body. In this study, we investigated factors affecting the accuracy and reliability of alternative attenuation correction strategies when imaging the brain of a large non-human primate (papio hamadryas) using the microPET Focus 220 animal scanner. For measured attenuation correction, the best bias versus noise performance was achieved using a (57)Co transmission point source with a 4% energy window. The optimal energy window for a (68)Ge transmission source operating in singles acquisition mode was 20%, independent of the source strength, providing bias-noise performance almost as good as for (57)Co. For both transmission sources, doubling the acquisition time had minimal impact on the bias-noise trade-off for corrected emission images, despite observable improvements in reconstructed attenuation values. In a [(18)F]FDG brain scan of a female baboon, both measured attenuation correction strategies achieved good results and similar SNR, while segmented attenuation correction (based on uncorrected emission images) resulted in appreciable regional bias in deep grey matter structures and the skull. We conclude that measured attenuation correction using a single pass (57)Co (4% energy window) or (68)Ge (20% window) transmission scan achieves an excellent trade-off between bias and propagation of noise when imaging the large non-human primate brain with a microPET scanner.

  12. Attenuation correction for the large non-human primate brain imaging using microPET

    NASA Astrophysics Data System (ADS)

    Naidoo-Variawa, S.; Lehnert, W.; Kassiou, M.; Banati, R.; Meikle, S. R.

    2010-04-01

    Assessment of the biodistribution and pharmacokinetics of radiopharmaceuticals in vivo is often performed on animal models of human disease prior to their use in humans. The baboon brain is physiologically and neuro-anatomically similar to the human brain and is therefore a suitable model for evaluating novel CNS radioligands. We previously demonstrated the feasibility of performing baboon brain imaging on a dedicated small animal PET scanner provided that the data are accurately corrected for degrading physical effects such as photon attenuation in the body. In this study, we investigated factors affecting the accuracy and reliability of alternative attenuation correction strategies when imaging the brain of a large non-human primate (papio hamadryas) using the microPET Focus 220 animal scanner. For measured attenuation correction, the best bias versus noise performance was achieved using a 57Co transmission point source with a 4% energy window. The optimal energy window for a 68Ge transmission source operating in singles acquisition mode was 20%, independent of the source strength, providing bias-noise performance almost as good as for 57Co. For both transmission sources, doubling the acquisition time had minimal impact on the bias-noise trade-off for corrected emission images, despite observable improvements in reconstructed attenuation values. In a [18F]FDG brain scan of a female baboon, both measured attenuation correction strategies achieved good results and similar SNR, while segmented attenuation correction (based on uncorrected emission images) resulted in appreciable regional bias in deep grey matter structures and the skull. We conclude that measured attenuation correction using a single pass 57Co (4% energy window) or 68Ge (20% window) transmission scan achieves an excellent trade-off between bias and propagation of noise when imaging the large non-human primate brain with a microPET scanner.

  13. Informed baseline subtraction of proteomic mass spectrometry data aided by a novel sliding window algorithm.

    PubMed

    Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J

    2016-01-01

    Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of magnitude on many simulated datasets. The advantages of the proposed pipeline include informed and data specific input arguments for baseline subtraction methods, the avoidance of time-intensive and subjective piecewise baseline subtraction, and the ability to automate baseline subtraction completely. Moreover, individual steps can be adopted as stand-alone routines.

  14. Method for the rapid synthesis of large quantities of metal oxide nanowires at low temperatures

    DOEpatents

    Sunkara, Mahendra Kumar [Louisville, KY; Vaddiraju, Sreeram [Mountain View, CA; Mozetic, Miran [Ljubljan, SI; Cvelbar, Uros [Idrija, SI

    2009-09-22

    A process for the rapid synthesis of metal oxide nanoparticles at low temperatures and methods which facilitate the fabrication of long metal oxide nanowires. The method is based on treatment of metals with oxygen plasma. Using oxygen plasma at low temperatures allows for rapid growth unlike other synthesis methods where nanomaterials take a long time to grow. Density of neutral oxygen atoms in plasma is a controlling factor for the yield of nanowires. The oxygen atom density window differs for different materials. By selecting the optimal oxygen atom density for various materials the yield can be maximized for nanowire synthesis of the metal.

  15. Simultsonic: A Simulation Tool for Ultrasonic Inspection

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, Adarsh; Karthikeyan, Soumya; Krishnamurthy, C. V.; Balasubramaniam, Krishnan

    2006-03-01

    A simulation program SIMULTSONIC is under development at CNDE to help determine and/or help optimize ultrasonic probe locations for inspection of complex components. SIMULTSONIC provides a ray-trace based assessment initially followed by a displacement or pressure field-based assessment for user-specified probe positions and user-selected component. Immersion and contact modes of inspection are available in SIMULTSONIC. The code written in Visual C++ operating in Microsoft Windows environment provides an interactive user interface. In this paper, the application of SIMULTSONIC to the inspection of very thin-walled pipes (with 450 um wall thickness) is described. Ray trace based assessment was done using SIMULTSONIC to determine the standoff distance and the angle of oblique incidence for an immersion mode focused transducer. A 3-cycle Hanning window pulse was chosen for simulations. Experiments were carried out to validate the simulations. The A-scans and the associated B-Scan images obtained through simulations show good correlation with experimental results, both with the arrival time of the signal as well as with the signal amplitudes. The scope of SIMULTSONIC to deal with parametrically represented surfaces will also be discussed.

  16. Decreasing-Rate Pruning Optimizes the Construction of Efficient and Robust Distributed Networks.

    PubMed

    Navlakha, Saket; Barth, Alison L; Bar-Joseph, Ziv

    2015-07-01

    Robust, efficient, and low-cost networks are advantageous in both biological and engineered systems. During neural network development in the brain, synapses are massively over-produced and then pruned-back over time. This strategy is not commonly used when designing engineered networks, since adding connections that will soon be removed is considered wasteful. Here, we show that for large distributed routing networks, network function is markedly enhanced by hyper-connectivity followed by aggressive pruning and that the global rate of pruning, a developmental parameter not previously studied by experimentalists, plays a critical role in optimizing network structure. We first used high-throughput image analysis techniques to quantify the rate of pruning in the mammalian neocortex across a broad developmental time window and found that the rate is decreasing over time. Based on these results, we analyzed a model of computational routing networks and show using both theoretical analysis and simulations that decreasing rates lead to more robust and efficient networks compared to other rates. We also present an application of this strategy to improve the distributed design of airline networks. Thus, inspiration from neural network formation suggests effective ways to design distributed networks across several domains.

  17. Decreasing-Rate Pruning Optimizes the Construction of Efficient and Robust Distributed Networks

    PubMed Central

    Navlakha, Saket; Barth, Alison L.; Bar-Joseph, Ziv

    2015-01-01

    Robust, efficient, and low-cost networks are advantageous in both biological and engineered systems. During neural network development in the brain, synapses are massively over-produced and then pruned-back over time. This strategy is not commonly used when designing engineered networks, since adding connections that will soon be removed is considered wasteful. Here, we show that for large distributed routing networks, network function is markedly enhanced by hyper-connectivity followed by aggressive pruning and that the global rate of pruning, a developmental parameter not previously studied by experimentalists, plays a critical role in optimizing network structure. We first used high-throughput image analysis techniques to quantify the rate of pruning in the mammalian neocortex across a broad developmental time window and found that the rate is decreasing over time. Based on these results, we analyzed a model of computational routing networks and show using both theoretical analysis and simulations that decreasing rates lead to more robust and efficient networks compared to other rates. We also present an application of this strategy to improve the distributed design of airline networks. Thus, inspiration from neural network formation suggests effective ways to design distributed networks across several domains. PMID:26217933

  18. Timing anthropogenic stressors to mitigate their impact on marine ecosystem resilience.

    PubMed

    Wu, Paul Pao-Yen; Mengersen, Kerrie; McMahon, Kathryn; Kendrick, Gary A; Chartrand, Kathryn; York, Paul H; Rasheed, Michael A; Caley, M Julian

    2017-11-02

    Better mitigation of anthropogenic stressors on marine ecosystems is urgently needed to address increasing biodiversity losses worldwide. We explore opportunities for stressor mitigation using whole-of-systems modelling of ecological resilience, accounting for complex interactions between stressors, their timing and duration, background environmental conditions and biological processes. We then search for ecological windows, times when stressors minimally impact ecological resilience, defined here as risk, recovery and resistance. We show for 28 globally distributed seagrass meadows that stressor scheduling that exploits ecological windows for dredging campaigns can achieve up to a fourfold reduction in recovery time and 35% reduction in extinction risk. Although the timing and length of windows vary among sites to some degree, global trends indicate favourable windows in autumn and winter. Our results demonstrate that resilience is dynamic with respect to space, time and stressors, varying most strongly with: (i) the life history of the seagrass genus and (ii) the duration and timing of the impacting stress.

  19. Platform for Postprocessing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don

    2008-01-01

    Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).

  20. 75 FR 74687 - Takes of Marine Mammals Incidental to Specified Activities; Construction of the Parsons Slough...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-01

    .... Actual pile driving time during this work window will depend on a number of factors, such as sediments... period beginning in November 2010, and ending in February 2011. This work window was selected to coincide.... The work window also coincides with the USFWS' required construction work window to avoid the peak...

  1. A Window-Washing Challenge

    ERIC Educational Resources Information Center

    Roman, Harry T.

    2010-01-01

    Skyscrapers sure do have a lot of windows, and these windows are cleaned and checked regularly. All this takes time, money, and puts workers at potential risk. Might there be a better way to do it? In this article, the author discusses a window-washing challenge and describes how students can tackle this task, pick up the challenge, and creatively…

  2. Centroid estimation for a Shack-Hartmann wavefront sensor based on stream processing.

    PubMed

    Kong, Fanpeng; Polo, Manuel Cegarra; Lambert, Andrew

    2017-08-10

    Using center of gravity to estimate the centroid of the spot in a Shack-Hartmann wavefront sensor, the measurement corrupts with photon and detector noise. Parameters, like window size, often require careful optimization to balance the noise error, dynamic range, and linearity of the response coefficient under different photon flux. It also needs to be substituted by the correlation method for extended sources. We propose a centroid estimator based on stream processing, where the center of gravity calculation window floats with the incoming pixel from the detector. In comparison with conventional methods, we show that the proposed estimator simplifies the choice of optimized parameters, provides a unit linear coefficient response, and reduces the influence of background and noise. It is shown that the stream-based centroid estimator also works well for limited size extended sources. A hardware implementation of the proposed estimator is discussed.

  3. Study, optimization, and design of a laser heat engine. [for satellite applications

    NASA Technical Reports Server (NTRS)

    Taussig, R. T.; Cassady, P. E.; Zumdieck, J. F.

    1978-01-01

    Laser heat engine concepts, proposed for satellite applications, are analyzed to determine which engine concept best meets the requirements of high efficiency (50 percent or better), continuous operation in space using near-term technology. The analysis of laser heat engines includes the thermodynamic cycles, engine design, laser power sources, collector/concentrator optics, receiving windows, absorbers, working fluids, electricity generation, and heat rejection. Specific engine concepts, optimized according to thermal efficiency, are rated by their technological availability and scaling to higher powers. A near-term experimental demonstration of the laser heat engine concept appears feasible utilizing an Otto cycle powered by CO2 laser radiation coupled into the engine through a diamond window. Higher cycle temperatures, higher efficiencies, and scalability to larger sizes appear to be achievable from a laser heat engine design based on the Brayton cycle and powered by a CO laser.

  4. Design and implementation of laser target simulator in hardware-in-the-loop simulation system based on LabWindows/CVI and RTX

    NASA Astrophysics Data System (ADS)

    Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong

    2016-11-01

    In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.

  5. A robust and high precision optimal explicit guidance scheme for solid motor propelled launch vehicles with thrust and drag uncertainty

    NASA Astrophysics Data System (ADS)

    Maity, Arnab; Padhi, Radhakant; Mallaram, Sanjeev; Mallikarjuna Rao, G.; Manickavasagam, M.

    2016-10-01

    A new nonlinear optimal and explicit guidance law is presented in this paper for launch vehicles propelled by solid motors. It can ensure very high terminal precision despite not having the exact knowledge of the thrust-time curve apriori. This was motivated from using it for a carrier launch vehicle in a hypersonic mission, which demands an extremely narrow terminal accuracy window for the launch vehicle for successful initiation of operation of the hypersonic vehicle. The proposed explicit guidance scheme, which computes the optimal guidance command online, ensures the required stringent final conditions with high precision at the injection point. A key feature of the proposed guidance law is an innovative extension of the recently developed model predictive static programming guidance with flexible final time. A penalty function approach is also followed to meet the input and output inequality constraints throughout the vehicle trajectory. In this paper, the guidance law has been successfully validated from nonlinear six degree-of-freedom simulation studies by designing an inner-loop autopilot as well, which enhances confidence of its usefulness significantly. In addition to excellent nominal results, the proposed guidance has been found to have good robustness for perturbed cases as well.

  6. Rapidity window dependences of higher order cumulants and diffusion master equation

    NASA Astrophysics Data System (ADS)

    Kitazawa, Masakiyo

    2015-10-01

    We study the rapidity window dependences of higher order cumulants of conserved charges observed in relativistic heavy ion collisions. The time evolution and the rapidity window dependence of the non-Gaussian fluctuations are described by the diffusion master equation. Analytic formulas for the time evolution of cumulants in a rapidity window are obtained for arbitrary initial conditions. We discuss that the rapidity window dependences of the non-Gaussian cumulants have characteristic structures reflecting the non-equilibrium property of fluctuations, which can be observed in relativistic heavy ion collisions with the present detectors. It is argued that various information on the thermal and transport properties of the hot medium can be revealed experimentally by the study of the rapidity window dependences, especially by the combined use, of the higher order cumulants. Formulas of higher order cumulants for a probability distribution composed of sub-probabilities, which are useful for various studies of non-Gaussian cumulants, are also presented.

  7. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  8. Optimal filter parameters for low SNR seismograms as a function of station and event location

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.

    1999-06-01

    Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.

  9. Ultrasound window-modulated compounding Nakagami imaging: Resolution improvement and computational acceleration for liver characterization.

    PubMed

    Ma, Hsiang-Yang; Lin, Ying-Hsiu; Wang, Chiao-Yin; Chen, Chiung-Nien; Ho, Ming-Chih; Tsui, Po-Hsiang

    2016-08-01

    Ultrasound Nakagami imaging is an attractive method for visualizing changes in envelope statistics. Window-modulated compounding (WMC) Nakagami imaging was reported to improve image smoothness. The sliding window technique is typically used for constructing ultrasound parametric and Nakagami images. Using a large window overlap ratio may improve the WMC Nakagami image resolution but reduces computational efficiency. Therefore, the objectives of this study include: (i) exploring the effects of the window overlap ratio on the resolution and smoothness of WMC Nakagami images; (ii) proposing a fast algorithm that is based on the convolution operator (FACO) to accelerate WMC Nakagami imaging. Computer simulations and preliminary clinical tests on liver fibrosis samples (n=48) were performed to validate the FACO-based WMC Nakagami imaging. The results demonstrated that the width of the autocorrelation function and the parameter distribution of the WMC Nakagami image reduce with the increase in the window overlap ratio. One-pixel shifting (i.e., sliding the window on the image data in steps of one pixel for parametric imaging) as the maximum overlap ratio significantly improves the WMC Nakagami image quality. Concurrently, the proposed FACO method combined with a computational platform that optimizes the matrix computation can accelerate WMC Nakagami imaging, allowing the detection of liver fibrosis-induced changes in envelope statistics. FACO-accelerated WMC Nakagami imaging is a new-generation Nakagami imaging technique with an improved image quality and fast computation. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. New machining method of high precision infrared window part

    NASA Astrophysics Data System (ADS)

    Yang, Haicheng; Su, Ying; Xu, Zengqi; Guo, Rui; Li, Wenting; Zhang, Feng; Liu, Xuanmin

    2016-10-01

    Most of the spherical shell of the photoelectric multifunctional instrument was designed as multi optical channel mode to adapt to the different band of the sensor, there were mainly TV, laser and infrared channels. Without affecting the optical diameter, wind resistance and pneumatic performance of the optical system, the overall layout of the spherical shell was optimized to save space and reduce weight. Most of the shape of the optical windows were special-shaped, each optical window directly participated in the high resolution imaging of the corresponding sensor system, and the optical axis parallelism of each sensor needed to meet the accuracy requirement of 0.05mrad.Therefore precision machining of optical window parts quality will directly affect the photoelectric system's pointing accuracy and interchangeability. Processing and testing of the TV and laser window had been very mature, while because of the special nature of the material, transparent and high refractive rate, infrared window parts had the problems of imaging quality and the control of the minimum focal length and second level parallel in the processing. Based on years of practical experience, this paper was focused on how to control the shape and parallel difference precision of infrared window parts in the processing. Single pass rate was increased from 40% to more than 95%, the processing efficiency was significantly enhanced, an effective solution to the bottleneck problem in the actual processing, which effectively solve the bottlenecks in research and production.

  11. Novel near-infrared spectrum analysis tool: Synergy adaptive moving window model based on immune clone algorithm.

    PubMed

    Wang, Shenghao; Zhang, Yuyan; Cao, Fuyi; Pei, Zhenying; Gao, Xuewei; Zhang, Xu; Zhao, Yong

    2018-02-13

    This paper presents a novel spectrum analysis tool named synergy adaptive moving window modeling based on immune clone algorithm (SA-MWM-ICA) considering the tedious and inconvenient labor involved in the selection of pre-processing methods and spectral variables by prior experience. In this work, immune clone algorithm is first introduced into the spectrum analysis field as a new optimization strategy, covering the shortage of the relative traditional methods. Based on the working principle of the human immune system, the performance of the quantitative model is regarded as antigen, and a special vector corresponding to the above mentioned antigen is regarded as antibody. The antibody contains a pre-processing method optimization region which is created by 11 decimal digits, and a spectrum variable optimization region which is formed by some moving windows with changeable width and position. A set of original antibodies are created by modeling with this algorithm. After calculating the affinity of these antibodies, those with high affinity will be selected to clone. The regulation for cloning is that the higher the affinity, the more copies will be. In the next step, another import operation named hyper-mutation is applied to the antibodies after cloning. Moreover, the regulation for hyper-mutation is that the lower the affinity, the more possibility will be. Several antibodies with high affinity will be created on the basis of these steps. Groups of simulated dataset, gasoline near-infrared spectra dataset, and soil near-infrared spectra dataset are employed to verify and illustrate the performance of SA-MWM-ICA. Analysis results show that the performance of the quantitative models adopted by SA-MWM-ICA are better especially for structures with relatively complex spectra than traditional models such as partial least squares (PLS), moving window PLS (MWPLS), genetic algorithm PLS (GAPLS), and pretreatment method classification and adjustable parameter changeable size moving window PLS (CA-CSMWPLS). The selected pre-processing methods and spectrum variables are easily explained. The proposed method will converge in few generations and can be used not only for near-infrared spectroscopy analysis but also for other similar spectral analysis, such as infrared spectroscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Impact of Hypokalemia on Electromechanical Window, Excitation Wavelength and Repolarization Gradients in Guinea-Pig and Rabbit Hearts

    PubMed Central

    Osadchii, Oleg E.

    2014-01-01

    Normal hearts exhibit a positive time difference between the end of ventricular contraction and the end of QT interval, which is referred to as the electromechanical (EM) window. Drug-induced prolongation of repolarization may lead to the negative EM window, which was proposed to be a novel proarrhythmic marker. This study examined whether abnormal changes in the EM window may account for arrhythmogenic effects produced by hypokalemia. Left ventricular pressure, electrocardiogram, and epicardial monophasic action potentials were recorded in perfused hearts from guinea-pig and rabbit. Hypokalemia (2.5 mM K+) was found to prolong repolarization, reduce the EM window, and promote tachyarrhythmia. Nevertheless, during both regular pacing and extrasystolic excitation, the increased QT interval invariably remained shorter than the duration of mechanical systole, thus yielding positive EM window values. Hypokalemia-induced arrhythmogenicity was associated with slowed ventricular conduction, and shortened effective refractory periods, which translated to a reduced excitation wavelength index. Hypokalemia also evoked non-uniform prolongation of action potential duration in distinct epicardial regions, which resulted in increased spatial variability in the repolarization time. These findings suggest that arrhythmogenic effects of hypokalemia are not accounted for by the negative EM window, and are rather attributed to abnormal changes in ventricular conduction times, refractoriness, excitation wavelength, and spatial repolarization gradients. PMID:25141124

  13. High Temperature Tribometer. Phase 1

    DTIC Science & Technology

    1989-06-01

    13 Figure 2.3.2 Setpoint and Gain Windows in FW.EXE ......... . Figure 2.4.1 Data-Flow Diagram for Data-Acquisition Module ..... .. 23 I Figure...mounted in a friction force measuring device. Optimally , material testing results should not be test machine sensitiye; but due to equipment variables...fixed. The friction force due to sliding should be continuously measured. This is optimally done in conjunction with the normal force measurement via

  14. Continuous Fiber Ceramic Composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fareed, Ali; Craig, Phillip A.

    2002-09-01

    Fiber-reinforced ceramic composites demonstrate the high-temperature stability of ceramics--with an increased fracture toughness resulting from the fiber reinforcement of the composite. The material optimization performed under the continuous fiber ceramic composites (CFCC) included a series of systematic optimizations. The overall goals were to define the processing window, to increase the robustinous of the process, to increase process yield while reducing costs, and to define the complexity of parts that could be fabricated.

  15. Study of Einstein-Podolsky-Rosen state for space-time variables in a two photon interference experiment

    NASA Technical Reports Server (NTRS)

    Shih, Y. H.; Sergienko, A. V.; Rubin, M. H.

    1993-01-01

    A pair of correlated photons generated from parametric down conversion was sent to two independent Michelson interferometers. Second order interference was studied by means of a coincidence measurement between the outputs of two interferometers. The reported experiment and analysis studied this second order interference phenomena from the point of view of Einstein-Podolsky-Rosen paradox. The experiment was done in two steps. The first step of the experiment used 50 psec and 3 nsec coincidence time windows simultaneously. The 50 psec window was able to distinguish a 1.5 cm optical path difference in the interferometers. The interference visibility was measured to be 38 percent and 21 percent for the 50 psec time window and 22 percent and 7 percent for the 3 nsec time window, when the optical path difference of the interferometers were 2 cm and 4 cm, respectively. By comparing the visibilities between these two windows, the experiment showed the non-classical effect which resulted from an E.P.R. state. The second step of the experiment used a 20 psec coincidence time window, which was able to distinguish a 6 mm optical path difference in the interferometers. The interference visibilities were measured to be 59 percent for an optical path difference of 7 mm. This is the first observation of visibility greater than 50 percent for a two interferometer E.P.R. experiment which demonstrates nonclassical correlation of space-time variables.

  16. Reliability of system for precise cold forging

    NASA Astrophysics Data System (ADS)

    Krušič, Vid; Rodič, Tomaž

    2017-07-01

    The influence of scatter of principal input parameters of the forging system on the dimensional accuracy of product and on the tool life for closed-die forging process is presented in this paper. Scatter of the essential input parameters for the closed-die upsetting process was adjusted to the maximal values that enabled the reliable production of a dimensionally accurate product at optimal tool life. An operating window was created in which exists the maximal scatter of principal input parameters for the closed-die upsetting process that still ensures the desired dimensional accuracy of the product and the optimal tool life. Application of the adjustment of the process input parameters is shown on the example of making an inner race of homokinetic joint from mass production. High productivity in manufacture of elements by cold massive extrusion is often achieved by multiple forming operations that are performed simultaneously on the same press. By redesigning the time sequences of forming operations at multistage forming process of starter barrel during the working stroke the course of the resultant force is optimized.

  17. Genetic evolution of pancreatic cancer: lessons learnt from the pancreatic cancer genome sequencing project

    PubMed Central

    Iacobuzio-Donahue, Christine A

    2012-01-01

    Pancreatic cancer is a disease caused by the accumulation of genetic alterations in specific genes. Elucidation of the human genome sequence, in conjunction with technical advances in the ability to perform whole exome sequencing, have provided new insight into the mutational spectra characteristic of this lethal tumour type. Most recently, exomic sequencing has been used to clarify the clonal evolution of pancreatic cancer as well as provide time estimates of pancreatic carcinogenesis, indicating that a long window of opportunity may exist for early detection of this disease while in the curative stage. Moving forward, these mutational analyses indicate potential targets for personalised diagnostic and therapeutic intervention as well as the optimal timing for intervention based on the natural history of pancreatic carcinogenesis and progression. PMID:21749982

  18. Bayesian distributed lag interaction models to identify perinatal windows of vulnerability in children's health.

    PubMed

    Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A

    2017-07-01

    Epidemiological research supports an association between maternal exposure to air pollution during pregnancy and adverse children's health outcomes. Advances in exposure assessment and statistics allow for estimation of both critical windows of vulnerability and exposure effect heterogeneity. Simultaneous estimation of windows of vulnerability and effect heterogeneity can be accomplished by fitting a distributed lag model (DLM) stratified by subgroup. However, this can provide an incomplete picture of how effects vary across subgroups because it does not allow for subgroups to have the same window but different within-window effects or to have different windows but the same within-window effect. Because the timing of some developmental processes are common across subpopulations of infants while for others the timing differs across subgroups, both scenarios are important to consider when evaluating health risks of prenatal exposures. We propose a new approach that partitions the DLM into a constrained functional predictor that estimates windows of vulnerability and a scalar effect representing the within-window effect directly. The proposed method allows for heterogeneity in only the window, only the within-window effect, or both. In a simulation study we show that a model assuming a shared component across groups results in lower bias and mean squared error for the estimated windows and effects when that component is in fact constant across groups. We apply the proposed method to estimate windows of vulnerability in the association between prenatal exposures to fine particulate matter and each of birth weight and asthma incidence, and estimate how these associations vary by sex and maternal obesity status in a Boston-area prospective pre-birth cohort study. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. MO-FG-BRA-08: Swarm Intelligence-Based Personalized Respiratory Gating in Lung SAbR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modiri, A; Sabouri, P; Sawant, A

    Purpose: Respiratory gating is widely deployed as a clinical motion-management strategy in lung radiotherapy. In conventional gating, the beam is turned on during a pre-determined phase window; typically, around end-exhalation. In this work, we challenge the notion that end-exhalation is always the optimal gating phase. Specifically, we use a swarm-intelligence-based, inverse planning approach to determine the optimal respiratory phase and MU for each beam with respect to (i) the state of the anatomy at each phase and (ii) the time spent in that state, estimated from long-term monitoring of the patient’s breathing motion. Methods: In a retrospective study of fivemore » lung cancer patients, we compared the dosimetric performance of our proposed personalized gating (PG) with that of conventional end-of-exhale gating (CEG) and a previously-developed, fully 4D-optimized plan (combined with MLC tracking delivery). For each patient, respiratory phase probabilities (indicative of the time duration of the phase) were estimated over 2 minutes from lung tumor motion traces recorded previously using the Synchrony system (Accuray Inc.). Based on this information, inverse planning optimization was performed to calculate the optimal respiratory gating phase and MU for each beam. To ensure practical deliverability, each PG beam was constrained to deliver the assigned MU over a time duration comparable to that of CEG delivery. Results: Maximum OAR sparing for the five patients achieved by the PG and the 4D plans compared to CEG plans was: Esophagus Dmax [PG:57%, 4D:37%], Heart Dmax [PG:71%, 4D:87%], Spinal cord Dmax [PG:18%, 4D:68%] and Lung V13 [PG:16%, 4D:31%]. While patients spent the most time in exhalation, the PG-optimization chose end-exhale only for 28% of beams. Conclusion: Our novel gating strategy achieved significant dosimetric improvements over conventional gating, and approached the upper limit represented by fully 4D optimized planning while being significantly simpler and more clinically translatable. This work was partially supported through research funding from National Institutes of Health (R01CA169102) and Varian Medical Systems, Palo Alto, CA, USA.« less

  20. An Approach to Unbiased Subsample Interpolation for Motion Tracking

    PubMed Central

    McCormick, Matthew M.; Varghese, Tomy

    2013-01-01

    Accurate subsample displacement estimation is necessary for ultrasound elastography because of the small deformations that occur and the subsequent application of a derivative operation on local displacements. Many of the commonly used subsample estimation techniques introduce significant bias errors. This article addresses a reduced bias approach to subsample displacement estimations that consists of a two-dimensional windowed-sinc interpolation with numerical optimization. It is shown that a Welch or Lanczos window with a Nelder–Mead simplex or regular-step gradient-descent optimization is well suited for this purpose. Little improvement results from a sinc window radius greater than four data samples. The strain signal-to-noise ratio (SNR) obtained in a uniformly elastic phantom is compared with other parabolic and cosine interpolation methods; it is found that the strain SNR ratio is improved over parabolic interpolation from 11.0 to 13.6 in the axial direction and 0.7 to 1.1 in the lateral direction for an applied 1% axial deformation. The improvement was most significant for small strains and displacement tracking in the lateral direction. This approach does not rely on special properties of the image or similarity function, which is demonstrated by its effectiveness with the application of a previously described regularization technique. PMID:23493609

  1. Alternative Fuels Data Center: Hydrogen Drive

    Science.gov Websites

    , contact Greater Washington Region Clean Cities Coalition. Download QuickTime Video QuickTime (.mov ) Download Windows Media Video Windows Media (.wmv) Video Download Help Text version See more videos provided

  2. Application of the weighted total field-scattering field technique to 3D-PSTD light scattering model

    NASA Astrophysics Data System (ADS)

    Hu, Shuai; Gao, Taichang; Liu, Lei; Li, Hao; Chen, Ming; Yang, Bo

    2018-04-01

    PSTD (Pseudo Spectral Time Domain) is an excellent model for the light scattering simulation of nonspherical aerosol particles. However, due to the particularity of its discretization form of the Maxwell's equations, the traditional Total Field/Scattering Field (TF/SF) technique for FDTD (Finite Differential Time Domain) is not applicable to PSTD, and the time-consuming pure scattering field technique is mainly applied to introduce the incident wave. To this end, the weighted TF/SF technique proposed by X. Gao is generalized and applied to the 3D-PSTD scattering model. Using this technique, the incident light can be effectively introduced by modifying the electromagnetic components in an inserted connecting region between the total field and the scattering field region with incident terms, where the incident terms are obtained by weighting the incident field by a window function. To optimally determine the thickness of connection region and the window function type for PSTD calculations, their influence on the modeling accuracy is firstly analyzed. To further verify the effectiveness and advantages of the weighted TF/SF technique, the improved PSTD model is validated against the PSTD model equipped with pure scattering field technique in both calculation accuracy and efficiency. The results show that, the performance of PSTD seems to be not sensitive to variation of window functions. The number of the connection layer required decreases with the increasing of spatial resolution, where for spatial resolution of 24 grids per wavelength, a 6-layer region is thick enough. The scattering phase matrices and integral scattering parameters obtained by the improved PSTD show an excellent consistency with those well-tested models for spherical and nonspherical particles, illustrating that the weighted TF/SF technique can introduce the incident precisely. The weighted TF/SF technique shows higher computational efficiency than pure scattering technique.

  3. Comparison of IMRT planning with two-step and one-step optimization: a strategy for improving therapeutic gain and reducing the integral dose

    NASA Astrophysics Data System (ADS)

    Abate, A.; Pressello, M. C.; Benassi, M.; Strigari, L.

    2009-12-01

    The aim of this study was to evaluate the effectiveness and efficiency in inverse IMRT planning of one-step optimization with the step-and-shoot (SS) technique as compared to traditional two-step optimization using the sliding windows (SW) technique. The Pinnacle IMRT TPS allows both one-step and two-step approaches. The same beam setup for five head-and-neck tumor patients and dose-volume constraints were applied for all optimization methods. Two-step plans were produced converting the ideal fluence with or without a smoothing filter into the SW sequence. One-step plans, based on direct machine parameter optimization (DMPO), had the maximum number of segments per beam set at 8, 10, 12, producing a directly deliverable sequence. Moreover, the plans were generated whether a split-beam was used or not. Total monitor units (MUs), overall treatment time, cost function and dose-volume histograms (DVHs) were estimated for each plan. PTV conformality and homogeneity indexes and normal tissue complication probability (NTCP) that are the basis for improving therapeutic gain, as well as non-tumor integral dose (NTID), were evaluated. A two-sided t-test was used to compare quantitative variables. All plans showed similar target coverage. Compared to two-step SW optimization, the DMPO-SS plans resulted in lower MUs (20%), NTID (4%) as well as NTCP values. Differences of about 15-20% in the treatment delivery time were registered. DMPO generates less complex plans with identical PTV coverage, providing lower NTCP and NTID, which is expected to reduce the risk of secondary cancer. It is an effective and efficient method and, if available, it should be favored over the two-step IMRT planning.

  4. Emergence of two near-infrared windows for in vivo and intraoperative SERS.

    PubMed

    Lane, Lucas A; Xue, Ruiyang; Nie, Shuming

    2018-04-06

    Two clear windows in the near-infrared (NIR) spectrum are of considerable current interest for in vivo molecular imaging and spectroscopic detection. The main rationale is that near-infrared light can penetrate biological tissues such as skin and blood more efficiently than visible light because these tissues scatter and absorb less light at longer wavelengths. The first clear window, defined as light wavelengths between 650nm and 950nm, has been shown to be far superior for in vivo and intraoperative optical imaging than visible light. The second clear window, operating in the wavelength range of 1000-1700nm, has been reported to further improve detection sensitivity, spatial resolution, and tissue penetration because tissue photon scattering and background interference are further reduced at longer wavelengths. Here we discuss recent advances in developing biocompatible plasmonic nanoparticles for in vivo and intraoperative surface-enhanced Raman scattering (SERS) in both the first and second NIR windows. In particular, a new class of 'broad-band' plasmonic nanostructures is well suited for surface Raman enhancement across a broad range of wavelengths allowing a direct comparison of detection sensitivity and tissue penetration between the two NIR window. Also, optimized and encoded SERS nanoparticles are generally nontoxic and are much brighter than near-infrared quantum dots (QDs), raising new possibilities for ultrasensitive detection of microscopic tumors and image-guided precision surgery. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. On the relationship between instantaneous phase synchrony and correlation-based sliding windows for time-resolved fMRI connectivity analysis.

    PubMed

    Pedersen, Mangor; Omidvarnia, Amir; Zalesky, Andrew; Jackson, Graeme D

    2018-06-08

    Correlation-based sliding window analysis (CSWA) is the most commonly used method to estimate time-resolved functional MRI (fMRI) connectivity. However, instantaneous phase synchrony analysis (IPSA) is gaining popularity mainly because it offers single time-point resolution of time-resolved fMRI connectivity. We aim to provide a systematic comparison between these two approaches, on both temporal and topological levels. For this purpose, we used resting-state fMRI data from two separate cohorts with different temporal resolutions (45 healthy subjects from Human Connectome Project fMRI data with repetition time of 0.72 s and 25 healthy subjects from a separate validation fMRI dataset with a repetition time of 3 s). For time-resolved functional connectivity analysis, we calculated tapered CSWA over a wide range of different window lengths that were temporally and topologically compared to IPSA. We found a strong association in connectivity dynamics between IPSA and CSWA when considering the absolute values of CSWA. The association between CSWA and IPSA was stronger for a window length of ∼20 s (shorter than filtered fMRI wavelength) than ∼100 s (longer than filtered fMRI wavelength), irrespective of the sampling rate of the underlying fMRI data. Narrow-band filtering of fMRI data (0.03-0.07 Hz) yielded a stronger relationship between IPSA and CSWA than wider-band (0.01-0.1 Hz). On a topological level, time-averaged IPSA and CSWA nodes were non-linearly correlated for both short (∼20 s) and long (∼100 s) windows, mainly because nodes with strong negative correlations (CSWA) displayed high phase synchrony (IPSA). IPSA and CSWA were anatomically similar in the default mode network, sensory cortex, insula and cerebellum. Our results suggest that IPSA and CSWA provide comparable characterizations of time-resolved fMRI connectivity for appropriately chosen window lengths. Although IPSA requires narrow-band fMRI filtering, we recommend the use of IPSA given that it does not mandate a (semi-)arbitrary choice of window length and window overlap. A code for calculating IPSA is provided. Copyright © 2018. Published by Elsevier Inc.

  6. Pre-Launch Performance Assessment of the VIIRS Land Surface Temperature Environmental Data Record

    NASA Astrophysics Data System (ADS)

    Hauss, B.; Ip, J.; Agravante, H.

    2009-12-01

    The Visible/Infrared Imager Radiometer Suite (VIIRS) Land Surface Temperature (LST) Environmental Data Record (EDR) provides the surface temperature of land surface including coastal and inland-water pixels at VIIRS moderate resolution (750m) during both day and night. To predict the LST under optimal conditions, the retrieval algorithm utilizes a dual split-window approach with both Short-wave Infrared (SWIR) channels at 3.70 µm (M12) and 4.05 µm (M13), and Long-wave Infrared (LWIR) channels at 10.76 µm (M15) and 12.01 µm (M16) to correct for atmospheric water vapor. Under less optimal conditions, the algorithm uses a fallback split-window approach with M15 and M16 channels. By comparison, the MODIS generalized split-window algorithm only uses the LWIR bands in the retrieval of surface temperature because of the concern for both solar contamination and large emissivity variations in the SWIR bands. In this paper, we assess whether these concerns are real and whether there is an impact on the precision and accuracy of the LST retrieval. The algorithm relies on the VIIRS Cloud Mask IP for identifying cloudy and ocean pixels, the VIIRS Surface Type EDR for identifying the IGBP land cover type for the pixels, and the VIIRS Aerosol Optical Thickness (AOT) IP for excluding pixels with AOT greater than 1.0. In this paper, we will report the pre-launch performance assessment of the LST EDR based on global synthetic data and proxy data from Terra MODIS. Results of both the split-window and dual split-window algorithms will be assessed by comparison either to synthetic "truth" or results of the MODIS retrieval. We will also show that the results of the assessment with proxy data are consistent with those obtained using the global synthetic data.

  7. Hydrocarbon Reservoir Prediction Using Bi-Gaussian S Transform Based Time-Frequency Analysis Approach

    NASA Astrophysics Data System (ADS)

    Cheng, Z.; Chen, Y.; Liu, Y.; Liu, W.; Zhang, G.

    2015-12-01

    Among those hydrocarbon reservoir detection techniques, the time-frequency analysis based approach is one of the most widely used approaches because of its straightforward indication of low-frequency anomalies from the time-frequency maps, that is to say, the low-frequency bright spots usually indicate the potential hydrocarbon reservoirs. The time-frequency analysis based approach is easy to implement, and more importantly, is usually of high fidelity in reservoir prediction, compared with the state-of-the-art approaches, and thus is of great interest to petroleum geologists, geophysicists, and reservoir engineers. The S transform has been frequently used in obtaining the time-frequency maps because of its better performance in controlling the compromise between the time and frequency resolutions than the alternatives, such as the short-time Fourier transform, Gabor transform, and continuous wavelet transform. The window function used in the majority of previous S transform applications is the symmetric Gaussian window. However, one problem with the symmetric Gaussian window is the degradation of time resolution in the time-frequency map due to the long front taper. In our study, a bi-Gaussian S transform that substitutes the symmetric Gaussian window with an asymmetry bi-Gaussian window is proposed to analyze the multi-channel seismic data in order to predict hydrocarbon reservoirs. The bi-Gaussian window introduces asymmetry in the resultant time-frequency spectrum, with time resolution better in the front direction, as compared with the back direction. It is the first time that the bi-Gaussian S transform is used for analyzing multi-channel post-stack seismic data in order to predict hydrocarbon reservoirs since its invention in 2003. The superiority of the bi-Gaussian S transform over traditional S transform is tested on a real land seismic data example. The performance shows that the enhanced temporal resolution can help us depict more clearly the edge of the hydrocarbon reservoir, especially when the thickness of the reservoir is small (such as the thin beds).

  8. CAVE WINDOW

    DOEpatents

    Levenson, M.

    1960-10-25

    A cave window is described. It is constructed of thick glass panes arranged so that interior panes have smaller windowpane areas and exterior panes have larger areas. Exterior panes on the radiation exposure side are remotely replaceable when darkened excessively. Metal shutters minimize exposure time to extend window life.

  9. Development and evaluation of a monolithic floating dosage form for furosemide.

    PubMed

    Menon, A; Ritschel, W A; Sakr, A

    1994-02-01

    The poor bioavailability of orally dosed furosemide (60%), a weakly acidic drug, is due to the presence of a biological window comprised of the upper gastrointestinal tract. The purpose of the present study was to develop and optimize in vitro a monolithic modified-release dosage form (MMR) for furosemide with increased gastric residence time and to evaluate the in vivo performance of the dosage form. The principle of floatation was used to restrict the MMR to the stomach. A two-factor three-level full factorial experimental design was employed for formulation development. A flow-through cell was designed to evaluate in vitro dissolution parameters. Quadratic regression models indicated the polymer viscosity and polymer:drug ratio to be significant (p < 0.05) formulation factors in determining the duration of buoyancy and the release profile. Statistical optimization using response surface methodology with certain physiological constraints relating to gastric emptying time predicted an optimal MMR. In vivo evaluation of the optimized MMR in beagle dogs resulted in a significant increase (p < 0.05) in the absolute bioavailability for the MMR dosage form (42.9%) as compared to the commercially available tablet (33.4%) and enteric product (29.5%). Significant in vitro/in vivo correlations (p < 0.05) were obtained for the MMR using deconvolution analysis normalized for bioavailability. The floating dosage form was found to be a feasible approach in delivering furosemide to the upper gastrointestinal tract to maximize drug absorption.

  10. The home health care routing and scheduling problem with interdependent services.

    PubMed

    Mankowska, Dorota Slawa; Meisel, Frank; Bierwirth, Christian

    2014-03-01

    This paper presents a model for the daily planning of health care services carried out at patients' homes by staff members of a home care company. The planning takes into account individual service requirements of the patients, individual qualifications of the staff and possible interdependencies between different service operations. Interdependencies of services can include, for example, a temporal separation of two services as is required if drugs have to be administered a certain time before providing a meal. Other services like handling a disabled patient may require two staff members working together at a patient's home. The time preferences of patients are included in terms of given time windows. In this paper, we propose a planning approach for the described problem, which can be used for optimizing economical and service oriented measures of performance. A mathematical model formulation is proposed together with a powerful heuristic based on a sophisticated solution representation.

  11. Developmental time windows for axon growth influence neuronal network topology.

    PubMed

    Lim, Sol; Kaiser, Marcus

    2015-04-01

    Early brain connectivity development consists of multiple stages: birth of neurons, their migration and the subsequent growth of axons and dendrites. Each stage occurs within a certain period of time depending on types of neurons and cortical layers. Forming synapses between neurons either by growing axons starting at similar times for all neurons (much-overlapped time windows) or at different time points (less-overlapped) may affect the topological and spatial properties of neuronal networks. Here, we explore the extreme cases of axon formation during early development, either starting at the same time for all neurons (parallel, i.e., maximally overlapped time windows) or occurring for each neuron separately one neuron after another (serial, i.e., no overlaps in time windows). For both cases, the number of potential and established synapses remained comparable. Topological and spatial properties, however, differed: Neurons that started axon growth early on in serial growth achieved higher out-degrees, higher local efficiency and longer axon lengths while neurons demonstrated more homogeneous connectivity patterns for parallel growth. Second, connection probability decreased more rapidly with distance between neurons for parallel growth than for serial growth. Third, bidirectional connections were more numerous for parallel growth. Finally, we tested our predictions with C. elegans data. Together, this indicates that time windows for axon growth influence the topological and spatial properties of neuronal networks opening up the possibility to a posteriori estimate developmental mechanisms based on network properties of a developed network.

  12. Sunlight Responsive Thermochromic Window System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millett, F,A; Byker,H, J

    2006-10-27

    Pleotint has embarked on a novel approach with our Sunlight Responsive Thermochromic, SRT™, windows. We are integrating dynamic sunlight control, high insulation values and low solar heat gain together in a high performance window. The Pleotint SRT window is dynamic because it reversibly changes light transmission based on thermochromics activated directly by the heating effect of sunlight. We can achieve a window package with low solar heat gain coefficient (SHGC), a low U value and high insulation. At the same time our windows provide good daylighting. Our innovative window design offers architects and building designers the opportunity to choose theirmore » desired energy performance, excellent sound reduction, external pane can be self-cleaning, or a resistance to wind load, blasts, bullets or hurricanes. SRT windows would provide energy savings that are estimated at up to 30% over traditional window systems. Glass fabricators will be able to use existing equipment to make the SRT window while adding value and flexibility to the basic design. Glazing installers will have the ability to fit the windows with traditional methods without wires, power supplies and controllers. SRT windows can be retrofit into existing buildings,« less

  13. Threshold network of a financial market using the P-value of correlation coefficients

    NASA Astrophysics Data System (ADS)

    Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun

    2015-06-01

    Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.

  14. Effect of the time window on the heat-conduction information filtering model

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Song, Wen-Jun; Hou, Lei; Zhang, Yi-Lu; Liu, Jian-Guo

    2014-05-01

    Recommendation systems have been proposed to filter out the potential tastes and preferences of the normal users online, however, the physics of the time window effect on the performance is missing, which is critical for saving the memory and decreasing the computation complexity. In this paper, by gradually expanding the time window, we investigate the impact of the time window on the heat-conduction information filtering model with ten similarity measures. The experimental results on the benchmark dataset Netflix indicate that by only using approximately 11.11% recent rating records, the accuracy could be improved by an average of 33.16% and the diversity could be improved by 30.62%. In addition, the recommendation performance on the dataset MovieLens could be preserved by only considering approximately 10.91% recent records. Under the circumstance of improving the recommendation performance, our discoveries possess significant practical value by largely reducing the computational time and shortening the data storage space.

  15. Smart windows with functions of reflective display and indoor temperature-control

    NASA Astrophysics Data System (ADS)

    Lee, I.-Hui; Chao, Yu-Ching; Hsu, Chih-Cheng; Chang, Liang-Chao; Chiu, Tien-Lung; Lee, Jiunn-Yih; Kao, Fu-Jen; Lee, Chih-Kung; Lee, Jiun-Haw

    2010-02-01

    In this paper, a switchable window based on cholestreric liquid crystal (CLC) was demonstrated. Under different applied voltages, incoming light at visible and infrared wavelengths was modulated, respectively. A mixture of CLC with a nematic liquid crystal and a chiral dopant selectively reflected infrared light without bias, which effectively reduced the indoor temperature under sunlight illumination. At this time, transmission at visible range was kept at high and the windows looked transparent. With increasing the voltage to 15V, CLC changed to focal conic state and can be used as a reflective display, a privacy window, or a screen for projector. Under a high voltage (30V), homeotropic state was achieved. At this time, both infrared and visible light can transmit which acted as a normal window, which permitted infrared spectrum of winter sunlight to enter the room so as to reduce the heating requirement. Such a device can be used as a switchable window in smart buildings, green houses and windshields.

  16. IPO: a tool for automated optimization of XCMS parameters.

    PubMed

    Libiseller, Gunnar; Dvorzak, Michaela; Kleb, Ulrike; Gander, Edgar; Eisenberg, Tobias; Madeo, Frank; Neumann, Steffen; Trausinger, Gert; Sinner, Frank; Pieber, Thomas; Magnes, Christoph

    2015-04-16

    Untargeted metabolomics generates a huge amount of data. Software packages for automated data processing are crucial to successfully process these data. A variety of such software packages exist, but the outcome of data processing strongly depends on algorithm parameter settings. If they are not carefully chosen, suboptimal parameter settings can easily lead to biased results. Therefore, parameter settings also require optimization. Several parameter optimization approaches have already been proposed, but a software package for parameter optimization which is free of intricate experimental labeling steps, fast and widely applicable is still missing. We implemented the software package IPO ('Isotopologue Parameter Optimization') which is fast and free of labeling steps, and applicable to data from different kinds of samples and data from different methods of liquid chromatography - high resolution mass spectrometry and data from different instruments. IPO optimizes XCMS peak picking parameters by using natural, stable (13)C isotopic peaks to calculate a peak picking score. Retention time correction is optimized by minimizing relative retention time differences within peak groups. Grouping parameters are optimized by maximizing the number of peak groups that show one peak from each injection of a pooled sample. The different parameter settings are achieved by design of experiments, and the resulting scores are evaluated using response surface models. IPO was tested on three different data sets, each consisting of a training set and test set. IPO resulted in an increase of reliable groups (146% - 361%), a decrease of non-reliable groups (3% - 8%) and a decrease of the retention time deviation to one third. IPO was successfully applied to data derived from liquid chromatography coupled to high resolution mass spectrometry from three studies with different sample types and different chromatographic methods and devices. We were also able to show the potential of IPO to increase the reliability of metabolomics data. The source code is implemented in R, tested on Linux and Windows and it is freely available for download at https://github.com/glibiseller/IPO . The training sets and test sets can be downloaded from https://health.joanneum.at/IPO .

  17. A study of optimization techniques in HDR brachytherapy for the prostate

    NASA Astrophysics Data System (ADS)

    Pokharel, Ghana Shyam

    Several studies carried out thus far are in favor of dose escalation to the prostate gland to have better local control of the disease. But optimal way of delivery of higher doses of radiation therapy to the prostate without hurting neighboring critical structures is still debatable. In this study, we proposed that real time high dose rate (HDR) brachytherapy with highly efficient and effective optimization could be an alternative means of precise delivery of such higher doses. This approach of delivery eliminates the critical issues such as treatment setup uncertainties and target localization as in external beam radiation therapy. Likewise, dosimetry in HDR brachytherapy is not influenced by organ edema and potential source migration as in permanent interstitial implants. Moreover, the recent report of radiobiological parameters further strengthen the argument of using hypofractionated HDR brachytherapy for the management of prostate cancer. Firstly, we studied the essential features and requirements of real time HDR brachytherapy treatment planning system. Automating catheter reconstruction with fast editing tools, fast yet accurate dose engine, robust and fast optimization and evaluation engine are some of the essential requirements for such procedures. Moreover, in most of the cases we performed, treatment plan optimization took significant amount of time of overall procedure. So, making treatment plan optimization automatic or semi-automatic with sufficient speed and accuracy was the goal of the remaining part of the project. Secondly, we studied the role of optimization function and constraints in overall quality of optimized plan. We have studied the gradient based deterministic algorithm with dose volume histogram (DVH) and more conventional variance based objective functions for optimization. In this optimization strategy, the relative weight of particular objective in aggregate objective function signifies its importance with respect to other objectives. Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was done to keep time window desirable for real time procedures. Therefore, it requires further study with improved conditions to realize the full potential of the algorithm.

  18. Net analyte signal-based simultaneous determination of ethanol and water by quartz crystal nanobalance sensor.

    PubMed

    Mirmohseni, A; Abdollahi, H; Rostamizadeh, K

    2007-02-28

    Net analyte signal (NAS)-based method called HLA/GO was applied for the selectively determination of binary mixture of ethanol and water by quartz crystal nanobalance (QCN) sensor. A full factorial design was applied for the formation of calibration and prediction sets in the concentration ranges 5.5-22.2 microg mL(-1) for ethanol and 7.01-28.07 microg mL(-1) for water. An optimal time range was selected by procedure which was based on the calculation of the net analyte signal regression plot in any considered time window for each test sample. A moving window strategy was used for searching the region with maximum linearity of NAS regression plot (minimum error indicator) and minimum of PRESS value. On the base of obtained results, the differences on the adsorption profiles in the time range between 1 and 600 s were used to determine mixtures of both compounds by HLA/GO method. The calculation of the net analytical signal using HLA/GO method allows determination of several figures of merit like selectivity, sensitivity, analytical sensitivity and limit of detection, for each component. To check the ability of the proposed method in the selection of linear regions of adsorption profile, a test for detecting non-linear regions of adsorption profile data in the presence of methanol was also described. The results showed that the method was successfully applied for the determination of ethanol and water.

  19. A method for visualizing high-density porous polyethylene (medpor, porex) with computed tomographic scanning.

    PubMed

    Vendemia, Nicholas; Chao, Jerry; Ivanidze, Jana; Sanelli, Pina; Spinelli, Henry M

    2011-01-01

    Medpor (Porex Surgical, Inc, Newnan, GA) is composed of porous polyethylene and is commonly used in craniofacial reconstruction. When complications such as seroma or abscess formation arise, diagnostic modalities are limited because Medpor is radiolucent on conventional radiologic studies. This poses a problem in situations where imaging is necessary to distinguish the implant from surrounding tissues. To present a clinically useful method for imaging Medpor with conventional computed tomographic (CT) scanning. Eleven patients (12 total implants) who have undergone reconstructive surgery with Medpor were included in the study. A retrospective review of CT scans done between 1 and 16 months postoperatively was performed using 3 distinct CT window settings. Measurements of implant dimensions and Hounsfield units were recorded and qualitatively assessed. Of the 3 distinct window settings studied, namely, "bone" (W1100/L450), "soft tissue"; (W500/L50), and "implant" (W800/L200), the implant window proved the most ideal, allowing the investigators to visualize and evaluate Medpor in all cases. Qualitative analysis revealed that Medpor implants were able to be distinguished from surrounding tissue in both the implant and soft tissue windows, with a density falling between that of fat and fluid. In 1 case, Medpor could not be visualized in the soft tissue window, although it could be visualized in the implant window. Quantitative analysis demonstrated a mean (SD) density of -38.7 (7.4) Hounsfield units. Medpor may be optimally visualized on conventional CT scans using the implant window settings W800/L200, which can aid in imaging Medpor and diagnosing implant-related complications.

  20. Bird-Window Collisions at a West-Coast Urban Park Museum: Analyses of Bird Biology and Window Attributes from Golden Gate Park, San Francisco.

    PubMed

    Kahle, Logan Q; Flannery, Maureen E; Dumbacher, John P

    2016-01-01

    Bird-window collisions are a major and poorly-understood generator of bird mortality. In North America, studies of this topic tend to be focused east of the Mississippi River, resulting in a paucity of data from the Western flyways. Additionally, few available data can critically evaluate factors such as time of day, sex and age bias, and effect of window pane size on collisions. We collected and analyzed 5 years of window strike data from a 3-story building in a large urban park in San Francisco, California. To evaluate our window collision data in context, we collected weekly data on local bird abundance in the adjacent parkland. Our study asks two overarching questions: first-what aspects of a bird's biology might make them more likely to fatally strike windows; and second, what characteristics of a building's design contribute to bird-window collisions. We used a dataset of 308 fatal bird strikes to examine the relationships of strikes relative to age, sex, time of day, time of year, and a variety of other factors, including mitigation efforts. We found that actively migrating birds may not be major contributors to collisions as has been found elsewhere. We found that males and young birds were both significantly overrepresented relative to their abundance in the habitat surrounding the building. We also analyzed the effect of external window shades as mitigation, finding that an overall reduction in large panes, whether covered or in some way broken up with mullions, effectively reduced window collisions. We conclude that effective mitigation or design will be required in all seasons, but that breeding seasons and migratory seasons are most critical, especially for low-rise buildings and other sites away from urban migrant traps. Finally, strikes occur throughout the day, but mitigation may be most effective in the morning and midday.

  1. Bird-Window Collisions at a West-Coast Urban Park Museum: Analyses of Bird Biology and Window Attributes from Golden Gate Park, San Francisco

    PubMed Central

    Kahle, Logan Q.; Flannery, Maureen E.; Dumbacher, John P.

    2016-01-01

    Bird-window collisions are a major and poorly-understood generator of bird mortality. In North America, studies of this topic tend to be focused east of the Mississippi River, resulting in a paucity of data from the Western flyways. Additionally, few available data can critically evaluate factors such as time of day, sex and age bias, and effect of window pane size on collisions. We collected and analyzed 5 years of window strike data from a 3-story building in a large urban park in San Francisco, California. To evaluate our window collision data in context, we collected weekly data on local bird abundance in the adjacent parkland. Our study asks two overarching questions: first–what aspects of a bird’s biology might make them more likely to fatally strike windows; and second, what characteristics of a building’s design contribute to bird-window collisions. We used a dataset of 308 fatal bird strikes to examine the relationships of strikes relative to age, sex, time of day, time of year, and a variety of other factors, including mitigation efforts. We found that actively migrating birds may not be major contributors to collisions as has been found elsewhere. We found that males and young birds were both significantly overrepresented relative to their abundance in the habitat surrounding the building. We also analyzed the effect of external window shades as mitigation, finding that an overall reduction in large panes, whether covered or in some way broken up with mullions, effectively reduced window collisions. We conclude that effective mitigation or design will be required in all seasons, but that breeding seasons and migratory seasons are most critical, especially for low-rise buildings and other sites away from urban migrant traps. Finally, strikes occur throughout the day, but mitigation may be most effective in the morning and midday. PMID:26731417

  2. Stereo matching using census cost over cross window and segmentation-based disparity refinement

    NASA Astrophysics Data System (ADS)

    Li, Qingwu; Ni, Jinyan; Ma, Yunpeng; Xu, Jinxin

    2018-03-01

    Stereo matching is a vital requirement for many applications, such as three-dimensional (3-D) reconstruction, robot navigation, object detection, and industrial measurement. To improve the practicability of stereo matching, a method using census cost over cross window and segmentation-based disparity refinement is proposed. First, a cross window is obtained using distance difference and intensity similarity in binocular images. Census cost over the cross window and color cost are combined as the matching cost, which is aggregated by the guided filter. Then, winner-takes-all strategy is used to calculate the initial disparities. Second, a graph-based segmentation method is combined with color and edge information to achieve moderate under-segmentation. The segmented regions are classified into reliable regions and unreliable regions by consistency checking. Finally, the two regions are optimized by plane fitting and propagation, respectively, to match the ambiguous pixels. The experimental results are on Middlebury Stereo Datasets, which show that the proposed method has good performance in occluded and discontinuous regions, and it obtains smoother disparity maps with a lower average matching error rate compared with other algorithms.

  3. Moving-window dynamic optimization: design of stimulation profiles for walking.

    PubMed

    Dosen, Strahinja; Popović, Dejan B

    2009-05-01

    The overall goal of the research is to improve control for electrical stimulation-based assistance of walking in hemiplegic individuals. We present the simulation for generating offline input (sensors)-output (intensity of muscle stimulation) representation of walking that serves in synthesizing a rule-base for control of electrical stimulation for restoration of walking. The simulation uses new algorithm termed moving-window dynamic optimization (MWDO). The optimization criterion was to minimize the sum of the squares of tracking errors from desired trajectories with the penalty function on the total muscle efforts. The MWDO was developed in the MATLAB environment and tested using target trajectories characteristic for slow-to-normal walking recorded in healthy individual and a model with the parameters characterizing the potential hemiplegic user. The outputs of the simulation are piecewise constant intensities of electrical stimulation and trajectories generated when the calculated stimulation is applied to the model. We demonstrated the importance of this simulation by showing the outputs for healthy and hemiplegic individuals, using the same target trajectories. Results of the simulation show that the MWDO is an efficient tool for analyzing achievable trajectories and for determining the stimulation profiles that need to be delivered for good tracking.

  4. Noise activated bistable sensor based on chaotic system with output defined by temporal coding and firing rate

    NASA Astrophysics Data System (ADS)

    Korneta, Wojciech; Gomes, Iacyel

    2017-11-01

    Traditional bistable sensors use external bias signal to drive its response between states and their detection strategy is based on the output power spectral density or the residence time difference (RTD) in two sensor states. Recently, the noise activated nonlinear dynamic sensors driven only by noise based on RTD technique have been proposed. Here, we present experimental results of dc voltage measurements by noise-driven bistable sensor based on electronic Chua's circuit operating in a chaotic regime where two single scroll attractors coexist. The output of the sensor is quantified by the proportion of the time the sensor stays in one state to the total observation time and by the spike-count rate with spikes defined by crossings between attractors. The relationship between the stimuli and particular observable for different noise intensities is obtained, the usefulness of each coding scheme is discussed, and the optimal noise intensity for detection is indicated. It is shown that the obtained relationship is the same for any observation time when population coding is used. The optimal time window for both detection and the number of units in population coding is found. Our results may be useful for analyses and understanding of the neural activity and in designing bistable storage elements at length scales where thermal fluctuations drastically increase and the effect of noise must be taken into consideration.

  5. An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.

    PubMed

    Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad

    2016-01-01

    Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.

  6. Pattern-based IP block detection, verification, and variability analysis

    NASA Astrophysics Data System (ADS)

    Ahmad Ibrahim, Muhamad Asraf Bin; Muhsain, Mohamad Fahmi Bin; Kamal Baharin, Ezni Aznida Binti; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe

    2018-03-01

    The goal of a foundry partner is to deliver high quality silicon product to its customers on time. There is an assumed trust that the silicon will yield, function and perform as expected when the design fits all the sign-off criteria. The use of Intellectual Property (IP) blocks is very common today and provides the customer with pre-qualified and optimized functions for their design thus shortening the design cycle. There are many methods by which an IP Block can be generated and placed within layout. Even with the most careful methods and following of guidelines comes the responsibility of sign-off checking. A foundry needs to detect where these IP Blocks have been placed and look for any violations. This includes DRC clean modifications to the IP Block which may or may not be intentional. Using a pattern-based approach to detect all IP Blocks used provides the foundry advanced capabilities to analyze them further for any kind of changes which could void the OPC and process window optimizations. Having any changes in an IP Block could cause functionality changes or even failures. This also opens the foundry to legal and cost issues while at the same time forcing re-spins of the design. In this publication, we discuss the methodology we have employed to avoid process issues and tape-out errors while at the same time reduce our manual work and improve the turnaround time. We are also able to use our pattern analysis to improve our OPC optimizations when modifications are encountered which have not been seen before.

  7. Mission Analysis and Design for Space Based Inter-Satellite Laser Power Beaming

    DTIC Science & Technology

    2010-03-01

    56 4.3.1 Darwin Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 4.4 Obscuration Analysis...81 Appendix B. Additional Multi-Dimensional Darwin Plots from ModelCenter . 86 Appendix C. STK Access Report for... Darwin Data Explorer Window Showing Optimized Results in Tabular Form

  8. Dielectric Windows with a Flat-Topped Characteristic of Transparency

    NASA Astrophysics Data System (ADS)

    Shcherbak, V. V.

    2013-09-01

    The construction of radiotransparent bafflers in a waveguide, with essentially improved matching with the tract is suggested, and optimized in a broad frequency range. This being a strip, diaphragm inside a dielectric layer. Also, on this basis, the efficient, absorber is created.

  9. Optimal dredge fleet scheduling within environmental work windows.

    DOT National Transportation Integrated Search

    2016-09-15

    The U.S. Army Corps of Engineers (USACE) annually spends more than 100 million dollars on dredging hundreds of navigation projects on more than 12,000 miles of inland and intra-coastal waterways. Building on previous work with USACE, this project exp...

  10. TH-CD-202-08: Feasibility Study of Planning Phase Optimization Using Patient Geometry-Driven Information for Better Dose Sparing of Organ at Risks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, S; Kim, D; Kim, T

    2016-06-15

    Purpose: To propose a simple and effective cost value function to search optimal planning phase (gating window) and demonstrated its feasibility for respiratory correlated radiation therapy. Methods: We acquired 4DCT of 10 phases for 10 lung patients who have tumor located near OARs such as esophagus, heart, and spinal cord (i.e., central lung cancer patients). A simplified mathematical optimization function was established by using overlap volume histogram (OVH) between the target and organ at risk (OAR) at each phase and the tolerance dose of selected OARs to achieve surrounding OARs dose-sparing. For all patients and all phases, delineation of themore » target volume and selected OARs (esophagus, heart, and spinal cord) was performed (by one observer to avoid inter-observer variation), then cost values were calculated for all phases. After the breathing phases were ranked according to cost value function, the relationship between score and dose distribution at highest and lowest cost value phases were evaluated by comparing the mean/max dose. Results: A simplified mathematical cost value function showed noticeable difference from phase to phase, implying it is possible to find optimal phases for gating window. The lowest cost value which may result in lower mean/max dose to OARs was distributed at various phases for all patients. The mean doses of the OARs significantly decreased about 10% with statistical significance for all 3 OARs at the phase with the lowest cost value. Also, the max doses of the OARs were decreased about 2∼5% at the phase with the lowest cost value compared to the phase with the highest cost value. Conclusion: It is demonstrated that optimal phases (in dose distribution perspective) for gating window could exist differently through each patient and the proposed cost value function can be a useful tool for determining such phases without performing dose optimization calculations. This research was supported by the Mid-career Researcher Program through NRF funded by the Ministry of Science, ICT & Future Planning of Korea (NRF-2014R1A2A1A10050270) and by the Radiation Technology R&D program through the National Research Foundation of Korea funded by the Ministry of Science, ICT & Future Planning (No. 2013M2A2A7038291)« less

  11. Optimization of rainfall networks using information entropy and temporal variability analysis

    NASA Astrophysics Data System (ADS)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  12. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in an underestimate of the likelihood of an event occurring ‘today’ leading to potentially inappropriate action choices. We thus present some initial guidelines for communicating such eruption forecasts.

  13. Extragalactic Fields Optimized for Adaptive Optics

    DTIC Science & Technology

    2011-03-01

    DAVID MONETIO Received 2010 luly 19; accepted 2010 December 30; published 2011 March 1 ABSTRACT. In this article we present the coordinates of 67 55’ x...Window, contains most of the patches we have identified. Our optimal field, centered at R.A.: 7h24m3s, decl.: - 1 °27󈧓", has an additional advantage of...equivalent observations undertaken in existing deep fields. Online material: color figures 1 . INTRODUCTION Our understanding of the high-redshift

  14. The Time Window Vehicle Routing Problem Considering Closed Route

    NASA Astrophysics Data System (ADS)

    Irsa Syahputri, Nenna; Mawengkang, Herman

    2017-12-01

    The Vehicle Routing Problem (VRP) determines the optimal set of routes used by a fleet of vehicles to serve a given set of customers on a predefined graph; the objective is to minimize the total travel cost (related to the travel times or distances) and operational cost (related to the number of vehicles used). In this paper we study a variant of the predefined graph: given a weighted graph G and vertices a and b, and given a set X of closed paths in G, find the minimum total travel cost of a-b path P such that no path in X is a subpath of P. Path P is allowed to repeat vertices and edges. We use integer programming model to describe the problem. A feasible neighbourhood approach is proposed to solve the model

  15. Evaluation of sliding window correlation performance for characterizing dynamic functional connectivity and brain states

    PubMed Central

    Shakil, Sadia; Lee, Chin-Hui; Keilholz, Shella Dawn

    2016-01-01

    A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. PMID:26952197

  16. Effects of inclination and eccentricity on optimal trajectories between earth and Venus

    NASA Technical Reports Server (NTRS)

    Gravier, J.-P.; Marchal, C.; Culp, R. D.

    1973-01-01

    The true optimal transfers, including the effects of the inclination and eccentricity of the planets' orbits, between earth and Venus are presented as functions of the corresponding idealized Hohmann transfers. The method of determining the optimal transfers using the calculus of variations is presented. For every possible Hohmann window, specified as a continuous function of the longitude of perihelion of the Hohmann trajectory, the corresponding numerically exact optimal two-impulse transfers are given in graphical form. The cases for which the optimal two-impulse transfer is the absolute optimal, and those for which a three-impulse transfer provides the absolute optimal transfer are indicated. This information furnishes everything necessary for quick and accurate orbit calculations for preliminary Venus mission analysis. This makes it possible to use the actual optimal transfers for advanced planning in place of the standard Hohmann transfers.

  17. Surface Transient Binding-Based Fluorescence Correlation Spectroscopy (STB-FCS), a Simple and Easy-to-Implement Method to Extend the Upper Limit of the Time Window to Seconds.

    PubMed

    Peng, Sijia; Wang, Wenjuan; Chen, Chunlai

    2018-05-10

    Fluorescence correlation spectroscopy is a powerful single-molecule tool that is able to capture kinetic processes occurring at the nanosecond time scale. However, the upper limit of its time window is restricted by the dwell time of the molecule of interest in the confocal detection volume, which is usually around submilliseconds for a freely diffusing biomolecule. Here, we present a simple and easy-to-implement method, named surface transient binding-based fluorescence correlation spectroscopy (STB-FCS), which extends the upper limit of the time window to seconds. We further demonstrated that STB-FCS enables capture of both intramolecular and intermolecular kinetic processes whose time scales cross several orders of magnitude.

  18. Photoacoustic CO2 sensor system: design and potential for miniaturization and integration in silicon

    NASA Astrophysics Data System (ADS)

    Huber, J.; Wöllenstein, J.

    2015-05-01

    The detection of CO2 indoors has a large impact on today's sensor market. The ambient room climate is important for human health and wellbeing. The CO2 concentration is a main indicator for indoor climate and correlates with the number of persons inside a room. People in Europe spend more than 90% of their time indoors. This leads to a high demand for miniaturized and energy efficient CO2 sensors. To realize small and energy-efficient mass-market sensors, we develop novel miniaturized photoacoustic sensor systems with optimized design for real-time and selective CO2 detection. The sensor system consists of two chambers, a measurement and a detection chamber. The detection chamber consists of an integrated pressure sensor under special gas atmosphere. As pressure sensor we use a commercially available cell phone microphone. We describe a possible miniaturization process of the developed system by regarding the possibility of integration of all sensor parts. The system is manufactured in precision mechanics with IR-optical sapphire windows as optical connections. During the miniaturization process the sapphire windows are replaced by Si chips with a special IR anti-reflection coating. The developed system is characterized in detail with gas measurements and optical transmission investigations. The results of the characterization process offer a high potential for further miniaturization with high capability for mass market applications.

  19. Low-cost real-time 3D PC distributed-interactive-simulation (DIS) application for C4I

    NASA Astrophysics Data System (ADS)

    Gonthier, David L.; Veron, Harry

    1998-04-01

    A 3D Distributed Interactive Simulation (DIS) application was developed and demonstrated in a PC environment. The application is capable of running in the stealth mode or as a player which includes battlefield simulations, such as ModSAF. PCs can be clustered together, but not necessarily collocated, to run a simulation or training exercise on their own. A 3D perspective view of the battlefield is displayed that includes terrain, trees, buildings and other objects supported by the DIS application. Screen update rates of 15 to 20 frames per second have been achieved with fully lit and textured scenes thus providing high quality and fast graphics. A complete PC system can be configured for under $2,500. The software runs under Windows95 and WindowsNT. It is written in C++ and uses a commercial API called RenderWare for 3D rendering. The software uses Microsoft Foundation classes and Microsoft DirectPlay for joystick input. The RenderWare libraries enhance the performance through optimization for MMX and the Pentium Pro processor. The RenderWare and the Righteous 3D graphics board from Orchid Technologies with an advertised rendering rate of up to 2 million texture mapped triangles per second. A low-cost PC DIS simulator that can partake in a real-time collaborative simulation with other platforms is thus achieved.

  20. A theory of eu-estrogenemia: a unifying concept

    PubMed Central

    Turner, Ralph J.; Kerber, Irwin J.

    2017-01-01

    Abstract Objective: The aim of the study was to propose a unifying theory for the role of estrogen in postmenopausal women through examples in basic science, randomized controlled trials, observational studies, and clinical practice. Methods: Review and evaluation of the literature relating to estrogen. Discussion: The role of hormone therapy and ubiquitous estrogen receptors after reproductive senescence gains insight from basic science models. Observational studies and individualized patient care in clinical practice may show outcomes that are not reproduced in randomized clinical trials. The understanding gained from the timing hypothesis for atherosclerosis, the critical window theory in neurosciences, randomized controlled trials, and numerous genomic and nongenomic actions of estrogen discovered in basic science provides new explanations to clinical challenges that practitioners face. Consequences of a hypo-estrogenemic duration in women's lives are poorly understood. The Study of Women Across the Nation suggests its magnitude is greater than was previously acknowledged. We propose that the healthy user bias was the result of surgical treatment (hysterectomy with oophorectomy) for many gynecological maladies followed by pharmacological and physiological doses of estrogen to optimize patient quality of life. The past decade of research has begun to demonstrate the role of estrogen in homeostasis. Conclusions: The theory of eu-estrogenemia provides a robust framework to unify the timing hypothesis, critical window theory, randomized controlled trials, the basic science of estrogen receptors, and clinical observations of patients over the past five decades. PMID:28562489

  1. Visual tool for estimating the fractal dimension of images

    NASA Astrophysics Data System (ADS)

    Grossu, I. V.; Besliu, C.; Rusu, M. V.; Jipa, Al.; Bordeianu, C. C.; Felea, D.

    2009-10-01

    This work presents a new Visual Basic 6.0 application for estimating the fractal dimension of images, based on an optimized version of the box-counting algorithm. Following the attempt to separate the real information from "noise", we considered also the family of all band-pass filters with the same band-width (specified as parameter). The fractal dimension can be thus represented as a function of the pixel color code. The program was used for the study of paintings cracks, as an additional tool which can help the critic to decide if an artistic work is original or not. Program summaryProgram title: Fractal Analysis v01 Catalogue identifier: AEEG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 29 690 No. of bytes in distributed program, including test data, etc.: 4 967 319 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30M Classification: 14 Nature of problem: Estimating the fractal dimension of images. Solution method: Optimized implementation of the box-counting algorithm. Use of a band-pass filter for separating the real information from "noise". User friendly graphical interface. Restrictions: Although various file-types can be used, the application was mainly conceived for the 8-bit grayscale, windows bitmap file format. Running time: In a first approximation, the algorithm is linear.

  2. Region of interest and windowing-based progressive medical image delivery using JPEG2000

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Mukhopadhyay, Sudipta; Wheeler, Frederick W.; Avila, Ricardo S.

    2003-05-01

    An important telemedicine application is the perusal of CT scans (digital format) from a central server housed in a healthcare enterprise across a bandwidth constrained network by radiologists situated at remote locations for medical diagnostic purposes. It is generally expected that a viewing station respond to an image request by displaying the image within 1-2 seconds. Owing to limited bandwidth, it may not be possible to deliver the complete image in such a short period of time with traditional techniques. In this paper, we investigate progressive image delivery solutions by using JPEG 2000. An estimate of the time taken in different network bandwidths is performed to compare their relative merits. We further make use of the fact that most medical images are 12-16 bits, but would ultimately be converted to an 8-bit image via windowing for display on the monitor. We propose a windowing progressive RoI technique to exploit this and investigate JPEG 2000 RoI based compression after applying a favorite or a default window setting on the original image. Subsequent requests for different RoIs and window settings would then be processed at the server. For the windowing progressive RoI mode, we report a 50% reduction in transmission time.

  3. Hospital-treated mental and behavioral disorders and risk of Alzheimer's disease: A nationwide nested case-control study.

    PubMed

    Tapiainen, V; Hartikainen, S; Taipale, H; Tiihonen, J; Tolppanen, A-M

    2017-06-01

    Studies investigating psychiatric disorders as Alzheimer's disease (AD) risk factors have yielded heterogeneous findings. Differences in time windows between the exposure and outcome could be one explanation. We examined whether (1) mental and behavioral disorders in general or (2) specific mental and behavioral disorder categories increase the risk of AD and (3) how the width of the time window between the exposure and outcome affects the results. A nationwide nested case-control study of all Finnish clinically verified AD cases, alive in 2005 and their age, sex and region of residence matched controls (n of case-control pairs 27,948). History of hospital-treated mental and behavioral disorders was available since 1972. Altogether 6.9% (n=1932) of the AD cases and 6.4% (n=1784) of controls had a history of any mental and behavioral disorder. Having any mental and behavioral disorder (adjusted OR=1.07, 95% CI=1.00-1.16) or depression/other mood disorder (adjusted OR=1.17, 95% CI=1.05-1.30) were associated with higher risk of AD with 5-year time window but not with 10-year time window (adjusted OR, 95% CI 0.99, 0.91-1.08 for any disorder and 1.08, 0.96-1.23 for depression). The associations between mental and behavioral disorders and AD were modest and dependent on the time window. Therefore, some of the disorders may represent misdiagnosed prodromal symptoms of AD, which underlines the importance of proper differential diagnostics among older persons. These findings also highlight the importance of appropriate time window in psychiatric and neuroepidemiology research. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  4. Human Mars Mission: Launch Window from Earth Orbit. Pt. 1

    NASA Technical Reports Server (NTRS)

    Young, Archie

    1999-01-01

    The determination of orbital window characteristics is of major importance in the analysis of human interplanetary missions and systems. The orbital launch window characteristics are directly involved in the selection of mission trajectories, the development of orbit operational concepts, and the design of orbital launch systems. The orbital launch window problem arises because of the dynamic nature of the relative geometry between outgoing (departure) asymptote of the hyperbolic escape trajectory and the earth parking orbit. The orientation of the escape hyperbola asymptotic relative to earth is a function of time. The required hyperbola energy level also varies with time. In addition, the inertial orientation of the parking orbit is a function of time because of the perturbations caused by the Earth's oblateness. Thus, a coplanar injection onto the escape hyperbola can be made only at a point in time when the outgoing escape asymptote is contained by the plane of parking orbit. Even though this condition may be planned as a nominal situation, it will not generally represent the more probable injection geometry. The general case of an escape injection maneuver performed at a time other than the coplanar time will involve both a path angle and plane change and, therefore, a DELTA V penalty. Usually, because of the DELTA V penalty the actual departure injection window is smaller in duration than that determined by energy requirement alone. This report contains the formulation, characteristics, and test cases for five different launch window modes for Earth orbit. These modes are: (1) One impulsive maneuver from a Highly Elliptical Orbit (HEO) (2) Two impulsive maneuvers from a Highly Elliptical Orbit (HEO) (3) One impulsive maneuver from a Low Earth Orbit (LEO) (4) Two impulsive maneuvers from LEO (5) Three impulsive maneuvers from LEO.

  5. Calculation of Retention Time Tolerance Windows with Absolute Confidence from Shared Liquid Chromatographic Retention Data

    PubMed Central

    Boswell, Paul G.; Abate-Pella, Daniel; Hewitt, Joshua T.

    2015-01-01

    Compound identification by liquid chromatography-mass spectrometry (LC-MS) is a tedious process, mainly because authentic standards must be run on a user’s system to be able to confidently reject a potential identity from its retention time and mass spectral properties. Instead, it would be preferable to use shared retention time/index data to narrow down the identity, but shared data cannot be used to reject candidates with an absolute level of confidence because the data are strongly affected by differences between HPLC systems and experimental conditions. However, a technique called “retention projection” was recently shown to account for many of the differences. In this manuscript, we discuss an approach to calculate appropriate retention time tolerance windows for projected retention times, potentially making it possible to exclude candidates with an absolute level of confidence, without needing to have authentic standards of each candidate on hand. In a range of multi-segment gradients and flow rates run among seven different labs, the new approach calculated tolerance windows that were significantly more appropriate for each retention projection than global tolerance windows calculated for retention projections or linear retention indices. Though there were still some small differences between the labs that evidently were not taken into account, the calculated tolerance windows only needed to be relaxed by 50% to make them appropriate for all labs. Even then, 42% of the tolerance windows calculated in this study without standards were narrower than those required by WADA for positive identification, where standards must be run contemporaneously. PMID:26292624

  6. Calculation of retention time tolerance windows with absolute confidence from shared liquid chromatographic retention data.

    PubMed

    Boswell, Paul G; Abate-Pella, Daniel; Hewitt, Joshua T

    2015-09-18

    Compound identification by liquid chromatography-mass spectrometry (LC-MS) is a tedious process, mainly because authentic standards must be run on a user's system to be able to confidently reject a potential identity from its retention time and mass spectral properties. Instead, it would be preferable to use shared retention time/index data to narrow down the identity, but shared data cannot be used to reject candidates with an absolute level of confidence because the data are strongly affected by differences between HPLC systems and experimental conditions. However, a technique called "retention projection" was recently shown to account for many of the differences. In this manuscript, we discuss an approach to calculate appropriate retention time tolerance windows for projected retention times, potentially making it possible to exclude candidates with an absolute level of confidence, without needing to have authentic standards of each candidate on hand. In a range of multi-segment gradients and flow rates run among seven different labs, the new approach calculated tolerance windows that were significantly more appropriate for each retention projection than global tolerance windows calculated for retention projections or linear retention indices. Though there were still some small differences between the labs that evidently were not taken into account, the calculated tolerance windows only needed to be relaxed by 50% to make them appropriate for all labs. Even then, 42% of the tolerance windows calculated in this study without standards were narrower than those required by WADA for positive identification, where standards must be run contemporaneously. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. A fast algorithm for vertex-frequency representations of signals on graphs

    PubMed Central

    Jestrović, Iva; Coyle, James L.; Sejdić, Ervin

    2016-01-01

    The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645

  8. Parallelization of a hydrological model using the message passing interface

    USGS Publications Warehouse

    Wu, Yiping; Li, Tiejian; Sun, Liqun; Chen, Ji

    2013-01-01

    With the increasing knowledge about the natural processes, hydrological models such as the Soil and Water Assessment Tool (SWAT) are becoming larger and more complex with increasing computation time. Additionally, other procedures such as model calibration, which may require thousands of model iterations, can increase running time and thus further reduce rapid modeling and analysis. Using the widely-applied SWAT as an example, this study demonstrates how to parallelize a serial hydrological model in a Windows® environment using a parallel programing technology—Message Passing Interface (MPI). With a case study, we derived the optimal values for the two parameters (the number of processes and the corresponding percentage of work to be distributed to the master process) of the parallel SWAT (P-SWAT) on an ordinary personal computer and a work station. Our study indicates that model execution time can be reduced by 42%–70% (or a speedup of 1.74–3.36) using multiple processes (two to five) with a proper task-distribution scheme (between the master and slave processes). Although the computation time cost becomes lower with an increasing number of processes (from two to five), this enhancement becomes less due to the accompanied increase in demand for message passing procedures between the master and all slave processes. Our case study demonstrates that the P-SWAT with a five-process run may reach the maximum speedup, and the performance can be quite stable (fairly independent of a project size). Overall, the P-SWAT can help reduce the computation time substantially for an individual model run, manual and automatic calibration procedures, and optimization of best management practices. In particular, the parallelization method we used and the scheme for deriving the optimal parameters in this study can be valuable and easily applied to other hydrological or environmental models.

  9. A Monte Carlo simulation study of an improved K-edge log-subtraction X-ray imaging using a photon counting CdTe detector

    NASA Astrophysics Data System (ADS)

    Lee, Youngjin; Lee, Amy Candy; Kim, Hee-Joung

    2016-09-01

    Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on our results, we recommend using this technique for high image quality.

  10. A hyperspectral image optimizing method based on sub-pixel MTF analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yun; Li, Kai; Wang, Jinqiang; Zhu, Yajie

    2015-04-01

    Hyperspectral imaging is used to collect tens or hundreds of images continuously divided across electromagnetic spectrum so that the details under different wavelengths could be represented. A popular hyperspectral imaging methods uses a tunable optical band-pass filter settled in front of the focal plane to acquire images of different wavelengths. In order to alleviate the influence of chromatic aberration in some segments in a hyperspectral series, in this paper, a hyperspectral optimizing method uses sub-pixel MTF to evaluate image blurring quality was provided. This method acquired the edge feature in the target window by means of the line spread function (LSF) to calculate the reliable position of the edge feature, then the evaluation grid in each line was interpolated by the real pixel value based on its relative position to the optimal edge and the sub-pixel MTF was used to analyze the image in frequency domain, by which MTF calculation dimension was increased. The sub-pixel MTF evaluation was reliable, since no image rotation and pixel value estimation was needed, and no artificial information was introduced. With theoretical analysis, the method proposed in this paper is reliable and efficient when evaluation the common images with edges of small tilt angle in real scene. It also provided a direction for the following hyperspectral image blurring evaluation and the real-time focal plane adjustment in real time in related imaging system.

  11. How the Brain Repairs Stuttering

    ERIC Educational Resources Information Center

    Kell, Christian A.; Neumann, Katrin; von Kriegstein, Katharina; Posenenske, Claudia; von Gudenberg, Alexander W.; Euler, Harald; Giraud, Anne-Lise

    2009-01-01

    Stuttering is a neurodevelopmental disorder associated with left inferior frontal structural anomalies. While children often recover, stuttering may also spontaneously disappear much later after years of dysfluency. These rare cases of unassisted recovery in adulthood provide a model of optimal brain repair outside the classical windows of…

  12. Optimizing Battery Usage and Management for Long Life

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler; Shi, Ying; Wood, Eric

    2016-06-16

    This presentation discusses the impact of system design factors on battery aging and end of life. Topics include sizing of the SOC operating window, cell balancing and thermal management systems and their value in reducing pack degradation rates and cell imbalance growth over lifetime.

  13. A Tri-Band Frequency Selective Surface (FSS) to Diplex Widely Separated Bands for Millimeter Wave Remote Sensing

    NASA Astrophysics Data System (ADS)

    Poojali, Jayaprakash; Ray, Shaumik; Pesala, Bala; Chitti, Krishnamurthy V.; Arunachalam, Kavitha

    2016-10-01

    A substrate-backed frequency selective surface (FSS) is presented for diplexing the widely separated frequency spectrum centered at 55, 89, and 183 GHz with varying bandwidth for spatial separation in the quasi-optical feed network of the millimeter wave sounder. A unit cell composed of a crossed dipole integrated with a circular ring and loaded inside a square ring is optimized for tri-band frequency response with transmission window at 89 GHz and rejection windows at 55 and 183 GHz. The reflection and transmission losses predicted for the optimized unit cell (728 μm × 728 μm) composed of dissimilar resonant shapes is less than 0.5 dB for transverse electric (TE) and transverse magnetic (TM) polarizations and wide angle of incidence (0°-45°). The FSS is fabricated on a 175-μm-thick quartz substrate using microfabrication techniques. The transmission characteristics measured with continuous wave (CW) terahertz transmit receive system are in good agreement with the numerical simulations.

  14. Optimized ex-ovo culturing of chick embryos to advanced stages of development.

    PubMed

    Cloney, Kellie; Franz-Odendaal, Tamara Anne

    2015-01-24

    Research in anatomy, embryology, and developmental biology has largely relied on the use of model organisms. In order to study development in live embryos model organisms, such as the chicken, are often used. The chicken is an excellent model organism due to its low cost and minimal maintenance, however they present observational challenges because they are enclosed in an opaque eggshell. In order to properly view the embryo as it develops, the shell must be windowed or removed. Both windowing and ex ovo techniques have been developed to assist researchers in the study of embryonic development. However, each of the methods has limitations and challenges. Here, we present a simple, optimized ex ovo culture technique for chicken embryos that enables the observation of embryonic development from stage HH 19 into late stages of development (HH 40), when many organs have developed. This technique is easy to adopt in both undergraduate classes and more advanced research laboratories where embryo manipulations are conducted.

  15. 77 FR 46950 - Post Office Organization and Administration: Establishment, Classification, and Discontinuance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-07

    ... defines a remotely managed Post Office (RMPO) as a Post Office that offers part-time window service hours... Administrative Post Office. The final rule also defines a part-time Post Office (PTPO) as a Post Office that offers part-time window service hours, is staffed by a Postal Service employee, and reports to a district...

  16. Decision-support models for empiric antibiotic selection in Gram-negative bloodstream infections.

    PubMed

    MacFadden, D R; Coburn, B; Shah, N; Robicsek, A; Savage, R; Elligsen, M; Daneman, N

    2018-04-25

    Early empiric antibiotic therapy in patients can improve clinical outcomes in Gram-negative bacteraemia. However, the widespread prevalence of antibiotic-resistant pathogens compromises our ability to provide adequate therapy while minimizing use of broad antibiotics. We sought to determine whether readily available electronic medical record data could be used to develop predictive models for decision support in Gram-negative bacteraemia. We performed a multi-centre cohort study, in Canada and the USA, of hospitalized patients with Gram-negative bloodstream infection from April 2010 to March 2015. We analysed multivariable models for prediction of antibiotic susceptibility at two empiric windows: Gram-stain-guided and pathogen-guided treatment. Decision-support models for empiric antibiotic selection were developed based on three clinical decision thresholds of acceptable adequate coverage (80%, 90% and 95%). A total of 1832 patients with Gram-negative bacteraemia were evaluated. Multivariable models showed good discrimination across countries and at both Gram-stain-guided (12 models, areas under the curve (AUCs) 0.68-0.89, optimism-corrected AUCs 0.63-0.85) and pathogen-guided (12 models, AUCs 0.75-0.98, optimism-corrected AUCs 0.64-0.95) windows. Compared to antibiogram-guided therapy, decision-support models of antibiotic selection incorporating individual patient characteristics and prior culture results have the potential to increase use of narrower-spectrum antibiotics (in up to 78% of patients) while reducing inadequate therapy. Multivariable models using readily available epidemiologic factors can be used to predict antimicrobial susceptibility in infecting pathogens with reasonable discriminatory ability. Implementation of sequential predictive models for real-time individualized empiric antibiotic decision-making has the potential to both optimize adequate coverage for patients while minimizing overuse of broad-spectrum antibiotics, and therefore requires further prospective evaluation. Readily available epidemiologic risk factors can be used to predict susceptibility of Gram-negative organisms among patients with bacteraemia, using automated decision-making models. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  17. Tumour-on-a-chip provides an optical window into nanoparticle tissue transport

    NASA Astrophysics Data System (ADS)

    Albanese, Alexandre; Lam, Alan K.; Sykes, Edward A.; Rocheleau, Jonathan V.; Chan, Warren C. W.

    2013-10-01

    Nanomaterials are used for numerous biomedical applications, but the selection of optimal properties for maximum delivery remains challenging. Thus, there is a significant interest in elucidating the nano-bio interactions underlying tissue accumulation. To date, researchers have relied on cell culture or animal models to study nano-bio interactions. However, cell cultures lack the complexity of biological tissues and animal models are prohibitively slow and expensive. Here we report a tumour-on-a-chip system where incorporation of tumour-like spheroids into a microfluidic channel permits real-time analysis of nanoparticle (NP) accumulation at physiological flow conditions. We show that penetration of NPs into the tissue is limited by their diameter and that retention can be improved by receptor targeting. NP transport is predominantly diffusion-limited with convection improving accumulation mostly at the tissue perimeter. A murine tumour model confirms these findings and demonstrates that the tumour-on-a-chip can be useful for screening optimal NP designs prior to in vivo studies.

  18. Imaging windows for long-term intravital imaging

    PubMed Central

    Alieva, Maria; Ritsma, Laila; Giedt, Randy J; Weissleder, Ralph; van Rheenen, Jacco

    2014-01-01

    Intravital microscopy is increasingly used to visualize and quantitate dynamic biological processes at the (sub)cellular level in live animals. By visualizing tissues through imaging windows, individual cells (e.g., cancer, host, or stem cells) can be tracked and studied over a time-span of days to months. Several imaging windows have been developed to access tissues including the brain, superficial fascia, mammary glands, liver, kidney, pancreas, and small intestine among others. Here, we review the development of imaging windows and compare the most commonly used long-term imaging windows for cancer biology: the cranial imaging window, the dorsal skin fold chamber, the mammary imaging window, and the abdominal imaging window. Moreover, we provide technical details, considerations, and trouble-shooting tips on the surgical procedures and microscopy setups for each imaging window and explain different strategies to assure imaging of the same area over multiple imaging sessions. This review aims to be a useful resource for establishing the long-term intravital imaging procedure. PMID:28243510

  19. Imaging windows for long-term intravital imaging: General overview and technical insights.

    PubMed

    Alieva, Maria; Ritsma, Laila; Giedt, Randy J; Weissleder, Ralph; van Rheenen, Jacco

    2014-01-01

    Intravital microscopy is increasingly used to visualize and quantitate dynamic biological processes at the (sub)cellular level in live animals. By visualizing tissues through imaging windows, individual cells (e.g., cancer, host, or stem cells) can be tracked and studied over a time-span of days to months. Several imaging windows have been developed to access tissues including the brain, superficial fascia, mammary glands, liver, kidney, pancreas, and small intestine among others. Here, we review the development of imaging windows and compare the most commonly used long-term imaging windows for cancer biology: the cranial imaging window, the dorsal skin fold chamber, the mammary imaging window, and the abdominal imaging window. Moreover, we provide technical details, considerations, and trouble-shooting tips on the surgical procedures and microscopy setups for each imaging window and explain different strategies to assure imaging of the same area over multiple imaging sessions. This review aims to be a useful resource for establishing the long-term intravital imaging procedure.

  20. Dose-dependent effects of ouabain on spiral ganglion neurons and Schwann cells in mouse cochlea.

    PubMed

    Zhang, Zhi-Jian; Guan, Hong-Xia; Yang, Kun; Xiao, Bo-Kui; Liao, Hua; Jiang, Yang; Zhou, Tao; Hua, Qing-Quan

    2017-10-01

    This study aimed in fully investigating the toxicities of ouabain to mouse cochlea and the related cellular environment, and providing an optimal animal model system for cell transplantation in the treatment of auditory neuropathy (AN) and sensorineural hearing loss (SNHL). Different dosages of ouabain were applied to mouse round window. The auditory brainstem responses and distortion product otoacoustic emissions were used to evaluate the cochlear function. The immunohistochemical staining and cochlea surface preparation were performed to detect the spiral ganglion neurons (SGNs), Schwann cells and hair cells. Ouabain at the dosages of 0.5 mM, 1 mM and 3 mM selectively and permanently destroyed SGNs and their functions, while leaving the hair cells relatively intact. Ouabain at 3 mM resulted in the most severe SGNs loss and induced significant loss of Schwann cells started as early as 7 days and with further damages at 14 and 30 days after ouabain exposure. The application of ouabain to mouse round window induces damages of SGNs and Schwann cells in a dose- and time-dependent manner, this study established a reliable and accurate animal model system of AN and SNHL.

  1. Diagnosing and ranking retinopathy disease level using diabetic fundus image recuperation approach.

    PubMed

    Somasundaram, K; Rajendran, P Alli

    2015-01-01

    Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time.

  2. Diagnosing and Ranking Retinopathy Disease Level Using Diabetic Fundus Image Recuperation Approach

    PubMed Central

    Somasundaram, K.; Alli Rajendran, P.

    2015-01-01

    Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time. PMID:25945362

  3. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peplow, Douglas E.; Mosher, Scott W; Evans, Thomas M

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADISmore » also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.« less

  4. Energy efficiency façade design in high-rise apartment buildings using the calculation of solar heat transfer through windows with shading devices

    NASA Astrophysics Data System (ADS)

    Ha, P. T. H.

    2018-04-01

    The architectural design orientation at the first design stage plays a key role and has a great impact on the energy consumption of a building throughout its life-cycle. To provide designers with a simple and useful tool in quantitatively determining and simply optimizing the energy efficiency of a building at the very first stage of conceptual design, a factor namely building envelope energy efficiency (Khqnl ) should be investigated and proposed. Heat transfer through windows and other glazed areas of mezzanine floors accounts for 86% of overall thermal transfer through building envelope, so the factor Khqnl of high-rise buildings largely depends on shading solutions. The author has established tables and charts to make reference to the values of Khqnl factor in certain high-rise apartment buildings in Hanoi calculated with a software program subject to various inputs including: types and sizes of shading devices, building orientations and at different points of time to be respectively analyzed. It is possible and easier for architects to refer to these tables and charts in façade design for a higher level of energy efficiency.

  5. Use phase signals to promote lifetime extension for Windows PCs.

    PubMed

    Hickey, Stewart; Fitzpatrick, Colin; O'Connell, Maurice; Johnson, Michael

    2009-04-01

    This paper proposes a signaling methodology for personal computers. Signaling may be viewed as an ecodesign strategy that can positively influence the consumer to consumer (C2C) market process. A number of parameters are identified that can provide the basis for signal implementation. These include operating time, operating temperature, operating voltage, power cycle counts, hard disk drive (HDD) self-monitoring, and reporting technology (SMART) attributes and operating system (OS) event information. All these parameters are currently attainable or derivable via embedded technologies in modern desktop systems. A case study detailing a technical implementation of how the development of signals can be achieved in personal computers that incorporate Microsoft Windows operating systems is presented. Collation of lifetime temperature data from a system processor is demonstrated as a possible means of characterizing a usage profile for a desktop system. In addition, event log data is utilized for devising signals indicative of OS quality. The provision of lifetime usage data in the form of intuitive signals indicative of both hardware and software quality can in conjunction with consumer education facilitate an optimal remarketing strategy for used systems. This implementation requires no additional hardware.

  6. Constraint based modeling of metabolism allows finding metabolic cancer hallmarks and identifying personalized therapeutic windows.

    PubMed

    Bordel, Sergio

    2018-04-13

    In order to choose optimal personalized anticancer treatments, transcriptomic data should be analyzed within the frame of biological networks. The best known human biological network (in terms of the interactions between its different components) is metabolism. Cancer cells have been known to have specific metabolic features for a long time and currently there is a growing interest in characterizing new cancer specific metabolic hallmarks. In this article it is presented a method to find personalized therapeutic windows using RNA-seq data and Genome Scale Metabolic Models. This method is implemented in the python library, pyTARG. Our predictions showed that the most anticancer selective (affecting 27 out of 34 considered cancer cell lines and only 1 out of 6 healthy mesenchymal stem cell lines) single metabolic reactions are those involved in cholesterol biosynthesis. Excluding cholesterol biosynthesis, all the considered cell lines can be selectively affected by targeting different combinations (from 1 to 5 reactions) of only 18 metabolic reactions, which suggests that a small subset of drugs or siRNAs combined in patient specific manners could be at the core of metabolism based personalized treatments.

  7. Photorefractive-based adaptive optical windows

    NASA Astrophysics Data System (ADS)

    Liu, Yuexin; Yang, Yi; Wang, Bo; Fu, John Y.; Yin, Shizhuo; Guo, Ruyan; Yu, Francis T.

    2004-10-01

    Optical windows have been widely used in optical spectrographic processing system. In this paper, various window profiles, such as rectangular, triangular, Hamming, Hanning, and Blackman etc., have been investigated in detail, regarding their effect on the generated spectrograms, such as joint time-frequency resolution ΔtΔw, the sidelobe amplitude attenuation etc.. All of these windows can be synthesized in a photorefractive crystal by angular multiplexing holographic technique, which renders the system more adaptive. Experimental results are provided.

  8. Noise normalization and windowing functions for VALIDAR in wind parameter estimation

    NASA Astrophysics Data System (ADS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Li, Zhiwen

    2006-05-01

    The wind parameter estimates from a state-of-the-art 2-μm coherent lidar system located at NASA Langley, Virginia, named VALIDAR (validation lidar), were compared after normalizing the noise by its estimated power spectra via the periodogram and the linear predictive coding (LPC) scheme. The power spectra and the Doppler shift estimates were the main parameter estimates for comparison. Different types of windowing functions were implemented in VALIDAR data processing algorithm and their impact on the wind parameter estimates was observed. Time and frequency independent windowing functions such as Rectangular, Hanning, and Kaiser-Bessel and time and frequency dependent apodized windowing function were compared. The briefing of current nonlinear algorithm development for Doppler shift correction subsequently follows.

  9. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-03

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Time-bound product returns and optimal order quantities for mass merchandisers

    NASA Astrophysics Data System (ADS)

    Yu, Min-Chun; Goh, Mark

    2012-01-01

    The return guidelines for a mass merchandiser usually entail a grace period, a markdown on the original price and the condition of the returned items. This research utilises eight scenarios formed from the variation of possible return guidelines to model the cost functions of single-product categories for a typical mass merchandiser. Models for the eight scenarios are developed and solved with the objective of maximising the expected profit so as to obtain closed form solutions for the associated optimal order quantity. An illustrative example and sensitivity analysis are provided to demonstrate the applicability of the model. Our results show that merchandisers who allow for returns within a time window, albeit with a penalty cost imposed and the returned products being recoverable, should plan for larger order amounts as such products do not affect the business. Similarly, the merchandisers who allow for returns beyond a grace period and without any penalty charges, but where the returned products are irrecoverable, should manage their stocks in this category more judiciously by ordering as little as possible so as to limit the number of returns and carefully consider the effects of their customer satisfaction-guaranteed policies, if any.

  11. Process improvement for regulatory analyses of custom-blend fertilizers.

    PubMed

    Wegner, Keith A

    2014-01-01

    Chemical testing of custom-blend fertilizers is essential to ensure that the products meet the formulation requirements. For purposes of proper crop nutrition and consumer protection, regulatory oversight promotes compliance and particular attention to blending and formulation specifications. Analyses of custom-blend fertilizer products must be performed and reported within a very narrow window in order to be effective. The Colorado Department of Agriculture's Biochemistry Laboratory is an ISO 17025 accredited facility and conducts analyses of custom-blend fertilizer products primarily during the spring planting season. Using the Lean Six Sigma (LSS) process, the Biochemistry Laboratory has reduced turnaround times from as much as 45 days to as little as 3 days. The LSS methodology focuses on waste reduction through identifying: non-value-added steps, unneeded process reviews, optimization of screening and confirmatory analyses, equipment utilization, nonessential reporting requirements, and inefficient personnel deployment. Eliminating these non-value-added activities helped the laboratory significantly shorten turnaround time and reduce costs. Key improvement elements discovered during the LSS process included: focused sample tracking, equipment redundancy, strategic supply stocking, batch size optimization, critical sample paths, elimination of nonessential QC reviews, and more efficient personnel deployment.

  12. Elimination of scattered gamma rays from injection sites using upper offset energy windows in sentinel lymph node scintigraphy.

    PubMed

    Yoneyama, Hiroto; Tsushima, Hiroyuki; Onoguchi, Masahisa; Konishi, Takahiro; Nakajima, Kenichi; Kinuya, Seigo

    2015-05-01

    The identification of sentinel lymph nodes (SLNs) near injection sites is difficult because of scattered gamma rays. The purpose of this study was to investigate the optimal energy windows for elimination of scattered gamma rays in order to improve the detection of SLNs. The clinical study group consisted of 56 female patients with breast cancer. While the energy was centred at 140 keV with a 20% window for Tc-99m, this energy window was divided into five subwindows with every 4% in planar imaging. Regions of interest were placed on SLNs and the background, and contrast was calculated using a standard equation. The confidence levels of interpretations were evaluated using a five-grade scale. The contrast provided by 145.6 keV±2% was the best, followed by 140 keV±2%, 151.2 keV±2%, 134.4 keV±2% and 128.8 keV±2% in that order. When 128.8 keV±2% and 134.4 keV±2% were eliminated from 140 keV±10% (145.6 keV±6%), the contrast of SLNs improved significantly. The confidence levels of interpretation and detection rate provided by the planar images with 140 keV±10% were 4.74±0.58 and 94.8%, respectively, and those provided by 145.6 keV±6% were 4.94±0.20 and 100%. Because lower energy windows contain many scattered gamma rays, upper offset energy windows, which exclude lower energy windows, improve the image contrast of SLNs near injection sites.

  13. Exposure to suboptimal temperatures during metamorphosis reveals a critical developmental window in the solitary bee, Megachile rotundata

    USDA-ARS?s Scientific Manuscript database

    Metamorphosis is an important developmental stage for holometabolous insects, during which adult morphology and physiology are established. Proper development relies on optimal body temperatures, and natural ambient temperature (Ta) fluctuations, especially in spring or in northern latitudes, could ...

  14. Optimizing Battery Usage and Management for Long Life

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler; Shi, Ying; Wood, Eric

    2016-06-16

    This presentation discusses the impact of system design factors on battery aging and end of life. Topics include sizing of the state-of-charge operating window, cell balancing, and thermal management systems and their value in reducing pack degradation rates and cell imbalance growth over lifetime.

  15. Evaluation of the pre-posterior distribution of optimized sampling times for the design of pharmacokinetic studies.

    PubMed

    Duffull, Stephen B; Graham, Gordon; Mengersen, Kerrie; Eccleston, John

    2012-01-01

    Information theoretic methods are often used to design studies that aim to learn about pharmacokinetic and linked pharmacokinetic-pharmacodynamic systems. These design techniques, such as D-optimality, provide the optimum experimental conditions. The performance of the optimum design will depend on the ability of the investigator to comply with the proposed study conditions. However, in clinical settings it is not possible to comply exactly with the optimum design and hence some degree of unplanned suboptimality occurs due to error in the execution of the study. In addition, due to the nonlinear relationship of the parameters of these models to the data, the designs are also locally dependent on an arbitrary choice of a nominal set of parameter values. A design that is robust to both study conditions and uncertainty in the nominal set of parameter values is likely to be of use clinically. We propose an adaptive design strategy to account for both execution error and uncertainty in the parameter values. In this study we investigate designs for a one-compartment first-order pharmacokinetic model. We do this in a Bayesian framework using Markov-chain Monte Carlo (MCMC) methods. We consider log-normal prior distributions on the parameters and investigate several prior distributions on the sampling times. An adaptive design was used to find the sampling window for the current sampling time conditional on the actual times of all previous samples.

  16. High-impact resistance optical sensor windows

    NASA Astrophysics Data System (ADS)

    Askinazi, Joel; Ceccorulli, Mark L.; Goldman, Lee

    2011-06-01

    Recent field experience with optical sensor windows on both ground and airborne platforms has shown a significant increase in window fracturing from foreign object debris (FOD) impacts and as a by-product of asymmetrical warfare. Common optical sensor window materials such as borosilicate glass do not typically have high impact resistance. Emerging advanced optical window materials such as aluminum oxynitride offer the potential for a significant improvement in FOD impact resistance due to their superior surface hardness, fracture toughness and strength properties. To confirm the potential impact resistance improvement achievable with these emerging materials, Goodrich ISR Systems in collaboration with Surmet Corporation undertook a set of comparative FOD impact tests of optical sensor windows made from borosilicate glass and from aluminum oxynitride. It was demonstrated that the aluminum oxynitride windows could withstand up to three times the FOD impact velocity (as compared with borosilicate glass) before fracture would occur. These highly encouraging test results confirm the utility of this new highly viable window solution for use on new ground and airborne window multispectral applications as well as a retrofit to current production windows. We believe that this solution can go a long way to significantly reducing the frequency and life cycle cost of window replacement.

  17. Research on Scheduling Algorithm for Multi-satellite and Point Target Task on Swinging Mode

    NASA Astrophysics Data System (ADS)

    Wang, M.; Dai, G.; Peng, L.; Song, Z.; Chen, G.

    2012-12-01

    Nowadays, using satellite in space to observe ground is an important and major method to obtain ground information. With the development of the scientific technology in the field of space, many fields such as military and economic and other areas have more and more requirement of space technology because of the benefits of the satellite's widespread, timeliness and unlimited of area and country. And at the same time, because of the wide use of all kinds of satellites, sensors, repeater satellites and ground receiving stations, ground control system are now facing great challenge. Therefore, how to make the best value of satellite resources so as to make full use of them becomes an important problem of ground control system. Satellite scheduling is to distribute the resource to all tasks without conflict to obtain the scheduling result so as to complete as many tasks as possible to meet user's requirement under considering the condition of the requirement of satellites, sensors and ground receiving stations. Considering the size of the task, we can divide tasks into point task and area task. This paper only considers point targets. In this paper, a description of satellite scheduling problem and a chief introduction of the theory of satellite scheduling are firstly made. We also analyze the restriction of resource and task in scheduling satellites. The input and output flow of scheduling process are also chiefly described in the paper. On the basis of these analyses, we put forward a scheduling model named as multi-variable optimization model for multi-satellite and point target task on swinging mode. In the multi-variable optimization model, the scheduling problem is transformed the parametric optimization problem. The parameter we wish to optimize is the swinging angle of every time-window. In the view of the efficiency and accuracy, some important problems relating the satellite scheduling such as the angle relation between satellites and ground targets, positive and negative swinging angle and the computation of time window are analyzed and discussed. And many strategies to improve the efficiency of this model are also put forward. In order to solve the model, we bring forward the conception of activity sequence map. By using the activity sequence map, the activity choice and the start time of the activity can be divided. We also bring forward three neighborhood operators to search the result space. The front movement remaining time and the back movement remaining time are used to analyze the feasibility to generate solution from neighborhood operators. Lastly, the algorithm to solve the problem and model is put forward based genetic algorithm. Population initialization, crossover operator, mutation operator, individual evaluation, collision decrease operator, select operator and collision elimination operator is designed in the paper. Finally, the scheduling result and the simulation for a practical example on 5 satellites and 100 point targets with swinging mode is given, and the scheduling performances are also analyzed while the swinging angle in 0, 5, 10, 15, 25. It can be shown by the result that the model and the algorithm are more effective than those ones without swinging mode.

  18. Application of MEMS-based x-ray optics as tuneable nanosecond choppers

    NASA Astrophysics Data System (ADS)

    Chen, Pice; Walko, Donald A.; Jung, Il Woong; Li, Zhilong; Gao, Ya; Shenoy, Gopal K.; Lopez, Daniel; Wang, Jin

    2017-08-01

    Time-resolved synchrotron x-ray measurements often rely on using a mechanical chopper to isolate a set of x-ray pulses. We have started the development of micro electromechanical systems (MEMS)-based x-ray optics, as an alternate method to manipulate x-ray beams. In the application of x-ray pulse isolation, we recently achieved a pulse-picking time window of half a nanosecond, which is more than 100 times faster than mechanical choppers can achieve. The MEMS device consists of a comb-drive silicon micromirror, designed for efficiently diffracting an x-ray beam during oscillation. The MEMS devices were operated in Bragg geometry and their oscillation was synchronized to x-ray pulses, with a frequency matching subharmonics of the cycling frequency of x-ray pulses. The microscale structure of the silicon mirror in terms of the curvature and the quality of crystallinity ensures a narrow angular spread of the Bragg reflection. With the discussion of factors determining the diffractive time window, this report showed our approaches to narrow down the time window to half a nanosecond. The short diffractive time window will allow us to select single x-ray pulse out of a train of pulses from synchrotron radiation facilities.

  19. Windows of sensitivity to toxic chemicals in the motor effects development.

    PubMed

    Ingber, Susan Z; Pohl, Hana R

    2016-02-01

    Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8-17 [rats], GD 12-14 and PND 3-10 [mice]) and motor function performance (insufficient data for rats, GD 12-17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. Published by Elsevier Inc.

  20. Windows of sensitivity to toxic chemicals in the motor effects development✩

    PubMed Central

    Ingber, Susan Z.; Pohl, Hana R.

    2017-01-01

    Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8–17 [rats], GD 12–14 and PND 3–10 [mice]) and motor function performance (insufficient data for rats, GD 12–17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. PMID:26686904

Top