Strategies for Fermentation Medium Optimization: An In-Depth Review
Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.
2017-01-01
Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566
van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W
2014-12-22
Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.
Method of optimization onboard communication network
NASA Astrophysics Data System (ADS)
Platoshin, G. A.; Selvesuk, N. I.; Semenov, M. E.; Novikov, V. M.
2018-02-01
In this article the optimization levels of onboard communication network (OCN) are proposed. We defined the basic parameters, which are necessary for the evaluation and comparison of modern OCN, we identified also a set of initial data for possible modeling of the OCN. We also proposed a mathematical technique for implementing the OCN optimization procedure. This technique is based on the principles and ideas of binary programming. It is shown that the binary programming technique allows to obtain an inherently optimal solution for the avionics tasks. An example of the proposed approach implementation to the problem of devices assignment in OCN is considered.
Mathematical Optimization Techniques
NASA Technical Reports Server (NTRS)
Bellman, R. (Editor)
1963-01-01
The papers collected in this volume were presented at the Symposium on Mathematical Optimization Techniques held in the Santa Monica Civic Auditorium, Santa Monica, California, on October 18-20, 1960. The objective of the symposium was to bring together, for the purpose of mutual education, mathematicians, scientists, and engineers interested in modern optimization techniques. Some 250 persons attended. The techniques discussed included recent developments in linear, integer, convex, and dynamic programming as well as the variational processes surrounding optimal guidance, flight trajectories, statistical decisions, structural configurations, and adaptive control systems. The symposium was sponsored jointly by the University of California, with assistance from the National Science Foundation, the Office of Naval Research, the National Aeronautics and Space Administration, and The RAND Corporation, through Air Force Project RAND.
Modern control techniques in active flutter suppression using a control moment gyro
NASA Technical Reports Server (NTRS)
Buchek, P. M.
1974-01-01
Development of organized synthesis techniques, using concepts of modern control theory was studied for the design of active flutter suppression systems for two and three-dimensional lifting surfaces, utilizing a control moment gyro (CMG) to generate the required control torques. Incompressible flow theory is assumed, with the unsteady aerodynamic forces and moments for arbitrary airfoil motion obtained by using the convolution integral based on Wagner's indicial lift function. Linear optimal control theory is applied to find particular optimal sets of gain values which minimize a quadratic performance function. The closed loop system's response to impulsive gust disturbances and the resulting control power requirements are investigated, and the system eigenvalues necessary to minimize the maximum value of control power are determined.
Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement
NASA Astrophysics Data System (ADS)
Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.
In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.
Scaling Support Vector Machines On Modern HPC Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Yang; Fu, Haohuan; Song, Shuaiwen
2015-02-01
We designed and implemented MIC-SVM, a highly efficient parallel SVM for x86 based multicore and many-core architectures, such as the Intel Ivy Bridge CPUs and Intel Xeon Phi co-processor (MIC). We propose various novel analysis methods and optimization techniques to fully utilize the multilevel parallelism provided by these architectures and serve as general optimization methods for other machine learning tools.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Experimental Optimization Methods for Multi-Element Airfoils
NASA Technical Reports Server (NTRS)
Landman, Drew; Britcher, Colin P.
1996-01-01
A modern three element airfoil model with a remotely activated flap was used to investigate optimum flap testing position using an automated optimization algorithm in wind tunnel tests. Detailed results for lift coefficient versus flap vertical and horizontal position are presented for two angles of attack: 8 and 14 degrees. An on-line first order optimizer is demonstrated which automatically seeks the optimum lift as a function of flap position. Future work with off-line optimization techniques is introduced and aerodynamic hysteresis effects due to flap movement with flow on are discussed.
Cache Energy Optimization Techniques For Modern Processors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
2013-01-01
Modern multicore processors are employing large last-level caches, for example Intel's E7-8800 processor uses 24MB L3 cache. Further, with each CMOS technology generation, leakage energy has been dramatically increasing and hence, leakage energy is expected to become a major source of energy dissipation, especially in last-level caches (LLCs). The conventional schemes of cache energy saving either aim at saving dynamic energy or are based on properties specific to first-level caches, and thus these schemes have limited utility for last-level caches. Further, several other techniques require offline profiling or per-application tuning and hence are not suitable for product systems. In thismore » book, we present novel cache leakage energy saving schemes for single-core and multicore systems; desktop, QoS, real-time and server systems. Also, we present cache energy saving techniques for caches designed with both conventional SRAM devices and emerging non-volatile devices such as STT-RAM (spin-torque transfer RAM). We present software-controlled, hardware-assisted techniques which use dynamic cache reconfiguration to configure the cache to the most energy efficient configuration while keeping the performance loss bounded. To profile and test a large number of potential configurations, we utilize low-overhead, micro-architecture components, which can be easily integrated into modern processor chips. We adopt a system-wide approach to save energy to ensure that cache reconfiguration does not increase energy consumption of other components of the processor. We have compared our techniques with state-of-the-art techniques and have found that our techniques outperform them in terms of energy efficiency and other relevant metrics. The techniques presented in this book have important applications in improving energy-efficiency of higher-end embedded, desktop, QoS, real-time, server processors and multitasking systems. This book is intended to be a valuable guide for both newcomers and veterans in the field of cache power management. It will help graduate students, CAD tool developers and designers in understanding the need of energy efficiency in modern computing systems. Further, it will be useful for researchers in gaining insights into algorithms and techniques for micro-architectural and system-level energy optimization using dynamic cache reconfiguration. We sincerely believe that the ``food for thought'' presented in this book will inspire the readers to develop even better ideas for designing ``green'' processors of tomorrow.« less
Trajectory Optimization for Missions to Small Bodies with a Focus on Scientific Merit.
Englander, Jacob A; Vavrina, Matthew A; Lim, Lucy F; McFadden, Lucy A; Rhoden, Alyssa R; Noll, Keith S
2017-01-01
Trajectory design for missions to small bodies is tightly coupled both with the selection of targets for a mission and with the choice of spacecraft power, propulsion, and other hardware. Traditional methods of trajectory optimization have focused on finding the optimal trajectory for an a priori selection of destinations and spacecraft parameters. Recent research has expanded the field of trajectory optimization to multidisciplinary systems optimization that includes spacecraft parameters. The logical next step is to extend the optimization process to include target selection based not only on engineering figures of merit but also scientific value. This paper presents a new technique to solve the multidisciplinary mission optimization problem for small-bodies missions, including classical trajectory design, the choice of spacecraft power and propulsion systems, and also the scientific value of the targets. This technique, when combined with modern parallel computers, enables a holistic view of the small body mission design process that previously required iteration among several different design processes.
Metaheuristic Algorithms for Convolution Neural Network
Fanany, Mohamad Ivan; Arymurthy, Aniati Murni
2016-01-01
A typical modern optimization technique is usually either heuristic or metaheuristic. This technique has managed to solve some optimization problems in the research area of science, engineering, and industry. However, implementation strategy of metaheuristic for accuracy improvement on convolution neural networks (CNN), a famous deep learning method, is still rarely investigated. Deep learning relates to a type of machine learning technique, where its aim is to move closer to the goal of artificial intelligence of creating a machine that could successfully perform any intellectual tasks that can be carried out by a human. In this paper, we propose the implementation strategy of three popular metaheuristic approaches, that is, simulated annealing, differential evolution, and harmony search, to optimize CNN. The performances of these metaheuristic methods in optimizing CNN on classifying MNIST and CIFAR dataset were evaluated and compared. Furthermore, the proposed methods are also compared with the original CNN. Although the proposed methods show an increase in the computation time, their accuracy has also been improved (up to 7.14 percent). PMID:27375738
Metaheuristic Algorithms for Convolution Neural Network.
Rere, L M Rasdi; Fanany, Mohamad Ivan; Arymurthy, Aniati Murni
2016-01-01
A typical modern optimization technique is usually either heuristic or metaheuristic. This technique has managed to solve some optimization problems in the research area of science, engineering, and industry. However, implementation strategy of metaheuristic for accuracy improvement on convolution neural networks (CNN), a famous deep learning method, is still rarely investigated. Deep learning relates to a type of machine learning technique, where its aim is to move closer to the goal of artificial intelligence of creating a machine that could successfully perform any intellectual tasks that can be carried out by a human. In this paper, we propose the implementation strategy of three popular metaheuristic approaches, that is, simulated annealing, differential evolution, and harmony search, to optimize CNN. The performances of these metaheuristic methods in optimizing CNN on classifying MNIST and CIFAR dataset were evaluated and compared. Furthermore, the proposed methods are also compared with the original CNN. Although the proposed methods show an increase in the computation time, their accuracy has also been improved (up to 7.14 percent).
NASA Astrophysics Data System (ADS)
Jafari, S.; Hojjati, M. H.
2011-12-01
Rotating disks work mostly at high angular velocity and this results a large centrifugal force and consequently induce large stresses and deformations. Minimizing weight of such disks yields to benefits such as low dead weights and lower costs. This paper aims at finding an optimal disk thickness profile for minimum weight design using the simulated annealing (SA) and particle swarm optimization (PSO) as two modern optimization techniques. In using semi-analytical the radial domain of the disk is divided into some virtual sub-domains as rings where the weight of each rings must be minimized. Inequality constrain equation used in optimization is to make sure that maximum von Mises stress is always less than yielding strength of the material of the disk and rotating disk does not fail. The results show that the minimum weight obtained for all two methods is almost identical. The PSO method gives a profile with slightly less weight (6.9% less than SA) while the implementation of both PSO and SA methods are easy and provide more flexibility compared with classical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spong, D.A.
The design techniques and physics analysis of modern stellarator configurations for magnetic fusion research rely heavily on high performance computing and simulation. Stellarators, which are fundamentally 3-dimensional in nature, offer significantly more design flexibility than more symmetric devices such as the tokamak. By varying the outer boundary shape of the plasma, a variety of physics features, such as transport, stability, and heating efficiency can be optimized. Scientific visualization techniques are an important adjunct to this effort as they provide a necessary ergonomic link between the numerical results and the intuition of the human researcher. The authors have developed a varietymore » of visualization techniques for stellarators which both facilitate the design optimization process and allow the physics simulations to be more readily understood.« less
Efficient Parameter Searches for Colloidal Materials Design with Digital Alchemy
NASA Astrophysics Data System (ADS)
Dodd, Paul, M.; Geng, Yina; van Anders, Greg; Glotzer, Sharon C.
Optimal colloidal materials design is challenging, even for high-throughput or genomic approaches, because the design space provided by modern colloid synthesis techniques can easily have dozens of dimensions. In this talk we present the methodology of an inverse approach we term ''digital alchemy'' to perform rapid searches of design-paramenter spaces with up to 188 dimensions that yield thermodynamically optimal colloid parameters for target crystal structures with up to 20 particles in a unit cell. The method relies only on fundamental principles of statistical mechanics and Metropolis Monte Carlo techniques, and yields particle attribute tolerances via analogues of familiar stress-strain relationships.
Loop shaping design for tracking performance in machine axes.
Schinstock, Dale E; Wei, Zhouhong; Yang, Tao
2006-01-01
A modern interpretation of classical loop shaping control design methods is presented in the context of tracking control for linear motor stages. Target applications include noncontacting machines such as laser cutters and markers, water jet cutters, and adhesive applicators. The methods are directly applicable to the common PID controller and are pertinent to many electromechanical servo actuators other than linear motors. In addition to explicit design techniques a PID tuning algorithm stressing the importance of tracking is described. While the theory behind these techniques is not new, the analysis of their application to modern systems is unique in the research literature. The techniques and results should be important to control practitioners optimizing PID controller designs for tracking and in comparing results from classical designs to modern techniques. The methods stress high-gain controller design and interpret what this means for PID. Nothing in the methods presented precludes the addition of feedforward control methods for added improvements in tracking. Laboratory results from a linear motor stage demonstrate that with large open-loop gain very good tracking performance can be achieved. The resultant tracking errors compare very favorably to results from similar motions on similar systems that utilize much more complicated controllers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Surface texture measurement for dental wear applications
NASA Astrophysics Data System (ADS)
Austin, R. S.; Mullen, F.; Bartlett, D. W.
2015-06-01
The application of surface topography measurement and characterization within dental materials science is highly active and rapidly developing, in line with many modern industries. Surface measurement and structuring is used extensively within oral and dental science to optimize the optical, tribological and biological performance of natural and biomimetic dental materials. Although there has historically been little standardization in the use and reporting of surface metrology instrumentation and software, the dental industry is beginning to adopt modern areal measurement and characterization techniques, especially as the dental industry is increasingly adopting digital impressioning techniques in order to leverage CAD/CAM technologies for the design and construction of dental restorations. As dental treatment becomes increasingly digitized and reliant on advanced technologies such as dental implants, wider adoption of standardized surface topography and characterization techniques will become evermore essential. The dental research community welcomes the advances that are being made in surface topography measurement science towards realizing this ultimate goal.
Cornut, P-L; Soldermann, Y; Robin, C; Barranco, R; Kerhoas, A; Burillon, C
2013-12-01
To report the financial impact of using modern lens and vitreoretinal surgical techniques. Bottom-up sterilization and consumables costs for new surgical techniques (microincisional coaxial phacoemulsification and transconjunctival sutureless vitrectomy) and the corresponding former techniques (phacoemulsification with 3.2-mm incision and 20G vitrectomy) were determined. These costs were compared to each other and to the target costs of the Diagnosis Related Groups for public hospitals (Groupes Homogènes de Séjours [GHS]) concerned, extracted from the analytic accounting data of the French National Cost Study (Étude Nationale des Coûts [ENC]) for 2009 (target=sum of sterilization costs posted under medical logistics, consumables, implantable medical devices, and special pharmaceuticals posted as direct expenses). For outpatient lens surgery with or without vitrectomy (GHS code: 02C05J): the ENC's target cost for 2009 was 339€ out of a total of 1432€. The cost detailed in this study was 4 % higher than the target cost when the procedure was performed using the former technique (3.2mm sutured incision) and 12 % lower when the procedure was performed using the new technique (1.8mm sutureless) after removing now unnecessary consumables and optimization of the technique. For level I retinal detachment surgeries (GHS code: 02C021): the ENC's 2009 target cost was 641€ out of a total of 3091€. The cost specified in this study was 1 % lower than the target cost when the procedure was done using the former technique (20-G vitrectomy) and 16 % less when the procedure was performed using the new technique (transconjunctival vitrectomy) after removal of now unnecessary consumables and optimization of the technique. Contrary to generally accepted ideas, implementing modern techniques in ocular surgery can result in direct cost and sterilization savings when the operator takes advantage of the possibilities these techniques offer in terms of simplification of the procedures to do away with consumables that are no longer necessary. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
User’s guide to SNAP for ArcGIS® :ArcGIS interface for scheduling and network analysis program
Woodam Chung; Dennis Dykstra; Fred Bower; Stephen O’Brien; Richard Abt; John. and Sessions
2012-01-01
This document introduces a computer software named SNAP for ArcGIS® , which has been developed to streamline scheduling and transportation planning for timber harvest areas. Using modern optimization techniques, it can be used to spatially schedule timber harvest with consideration of harvesting costs, multiple products, alternative...
ERIC Educational Resources Information Center
Kanagarajan, Sujith; Ramakrishnan, Sivakumar
2018-01-01
Ubiquitous Learning Environment (ULE) has been becoming a mobile and sensor based technology equipped environment that suits the modern world education discipline requirements for the past few years. Ambient Intelligence (AmI) makes much smarter the ULE by the support of optimization and intelligent techniques. Various efforts have been so far…
Application of evolutionary computation in ECAD problems
NASA Astrophysics Data System (ADS)
Lee, Dae-Hyun; Hwang, Seung H.
1998-10-01
Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.
The use of modern measurement techniques for designing pro ecological constructions
NASA Astrophysics Data System (ADS)
Wieczorowski, Michał; Gapiński, Bartosz; Szymański, Maciej; Rękas, Artur
2017-10-01
In the paper some possibilities of application modern length and angle metrology techniques to design constructions that support ecology were presented. The paper is based on a project where a lighter bus and train car seat was developed. Different options were presented including static and dynamic photogrammetry, computed tomography and thermography. Research related with dynamic behaviour of designed structures gave input to determine deformation of a seat and passengers sitting on it during communication accidents. Works connected to strength of construction elements made it possible to optimize its dimensions maintaining proper durability. Metrological actions taken in relation to production machines and equipment enabled to better recognize phenomena that take place during manufacturing process and to correct its parameters, what in turns also contributed to slim down the construction.
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W.
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures. PMID:26904094
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations.
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures.
Optimization of the production process using virtual model of a workspace
NASA Astrophysics Data System (ADS)
Monica, Z.
2015-11-01
Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the object trajectory and the cooperation process.
A parallel Jacobson-Oksman optimization algorithm. [parallel processing (computers)
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Markos, A. T.
1975-01-01
A gradient-dependent optimization technique which exploits the vector-streaming or parallel-computing capabilities of some modern computers is presented. The algorithm, derived by assuming that the function to be minimized is homogeneous, is a modification of the Jacobson-Oksman serial minimization method. In addition to describing the algorithm, conditions insuring the convergence of the iterates of the algorithm and the results of numerical experiments on a group of sample test functions are presented. The results of these experiments indicate that this algorithm will solve optimization problems in less computing time than conventional serial methods on machines having vector-streaming or parallel-computing capabilities.
Utilization-Based Modeling and Optimization for Cognitive Radio Networks
NASA Astrophysics Data System (ADS)
Liu, Yanbing; Huang, Jun; Liu, Zhangxiong
The cognitive radio technique promises to manage and allocate the scarce radio spectrum in the highly varying and disparate modern environments. This paper considers a cognitive radio scenario composed of two queues for the primary (licensed) users and cognitive (unlicensed) users. According to the Markov process, the system state equations are derived and an optimization model for the system is proposed. Next, the system performance is evaluated by calculations which show the rationality of our system model. Furthermore, discussions among different parameters for the system are presented based on the experimental results.
NASA Technical Reports Server (NTRS)
Riedel, S. A.
1979-01-01
A method by which modern and classical control theory techniques may be integrated in a synergistic fashion and used in the design of practical flight control systems is presented. A general procedure is developed, and several illustrative examples are included. Emphasis is placed not only on the synthesis of the design, but on the assessment of the results as well. The first step is to establish the differences, distinguishing characteristics and connections between the modern and classical control theory approaches. Ultimately, this uncovers a relationship between bandwidth goals familiar in classical control and cost function weights in the equivalent optimal system. In order to obtain a practical optimal solution, it is also necessary to formulate the problem very carefully, and each choice of state, measurement and output variable must be judiciously considered. Once design goals are established and problem formulation completed, the control system is synthesized in a straightforward manner. Three steps are involved: filter-observer solution, regulator solution, and the combination of those two into the controller. Assessment of the controller permits and examination and expansion of the synthesis results.
NASA Technical Reports Server (NTRS)
Hanks, Brantley R.; Skelton, Robert E.
1991-01-01
Vibration in modern structural and mechanical systems can be reduced in amplitude by increasing stiffness, redistributing stiffness and mass, and/or adding damping if design techniques are available to do so. Linear Quadratic Regulator (LQR) theory in modern multivariable control design, attacks the general dissipative elastic system design problem in a global formulation. The optimal design, however, allows electronic connections and phase relations which are not physically practical or possible in passive structural-mechanical devices. The restriction of LQR solutions (to the Algebraic Riccati Equation) to design spaces which can be implemented as passive structural members and/or dampers is addressed. A general closed-form solution to the optimal free-decay control problem is presented which is tailored for structural-mechanical system. The solution includes, as subsets, special cases such as the Rayleigh Dissipation Function and total energy. Weighting matrix selection is a constrained choice among several parameters to obtain desired physical relationships. The closed-form solution is also applicable to active control design for systems where perfect, collocated actuator-sensor pairs exist.
Pedersen, C B
1987-06-01
The modern surgical treatment of otosclerosis consists of replacement of the sound conducting function of the stapes by a prosthesis. The results obtained in 100 consecutive patients using the small fenestra technique and a 0.4 mm. Teflon and steel wire prosthesis are reported. The surgical technique is described. The hearing was improved in all patients. In 92 per cent of the patients an optimal hearing gain was found after an observation time of 1 to 4 years. Five patients required re-operation during the observation time. The small fenestra technique and the Fisch prosthesis were considered optimal in respect to technical difficulty, hearing improvement and complication rate. There was no sensorineural hearing loss in this series of patients. The absence of serious complications makes it reasonable to operate on both ears in patients with bilateral hearing loss. The results are as good in elderly people as in younger people. Therefore the operation can be offered for patients in all age groups.
A pilot modeling technique for handling-qualities research
NASA Technical Reports Server (NTRS)
Hess, R. A.
1980-01-01
A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.
Properties of nucleon resonances by means of a genetic algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez-Ramirez, C.; Moya de Guerra, E.; Instituto de Estructura de la Materia, CSIC, Serrano 123, E-28006 Madrid
2008-06-15
We present an optimization scheme that employs a genetic algorithm (GA) to determine the properties of low-lying nucleon excitations within a realistic photo-pion production model based upon an effective Lagrangian. We show that with this modern optimization technique it is possible to reliably assess the parameters of the resonances and the associated error bars as well as to identify weaknesses in the models. To illustrate the problems the optimization process may encounter, we provide results obtained for the nucleon resonances {delta}(1230) and {delta}(1700). The former can be easily isolated and thus has been studied in depth, while the latter ismore » not as well known experimentally.« less
Concepts and algorithms for terminal-area traffic management
NASA Technical Reports Server (NTRS)
Erzberger, H.; Chapel, J. D.
1984-01-01
The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.
NASA Astrophysics Data System (ADS)
Gómez-Galán, J. A.; Sánchez-Rodríguez, T.; Sánchez-Raya, M.; Martel, I.; López-Martín, A.; Carvajal, R. G.; Ramírez-Angulo, J.
2014-06-01
This paper evaluates the design of front-end electronics in modern technologies to be used in a new generation of heavy ion detectors—HYDE (FAIR, Germany)—proposing novel architectures to achieve high gain in a low voltage environment. As conventional topologies of operational amplifiers in modern CMOS processes show limitations in terms of gain, novel approaches must be raised. The work addresses the design using transistors with channel length of no more than double the feature size and a supply voltage as low as 1.2 V. A front-end system has been fabricated in a 90 nm process including gain boosting techniques based on regulated cascode circuits. The analog channel has been optimized to match a detector capacitance of 5 pF and exhibits a good performance in terms of gain, speed, linearity and power consumption.
Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience.
Paninski, L; Cunningham, J P
2018-06-01
Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision. Copyright © 2018 Elsevier Ltd. All rights reserved.
Nanoscale surface characterization using laser interference microscopy
NASA Astrophysics Data System (ADS)
Ignatyev, Pavel S.; Skrynnik, Andrey A.; Melnik, Yury A.
2018-03-01
Nanoscale surface characterization is one of the most significant parts of modern materials development and application. The modern microscopes are expensive and complicated tools, and its use for industrial tasks is limited due to laborious sample preparation, measurement procedures, and low operation speed. The laser modulation interference microscopy method (MIM) for real-time quantitative and qualitative analysis of glass, metals, ceramics, and various coatings has a spatial resolution of 0.1 nm for vertical and up to 100 nm for lateral. It is proposed as an alternative to traditional scanning electron microscopy (SEM) and atomic force microscopy (AFM) methods. It is demonstrated that in the cases of roughness metrology for super smooth (Ra >1 nm) surfaces the application of a laser interference microscopy techniques is more optimal than conventional SEM and AFM. The comparison of semiconductor test structure for lateral dimensions measurements obtained with SEM and AFM and white light interferometer also demonstrates the advantages of MIM technique.
Application of Modern Fortran to Spacecraft Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Williams, Jacob; Falck, Robert D.; Beekman, Izaak B.
2018-01-01
In this paper, applications of the modern Fortran programming language to the field of spacecraft trajectory optimization and design are examined. Modern object-oriented Fortran has many advantages for scientific programming, although many legacy Fortran aerospace codes have not been upgraded to use the newer standards (or have been rewritten in other languages perceived to be more modern). NASA's Copernicus spacecraft trajectory optimization program, originally a combination of Fortran 77 and Fortran 95, has attempted to keep up with modern standards and makes significant use of the new language features. Various algorithms and methods are presented from trajectory tools such as Copernicus, as well as modern Fortran open source libraries and other projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, Lena, E-mail: lena.specht@regionh.dk; Yahalom, Joachim; Illidge, Tim
2014-07-15
Radiation therapy (RT) is the most effective single modality for local control of Hodgkin lymphoma (HL) and an important component of therapy for many patients. These guidelines have been developed to address the use of RT in HL in the modern era of combined modality treatment. The role of reduced volumes and doses is addressed, integrating modern imaging with 3-dimensional (3D) planning and advanced techniques of treatment delivery. The previously applied extended field (EF) and original involved field (IF) techniques, which treated larger volumes based on nodal stations, have now been replaced by the use of limited volumes, based solelymore » on detectable nodal (and extranodal extension) involvement at presentation, using contrast-enhanced computed tomography, positron emission tomography/computed tomography, magnetic resonance imaging, or a combination of these techniques. The International Commission on Radiation Units and Measurements concepts of gross tumor volume, clinical target volume, internal target volume, and planning target volume are used for defining the targeted volumes. Newer treatment techniques, including intensity modulated radiation therapy, breath-hold, image guided radiation therapy, and 4-dimensional imaging, should be implemented when their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control. The highly conformal involved node radiation therapy (INRT), recently introduced for patients for whom optimal imaging is available, is explained. A new concept, involved site radiation therapy (ISRT), is introduced as the standard conformal therapy for the scenario, commonly encountered, wherein optimal imaging is not available. There is increasing evidence that RT doses used in the past are higher than necessary for disease control in this era of combined modality therapy. The use of INRT and of lower doses in early-stage HL is supported by available data. Although the use of ISRT has not yet been validated in a formal study, it is more conservative than INRT, accounting for suboptimal information and appropriately designed for safe local disease control. The goal of modern smaller field radiation therapy is to reduce both treatment volume and treatment dose while maintaining efficacy and minimizing acute and late sequelae. This review is a consensus of the International Lymphoma Radiation Oncology Group (ILROG) Steering Committee regarding the modern approach to RT in the treatment of HL, outlining a new concept of ISRT in which reduced treatment volumes are planned for the effective control of involved sites of HL. Nodal and extranodal non-Hodgkin lymphomas (NHL) are covered separately by ILROG guidelines.« less
Specht, Lena; Yahalom, Joachim; Illidge, Tim; Berthelsen, Anne Kiil; Constine, Louis S; Eich, Hans Theodor; Girinsky, Theodore; Hoppe, Richard T; Mauch, Peter; Mikhaeel, N George; Ng, Andrea
2014-07-15
Radiation therapy (RT) is the most effective single modality for local control of Hodgkin lymphoma (HL) and an important component of therapy for many patients. These guidelines have been developed to address the use of RT in HL in the modern era of combined modality treatment. The role of reduced volumes and doses is addressed, integrating modern imaging with 3-dimensional (3D) planning and advanced techniques of treatment delivery. The previously applied extended field (EF) and original involved field (IF) techniques, which treated larger volumes based on nodal stations, have now been replaced by the use of limited volumes, based solely on detectable nodal (and extranodal extension) involvement at presentation, using contrast-enhanced computed tomography, positron emission tomography/computed tomography, magnetic resonance imaging, or a combination of these techniques. The International Commission on Radiation Units and Measurements concepts of gross tumor volume, clinical target volume, internal target volume, and planning target volume are used for defining the targeted volumes. Newer treatment techniques, including intensity modulated radiation therapy, breath-hold, image guided radiation therapy, and 4-dimensional imaging, should be implemented when their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control. The highly conformal involved node radiation therapy (INRT), recently introduced for patients for whom optimal imaging is available, is explained. A new concept, involved site radiation therapy (ISRT), is introduced as the standard conformal therapy for the scenario, commonly encountered, wherein optimal imaging is not available. There is increasing evidence that RT doses used in the past are higher than necessary for disease control in this era of combined modality therapy. The use of INRT and of lower doses in early-stage HL is supported by available data. Although the use of ISRT has not yet been validated in a formal study, it is more conservative than INRT, accounting for suboptimal information and appropriately designed for safe local disease control. The goal of modern smaller field radiation therapy is to reduce both treatment volume and treatment dose while maintaining efficacy and minimizing acute and late sequelae. This review is a consensus of the International Lymphoma Radiation Oncology Group (ILROG) Steering Committee regarding the modern approach to RT in the treatment of HL, outlining a new concept of ISRT in which reduced treatment volumes are planned for the effective control of involved sites of HL. Nodal and extranodal non-Hodgkin lymphomas (NHL) are covered separately by ILROG guidelines. Copyright © 2014 Elsevier Inc. All rights reserved.
Practical optimal flight control system design for helicopter aircraft. Volume 1: Technical Report
NASA Technical Reports Server (NTRS)
Hofmann, L. G.; Riedel, S. A.; Mcruer, D.
1980-01-01
A method by which modern and classical theory techniques may be integrated in a synergistic fashion and used in the design of practical flight control systems is presented. A general procedure is developed, and several illustrative examples are included. Emphasis is placed not only on the synthesis of the design, but on the assessment of the results as well.
Optimization of process parameters in welding of dissimilar steels using robot TIG welding
NASA Astrophysics Data System (ADS)
Navaneeswar Reddy, G.; VenkataRamana, M.
2018-03-01
Robot TIG welding is a modern technique used for joining two work pieces with high precision. Design of Experiments is used to conduct experiments by varying weld parameters like current, wire feed and travelling speed. The welding parameters play important role in joining of dissimilar stainless steel SS 304L and SS430. In this work, influences of welding parameter on Robot TIG Welded specimens are investigated using Response Surface Methodology. The Micro Vickers hardness tests of the weldments are measured. The process parameters are optimized to maximize the hardness of the weldments.
Diagnosis and Diagnostic Imaging of Anal Canal Cancer.
Ciombor, Kristen K; Ernst, Randy D; Brown, Gina
2017-01-01
Anal canal cancer is an uncommon malignancy but one that is often curable with optimal therapy. Owing to its unique location, histology, risk factors, and usual presentation, a careful diagnostic approach is warranted. This approach includes an excellent history and physical examination, including digital rectal examination, laboratory data, and comprehensive imaging. Anal cancer staging and formulation of a treatment plan depends on accurate imaging data. Modern radiographic techniques have improved staging quality and accuracy, and a thorough knowledge of anal anatomy is paramount to the optimal multidisciplinary treatment of this disease. Copyright © 2016 Elsevier Inc. All rights reserved.
Study of aerodynamic surface control of space shuttle boost and reentry, volume 1
NASA Technical Reports Server (NTRS)
Chang, C. J.; Connor, C. L.; Gill, G. P.
1972-01-01
The optimization technique is described which was used in the study for applying modern optimal control technology to the design of shuttle booster engine reaction control systems and aerodynamic control systems. Complete formulations are presented for both the ascent and reentry portions of the study. These formulations include derivations of the 6D perturbation equations of motion and the process followed in the control and blending law selections. A total hybrid software concept applied to the study is described in detail. Conclusions and recommendations based on the results of the study are included.
Global Design Optimization for Aerodynamics and Rocket Propulsion Components
NASA Technical Reports Server (NTRS)
Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)
2000-01-01
Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.
Alternatives for jet engine control
NASA Technical Reports Server (NTRS)
Leake, R. J.; Sain, M. K.
1978-01-01
General goals of the research were classified into two categories. The first category involves the use of modern multivariable frequency domain methods for control of engine models in the neighborhood of a quiescent point. The second category involves the use of nonlinear modelling and optimization techniques for control of engine models over a more extensive part of the flight envelope. In the frequency domain category, works were published in the areas of low-interaction design, polynomial design, and multiple setpoint studies. A number of these ideas progressed to the point at which they are starting to attract practical interest. In the nonlinear category, advances were made both in engine modelling and in the details associated with software for determination of time optimal controls. Nonlinear models for a two spool turbofan engine were expanded and refined; and a promising new approach to automatic model generation was placed under study. A two time scale scheme was developed to do two-dimensional dynamic programming, and an outward spiral sweep technique has greatly speeded convergence times in time optimal calculations.
Civil and mechanical engineering applications of sensitivity analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komkov, V.
1985-07-01
In this largely tutorial presentation, the historical development of optimization theories has been outlined as they applied to mechanical and civil engineering designs and the development of modern sensitivity techniques during the last 20 years has been traced. Some of the difficulties and the progress made in overcoming them have been outlined. Some of the recently developed theoretical methods have been stressed to indicate their importance to computer-aided design technology.
Finite Element Based Optimization of Material Parameters for Enhanced Ballistic Protection
NASA Astrophysics Data System (ADS)
Ramezani, Arash; Huber, Daniel; Rothe, Hendrik
2013-06-01
The threat imposed by terrorist attacks is a major hazard for military installations, vehicles and other items. The large amounts of firearms and projectiles that are available, pose serious threats to military forces and even civilian facilities. An important task for international research and development is to avert danger to life and limb. This work will evaluate the effect of modern armor with numerical simulations. It will also provide a brief overview of ballistic tests in order to offer some basic knowledge of the subject, serving as a basis for the comparison of simulation results. The objective of this work is to develop and improve the modern armor used in the security sector. Numerical simulations should replace the expensive ballistic tests and find vulnerabilities of items and structures. By progressively changing the material parameters, the armor is to be optimized. Using a sensitivity analysis, information regarding decisive variables is yielded and vulnerabilities are easily found and eliminated afterwards. To facilitate the simulation, advanced numerical techniques have been employed in the analyses.
NASA Astrophysics Data System (ADS)
Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta
2016-06-01
With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MAGEE,GLEN I.
Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flightmore » modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.« less
Optimal placement of FACTS devices using optimization techniques: A review
NASA Astrophysics Data System (ADS)
Gaur, Dipesh; Mathew, Lini
2018-03-01
Modern power system is dealt with overloading problem especially transmission network which works on their maximum limit. Today’s power system network tends to become unstable and prone to collapse due to disturbances. Flexible AC Transmission system (FACTS) provides solution to problems like line overloading, voltage stability, losses, power flow etc. FACTS can play important role in improving static and dynamic performance of power system. FACTS devices need high initial investment. Therefore, FACTS location, type and their rating are vital and should be optimized to place in the network for maximum benefit. In this paper, different optimization methods like Particle Swarm Optimization (PSO), Genetic Algorithm (GA) etc. are discussed and compared for optimal location, type and rating of devices. FACTS devices such as Thyristor Controlled Series Compensator (TCSC), Static Var Compensator (SVC) and Static Synchronous Compensator (STATCOM) are considered here. Mentioned FACTS controllers effects on different IEEE bus network parameters like generation cost, active power loss, voltage stability etc. have been analyzed and compared among the devices.
The theory of variational hybrid quantum-classical algorithms
NASA Astrophysics Data System (ADS)
McClean, Jarrod R.; Romero, Jonathan; Babbush, Ryan; Aspuru-Guzik, Alán
2016-02-01
Many quantum algorithms have daunting resource requirements when compared to what is available today. To address this discrepancy, a quantum-classical hybrid optimization scheme known as ‘the quantum variational eigensolver’ was developed (Peruzzo et al 2014 Nat. Commun. 5 4213) with the philosophy that even minimal quantum resources could be made useful when used in conjunction with classical routines. In this work we extend the general theory of this algorithm and suggest algorithmic improvements for practical implementations. Specifically, we develop a variational adiabatic ansatz and explore unitary coupled cluster where we establish a connection from second order unitary coupled cluster to universal gate sets through a relaxation of exponential operator splitting. We introduce the concept of quantum variational error suppression that allows some errors to be suppressed naturally in this algorithm on a pre-threshold quantum device. Additionally, we analyze truncation and correlated sampling in Hamiltonian averaging as ways to reduce the cost of this procedure. Finally, we show how the use of modern derivative free optimization techniques can offer dramatic computational savings of up to three orders of magnitude over previously used optimization techniques.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
[Conceptual approach to formation of a modern system of medical provision].
Belevitin, A B; Miroshnichenko, Iu V; Bunin, S A; Goriachev, A B; Krasavin, K D
2009-09-01
Within the frame of forming of a new face of medical service of the Armed Forces, were determined the principle approaches to optimization of the process of development of the system of medical supply. It was proposed to use the following principles: principle of hierarchic structuring, principle of purposeful orientation, principle of vertical task sharing, principle of horizontal task sharing, principle of complex simulation, principle of permanent perfection. The main direction of optimization of structure and composition of system of medical supply of the Armed Forces are: forming of modern institutes of medical supply--centers of support by technique and facilities on the base of central, regional storehouses, and attachment of several functions of organs of military government to them; creation of medical supply office on the base military hospitals, being basing treatment-prophylaxis institutes, in adjusted territorial zones of responsibility for the purpose of realization of complex of tasks of supplying the units and institutes, attached to them on medical support, by medical equipment. Building of medical support system is realized on three levels: Center - Military region (NAVY region) - territorial zone of responsibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo; Ronald Boring; Lew Hanes
2013-09-01
The U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) program is collaborating with a U.S. nuclear utility to bring about a systematic fleet-wide control room modernization. To facilitate this upgrade, a new distributed control system (DCS) is being introduced into the control rooms of these plants. The DCS will upgrade the legacy plant process computer and emergency response facility information system. In addition, the DCS will replace an existing analog turbine control system with a display-based system. With technology upgrades comes the opportunity to improve the overall human-system interaction between the operators and the control room. To optimize operatormore » performance, the LWRS Control Room Modernization research team followed a human-centered approach published by the U.S. Nuclear Regulatory Commission. NUREG-0711, Rev. 3, Human Factors Engineering Program Review Model (O’Hara et al., 2012), prescribes four phases for human factors engineering. This report provides examples of the first phase, Planning and Analysis. The three elements of Planning and Analysis in NUREG-0711 that are most crucial to initiating control room upgrades are: • Operating Experience Review: Identifies opportunities for improvement in the existing system and provides lessons learned from implemented systems. • Function Analysis and Allocation: Identifies which functions at the plant may be optimally handled by the DCS vs. the operators. • Task Analysis: Identifies how tasks might be optimized for the operators. Each of these elements is covered in a separate chapter. Examples are drawn from workshops with reactor operators that were conducted at the LWRS Human System Simulation Laboratory HSSL and at the respective plants. The findings in this report represent generalized accounts of more detailed proprietary reports produced for the utility for each plant. The goal of this LWRS report is to disseminate the technique and provide examples sufficient to serve as a template for other utilities’ projects for control room modernization.« less
Plane-Wave DFT Methods for Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bylaska, Eric J.
A detailed description of modern plane-wave DFT methods and software (contained in the NWChem package) are described that allow for both geometry optimization and ab initio molecular dynamics simulations. Significant emphasis is placed on aspects of these methods that are of interest to computational chemists and useful for simulating chemistry, including techniques for calculating charged systems, exact exchange (i.e. hybrid DFT methods), and highly efficient AIMD/MM methods. Sample applications on the structure of the goethite+water interface and the hydrolysis of nitroaromatic molecules are described.
Combs, Stephanie E; Debus, Jürgen; Feick, Günter; Hadaschik, Boris; Hohenfellner, Markus; Schüle, Roland; Zacharias, Jens-Peter; Schwardt, Malte
2014-11-04
A brainstorming and consensus meeting organized by the German Cancer Aid focused on modern treatment of prostate cancer and promising innovative techniques and research areas. Besides optimization of screening algorithms, molecular-based stratification and individually tailored treatment regimens will be the future of multimodal prostate cancer management. Effective interdisciplinary structures, including biobanking and data collection mechanisms are the basis for such developments.
Using Genotype Abundance to Improve Phylogenetic Inference
Mesin, Luka; Victora, Gabriel D; Minin, Vladimir N; Matsen, Frederick A
2018-01-01
Abstract Modern biological techniques enable very dense genetic sampling of unfolding evolutionary histories, and thus frequently sample some genotypes multiple times. This motivates strategies to incorporate genotype abundance information in phylogenetic inference. In this article, we synthesize a stochastic process model with standard sequence-based phylogenetic optimality, and show that tree estimation is substantially improved by doing so. Our method is validated with extensive simulations and an experimental single-cell lineage tracing study of germinal center B cell receptor affinity maturation. PMID:29474671
Mahan, Angel F; McEvoy, Matthew D; Gravenstein, Nikolaus
2016-04-01
In modern practice, real-time ultrasound guidance is commonly employed for the placement of internal jugular vein catheters. With a new tool, such as ultrasound, comes the opportunity to refine and further optimize the ultrasound view during jugular vein catheterization. We describe jugular vein access techniques and use the long-axis view as an alternative to the commonly employed short-axis cross-section view for internal jugular vein access and cannulation. The long-axis ultrasound-guided internal jugular vein approach for internal jugular vein cannulation is a useful alternative technique that can provide better needle tip and guidewire visualization than the more traditional short-axis ultrasound view.
Spatial frequency performance limitations of radiation dose optimization and beam positioning
NASA Astrophysics Data System (ADS)
Stewart, James M. P.; Stapleton, Shawn; Chaudary, Naz; Lindsay, Patricia E.; Jaffray, David A.
2018-06-01
The flexibility and sophistication of modern radiotherapy treatment planning and delivery methods have advanced techniques to improve the therapeutic ratio. Contemporary dose optimization and calculation algorithms facilitate radiotherapy plans which closely conform the three-dimensional dose distribution to the target, with beam shaping devices and image guided field targeting ensuring the fidelity and accuracy of treatment delivery. Ultimately, dose distribution conformity is limited by the maximum deliverable dose gradient; shallow dose gradients challenge techniques to deliver a tumoricidal radiation dose while minimizing dose to surrounding tissue. In this work, this ‘dose delivery resolution’ observation is rigorously formalized for a general dose delivery model based on the superposition of dose kernel primitives. It is proven that the spatial resolution of a delivered dose is bounded by the spatial frequency content of the underlying dose kernel, which in turn defines a lower bound in the minimization of a dose optimization objective function. In addition, it is shown that this optimization is penalized by a dose deposition strategy which enforces a constant relative phase (or constant spacing) between individual radiation beams. These results are further refined to provide a direct, analytic method to estimate the dose distribution arising from the minimization of such an optimization function. The efficacy of the overall framework is demonstrated on an image guided small animal microirradiator for a set of two-dimensional hypoxia guided dose prescriptions.
Model-based phase-shifting interferometer
NASA Astrophysics Data System (ADS)
Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian
2015-10-01
A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.
Maintaining relationships with your patients by maximizing your online presence.
Donnelly, John; Kaaihue, Maarit
2011-01-01
Medical practices that take full advantage of today's online consumer-driven culture will leave other practices in their wake. With today's modern consumers looking to the Internet more and more for finding medical solutions for their family, it is imperative that your practice uses all of the tools available for creating and maintaining its online presence. We all know that having a functional Web site these days is a necessity for practically any business in any industry; however, taking your online presence further by using a few techniques can set up your practice for great success. Your online marketing should help your practice with managing patient relationships at all levels. To best reach this goal, continually analyzing data and updating your online marketing approach will help further drive leads and conversions. Using a few search engine optimization techniques as well as optimal design and marketing methods will allow you to more easily find prospective patients, build trust and credibility with your current patients, and manage your reputation.
SIMD Optimization of Linear Expressions for Programmable Graphics Hardware
Bajaj, Chandrajit; Ihm, Insung; Min, Jungki; Oh, Jinsang
2009-01-01
The increased programmability of graphics hardware allows efficient graphical processing unit (GPU) implementations of a wide range of general computations on commodity PCs. An important factor in such implementations is how to fully exploit the SIMD computing capacities offered by modern graphics processors. Linear expressions in the form of ȳ = Ax̄ + b̄, where A is a matrix, and x̄, ȳ and b̄ are vectors, constitute one of the most basic operations in many scientific computations. In this paper, we propose a SIMD code optimization technique that enables efficient shader codes to be generated for evaluating linear expressions. It is shown that performance can be improved considerably by efficiently packing arithmetic operations into four-wide SIMD instructions through reordering of the operations in linear expressions. We demonstrate that the presented technique can be used effectively for programming both vertex and pixel shaders for a variety of mathematical applications, including integrating differential equations and solving a sparse linear system of equations using iterative methods. PMID:19946569
Reirradiation of head and neck cancer using modern highly conformal techniques.
Ho, Jennifer C; Phan, Jack
2018-04-23
Locoregional disease recurrence or development of a second primary cancer after definitive radiotherapy for head and neck cancers remains a treatment challenge. Reirradiation utilizing traditional techniques has been limited by concern for serious toxicity. With the advent of newer, more precise radiotherapy techniques, such as intensity-modulated radiotherapy (IMRT), proton radiotherapy, and stereotactic body radiotherapy (SBRT), there has been renewed interest in curative-intent head and neck reirradiation. However, as most studies were retrospective, single-institutional experiences, the optimal modality is not clear. We provide a comprehensive review of the outcomes of relevant studies using these 3 head and neck reirradiation techniques, followed by an analysis and comparison of the toxicity, tumor control, concurrent systemic therapy, and prognostic factors. Overall, there is evidence that IMRT, proton therapy, and SBRT reirradiation are feasible treatment options that offer a chance for durable local control and survival. Prospective studies, particularly randomized trials, are needed. © 2018 Wiley Periodicals, Inc.
Neoadjuvant radiotherapeutic strategies in pancreatic cancer
Roeder, Falk
2016-01-01
This review summarizes the current status of neoadjuvant radiation approaches in the treatment of pancreatic cancer, including a description of modern radiation techniques, and an overview on the literature regarding neoadjuvant radio- or radiochemotherapeutic strategies both for resectable and irresectable pancreatic cancer. Neoadjuvant chemoradiation for locally-advanced, primarily non- or borderline resectable pancreas cancer results in secondary resectability in a substantial proportion of patients with consecutively markedly improved overall prognosis and should be considered as possible alternative in pretreatment multidisciplinary evaluations. In resectable pancreatic cancer, outstanding results in terms of response, local control and overall survival have been observed with neoadjuvant radio- or radiochemotherapy in several phase I/II trials, which justify further evaluation of this strategy. Further investigation of neoadjuvant chemoradiation strategies should be performed preferentially in randomized trials in order to improve comparability of the current results with other treatment modalities. This should include the evaluation of optimal sequencing with newer and more potent systemic induction therapy approaches. Advances in patient selection based on new molecular markers might be of crucial interest in this context. Finally modern external beam radiation techniques (intensity-modulated radiation therapy, image-guided radiation therapy and stereotactic body radiation therapy), new radiation qualities (protons, heavy ions) or combinations with alternative boosting techniques widen the therapeutic window and contribute to the reduction of toxicity. PMID:26909133
Yu, Daxiong; Ma, Ruijie; Fang, Jianqiao
2015-05-01
There are many eminent acupuncture masters in modern times in the regions of Zhejiang province, which has developed the acupuncture schools of numerous characteristics and induces the important impacts at home and abroad. Through the literature collection on the acupuncture schools in Zhejiang and the interviews to the parties involved, it has been discovered that the acupuncture manipulation techniques of acupuncture masters in modern times are specifically featured. Those techniques are developed on the basis of Neijing (Internal Classic), Jinzhenfu (Ode to Gold Needle) and Zhenjiu Dacheng (Great Compendium of Acupuncture and Moxibustion). No matter to obey the old maxim or study by himself, every master lays the emphasis on the research and interpretation of classical theories and integrates the traditional with the modern. In the paper, the acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times are stated from four aspects, named needling techniques in Internal Classic, feijingzouqi needling technique, penetrating needling technique and innovation of acupuncture manipulation.
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walz-Flannigan, A; Lucas, J; Buchanan, K
Purpose: Manual technique selection in radiography is needed for imaging situations where there is difficulty in proper positioning for AEC, prosthesis, for non-bucky imaging, or for guiding image repeats. Basic information about how to provide consistent image signal and contrast for various kV and tissue thickness is needed to create manual technique charts, and relevant for physicists involved in technique chart optimization. Guidance on technique combinations and rules-of-thumb to provide consistent image signal still in use today are based on measurements with optical density of screen-film combinations and older generation x-ray systems. Tools such as a kV-scale chart can bemore » useful to know how to modify mAs when kV is changed in order to maintain consistent image receptor signal level. We evaluate these tools for modern equipment for use in optimizing proper size scaled techniques. Methods: We used a water phantom to measure calibrated signal change for CR and DR (with grid) for various beam energies. Tube current values were calculated that would yield a consistent image signal response. Data was fit to provide sufficient granularity of detail to compose technique-scale chart. Tissue thickness approximated equivalence to 80% of water depth. Results: We created updated technique-scale charts, providing mAs and kV combinations to achieve consistent signal for CR and DR for various tissue equivalent thicknesses. We show how this information can be used to create properly scaled size-based manual technique charts. Conclusion: Relative scaling of mAs and kV for constant signal (i.e. the shape of the curve) appears substantially similar between film-screen and CR/DR. This supports the notion that image receptor related differences are minor factors for relative (not absolute) changes in mAs with varying kV. However, as demonstrated creation of these difficult to find detailed technique-scales are useful tools for manual chart optimization.« less
Linear-Quadratic-Gaussian Regulator Developed for a Magnetic Bearing
NASA Technical Reports Server (NTRS)
Choi, Benjamin B.
2002-01-01
Linear-Quadratic-Gaussian (LQG) control is a modern state-space technique for designing optimal dynamic regulators. It enables us to trade off regulation performance and control effort, and to take into account process and measurement noise. The Structural Mechanics and Dynamics Branch at the NASA Glenn Research Center has developed an LQG control for a fault-tolerant magnetic bearing suspension rig to optimize system performance and to reduce the sensor and processing noise. The LQG regulator consists of an optimal state-feedback gain and a Kalman state estimator. The first design step is to seek a state-feedback law that minimizes the cost function of regulation performance, which is measured by a quadratic performance criterion with user-specified weighting matrices, and to define the tradeoff between regulation performance and control effort. The next design step is to derive a state estimator using a Kalman filter because the optimal state feedback cannot be implemented without full state measurement. Since the Kalman filter is an optimal estimator when dealing with Gaussian white noise, it minimizes the asymptotic covariance of the estimation error.
Liang, Jie; Gao, Xiang; Zeng, Guangming; Hua, Shanshan; Zhong, Minzhou; Li, Xiaodong; Li, Xin
2018-01-09
Climate change and human activities cause uncertain changes to species biodiversity by altering their habitat. The uncertainty of climate change requires planners to balance the benefit and cost of making conservation plan. Here optimal protection approach for Lesser White-fronted Goose (LWfG) by coupling Modern Portfolio Theory (MPT) and Marxan selection were proposed. MPT was used to provide suggested weights of investment for protected area (PA) and reduce the influence of climatic uncertainty, while Marxan was utilized to choose a series of specific locations for PA. We argued that through combining these two commonly used techniques with the conservation plan, including assets allocation and PA chosing, the efficiency of rare bird's protection would be enhanced. In MPT analyses, the uncertainty of conservation-outcome can be reduced while conservation effort was allocated in Hunan, Jiangxi and Yangtze River delta. In Marxan model, the optimal location for habitat restorations based on existing nature reserve was identified. Clear priorities for the location and allocation of assets could be provided based on this research, and it could help decision makers to build conservation strategy for LWfG.
An Advanced simulation Code for Modeling Inductive Output Tubes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thuc Bui; R. Lawrence Ives
2012-04-27
During the Phase I program, CCR completed several major building blocks for a 3D large signal, inductive output tube (IOT) code using modern computer language and programming techniques. These included a 3D, Helmholtz, time-harmonic, field solver with a fully functional graphical user interface (GUI), automeshing and adaptivity. Other building blocks included the improved electrostatic Poisson solver with temporal boundary conditions to provide temporal fields for the time-stepping particle pusher as well as the self electric field caused by time-varying space charge. The magnetostatic field solver was also updated to solve for the self magnetic field caused by time changing currentmore » density in the output cavity gap. The goal function to optimize an IOT cavity was also formulated, and the optimization methodologies were investigated.« less
NASA Astrophysics Data System (ADS)
Bird, Robert; Nystrom, David; Albright, Brian
2017-10-01
The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.
Impact of Chaos Functions on Modern Swarm Optimizers.
Emary, E; Zawbaa, Hossam M
2016-01-01
Exploration and exploitation are two essential components for any optimization algorithm. Much exploration leads to oscillation and premature convergence while too much exploitation slows down the optimization algorithm and the optimizer may be stuck in local minima. Therefore, balancing the rates of exploration and exploitation at the optimization lifetime is a challenge. This study evaluates the impact of using chaos-based control of exploration/exploitation rates against using the systematic native control. Three modern algorithms were used in the study namely grey wolf optimizer (GWO), antlion optimizer (ALO) and moth-flame optimizer (MFO) in the domain of machine learning for feature selection. Results on a set of standard machine learning data using a set of assessment indicators prove advance in optimization algorithm performance when using variational repeated periods of declined exploration rates over using systematically decreased exploration rates.
WE-D-303-00: Computational Phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA
2015-06-15
Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less
Thompson, R.S.; Anderson, K.H.; Bartlein, P.J.
2008-01-01
The method of modern analogs is widely used to obtain estimates of past climatic conditions from paleobiological assemblages, and despite its frequent use, this method involved so-far untested assumptions. We applied four analog approaches to a continental-scale set of bioclimatic and plant-distribution presence/absence data for North America to assess how well this method works under near-optimal modern conditions. For each point on the grid, we calculated the similarity between its vegetation assemblage and those of all other points on the grid (excluding nearby points). The climate of the points with the most similar vegetation was used to estimate the climate at the target grid point. Estimates based the use of the Jaccard similarity coefficient had smaller errors than those based on the use of a new similarity coefficient, although the latter may be more robust because it does not assume that the "fossil" assemblage is complete. The results of these analyses indicate that presence/absence vegetation assemblages provide a valid basis for estimating bioclimates on the continental scale. However, the accuracy of the estimates is strongly tied to the number of species in the target assemblage, and the analog method is necessarily constrained to produce estimates that fall within the range of observed values. We applied the four modern analog approaches and the mutual overlap (or "mutual climatic range") method to estimate bioclimatic conditions represented by the plant macrofossil assemblage from a packrat midden of Last Glacial Maximum age from southern Nevada. In general, the estimation approaches produced similar results in regard to moisture conditions, but there was a greater range of estimates for growing-degree days. Despite its limitations, the modern analog technique can provide paleoclimatic reconstructions that serve as the starting point to the interpretation of past climatic conditions.
Optimization of dental implantation
NASA Astrophysics Data System (ADS)
Dol, Aleksandr V.; Ivanov, Dmitriy V.
2017-02-01
Modern dentistry can not exist without dental implantation. This work is devoted to study of the "bone-implant" system and to optimization of dental prostheses installation. Modern non-invasive methods such as MRI an 3D-scanning as well as numerical calculations and 3D-prototyping allow to optimize all of stages of dental prosthetics. An integrated approach to the planning of implant surgery can significantly reduce the risk of complications in the first few days after treatment, and throughout the period of operation of the prosthesis.
Assessment of Speech in Primary Cleft Palate by Two-layer Closure (Conservative Management).
Jain, Harsha; Rao, Dayashankara; Sharma, Shailender; Gupta, Saurabh
2012-01-01
Treatment of the cleft palate has evolved over a long period of time. Various techniques of cleft palate repair that are practiced today are the results of principles learned through many years of modifications. The challenge in the art of modern palatoplasty is no longer successful closure of the cleft palate but an optimal speech outcome without compromising maxillofacial growth. Throughout these periods of evolution in the treatment of cleft palate, the effectiveness of various treatment protocols has been challenged by controversies concerning speech and maxillofacial growth. In this article we have evaluated the results of Pinto's modification of Wardill-Kilner palatoplasty without radical dissection of the levator veli palitini muscle on speech and post-op fistula in two different age groups in 20 patients. Preoperative and 6-month postoperative speech assessment values indicated that two-layer palatoplasty (modified Wardill-Kilner V-Y pushback technique) without an intravelar veloplasty technique was good for speech.
Current role of modern radiotherapy techniques in the management of breast cancer
Ozyigit, Gokhan; Gultekin, Melis
2014-01-01
Breast cancer is the most common type of malignancy in females. Advances in systemic therapies and radiotherapy (RT) provided long survival rates in breast cancer patients. RT has a major role in the management of breast cancer. During the past 15 years several developments took place in the field of imaging and irradiation techniques, intensity modulated RT, hypofractionation and partial-breast irradiation. Currently, improvements in the RT technology allow us a subsequent decrease in the treatment-related complications such as fibrosis and long-term cardiac toxicity while improving the loco-regional control rates and cosmetic results. Thus, it is crucial that modern radiotherapy techniques should be carried out with maximum care and efficiency. Several randomized trials provided evidence for the feasibility of modern radiotherapy techniques in the management of breast cancer. However, the role of modern radiotherapy techniques in the management of breast cancer will continue to be defined by the mature results of randomized trials. Current review will provide an up-to-date evidence based data on the role of modern radiotherapy techniques in the management of breast cancer. PMID:25114857
Optimizing product life cycle processes in design phase
NASA Astrophysics Data System (ADS)
Faneye, Ola. B.; Anderl, Reiner
2002-02-01
Life cycle concepts do not only serve as basis in assisting product developers understand the dependencies between products and their life cycles, they also help in identifying potential opportunities for improvement in products. Common traditional concepts focus mainly on energy and material flow across life phases, necessitating the availability of metrics derived from a reference product. Knowledge of life cycle processes won from an existing product is directly reused in its redesign. Depending on sales volume nevertheless, the environmental impact before product optimization can be substantial. With modern information technologies today, computer-aided life cycle methodologies can be applied well before product use. On the basis of a virtual prototype, life cycle processes are analyzed and optimized, using simulation techniques. This preventive approach does not only help in minimizing (or even eliminating) environmental burdens caused by product, costs incurred due to changes in real product can also be avoided. The paper highlights the relationship between product and life cycle and presents a computer-based methodology for optimizing the product life cycle during design, as presented by SFB 392: Design for Environment - Methods and Tools at Technical University, Darmstadt.
NASA Technical Reports Server (NTRS)
Vanderplaats, Garrett; Townsend, James C. (Technical Monitor)
2002-01-01
The purpose of this research under the NASA Small Business Innovative Research program was to develop algorithms and associated software to solve very large nonlinear, constrained optimization tasks. Key issues included efficiency, reliability, memory, and gradient calculation requirements. This report describes the general optimization problem, ten candidate methods, and detailed evaluations of four candidates. The algorithm chosen for final development is a modern recreation of a 1960s external penalty function method that uses very limited computer memory and computational time. Although of lower efficiency, the new method can solve problems orders of magnitude larger than current methods. The resulting BIGDOT software has been demonstrated on problems with 50,000 variables and about 50,000 active constraints. For unconstrained optimization, it has solved a problem in excess of 135,000 variables. The method includes a technique for solving discrete variable problems that finds a "good" design, although a theoretical optimum cannot be guaranteed. It is very scalable in that the number of function and gradient evaluations does not change significantly with increased problem size. Test cases are provided to demonstrate the efficiency and reliability of the methods and software.
\\mathscr{H}_2 optimal control techniques for resistive wall mode feedback in tokamaks
NASA Astrophysics Data System (ADS)
Clement, Mitchell; Hanson, Jeremy; Bialek, Jim; Navratil, Gerald
2018-04-01
DIII-D experiments show that a new, advanced algorithm enables resistive wall mode (RWM) stability control in high performance discharges using external coils. DIII-D can excite strong, locked or nearly locked external kink modes whose rotation frequencies and growth rates are on the order of the magnetic flux diffusion time of the vacuum vessel wall. Experiments have shown that modern control techniques like linear quadratic Gaussian (LQG) control require less current than the proportional controller in use at DIII-D when using control coils external to DIII-D’s vacuum vessel. Experiments were conducted to develop control of a rotating n = 1 perturbation using an LQG controller derived from VALEN and external coils. Feedback using this LQG algorithm outperformed a proportional gain only controller in these perturbation experiments over a range of frequencies. Results from high βN experiments also show that advanced feedback techniques using external control coils may be as effective as internal control coil feedback using classical control techniques.
Tarasov, Andrii; Rauhut, Doris; Jung, Rainer
2017-12-01
Analytical methods of haloanisoles and halophenols quantification in cork matrix are summarized in the current review. Sample-preparation and sample-treatment techniques have been compared and discussed from the perspective of their efficiency, time- and extractant-optimization, easiness of performance. Primary interest of these analyses usually addresses to 2,4,6-trichloroanisole (TCA), which is a major wine contaminant among haloanisoles. Two concepts of TCA determination are described in the review: releasable TCA and total TCA analyses. Chromatographic, bioanalytical and sensorial methods were compared according to their application in the cork industry and in scientific investigations. Finally, it was shown that modern analytical techniques are able to provide required sensitivity, selectivity and repeatability for haloanisoles and halophenols determination. Copyright © 2017 Elsevier B.V. All rights reserved.
Taming the Wild: A Unified Analysis of Hogwild!-Style Algorithms.
De Sa, Christopher; Zhang, Ce; Olukotun, Kunle; Ré, Christopher
2015-12-01
Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems. Researchers and industry have developed several techniques to optimize SGD's runtime performance, including asynchronous execution and reduced precision. Our main result is a martingale-based analysis that enables us to capture the rich noise models that may arise from such techniques. Specifically, we use our new analysis in three ways: (1) we derive convergence rates for the convex case (Hogwild!) with relaxed assumptions on the sparsity of the problem; (2) we analyze asynchronous SGD algorithms for non-convex matrix problems including matrix completion; and (3) we design and analyze an asynchronous SGD algorithm, called Buckwild!, that uses lower-precision arithmetic. We show experimentally that our algorithms run efficiently for a variety of problems on modern hardware.
NASA Astrophysics Data System (ADS)
Sun, Ning; Wu, Yiming; Chen, He; Fang, Yongchun
2018-03-01
Underactuated cranes play an important role in modern industry. Specifically, in most situations of practical applications, crane systems exhibit significant double pendulum characteristics, which makes the control problem quite challenging. Moreover, most existing planners/controllers obtained with standard methods/techniques for double pendulum cranes cannot minimize the energy consumption when fulfilling the transportation tasks. Therefore, from a practical perspective, this paper proposes an energy-optimal solution for transportation control of double pendulum cranes. By applying the presented approach, the transportation objective, including fast trolley positioning and swing elimination, is achieved with minimized energy consumption, and the residual oscillations are suppressed effectively with all the state constrains being satisfied during the entire transportation process. As far as we know, this is the first energy-optimal solution for transportation control of underactuated double pendulum cranes with various state and control constraints. Hardware experimental results are included to verify the effectiveness of the proposed approach, whose superior performance is reflected by being experimentally compared with some comparative controllers.
Verification techniques for x-ray and mammography applications
NASA Astrophysics Data System (ADS)
Kotsopoulos, Stavros A.; Lymberopoulos, Dimitris C.
1993-07-01
The integration of Medical Information Environment demands the study and development of high speed data communication systems with special designed 'endsystems' (MWS, etc.) for flexible and reliable data transmission/reception, handling and manipulation. An important parameter which affects the overall system's performance is the 'quality factor' of the communicated medical data produced by a wide range of modern modalities. The present paper describes a set of tests, done in a medical communication network based on a teleworking platform, in order to optimize the sensitivity parameters of the modalities by remote fine re-adjustments guided by experts.
Experimental Investigation and Optimization of Response Variables in WEDM of Inconel - 718
NASA Astrophysics Data System (ADS)
Karidkar, S. S.; Dabade, U. A.
2016-02-01
Effective utilisation of Wire Electrical Discharge Machining (WEDM) technology is challenge for modern manufacturing industries. Day by day new materials with high strengths and capabilities are being developed to fulfil the customers need. Inconel - 718 is similar kind of material which is extensively used in aerospace applications, such as gas turbine, rocket motors, and spacecraft as well as in nuclear reactors and pumps etc. This paper deals with the experimental investigation of optimal machining parameters in WEDM for Surface Roughness, Kerf Width and Dimensional Deviation using DoE such as Taguchi methodology, L9 orthogonal array. By keeping peak current constant at 70 A, the effect of other process parameters on above response variables were analysed. Obtained experimental results were statistically analysed using Minitab-16 software. Analysis of Variance (ANOVA) shows pulse on time as the most influential parameter followed by wire tension whereas spark gap set voltage is observed to be non-influencing parameter. Multi-objective optimization technique, Grey Relational Analysis (GRA), shows optimal machining parameters such as pulse on time 108 Machine unit, spark gap set voltage 50 V and wire tension 12 gm for optimal response variables considered for the experimental analysis.
Discrete particle swarm optimization for identifying community structures in signed social networks.
Cai, Qing; Gong, Maoguo; Shen, Bo; Ma, Lijia; Jiao, Licheng
2014-10-01
Modern science of networks has facilitated us with enormous convenience to the understanding of complex systems. Community structure is believed to be one of the notable features of complex networks representing real complicated systems. Very often, uncovering community structures in networks can be regarded as an optimization problem, thus, many evolutionary algorithms based approaches have been put forward. Particle swarm optimization (PSO) is an artificial intelligent algorithm originated from social behavior such as birds flocking and fish schooling. PSO has been proved to be an effective optimization technique. However, PSO was originally designed for continuous optimization which confounds its applications to discrete contexts. In this paper, a novel discrete PSO algorithm is suggested for identifying community structures in signed networks. In the suggested method, particles' status has been redesigned in discrete form so as to make PSO proper for discrete scenarios, and particles' updating rules have been reformulated by making use of the topology of the signed network. Extensive experiments compared with three state-of-the-art approaches on both synthetic and real-world signed networks demonstrate that the proposed method is effective and promising. Copyright © 2014 Elsevier Ltd. All rights reserved.
Optimal case-control matching in practice.
Cologne, J B; Shibata, Y
1995-05-01
We illustrate modern matching techniques and discuss practical issues in defining the closeness of matching for retrospective case-control designs (in which the pool of subjects already exists when the study commences). We empirically compare matching on a balancing score, analogous to the propensity score for treated/control matching, with matching on a weighted distance measure. Although both methods in principle produce balance between cases and controls in the marginal distributions of the matching covariates, the weighted distance measure provides better balance in practice because the balancing score can be poorly estimated. We emphasize the use of optimal matching based on efficient network algorithms. An illustration is based on the design of a case-control study of hepatitis B virus infection as a possible confounder and/or effect modifier of radiation-related primary liver cancer in atomic bomb survivors.
Reliability-based optimization of an active vibration controller using evolutionary algorithms
NASA Astrophysics Data System (ADS)
Saraygord Afshari, Sajad; Pourtakdoust, Seid H.
2017-04-01
Many modern industrialized systems such as aircrafts, rotating turbines, satellite booms, etc. cannot perform their desired tasks accurately if their uninhibited structural vibrations are not controlled properly. Structural health monitoring and online reliability calculations are emerging new means to handle system imposed uncertainties. As stochastic forcing are unavoidable, in most engineering systems, it is often needed to take them into the account for the control design process. In this research, smart material technology is utilized for structural health monitoring and control in order to keep the system in a reliable performance range. In this regard, a reliability-based cost function is assigned for both controller gain optimization as well as sensor placement. The proposed scheme is implemented and verified for a wing section. Comparison of results for the frequency responses is considered to show potential applicability of the presented technique.
Efficacy of Code Optimization on Cache-Based Processors
NASA Technical Reports Server (NTRS)
VanderWijngaart, Rob F.; Saphir, William C.; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
In this paper a number of techniques for improving the cache performance of a representative piece of numerical software is presented. Target machines are popular processors from several vendors: MIPS R5000 (SGI Indy), MIPS R8000 (SGI PowerChallenge), MIPS R10000 (SGI Origin), DEC Alpha EV4 + EV5 (Cray T3D & T3E), IBM RS6000 (SP Wide-node), Intel PentiumPro (Ames' Whitney), Sun UltraSparc (NERSC's NOW). The optimizations all attempt to increase the locality of memory accesses. But they meet with rather varied and often counterintuitive success on the different computing platforms. We conclude that it may be genuinely impossible to obtain portable performance on the current generation of cache-based machines. At the least, it appears that the performance of modern commodity processors cannot be described with parameters defining the cache alone.
Application of Post Modern Portfolio Theory to Mitigate Risk in International Shipping
2011-03-24
The concept of portfolio optimization pioneered by Dr. Harry Markowitz and still used today for investment diversification is applied to the ...is currently referred to as “Post-Modern Portfolio Theory .” It begins with the foundations of portfolio optimization as created by Harry 14... Portfolio Theory ,” and is still considered to be one of the foundations of economic theory , garnering
Meleşcanu Imre, M; Preoteasa, E; Țâncu, AM; Preoteasa, CT
2013-01-01
Rationale. The imaging methods are more and more used in the clinical process of modern dentistry. Once the implant based treatment alternatives are nowadays seen as being the standard of care in edentulous patients, these techniques must be integrated in the complete denture treatment. Aim. The study presents some evaluation techniques for the edentulous patient treated by conventional dentures or mini dental implants (mini SKY Bredent) overdentures, using the profile teleradiography. These offer data useful for an optimal positioning of the artificial teeth and the mini dental implants, favoring to obtain an esthetic and functional treatment outcome. We proposed also a method to conceive a simple surgical guide that allows the prosthetically driven implants placement. Material and method. Clinical case reports were made, highlighting the importance of cephalometric evaluation on lateral teleradiographs in complete edentulous patients. A clinical case that gradually reports the surgical guide preparation (Bredent silicon radio opaque), in order to place the mini dental implants in the best prosthetic and anatomic conditions, was presented. Conclusions. The profile teleradiograph is a useful tool for the practitioner. It allows establishing the optimal site for implant placement, in a good relation with the overdenture. The conventional denture can be easily and relatively costless transformed in a surgical guide used during implant placement. PMID:23599828
Timing analysis by model checking
NASA Technical Reports Server (NTRS)
Naydich, Dimitri; Guaspari, David
2000-01-01
The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.
Lorenz, Romy; Monti, Ricardo Pio; Violante, Inês R.; Anagnostopoulos, Christoforos; Faisal, Aldo A.; Montana, Giovanni; Leech, Robert
2016-01-01
Functional neuroimaging typically explores how a particular task activates a set of brain regions. Importantly though, the same neural system can be activated by inherently different tasks. To date, there is no approach available that systematically explores whether and how distinct tasks probe the same neural system. Here, we propose and validate an alternative framework, the Automatic Neuroscientist, which turns the standard fMRI approach on its head. We use real-time fMRI in combination with modern machine-learning techniques to automatically design the optimal experiment to evoke a desired target brain state. In this work, we present two proof-of-principle studies involving perceptual stimuli. In both studies optimization algorithms of varying complexity were employed; the first involved a stochastic approximation method while the second incorporated a more sophisticated Bayesian optimization technique. In the first study, we achieved convergence for the hypothesized optimum in 11 out of 14 runs in less than 10 min. Results of the second study showed how our closed-loop framework accurately and with high efficiency estimated the underlying relationship between stimuli and neural responses for each subject in one to two runs: with each run lasting 6.3 min. Moreover, we demonstrate that using only the first run produced a reliable solution at a group-level. Supporting simulation analyses provided evidence on the robustness of the Bayesian optimization approach for scenarios with low contrast-to-noise ratio. This framework is generalizable to numerous applications, ranging from optimizing stimuli in neuroimaging pilot studies to tailoring clinical rehabilitation therapy to patients and can be used with multiple imaging modalities in humans and animals. PMID:26804778
Lorenz, Romy; Monti, Ricardo Pio; Violante, Inês R; Anagnostopoulos, Christoforos; Faisal, Aldo A; Montana, Giovanni; Leech, Robert
2016-04-01
Functional neuroimaging typically explores how a particular task activates a set of brain regions. Importantly though, the same neural system can be activated by inherently different tasks. To date, there is no approach available that systematically explores whether and how distinct tasks probe the same neural system. Here, we propose and validate an alternative framework, the Automatic Neuroscientist, which turns the standard fMRI approach on its head. We use real-time fMRI in combination with modern machine-learning techniques to automatically design the optimal experiment to evoke a desired target brain state. In this work, we present two proof-of-principle studies involving perceptual stimuli. In both studies optimization algorithms of varying complexity were employed; the first involved a stochastic approximation method while the second incorporated a more sophisticated Bayesian optimization technique. In the first study, we achieved convergence for the hypothesized optimum in 11 out of 14 runs in less than 10 min. Results of the second study showed how our closed-loop framework accurately and with high efficiency estimated the underlying relationship between stimuli and neural responses for each subject in one to two runs: with each run lasting 6.3 min. Moreover, we demonstrate that using only the first run produced a reliable solution at a group-level. Supporting simulation analyses provided evidence on the robustness of the Bayesian optimization approach for scenarios with low contrast-to-noise ratio. This framework is generalizable to numerous applications, ranging from optimizing stimuli in neuroimaging pilot studies to tailoring clinical rehabilitation therapy to patients and can be used with multiple imaging modalities in humans and animals. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
$$\\mathscr{H}_2$$ optimal control techniques for resistive wall mode feedback in tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clement, Mitchell; Hanson, Jeremy; Bialek, Jim
DIII-D experiments show that a new, advanced algorithm improves resistive wall mode (RWM) stability control in high performance discharges using external coils. DIII-D can excite strong, locked or nearly locked external kink modes whose rotation frequencies and growth rates are on the order of the magnetic ux di usion time of the vacuum vessel wall. The VALEN RWM model has been used to gauge the e ectiveness of RWM control algorithms in tokamaks. Simulations and experiments have shown that modern control techniques like Linear Quadratic Gaussian (LQG) control will perform better, using 77% less current, than classical techniques when usingmore » control coils external to DIII-D's vacuum vessel. Experiments were conducted to develop control of a rotating n = 1 perturbation using an LQG controller derived from VALEN and external coils. Feedback using this LQG algorithm outperformed a proportional gain only controller in these perturbation experiments over a range of frequencies. Results from high N experiments also show that advanced feedback techniques using external control coils may be as e ective as internal control coil feedback using classical control techniques.« less
$$\\mathscr{H}_2$$ optimal control techniques for resistive wall mode feedback in tokamaks
Clement, Mitchell; Hanson, Jeremy; Bialek, Jim; ...
2018-02-28
DIII-D experiments show that a new, advanced algorithm improves resistive wall mode (RWM) stability control in high performance discharges using external coils. DIII-D can excite strong, locked or nearly locked external kink modes whose rotation frequencies and growth rates are on the order of the magnetic ux di usion time of the vacuum vessel wall. The VALEN RWM model has been used to gauge the e ectiveness of RWM control algorithms in tokamaks. Simulations and experiments have shown that modern control techniques like Linear Quadratic Gaussian (LQG) control will perform better, using 77% less current, than classical techniques when usingmore » control coils external to DIII-D's vacuum vessel. Experiments were conducted to develop control of a rotating n = 1 perturbation using an LQG controller derived from VALEN and external coils. Feedback using this LQG algorithm outperformed a proportional gain only controller in these perturbation experiments over a range of frequencies. Results from high N experiments also show that advanced feedback techniques using external control coils may be as e ective as internal control coil feedback using classical control techniques.« less
Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform
NASA Astrophysics Data System (ADS)
Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.
2017-03-01
The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.
WE-D-303-01: Development and Application of Digital Human Phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segars, P.
2015-06-15
Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less
Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)
ERIC Educational Resources Information Center
Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.
2016-01-01
The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…
Enhanced interfaces for web-based enterprise-wide image distribution.
Jost, R Gilbert; Blaine, G James; Fritz, Kevin; Blume, Hartwig; Sadhra, Sarbjit
2002-01-01
Modern Web browsers support image distribution with two shortcomings: (1) image grayscale presentation at client workstations is often sub-optimal and generally inconsistent with the presentation state on diagnostic workstations and (2) an Electronic Patient Record (EPR) application usually cannot directly access images with an integrated viewer. We have modified our EPR and our Web-based image-distribution system to allow access to images from within the EPR. In addition, at the client workstation, a grayscale transformation is performed that consists of two components: a client-display-specific component based on the characteristic display function of the class of display system, and a modality-specific transformation that is downloaded with every image. The described techniques have been implemented in our institution and currently support enterprise-wide clinical image distribution. The effectiveness of the techniques is reviewed.
Megabrasion: a conservative strategy for the anterior dentition.
Magne, P
1997-05-01
Continuous developments in adhesive restorative techniques, tooth whitening procedures, and restorative materials have significantly broadened the initially defined spectrum of indications for composite restorations. These developments have thereby contributed to the achievement of one of the major objectives of conservative restorative dentistry--the maximum preservation of sound tooth structure. In order to optimize the application of modern composite resin technology, mastering the basic principles of natural aesthetics is an essential prerequisite. The learning objective of this article is to discuss the etiology of enamel discoloration and conservative treatment strategies, including microabrasion and masking procedures. Emphasis is placed on a simple procedure--the megabrasion technique--which does not depend extensively on the artistic skills of the operator. It represents a useful and predictable approach for the elimination of white opaque stains on the enamel and yellow-brown enamel discolorations.
NASA Technical Reports Server (NTRS)
Brown, Aaron J.
2015-01-01
The International Space Station's (ISS) trajectory is coordinated and executed by the Trajectory Operations and Planning (TOPO) group at NASA's Johnson Space Center. TOPO group personnel routinely generate look-ahead trajectories for the ISS that incorporate translation burns needed to maintain its orbit over the next three to twelve months. The burns are modeled as in-plane, horizontal burns, and must meet operational trajectory constraints imposed by both NASA and the Russian Space Agency. In generating these trajectories, TOPO personnel must determine the number of burns to model, each burn's Time of Ignition (TIG), and magnitude (i.e. deltaV) that meet these constraints. The current process for targeting these burns is manually intensive, and does not take advantage of more modern techniques that can reduce the workload needed to find feasible burn solutions, i.e. solutions that simply meet the constraints, or provide optimal burn solutions that minimize the total DeltaV while simultaneously meeting the constraints. A two-level, hybrid optimization technique is proposed to find both feasible and globally optimal burn solutions for ISS trajectory planning. For optimal solutions, the technique breaks the optimization problem into two distinct sub-problems, one for choosing the optimal number of burns and each burn's optimal TIG, and the other for computing the minimum total deltaV burn solution that satisfies the trajectory constraints. Each of the two aforementioned levels uses a different optimization algorithm to solve one of the sub-problems, giving rise to a hybrid technique. Level 2, or the outer level, uses a genetic algorithm to select the number of burns and each burn's TIG. Level 1, or the inner level, uses the burn TIGs from Level 2 in a sequential quadratic programming (SQP) algorithm to compute a minimum total deltaV burn solution subject to the trajectory constraints. The total deltaV from Level 1 is then used as a fitness function by the genetic algorithm in Level 2 to select the number of burns and their TIGs for the next generation. In this manner, the two levels solve their respective sub-problems separately but collaboratively until a burn solution is found that globally minimizes the deltaV across the entire trajectory. Feasible solutions can also be found by simply using the SQP algorithm in Level 1 with a zero cost function. This paper discusses the formulation of the Level 1 sub-problem and the development of a prototype software tool to solve it. The Level 2 sub-problem will be discussed in a future work. Following the Level 1 formulation and solution, several look-ahead trajectory examples for the ISS are explored. In each case, the burn targeting results using the current process are compared against a feasible solution found using Level 1 in the proposed technique. Level 1 is then used to find a minimum deltaV solution given the fixed number of burns and burn TIGs. The optimal solution is compared with the previously found feasible solution to determine the deltaV (and therefore propellant) savings. The proposed technique seeks to both improve the current process for targeting ISS burns, and to add the capability to optimize ISS burns in a novel fashion. The optimal solutions found using this technique can potentially save hundreds of kilograms of propellant over the course of the ISS mission compared to feasible solutions alone. While the software tool being developed to implement this technique is specific to ISS, the concept is extensible to other long-duration, central-body orbiting missions that must perform orbit maintenance burns to meet operational trajectory constraints.
Data mining: childhood injury control and beyond.
Tepas, Joseph J
2009-08-01
Data mining is defined as the automatic extraction of useful, often previously unknown information from large databases or data sets. It has become a major part of modern life and is extensively used in industry, banking, government, and health care delivery. The process requires a data collection system that integrates input from multiple sources containing critical elements that define outcomes of interest. Appropriately designed data mining processes identify and adjust for confounding variables. The statistical modeling used to manipulate accumulated data may involve any number of techniques. As predicted results are periodically analyzed against those observed, the model is consistently refined to optimize precision and accuracy. Whether applying integrated sources of clinical data to inferential probabilistic prediction of risk of ventilator-associated pneumonia or population surveillance for signs of bioterrorism, it is essential that modern health care providers have at least a rudimentary understanding of what the concept means, how it basically works, and what it means to current and future health care.
Old patterns, new meaning: the 1845 hospital of Bezm-i Alem in Istanbul.
Shefer, Miri
2005-01-01
This paper discusses the history of an 1845 Ottoman hospital founded by Bezm-i Alem, mother of the reigning sultan Abdülmecit I (reigned 1839-1856), embedded in the medical and political contexts of the Middle East in the nineteenth century. The main focus of this paper is the Ottoman discourse of modernization, which identified progress with modernization and westernization and induced a belief in the positive character of progress, with a high degree of optimism regarding the success of the process. The Bezm-i Alem hospital illustrates the medical reality of the 19th century, reconstructed through Ottoman eyes rather than from the perspective of foreigners with their own agenda and biases. In many respects it continued previous medical traditions; other aspects reveal brand new developments in Ottoman medicine and hospital management. Ottoman medical reality was one of coexistence and rivalry: traditional conceptions of medicine and health were believed and practiced side-by-side with new western-like concepts and techniques.
In situ wavefront correction and its application to micromanipulation
NASA Astrophysics Data System (ADS)
Čižmár, Tomáš; Mazilu, Michael; Dholakia, Kishan
2010-06-01
In any optical system, distortions to a propagating wavefront reduce the spatial coherence of a light field, making it increasingly difficult to obtain the theoretical diffraction-limited spot size. Such aberrations are severely detrimental to optimal performance in imaging, nanosurgery, nanofabrication and micromanipulation, as well as other techniques within modern microscopy. We present a generic method based on complex modulation for true in situ wavefront correction that allows compensation of all aberrations along the entire optical train. The power of the method is demonstrated for the field of micromanipulation, which is very sensitive to wavefront distortions. We present direct trapping with optimally focused laser light carrying power of a fraction of a milliwatt as well as the first trapping through highly turbid and diffusive media. This opens up new perspectives for optical micromanipulation in colloidal and biological physics and may be useful for various forms of advanced imaging.
Real-Time Optimization and Control of Next-Generation Distribution
Infrastructure | Grid Modernization | NREL Real-Time Optimization and Control of Next -Generation Distribution Infrastructure Real-Time Optimization and Control of Next-Generation Distribution Infrastructure This project develops innovative, real-time optimization and control methods for next-generation
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
Wang, Bei; Ethier, Stephane; Tang, William; ...
2017-06-29
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
Vielreicher, M.; Schürmann, S.; Detsch, R.; Schmidt, M. A.; Buttgereit, A.; Boccaccini, A.; Friedrich, O.
2013-01-01
This review focuses on modern nonlinear optical microscopy (NLOM) methods that are increasingly being used in the field of tissue engineering (TE) to image tissue non-invasively and without labelling in depths unreached by conventional microscopy techniques. With NLOM techniques, biomaterial matrices, cultured cells and their produced extracellular matrix may be visualized with high resolution. After introducing classical imaging methodologies such as µCT, MRI, optical coherence tomography, electron microscopy and conventional microscopy two-photon fluorescence (2-PF) and second harmonic generation (SHG) imaging are described in detail (principle, power, limitations) together with their most widely used TE applications. Besides our own cell encapsulation, cell printing and collagen scaffolding systems and their NLOM imaging the most current research articles will be reviewed. These cover imaging of autofluorescence and fluorescence-labelled tissue and biomaterial structures, SHG-based quantitative morphometry of collagen I and other proteins, imaging of vascularization and online monitoring techniques in TE. Finally, some insight is given into state-of-the-art three-photon-based imaging methods (e.g. coherent anti-Stokes Raman scattering, third harmonic generation). This review provides an overview of the powerful and constantly evolving field of multiphoton microscopy, which is a powerful and indispensable tool for the development of artificial tissues in regenerative medicine and which is likely to gain importance also as a means for general diagnostic medical imaging. PMID:23864499
Multiple brain metastases irradiation with Eleka Axesse stereotactic system
NASA Astrophysics Data System (ADS)
Filatov, P. V.; Polovnikov, E. S.; Orlov, K. Yu.; Krutko, A. V.; Kirilova, I. A.; Moskalev, A. V.; Filatova, E. V.; Zheravin, A. A.
2017-09-01
Brain metastases are one of the factors complicating the treatment of a malignant tumor. Radiation therapy, especially radiosurgery, plays an important role in the modern treatment practice. During 2011-2016, 32 patients (from 29 to 67 years old) with multiple brain metastases underwent the treatment with SRS or SRT in our center. The number of secondary lesions varied from 2 to 11. Eight patients underwent microsurgery resection. Seven patients had recurrence after whole brain radiotherapy. Thirty patient underwent single fraction SRS and two patients with large metastases (bigger than 3 cm) underwent fractionated SRT. The treatment was done with dedicated linear accelerator stereotactic system Elekta Axesse (Elekta AB, Stockholm, Sweden). Different stereotactic fixation devices were used, namely, Leksell G frame, non-invasive HeadFIX frame, and reinforced thermoplastic mask (IMRT perforation). All treatments included a volumetric modulated arc therapy (VMAT) technique and of Inage Guided Radiation Therapy (IGRT) technique. All lesions were treated from a single isocenter, which allowed reducing the treatment time and overall dose to the patient's body. All patients suffered the treatment satisfactorily. No adverse reactions or complications were met in any case during or right after the treatment. Different stereotactic fixation devices and modern treatment techniques allowed creating an optimal, safe and comfortable way for patient treatment. The treatment time was from 15 to 50 minutes. Patient position verification after or during the treatment demonstrated good accuracy for all fixation types and low level of intrafraction motion.
NASA Astrophysics Data System (ADS)
Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad
2017-11-01
Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.
NASA Technical Reports Server (NTRS)
Halyo, N.
1976-01-01
A digital automatic control law to capture a steep glideslope and track the glideslope to a specified altitude is developed for the longitudinal/vertical dynamics of a CTOL aircraft using modern estimation and control techniques. The control law uses a constant gain Kalman filter to process guidance information from the microwave landing system, and acceleration from body mounted accelerometer data. The filter outputs navigation data and wind velocity estimates which are used in controlling the aircraft. Results from a digital simulation of the aircraft dynamics and the control law are presented for various wind conditions.
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
Damage Detection Using Holography and Interferometry
NASA Technical Reports Server (NTRS)
Decker, Arthur J.
2003-01-01
This paper reviews classical approaches to damage detection using laser holography and interferometry. The paper then details the modern uses of electronic holography and neural-net-processed characteristic patterns to detect structural damage. The design of the neural networks and the preparation of the training sets are discussed. The use of a technique to optimize the training sets, called folding, is explained. Then a training procedure is detailed that uses the holography-measured vibration modes of the undamaged structures to impart damage-detection sensitivity to the neural networks. The inspections of an optical strain gauge mounting plate and an International Space Station cold plate are presented as examples.
Diffraction-geometry refinement in the DIALS framework
Waterman, David G.; Winter, Graeme; Gildea, Richard J.; ...
2016-03-30
Rapid data collection and modern computing resources provide the opportunity to revisit the task of optimizing the model of diffraction geometry prior to integration. A comprehensive description is given of new software that builds upon established methods by performing a single global refinement procedure, utilizing a smoothly varying model of the crystal lattice where appropriate. This global refinement technique extends to multiple data sets, providing useful constraints to handle the problem of correlated parameters, particularly for small wedges of data. Examples of advanced uses of the software are given and the design is explained in detail, with particular emphasis onmore » the flexibility and extensibility it entails.« less
Modern digital flight control system design for VTOL aircraft
NASA Technical Reports Server (NTRS)
Broussard, J. R.; Berry, P. W.; Stengel, R. F.
1979-01-01
Methods for and results from the design and evaluation of a digital flight control system (DFCS) for a CH-47B helicopter are presented. The DFCS employed proportional-integral control logic to provide rapid, precise response to automatic or manual guidance commands while following conventional or spiral-descent approach paths. It contained altitude- and velocity-command modes, and it adapted to varying flight conditions through gain scheduling. Extensive use was made of linear systems analysis techniques. The DFCS was designed, using linear-optimal estimation and control theory, and the effects of gain scheduling are assessed by examination of closed-loop eigenvalues and time responses.
Coprocessors for quantum devices
NASA Astrophysics Data System (ADS)
Kay, Alastair
2018-03-01
Quantum devices, from simple fixed-function tools to the ultimate goal of a universal quantum computer, will require high-quality, frequent repetition of a small set of core operations, such as the preparation of entangled states. These tasks are perfectly suited to realization by a coprocessor or supplementary instruction set, as is common practice in modern CPUs. In this paper, we present two quintessentially quantum coprocessor functions: production of a Greenberger-Horne-Zeilinger state and implementation of optimal universal (asymmetric) quantum cloning. Both are based on the evolution of a fixed Hamiltonian. We introduce a technique for deriving the parameters of these Hamiltonians based on the numerical integration of Toda-like flows.
Flexible use and technique extension of logistics management
NASA Astrophysics Data System (ADS)
Xiong, Furong
2011-10-01
As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
Optimization of cascade blade mistuning under flutter and forced response constraints
NASA Technical Reports Server (NTRS)
Murthy, D. V.; Haftka, R. T.
1984-01-01
In the development of modern turbomachinery, problems of flutter instabilities and excessive forced response of a cascade of blades that were encountered have often turned out to be extremely difficult to eliminate. The study of these instabilities and the forced response is complicated by the presence of mistuning; that is, small differences among the individual blades. The theory of mistuned cascade behavior shows that mistuning can have a beneficial effect on the stability of the rotor. This beneficial effect is produced by the coupling between the more stable and less stable flutter modes introduced by mistuning. The effect of mistuning on the forced response can be either beneficial or adverse. Kaza and Kielb have studied the effects of two types of mistuning on the flutter and forced response: alternate mistuning where alternte blades are identical and random mistuning. The objective is to investigate other patterns of mistuning which maximize the beneficial effects on the flutter and forced response of the cascade. Numerical optimization techniques are employed to obtain optimal mistuning patterns. The optimization program seeks to minimize the amount of mistuning required to satisfy constraints on flutter speed and forced response.
Leroch, Michaela; Mernke, Dennis; Koppenhoefer, Dieter; Schneider, Prisca; Mosbach, Andreas; Doehlemann, Gunther; Hahn, Matthias
2011-05-01
The green fluorescent protein (GFP) and its variants have been widely used in modern biology as reporters that allow a variety of live-cell imaging techniques. So far, GFP has rarely been used in the gray mold fungus Botrytis cinerea because of low fluorescence intensity. The codon usage of B. cinerea genes strongly deviates from that of commonly used GFP-encoding genes and reveals a lower GC content than other fungi. In this study, we report the development and use of a codon-optimized version of the B. cinerea enhanced GFP (eGFP)-encoding gene (Bcgfp) for improved expression in B. cinerea. Both the codon optimization and, to a smaller extent, the insertion of an intron resulted in higher mRNA levels and increased fluorescence. Bcgfp was used for localization of nuclei in germinating spores and for visualizing host penetration. We further demonstrate the use of promoter-Bcgfp fusions for quantitative evaluation of various toxic compounds as inducers of the atrB gene encoding an ABC-type drug efflux transporter of B. cinerea. In addition, a codon-optimized mCherry-encoding gene was constructed which yielded bright red fluorescence in B. cinerea.
Roeber, Florian; Kahn, Lewis
2014-10-15
The specific diagnosis of gastrointestinal nematode infections in ruminants is routinely based on larval culture technique and on the morphological identification of developed third-stage larvae. However, research on the ecology and developmental requirements of different species suggests that environmental conditions (e.g., temperature and humidity) for optimal development to occur vary between the different species. Thus, employing a common culture protocol for all species will favour the development of certain species over others and can cause a biased result in particular when species proportions in a mixed infection are to be determined. Furthermore, the morphological identification of L3 larvae is complicated by a lack of distinctive, obvious features that would allow the identification of all key species. In the present paper we review in detail the potential limitations of larval culture technique and morphological identification and provide account to some modern molecular alternatives to the specific diagnosis of gastrointestinal nematode infection in ruminants. Copyright © 2014 Elsevier B.V. All rights reserved.
Response Surface Methods for Spatially-Resolved Optical Measurement Techniques
NASA Technical Reports Server (NTRS)
Danehy, P. M.; Dorrington, A. A.; Cutler, A. D.; DeLoach, R.
2003-01-01
Response surface methods (or methodology), RSM, have been applied to improve data quality for two vastly different spatial ly-re solved optical measurement techniques. In the first application, modern design of experiments (MDOE) methods, including RSM, are employed to map the temperature field in a direct-connect supersonic combustion test facility at NASA Langley Research Center. The laser-based measurement technique known as coherent anti-Stokes Raman spectroscopy (CARS) is used to measure temperature at various locations in the combustor. RSM is then used to develop temperature maps of the flow. Even though the temperature fluctuations at a single point in the flowfield have a standard deviation on the order of 300 K, RSM provides analytic fits to the data having 95% confidence interval half width uncertainties in the fit as low as +/-30 K. Methods of optimizing future CARS experiments are explored. The second application of RSM is to quantify the shape of a 5-meter diameter, ultra-light, inflatable space antenna at NASA Langley Research Center.
Assessment of Speech in Primary Cleft Palate by Two-layer Closure (Conservative Management)
Jain, Harsha; Rao, Dayashankara; Sharma, Shailender; Gupta, Saurabh
2012-01-01
Treatment of the cleft palate has evolved over a long period of time. Various techniques of cleft palate repair that are practiced today are the results of principles learned through many years of modifications. The challenge in the art of modern palatoplasty is no longer successful closure of the cleft palate but an optimal speech outcome without compromising maxillofacial growth. Throughout these periods of evolution in the treatment of cleft palate, the effectiveness of various treatment protocols has been challenged by controversies concerning speech and maxillofacial growth. In this article we have evaluated the results of Pinto's modification of Wardill–Kilner palatoplasty without radical dissection of the levator veli palitini muscle on speech and post-op fistula in two different age groups in 20 patients. Preoperative and 6-month postoperative speech assessment values indicated that two-layer palatoplasty (modified Wardill–Kilner V-Y pushback technique) without an intravelar veloplasty technique was good for speech. PMID:23066454
Management of irregular astigmatism.
Goggin, M; Alpins, N; Schmid, L M
2000-08-01
Using a liberal definition of corneal irregularity, modern videokeratoscopy may define approximately 40% of normal corneas with a toric refractive error as possessing primary irregular astigmatism. The causes of secondary forms of irregular astigmatism include corneal surgery, trauma, dystrophies, and infections. Internal refractive surface and media irregularity or noncorneal astigmatism (ocular residual astigmatism) contribute to irregular astigmatism of the entire refractive path of which crystaline lenticular astigmatism is usually the principal contributing component. Treatment options have increased in recent years, particularly, though not exclusively, through the advent of tailored corneal excimer laser ablations. However, discussion continues concerning the systematic approach necessary to enable treatment to achieve an optimal optical surface for the eye. Discussion also continues as to what constitutes the optimal corneal shape. Some refractive procedures may increase higher order aberrations in the attempt to neutralize refractive astigmatism. The way to further refinement of the commonly performed refractive techniques will ultimately lie in the integrated inclusion of a trio of technologies: topographic analysis of the corneal surface, wavefront analysis of ocular refractive aberrations, and vector planning to enable the appropriate balance in emphasis between these two diagnostic modalities. For the uncommon, irregularly roughened corneas, the ablatable polymer techniques show some promise.
Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad; Roman, Monica
2015-01-01
Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies.
Selişteanu, Dan; Șendrescu, Dorin; Georgeanu, Vlad
2015-01-01
Monoclonal antibodies (mAbs) are at present one of the fastest growing products of pharmaceutical industry, with widespread applications in biochemistry, biology, and medicine. The operation of mAbs production processes is predominantly based on empirical knowledge, the improvements being achieved by using trial-and-error experiments and precedent practices. The nonlinearity of these processes and the absence of suitable instrumentation require an enhanced modelling effort and modern kinetic parameter estimation strategies. The present work is dedicated to nonlinear dynamic modelling and parameter estimation for a mammalian cell culture process used for mAb production. By using a dynamical model of such kind of processes, an optimization-based technique for estimation of kinetic parameters in the model of mammalian cell culture process is developed. The estimation is achieved as a result of minimizing an error function by a particle swarm optimization (PSO) algorithm. The proposed estimation approach is analyzed in this work by using a particular model of mammalian cell culture, as a case study, but is generic for this class of bioprocesses. The presented case study shows that the proposed parameter estimation technique provides a more accurate simulation of the experimentally observed process behaviour than reported in previous studies. PMID:25685797
Modern Languages and Interculturality in the Primary Sector in England, Greece, Italy and Spain.
ERIC Educational Resources Information Center
Cerezal, Fernando
1997-01-01
Addresses concerns and issues regarding modern language teaching and learning at primary schools in Greece, Italy, Spain, and England. It focuses on the optimal age for learning and acquiring languages and to the educational reforms which have been undertaken in each country relating to early modern language teaching and learning and…
Status and Perspectives of Neutron Imaging Facilities
NASA Astrophysics Data System (ADS)
Lehmann, E.; Trtik, P.; Ridikas, D.
The methodology and the application range of neutron imaging techniques have been significantly improved at numerous facilities worldwide in the last decades. This progress has been achieved by new detector systems, the setup of dedicated, optimized and flexible beam lines and the much better understanding of the complete imaging process thanks to complementary simulations. Furthermore, new applications and research topics were found and implemented. However, since the quality and the number of neutron imaging facilities depend much on the access to suitable beam ports, there is still an enormous potential to implement state-of-the-art neutron imaging techniques at many more facilities. On the one hand, there are prominent and powerful sources which do not intend/accept the implementation of neutron imaging techniques due to the priorities set for neutron scattering and irradiation techniques exclusively. On the other hand, there are modern and useful devices which remain under-utilized and have either not the capacity or not the know-how to develop attractive user programs and/or industrial partnerships. In this overview of the international status of neutron imaging facilities, we will specify details about the current situation.
Speckle-based at-wavelength metrology of X-ray mirrors with super accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal, E-mail: kawal.sawhney@diamond.ac.uk
2016-05-15
X-ray active mirrors, such as bimorph and mechanically bendable mirrors, are increasingly being used on beamlines at modern synchrotron source facilities to generate either focused or “tophat” beams. As well as optical tests in the metrology lab, it is becoming increasingly important to optimise and characterise active optics under actual beamline operating conditions. Recently developed X-ray speckle-based at-wavelength metrology technique has shown great potential. The technique has been established and further developed at the Diamond Light Source and is increasingly being used to optimise active mirrors. Details of the X-ray speckle-based at-wavelength metrology technique and an example of its applicabilitymore » in characterising and optimising a micro-focusing bimorph X-ray mirror are presented. Importantly, an unprecedented angular sensitivity in the range of two nanoradians for measuring the slope error of an optical surface has been demonstrated. Such a super precision metrology technique will be beneficial to the manufacturers of polished mirrors and also in optimization of beam shaping during experiments.« less
NASA Astrophysics Data System (ADS)
Salcedo-Sanz, S.
2016-10-01
Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.
Application of modern control theory to the design of optimum aircraft controllers
NASA Technical Reports Server (NTRS)
Power, L. J.
1973-01-01
The synthesis procedure presented is based on the solution of the output regulator problem of linear optimal control theory for time-invariant systems. By this technique, solution of the matrix Riccati equation leads to a constant linear feedback control law for an output regulator which will maintain a plant in a particular equilibrium condition in the presence of impulse disturbances. Two simple algorithms are presented that can be used in an automatic synthesis procedure for the design of maneuverable output regulators requiring only selected state variables for feedback. The first algorithm is for the construction of optimal feedforward control laws that can be superimposed upon a Kalman output regulator and that will drive the output of a plant to a desired constant value on command. The second algorithm is for the construction of optimal Luenberger observers that can be used to obtain feedback control laws for the output regulator requiring measurement of only part of the state vector. This algorithm constructs observers which have minimum response time under the constraint that the magnitude of the gains in the observer filter be less than some arbitrary limit.
Parental investment and the optimization of human family size
Lawson, David W.; Mace, Ruth
2011-01-01
Human reproductive behaviour is marked by exceptional variation at the population and individual level. Human behavioural ecologists propose adaptive hypotheses to explain this variation as shifting phenotypic optima in relation to local socioecological niches. Here we review evidence that variation in fertility (offspring number), in both traditional and modern industrialized populations, represents optimization of the life-history trade-off between reproductive rate and parental investment. While a reliance on correlational methods suggests the true costs of sibling resource competition are often poorly estimated, a range of anthropological and demographic studies confirm that parents balance family size against offspring success. Evidence of optimization is less forthcoming. Declines in fertility associated with modernization are particularly difficult to reconcile with adaptive models, because fertility limitation fails to enhance offspring reproductive success. Yet, considering alternative measures, we show that modern low fertility confers many advantages on offspring, which are probably transmitted to future generations. Evidence from populations that have undergone or initiated demographic transition indicate that these rewards to fertility limitation fall selectively on relatively wealthy individuals. The adaptive significance of modern reproductive behaviour remains difficult to evaluate, but may be best understood in response to rising investment costs of rearing socially and economically competitive offspring. PMID:21199838
State of the art in treatment of facial paralysis with temporalis tendon transfer.
Sidle, Douglas M; Simon, Patrick
2013-08-01
Temporalis tendon transfer is a technique for dynamic facial reanimation. Since its inception, nearly 80 years ago, it has undergone a wealth of innovation to produce the modern operation. The purpose of this review is to update the literature as to the current techniques and perioperative management of patients undergoing temporalis tendon transfer. The modern technique focuses on the minimally invasive approaches and aesthetic refinements to enhance the final product of the operation. The newest techniques as well as preoperative assessment and postoperative rehabilitation are discussed. When temporalis tendon transfer is indicated for facial reanimation, the modern operation offers a refined technique that produces an aesthetically acceptable outcome. Preoperative smile assessment and postoperative smile rehabilitation are necessary and are important adjuncts to a successful operation.
MO-B-BRB-01: Optimize Treatment Planning Process in Clinical Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, W.
The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequentialmore » events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi-automatic plan evaluation, (e) quality checklist for error prevention, (f) iterative process, (g) balance of speed and quality Learning Objectives: Gain familiarity with the workflow of modern treatment planning process. Understand the scope and challenges of managing modern treatment planning processes. Gain familiarity with Lean Six Sigma approaches and their implementation in the treatment planning workflow.« less
MO-B-BRB-00: Optimizing the Treatment Planning Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequentialmore » events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi-automatic plan evaluation, (e) quality checklist for error prevention, (f) iterative process, (g) balance of speed and quality Learning Objectives: Gain familiarity with the workflow of modern treatment planning process. Understand the scope and challenges of managing modern treatment planning processes. Gain familiarity with Lean Six Sigma approaches and their implementation in the treatment planning workflow.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapur, A.
The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequentialmore » events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi-automatic plan evaluation, (e) quality checklist for error prevention, (f) iterative process, (g) balance of speed and quality Learning Objectives: Gain familiarity with the workflow of modern treatment planning process. Understand the scope and challenges of managing modern treatment planning processes. Gain familiarity with Lean Six Sigma approaches and their implementation in the treatment planning workflow.« less
Publications | Grid Modernization | NREL
Photovoltaics: Trajectories and Challenges Cover of Efficient Relaxations for Joint Chance Constrained AC Optimal Power Flow publication Efficient Relaxations for Joint Chance Constrained AC Optimal Power Flow
Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian
2014-09-01
Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zingone, Gaetano; Licata, Vincenzo; Calogero, Cucchiara
2008-07-08
The present work fits into the interesting theme of seismic prevention for protection of the monumental patrimony made up of churches with drum domes. Specifically, with respect to a church in the historic area of Catania, chosen as a monument exemplifying the typology examined, the seismic behavior is analyzed in the linear field using modern dynamic identification techniques. The dynamically identified computational model arrived at made it possible to identify the macro-element most at risk, the dome-drum system. With respect to this system the behavior in the nonlinear field is analyzed through dynamic tests on large-scale models in the presencemore » of various types of improving reinforcement. The results are used to appraise the ameliorative contribution afforded by each of them and to choose the most suitable type of reinforcement, optimizing the stiffness/ductility ratio of the system.« less
Pelvic packing or angiography: competitive or complementary?
Suzuki, Takashi; Smith, Wade R; Moore, Ernest E
2009-04-01
Pelvic angiography is an established technique that has evolved into a highly effective means of controlling arterial pelvic haemorrhage. The current dominant paradigm for haemodynamically unstable patients with pelvic fractures is angiographic management combined with mechanical stabilisation of the pelvis. However, an effective rapid screening tool for arterial bleeding in pelvic fracture patients has yet to be identified. There is also no precise way to determine the major source of bleeding responsible for haemodynamic instability. In many pelvic fracture patients, bleeding is from venous lacerations which are not effectively treated with angiography to fractured bony surfaces. Modern pelvic packing consists of time-saving and minimally invasive techniques which appear to result in effective control of the haemorrhage via tamponade. This review article focuses on the recent body of knowledge on angiography and pelvic packing. We propose the optimal role for each modality in trauma centres.
Vectorization with SIMD extensions speeds up reconstruction in electron tomography.
Agulleiro, J I; Garzón, E M; García, I; Fernández, J J
2010-06-01
Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.
Xiong, Hanqing; Zhou, Zhenqiao; Zhu, Mingqiang; Lv, Xiaohua; Li, Anan; Li, Shiwei; Li, Longhui; Yang, Tao; Wang, Siming; Yang, Zhongqin; Xu, Tonghui; Luo, Qingming; Gong, Hui; Zeng, Shaoqun
2014-01-01
Resin embedding is a well-established technique to prepare biological specimens for microscopic imaging. However, it is not compatible with modern green-fluorescent protein (GFP) fluorescent-labelling technique because it significantly quenches the fluorescence of GFP and its variants. Previous empirical optimization efforts are good for thin tissue but not successful on macroscopic tissue blocks as the quenching mechanism remains uncertain. Here we show most of the quenched GFP molecules are structurally preserved and not denatured after routine embedding in resin, and can be chemically reactivated to a fluorescent state by alkaline buffer during imaging. We observe up to 98% preservation in yellow-fluorescent protein case, and improve the fluorescence intensity 11.8-fold compared with unprocessed samples. We demonstrate fluorescence microimaging of resin-embedded EGFP/EYFP-labelled tissue block without noticeable loss of labelled structures. This work provides a turning point for the imaging of fluorescent protein-labelled specimens after resin embedding. PMID:24886825
Theory, simulation and experiments for precise deflection control of radiotherapy electron beams.
Figueroa, R; Leiva, J; Moncada, R; Rojas, L; Santibáñez, M; Valente, M; Velásquez, J; Young, H; Zelada, G; Yáñez, R; Guillen, Y
2018-03-08
Conventional radiotherapy is mainly applied by linear accelerators. Although linear accelerators provide dual (electron/photon) radiation beam modalities, both of them are intrinsically produced by a megavoltage electron current. Modern radiotherapy treatment techniques are based on suitable devices inserted or attached to conventional linear accelerators. Thus, precise control of delivered beam becomes a main key issue. This work presents an integral description of electron beam deflection control as required for novel radiotherapy technique based on convergent photon beam production. Theoretical and Monte Carlo approaches were initially used for designing and optimizing device´s components. Then, dedicated instrumentation was developed for experimental verification of electron beam deflection due to the designed magnets. Both Monte Carlo simulations and experimental results support the reliability of electrodynamics models used to predict megavoltage electron beam control. Copyright © 2018 Elsevier Ltd. All rights reserved.
De Greef, J; Villani, K; Goethals, J; Van Belle, H; Van Caneghem, J; Vandecasteele, C
2013-11-01
Due to ongoing developments in the EU waste policy, Waste-to-Energy (WtE) plants are to be optimized beyond current acceptance levels. In this paper, a non-exhaustive overview of advanced technical improvements is presented and illustrated with facts and figures from state-of-the-art combustion plants for municipal solid waste (MSW). Some of the data included originate from regular WtE plant operation - before and after optimisation - as well as from defined plant-scale research. Aspects of energy efficiency and (re-)use of chemicals, resources and materials are discussed and support, in light of best available techniques (BAT), the idea that WtE plant performance still can be improved significantly, without direct need for expensive techniques, tools or re-design. In first instance, diagnostic skills and a thorough understanding of processes and operations allow for reclaiming the silent optimisation potential. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.
2017-12-01
StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an uninterrupted pipeline from toy/teaching codes to high-performance, extreme-scale solves. StagBLDemo replicates the functionality of an advanced MATLAB-style regional geodynamics code, thus providing users with a concrete procedure to exceed the performance and scalability limitations of smaller-scale tools.
A Brief Survey of Modern Optimization for Statisticians
Lange, Kenneth; Chi, Eric C.; Zhou, Hua
2014-01-01
Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include nondifferentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience. PMID:25242858
Atomdroid: a computational chemistry tool for mobile platforms.
Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M
2012-04-23
We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.
Autogenous bone grafts in the esthetic zone: optimizing the procedure using piezosurgery.
Majewski, Piotr
2012-12-01
Soft and hard tissue defects pose a therapeutic challenge in modern implant dentistry. There are a multitude of surgical techniques available, and it is necessary to match the problem with the solution. This report describes the reconstruction of the alveolar ridge in the esthetic zone with the help of autogenous bone blocks harvested from the chin that were shaped to fit and stabilized at the recipient site. The procedures were performed using Piezosurgery, which made it possible to introduce surgical modifications and had a significant impact on the accuracy of the procedure. An observation period of 2 to 7 years showed positive stable results for treatment in terms of function and esthetics.
An Introduction to Modern Missing Data Analyses
ERIC Educational Resources Information Center
Baraldi, Amanda N.; Enders, Craig K.
2010-01-01
A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…
Fast matrix multiplication and its algebraic neighbourhood
NASA Astrophysics Data System (ADS)
Pan, V. Ya.
2017-11-01
Matrix multiplication is among the most fundamental operations of modern computations. By 1969 it was still commonly believed that the classical algorithm was optimal, although the experts already knew that this was not so. Worldwide interest in matrix multiplication instantly exploded in 1969, when Strassen decreased the exponent 3 of cubic time to 2.807. Then everyone expected to see matrix multiplication performed in quadratic or nearly quadratic time very soon. Further progress, however, turned out to be capricious. It was at stalemate for almost a decade, then a combination of surprising techniques (completely independent of Strassen's original ones and much more advanced) enabled a new decrease of the exponent in 1978-1981 and then again in 1986, to 2.376. By 2017 the exponent has still not passed through the barrier of 2.373, but most disturbing was the curse of recursion — even the decrease of exponents below 2.7733 required numerous recursive steps, and each of them squared the problem size. As a result, all algorithms supporting such exponents supersede the classical algorithm only for inputs of immense sizes, far beyond any potential interest for the user. We survey the long study of fast matrix multiplication, focusing on neglected algorithms for feasible matrix multiplication. We comment on their design, the techniques involved, implementation issues, the impact of their study on the modern theory and practice of Algebraic Computations, and perspectives for fast matrix multiplication. Bibliography: 163 titles.
Autonomous Energy Grids | Grid Modernization | NREL
control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to
Palkowski, Marek; Bielecki, Wlodzimierz
2017-06-02
RNA secondary structure prediction is a compute intensive task that lies at the core of several search algorithms in bioinformatics. Fortunately, the RNA folding approaches, such as the Nussinov base pair maximization, involve mathematical operations over affine control loops whose iteration space can be represented by the polyhedral model. Polyhedral compilation techniques have proven to be a powerful tool for optimization of dense array codes. However, classical affine loop nest transformations used with these techniques do not optimize effectively codes of dynamic programming of RNA structure predictions. The purpose of this paper is to present a novel approach allowing for generation of a parallel tiled Nussinov RNA loop nest exposing significantly higher performance than that of known related code. This effect is achieved due to improving code locality and calculation parallelization. In order to improve code locality, we apply our previously published technique of automatic loop nest tiling to all the three loops of the Nussinov loop nest. This approach first forms original rectangular 3D tiles and then corrects them to establish their validity by means of applying the transitive closure of a dependence graph. To produce parallel code, we apply the loop skewing technique to a tiled Nussinov loop nest. The technique is implemented as a part of the publicly available polyhedral source-to-source TRACO compiler. Generated code was run on modern Intel multi-core processors and coprocessors. We present the speed-up factor of generated Nussinov RNA parallel code and demonstrate that it is considerably faster than related codes in which only the two outer loops of the Nussinov loop nest are tiled.
SU-E-I-97: Smart Auto-Planning Framework in An EMR Environment (SAFEE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, B; Chen, S; Mutaf, Y
2014-06-01
Purpose: Our Radiation Oncology Department uses clinical practice guidelines for patient treatment, including normal tissue sparing and other dosimetric constraints. These practice guidelines were adapted from national guidelines, clinical trials, literature reviews, and practitioner's own experience. Modern treatment planning systems (TPS) have the capability of incorporating these practice guidelines to automatically create radiation therapy treatment plans with little human intervention. We are developing a software infrastructure to integrate clinical practice guidelines and radiation oncology electronic medical record (EMR) system into radiation therapy treatment planning system (TPS) for auto planning. Methods: Our Smart Auto-Planning Framework in an EMR environment (SAFEE) usesmore » a software pipeline framework to integrate practice guidelines,EMR, and TPS together. The SAFEE system starts with retrieving diagnosis information and physician's prescription from the EMR system. After approval of contouring, SAFEE will automatically create plans according to our guidelines. Based on clinical objectives, SAFEE will automatically select treatment delivery techniques (such as, 3DRT/IMRT/VMAT) and optimize plans. When necessary, SAFEE will create multiple treatment plans with different combinations of parameters. SAFEE's pipeline structure makes it very flexible to integrate various techniques, such as, Model-Base Segmentation (MBS) and plan optimization algorithms, e.g., Multi-Criteria Optimization (MCO). In addition, SAFEE uses machine learning, data mining techniques, and an integrated database to create clinical knowledgebase and then answer clinical questions, such as, how to score plan quality or how volume overlap affects physicians' decision in beam and treatment technique selection. Results: In our institution, we use Varian Aria EMR system and RayStation TPS from RaySearch, whose ScriptService API allows control by external programs. These applications are the building blocks of our SAFEE system. Conclusion: SAFEE is a feasible method of integrating clinical information to develop an auto-planning paradigm to improve clinical workflow in cancer patient care.« less
Yoga and mental health: A dialogue between ancient wisdom and modern psychology
Vorkapic, Camila Ferreira
2016-01-01
Background: Many yoga texts make reference to the importance of mental health and the use of specific techniques in the treatment of mental disorders. Different concepts utilized in modern psychology may not come with contemporary ideas, instead, they seem to share a common root with ancient wisdom. Aims: The goal of this perspective article is to correlate modern techniques used in psychology and psychiatry with yogic practices, in the treatment of mental disorders. Materials and Methods: The current article presented a dialogue between the yogic approach for the treatment of mental disorder and concepts used in modern psychology, such as meta-cognition, disidentification, deconditioning and interoceptive exposure. Conclusions: Contemplative research found out that modern interventions in psychology might not come from modern concepts after all, but share great similarity with ancient yogic knowledge, giving us the opportunity to integrate the psychological wisdom of both East and West. PMID:26865774
Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts
ERIC Educational Resources Information Center
Nunnery, Christopher Edward
2011-01-01
Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…
Treatment of rabies in the 21st century: curing the incurable?
Franka, Richard; Rupprecht, Charles E
2011-10-01
Despite the extreme case fatality attributable to rabies, reports of survivors provide a faint glimpse of a possibility of overcoming this deadly disease, even after clinical symptoms manifest. At present, no existing approach fulfills modern medical criteria for the optimal therapy of rabies. Until new efficacious antiviral compounds and optimized treatment protocols are developed, animal population management and vaccination of major reservoirs and vectors, minimization of the risk of viral exposures, and appropriate and early postexposure prophylaxis, remain the hallmarks of modern public health intervention.
Oyebode, Femi
2014-04-01
This is a brief commentary on the value of optimism in therapy. It draws on the philosophical writings of Schopenhauer and Aristotle. It suggests that the modern preoccupation with optimism may be as extreme as the bleak pessimistic outlook favoured by Schopenhauer.
ERIC Educational Resources Information Center
Sellar, Sam
2016-01-01
This article puts forward the provocation that optimism has become a trap for educational research. It is argued that optimism underpins the implicit model of modern educational thought, which is oriented toward the "future" and wants it to be better. However, optimism can become a trap when it encourages investment in promises about the…
Optimized method for manufacturing large aspheric surfaces
NASA Astrophysics Data System (ADS)
Zhou, Xusheng; Li, Shengyi; Dai, Yifan; Xie, Xuhui
2007-12-01
Aspheric optics are being used more and more widely in modern optical systems, due to their ability of correcting aberrations, enhancing image quality, enlarging the field of view and extending the range of effect, while reducing the weight and volume of the system. With optical technology development, we have more pressing requirement to large-aperture and high-precision aspheric surfaces. The original computer controlled optical surfacing (CCOS) technique cannot meet the challenge of precision and machining efficiency. This problem has been thought highly of by researchers. Aiming at the problem of original polishing process, an optimized method for manufacturing large aspheric surfaces is put forward. Subsurface damage (SSD), full aperture errors and full band of frequency errors are all in control of this method. Lesser SSD depth can be gained by using little hardness tool and small abrasive grains in grinding process. For full aperture errors control, edge effects can be controlled by using smaller tools and amendment model with material removal function. For full band of frequency errors control, low frequency errors can be corrected with the optimized material removal function, while medium-high frequency errors by using uniform removing principle. With this optimized method, the accuracy of a K9 glass paraboloid mirror can reach rms 0.055 waves (where a wave is 0.6328μm) in a short time. The results show that the optimized method can guide large aspheric surface manufacturing effectively.
Optimizing communication satellites payload configuration with exact approaches
NASA Astrophysics Data System (ADS)
Stathakis, Apostolos; Danoy, Grégoire; Bouvry, Pascal; Talbi, El-Ghazali; Morelli, Gianluigi
2015-12-01
The satellite communications market is competitive and rapidly evolving. The payload, which is in charge of applying frequency conversion and amplification to the signals received from Earth before their retransmission, is made of various components. These include reconfigurable switches that permit the re-routing of signals based on market demand or because of some hardware failure. In order to meet modern requirements, the size and the complexity of current communication payloads are increasing significantly. Consequently, the optimal payload configuration, which was previously done manually by the engineers with the use of computerized schematics, is now becoming a difficult and time consuming task. Efficient optimization techniques are therefore required to find the optimal set(s) of switch positions to optimize some operational objective(s). In order to tackle this challenging problem for the satellite industry, this work proposes two Integer Linear Programming (ILP) models. The first one is single-objective and focuses on the minimization of the length of the longest channel path, while the second one is bi-objective and additionally aims at minimizing the number of switch changes in the payload switch matrix. Experiments are conducted on a large set of instances of realistic payload sizes using the CPLEX® solver and two well-known exact multi-objective algorithms. Numerical results demonstrate the efficiency and limitations of the ILP approach on this real-world problem.
CUDA Optimization Strategies for Compute- and Memory-Bound Neuroimaging Algorithms
Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W.
2011-01-01
As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. PMID:21159404
Kazakis, Georgios; Kanellopoulos, Ioannis; Sotiropoulos, Stefanos; Lagaros, Nikos D
2017-10-01
Construction industry has a major impact on the environment that we spend most of our life. Therefore, it is important that the outcome of architectural intuition performs well and complies with the design requirements. Architects usually describe as "optimal design" their choice among a rather limited set of design alternatives, dictated by their experience and intuition. However, modern design of structures requires accounting for a great number of criteria derived from multiple disciplines, often of conflicting nature. Such criteria derived from structural engineering, eco-design, bioclimatic and acoustic performance. The resulting vast number of alternatives enhances the need for computer-aided architecture in order to increase the possibility of arriving at a more preferable solution. Therefore, the incorporation of smart, automatic tools in the design process, able to further guide designer's intuition becomes even more indispensable. The principal aim of this study is to present possibilities to integrate automatic computational techniques related to topology optimization in the phase of intuition of civil structures as part of computer aided architectural design. In this direction, different aspects of a new computer aided architectural era related to the interpretation of the optimized designs, difficulties resulted from the increased computational effort and 3D printing capabilities are covered here in.
CUDA optimization strategies for compute- and memory-bound neuroimaging algorithms.
Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W
2012-06-01
As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Mitroi, Nicoleta; Moţa, Maria
2008-01-01
Since Hippocrates it is known that nutrition plays a very important role in maintaining health, people being advised to consider "food intake as a real medicine". Modern science shows that important and necessary for an optimal state of health is not only the intake of some specific nutrients, but, above all, specific quantities of each and every nutrient which are to be taken. New notions have consequently appeared, such as dietetic recommendations and nutritional epidemiology. At the same time, it has been emphasized that nutrition can directly contribute to diseases appearance (nutrients/food generally interact with the genes in a "benign" way, but in some circumstances this interaction can also have fatal consequences). Human development is influenced by both environmental factors (diet, smoking, education, physical activity etc.) and heredity; both factors should be given equal attention, if our aim is to maintain the state of health. Experimental studies are often dedicated either only to the influence of environmental factors or exclusively to genes' influence and not to both simultaneously [4]. Modern techniques and methods of study are designed to solve the problem.
Modern bioanalysis of proteins by electrophoretic techniques.
Krizkova, Sona; Ryvolova, Marketa; Masarik, Michal; Zitka, Ondrej; Adam, Vojtech; Hubalek, Jaromir; Eckschlager, Tomas; Kizek, Rene
2014-01-01
In 1957, protein rich in cysteine able to bind cadmium was isolated from horse kidney and named as metallothionein according to its structural properties. Further, this protein and metallothionein-like proteins have been found in tissues of other animal species, yeasts, fungi and plants. MT is as a potential cancer marker in the focus of interest, and its properties, functions, and behavior under various conditions are intensively studied. Our protocol describes separation of two major mammalian isoforms of MT (MT-1 and MT-2) using capillary electrophoresis (CE) coupled with UV detector. This protocol enables separation of MT isoforms and studying of their basic behavior as well as their quantification with detection limit in units of ng per μL. Sodium borate buffer (20 mM, pH 9.5) was optimized as a background electrolyte, and the separation was carried out in fused silica capillary with internal diameter of 75 μm and electric field intensity of 350 V/cm. Optimal detection wavelength was 254 nm.
Extreme Scale Plasma Turbulence Simulations on Top Supercomputers Worldwide
Tang, William; Wang, Bei; Ethier, Stephane; ...
2016-11-01
The goal of the extreme scale plasma turbulence studies described in this paper is to expedite the delivery of reliable predictions on confinement physics in large magnetic fusion systems by using world-class supercomputers to carry out simulations with unprecedented resolution and temporal duration. This has involved architecture-dependent optimizations of performance scaling and addressing code portability and energy issues, with the metrics for multi-platform comparisons being 'time-to-solution' and 'energy-to-solution'. Realistic results addressing how confinement losses caused by plasma turbulence scale from present-day devices to the much larger $25 billion international ITER fusion facility have been enabled by innovative advances in themore » GTC-P code including (i) implementation of one-sided communication from MPI 3.0 standard; (ii) creative optimization techniques on Xeon Phi processors; and (iii) development of a novel performance model for the key kernels of the PIC code. Our results show that modeling data movement is sufficient to predict performance on modern supercomputer platforms.« less
Coordinated Platoon Routing in a Metropolitan Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, Jeffrey; Munson, Todd; Sokolov, Vadim
2016-10-10
Platooning vehicles—connected and automated vehicles traveling with small intervehicle distances—use less fuel because of reduced aerodynamic drag. Given a network de- fined by vertex and edge sets and a set of vehicles with origin/destination nodes/times, we model and solve the combinatorial optimization problem of coordinated routing of vehicles in a manner that routes them to their destination on time while using the least amount of fuel. Common approaches decompose the platoon coordination and vehicle routing into separate problems. Our model addresses both problems simultaneously to obtain the best solution. We use modern modeling techniques and constraints implied from analyzing themore » platoon routing problem to address larger numbers of vehicles and larger networks than previously considered. While the numerical method used is unable to certify optimality for candidate solutions to all networks and parameters considered, we obtain excellent solutions in approximately one minute for much larger networks and vehicle sets than previously considered in the literature.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duval, B.C.; Allen, G.; Madaoui, K.
The paper describes how modern geoscience techniques, developed for a large part in intensive exploration programs, can be used at the field level to improve reservoir prediction and production planning and also to optimize recovery. Detailed sedimentological studies has allowed the authors to determine the environment of the reservoir formations and help define the likely shape and size of individual sands and refine the reservoir model. An illustration is given by fields located in the Mahakam delta area of Kalimantan (Handil, Tunu) and in the Gulf of Thailand (Bongkot). Sequence stratigraphy assists in identifying efficient regional seals which, at fieldmore » scale, lead to the recomposition of a great number of individual sands (several hundreds in some cases) into fewer flow units, making the system manageable from a reservoir standpoint. This technology was used extensively to delineate the giant Peciko gas field of Indonesia. The geophysical approach of reservoir parameters and the use of seismic attributes are rapidly expanding. The Yadana gas field in the Gulf of Martaban (Myanmar) is a case in point to show how porosities can be determined from impedances obtained by seismic inversion techniques. An example from the Bongkot field shows how 3D seismic and direct hydrocarbon indication technology (DHI) are used to deal with complex faulting to optimize deviated well profiles and improve recoveries.« less
Technology optimization techniques for multicomponent optical band-pass filter manufacturing
NASA Astrophysics Data System (ADS)
Baranov, Yuri P.; Gryaznov, Georgiy M.; Rodionov, Andrey Y.; Obrezkov, Andrey V.; Medvedev, Roman V.; Chivanov, Alexey N.
2016-04-01
Narrowband optical devices (like IR-sensing devices, celestial navigation systems, solar-blind UV-systems and many others) are one of the most fast-growing areas in optical manufacturing. However, signal strength in this type of applications is quite low and performance of devices depends on attenuation level of wavelengths out of operating range. Modern detectors (photodiodes, matrix detectors, photomultiplier tubes and others) usually do not have required selectivity or have higher sensitivity to background spectrum at worst. Manufacturing of a single component band-pass filter with high attenuation level of wavelength is resource-intensive task. Sometimes it's not possible to find solution for this problem using existing technologies. Different types of filters have technology variations of transmittance profile shape due to various production factors. At the same time there are multiple tasks with strict requirements for background spectrum attenuation in narrowband optical devices. For example, in solar-blind UV-system wavelengths above 290-300 nm must be attenuated by 180dB. In this paper techniques of multi-component optical band-pass filters assembly from multiple single elements with technology variations of transmittance profile shape for optimal signal-tonoise ratio (SNR) were proposed. Relationships between signal-to-noise ratio and different characteristics of transmittance profile shape were shown. Obtained practical results were in rather good agreement with our calculations.
Performance of coincidence-based PSD on LiF/ZnS Detectors for Multiplicity Counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Sean M.; Stave, Sean C.; Lintereur, Azaree
Abstract: Mass accountancy measurement is a nuclear nonproliferation application which utilizes coincidence and multiplicity counters to verify special nuclear material declarations. With a well-designed and efficient detector system, several relevant parameters of the material can be verified simultaneously. 6LiF/ZnS scintillating sheets may be used for this purpose due to a combination of high efficiency and short die-away times in systems designed with this material, but involve choices of detector geometry and exact material composition (e.g., the addition of Ni-quenching in the material) that must be optimized for the application. Multiplicity counting for verification of declared nuclear fuel mass involves neutronmore » detection in conditions where several neutrons arrive in a short time window, with confounding gamma rays. This paper considers coincidence-based Pulse-Shape Discrimination (PSD) techniques developed to work under conditions of high pileup, and the performance of these algorithms with different detection materials. Simulated and real data from modern LiF/ZnS scintillator systems are evaluated with these techniques and the relationship between the performance under pileup and material characteristics (e.g., neutron peak width and total light collection efficiency) are determined, to allow for an optimal choice of detector and material.« less
NASA Astrophysics Data System (ADS)
Buyuk, Ersin; Karaman, Abdullah
2017-04-01
We estimated transmissivity and storage coefficient values from the single well water-level measurements positioned ahead of the mining face by using particle swarm optimization (PSO) technique. The water-level response to the advancing mining face contains an semi-analytical function that is not suitable for conventional inversion shemes because the partial derivative is difficult to calculate . Morever, the logaritmic behaviour of the model create difficulty for obtaining an initial model that may lead to a stable convergence. The PSO appears to obtain a reliable solution that produce a reasonable fit between water-level data and model function response. Optimization methods have been used to find optimum conditions consisting either minimum or maximum of a given objective function with regard to some criteria. Unlike PSO, traditional non-linear optimization methods have been used for many hydrogeologic and geophysical engineering problems. These methods indicate some difficulties such as dependencies to initial model, evolution of the partial derivatives that is required while linearizing the model and trapping at local optimum. Recently, Particle swarm optimization (PSO) became the focus of modern global optimization method that is inspired from the social behaviour of birds of swarms, and appears to be a reliable and powerful algorithms for complex engineering applications. PSO that is not dependent on an initial model, and non-derivative stochastic process appears to be capable of searching all possible solutions in the model space either around local or global optimum points.
Kriz, J; Baues, C; Engenhart-Cabillic, R; Haverkamp, U; Herfarth, K; Lukas, P; Schmidberger, H; Marnitz-Schulze, S; Fuchs, M; Engert, A; Eich, H T
2017-02-01
Field design changed substantially from extended-field RT (EF-RT) to involved-field RT (IF-RT) and now to involved-node RT (IN-RT) and involved-site RT (IS-RT) as well as treatment techniques in radiotherapy (RT) of Hodgkin's lymphoma (HL). The purpose of this article is to demonstrate the establishment of a quality assurance program (QAP) including modern RT techniques and field designs within the German Hodgkin Study Group (GHSG). In the era of modern conformal RT, this QAP had to be fundamentally adapted and a new evaluation process has been intensively discussed by the radiotherapeutic expert panel of the GHSG. The expert panel developed guidelines and criteria to analyse "modern" field designs and treatment techniques. This work is based on a dataset of 11 patients treated within the sixth study generation (HD16-17). To develop a QAP of "modern RT", the expert panel defined criteria for analysing current RT procedures. The consensus of a modified QAP in ongoing and future trials is presented. With this schedule, the QAP of the GHSG could serve as a model for other study groups.
Technological advances in radiotherapy of rectal cancer: opportunities and challenges.
Appelt, Ane L; Sebag-Montefiore, David
2016-07-01
This review summarizes the available evidence for the use of modern radiotherapy techniques for chemoradiotherapy for rectal cancer, with specific focus on intensity-modulated radiotherapy (IMRT) and volumetric arc therapy (VMAT) techniques. The dosimetric benefits of IMRT and VMAT are well established, but prospective clinical studies are limited, with phase I-II studies only. Recent years have seen the publication of a few larger prospective patient series as well as some retrospective cohorts, several of which include much needed late toxicity data. Overall results are encouraging, as toxicity levels - although varying across reports - appear lower than for 3D conformal radiotherapy. Innovative treatment techniques and strategies which may be facilitated by the use of IMRT/VMAT include simultaneously integrated tumour boost, adaptive treatment, selective sparing of specific organs to enable chemotherapy escalation, and nonsurgical management. Few prospective studies of IMRT and VMAT exist, which causes uncertainty not just in regards to the clinical benefit of these technologies but also in the optimal use. The priority for future research should be subgroups of patients who might receive relatively greater benefit from innovative treatment techniques, such as patients receiving chemoradiotherapy with definitive intent and patients treated with dose escalation.
Construction and Potential Applications of Biosensors for Proteins in Clinical Laboratory Diagnosis
Liu, Xuan
2017-01-01
Biosensors for proteins have shown attractive advantages compared to traditional techniques in clinical laboratory diagnosis. In virtue of modern fabrication modes and detection techniques, various immunosensing platforms have been reported on basis of the specific recognition between antigen-antibody pairs. In addition to profit from the development of nanotechnology and molecular biology, diverse fabrication and signal amplification strategies have been designed for detection of protein antigens, which has led to great achievements in fast quantitative and simultaneous testing with extremely high sensitivity and specificity. Besides antigens, determination of antibodies also possesses great significance for clinical laboratory diagnosis. In this review, we will categorize recent immunosensors for proteins by different detection techniques. The basic conception of detection techniques, sensing mechanisms, and the relevant signal amplification strategies are introduced. Since antibodies and antigens have an equal position to each other in immunosensing, all biosensing strategies for antigens can be extended to antibodies under appropriate optimizations. Biosensors for antibodies are summarized, focusing on potential applications in clinical laboratory diagnosis, such as a series of biomarkers for infectious diseases and autoimmune diseases, and an evaluation of vaccine immunity. The excellent performances of these biosensors provide a prospective space for future antibody-detection-based disease serodiagnosis. PMID:29207528
DOE Office of Scientific and Technical Information (OSTI.GOV)
McQuinn, Kristen B. W.; Skillman, Evan D.; Dolphin, Andrew E.
Accurate distances are fundamental for interpreting various measured properties of galaxies. Surprisingly, many of the best-studied spiral galaxies in the Local Volume have distance uncertainties that are much larger than can be achieved with modern observation techniques. Using Hubble Space Telescope optical imaging, we use the tip of the red giant branch method to measure the distances to six galaxies that are included in the Spitzer Infrared Nearby Galaxies Survey program and its offspring surveys. The sample includes M63, M74, NGC 1291, NGC 4559, NGC 4625, and NGC 5398. We compare our results with distances reported to these galaxies basedmore » on a variety of methods. Depending on the technique, there can be a wide range in published distances, particularly from the Tully–Fisher relation. In addition, differences between the planetary nebular luminosity function and surface brightness fluctuation techniques can vary between galaxies, suggesting inaccuracies that cannot be explained by systematics in the calibrations. Our distances improve upon previous results, as we use a well-calibrated, stable distance indicator, precision photometry in an optimally selected field of view, and a Bayesian maximum likelihood technique that reduces measurement uncertainties.« less
Construction and Potential Applications of Biosensors for Proteins in Clinical Laboratory Diagnosis.
Liu, Xuan; Jiang, Hui
2017-12-04
Biosensors for proteins have shown attractive advantages compared to traditional techniques in clinical laboratory diagnosis. In virtue of modern fabrication modes and detection techniques, various immunosensing platforms have been reported on basis of the specific recognition between antigen-antibody pairs. In addition to profit from the development of nanotechnology and molecular biology, diverse fabrication and signal amplification strategies have been designed for detection of protein antigens, which has led to great achievements in fast quantitative and simultaneous testing with extremely high sensitivity and specificity. Besides antigens, determination of antibodies also possesses great significance for clinical laboratory diagnosis. In this review, we will categorize recent immunosensors for proteins by different detection techniques. The basic conception of detection techniques, sensing mechanisms, and the relevant signal amplification strategies are introduced. Since antibodies and antigens have an equal position to each other in immunosensing, all biosensing strategies for antigens can be extended to antibodies under appropriate optimizations. Biosensors for antibodies are summarized, focusing on potential applications in clinical laboratory diagnosis, such as a series of biomarkers for infectious diseases and autoimmune diseases, and an evaluation of vaccine immunity. The excellent performances of these biosensors provide a prospective space for future antibody-detection-based disease serodiagnosis.
The Taguchi Method Application to Improve the Quality of a Sustainable Process
NASA Astrophysics Data System (ADS)
Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.
2018-06-01
Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.
Advanced Techniques for Ultrasonic Imaging in the Presence of Material and Geometrical Complexity
NASA Astrophysics Data System (ADS)
Brath, Alexander Joseph
The complexity of modern engineering systems is increasing in several ways: advances in materials science are leading to the design of materials which are optimized for material strength, conductivity, temperature resistance etc., leading to complex material microstructure; the combination of additive manufacturing and shape optimization algorithms are leading to components with incredibly intricate geometrical complexity; and engineering systems are being designed to operate at larger scales in ever harsher environments. As a result, at the same time that there is an increasing need for reliable and accurate defect detection and monitoring capabilities, many of the currently available non-destructive evaluation techniques are rendered ineffective by this increasing material and geometrical complexity. This thesis addresses the challenges posed by inspection and monitoring problems in complex engineering systems with a three-part approach. In order to address material complexities, a model of wavefront propagation in anisotropic materials is developed, along with efficient numerical techniques to solve for the wavefront propagation in inhomogeneous, anisotropic material. Since material and geometrical complexities significantly affect the ability of ultrasonic energy to penetrate into the specimen, measurement configurations are tailored to specific applications which utilize arrays of either piezoelectric (PZT) or electromagnetic acoustic transducers (EMAT). These measurement configurations include novel array architectures as well as the exploration of ice as an acoustic coupling medium. Imaging algorithms which were previously developed for isotropic materials with simple geometry are adapted to utilize the more powerful wavefront propagation model and novel measurement configurations.
NASA Astrophysics Data System (ADS)
Shirzaei, M.; Walter, T. R.
2009-10-01
Modern geodetic techniques provide valuable and near real-time observations of volcanic activity. Characterizing the source of deformation based on these observations has become of major importance in related monitoring efforts. We investigate two random search approaches, simulated annealing (SA) and genetic algorithm (GA), and utilize them in an iterated manner. The iterated approach helps to prevent GA in general and SA in particular from getting trapped in local minima, and it also increases redundancy for exploring the search space. We apply a statistical competency test for estimating the confidence interval of the inversion source parameters, considering their internal interaction through the model, the effect of the model deficiency, and the observational error. Here, we present and test this new randomly iterated search and statistical competency (RISC) optimization method together with GA and SA for the modeling of data associated with volcanic deformations. Following synthetic and sensitivity tests, we apply the improved inversion techniques to two episodes of activity in the Campi Flegrei volcanic region in Italy, observed by the interferometric synthetic aperture radar technique. Inversion of these data allows derivation of deformation source parameters and their associated quality so that we can compare the two inversion methods. The RISC approach was found to be an efficient method in terms of computation time and search results and may be applied to other optimization problems in volcanic and tectonic environments.
Professional Competence of a Teacher in Higher Educational Institution
ERIC Educational Resources Information Center
Abykanova, Bakytgul; Tashkeyeva, Gulmira; Idrissov, Salamat; Bilyalova, Zhupar; Sadirbekova, Dinara
2016-01-01
Modern reality brings certain corrections to the understanding of forms and methods of teaching various courses in higher educational institution. A special role among the educational techniques and means in the college educational environment is taken by the modern technologies, such as using the techniques, means and ways, which are aimed at…
An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques
ERIC Educational Resources Information Center
de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.
2017-01-01
This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…
ERIC Educational Resources Information Center
Fitzgerald, Mary
2017-01-01
This article reflects on the ways in which socially engaged arts practices can contribute to reconceptualizing the contemporary modern dance technique class as a powerful site of social change. Specifically, the author considers how incorporating socially engaged practices into pedagogical models has the potential to foster responsible citizenship…
Visualization tool for human-machine interface designers
NASA Astrophysics Data System (ADS)
Prevost, Michael P.; Banda, Carolyn P.
1991-06-01
As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.
The Solutions of the Agricultural Land Use Monitoring Problems
ERIC Educational Resources Information Center
Vershinin, Valentin V.; Murasheva, Alla A.; Shirokova, Vera A.; Khutorova, Alla O.; Shapovalov, Dmitriy A.; Tarbaev, Vladimir A.
2016-01-01
Modern landscape--it's a holistic system of interconnected and interacting components. To questions of primary importance belongs evaluation of stability of modern landscape (including agrarian) and its optimization. As a main complex characteristic and landscape inhomogeneity in a process of agricultural usage serves materials of quantitative and…
Strategies on the Implementation of China's Logistics Information Network
NASA Astrophysics Data System (ADS)
Dong, Yahui; Li, Wei; Guo, Xuwen
The economic globalization and trend of e-commerce network have determined that the logistics industry will be rapidly developed in the 21st century. In order to achieve the optimal allocation of resources, a worldwide rapid and sound customer service system should be established. The establishment of a corresponding modern logistics system is the inevitable choice of this requirement. It is also the inevitable choice for the development of modern logistics industry in China. The perfect combination of modern logistics and information network can better promote the development of the logistics industry. Through the analysis of Status of Logistics Industry in China, this paper summed up the domestic logistics enterprise logistics information system in the building of some common problems. According to logistics information systems planning methods and principles set out logistics information system to optimize the management model.
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
ERIC Educational Resources Information Center
Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.
2014-01-01
Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…
Alternatives for jet engine control
NASA Technical Reports Server (NTRS)
Sain, M. K.
1979-01-01
The research is classified in two categories: (1) the use of modern multivariable frequency domain methods for control of engine models in the neighborhood of a set-point, and (2) the use of nonlinear modelling and optimization techniques for control of engine models over a more extensive part of the flight envelope. Progress in the first category included the extension of CARDIAD (Complex Acceptability Region for Diagonal Dominance) methods developed with the help of the grant to the case of engine models with four inputs and four outputs. A suitable bounding procedure for the dominance function was determined. Progress in the second category had its principal focus on automatic nonlinear model generation. Simulations of models produced satisfactory results where compared with the NASA DYNGEN digital engine deck.
NASA Astrophysics Data System (ADS)
Afanasyev, A. P.; Bazhenov, R. I.; Luchaninov, D. V.
2018-05-01
The main purpose of the research is to develop techniques for defining the best technical and economic trajectories of cables in urban power systems. The proposed algorithms of calculation of the routes for laying cables take into consideration topological, technical and economic features of the cabling. The discrete option of an algorithm Fast marching method is applied as a calculating tool. It has certain advantages compared to other approaches. In particular, this algorithm is cost-effective to compute, therefore, it is not iterative. Trajectories of received laying cables are considered as optimal ones from the point of view of technical and economic criteria. They correspond to the present rules of modern urban development.
Dijkman, B; Wellens, H J
2001-09-01
Dual chamber implantable cardioverter defibrillator (ICD) combines the possibility to detect and treat ventricular and atrial arrhythmias with the possibility of modern heart stimulation techniques. Advanced pacing algorithms together with extended arrhythmia detection capabilities can give rise to new types of device-device interactions. Some of the possible interactions are illustrated by four cases documented in four models of dual chamber ICDs. Functioning of new features in dual chamber devices is influenced by the fact that the pacemaker is not a separate device but a part of the ICD system and that both are being used in a patient with arrhythmia. Programming measures are suggested to optimize use of new pacing algorithms while maintaining correct arrhythmia detection.
NASA Astrophysics Data System (ADS)
Sánchez, Florencio; Craciun, Valentin
2018-07-01
Research on nanomaterials and nanostructures is continuing to grow at a rapid pace as they are used in many important devices like transistors, sensors, MEMS or components of modern tools for diagnosis and treatment in medicine. The functional properties of the materials used in these devices depend on their microstructure, and can be finely tuned using physical and chemical synthesis or various processing techniques that change the structure, composition, morphology and defects type and concentration. The investigation of stress, stoichiometry, phase structure and defects at atomic level is necessary to understand, model and further optimize the electric, magnetic, optical and mechanical properties of the nanosystems and for engineers to design new, better and more reliable devices.
NASA Astrophysics Data System (ADS)
Bednyakova, Anastasia; Turitsyn, Sergei K.
2015-03-01
The key to generating stable optical pulses is mastery of nonlinear light dynamics in laser resonators. Modern techniques to control the buildup of laser pulses are based on nonlinear science and include classical solitons, dissipative solitons, parabolic pulses (similaritons) and various modifications and blending of these methods. Fiber lasers offer remarkable opportunities to apply one-dimensional nonlinear science models for the design and optimization of very practical laser systems. Here, we propose a new concept of a laser based on the adiabatic amplification of a soliton pulse in the cavity—the adiabatic soliton laser. The adiabatic change of the soliton parameters during evolution in the resonator relaxes the restriction on the pulse energy inherent in traditional soliton lasers. Theoretical analysis is confirmed by extensive numerical modeling.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
GPU accelerated FDTD solver and its application in MRI.
Chi, J; Liu, F; Jin, J; Mason, D G; Crozier, S
2010-01-01
The finite difference time domain (FDTD) method is a popular technique for computational electromagnetics (CEM). The large computational power often required, however, has been a limiting factor for its applications. In this paper, we will present a graphics processing unit (GPU)-based parallel FDTD solver and its successful application to the investigation of a novel B1 shimming scheme for high-field magnetic resonance imaging (MRI). The optimized shimming scheme exhibits considerably improved transmit B(1) profiles. The GPU implementation dramatically shortened the runtime of FDTD simulation of electromagnetic field compared with its CPU counterpart. The acceleration in runtime has made such investigation possible, and will pave the way for other studies of large-scale computational electromagnetic problems in modern MRI which were previously impractical.
Replica Exchange Molecular Dynamics in the Age of Heterogeneous Architectures
NASA Astrophysics Data System (ADS)
Roitberg, Adrian
2014-03-01
The rise of GPU-based codes has allowed MD to reach timescales only dreamed of only 5 years ago. Even within this new paradigm there is still need for advanced sampling techniques. Modern supercomputers (e.g. Blue Waters, Titan, Keeneland) have made available to users a significant number of GPUS and CPUS, which in turn translate into amazing opportunities for dream calculations. Replica-exchange based methods can optimally use tis combination of codes and architectures to explore conformational variabilities in large systems. I will show our recent work in porting the program Amber to GPUS, and the support for replica exchange methods, where the replicated dimension could be Temperature, pH, Hamiltonian, Umbrella windows and combinations of those schemes.
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley W.
2009-01-01
Supporting the Aeronautics Research Mission Directorate guidelines, the National Aeronautics and Space Administration [NASA] Dryden Flight Research Center is developing a multidisciplinary design, analysis, and optimization [MDAO] tool. This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Today s modern aircraft designs in transonic speed are a challenging task due to the computation time required for the unsteady aeroelastic analysis using a Computational Fluid Dynamics [CFD] code. Design approaches in this speed regime are mainly based on the manual trial and error. Because of the time required for unsteady CFD computations in time-domain, this will considerably slow down the whole design process. These analyses are usually performed repeatedly to optimize the final design. As a result, there is considerable motivation to be able to perform aeroelastic calculations more quickly and inexpensively. This paper will describe the development of unsteady transonic aeroelastic design methodology for design optimization using reduced modeling method and unsteady aerodynamic approximation. The method requires the unsteady transonic aerodynamics be represented in the frequency or Laplace domain. Dynamically linear assumption is used for creating Aerodynamic Influence Coefficient [AIC] matrices in transonic speed regime. Unsteady CFD computations are needed for the important columns of an AIC matrix which corresponded to the primary modes for the flutter. Order reduction techniques, such as Guyan reduction and improved reduction system, are used to reduce the size of problem transonic flutter can be found by the classic methods, such as Rational function approximation, p-k, p, root-locus etc. Such a methodology could be incorporated into MDAO tool for design optimization at a reasonable computational cost. The proposed technique is verified using the Aerostructures Test Wing 2 actually designed, built, and tested at NASA Dryden Flight Research Center. The results from the full order model and the approximate reduced order model are analyzed and compared.
Method of Optimizing the Construction of Machining, Assembly and Control Devices
NASA Astrophysics Data System (ADS)
Iordache, D. M.; Costea, A.; Niţu, E. L.; Rizea, A. D.; Babă, A.
2017-10-01
Industry dynamics, driven by economic and social requirements, must generate more interest in technological optimization, capable of ensuring a steady development of advanced technical means to equip machining processes. For these reasons, the development of tools, devices, work equipment and control, as well as the modernization of machine tools, is the certain solution to modernize production systems that require considerable time and effort. This type of approach is also related to our theoretical, experimental and industrial applications of recent years, presented in this paper, which have as main objectives the elaboration and use of mathematical models, new calculation methods, optimization algorithms, new processing and control methods, as well as some structures for the construction and configuration of technological equipment with a high level of performance and substantially reduced costs..
EMU Suit Performance Simulation
NASA Technical Reports Server (NTRS)
Cowley, Matthew S.; Benson, Elizabeth; Harvill, Lauren; Rajulu, Sudhakar
2014-01-01
Introduction: Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for research and development are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques that focus on a human-centric design paradigm. These new techniques make use of virtual prototype simulations and fully adjustable physical prototypes of suit hardware. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process. Objectives: The primary objective was to test modern simulation techniques for evaluating the human performance component of two EMU suit concepts, pivoted and planar style hard upper torso (HUT). Methods: This project simulated variations in EVA suit shoulder joint design and subject anthropometry and then measured the differences in shoulder mobility caused by the modifications. These estimations were compared to human-in-the-loop test data gathered during past suited testing using four subjects (two large males, two small females). Results: Results demonstrated that EVA suit modeling and simulation are feasible design tools for evaluating and optimizing suit design based on simulated performance. The suit simulation model was found to be advantageous in its ability to visually represent complex motions and volumetric reach zones in three dimensions, giving designers a faster and deeper comprehension of suit component performance vs. human performance. Suit models were able to discern differing movement capabilities between EMU HUT configurations, generic suit fit concerns, and specific suit fit concerns for crewmembers based on individual anthropometry
World War II, tantalum, and the evolution of modern cranioplasty technique.
Flanigan, Patrick; Kshettry, Varun R; Benzel, Edward C
2014-04-01
Cranioplasty is a unique procedure with a rich history. Since ancient times, a diverse array of materials from coconut shells to gold plates has been used for the repair of cranial defects. More recently, World War II greatly increased the demand for cranioplasty procedures and renewed interest in the search for a suitable synthetic material for cranioprostheses. Experimental evidence revealed that tantalum was biologically inert to acid and oxidative stresses. In fact, the observation that tantalum did not absorb acid resulted in the metal being named after Tantalus, the Greek mythological figure who was condemned to a pool of water in the Underworld that would recede when he tried to take a drink. In clinical use, malleability facilitated a single-stage cosmetic repair of cranial defects. Tantalum became the preferred cranioplasty material for more than 1000 procedures performed during World War II. In fact, its use was rapidly adopted in the civilian population. During World War II and the heyday of tantalum cranioplasty, there was a rapid evolution in prosthesis implantation and fixation techniques significantly shaping how cranioplasties are performed today. Several years after the war, acrylic emerged as the cranioplasty material of choice. It had several clear advantages over its metallic counterparts. Titanium, which was less radiopaque and had a more optimal thermal conductivity profile (less thermally conductive), eventually supplanted tantalum as the most common metallic cranioplasty material. While tantalum cranioplasty was popular for only a decade, it represented a significant breakthrough in synthetic cranioplasty. The experiences of wartime neurosurgeons with tantalum cranioplasty played a pivotal role in the evolution of modern cranioplasty techniques and ultimately led to a heightened understanding of the necessary attributes of an ideal synthetic cranioplasty material. Indeed, the history of tantalum cranioplasty serves as a model for innovative thinking and adaptive technology development.
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
[The Morbidity of Students Conditioned by Diet Character in Modern Condition of Education].
Novokhatskaya, E A; Yakovleva, T P; Kalitina, M A
2017-09-01
The article considers characteristics of nervous psychic adaptation, morbidity and character of diet of students of the Russian state social university. The main incentives of combination of university studies and work are analyzed. The impact of combining of studies and work, regimen and diet quality on health are investigated. The psychological studies were implemented using computerized techniques of psychological testing and data collection with blank technique. The morbidity of students was discovered using questionnaire. It is established that students combining studies and work, have optimal indices of nervous psychic adaptation. however, level of their morbidity is twice higher than morbidity of students not combining studies and work. The analysis of regimen and diet character of students established deviations in regimen and structure of diet. The ration of proteins, fats and carbohydrates in day ration of students was imbalanced (1.0:1.4:6.1) at the expense of surplus of content of fat and especially carbohydrates that afterwards can results in development of diseases related to irregular diet.
Structural Optimization of a Force Balance Using a Computational Experiment Design
NASA Technical Reports Server (NTRS)
Parker, P. A.; DeLoach, R.
2002-01-01
This paper proposes a new approach to force balance structural optimization featuring a computational experiment design. Currently, this multi-dimensional design process requires the designer to perform a simplification by executing parameter studies on a small subset of design variables. This one-factor-at-a-time approach varies a single variable while holding all others at a constant level. Consequently, subtle interactions among the design variables, which can be exploited to achieve the design objectives, are undetected. The proposed method combines Modern Design of Experiments techniques to direct the exploration of the multi-dimensional design space, and a finite element analysis code to generate the experimental data. To efficiently search for an optimum combination of design variables and minimize the computational resources, a sequential design strategy was employed. Experimental results from the optimization of a non-traditional force balance measurement section are presented. An approach to overcome the unique problems associated with the simultaneous optimization of multiple response criteria is described. A quantitative single-point design procedure that reflects the designer's subjective impression of the relative importance of various design objectives, and a graphical multi-response optimization procedure that provides further insights into available tradeoffs among competing design objectives are illustrated. The proposed method enhances the intuition and experience of the designer by providing new perspectives on the relationships between the design variables and the competing design objectives providing a systematic foundation for advancements in structural design.
Hierarchical random walks in trace fossils and the origin of optimal search behavior
Sims, David W.; Reynolds, Andrew M.; Humphries, Nicolas E.; Southall, Emily J.; Wearmouth, Victoria J.; Metcalfe, Brett; Twitchett, Richard J.
2014-01-01
Efficient searching is crucial for timely location of food and other resources. Recent studies show that diverse living animals use a theoretically optimal scale-free random search for sparse resources known as a Lévy walk, but little is known of the origins and evolution of foraging behavior and the search strategies of extinct organisms. Here, using simulations of self-avoiding trace fossil trails, we show that randomly introduced strophotaxis (U-turns)—initiated by obstructions such as self-trail avoidance or innate cueing—leads to random looping patterns with clustering across increasing scales that is consistent with the presence of Lévy walks. This predicts that optimal Lévy searches may emerge from simple behaviors observed in fossil trails. We then analyzed fossilized trails of benthic marine organisms by using a novel path analysis technique and find the first evidence, to our knowledge, of Lévy-like search strategies in extinct animals. Our results show that simple search behaviors of extinct animals in heterogeneous environments give rise to hierarchically nested Brownian walk clusters that converge to optimal Lévy patterns. Primary productivity collapse and large-scale food scarcity characterizing mass extinctions evident in the fossil record may have triggered adaptation of optimal Lévy-like searches. The findings suggest that Lévy-like behavior has been used by foragers since at least the Eocene but may have a more ancient origin, which might explain recent widespread observations of such patterns among modern taxa. PMID:25024221
Magnetic resonance spectroscopic imaging for improved treatment planning of prostate cancer
NASA Astrophysics Data System (ADS)
Venugopal, Niranjan
Prostate cancer is the most common malignancy afflicting Canadian men in 2011. Physicians use digital rectal exams (DRE), blood tests for prostate specific antigen (PSA) and transrectal ultrasound (TRUS)-guided biopsies for the initial diagnosis of prostate cancer. None of these tests detail the spatial extent of prostate cancer - information critical for using new therapies that can target cancerous prostate. With an MRI technique called proton magnetic resonance spectroscopic imaging (1H-MRSI), biochemical analysis of the entire prostate can be done without the need for biopsy, providing detailed information beyond the non-specific changes in hardness felt by an experienced urologist in a DRE, the presence of PSA in blood, or the "blind-guidance" of TRUS-guided biopsy. A hindrance to acquiring high quality 1H-MRSI data comes from signal originating from fatty tissue surrounding prostate that tends to mask or distort signal from within the prostate, thus reducing the overall clinical usefulness of 1H-MRSI data. This thesis has three major areas of focus: 1) The development of an optimized 1H-MRSI technique, called conformal voxel magnetic resonance spectroscopy (CV-MRS), to deal the with removal of unwanted lipid contaminating artifacts at short and long echo times. 2) An in vivo human study to test the CV-MRS technique, including healthy volunteers and cancer patients scheduled for radical prostatectomy or radiation therapy. 3) A study to determine the efficacy of using the 1H-MRSI data for optimized radiation treatment planning using modern delivery techniques like intensity modulated radiation treatment. Data collected from the study using the optimized CV-MRS method show significantly reduced lipid contamination resulting in high quality spectra throughout the prostate. Combining the CV-MRS technique with spectral-spatial excitation further reduced lipid contamination and opened up the possibility of detecting metabolites with short T2 relaxation times. Results from the in vivo study were verified with post-histopathological data. Lastly, 1H-MRSI data was incorporated into the radiation treatment planning software and used to assess tumour control by escalating the radiation to prostate lesions that were identified by 1H-MRSI. In summary, this thesis demonstrates the clinical feasibility of using advanced spectroscopic imaging techniques for improved diagnosis and treatment of prostate cancer.
NASA Technical Reports Server (NTRS)
Schmidt, Phillip; Garg, Sanjay; Holowecky, Brian
1992-01-01
A parameter optimization framework is presented to solve the problem of partitioning a centralized controller into a decentralized hierarchical structure suitable for integrated flight/propulsion control implementation. The controller partitioning problem is briefly discussed and a cost function to be minimized is formulated, such that the resulting 'optimal' partitioned subsystem controllers will closely match the performance (including robustness) properties of the closed-loop system with the centralized controller while maintaining the desired controller partitioning structure. The cost function is written in terms of parameters in a state-space representation of the partitioned sub-controllers. Analytical expressions are obtained for the gradient of this cost function with respect to parameters, and an optimization algorithm is developed using modern computer-aided control design and analysis software. The capabilities of the algorithm are demonstrated by application to partitioned integrated flight/propulsion control design for a modern fighter aircraft in the short approach to landing task. The partitioning optimization is shown to lead to reduced-order subcontrollers that match the closed-loop command tracking and decoupling performance achieved by a high-order centralized controller.
NASA Technical Reports Server (NTRS)
Schmidt, Phillip H.; Garg, Sanjay; Holowecky, Brian R.
1993-01-01
A parameter optimization framework is presented to solve the problem of partitioning a centralized controller into a decentralized hierarchical structure suitable for integrated flight/propulsion control implementation. The controller partitioning problem is briefly discussed and a cost function to be minimized is formulated, such that the resulting 'optimal' partitioned subsystem controllers will closely match the performance (including robustness) properties of the closed-loop system with the centralized controller while maintaining the desired controller partitioning structure. The cost function is written in terms of parameters in a state-space representation of the partitioned sub-controllers. Analytical expressions are obtained for the gradient of this cost function with respect to parameters, and an optimization algorithm is developed using modern computer-aided control design and analysis software. The capabilities of the algorithm are demonstrated by application to partitioned integrated flight/propulsion control design for a modern fighter aircraft in the short approach to landing task. The partitioning optimization is shown to lead to reduced-order subcontrollers that match the closed-loop command tracking and decoupling performance achieved by a high-order centralized controller.
Optimizing the construction of devices to control inaccesible surfaces - case study
NASA Astrophysics Data System (ADS)
Niţu, E. L.; Costea, A.; Iordache, M. D.; Rizea, A. D.; Babă, Al
2017-10-01
The modern concept for the evolution of manufacturing systems requires multi-criteria optimization of technological processes and equipments, prioritizing associated criteria according to their importance. Technological preparation of the manufacturing can be developed, depending on the volume of production, to the limit of favourable economical effects related to the recovery of the costs for the design and execution of the technological equipment. Devices, as subsystems of the technological system, in the general context of modernization and diversification of machines, tools, semi-finished products and drives, are made in a multitude of constructive variants, which in many cases do not allow their identification, study and improvement. This paper presents a case study in which the multi-criteria analysis of some structures, based on a general optimization method, of novelty character, is used in order to determine the optimal construction variant of a control device. The rational construction of the control device confirms that the optimization method and the proposed calculation methods are correct and determine a different system configuration, new features and functions, and a specific method of working to control inaccessible surfaces.
Should psychology be ‘positive’? Letting the philosophers speak
Oyebode, Femi
2014-01-01
This is a brief commentary on the value of optimism in therapy. It draws on the philosophical writings of Schopenhauer and Aristotle. It suggests that the modern preoccupation with optimism may be as extreme as the bleak pessimistic outlook favoured by Schopenhauer. PMID:25237498
When Do Painters Make Their Best Work?
ERIC Educational Resources Information Center
Franses, P. H.
2013-01-01
This Research Note proposes that modern art painters make their best works at the optimal moment in their lives, a moment that could then be associated with the Divine proportion (the Fibonacci phi). An analysis of 189 highest-priced works by as many modern art painters, comparing the moment of creation with their life span of these artists,…
Three Naive Questions: Addressed to the Modern Educational Optimism
ERIC Educational Resources Information Center
Krstic, Predrag
2016-01-01
This paper aims to question anew the popular and supposedly self-evident affirmation of education, in its modern incarnation as in its historical notion. The "naive" questions suggest that we have recently taken for granted that education ought to be for the masses, that it ought to be upbringing, and that it is better than ignorance.…
ERIC Educational Resources Information Center
Papadopoulos, Ioannis
2010-01-01
The issue of the area of irregular shapes is absent from the modern mathematical textbooks in elementary education in Greece. However, there exists a collection of books written for educational purposes by famous Greek scholars dating from the eighteenth century, which propose certain techniques concerning the estimation of the area of such…
Testing New Proxies for Photosymbiosis in the Fossil Record
NASA Astrophysics Data System (ADS)
Tornabene, C.; Martindale, R. C.; Schaller, M. F.
2015-12-01
Photosymbiosis is a mutualistic relationship that many corals have developed with dinoflagellates called zooxanthellae. The dinoflagellates, of the genus Symbiodinium, photosynthesize and provide corals with most of their energy, while in turn coral hosts live in waters where zooxanthellae have optimal exposure to sunlight. Thanks to this relationship, symbiotic corals calcify faster than non-symbiotic corals. Photosymbiosis is therefore considered the evolutionary innovation that allowed corals to become major reef-builders through geological time.This relationship is extremely difficult to study. Zooxanthellae, which are housed in the coral tissue, are not preserved in fossil coral skeletons, thus determining whether corals had symbionts requires a robust proxy. In order to address this critical question, the goal of this research is to test new proxies for ancient photosymbiosis. Currently the project is focused on assessing the nitrogen (δ15N) isotopes of corals' organic matrices, sensu Muscatine et al. (2005), as well as carbon and oxygen (δ13C, δ18O) isotopes of fossil coral skeletons. Samples from Modern, Pleistocene, Oligocene and Triassic coral skeletons were analyzed to test the validity of these proxies. Coral samples comprise both (interpreted) symbiotic and non-symbiotic fossil corals from the Oligocene and Triassic as well as symbiotic fossil corals from the Modern and Pleistocene to corroborate our findings with the results of Muscatine et al. (2005). Samples were tested for diagenesis through petrographic and scanning electron microscope (SEM) analyses to avoid contamination. Additionally, a novel technique that has not yet been applied to the fossil record was tested. The technique aims to recognize dinosterol, a dinoflagellate biomarker, in both modern and fossil coral samples. The premise of this proxy is that symbiotic corals should contain the dinoflagellate biomarker, whereas those lacking symbionts should lack dinosterol. Results from this research will ideally lead to the development of a definitive, quantitative test for whether fossil corals had symbionts.
NASA Astrophysics Data System (ADS)
Srikantha, Pirathayini
Today's electric grid is rapidly evolving to provision for heterogeneous system components (e.g. intermittent generation, electric vehicles, storage devices, etc.) while catering to diverse consumer power demand patterns. In order to accommodate this changing landscape, the widespread integration of cyber communication with physical components can be witnessed in all tenets of the modern power grid. This ubiquitous connectivity provides an elevated level of awareness and decision-making ability to system operators. Moreover, devices that were typically passive in the traditional grid are now `smarter' as these can respond to remote signals, learn about local conditions and even make their own actuation decisions if necessary. These advantages can be leveraged to reap unprecedented long-term benefits that include sustainable, efficient and economical power grid operations. Furthermore, challenges introduced by emerging trends in the grid such as high penetration of distributed energy sources, rising power demands, deregulations and cyber-security concerns due to vulnerabilities in standard communication protocols can be overcome by tapping onto the active nature of modern power grid components. In this thesis, distributed constructs in optimization and game theory are utilized to design the seamless real-time integration of a large number of heterogeneous power components such as distributed energy sources with highly fluctuating generation capacities and flexible power consumers with varying demand patterns to achieve optimal operations across multiple levels of hierarchy in the power grid. Specifically, advanced data acquisition, cloud analytics (such as prediction), control and storage systems are leveraged to promote sustainable and economical grid operations while ensuring that physical network, generation and consumer comfort requirements are met. Moreover, privacy and security considerations are incorporated into the core of the proposed designs and these serve to improve the resiliency of the future smart grid. It is demonstrated both theoretically and practically that the techniques proposed in this thesis are highly scalable and robust with superior convergence characteristics. These distributed and decentralized algorithms allow individual actuating nodes to execute self-healing and adaptive actions when exposed to changes in the grid so that the optimal operating state in the grid is maintained consistently.
Development of a Composite Tailoring Procedure for Airplane Wings
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi
2000-01-01
The quest for finding optimum solutions to engineering problems has existed for a long time. In modern times, the development of optimization as a branch of applied mathematics is regarded to have originated in the works of Newton, Bernoulli and Euler. Venkayya has presented a historical perspective on optimization in [1]. The term 'optimization' is defined by Ashley [2] as a procedure "...which attempts to choose the variables in a design process so as formally to achieve the best value of some performance index while not violating any of the associated conditions or constraints". Ashley presented an extensive review of practical applications of optimization in the aeronautical field till about 1980 [2]. It was noted that there existed an enormous amount of published literature in the field of optimization, but its practical applications in industry were very limited. Over the past 15 years, though, optimization has been widely applied to address practical problems in aerospace design [3-5]. The design of high performance aerospace systems is a complex task. It involves the integration of several disciplines such as aerodynamics, structural analysis, dynamics, and aeroelasticity. The problem involves multiple objectives and constraints pertaining to the design criteria associated with each of these disciplines. Many important trade-offs exist between the parameters involved which are used to define the different disciplines. Therefore, the development of multidisciplinary design optimization (MDO) techniques, in which different disciplines and design parameters are coupled into a closed loop numerical procedure, seems appropriate to address such a complex problem. The importance of MDO in successful design of aerospace systems has been long recognized. Recent developments in this field have been surveyed by Sobieszczanski-Sobieski and Haftka [6].
Photonics and nanophotonics and information and communication technologies in modern food packaging.
Sarapulova, Olha; Sherstiuk, Valentyn; Shvalagin, Vitaliy; Kukhta, Aleksander
2015-01-01
The analysis of the problem of conjunction of information and communication technologies (ICT) with packaging industry and food production was made. The perspective of combining the latest advances of nanotechnology, including nanophotonics, and ICT for creating modern smart packaging was shown. There were investigated luminescent films with zinc oxide nanoparticles, which change luminescence intensity as nano-ZnO interacts with decay compounds of food products, for active and intelligent packaging. High luminescent transparent films were obtained from colloidal suspension of ZnO and polyvinylpyrrolidone (PVP). The influence of molecular mass, concentration of nano-ZnO, and film thickness on luminescent properties of films was studied in order to optimize the content of the compositions. The possibility of covering the obtained films with polyvinyl alcohol was considered for eliminating water soluble properties of PVP. The luminescent properties of films with different covers were studied. The insoluble in water composition based on ZnO stabilized with colloidal silicon dioxide and PVP in polymethylmethacrylate was developed, and the luminescent properties of films were investigated. The compositions are non-toxic, safe, and suitable for applying to the inner surface of active and intelligent packaging by printing techniques, such as screen printing, flexography, inkjet, and pad printing.
Semiserin, V A; Khritinin, D F; Maev, I V; Karakozov, A T; Eremin, M N; Olenicheva, E L
2012-01-01
In this paper is synthesized current and recent data on the problem of metabolic syndrome (MS) in combination with toxic liver injury (CCI). Statistical parameters of the last 15 years, the dynamics of alimentary-constitutional obesity (ABC) in patients from the officers contracted service of Defense Ministry of Russia are reflected. Two-year experience in the application of modern non-invasive methods of diagnosis of liver fibrosis with a reflection of its dynamics on the background of complex treatment of patients with MS in conjunction with the Chamber on the example of 57 patients is shown. Paid great attention to psychological and emotional adjustment of patients with ABC, given the complex survey design and treatment in violation of motivational and behavioral responses. High clinical efficiency of combination drug therapy of MS and CCI, the diagnostic value of modern non-invasive methods of diagnosis of hepatic fibrosis are reliably performed. Technique of elastography significantly improves the liver clinical evaluation of the effectiveness of the therapy, allows for early detect the presence of the initial degree of hepatic fibrosis, choose the optimal treatment regimen and to evaluate the results dynamically.
Photonics and Nanophotonics and Information and Communication Technologies in Modern Food Packaging
NASA Astrophysics Data System (ADS)
Sarapulova, Olha; Sherstiuk, Valentyn; Shvalagin, Vitaliy; Kukhta, Aleksander
2015-05-01
The analysis of the problem of conjunction of information and communication technologies (ICT) with packaging industry and food production was made. The perspective of combining the latest advances of nanotechnology, including nanophotonics, and ICT for creating modern smart packaging was shown. There were investigated luminescent films with zinc oxide nanoparticles, which change luminescence intensity as nano-ZnO interacts with decay compounds of food products, for active and intelligent packaging. High luminescent transparent films were obtained from colloidal suspension of ZnO and polyvinylpyrrolidone (PVP). The influence of molecular mass, concentration of nano-ZnO, and film thickness on luminescent properties of films was studied in order to optimize the content of the compositions. The possibility of covering the obtained films with polyvinyl alcohol was considered for eliminating water soluble properties of PVP. The luminescent properties of films with different covers were studied. The insoluble in water composition based on ZnO stabilized with colloidal silicon dioxide and PVP in polymethylmethacrylate was developed, and the luminescent properties of films were investigated. The compositions are non-toxic, safe, and suitable for applying to the inner surface of active and intelligent packaging by printing techniques, such as screen printing, flexography, inkjet, and pad printing.
Computer Assisted Design, Prediction, and Execution of Economical Organic Syntheses
NASA Astrophysics Data System (ADS)
Gothard, Nosheen Akber
The synthesis of useful organic molecules via simple and cost-effective routes is a core challenge in organic chemistry. In industry or academia, organic chemists use their chemical intuition, technical expertise and published procedures to determine an optimal pathway. This approach, not only takes time and effort, but also is cost prohibitive. Many potential optimal routes scratched on paper fail to get experimentally tested. In addition, with new methods being discovered daily are often overlooked by established techniques. This thesis reports a computational technique that assist the discovery of economical synthetic routes to useful organic targets. Organic chemistry exists as a network where chemicals are connected by reactions, analogous to citied connected by roads in a geographic map. This network topology of organic reactions in the network of organic chemistry (NOC) allows the application of graph-theory to devise algorithms for synthetic optimization of organic targets. A computational approach comprised of customizable algorithms, pre-screening filters, and existing chemoinformatic techniques is capable of answering complex questions and perform mechanistic tasks desired by chemists such as optimization of organic syntheses. One-pot reactions are central to modern synthesis since they save resources and time by avoiding isolation, purification, characterization, and production of chemical waste after each synthetic step. Sometimes, such reactions are identified by chance or, more often, by careful inspection of individual steps that are to be wired together. Algorithms are used to discover one-pot reactions and validated experimentally. Which demonstrate that the computationally predicted sequences can indeed by carried out experimentally in good overall yields. The experimental examples are chosen to from small networks of reactions around useful chemicals such as quinoline scaffolds, quinoline-based inhibitors of phosphoinositide 3-kinase delta (PI3Kdelta) enzyme, and thiophene derivatives. In this way, we replace individual synthetic connections with two-, three-, or even four-step one-pot sequences. Lastly, the computational method is utilized to devise hypothetical synthetic route to popular pharmaceutical drugs like NaproxenRTM and TaxolRTM. The algorithmically generated optimal pathways are evaluated with chemistry logic. By applying labor/cost factor It was revealed that not all shorter synthesis routes are economical, sometimes "Longest way round is the shortest way home" lengthier routes are found to be more economical and environmentally friendly.
Overview of the Wheat Genetic Transformation and Breeding Status in China.
Han, Jiapeng; Yu, Xiaofen; Chang, Junli; Yang, Guangxiao; He, Guangyuan
2017-01-01
In the past two decades, Chinese scientists have achieved significant progress on three aspects of wheat genetic transformation. First, the wheat transformation platform has been established and optimized to improve the transformation efficiency, shorten the time required from starting of transformation procedure to the fertile transgenic wheat plants obtained as well as to overcome the problem of genotype-dependent for wheat genetic transformation in wide range of wheat elite varieties. Second, with the help of many emerging techniques such as CRISPR/cas9 function of over 100 wheat genes has been investigated. Finally, modern technology has been combined with the traditional breeding technique such as crossing to accelerate the application of wheat transformation. Overall, the wheat end-use quality and the characteristics of wheat stress tolerance have been improved by wheat genetic engineering technique. So far, wheat transgenic lines integrated with quality-improved genes and stress tolerant genes have been on the way of Production Test stage in the field. The debates and the future studies on wheat transformation have been discussed, and the brief summary of Chinese wheat breeding research history has also been provided in this review.
Extreme Learning Machine and Particle Swarm Optimization in optimizing CNC turning operation
NASA Astrophysics Data System (ADS)
Janahiraman, Tiagrajah V.; Ahmad, Nooraziah; Hani Nordin, Farah
2018-04-01
The CNC machine is controlled by manipulating cutting parameters that could directly influence the process performance. Many optimization methods has been applied to obtain the optimal cutting parameters for the desired performance function. Nonetheless, the industry still uses the traditional technique to obtain those values. Lack of knowledge on optimization techniques is the main reason for this issue to be prolonged. Therefore, the simple yet easy to implement, Optimal Cutting Parameters Selection System is introduced to help the manufacturer to easily understand and determine the best optimal parameters for their turning operation. This new system consists of two stages which are modelling and optimization. In modelling of input-output and in-process parameters, the hybrid of Extreme Learning Machine and Particle Swarm Optimization is applied. This modelling technique tend to converge faster than other artificial intelligent technique and give accurate result. For the optimization stage, again the Particle Swarm Optimization is used to get the optimal cutting parameters based on the performance function preferred by the manufacturer. Overall, the system can reduce the gap between academic world and the industry by introducing a simple yet easy to implement optimization technique. This novel optimization technique can give accurate result besides being the fastest technique.
Gianatassio, Ryan; Lopchuk, Justin M.; Wang, Jie; Pan, Chung-Mao; Malins, Lara R.; Prieto, Liher; Brandt, Thomas A.; Collins, Michael R.; Gallego, Gary M.; Sach, Neal W.; Spangler, Jillian E.; Zhu, Huichin; Zhu, Jinjiang; Baran, Phil S.
2015-01-01
To optimize drug candidates, modern medicinal chemists are increasingly turning to an unconventional structural motif: small, strained ring systems. However, the difficulty of introducing substituents such as bicyclo[1.1.1]pentanes, azetidines, or cyclobutanes often outweighs the challenge of synthesizing the parent scaffold itself. Thus, there is an urgent need for general methods to rapidly and directly append such groups onto core scaffolds. Here we report a general strategy to harness the embedded potential energy of effectively spring-loaded C–C and C–N bonds with the most oft-encountered nucleophiles in pharmaceutical chemistry, amines. Strain release amination can diversify a range of substrates with a multitude of desirable bioisosteres at both the early and late-stages of a synthesis. The technique has also been applied to peptide labeling and bioconjugation. PMID:26816372
Ramses-GPU: Second order MUSCL-Handcock finite volume fluid solver
NASA Astrophysics Data System (ADS)
Kestener, Pierre
2017-10-01
RamsesGPU is a reimplementation of RAMSES (ascl:1011.007) which drops the adaptive mesh refinement (AMR) features to optimize 3D uniform grid algorithms for modern graphics processor units (GPU) to provide an efficient software package for astrophysics applications that do not need AMR features but do require a very large number of integration time steps. RamsesGPU provides an very efficient C++/CUDA/MPI software implementation of a second order MUSCL-Handcock finite volume fluid solver for compressible hydrodynamics as a magnetohydrodynamics solver based on the constraint transport technique. Other useful modules includes static gravity, dissipative terms (viscosity, resistivity), and forcing source term for turbulence studies, and special care was taken to enhance parallel input/output performance by using state-of-the-art libraries such as HDF5 and parallel-netcdf.
The new era of cardiac surgery: hybrid therapy for cardiovascular disease.
Solenkova, Natalia V; Umakanthan, Ramanan; Leacche, Marzia; Zhao, David X; Byrne, John G
2010-11-01
Surgical therapy for cardiovascular disease carries excellent long-term outcomes but it is relatively invasive. With the development of new devices and techniques, modern cardiovascular surgery is trending toward less invasive approaches, especially for patients at high risk for traditional open heart surgery. A hybrid strategy combines traditional surgical treatments performed in the operating room with treatments traditionally available only in the catheterization laboratory with the goal of offering patients the best available therapy for any set of cardiovascular diseases. Examples of hybrid procedures include hybrid coronary artery bypass grafting, hybrid valve surgery and percutaneous coronary intervention, hybrid endocardial and epicardial atrial fibrillation procedures, and hybrid coronary artery bypass grafting/carotid artery stenting. This multidisciplinary approach requires strong collaboration between cardiac surgeons, vascular surgeons, and interventional cardiologists to obtain optimal patient outcomes.
Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark
2011-01-01
A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.
Axelrod, David A; Vagefi, Parsia A; Roberts, John P
2015-08-01
The liver transplant allocation system has evolved to a ranking system of “sickest-first” system based on objective criteria. Yet, organs continue to be distributed first within OPOs and regions that are largely based on historical practice patterns related to kidney transplantation and were never designed to minimize waitlist death or equalize opportunity for liver transplant. The current proposal is a move to enhance survival though the application of modern mathematical techniques to optimize liver distribution. Like MELDbased allocation, it will never be perfect and should be continually evaluated and revised. However, the disparity in access, which favors those residing in or able to travel to privileged areas, to the detriment of the patients dying on the list in underserved areas, is simply not defensible in 2015.
Classification without labels: learning from mixed samples in high energy physics
NASA Astrophysics Data System (ADS)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-01
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimal classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.
Integrated control-system design via generalized LQG (GLQG) theory
NASA Technical Reports Server (NTRS)
Bernstein, Dennis S.; Hyland, David C.; Richter, Stephen; Haddad, Wassim M.
1989-01-01
Thirty years of control systems research has produced an enormous body of theoretical results in feedback synthesis. Yet such results see relatively little practical application, and there remains an unsettling gap between classical single-loop techniques (Nyquist, Bode, root locus, pole placement) and modern multivariable approaches (LQG and H infinity theory). Large scale, complex systems, such as high performance aircraft and flexible space structures, now demand efficient, reliable design of multivariable feedback controllers which optimally tradeoff performance against modeling accuracy, bandwidth, sensor noise, actuator power, and control law complexity. A methodology is described which encompasses numerous practical design constraints within a single unified formulation. The approach, which is based upon coupled systems or modified Riccati and Lyapunov equations, encompasses time-domain linear-quadratic-Gaussian theory and frequency-domain H theory, as well as classical objectives such as gain and phase margin via the Nyquist circle criterion. In addition, this approach encompasses the optimal projection approach to reduced-order controller design. The current status of the overall theory will be reviewed including both continuous-time and discrete-time (sampled-data) formulations.
Evaluation of a Multicore-Optimized Implementation for Tomographic Reconstruction
Agulleiro, Jose-Ignacio; Fernández, José Jesús
2012-01-01
Tomography allows elucidation of the three-dimensional structure of an object from a set of projection images. In life sciences, electron microscope tomography is providing invaluable information about the cell structure at a resolution of a few nanometres. Here, large images are required to combine wide fields of view with high resolution requirements. The computational complexity of the algorithms along with the large image size then turns tomographic reconstruction into a computationally demanding problem. Traditionally, high-performance computing techniques have been applied to cope with such demands on supercomputers, distributed systems and computer clusters. In the last few years, the trend has turned towards graphics processing units (GPUs). Here we present a detailed description and a thorough evaluation of an alternative approach that relies on exploitation of the power available in modern multicore computers. The combination of single-core code optimization, vector processing, multithreading and efficient disk I/O operations succeeds in providing fast tomographic reconstructions on standard computers. The approach turns out to be competitive with the fastest GPU-based solutions thus far. PMID:23139768
Feasibility of Tactical Air Delivery Resupply Using Gliders
2016-12-01
using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...Sparrow,” using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...the desired point(s) of impact due to the atmospheric three-dimensional ( 3D ) wind and density field encountered by the descending load under canopy
NASA Astrophysics Data System (ADS)
Huang, Jian; Wei, Kai; Jin, Kai; Li, Min; Zhang, YuDong
2018-06-01
The Sodium laser guide star (LGS) plays a key role in modern astronomical Adaptive Optics Systems (AOSs). The spot size and photon return of the Sodium LGS depend strongly on the laser power density distribution at the Sodium layer and thus affect the performance of the AOS. The power density distribution is degraded by turbulence in the uplink path, launch system aberrations, the beam quality of the laser, and so forth. Even without any aberrations, the TE00 Gaussian type is still not the optimal power density distribution to obtain the best balance between the measurement error and temporal error. To optimize and control the LGS power density distribution at the Sodium layer to an expected distribution type, a method that combines pre-correction and beam-shaping is proposed. A typical result shows that under strong turbulence (Fried parameter (r0) of 5 cm) and for a quasi-continuous wave Sodium laser (power (P) of 15 W), in the best case, our method can effectively optimize the distribution from the Gaussian type to the "top-hat" type and enhance the photon return flux of the Sodium LGS; at the same time, the total error of the AOS is decreased by 36% with our technique for a high power laser and poor seeing.
Real-time CO2 sensor for the optimal control of electronic EGR system
NASA Astrophysics Data System (ADS)
Kim, Gwang-jung; Choi, Byungchul; Choi, Inchul
2013-12-01
In modern diesel engines, EGR (Exhaust Gas Recirculation) is an important technique used in nitrogen oxide (NOx) emission reduction. This paper describes the development and experimental results of a fiber-optical sensor using a 2.7 μm wavelength absorption to quantify the simultaneous CO2 concentration which is the primary variable of EGR rate (CO2 in the exhaust gas versus CO2 in the intake gas, %). A real-time laser absorption method was developed using a DFB (distributed feedback) diode laser and waveguide to make optimal design and control of electronic EGR system required for `Euro-6' and `Tier 4 Final' NOx emission regulations. While EGR is effective to reduce NOx significantly, the amount of HC and CO is increased in the exhaust gas if EGR rate is not controlled based on driving conditions. Therefore, it is important to recirculate an appropriate amount of exhaust gas in the operation condition generating high volume of NOx. In this study, we evaluated basic characteristics and functions of our optical sensor and studied basically in order to find out optimal design condition. We demonstrated CO2 measurement speed, accuracy and linearity as making a condition similar to real engine through the bench-scale experiment.
On the Optimization of Aerospace Plane Ascent Trajectory
NASA Astrophysics Data System (ADS)
Al-Garni, Ahmed; Kassem, Ayman Hamdy
A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.
Modern Education in China. Bulletin, 1919, No. 44
ERIC Educational Resources Information Center
Edmunds, Charles K.
1919-01-01
The Chinese conception of life's values is so different from that of western peoples that they have failed to develop modern technique and scientific knowledge. Now that they have come to see the value of these, rapid and fundamental changes are taking place. When modern scientific knowledge is added to the skill which the Chinese already have in…
Dynamic positioning configuration and its first-order optimization
NASA Astrophysics Data System (ADS)
Xue, Shuqiang; Yang, Yuanxi; Dang, Yamin; Chen, Wu
2014-02-01
Traditional geodetic network optimization deals with static and discrete control points. The modern space geodetic network is, on the other hand, composed of moving control points in space (satellites) and on the Earth (ground stations). The network configuration composed of these facilities is essentially dynamic and continuous. Moreover, besides the position parameter which needs to be estimated, other geophysical information or signals can also be extracted from the continuous observations. The dynamic (continuous) configuration of the space network determines whether a particular frequency of signals can be identified by this system. In this paper, we employ the functional analysis and graph theory to study the dynamic configuration of space geodetic networks, and mainly focus on the optimal estimation of the position and clock-offset parameters. The principle of the D-optimization is introduced in the Hilbert space after the concept of the traditional discrete configuration is generalized from the finite space to the infinite space. It shows that the D-optimization developed in the discrete optimization is still valid in the dynamic configuration optimization, and this is attributed to the natural generalization of least squares from the Euclidean space to the Hilbert space. Then, we introduce the principle of D-optimality invariance under the combination operation and rotation operation, and propose some D-optimal simplex dynamic configurations: (1) (Semi) circular configuration in 2-dimensional space; (2) the D-optimal cone configuration and D-optimal helical configuration which is close to the GPS constellation in 3-dimensional space. The initial design of GPS constellation can be approximately treated as a combination of 24 D-optimal helixes by properly adjusting the ascending node of different satellites to realize a so-called Walker constellation. In the case of estimating the receiver clock-offset parameter, we show that the circular configuration, the symmetrical cone configuration and helical curve configuration are still D-optimal. It shows that the given total observation time determines the optimal frequency (repeatability) of moving known points and vice versa, and one way to improve the repeatability is to increase the rotational speed. Under the Newton's law of motion, the frequency of satellite motion determines the orbital altitude. Furthermore, we study three kinds of complex dynamic configurations, one of which is the combination of D-optimal cone configurations and a so-called Walker constellation composed of D-optimal helical configuration, the other is the nested cone configuration composed of n cones, and the last is the nested helical configuration composed of n orbital planes. It shows that an effective way to achieve high coverage is to employ the configuration composed of a certain number of moving known points instead of the simplex configuration (such as D-optimal helical configuration), and one can use the D-optimal simplex solutions or D-optimal complex configurations in any combination to achieve powerful configurations with flexile coverage and flexile repeatability. Alternately, how to optimally generate and assess the discrete configurations sampled from the continuous one is discussed. The proposed configuration optimization framework has taken the well-known regular polygons (such as equilateral triangle and quadrangular) in two-dimensional space and regular polyhedrons (regular tetrahedron, cube, regular octahedron, regular icosahedron, or regular dodecahedron) into account. It shows that the conclusions made by the proposed technique are more general and no longer limited by different sampling schemes. By the conditional equation of D-optimal nested helical configuration, the relevance issues of GNSS constellation optimization are solved and some examples are performed by GPS constellation to verify the validation of the newly proposed optimization technique. The proposed technique is potentially helpful in maintenance and quadratic optimization of single GNSS of which the orbital inclination and the orbital altitude change under the precession, as well as in optimally nesting GNSSs to perform global homogeneous coverage of the Earth.
Strength training for the warfighter.
Kraemer, William J; Szivak, Tunde K
2012-07-01
Optimizing strength training for the warfighter is challenged by past training philosophies that no longer serve the modern warfighter facing the "anaerobic battlefield." Training approaches for integration of strength with other needed physical capabilities have been shown to require a periodization model that has the flexibility for changes and is able to adapt to ever-changing circumstances affecting the quality of workouts. Additionally, sequencing of workouts to limit over-reaching and development of overtraining syndromes that end in loss of duty time and injury are paramount to long-term success. Allowing adequate time for rest and recovery and recognizing the negative influences of extreme exercise programs and excessive endurance training will be vital in moving physical training programs into a more modern perspective as used by elite strength-power anaerobic athletes in sports today. Because the warfighter is an elite athlete, it is time that training approaches that are scientifically based are updated within the military to match the functional demands of modern warfare and are given greater credence and value at the command levels. A needs analysis, development of periodized training modules, and individualization of programs are needed to optimize the strength of the modern warfighter. We now have the knowledge, professional coaches and nonprofit organization certifications with continuing education units, and modern training technology to allow this to happen. Ultimately, it only takes command decisions and implementation to make this possible.
A profile of the demographics and training characteristics of professional modern dancers.
Weiss, David S; Shah, Selina; Burchette, Raoul J
2008-01-01
Modern dancers are a unique group of artists, performing a diverse repertoire in dance companies of various sizes. In this study, 184 professional modern dancers in the United States (males N=49, females N=135), including members of large and small companies as well as freelance dancers, were surveyed regarding their demographics and training characteristics. The mean age of the dancers was 30.1 +/- 7.3 years, and they had danced professionally for 8.9 +/- 7.2 years. The average Body Mass Index (BMI) was 23.6 +/- 2.4 for males and 20.5 +/- 1.7 for females. Females had started taking dance class earlier (age 6.5 +/- 4.2 years) as compared to males (age 15.6 +/- 6.2 years). Females were more likely to have begun their training in ballet, while males more often began with modern classes (55% and 51% respectively, p < 0.0001). The professional modern dancers surveyed spent 8.3 +/- 6.0 hours in class and 17.2 +/- 12.6 hours in rehearsal each week. Eighty percent took modern technique class and 67% reported that they took ballet technique class. The dancers who specified what modern technique they studied (N=84) reported between two and four different techniques. The dancers also participated in a multitude of additional exercise regimens for a total of 8.2 +/- 6.6 hours per week, with the most common types being Pilates, yoga, and upper body weightlifting. The dancers wore many different types of footwear, depending on the style of dance being performed. For modern dance alone, dancers wore 12 different types of footwear. Reflecting the diversity of the dancers and companies surveyed, females reported performing for 23.3 +/- 14.0 weeks (range: 2-52 weeks) per year; males reported performing 20.4 +/- 13.9 weeks (range: 1-40) per year. Only 18% of the dancers did not have any health insurance, with 54% having some type of insurance provided by their employer. However, 23% of the dancers purchased their own insurance, and 22% had insurance provided by their families. Only 16% of dancers reported that they had Workers' Compensation coverage, despite the fact that they were all professionals, including many employed by major modern dance companies across the United States. It is concluded that understanding the training profile of the professional modern dancer should assist healthcare providers in supplying appropriate medical care for these performers.
Optimal ventilation of the anesthetized pediatric patient.
Feldman, Jeffrey M
2015-01-01
Mechanical ventilation of the pediatric patient is challenging because small changes in delivered volume can be a significant fraction of the intended tidal volume. Anesthesia ventilators have traditionally been poorly suited to delivering small tidal volumes accurately, and pressure-controlled ventilation has become used commonly when caring for pediatric patients. Modern anesthesia ventilators are designed to deliver small volumes accurately to the patient's airway by compensating for the compliance of the breathing system and delivering tidal volume independent of fresh gas flow. These technology advances provide the opportunity to implement a lung-protective ventilation strategy in the operating room based upon control of tidal volume. This review will describe the capabilities of the modern anesthesia ventilator and the current understanding of lung-protective ventilation. An optimal approach to mechanical ventilation for the pediatric patient is described, emphasizing the importance of using bedside monitors to optimize the ventilation strategy for the individual patient.
Looking ahead in systems engineering
NASA Technical Reports Server (NTRS)
Feigenbaum, Donald S.
1966-01-01
Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.
[Aerobic methylobacteria as promising objects of modern biotechnology].
Doronina, N V; Toronskava, L; Fedorov, D N; Trotsenko, Yu A
2015-01-01
The experimental data of the past decade concerning the metabolic peculiarities of aerobic meth ylobacteria and the prospects for their use in different fields of modern biotechnology, including genetic engineering techniques, have been summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, B; Southern Medical University, Guangzhou, Guangdong; Tian, Z
Purpose: While compressed sensing-based cone-beam CT (CBCT) iterative reconstruction techniques have demonstrated tremendous capability of reconstructing high-quality images from undersampled noisy data, its long computation time still hinders wide application in routine clinic. The purpose of this study is to develop a reconstruction framework that employs modern consensus optimization techniques to achieve CBCT reconstruction on a multi-GPU platform for improved computational efficiency. Methods: Total projection data were evenly distributed to multiple GPUs. Each GPU performed reconstruction using its own projection data with a conventional total variation regularization approach to ensure image quality. In addition, the solutions from GPUs were subjectmore » to a consistency constraint that they should be identical. We solved the optimization problem with all the constraints considered rigorously using an alternating direction method of multipliers (ADMM) algorithm. The reconstruction framework was implemented using OpenCL on a platform with two Nvidia GTX590 GPU cards, each with two GPUs. We studied the performance of our method and demonstrated its advantages through a simulation case with a NCAT phantom and an experimental case with a Catphan phantom. Result: Compared with the CBCT images reconstructed using conventional FDK method with full projection datasets, our proposed method achieved comparable image quality with about one third projection numbers. The computation time on the multi-GPU platform was ∼55 s and ∼ 35 s in the two cases respectively, achieving a speedup factor of ∼ 3.0 compared with single GPU reconstruction. Conclusion: We have developed a consensus ADMM-based CBCT reconstruction method which enabled performing reconstruction on a multi-GPU platform. The achieved efficiency made this method clinically attractive.« less
A Language for Specifying Compiler Optimizations for Generic Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcock, Jeremiah J.
2007-01-01
Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less
Impact of the macroeconomic factors on university budgeting the US and Russia
NASA Astrophysics Data System (ADS)
Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly
2017-10-01
This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.
Identification of Microorganisms by Modern Analytical Techniques.
Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica
2017-11-01
Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.
Barac, Bosko
2002-05-01
Modern neurology has completely changed in its concepts of science and medical discipline regarding the etiologies and the capabilities in the diagnostics, management, rehabilitation and prevention of neurological diseases. Advances in neurological sciences produced a rapid growth in the number of neurologists, new subspecialties and neurological institutions worldwide, opening questions on their possible application due to financial restrictions in many countries. Neurology in Croatia followed the modern tendencies in the world: in line with its humanistic tradition its orientation to the patient early appeared. From this experience developed a care on the optimal organization of neurological services, later on initiated in the Research Group on the Organization and Delivery of Neurological Services, founded in the World Federation of Neurology. The main activities and the Recommendations related to Neurology in Public Health are described, with the proposed levels of organization of neurological services, aiming at the optimal and rational neurological care. Problems of international collaboration on cost-effectiveness in neurology are accentuated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Henry, Stephen Michael
In order to effectively plan the management and modernization of their large and diverse fleets of vehicles, Program Executive Office Ground Combat Systems (PEO GCS) and Program Executive Office Combat Support and Combat Service Support (PEO CS&CSS) commis- sioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This paper contains a thor-more » ough documentation of the terminology, parameters, variables, and constraints that comprise the fleet management mixed integer linear programming (MILP) mathematical formulation. This paper, which is an update to the original CPAT formulation document published in 2015 (SAND2015-3487), covers the formulation of important new CPAT features.« less
Impact of archeomagnetic field model data on modern era geomagnetic forecasts
NASA Astrophysics Data System (ADS)
Tangborn, Andrew; Kuang, Weijia
2018-03-01
A series of geomagnetic data assimilation experiments have been carried out to demonstrate the impact of assimilating archeomagnetic data via the CALS3k.4 geomagnetic field model from the period between 10 and 1590 CE. The assimilation continues with the gufm1 model from 1590 to 1990 and CM4 model from 1990 to 2000 as observations, and comparisons between these models and the geomagnetic forecasts are used to determine an optimal maximum degree for the archeomagnetic observations, and to independently estimate errors for these observations. These are compared with an assimilation experiment that uses the uncertainties provided with CALS3k.4. Optimal 20 year forecasts in 1990 are found when the Gauss coefficients up to degree 3 are assimilated. In addition we demonstrate how a forecast and observation bias correction scheme could be used to reduce bias in modern era forecasts. Initial experiments show that this approach can reduce modern era forecast biases by as much as 50%.
Optimization of a Thermodynamic Model Using a Dakota Toolbox Interface
NASA Astrophysics Data System (ADS)
Cyrus, J.; Jafarov, E. E.; Schaefer, K. M.; Wang, K.; Clow, G. D.; Piper, M.; Overeem, I.
2016-12-01
Scientific modeling of the Earth physical processes is an important driver of modern science. The behavior of these scientific models is governed by a set of input parameters. It is crucial to choose accurate input parameters that will also preserve the corresponding physics being simulated in the model. In order to effectively simulate real world processes the models output data must be close to the observed measurements. To achieve this optimal simulation, input parameters are tuned until we have minimized the objective function, which is the error between the simulation model outputs and the observed measurements. We developed an auxiliary package, which serves as a python interface between the user and DAKOTA. The package makes it easy for the user to conduct parameter space explorations, parameter optimizations, as well as sensitivity analysis while tracking and storing results in a database. The ability to perform these analyses via a Python library also allows the users to combine analysis techniques, for example finding an approximate equilibrium with optimization then immediately explore the space around it. We used the interface to calibrate input parameters for the heat flow model, which is commonly used in permafrost science. We performed optimization on the first three layers of the permafrost model, each with two thermal conductivity coefficients input parameters. Results of parameter space explorations indicate that the objective function not always has a unique minimal value. We found that gradient-based optimization works the best for the objective functions with one minimum. Otherwise, we employ more advanced Dakota methods such as genetic optimization and mesh based convergence in order to find the optimal input parameters. We were able to recover 6 initially unknown thermal conductivity parameters within 2% accuracy of their known values. Our initial tests indicate that the developed interface for the Dakota toolbox could be used to perform analysis and optimization on a `black box' scientific model more efficiently than using just Dakota.
Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine
NASA Technical Reports Server (NTRS)
Danehy, P. M.; DeLoach, R.; Cutler, A. D.
2002-01-01
We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.
MO-B-BRB-02: Maintain the Quality of Treatment Planning for Time-Constraint Cases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, J.
The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequentialmore » events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi-automatic plan evaluation, (e) quality checklist for error prevention, (f) iterative process, (g) balance of speed and quality Learning Objectives: Gain familiarity with the workflow of modern treatment planning process. Understand the scope and challenges of managing modern treatment planning processes. Gain familiarity with Lean Six Sigma approaches and their implementation in the treatment planning workflow.« less
ORACLS: A system for linear-quadratic-Gaussian control law design
NASA Technical Reports Server (NTRS)
Armstrong, E. S.
1978-01-01
A modern control theory design package (ORACLS) for constructing controllers and optimal filters for systems modeled by linear time-invariant differential or difference equations is described. Numerical linear-algebra procedures are used to implement the linear-quadratic-Gaussian (LQG) methodology of modern control theory. Algorithms are included for computing eigensystems of real matrices, the relative stability of a matrix, factored forms for nonnegative definite matrices, the solutions and least squares approximations to the solutions of certain linear matrix algebraic equations, the controllability properties of a linear time-invariant system, and the steady state covariance matrix of an open-loop stable system forced by white noise. Subroutines are provided for solving both the continuous and discrete optimal linear regulator problems with noise free measurements and the sampled-data optimal linear regulator problem. For measurement noise, duality theory and the optimal regulator algorithms are used to solve the continuous and discrete Kalman-Bucy filter problems. Subroutines are also included which give control laws causing the output of a system to track the output of a prescribed model.
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
Clustering methods for the optimization of atomic cluster structure
NASA Astrophysics Data System (ADS)
Bagattini, Francesco; Schoen, Fabio; Tigli, Luca
2018-04-01
In this paper, we propose a revised global optimization method and apply it to large scale cluster conformation problems. In the 1990s, the so-called clustering methods were considered among the most efficient general purpose global optimization techniques; however, their usage has quickly declined in recent years, mainly due to the inherent difficulties of clustering approaches in large dimensional spaces. Inspired from the machine learning literature, we redesigned clustering methods in order to deal with molecular structures in a reduced feature space. Our aim is to show that by suitably choosing a good set of geometrical features coupled with a very efficient descent method, an effective optimization tool is obtained which is capable of finding, with a very high success rate, all known putative optima for medium size clusters without any prior information, both for Lennard-Jones and Morse potentials. The main result is that, beyond being a reliable approach, the proposed method, based on the idea of starting a computationally expensive deep local search only when it seems worth doing so, is capable of saving a huge amount of searches with respect to an analogous algorithm which does not employ a clustering phase. In this paper, we are not claiming the superiority of the proposed method compared to specific, refined, state-of-the-art procedures, but rather indicating a quite straightforward way to save local searches by means of a clustering scheme working in a reduced variable space, which might prove useful when included in many modern methods.
Besseris, George J
2018-03-01
Generalized regression neural networks (GRNN) may act as crowdsourcing cognitive agents to screen small, dense and complex datasets. The concurrent screening and optimization of several complex physical and sensory traits of bread is developed using a structured Taguchi-type micro-mining technique. A novel product outlook is offered to industrial operations to cover separate aspects of smart product design, engineering and marketing. Four controlling factors were selected to be modulated directly on a modern production line: 1) the dough weight, 2) the proofing time, 3) the baking time, and 4) the oven zone temperatures. Concentrated experimental recipes were programmed using the Taguchi-type L 9 (3 4 ) OA-sampler to detect potentially non-linear multi-response tendencies. The fused behavior of the master-ranked bread characteristics behavior was smart sampled with GRNN-crowdsourcing and robust analysis. It was found that the combination of the oven zone temperatures to play a highly influential role in all investigated scenarios. Moreover, the oven zone temperatures and the dough weight appeared to be instrumental when attempting to synchronously adjusting all four physical characteristics. The optimal oven-zone temperature setting for concurrent screening-and-optimization was found to be 270-240 °C. The optimized (median) responses for loaf weight, moisture, height, width, color, flavor, crumb structure, softness, and elasticity are: 782 g, 34.8 %, 9.36 cm, 10.41 cm, 6.6, 7.2, 7.6, 7.3, and 7.0, respectively.
Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
NASA Astrophysics Data System (ADS)
Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr
2017-10-01
Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.
Modernization of vertical Pelton turbines with the help of CFD and model testing
NASA Astrophysics Data System (ADS)
Mack, Reiner; Gola, Bartlomiej; Smertnig, Martin; Wittwer, Bernhard; Meusburger, Peter
2014-03-01
The modernization of water turbines bears a high potential of increasing the already installed hydropower capacity. In many projects the existing waterways allow a substantial increase of the available flow capacity and with it the energy output. But also the upgrading onto a state of the art hydraulic, mechanical and electrical design will increase the available power considerably after the rehabilitation. The two phase nature of the flow in Pelton turbines requires for the hydraulic refurbishment special care in the application of the available design methods. Where the flow in the high pressure section of the turbine is mainly of one phase nature, CFD has been used as a standard tool for many years. Also the jet quality, and with it the exploration of the source of flow disturbances that cause poor free surface quality can be investigated with CFD. The interaction of the jet with the buckets of the runner is also examined by means of CFD. However, its accuracy with respect to hydraulic efficiency is, because of the two phase flow and the transient flow process, in very few cases good enough for a reliable and accurate prediction of absolute numbers. The optimization of hydraulic bucket profiles is therefore always checked with measurements in homologous scaled model turbines. A similar situation exists for the housing flow after the water is discharged from the runner. Here also CFD techniques are available to explore the general mechanisms. However, due to the two phase flow nature, where only a very small space is filled with moving water, the experimental setup in a model turbine is always the final proof for optimizations of housing inserts and modifications. The hydraulic design of a modernization project for a power station equipped with vertical Pelton turbines of two different designs is described in the proposed paper. It will be shown, how CFD is applied to determine the losses in the high pressure section and how these results are combined with the model tests carried out in the hydraulic laboratory. Finally a comparison is made in between the achieved model turbine results with measurements carried out in the prototype.
Current status of rotational atherectomy.
Tomey, Matthew I; Kini, Annapoorna S; Sharma, Samin K
2014-04-01
Rotational atherectomy facilitates percutaneous coronary intervention for complex de novo lesions with severe calcification. A strategy of routine rotational atherectomy has not, however, conferred reduction in restenosis or major adverse cardiac events. As it is technically demanding, rotational atherectomy is also uncommon. At this 25-year anniversary since the introduction of rotational atherectomy, we sought to review the current state-of-the-art in rotational atherectomy technique, safety, and efficacy data in the modern era of drug-eluting stents, strategies to prevent and manage complications, including slow-flow/no-reflow and burr entrapment, and appropriate use in the context of the broader evolution in the management of stable ischemic heart disease. Fundamental elements of optimal technique include use of a single burr with burr-to-artery ratio of 0.5 to 0.6-rotational speed of 140,000 to 150,000 rpm, gradual burr advancement using a pecking motion, short ablation runs of 15 to 20 s, and avoidance of decelerations >5,000 rpm. Combined with meticulous technique, optimal antiplatelet therapy, vasodilators, flush solution, and provisional use of atropine, temporary pacing, vasopressors, and mechanical support may prevent slow-flow/no-reflow, which in contemporary series is reported in 0.0% to 2.6% of cases. On the basis of the results of recent large clinical trials, a subset of patients with complex coronary artery disease previously assigned to rotational atherectomy may be directed instead to medical therapy alone or bypass surgery. For patients with de novo severely calcified lesions for which rotational atherectomy remains appropriate, referral centers of excellence are required. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Single-drop optimization of protein crystallization.
Meyer, Arne; Dierks, Karsten; Hilterhaus, Dierk; Klupsch, Thomas; Mühlig, Peter; Kleesiek, Jens; Schöpflin, Robert; Einspahr, Howard; Hilgenfeld, Rolf; Betzel, Christian
2012-08-01
A completely new crystal-growth device has been developed that permits charting a course across the phase diagram to produce crystalline samples optimized for diffraction experiments. The utility of the device is demonstrated for the production of crystals for the traditional X-ray diffraction data-collection experiment, of microcrystals optimal for data-collection experiments at a modern microbeam insertion-device synchrotron beamline and of nanocrystals required for data collection on an X-ray laser beamline.
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique
NASA Astrophysics Data System (ADS)
Riccò, R.; Nizzero, S.; Penna, E.; Meneghello, A.; Cretaio, E.; Enrichi, F.
2018-05-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. [Figure not available: see fulltext.
High-resolution non-destructive three-dimensional imaging of integrated circuits.
Holler, Mirko; Guizar-Sicairos, Manuel; Tsai, Esther H R; Dinapoli, Roberto; Müller, Elisabeth; Bunk, Oliver; Raabe, Jörg; Aeppli, Gabriel
2017-03-15
Modern nanoelectronics has advanced to a point at which it is impossible to image entire devices and their interconnections non-destructively because of their small feature sizes and the complex three-dimensional structures resulting from their integration on a chip. This metrology gap implies a lack of direct feedback between design and manufacturing processes, and hampers quality control during production, shipment and use. Here we demonstrate that X-ray ptychography-a high-resolution coherent diffractive imaging technique-can create three-dimensional images of integrated circuits of known and unknown designs with a lateral resolution in all directions down to 14.6 nanometres. We obtained detailed device geometries and corresponding elemental maps, and show how the devices are integrated with each other to form the chip. Our experiments represent a major advance in chip inspection and reverse engineering over the traditional destructive electron microscopy and ion milling techniques. Foreseeable developments in X-ray sources, optics and detectors, as well as adoption of an instrument geometry optimized for planar rather than cylindrical samples, could lead to a thousand-fold increase in efficiency, with concomitant reductions in scan times and voxel sizes.
Artificial intelligence in robot control systems
NASA Astrophysics Data System (ADS)
Korikov, A.
2018-05-01
This paper analyzes modern concepts of artificial intelligence and known definitions of the term "level of intelligence". In robotics artificial intelligence system is defined as a system that works intelligently and optimally. The author proposes to use optimization methods for the design of intelligent robot control systems. The article provides the formalization of problems of robotic control system design, as a class of extremum problems with constraints. Solving these problems is rather complicated due to the high dimensionality, polymodality and a priori uncertainty. Decomposition of the extremum problems according to the method, suggested by the author, allows reducing them into a sequence of simpler problems, that can be successfully solved by modern computing technology. Several possible approaches to solving such problems are considered in the article.
NASA Astrophysics Data System (ADS)
Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.
2016-05-01
In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.
NASA Astrophysics Data System (ADS)
Kómár, P.; Kessler, E. M.; Bishof, M.; Jiang, L.; Sørensen, A. S.; Ye, J.; Lukin, M. D.
2014-08-01
The development of precise atomic clocks plays an increasingly important role in modern society. Shared timing information constitutes a key resource for navigation with a direct correspondence between timing accuracy and precision in applications such as the Global Positioning System. By combining precision metrology and quantum networks, we propose a quantum, cooperative protocol for operating a network of geographically remote optical atomic clocks. Using nonlocal entangled states, we demonstrate an optimal utilization of global resources, and show that such a network can be operated near the fundamental precision limit set by quantum theory. Furthermore, the internal structure of the network, combined with quantum communication techniques, guarantees security both from internal and external threats. Realization of such a global quantum network of clocks may allow construction of a real-time single international time scale (world clock) with unprecedented stability and accuracy.
Distance majorization and its applications.
Chi, Eric C; Zhou, Hua; Lange, Kenneth
2014-08-01
The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton's method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications.
Ho, Steven C L; Yang, Yuansheng
2014-08-01
Promoters are essential on plasmid vectors to initiate transcription of the transgenes when generating therapeutic recombinant proteins expressing mammalian cell lines. High and sustained levels of gene expression are desired during therapeutic protein production while gene expression is useful for cell engineering. As many finely controlled promoters exhibit cell and product specificity, new promoters need to be identified, optimized and carefully evaluated before use. Suitable promoters can be identified using techniques ranging from simple molecular biology methods to modern high-throughput omics screenings. Promoter engineering is often required after identification to either obtain high and sustained expression or to provide a wider range of gene expression. This review discusses some of the available methods to identify and engineer promoters for therapeutic recombinant protein expression in mammalian cells.
New Approaches to Providing Individualized Diabetes Care in the 21st Century
Powell, Priscilla W.; Corathers, Sarah D.; Raymond, Jennifer; Streisand, Randi
2016-01-01
Building from a foundation of rapid innovation, the 21st century is poised to offer considerable new approaches to providing modern diabetes care. The focus of this paper is the evolving role of diabetes care providers collaboratively working with patients and families toward the goals of achieving optimal clinical and psychosocial outcomes for individuals living with diabetes. Advances in monitoring, treatment and technology have been complemented by trends toward patient-centered care with expertise from multiple health care disciplines. The evolving clinical care delivery system extends far beyond adjustment of insulin regimens. Effective integration of patient-centered strategies, such as shared-decision making, motivational interviewing techniques, shared medical appointments, and multidisciplinary team collaboration, into a dynamic model of diabetes care delivery holds promise in reaching glycemic targets and improving patients’ quality of life. PMID:25901504
Feicht, W; Buchner, A; Riesenberg, R
2001-05-01
Trifunctional bispecific antibodies open up new immunological possibilities in tumour treatment. Prior to clinical application, comprehensive investigations using animal models and in vitro examinations need to be done. To investigate long-term interactions between various immunologically active blood cells and individual tumour cells in the presence of antibodies, we developed an incubation system for experimental cell cultures on an inverted microscope. The system consists of a perspex box with a central moisture chamber with integrated water reservoir, external air circulation heating, and a CO2 supply. The sterile cell cultures are located in the wells of a slide positioned within a depression in the water reservoir. The newly developed incubation system enables continuous observation over the long term of experiments under optimal cell cultures conditions in combination with modern video techniques.
It's time to reinvent the general aviation airplane
NASA Technical Reports Server (NTRS)
Stengel, Robert F.
1988-01-01
Current designs for general aviation airplanes have become obsolete, and avenues for major redesign must be considered. New designs should incorporate recent advances in electronics, aerodynamics, structures, materials, and propulsion. Future airplanes should be optimized to operate satisfactorily in a positive air traffic control environment, to afford safety and comfort for point-to-point transportation, and to take advantage of automated manufacturing techniques and high production rates. These requirements have broad implications for airplane design and flying qualities, leading to a concept for the Modern Equipment General Aviation (MEGA) airplane. Synergistic improvements in design, production, and operation can provide a much needed fresh start for the general aviation industry and the traveling public. In this investigation a small four place airplane is taken as the reference, although the proposed philosophy applies across the entire spectrum of general aviation.
Locality-Aware CTA Clustering For Modern GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Liu, Weifeng
2017-04-08
In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less
Optimizing Engineering Tools Using Modern Ground Architectures
2017-12-01
Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of
Modern Initial Management of Severe Limbs Trauma in War Surgery: Orthopaedic Damage Control
2010-04-01
avoid fat embolism , allow an optimal nursing and medical evacuation without any secondary functional consequences [3]. 2.2.1 Indications: The...decrease the risk of fat embolism . Modern Initial Management of Severe Limbs Trauma in War Surgery: “Orthopaedic Damage Control” RTO-MP-HFM-182 17...injuries. Orthopaedic Imperious: Multiple open shaft fractures with blood loss, complex epiphysal fractures requiring a long difficult surgical bloody
NASA Technical Reports Server (NTRS)
Martos, Borja; Kiszely, Paul; Foster, John V.
2011-01-01
As part of the NASA Aviation Safety Program (AvSP), a novel pitot-static calibration method was developed to allow rapid in-flight calibration for subscale aircraft while flying within confined test areas. This approach uses Global Positioning System (GPS) technology coupled with modern system identification methods that rapidly computes optimal pressure error models over a range of airspeed with defined confidence bounds. This method has been demonstrated in subscale flight tests and has shown small 2- error bounds with significant reduction in test time compared to other methods. The current research was motivated by the desire to further evaluate and develop this method for full-scale aircraft. A goal of this research was to develop an accurate calibration method that enables reductions in test equipment and flight time, thus reducing costs. The approach involved analysis of data acquisition requirements, development of efficient flight patterns, and analysis of pressure error models based on system identification methods. Flight tests were conducted at The University of Tennessee Space Institute (UTSI) utilizing an instrumented Piper Navajo research aircraft. In addition, the UTSI engineering flight simulator was used to investigate test maneuver requirements and handling qualities issues associated with this technique. This paper provides a summary of piloted simulation and flight test results that illustrates the performance and capabilities of the NASA calibration method. Discussion of maneuver requirements and data analysis methods is included as well as recommendations for piloting technique.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.
Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T
2017-01-01
Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.
Modern adjuncts and technologies in microsurgery: an historical and evidence-based review.
Pratt, George F; Rozen, Warren M; Chubb, Daniel; Whitaker, Iain S; Grinsell, Damien; Ashton, Mark W; Acosta, Rafael
2010-11-01
While modern reconstructive surgery was revolutionized with the introduction of microsurgical techniques, microsurgery itself has seen the introduction of a range of technological aids and modern techniques aiming to improve dissection times, anastomotic times, and overall outcomes. These include improved preoperative planning, anastomotic aides, and earlier detection of complications with higher salvage rates. Despite the potential for substantial impact, many of these techniques have been evaluated in a limited fashion, and the evidence for each has not been universally explored. The purpose of this review was to establish and quantify the evidence for each technique. A search of relevant medical databases was performed to identify literature providing evidence for each technology. Levels of evidence were thus accumulated and applied to each technique. There is a relative paucity of evidence for many of the more recent technologies described in the field of microsurgery, with no randomized controlled trials, and most studies in the field comprising case series only. Current evidence-based suggestions include the use of computed tomographic angiography (CTA) for the preoperative planning of perforator flaps, the intraoperative use of a mechanical anastomotic coupling aide (particularly the Unilink® coupler), and postoperative flap monitoring with strict protocols using clinical bedside monitoring and/or the implantable Doppler probe. Despite the breadth of technologies introduced into the field of microsurgery, there is substantial variation in the degree of evidence presented for each, suggesting the role for much future research, particularly from emerging technologies such as robotics and modern simulators. Copyright © 2010 Wiley-Liss, Inc.
A Survey of Architectural Techniques For Improving Cache Power Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
Modern processors are using increasingly larger sized on-chip caches. Also, with each CMOS technology generation, there has been a significant increase in their leakage energy consumption. For this reason, cache power management has become a crucial research issue in modern processor design. To address this challenge and also meet the goals of sustainable computing, researchers have proposed several techniques for improving energy efficiency of cache architectures. This paper surveys recent architectural techniques for improving cache power efficiency and also presents a classification of these techniques based on their characteristics. For providing an application perspective, this paper also reviews several real-worldmore » processor chips that employ cache energy saving techniques. The aim of this survey is to enable engineers and researchers to get insights into the techniques for improving cache power efficiency and motivate them to invent novel solutions for enabling low-power operation of caches.« less
Zakhia, Frédéric; de Lajudie, Philippe
2006-03-01
Taxonomy is the science that studies the relationships between organisms. It comprises classification, nomenclature, and identification. Modern bacterial taxonomy is polyphasic. This means that it is based on several molecular techniques, each one retrieving the information at different cellular levels (proteins, fatty acids, DNA...). The obtained results are combined and analysed to reach a "consensus taxonomy" of a microorganism. Until 1970, a small number of classification techniques were available for microbiologists (mainly phenotypic characterization was performed: a legume species nodulation ability for a Rhizobium, for example). With the development of techniques based on polymerase chain reaction for characterization, the bacterial taxonomy has undergone great changes. In particular, the classification of the legume nodulating bacteria has been repeatedly modified over the last 20 years. We present here a review of the currently used molecular techniques in bacterial characterization, with examples of application of these techniques for the study of the legume nodulating bacteria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bales, Benjamin B; Barrett, Richard F
In almost all modern scientific applications, developers achieve the greatest performance gains by tuning algorithms, communication systems, and memory access patterns, while leaving low level instruction optimizations to the compiler. Given the increasingly varied and complicated x86 architectures, the value of these optimizations is unclear, and, due to time and complexity constraints, it is difficult for many programmers to experiment with them. In this report we explore the potential gains of these 'last mile' optimization efforts on an AMD Barcelona processor, providing readers with relevant information so that they can decide whether investment in the presented optimizations is worthwhile.
Gas Dynamic Modernization of Axial Uncooled Turbine by Means of CFD and Optimization Software
NASA Astrophysics Data System (ADS)
Marchukov, E. Yu; Egorov, I. N.
2018-01-01
The results of multicriteria optimization of three-stage low-pressure turbine are described in the paper. The aim of the optimization is to improve turbine operation process by three criteria: turbine outlet flow angle, value of residual swirl at the turbine outlet, and turbine efficiency. Full reprofiling of all blade rows is carried out while solving optimization problem. Reprofiling includes a change in both shape of flat blade sections (profiles) and three-dimensional shape of the blades. The study is carried out with 3D numerical models of turbines.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Particle-In-Cell Analysis of an Electric Antenna for the BepiColombo/MMO spacecraft
NASA Astrophysics Data System (ADS)
Miyake, Yohei; Usui, Hideyuki; Kojima, Hirotsugu
The BepiColombo/MMO spacecraft is planned to provide a first electric field measurement in Mercury's magnetosphere by mounting two types of the electric antennas: WPT and MEFISTO. The sophisticated calibration of such measurements should be performed based on precise knowledge of the antenna characteristics in space plasma. However, it is difficult to know prac-tical antenna characteristics considering the plasma kinetics and spacecraft-plasma interactions by means of theoretical approaches. Furthermore, some modern antenna designing techniques such as a "hockey puck" principle is applied to MEFISTO, which introduces much complexity in its overall configuration. Thus a strong demand arises regarding the establishment of a nu-merical method that can solve the complex configuration and plasma dynamics for evaluating the electric properties of the modern instrument. For the self-consistent antenna analysis, we have developed a particle simulation code named EMSES based on the particle-in-cell technique including a treatment antenna conductive sur-faces. In this paper, we mainly focus on electrostatic (ES) features and photoelectron distri-bution in the vicinity of MEFISTO. Our simulation model includes (1) a photoelectron guard electrode, (2) a bias current provided from the spacecraft body to the sensing element, (3) a floating potential treatment for the spacecraft body, and (4) photoelectron emission from sunlit surfaces of the conductive bodies. Of these, the photoelectron guard electrode is a key technol-ogy for producing an optimal condition of plasma environment around MEFISTO. Specifically, we introduced a pre-amplifier housing called puck located between the conductive boom and the sensor wire. The photoelectron guard is then simulated by forcibly fixing the potential difference between the puck surface and the spacecraft body. For the modeling, we use the Capacity Matrix technique in order to assure the conservation condition of total charge owned by the entire spacecraft body. We report some numerical analyses on the influence of the guard electrode on the surrounding plasma environment by using the developed model.
Airfoil optimization for unsteady flows with application to high-lift noise reduction
NASA Astrophysics Data System (ADS)
Rumpfkeil, Markus Peer
The use of steady-state aerodynamic optimization methods in the computational fluid dynamic (CFD) community is fairly well established. In particular, the use of adjoint methods has proven to be very beneficial because their cost is independent of the number of design variables. The application of numerical optimization to airframe-generated noise, however, has not received as much attention, but with the significant quieting of modern engines, airframe noise now competes with engine noise. Optimal control techniques for unsteady flows are needed in order to be able to reduce airframe-generated noise. In this thesis, a general framework is formulated to calculate the gradient of a cost function in a nonlinear unsteady flow environment via the discrete adjoint method. The unsteady optimization algorithm developed in this work utilizes a Newton-Krylov approach since the gradient-based optimizer uses the quasi-Newton method BFGS, Newton's method is applied to the nonlinear flow problem, GMRES is used to solve the resulting linear problem inexactly, and last but not least the linear adjoint problem is solved using Bi-CGSTAB. The flow is governed by the unsteady two-dimensional compressible Navier-Stokes equations in conjunction with a one-equation turbulence model, which are discretized using structured grids and a finite difference approach. The effectiveness of the unsteady optimization algorithm is demonstrated by applying it to several problems of interest including shocktubes, pulses in converging-diverging nozzles, rotating cylinders, transonic buffeting, and an unsteady trailing-edge flow. In order to address radiated far-field noise, an acoustic wave propagation program based on the Ffowcs Williams and Hawkings (FW-H) formulation is implemented and validated. The general framework is then used to derive the adjoint equations for a novel hybrid URANS/FW-H optimization algorithm in order to be able to optimize the shape of airfoils based on their calculated far-field pressure fluctuations. Validation and application results for this novel hybrid URANS/FW-H optimization algorithm show that it is possible to optimize the shape of an airfoil in an unsteady flow environment to minimize its radiated far-field noise while maintaining good aerodynamic performance.
Platform-dependent optimization considerations for mHealth applications
NASA Astrophysics Data System (ADS)
Kaghyan, Sahak; Akopian, David; Sarukhanyan, Hakob
2015-03-01
Modern mobile devices contain integrated sensors that enable multitude of applications in such fields as mobile health (mHealth), entertainment, sports, etc. Human physical activity monitoring is one of such the emerging applications. There exists a range of challenges that relate to activity monitoring tasks, and, particularly, exploiting optimal solutions and architectures for respective mobile software application development. This work addresses mobile computations related to integrated inertial sensors for activity monitoring, such as accelerometers, gyroscopes, integrated global positioning system (GPS) and WLAN-based positioning, that can be used for activity monitoring. Some of the aspects will be discussed in this paper. Each of the sensing data sources has its own characteristics such as specific data formats, data rates, signal acquisition durations etc., and these specifications affect energy consumption. Energy consumption significantly varies as sensor data acquisition is followed by data analysis including various transformations and signal processing algorithms. This paper will address several aspects of more optimal activity monitoring implementations exploiting state-of-the-art capabilities of modern platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illidge, Tim, E-mail: Tim.Illidge@ics.manchester.ac.uk; Specht, Lena; Yahalom, Joachim
2014-05-01
Radiation therapy (RT) is the most effective single modality for local control of non-Hodgkin lymphoma (NHL) and is an important component of therapy for many patients. Many of the historic concepts of dose and volume have recently been challenged by the advent of modern imaging and RT planning tools. The International Lymphoma Radiation Oncology Group (ILROG) has developed these guidelines after multinational meetings and analysis of available evidence. The guidelines represent an agreed consensus view of the ILROG steering committee on the use of RT in NHL in the modern era. The roles of reduced volume and reduced doses aremore » addressed, integrating modern imaging with 3-dimensional planning and advanced techniques of RT delivery. In the modern era, in which combined-modality treatment with systemic therapy is appropriate, the previously applied extended-field and involved-field RT techniques that targeted nodal regions have now been replaced by limiting the RT to smaller volumes based solely on detectable nodal involvement at presentation. A new concept, involved-site RT, defines the clinical target volume. For indolent NHL, often treated with RT alone, larger fields should be considered. Newer treatment techniques, including intensity modulated RT, breath holding, image guided RT, and 4-dimensional imaging, should be implemented, and their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control.« less
Modern quantitative schlieren techniques
NASA Astrophysics Data System (ADS)
Hargather, Michael; Settles, Gary
2010-11-01
Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.
Real-time colouring and filtering with graphics shaders
NASA Astrophysics Data System (ADS)
Vohl, D.; Fluke, C. J.; Barnes, D. G.; Hassan, A. H.
2017-11-01
Despite the popularity of the Graphics Processing Unit (GPU) for general purpose computing, one should not forget about the practicality of the GPU for fast scientific visualization. As astronomers have increasing access to three-dimensional (3D) data from instruments and facilities like integral field units and radio interferometers, visualization techniques such as volume rendering offer means to quickly explore spectral cubes as a whole. As most 3D visualization techniques have been developed in fields of research like medical imaging and fluid dynamics, many transfer functions are not optimal for astronomical data. We demonstrate how transfer functions and graphics shaders can be exploited to provide new astronomy-specific explorative colouring methods. We present 12 shaders, including four novel transfer functions specifically designed to produce intuitive and informative 3D visualizations of spectral cube data. We compare their utility to classic colour mapping. The remaining shaders highlight how common computation like filtering, smoothing and line ratio algorithms can be integrated as part of the graphics pipeline. We discuss how this can be achieved by utilizing the parallelism of modern GPUs along with a shading language, letting astronomers apply these new techniques at interactive frame rates. All shaders investigated in this work are included in the open source software shwirl (Vohl 2017).
Bending the Rules: Widefield Microscopy and the Abbe Limit of Resolution
Verdaasdonk, Jolien S.; Stephens, Andrew D.; Haase, Julian; Bloom, Kerry
2014-01-01
One of the most fundamental concepts of microscopy is that of resolution–the ability to clearly distinguish two objects as separate. Recent advances such as structured illumination microscopy (SIM) and point localization techniques including photoactivated localization microscopy (PALM), and stochastic optical reconstruction microscopy (STORM) strive to overcome the inherent limits of resolution of the modern light microscope. These techniques, however, are not always feasible or optimal for live cell imaging. Thus, in this review, we explore three techniques for extracting high resolution data from images acquired on a widefield microscope–deconvolution, model convolution, and Gaussian fitting. Deconvolution is a powerful tool for restoring a blurred image using knowledge of the point spread function (PSF) describing the blurring of light by the microscope, although care must be taken to ensure accuracy of subsequent quantitative analysis. The process of model convolution also requires knowledge of the PSF to blur a simulated image which can then be compared to the experimentally acquired data to reach conclusions regarding its geometry and fluorophore distribution. Gaussian fitting is the basis for point localization microscopy, and can also be applied to tracking spot motion over time or measuring spot shape and size. All together, these three methods serve as powerful tools for high-resolution imaging using widefield microscopy. PMID:23893718
NASA Astrophysics Data System (ADS)
Cao, X.; Tian, F.; Telford, R.; Ni, J.; Xu, Q.; Chen, F.; Liu, X.; Stebich, M.; Zhao, Y.; Herzschuh, U.
2017-12-01
Pollen-based quantitative reconstructions of past climate variables is a standard palaeoclimatic approach. Despite knowing that the spatial extent of the calibration-set affects the reconstruction result, guidance is lacking as to how to determine a suitable spatial extent of the pollen-climate calibration-set. In this study, past mean annual precipitation (Pann) during the Holocene (since 11.5 cal ka BP) is reconstructed repeatedly for pollen records from Qinghai Lake (36.7°N, 100.5°E; north-east Tibetan Plateau), Gonghai Lake (38.9°N, 112.2°E; north China) and Sihailongwan Lake (42.3°N, 126.6°E; north-east China) using calibration-sets of varying spatial extents extracted from the modern pollen dataset of China and Mongolia (2559 sampling sites and 168 pollen taxa in total). Results indicate that the spatial extent of the calibration-set has a strong impact on model performance, analogue quality and reconstruction diagnostics (absolute value, range, trend, optimum). Generally, these effects are stronger with the modern analogue technique (MAT) than with weighted averaging partial least squares (WA-PLS). With respect to fossil spectra from northern China, the spatial extent of calibration-sets should be restricted to ca. 1000 km in radius because small-scale calibration-sets (<800 km radius) will likely fail to include enough spatial variation in the modern pollen assemblages to reflect the temporal range shifts during the Holocene, while too broad a scale calibration-set (>1500 km radius) will include taxa with very different pollen-climate relationships. Based on our results we conclude that the optimal calibration-set should 1) cover a reasonably large spatial extent with an even distribution of modern pollen samples; 2) possess good model performance as indicated by cross-validation, high analogue quality, and excellent fit with the target fossil pollen spectra; 3) possess high taxonomic resolution, and 4) obey the modern and past distribution ranges of taxa inferred from palaeo-genetic and macrofossil studies.
Application of contrast media in post-mortem imaging (CT and MRI).
Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice
2015-09-01
The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.
Chopped random-basis quantum optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caneva, Tommaso; Calarco, Tommaso; Montangero, Simone
2011-08-15
In this work, we describe in detail the chopped random basis (CRAB) optimal control technique recently introduced to optimize time-dependent density matrix renormalization group simulations [P. Doria, T. Calarco, and S. Montangero, Phys. Rev. Lett. 106, 190501 (2011)]. Here, we study the efficiency of this control technique in optimizing different quantum processes and we show that in the considered cases we obtain results equivalent to those obtained via different optimal control methods while using less resources. We propose the CRAB optimization as a general and versatile optimal control technique.
New evidence favoring multilevel decomposition and optimization
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Polignone, Debra A.
1990-01-01
The issue of the utility of multilevel decomposition and optimization remains controversial. To date, only the structural optimization community has actively developed and promoted multilevel optimization techniques. However, even this community acknowledges that multilevel optimization is ideally suited for a rather limited set of problems. It is warned that decomposition typically requires eliminating local variables by using global variables and that this in turn causes ill-conditioning of the multilevel optimization by adding equality constraints. The purpose is to suggest a new multilevel optimization technique. This technique uses behavior variables, in addition to design variables and constraints, to decompose the problem. The new technique removes the need for equality constraints, simplifies the decomposition of the design problem, simplifies the programming task, and improves the convergence speed of multilevel optimization compared to conventional optimization.
Injuries in students of three different dance techniques.
Echegoyen, Soledad; Acuña, Eugenia; Rodríguez, Cristina
2010-06-01
As with any athlete, the dancer has a high risk for injury. Most studies carried out relate to classical and modern dance; however, there is a lack of reports on injuries involving other dance techniques. This study is an attempt to determine the differences in the incidence, the exposure-related rates, and the kind of injuries in three different dance techniques. A prospective study about dance injuries was carried out between 2004 and 2007 on students of modern, Mexican folkloric, and Spanish dance at the Escuela Nacional de Danza. A total of 1,168 injuries were registered in 444 students; the injury rate was 4 injuries/student for modern dance and 2 injuries/student for Mexican folkloric and Spanish dance. The rate per training hours was 4 for modern, 1.8 for Mexican folkloric, and 1.5 injuries/1,000 hr of training for Spanish dance. The lower extremity is the most frequent structure injured (70.47%), and overuse injuries comprised 29% of the total. The most frequent injuries were strain, sprain, back pain, and patellofemoral pain. This study has a consistent medical diagnosis of the injuries and is the first attempt in Mexico to compare the incidence of injuries in different dance techniques. To decrease the frequency of student injury, it is important to incorporate prevention programs into dance program curricula. More studies are necessary to define causes and mechanisms of injury, as well as an analysis of training methodology, to decrease the incidence of the muscle imbalances resulting in injury.
The art of providing resuscitation in Greek mythology.
Siempos, Ilias I; Ntaidou, Theodora K; Samonis, George
2014-12-01
We reviewed Greek mythology to accumulate tales of resuscitation and we explored whether these tales could be viewed as indirect evidence that ancient Greeks considered resuscitation strategies similar to those currently used. Three compendia of Greek mythology: The Routledge Handbook of Greek Mythology, The Greek Myths by Robert Graves, and Greek Mythology by Ioannis Kakridis were used to find potentially relevant narratives. Thirteen myths that may suggest resuscitation (including 1 case of autoresuscitation) were identified. Methods to attempt mythological resuscitation included use of hands (which may correlate with basic life support procedures), a kiss on the mouth (similar to mouth-to-mouth resuscitation), application of burning torches (which might recall contemporary use of external defibrillators), and administration of drugs (a possible analogy to advanced life support procedures). A careful assessment of relevant myths demonstrated that interpretations other than medical might be more credible. Although several narratives of Greek mythology might suggest modern resuscitation techniques, they do not clearly indicate that ancient Greeks presaged scientific methods of resuscitation. Nevertheless, these elegant tales reflect humankind's optimism that a dying human might be restored to life if the appropriate procedures were implemented. Without this optimism, scientific improvement in the field of resuscitation might not have been achieved.
Assembling networks of microbial genomes using linear programming.
Holloway, Catherine; Beiko, Robert G
2010-11-20
Microbial genomes exhibit complex sets of genetic affinities due to lateral genetic transfer. Assessing the relative contributions of parent-to-offspring inheritance and gene sharing is a vital step in understanding the evolutionary origins and modern-day function of an organism, but recovering and showing these relationships is a challenging problem. We have developed a new approach that uses linear programming to find between-genome relationships, by treating tables of genetic affinities (here, represented by transformed BLAST e-values) as an optimization problem. Validation trials on simulated data demonstrate the effectiveness of the approach in recovering and representing vertical and lateral relationships among genomes. Application of the technique to a set comprising Aquifex aeolicus and 75 other thermophiles showed an important role for large genomes as 'hubs' in the gene sharing network, and suggested that genes are preferentially shared between organisms with similar optimal growth temperatures. We were also able to discover distinct and common genetic contributors to each sequenced representative of genus Pseudomonas. The linear programming approach we have developed can serve as an effective inference tool in its own right, and can be an efficient first step in a more-intensive phylogenomic analysis.
NASA Astrophysics Data System (ADS)
Tsai, Cheng-Mu; Fang, Yi-Chin; Chen, Zhen Hsiang
2011-10-01
This study used the aspheric lens to realize the laser flat-top optimization, and applied the genetic algorithm (GA) to find the optimal results. Using the characteristics of aspheric lens to obtain the optimized high quality Nd: YAG 355 waveband laser flat-top optical system, this study employed the Light tools LDS (least damped square) and the GA of artificial intelligence optimization method to determine the optimal aspheric coefficient and obtain the optimal solution. This study applied the aspheric lens with GA for the flattening of laser beams using two aspheric lenses in the aspheric surface optical system to complete 80% spot narrowing under standard deviation of 0.6142.
Conceptual design optimization study
NASA Technical Reports Server (NTRS)
Hollowell, S. J.; Beeman, E. R., II; Hiyama, R. M.
1990-01-01
The feasibility of applying multilevel functional decomposition and optimization techniques to conceptual design of advanced fighter aircraft was investigated. Applying the functional decomposition techniques to the conceptual design phase appears to be feasible. The initial implementation of the modified design process will optimize wing design variables. A hybrid approach, combining functional decomposition techniques for generation of aerodynamic and mass properties linear sensitivity derivatives with existing techniques for sizing mission performance and optimization, is proposed.
NASA Astrophysics Data System (ADS)
Taneja, Jayant Kumar
Electricity is an indispensable commodity to modern society, yet it is delivered via a grid architecture that remains largely unchanged over the past century. A host of factors are conspiring to topple this dated yet venerated design: developments in renewable electricity generation technology, policies to reduce greenhouse gas emissions, and advances in information technology for managing energy systems. Modern electric grids are emerging as complex distributed systems in which a portfolio of power generation resources, often incorporating fluctuating renewable resources such as wind and solar, must be managed dynamically to meet uncontrolled, time-varying demand. Uncertainty in both supply and demand makes control of modern electric grids fundamentally more challenging, and growing portfolios of renewables exacerbate the challenge. We study three electricity grids: the state of California, the province of Ontario, and the country of Germany. To understand the effects of increasing renewables, we develop a methodology to scale renewables penetration. Analyzing these grids yields key insights about rigid limits to renewables penetration and their implications in meeting long-term emissions targets. We argue that to achieve deep penetration of renewables, the operational model of the grid must be inverted, changing the paradigm from load-following supplies to supply-following loads. To alleviate the challenge of supply-demand matching on deeply renewable grids, we first examine well-known techniques, including altering management of existing supply resources, employing utility-scale energy storage, targeting energy efficiency improvements, and exercising basic demand-side management. Then, we create several instantiations of supply-following loads -- including refrigerators, heating and cooling systems, and laptop computers -- by employing a combination of sensor networks, advanced control techniques, and enhanced energy storage. We examine the capacity of each load for supply-following and study the behaviors of populations of these loads, assessing their potential at various levels of deployment throughout the California electricity grid. Using combinations of supply-following strategies, we can reduce peak natural gas generation by 19% on a model of the California grid with 60% renewables. We then assess remaining variability on this deeply renewable grid incorporating supply-following loads, characterizing additional capabilities needed to ensure supply-demand matching in future sustainable electricity grids.
The deconvolution of complex spectra by artificial immune system
NASA Astrophysics Data System (ADS)
Galiakhmetova, D. I.; Sibgatullin, M. E.; Galimullin, D. Z.; Kamalova, D. I.
2017-11-01
An application of the artificial immune system method for decomposition of complex spectra is presented. The results of decomposition of the model contour consisting of three components, Gaussian contours, are demonstrated. The method of artificial immune system is an optimization method, which is based on the behaviour of the immune system and refers to modern methods of search for the engine optimization.
An active antenna for ELF magnetic fields
NASA Technical Reports Server (NTRS)
Sutton, John F.; Spaniol, Craig
1994-01-01
The work of Nikola Tesla, especially that directed toward world-wide electrical energy distribution via excitation of the earth-ionosphere cavity resonances, has stimulated interest in the study of these resonances. Not only are they important for their potential use in the transmission of intelligence and electrical power, they are important because they are an integral part of our natural environment. This paper describes the design of a sensitive, untuned, low noise active antenna which is uniquely suited to modern earth-ionosphere cavity resonance measurements employing fast-Fourier transform techniques for near-real-time data analysis. It capitalizes on a little known field-antenna interaction mechanism. Recently, the authors made preliminary measurements of the magnetic fields in the earth-ionosphere cavity. During the course of this study, the problem of designing an optimized ELF magnetic field sensor presented itself. The sensor would have to be small, light weight (for portable use), and capable of detecting the 5-50 Hz picoTesla-level signals generated by the natural excitations of the earth-ionosphere cavity resonances. A review of the literature revealed that past researchers had employed very large search coils, both tuned and untuned. Hill and Bostick, for example, used coils of 30,000 turns wound on high permeability cores of 1.83 m length, weighing 40 kg. Tuned coils are unsuitable for modern fast-Fourier transform data analysis techniques which require a broad spectrum input. 'Untuned' coils connected to high input impedance voltage amplifiers exhibit resonant responses at the resonant frequency determined by the coil inductance and the coil distributed winding capacitance. Also, considered as antennas, they have effective areas equal only to their geometrical areas.
A method for aircraft concept exploration using multicriteria interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Buonanno, Michael Alexander
2005-08-01
The problem of aircraft concept selection has become increasingly difficult in recent years due to changes in the primary evaluation criteria of concepts. In the past, performance was often the primary discriminator, whereas modern programs have placed increased emphasis on factors such as environmental impact, economics, supportability, aesthetics, and other metrics. The revolutionary nature of the vehicles required to simultaneously meet these conflicting requirements has prompted a shift from design using historical data regression techniques for metric prediction to the use of sophisticated physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to select a sub-optimum baseline vehicle. Some extremely important concept decisions, such as the type of control surface arrangement to use, are frequently made without sufficient understanding of their impact on the important system metrics due to a lack of historical guidance, computational resources, or analysis tools. This thesis discusses the difficulties associated with revolutionary system design, and introduces several new techniques designed to remedy them. First, an interactive design method has been developed that allows the designer to provide feedback to a numerical optimization algorithm during runtime, thereby preventing the optimizer from exploiting weaknesses in the analytical model. This method can be used to account for subjective criteria, or as a crude measure of un-modeled quantitative criteria. Other contributions of the work include a modified Structured Genetic Algorithm that enables the efficient search of large combinatorial design hierarchies and an improved multi-objective optimization procedure that can effectively optimize several objectives simultaneously. A new conceptual design method has been created by drawing upon each of these new capabilities and aspects of more traditional design methods. The ability of this new technique to assist in the design of revolutionary vehicles has been demonstrated using a problem of contemporary interest: the concept exploration of a supersonic business jet. This problem was found to be a good demonstration case because of its novelty and unique requirements, and the results of this proof of concept exercise indicate that the new method is effective at providing additional insight into the relationship between a vehicle's requirements and its favorable attributes.
Research on the optimization of vehicle distribution routes in logistics enterprises
NASA Astrophysics Data System (ADS)
Fan, Zhigou; Ma, Mengkun
2018-01-01
With the rapid development of modern logistics, the vehicle routing problem has become one of the urgent problems in the logistics industry. The rationality of distribution route planning directly affects the efficiency and quality of logistics distribution. This paper first introduces the definition of logistics distribution and the three methods of optimizing the distribution routes, and then analyzes the current vehicle distribution route by using a representative example, finally puts forward the optimization schemes of logistics distribution route.
Optimal Damping of Perturbations of Moving Thermoelastic Panel
NASA Astrophysics Data System (ADS)
Banichuk, N. V.; Ivanova, S. Yu.
2018-01-01
The translational motion of a thermoelastic web subject to transverse vibrations caused by initial perturbations is considered. It is assumed that a web moving with a constant translational velocity is described by the model of a thermoelastic panel simply supported at its ends. The problem of optimal damping of vibrations when applying active transverse actions is formulated. For solving the optimization problem, modern methods developed in control theory for systems with distributed parameters described by partial differential equations are used.
Crossroads: Modern Interactive Intersections and Accessible Pedestrian Signals
ERIC Educational Resources Information Center
Barlow, Janet M.; Franck, Lukas
2005-01-01
This article discusses the interactive nature of modern actuated intersections and the effect of that interface on pedestrians who are visually impaired. Information is provided about accessible pedestrian signals (APS), the role of blindness professionals in APS installation decisions, and techniques for crossing streets with APS.
Sample preparation for the analysis of isoflavones from soybeans and soy foods.
Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A
2009-01-02
This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.
NASA Astrophysics Data System (ADS)
Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun
2014-08-01
In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.
Summary of Optimization Techniques That Can Be Applied to Suspension System Design
DOT National Transportation Integrated Search
1973-03-01
Summaries are presented of the analytic techniques available for three levitated vehicle suspension optimization problems: optimization of passive elements for fixed configuration; optimization of a free passive configuration; optimization of a free ...
NASA Technical Reports Server (NTRS)
Schutz, Bob E.
1993-01-01
Satellite Laser Ranging (SLR) has a rich history of development which began in the 1960s with 10 meter-level first generation systems. These systems evolved with order of magnitude improvements to the systems that now produce several millimeter single shot range precisions. What began, in part, as an interesting application of the new laser technology has become an essential component of modern, precision space geodesy, which in turn enables contributions to a variety of science areas. Modern space geodesy is the beneficiary of technological developments which have enabled precision geodetic measurements. Aside from SLR and its closely related technique, Lunar Laser Ranging (LLR), Very Long Baseline Interferometry (VLBI) has made prominent science contributions also. In recent years, the Global Positioning System (GPS) has demonstrated a rapidly growing popularity as the result of demonstrated low cost with high precision instrumentation. Other modern techniques such as DORIS have demonstrated the ability to make significant science contributions; furthermore, PRARE can be expected to contribute in its own right. An appropriate question is 'why should several techniques be financially supported'? While there are several answers, I offer the opinion that, in consideration of the broad science areas that are the benefactors of space geodesy, no single technique can meet all the requirements and/or expectations of the science areas in which space geodesy contributes or has the potential for contributing. The more well-known science areas include plate tectonics, earthquake processes, Earth rotation/orientation, gravity (static and temporal), ocean circulation, land, and ice topography, to name a few applications. It is unfortunate that the modern space geodesy techniques are often viewed as competitive, but this view is usually encouraged by funding competition, especially in an era of growing needs but diminishing budgets. The techniques are, for the most part, complementary and the ability to reduce the data to geodetic parameters from several techniques promotes confidence in the geophysical interpretations. In the following sections, the current SLR applications are reviewed in the context of the other techniques. The strengths and limitations of SLR are reviewed and speculation about the future prospects are offered.
A Simple Laser Microphone for Classroom Demonstration
ERIC Educational Resources Information Center
Moses, James M.; Trout, K. P.
2006-01-01
Communication through the modulation of electromagnetic radiation has become a foundational technique in modern technology. In this paper we discuss a modern day method of eavesdropping based upon the modulation of laser light reflected from a window pane. A simple and affordable classroom demonstration of a "laser microphone" is…
Review and Analysis of Peak Tracking Techniques for Fiber Bragg Grating Sensors
2017-01-01
Fiber Bragg Grating (FBG) sensors are among the most popular elements for fiber optic sensor networks used for the direct measurement of temperature and strain. Modern FBG interrogation setups measure the FBG spectrum in real-time, and determine the shift of the Bragg wavelength of the FBG in order to estimate the physical parameters. The problem of determining the peak wavelength of the FBG from a spectral measurement limited in resolution and noise, is referred as the peak-tracking problem. In this work, the several peak-tracking approaches are reviewed and classified, outlining their algorithmic implementations: the methods based on direct estimation, interpolation, correlation, resampling, transforms, and optimization are discussed in all their proposed implementations. Then, a simulation based on coupled-mode theory compares the performance of the main peak-tracking methods, in terms of accuracy and signal to noise ratio resilience. PMID:29039804
Micromachined fragment capturer for biomedical applications.
Choi, Young-Soo; Lee, Dong-Weon
2011-11-01
Due to changes in modern diet, a form of heart disease called chronic total occlusion has become a serious disease to be treated as an emergency. In this study, we propose a micromachined capturer that is designed and fabricated to collect plaque fragments generated during surgery to remove the thrombus. The fragment capturer consists of a plastic body made by rapid prototyping, SU-8 mesh structures using MEMS techniques, and ionic polymer metal composite (IPMC) actuators. An array of IPMC actuators combined with the SU-8 net structure was optimized to effectively collect plaque fragments. The evaporation of solvent through the actuator's surface was prevented using a coating of SU-8 and polydimethylsiloxane thin film on the actuator. This approach improved the available operating time of the IPMC, which primarily depends on solvent loss. Our preliminary results demonstrate the possibility of using the capturer for biomedical applications. © 2011 American Institute of Physics
Distance majorization and its applications
Chi, Eric C.; Zhou, Hua; Lange, Kenneth
2014-01-01
The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton’s method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications. PMID:25392563
Progress and challenges in implementing the research on ESKAPE pathogens.
Rice, Louis B
2010-11-01
The ESKAPE pathogens (Enterococcus faecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, and Enterobacter species) are responsible for a substantial percentage of nosocomial infections in the modern hospital and represent the vast majority of isolates whose resistance to antimicrobial agents presents serious therapeutic dilemmas for physicians. Over the years, improved molecular biology techniques have led to detailed information about individual resistance mechanisms in all these pathogens. However, there remains a lack of compelling data on the interplay between resistance mechanisms and between the bacteria themselves. In addition, data on the impact of clinical interventions to decrease the prevalence of resistance are also lacking. The difficulty in identifying novel antimicrobial agents with reliable activity against these pathogens argues for an augmentation of research in the basic and population science of resistance, as well as careful studies to identify optimal strategies for infection control and antimicrobial use.
Sparse QSAR modelling methods for therapeutic and regenerative medicine
NASA Astrophysics Data System (ADS)
Winkler, David A.
2018-02-01
The quantitative structure-activity relationships method was popularized by Hansch and Fujita over 50 years ago. The usefulness of the method for drug design and development has been shown in the intervening years. As it was developed initially to elucidate which molecular properties modulated the relative potency of putative agrochemicals, and at a time when computing resources were scarce, there is much scope for applying modern mathematical methods to improve the QSAR method and to extending the general concept to the discovery and optimization of bioactive molecules and materials more broadly. I describe research over the past two decades where we have rebuilt the unit operations of the QSAR method using improved mathematical techniques, and have applied this valuable platform technology to new important areas of research and industry such as nanoscience, omics technologies, advanced materials, and regenerative medicine. This paper was presented as the 2017 ACS Herman Skolnik lecture.
Bahraminasab, Marjan; Farahmand, Farzam
2017-09-01
The trend in biomaterials development has now headed for tailoring the properties and making hybrid materials to achieve the optimal performance metrics in a product. Modern manufacturing processes along with advanced computational techniques enable systematical fabrication of new biomaterials by design strategy. Functionally graded materials as a recent group of hybrid materials have found numerous applications in biomedical area, particularly for making orthopedic prostheses. This article, therefore, seeks to address the following research questions: (RQ1) What is the desired structure of orthopedic hybrid materials? (RQ2) What is the contribution of the literature in the development of hybrid materials in the field of orthopedic research? (RQ3) Which type of manufacturing approaches is prevalently used to build these materials for knee and hip implants? (RQ4) Is there any inadequacy in the methods applied?
Non-linear optical flow cytometry using a scanned, Bessel beam light-sheet.
Collier, Bradley B; Awasthi, Samir; Lieu, Deborah K; Chan, James W
2015-05-29
Modern flow cytometry instruments have become vital tools for high-throughput analysis of single cells. However, as issues with the cellular labeling techniques often used in flow cytometry have become more of a concern, the development of label-free modalities for cellular analysis is increasingly desired. Non-linear optical phenomena (NLO) are of growing interest for label-free analysis because of the ability to measure the intrinsic optical response of biomolecules found in cells. We demonstrate that a light-sheet consisting of a scanned Bessel beam is an optimal excitation geometry for efficiently generating NLO signals in a microfluidic environment. The balance of photon density and cross-sectional area provided by the light-sheet allowed significantly larger two-photon fluorescence intensities to be measured in a model polystyrene microparticle system compared to measurements made using other excitation focal geometries, including a relaxed Gaussian excitation beam often used in conventional flow cytometers.
Experimental validation of docking and capture using space robotics testbeds
NASA Technical Reports Server (NTRS)
Spofford, John; Schmitz, Eric; Hoff, William
1991-01-01
This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.
[Medical image compression: a review].
Noreña, Tatiana; Romero, Eduardo
2013-01-01
Modern medicine is an increasingly complex activity , based on the evidence ; it consists of information from multiple sources : medical record text , sound recordings , images and videos generated by a large number of devices . Medical imaging is one of the most important sources of information since they offer comprehensive support of medical procedures for diagnosis and follow-up . However , the amount of information generated by image capturing gadgets quickly exceeds storage availability in radiology services , generating additional costs in devices with greater storage capacity . Besides , the current trend of developing applications in cloud computing has limitations, even though virtual storage is available from anywhere, connections are made through internet . In these scenarios the optimal use of information necessarily requires powerful compression algorithms adapted to medical activity needs . In this paper we present a review of compression techniques used for image storage , and a critical analysis of them from the point of view of their use in clinical settings.
Application of CFD to the analysis and design of high-speed inlets
NASA Technical Reports Server (NTRS)
Rose, William C.
1995-01-01
Over the past seven years, efforts under the present Grant have been aimed at being able to apply modern Computational Fluid Dynamics to the design of high-speed engine inlets. In this report, a review of previous design capabilities (prior to the advent of functioning CFD) was presented and the example of the NASA 'Mach 5 inlet' design was given as the premier example of the historical approach to inlet design. The philosophy used in the Mach 5 inlet design was carried forward in the present study, in which CFD was used to design a new Mach 10 inlet. An example of an inlet redesign was also shown. These latter efforts were carried out using today's state-of-the-art, full computational fluid dynamics codes applied in an iterative man-in-the-loop technique. The potential usefulness of an automated machine design capability using an optimizer code was also discussed.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Modern Observational Techniques for Comets
NASA Technical Reports Server (NTRS)
Brandt, J. C. (Editor); Greenberg, J. M. (Editor); Donn, B. (Editor); Rahe, J. (Editor)
1981-01-01
Techniques are discussed in the following areas: astrometry, photometry, infrared observations, radio observations, spectroscopy, imaging of coma and tail, image processing of observation. The determination of the chemical composition and physical structure of comets is highlighted.
Study on feed forward neural network convex optimization for LiFePO4 battery parameters
NASA Astrophysics Data System (ADS)
Liu, Xuepeng; Zhao, Dongmei
2017-08-01
Based on the modern facility agriculture automatic walking equipment LiFePO4 Battery, the parameter identification of LiFePO4 Battery is analyzed. An improved method for the process model of li battery is proposed, and the on-line estimation algorithm is presented. The parameters of the battery are identified using feed forward network neural convex optimization algorithm.
Challenging the Sacred Assumption: A Call for a Systemic Review of Army Aviation Maintenance
2017-05-25
structure , training, equipping and sustainment. Each study intends to optimize the force structure to achieve a balance between the modernization and...operational budgets. Since 1994, Army Aviation force structures , training resources, available equipment and aircraft have changed significantly. Yet...and are focused on force structure , training, equipping and sustainment. Each study intends to optimize the force structure to achieve a balance
From experimental imaging techniques to virtual embryology.
Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis
2004-01-01
Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.
Gotti, Roberto; Amadesi, Elisa; Fiori, Jessica; Bosi, Sara; Bregola, Valeria; Marotti, Ilaria; Dinelli, Giovanni
2018-01-12
Phenolic compounds have received great attention among the health promoting phytochemicals in common wheat (Triticum aestivum L.), mainly because of their strong antioxidant properties. In the present study a simple Capillary Zone Electrophoresis (CZE) method with UV detection was optimized and validated for the quantitation of six of the most important phenolic acids in whole grain i.e., sinapic, ferulic, syringic, p-coumaric, vanillic and p-hydroxybenzoic acid. The separation was achieved in a running buffer composed of sodium phosphate solution (50 mM) in water/methanol 80:20 (v/v) at pH 6.0 and using a fused-silica capillary at the temperature of 30 °C under application of 27 kV. By means of diode array detector, and made possible by the favorable characteristic UV spectra, the quantitation of the solutes was carried out at 200, 220 and 300 nm, in the complex matrices represented by the soluble and bound fractions of wheat flours. The validation parameters of the method i.e., linearity, sensitivity, precision, accuracy and robustness were in line with those obtained by consolidated separation techniques applied for the same purposes (e.g., HPLC-UV), with a significant advantage in term of analysis time (less than 12 min). Ten varieties of soft wheat (five modern Italian and five old Italian genotypes) were analysed and the data were subjected to Principal Components Analysis (PCA). Interestingly, significant differences of the quantitative phenolic acids profile were observed between the modern and the ancient genotypes, with the latter showing higher amount of the main represented phenolic acids. Copyright © 2017 Elsevier B.V. All rights reserved.
Chamrád, Ivo; Rix, Uwe; Stukalov, Alexey; Gridling, Manuela; Parapatics, Katja; Müller, André C.; Altiok, Soner; Colinge, Jacques; Superti-Furga, Giulio; Haura, Eric B.; Bennett, Keiryn L.
2014-01-01
While targeted therapy based on the idea of attenuating the activity of a preselected, therapeutically relevant protein has become one of the major trends in modern cancer therapy, no truly specific targeted drug has been developed and most clinical agents have displayed a degree of polypharmacology. Therefore, the specificity of anticancer therapeutics has emerged as a highly important but severely underestimated issue. Chemical proteomics is a powerful technique combining postgenomic drug-affinity chromatography with high-end mass spectrometry analysis and bioinformatic data processing to assemble a target profile of a desired therapeutic molecule. Due to high demands on the starting material, however, chemical proteomic studies have been mostly limited to cancer cell lines. Herein, we report a down-scaling of the technique to enable the analysis of very low abundance samples, as those obtained from needle biopsies. By a systematic investigation of several important parameters in pull-downs with the multikinase inhibitor bosutinib, the standard experimental protocol was optimized to 100 µg protein input. At this level, more than 30 well-known targets were detected per single pull-down replicate with high reproducibility. Moreover, as presented by the comprehensive target profile obtained from miniaturized pull-downs with another clinical drug, dasatinib, the optimized protocol seems to be extendable to other drugs of interest. Sixty distinct human and murine targets were finally identified for bosutinib and dasatinib in chemical proteomic experiments utilizing core needle biopsy samples from xenotransplants derived from patient tumor tissue. Altogether, the developed methodology proves robust and generic and holds many promises for the field of personalized health care. PMID:23901793
Developments in flow visualization methods for flight research
NASA Technical Reports Server (NTRS)
Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.
1990-01-01
With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.
NASA Astrophysics Data System (ADS)
Davendralingam, Navindran
Conceptual design of aircraft and the airline network (routes) on which aircraft fly on are inextricably linked to passenger driven demand. Many factors influence passenger demand for various Origin-Destination (O-D) city pairs including demographics, geographic location, seasonality, socio-economic factors and naturally, the operations of directly competing airlines. The expansion of airline operations involves the identificaion of appropriate aircraft to meet projected future demand. The decisions made in incorporating and subsequently allocating these new aircraft to serve air travel demand affects the inherent risk and profit potential as predicted through the airline revenue management systems. Competition between airlines then translates to latent passenger observations of the routes served between OD pairs and ticket pricing---this in effect reflexively drives future states of demand. This thesis addresses the integrated nature of aircraft design, airline operations and passenger demand, in order to maximize future expected profits as new aircraft are brought into service. The goal of this research is to develop an approach that utilizes aircraft design, airline network design and passenger demand as a unified framework to provide better integrated design solutions in order to maximize expexted profits of an airline. This is investigated through two approaches. The first is a static model that poses the concurrent engineering paradigm above as an investment portfolio problem. Modern financial portfolio optimization techniques are used to leverage risk of serving future projected demand using a 'yet to be introduced' aircraft against potentially generated future profits. Robust optimization methodologies are incorporated to mitigate model sensitivity and address estimation risks associated with such optimization techniques. The second extends the portfolio approach to include dynamic effects of an airline's operations. A dynamic programming approach is employed to simulate the reflexive nature of airline supply-demand interactions by modeling the aggregate changes in demand that would result from tactical allocations of aircraft to maximize profit. The best yet-to-be-introduced aircraft maximizes profit by minimizing the long term fleetwide direct operating costs.
Drug eluting stents and modern stent technologies for in-stent restenosis.
Werner, Martin
2017-08-01
The implantation of metallic stents has become a standard procedure to improve the outcome after angioplasty of peripheral vessels. However, the occurrence of in-stent restenosis is hampering the long-term efficacy of these procedures and is associated with re-occurrence of symptoms. The optimal treatment modality for in-stent-restenosis in the peripheral vasculature is not well examined. This review discusses the existing evidence for the treatment of in-stent restenosis with drug eluting stents and modern stent technologies.
Kalman Filter Tracking on Parallel Architectures
NASA Astrophysics Data System (ADS)
Cerati, Giuseppe; Elmer, Peter; Lantz, Steven; McDermott, Kevin; Riley, Dan; Tadel, Matevž; Wittich, Peter; Würthwein, Frank; Yagil, Avi
2015-12-01
Power density constraints are limiting the performance improvements of modern CPUs. To address this we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Example technologies today include Intel's Xeon Phi and GPGPUs. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High Luminosity LHC, for example, this will be by far the dominant problem. The need for greater parallelism has driven investigations of very different track finding techniques including Cellular Automata or returning to Hough Transform. The most common track finding techniques in use today are however those based on the Kalman Filter [2]. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. They are known to provide high physics performance, are robust and are exactly those being used today for the design of the tracking system for HL-LHC. Our previous investigations showed that, using optimized data structures, track fitting with Kalman Filter can achieve large speedup both with Intel Xeon and Xeon Phi. We report here our further progress towards an end-to-end track reconstruction algorithm fully exploiting vectorization and parallelization techniques in a realistic simulation setup.
A novel, modernized Golgi-Cox stain optimized for CLARITY cleared tissue.
Kassem, Mustafa S; Fok, Sandra Y Y; Smith, Kristie L; Kuligowski, Michael; Balleine, Bernard W
2018-01-15
High resolution neuronal information is extraordinarily useful in understanding the brain's functionality. The development of the Golgi-Cox stain allowed observation of the neuron in its entirety with unrivalled detail. Tissue clearing techniques, e.g., CLARITY and CUBIC, provide the potential to observe entire neuronal circuits intact within tissue and without previous restrictions with regard to section thickness. Here we describe an improved Golgi-Cox stain method, optimised for use with CLARITY and CUBIC that can be used in both fresh and fixed tissue. Using this method, we were able to observe neurons in their entirety within a fraction of the time traditionally taken to clear tissue (48h). We were also able to show for the first-time that Golgi stained tissue is fluorescent when visualized using a multi-photon microscope, allowing us to image synaptic spines with a detail previously unachievable. These novel methods provide cheap and easy to use techniques to investigate the morphology of cellular processes in the brain at a new-found depth, speed, utility and detail, without previous restrictions of time, tissue type and section thickness. This is the first application of a Golgi-Cox stain to cleared brain tissue, it is investigated and discussed in detail, describing different methodologies that may be used, a comparison between the different clearing techniques and lastly the novel interaction of these techniques with this ultra-rapid stain. Copyright © 2017 Elsevier B.V. All rights reserved.
Instrumentation and fusion for congenital spine deformities.
Hedequist, Daniel J
2009-08-01
A retrospective clinical review. To review the use of modern instrumentation of the spine for congenital spinal deformities. Spinal instrumentation has evolved since the advent of the Harrington rod. There is a paucity of literature, which discusses the use of modern spinal instrumentation in congenital spine deformity cases. This review focuses on modern instrumentation techniques for congenital scoliosis and kyphosis. A systematic review was performed of the literature to discuss spinal implant use for congenital deformities. Spinal instrumentation may be safely and effectively used in cases of congenital spinal deformity. Spinal surgeons taking care of children with congenital spine deformities need to be trained in all aspects of modern spinal instrumentation.
Nicolodelli, Gustavo; Senesi, Giorgio Saverio; de Oliveira Perazzoli, Ivan Luiz; Marangoni, Bruno Spolon; De Melo Benites, Vinícius; Milori, Débora Marcondes Bastos Pereira
2016-09-15
Organic fertilizers are obtained from waste of plant or animal origin. One of the advantages of organic fertilizers is that, from the composting, it recycles waste-organic of urban and agriculture origin, whose disposal would cause environmental impacts. Fast and accurate analysis of both major and minor/trace elements contained in organic mineral and inorganic fertilizers of new generation have promoted the application of modern analytical techniques. In particular, laser induced breakdown spectroscopy (LIBS) is showing to be a very promising, quick and practical technique to detect and measure contaminants and nutrients in fertilizers. Although, this technique presents some limitations, such as a low sensitivity, if compared to other spectroscopic techniques, the use of double pulse (DP) LIBS is an alternative to the conventional LIBS in single pulse (SP). The macronutrients (Ca, Mg, K, P), micronutrients (Cu, Fe, Na, Mn, Zn) and contaminant (Cr) in fertilizer using LIBS in SP and DP configurations were evaluated. A comparative study for both configurations was performed using optimized key parameters for improving LIBS performance. The limit of detection (LOD) values obtained by DP LIBS increased up to seven times as compared to SP LIBS. In general, the marked improvement obtained when using DP system in the simultaneous LIBS quantitative determination for fertilizers analysis could be ascribed to the larger ablated mass of the sample. The results presented in this study show the promising potential of the DP LIBS technique for a qualitative analysis in fertilizers, without requiring sample preparation with chemical reagents. Copyright © 2016 Elsevier B.V. All rights reserved.
How Farmers Learn about Environmental Issues: Reflections on a Sociobiographical Approach
ERIC Educational Resources Information Center
Vandenabeele, Joke; Wildemeersch, Danny
2012-01-01
At the time of this research, protests of farmers against new environmental policy measures received much media attention. News reports suggested that farmers' organizations rejected the idea that modern farming techniques cause damage to the environment and even tried to undermine attempts to reconcile the goals of modern agriculture with…
Older Learning Engagement in the Modern City
ERIC Educational Resources Information Center
Lido, Catherine; Osborne, Michael; Livingston, Mark; Thakuriah, Piyushimita; Sila-Nowicka, Katarzyna
2016-01-01
This research employs novel techniques to examine older learners' journeys, educationally and physically, in order to gain a "three-dimensional" picture of lifelong learning in the modern urban context of Glasgow. The data offers preliminary analyses of an ongoing 1,500 household survey by the Urban Big Data Centre (UBDC). A sample of…
Commodification of Ghana's Volta River: An Example of Ellul's Autonomy of Technique
ERIC Educational Resources Information Center
Agbemabiese, Lawrence; Byrne, John
2005-01-01
Jacques Ellul argued that modernity's nearly exclusive reliance on science and technology to design society would threaten human freedom. Of particular concern for Ellul was the prospect of the technical milieu overwhelming culture. The commodification of the Volta River in order to modernize Ghana illustrates the Ellulian dilemma of the autonomy…
Modern Methodology and Techniques Aimed at Developing the Environmentally Responsible Personality
ERIC Educational Resources Information Center
Ponomarenko, Yelena V.; Zholdasbekova, Bibisara A.; Balabekov, Aidarhan T.; Kenzhebekova, Rabiga I.; Yessaliyev, Aidarbek A.; Larchenkova, Liudmila A.
2016-01-01
The article discusses the positive impact of an environmentally responsible individual as the social unit able to live in harmony with the natural world, himself/herself and other people. The purpose of the article is to provide theoretical substantiation of modern teaching methods. The authors considered the experience of philosophy, psychology,…
Pape, G; Raiss, P; Kleinschmidt, K; Schuld, C; Mohr, G; Loew, M; Rickert, M
2010-12-01
Loosening of the glenoid component is one of the major causes of failure in total shoulder arthroplasty. Possible risk factors for loosening of cemented components include an eccentric loading, poor bone quality, inadequate cementing technique and insufficient cement penetration. The application of a modern cementing technique has become an established procedure in total hip arthroplasty. The goal of modern cementing techniques in general is to improve the cement-penetration into the cancellous bone. Modern cementing techniques include the cement vacuum-mixing technique, retrograde filling of the cement under pressurisation and the use of a pulsatile lavage system. The main purpose of this study was to analyse cement penetration into the glenoid bone by using modern cement techniques and to investigate the relationship between the bone mineral density (BMD) and the cement penetration. Furthermore we measured the temperature at the glenoid surface before and after jet-lavage of different patients during total shoulder arthroplasty. It is known that the surrounding temperature of the bone has an effect on the polymerisation of the cement. Data from this experiment provide the temperature setting for the in-vitro study. The glenoid surface temperature was measured in 10 patients with a hand-held non-contact temperature measurement device. The bone mineral density was measured by DEXA. Eight paired cadaver scapulae were allocated (n = 16). Each pair comprised two scapulae from one donor (matched-pair design). Two different glenoid components were used, one with pegs and the other with a keel. The glenoids for the in-vitro study were prepared with the bone compaction technique by the same surgeon in all cases. Pulsatile lavage was used to clean the glenoid of blood and bone fragments. Low viscosity bone cement was applied retrogradely into the glenoid by using a syringe. A constant pressure was applied with a modified force sensor impactor. Micro-computed tomography scans were applied to analyse the cement penetration into the cancellous bone. The mean temperature during the in-vivo arthroplasty of the glenoid was 29.4 °C (27.2-31 °C) before and 26.2 °C (25-27.5 °C) after jet-lavage. The overall peak BMD was 0.59 (range 0.33-0.99) g/cm (2). Mean cement penetration was 107.9 (range 67.6-142.3) mm (2) in the peg group and 128.3 (range 102.6-170.8) mm (2) in the keel group. The thickness of the cement layer varied from 0 to 2.1 mm in the pegged group and from 0 to 2.4 mm in the keeled group. A strong negative correlation between BMD and mean cement penetration was found for the peg group (r (2) = -0.834; p < 0.01) and for the keel group (r (2) = -0.727; p < 0.041). Micro-CT shows an inhomogenous dispersion of the cement into the cancellous bone. Data from the in-vivo temperature measurement indicate that the temperature at the glenohumeral surface under operation differs from the body core temperature and should be considered in further in-vitro studies with human specimens. Bone mineral density is negatively correlated to cement penetration in the glenoid. The application of a modern cementing technique in the glenoid provides sufficient cementing penetration although there is an inhomogenous dispersion of the cement. The findings of this study should be considered in further discussions about cementing technique and cement penetration into the cancellous bone of the glenoid. © Georg Thieme Verlag KG Stuttgart · New York.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shortt, Ken; Davidsson, Lena; Hendry, Jolyon
2008-05-01
The International Atomic Energy Agency organized an international conference called, 'Quality Assurance and New Techniques in Radiation Medicine' (QANTRM). It dealt with quality assurance (QA) in all aspects of radiation medicine (diagnostic radiology, nuclear medicine, and radiotherapy) at the international level. Participants discussed QA issues pertaining to the implementation of new technologies and the need for education and staff training. The advantage of developing a comprehensive and harmonized approach to QA covering both the technical and the managerial issues was emphasized to ensure the optimization of benefits to patient safety and effectiveness. The necessary coupling between medical radiation imaging andmore » radiotherapy was stressed, particularly for advanced technologies. However, the need for a more systematic approach to the adoption of advanced technologies was underscored by a report on failures in intensity-modulated radiotherapy dosimetry auditing tests in the United States, which could imply inadequate implementation of QA for these new technologies. A plenary session addressed the socioeconomic impact of introducing advanced technologies in resource-limited settings. How shall the dual gaps, one in access to basic medical services and the other in access to high-quality modern technology, be addressed?.« less
Warner, Daniel; Dijkstra, Jan; Hendriks, Wouter H; Pellikaan, Wilbert F
2014-03-30
Knowledge of digesta passage kinetics in ruminants is essential to predict nutrient supply to the animal in relation to optimal animal performance, environmental pollution and animal health. Fractional passage rates (FPR) of feed are widely used in modern feed evaluation systems and mechanistic rumen models, but data on nutrient-specific FPR are scarce. Such models generally rely on conventional external marker techniques, which do not always describe digesta passage kinetics in a satisfactory manner. Here the use of stable isotope-labelled dietary nutrients as a promising novel tool to assess nutrient-specific passage kinetics is discussed. Some major limitations of this technique include a potential marker migration, a poor isotope distribution in the labelled feed and a differential disappearance rate of isotopes upon microbial fermentation in non-steady state conditions. Such limitations can often be circumvented by using intrinsically stable isotope-labelled plant material. Data are limited but indicate that external particulate markers overestimate rumen FPR of plant fibre compared with the internal stable isotope markers. Stable isotopes undergo the same digestive mechanism as the labelled feed components and are thus of particular interest to specifically measure passage kinetics of digestible dietary nutrients. © 2013 Society of Chemical Industry.
Shortt, Ken; Davidsson, Lena; Hendry, Jolyon; Dondi, Maurizio; Andreo, Pedro
2008-01-01
The International Atomic Energy Agency organized an international conference called, "Quality Assurance and New Techniques in Radiation Medicine" (QANTRM). It dealt with quality assurance (QA) in all aspects of radiation medicine (diagnostic radiology, nuclear medicine, and radiotherapy) at the international level. Participants discussed QA issues pertaining to the implementation of new technologies and the need for education and staff training. The advantage of developing a comprehensive and harmonized approach to QA covering both the technical and the managerial issues was emphasized to ensure the optimization of benefits to patient safety and effectiveness. The necessary coupling between medical radiation imaging and radiotherapy was stressed, particularly for advanced technologies. However, the need for a more systematic approach to the adoption of advanced technologies was underscored by a report on failures in intensity-modulated radiotherapy dosimetry auditing tests in the United States, which could imply inadequate implementation of QA for these new technologies. A plenary session addressed the socioeconomic impact of introducing advanced technologies in resource-limited settings. How shall the dual gaps, one in access to basic medical services and the other in access to high-quality modern technology, be addressed?
Hua, Shanshan; Liang, Jie; Zeng, Guangming; Xu, Min; Zhang, Chang; Yuan, Yujie; Li, Xiaodong; Li, Ping; Liu, Jiayu; Huang, Lu
2015-11-15
Groundwater management in China has been facing challenges from both climate change and urbanization and is considered as a national priority nowadays. However, unprecedented uncertainty exists in future scenarios making it difficult to formulate management planning paradigms. In this paper, we apply modern portfolio theory (MPT) to formulate an optimal stage investment of groundwater contamination remediation in China. This approach generates optimal weights of investment to each stage of the groundwater management and helps maximize expected return while minimizing overall risk in the future. We find that the efficient frontier of investment displays an upward-sloping shape in risk-return space. The expected value of groundwater vulnerability index increases from 0.6118 to 0.6230 following with the risk of uncertainty increased from 0.0118 to 0.0297. If management investment is constrained not to exceed certain total cost until 2050 year, the efficient frontier could help decision makers make the most appropriate choice on the trade-off between risk and return. Copyright © 2015 Elsevier Ltd. All rights reserved.
Docosahexaenoic Acid and Cognition throughout the Lifespan
Weiser, Michael J.; Butt, Christopher M.; Mohajeri, M. Hasan
2016-01-01
Docosahexaenoic acid (DHA) is the predominant omega-3 (n-3) polyunsaturated fatty acid (PUFA) found in the brain and can affect neurological function by modulating signal transduction pathways, neurotransmission, neurogenesis, myelination, membrane receptor function, synaptic plasticity, neuroinflammation, membrane integrity and membrane organization. DHA is rapidly accumulated in the brain during gestation and early infancy, and the availability of DHA via transfer from maternal stores impacts the degree of DHA incorporation into neural tissues. The consumption of DHA leads to many positive physiological and behavioral effects, including those on cognition. Advanced cognitive function is uniquely human, and the optimal development and aging of cognitive abilities has profound impacts on quality of life, productivity, and advancement of society in general. However, the modern diet typically lacks appreciable amounts of DHA. Therefore, in modern populations, maintaining optimal levels of DHA in the brain throughout the lifespan likely requires obtaining preformed DHA via dietary or supplemental sources. In this review, we examine the role of DHA in optimal cognition during development, adulthood, and aging with a focus on human evidence and putative mechanisms of action. PMID:26901223
Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence
2013-03-01
Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.
Thermodynamic metrics and optimal paths.
Sivak, David A; Crooks, Gavin E
2012-05-11
A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.
Organizational Decision Making
1975-08-01
the lack of formal techniques typically used by large organizations, digress on the advantages of formal over informal... optimization ; for example one might do a number of optimization calculations, each time using a different measure of effectiveness as the optimized ...final decision. The next level of computer application involves the use of computerized optimization techniques. Optimization
Modern drug discovery technologies: opportunities and challenges in lead discovery.
Guido, Rafael V C; Oliva, Glaucius; Andricopulo, Adriano D
2011-12-01
The identification of promising hits and the generation of high quality leads are crucial steps in the early stages of drug discovery projects. The definition and assessment of both chemical and biological space have revitalized the screening process model and emphasized the importance of exploring the intrinsic complementary nature of classical and modern methods in drug research. In this context, the widespread use of combinatorial chemistry and sophisticated screening methods for the discovery of lead compounds has created a large demand for small organic molecules that act on specific drug targets. Modern drug discovery involves the employment of a wide variety of technologies and expertise in multidisciplinary research teams. The synergistic effects between experimental and computational approaches on the selection and optimization of bioactive compounds emphasize the importance of the integration of advanced technologies in drug discovery programs. These technologies (VS, HTS, SBDD, LBDD, QSAR, and so on) are complementary in the sense that they have mutual goals, thereby the combination of both empirical and in silico efforts is feasible at many different levels of lead optimization and new chemical entity (NCE) discovery. This paper provides a brief perspective on the evolution and use of key drug design technologies, highlighting opportunities and challenges.
The effects of modern cementing techniques on the longevity of total hip arthroplasty.
Poss, R; Brick, G W; Wright, R J; Roberts, D W; Sledge, C B
1988-07-01
Modern prosthetic design and cementing techniques have dramatically improved femoral component fixation. Compared to studies reported in the 1970s, the incidence of radiographic loosening for periods up to 5 years postoperatively has been reduced by at least a factor of 10. These results are the benchmark by which alternative forms of femoral component fixation must be measured. With the likelihood of increased longevity of total hip arthroplasty resulting from improved fixation, the problems of wear debris from the bearing surfaces and loss of bone stock with time will become preeminent.
Lecomte, Dominique; Plu, Isabelle; Froment, Alain
2012-06-01
Forensic examination is often requested when skeletal remains are discovered. Detailed visual observation can provide much information, such as the human or animal origin, sex, age, stature, and ancestry, and approximate time since death. New three-dimensional imaging techniques can provide further information (osteometry, facial reconstruction). Bone chemistry, and particularly measurement of stable or unstable carbon and nitrogen isotopes, yields information on diet and time since death, respectively. Genetic analyses of ancient DNA are also developing rapidly. Although seldom used in a judicial context, these modern anthropologic techniques are nevertheless available for the most complex cases.
Stability-Constrained Aerodynamic Shape Optimization with Applications to Flying Wings
NASA Astrophysics Data System (ADS)
Mader, Charles Alexander
A set of techniques is developed that allows the incorporation of flight dynamics metrics as an additional discipline in a high-fidelity aerodynamic optimization. Specifically, techniques for including static stability constraints and handling qualities constraints in a high-fidelity aerodynamic optimization are demonstrated. These constraints are developed from stability derivative information calculated using high-fidelity computational fluid dynamics (CFD). Two techniques are explored for computing the stability derivatives from CFD. One technique uses an automatic differentiation adjoint technique (ADjoint) to efficiently and accurately compute a full set of static and dynamic stability derivatives from a single steady solution. The other technique uses a linear regression method to compute the stability derivatives from a quasi-unsteady time-spectral CFD solution, allowing for the computation of static, dynamic and transient stability derivatives. Based on the characteristics of the two methods, the time-spectral technique is selected for further development, incorporated into an optimization framework, and used to conduct stability-constrained aerodynamic optimization. This stability-constrained optimization framework is then used to conduct an optimization study of a flying wing configuration. This study shows that stability constraints have a significant impact on the optimal design of flying wings and that, while static stability constraints can often be satisfied by modifying the airfoil profiles of the wing, dynamic stability constraints can require a significant change in the planform of the aircraft in order for the constraints to be satisfied.
Evolutionary optimization methods for accelerator design
NASA Astrophysics Data System (ADS)
Poklonskiy, Alexey A.
Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained optimization test problems for EA with a variety of different configurations and suggest optimal default parameter values based on the results. Then we study the performance of the REPA method on the same set of test problems and compare the obtained results with those of several commonly used constrained optimization methods with EA. Based on the obtained results, particularly on the outstanding performance of REPA on test problem that presents significant difficulty for other reviewed EAs, we conclude that the proposed method is useful and competitive. We discuss REPA parameter tuning for difficult problems and critically review some of the problems from the de-facto standard test problem set for the constrained optimization with EA. In order to demonstrate the practical usefulness of the developed method, we study several problems of accelerator design and demonstrate how they can be solved with EAs. These problems include a simple accelerator design problem (design a quadrupole triplet to be stigmatically imaging, find all possible solutions), a complex real-life accelerator design problem (an optimization of the front end section for the future neutrino factory), and a problem of the normal form defect function optimization which is used to rigorously estimate the stability of the beam dynamics in circular accelerators. The positive results we obtained suggest that the application of EAs to problems from accelerator theory can be very beneficial and has large potential. The developed optimization scenarios and tools can be used to approach similar problems.
The Handbook of Medical Image Perception and Techniques
NASA Astrophysics Data System (ADS)
Samei, Ehsan; Krupinski, Elizabeth
2014-07-01
1. Medical image perception Ehsan Samei and Elizabeth Krupinski; Part I. Historical Reflections and Theoretical Foundations: 2. A short history of image perception in medical radiology Harold Kundel and Calvin Nodine; 3. Spatial vision research without noise Arthur Burgess; 4. Signal detection theory, a brief history Arthur Burgess; 5. Signal detection in radiology Arthur Burgess; 6. Lessons from dinners with the giants of modern image science Robert Wagner; Part II. Science of Image Perception: 7. Perceptual factors in reading medical images Elizabeth Krupinski; 8. Cognitive factors in reading medical images David Manning; 9. Satisfaction of search in traditional radiographic imaging Kevin Berbaum, Edmund Franken, Robert Caldwell and Kevin Schartz; 10. The role of expertise in radiologic image interpretation Calvin Nodine and Claudia Mello-Thoms; 11. A primer of image quality and its perceptual relevance Robert Saunders and Ehsan Samei; 12. Beyond the limitations of human vision Maria Petrou; Part III. Perception Metrology: 13. Logistical issues in designing perception experiments Ehsan Samei and Xiang Li; 14. ROC analysis: basic concepts and practical applications Georgia Tourassi; 15. Multi-reader ROC Steve Hillis; 16. Recent developments in FROC methodology Dev Chakraborty; 17. Observer models as a surrogate to perception experiments Craig Abbey and Miguel Eckstein; 18. Implementation of observer models Matthew Kupinski; Part IV. Decision Support and Computer Aided Detection: 19. CAD: an image perception perspective Maryellen Giger and Weijie Chen; 20. Common designs of CAD studies Yulei Jiang; 21. Perceptual effect of CAD in reading chest images Matthew Freedman and Teresa Osicka; 22. Perceptual issues in mammography and CAD Michael Ulissey; 23. How perceptual factors affect the use and accuracy of CAD for interpretation of CT images Ronald Summers; 24. CAD: risks and benefits for radiologists' decisions Eugenio Alberdi, Andrey Povyakalo, Lorenzo Strigini and Peter Ayton; Part V. Optimization and Practical Issues: 25. Optimization of 2D and 3D radiographic systems Jeff Siewerdson; 26. Applications of AFC methodology in optimization of CT imaging systems Kent Ogden and Walter Huda; 27. Perceptual issues in reading mammograms Margarita Zuley; 28. Perceptual optimization of display processing techniques Richard Van Metter; 29. Optimization of display systems Elizabeth Krupinski and Hans Roehrig; 30. Ergonomic radiologist workplaces in the PACS environment Carl Zylack; Part VI. Epilogue: 31. Future prospects of medical image perception Ehsan Samei and Elizabeth Krupinski; Index.
The Handbook of Medical Image Perception and Techniques
NASA Astrophysics Data System (ADS)
Samei, Ehsan; Krupinski, Elizabeth
2009-12-01
1. Medical image perception Ehsan Samei and Elizabeth Krupinski; Part I. Historical Reflections and Theoretical Foundations: 2. A short history of image perception in medical radiology Harold Kundel and Calvin Nodine; 3. Spatial vision research without noise Arthur Burgess; 4. Signal detection theory, a brief history Arthur Burgess; 5. Signal detection in radiology Arthur Burgess; 6. Lessons from dinners with the giants of modern image science Robert Wagner; Part II. Science of Image Perception: 7. Perceptual factors in reading medical images Elizabeth Krupinski; 8. Cognitive factors in reading medical images David Manning; 9. Satisfaction of search in traditional radiographic imaging Kevin Berbaum, Edmund Franken, Robert Caldwell and Kevin Schartz; 10. The role of expertise in radiologic image interpretation Calvin Nodine and Claudia Mello-Thoms; 11. A primer of image quality and its perceptual relevance Robert Saunders and Ehsan Samei; 12. Beyond the limitations of human vision Maria Petrou; Part III. Perception Metrology: 13. Logistical issues in designing perception experiments Ehsan Samei and Xiang Li; 14. ROC analysis: basic concepts and practical applications Georgia Tourassi; 15. Multi-reader ROC Steve Hillis; 16. Recent developments in FROC methodology Dev Chakraborty; 17. Observer models as a surrogate to perception experiments Craig Abbey and Miguel Eckstein; 18. Implementation of observer models Matthew Kupinski; Part IV. Decision Support and Computer Aided Detection: 19. CAD: an image perception perspective Maryellen Giger and Weijie Chen; 20. Common designs of CAD studies Yulei Jiang; 21. Perceptual effect of CAD in reading chest images Matthew Freedman and Teresa Osicka; 22. Perceptual issues in mammography and CAD Michael Ulissey; 23. How perceptual factors affect the use and accuracy of CAD for interpretation of CT images Ronald Summers; 24. CAD: risks and benefits for radiologists' decisions Eugenio Alberdi, Andrey Povyakalo, Lorenzo Strigini and Peter Ayton; Part V. Optimization and Practical Issues: 25. Optimization of 2D and 3D radiographic systems Jeff Siewerdson; 26. Applications of AFC methodology in optimization of CT imaging systems Kent Ogden and Walter Huda; 27. Perceptual issues in reading mammograms Margarita Zuley; 28. Perceptual optimization of display processing techniques Richard Van Metter; 29. Optimization of display systems Elizabeth Krupinski and Hans Roehrig; 30. Ergonomic radiologist workplaces in the PACS environment Carl Zylack; Part VI. Epilogue: 31. Future prospects of medical image perception Ehsan Samei and Elizabeth Krupinski; Index.
Advances in optimal routing through computer networks
NASA Technical Reports Server (NTRS)
Paz, I. M.
1977-01-01
The optimal routing problem is defined. Progress in solving the problem during the previous decade is reviewed, with special emphasis on technical developments made during the last few years. The relationships between the routing, the throughput, and the switching technology used are discussed and their future trends are reviewed. Economic aspects are also briefly considered. Modern technical approaches for handling the routing problems and, more generally, the flow control problems are reviewed.
Control of Smart Building Using Advanced SCADA
NASA Astrophysics Data System (ADS)
Samuel, Vivin Thomas
For complete control of the building, a proper SCADA implementation and the optimization strategy has to be build. For better communication and efficiency a proper channel between the Communication protocol and SCADA has to be designed. This paper concentrate mainly between the communication protocol, and the SCADA implementation, for a better optimization and energy savings is derived to large scale industrial buildings. The communication channel used in order to completely control the building remotely from a distant place. For an efficient result we consider the temperature values and the power ratings of the equipment so that while controlling the equipment, we are setting a threshold values for FDD technique implementation. Building management system became a vital source for any building to maintain it and for safety purpose. Smart buildings, refers to various distinct features, where the complete automation system, office building controls, data center controls. ELC's are used to communicate the load values of the building to the remote server from a far location with the help of an Ethernet communication channel. Based on the demand fluctuation and the peak voltage, the loads operate differently increasing the consumption rate thus results in the increase in the annual consumption bill. In modern days, saving energy and reducing the consumption bill is most essential for any building for a better and long operation. The equipment - monitored regularly and optimization strategy is implemented for cost reduction automation system. Thus results in the reduction of annual cost reduction and load lifetime increase.
Basheti, Iman A; Reddel, Helen K; Armour, Carol L; Bosnic-Anticevich, Sinthia Z
2005-05-01
Optimal effects of asthma medications are dependent on correct inhaler technique. In a telephone survey, 77/87 patients reported that their Turbuhaler technique had not been checked by a health care professional. In a subsequent pilot study, 26 patients were randomized to receive one of 3 Turbuhaler counseling techniques, administered in the community pharmacy. Turbuhaler technique was scored before and 2 weeks after counseling (optimal technique = score 9/9). At baseline, 0/26 patients had optimal technique. After 2 weeks, optimal technique was achieved by 0/7 patients receiving standard verbal counseling (A), 2/8 receiving verbal counseling augmented with emphasis on Turbuhaler position during priming (B), and 7/9 receiving augmented verbal counseling plus physical demonstration (C) (Fisher's exact test for A vs C, p = 0.006). Satisfactory technique (4 essential steps correct) also improved (A: 3/8 to 4/7; B: 2/9 to 5/8; and C: 1/9 to 9/9 patients) (A vs C, p = 0.1). Counseling in Turbuhaler use represents an important opportunity for community pharmacists to improve asthma management, but physical demonstration appears to be an important component to effective Turbuhaler training for educating patients toward optimal Turbuhaler technique.
Courses in Modern Physics for Non-science Majors, Future Science Teachers, and Biology Students
NASA Astrophysics Data System (ADS)
Zollman, Dean
2001-03-01
For the past 15 years Kansas State University has offered a course in modern physics for students who are not majoring in physics. This course carries a prerequisite of one physics course so that the students have a basic introduction in classical topics. The majors of students range from liberal arts to engineering. Future secondary science teachers whose first area of teaching is not physics can use the course as part of their study of science. The course has evolved from a lecture format to one which is highly interactive and uses a combination of hands-on activities, tutorials and visualizations, particularly the Visual Quantum Mechanics materials. Another course encourages biology students to continue their physics learning beyond the introductory course. Modern Miracle Medical Machines introduces the basic physics which underlie diagnosis techniques such as MRI and PET and laser surgical techniques. Additional information is available at http://www.phys.ksu.edu/perg/
[Achievements and enlightenment of modern acupuncture therapy for stroke based on the neuroanatomy].
Chen, Li-Fang; Fang, Jian-Qiao; Chen, Lu-Ni; Wang, Chao
2014-04-01
Up to now, in the treatment of stroke patients by acupuncture therapy, three main representative achievements involving scalp acupuncture intervention, "Xing Nao Kai Qiao" (restoring consciousness and inducing resuscitation) acupuncture technique and nape acupuncture therapy have been got. Regarding their neurobiological mechanisms, the scalp acupuncture therapy is based on the functional localization of the cerebral cortex, "Xing Nao Kai Qiao" acupuncture therapy is closely related to nerve stem stimulation, and the nape acupuncture therapy is based on the nerve innervation of the regional neck-nape area in obtaining therapeutic effects. In fact, effects of these three acupuncture interventions are all closely associated with the modern neuroanatomy. In the treatment of post-stroke spastic paralysis, cognitive disorder and depression with acupuncture therapy, modern neuroanatomical knowledge should be one of the key theoretical basis and new therapeutic techniques should be explored and developed continuously.
Functional rehabilitation in advanced intraoral cancer.
Barret, Juan P; Roodenburg, Jan L
2017-02-01
Modern treatment of advanced intraoral cancer involves multidisciplinary teams with use of complicated reconstructive techniques to provide improved survival with optimal rehabilitation. Mastication is an important part of this process, and it can be severely impaired by tumor ablation. Whether flap reconstruction is a determinant factor in dental rehabilitation is still in debate. Thirty-five patients with advanced intraoral cancer were reviewed to determine dental rehabilitation of different reconstructive techniques. The patients were treated with a multidisciplinary team approach. The patients' demographics, primary treatment, reconstructive surgery, dental rehabilitation, and functional outcome were recorded and analyzed. Nine patients had Stadium III disease, and 26 patients had stadium IV. Thirty-two patients (91.42%) received postoperative radiotherapy. Masticatory and dental functional rehabilitation of patients was very poor. Only 15 patients (42.86%) could eat a normal diet, whereas 18 patients (51.42%) could manage only soft diets, and 2 patients (5.72%) could only be fed with a liquid diet. Denture rehabilitation was even more frustrating and had a direct impact on masticatory rehabilitation. Only 10 patients (28.57%) could use dentures postoperatively and 40% of patients (14 patients) could not use any denture at all. Above all reconstructive techniques, the free radial forearm flap provides the best functional outcome. Reconstruction of advanced intraoral cancer results in poor denture rehabilitation, especially when bulky flaps are used. If massive resections are necessary, the free radial forearm flap reconstruction provides the best functional outcome.
[Ligament-controlled positioning of the knee prosthesis components].
Widmer, K-H; Zich, A
2015-04-01
There are at least two predominant goals in total knee replacement: first, the surgeon aims to achieve an optimal postoperative kinematic motion close to the patient's physiological range, and second, he aims for concurrent high ligament stability to establish pain-free movement for the entire range of motion. A number of prosthetic designs and surgical techniques have been developed in recent years to achieve both of these targets. This study presents another modified surgical procedure for total knee implantation. As in common practice the osteotomies are planned preoperatively, referencing well-defined bony landmarks, but their placement and orientation are also controlled intraoperatively in a stepwise sequence via ligamentous linkages. This method is open to all surgical approaches and can be applied for PCL-conserving or -sacrificing techniques. The anterior femoral osteotomy is carried out first, followed by the distal femoral osteotomy. Then, the extension gap is finalized by tensioning the ligaments and "top-down" referencing at the level of the tibial osteotomy, followed by finishing the flexion gap in the same way, except that the osteotomy of the posterior condyles is referenced in a "bottom-up" fashion. Hence, this technique relies on both bony and ligament-controlled procedures. Thus, it respects the modified ligamentous framework and drives the prosthetic components into the new ligamentous envelope. Further improvement may be achieved by additional control of the kinematics during surgery by applying modern computer navigation technology.
Dynamic Visualization of Co-expression in Systems Genetics Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
New, Joshua Ryan; Huang, Jian; Chesler, Elissa J
2008-01-01
Biologists hope to address grand scientific challenges by exploring the abundance of data made available through modern microarray technology and other high-throughput techniques. The impact of this data, however, is limited unless researchers can effectively assimilate such complex information and integrate it into their daily research; interactive visualization tools are called for to support the effort. Specifically, typical studies of gene co-expression require novel visualization tools that enable the dynamic formulation and fine-tuning of hypotheses to aid the process of evaluating sensitivity of key parameters. These tools should allow biologists to develop an intuitive understanding of the structure of biologicalmore » networks and discover genes which reside in critical positions in networks and pathways. By using a graph as a universal data representation of correlation in gene expression data, our novel visualization tool employs several techniques that when used in an integrated manner provide innovative analytical capabilities. Our tool for interacting with gene co-expression data integrates techniques such as: graph layout, qualitative subgraph extraction through a novel 2D user interface, quantitative subgraph extraction using graph-theoretic algorithms or by querying an optimized b-tree, dynamic level-of-detail graph abstraction, and template-based fuzzy classification using neural networks. We demonstrate our system using a real-world workflow from a large-scale, systems genetics study of mammalian gene co-expression.« less
Performance of Grey Wolf Optimizer on large scale problems
NASA Astrophysics Data System (ADS)
Gupta, Shubham; Deep, Kusum
2017-01-01
For solving nonlinear continuous problems of optimization numerous nature inspired optimization techniques are being proposed in literature which can be implemented to solve real life problems wherein the conventional techniques cannot be applied. Grey Wolf Optimizer is one of such technique which is gaining popularity since the last two years. The objective of this paper is to investigate the performance of Grey Wolf Optimization Algorithm on large scale optimization problems. The Algorithm is implemented on 5 common scalable problems appearing in literature namely Sphere, Rosenbrock, Rastrigin, Ackley and Griewank Functions. The dimensions of these problems are varied from 50 to 1000. The results indicate that Grey Wolf Optimizer is a powerful nature inspired Optimization Algorithm for large scale problems, except Rosenbrock which is a unimodal function.
Modern developments for ground-based monitoring of fire behavior and effects
Colin C. Hardy; Robert Kremens; Matthew B. Dickinson
2010-01-01
Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...
ERIC Educational Resources Information Center
Lozano-Parada, Jaime H.; Burnham, Helen; Martinez, Fiderman Machuca
2018-01-01
A classical nonlinear system, the "Brusselator", was used to illustrate the modeling and simulation of oscillating chemical systems using stability analysis techniques with modern software tools such as Comsol Multiphysics, Matlab, and Excel. A systematic approach is proposed in order to establish a regime of parametric conditions that…
Do Vascular Networks Branch Optimally or Randomly across Spatial Scales?
Newberry, Mitchell G.; Savage, Van M.
2016-01-01
Modern models that derive allometric relationships between metabolic rate and body mass are based on the architectural design of the cardiovascular system and presume sibling vessels are symmetric in terms of radius, length, flow rate, and pressure. Here, we study the cardiovascular structure of the human head and torso and of a mouse lung based on three-dimensional images processed via our software Angicart. In contrast to modern allometric theories, we find systematic patterns of asymmetry in vascular branching, potentially explaining previously documented mismatches between predictions (power-law or concave curvature) and observed empirical data (convex curvature) for the allometric scaling of metabolic rate. To examine why these systematic asymmetries in vascular branching might arise, we construct a mathematical framework to derive predictions based on local, junction-level optimality principles that have been proposed to be favored in the course of natural selection and development. The two most commonly used principles are material-cost optimizations (construction materials or blood volume) and optimization of efficient flow via minimization of power loss. We show that material-cost optimization solutions match with distributions for asymmetric branching across the whole network but do not match well for individual junctions. Consequently, we also explore random branching that is constrained at scales that range from local (junction-level) to global (whole network). We find that material-cost optimizations are the strongest predictor of vascular branching in the human head and torso, whereas locally or intermediately constrained random branching is comparable to material-cost optimizations for the mouse lung. These differences could be attributable to developmentally-programmed local branching for larger vessels and constrained random branching for smaller vessels. PMID:27902691
Elmiger, Marco P; Poetzsch, Michael; Steuer, Andrea E; Kraemer, Thomas
2018-03-06
High resolution mass spectrometry and modern data independent acquisition (DIA) methods enable the creation of general unknown screening (GUS) procedures. However, even when DIA is used, its potential is far from being exploited, because often, the untargeted acquisition is followed by a targeted search. Applying an actual GUS (including untargeted screening) produces an immense amount of data that must be dealt with. An optimization of the parameters regulating the feature detection and hit generation algorithms of the data processing software could significantly reduce the amount of unnecessary data and thereby the workload. Design of experiment (DoE) approaches allow a simultaneous optimization of multiple parameters. In a first step, parameters are evaluated (crucial or noncrucial). Second, crucial parameters are optimized. The aim in this study was to reduce the number of hits, without missing analytes. The obtained parameter settings from the optimization were compared to the standard settings by analyzing a test set of blood samples spiked with 22 relevant analytes as well as 62 authentic forensic cases. The optimization lead to a marked reduction of workload (12.3 to 1.1% and 3.8 to 1.1% hits for the test set and the authentic cases, respectively) while simultaneously increasing the identification rate (68.2 to 86.4% and 68.8 to 88.1%, respectively). This proof of concept study emphasizes the great potential of DoE approaches to master the data overload resulting from modern data independent acquisition methods used for general unknown screening procedures by optimizing software parameters.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Genetic algorithm optimization of a film cooling array on a modern turbine inlet vane
NASA Astrophysics Data System (ADS)
Johnson, Jamie J.
In response to the need for more advanced gas turbine cooling design methods that factor in the 3-D flowfield and heat transfer characteristics, this study involves the computational optimization of a pressure side film cooling array on a modern turbine inlet vane. Latin hypersquare sampling, genetic algorithm reproduction, and Reynolds-Averaged Navier Stokes (RANS) computational fluid dynamics (CFD) as an evaluation step are used to assess a total of 1,800 film cooling designs over 13 generations. The process was efficient due to the Leo CFD code's ability to estimate cooling mass flux at surface grid cells using a transpiration boundary condition, eliminating the need for remeshing between designs. The optimization resulted in a unique cooling design relative to the baseline with new injection angles, compound angles, cooling row patterns, hole sizes, a redistribution of cooling holes away from the over-cooled midspan to hot areas near the shroud, and a lower maximum surface temperature. To experimentally confirm relative design trends between the optimized and baseline designs, flat plate infrared thermography assessments were carried out at design flow conditions. Use of flat plate experiments to model vane pressure side cooling was justified through a conjugate heat transfer CFD comparison of the 3-D vane and flat plate which showed similar cooling performance trends at multiple span locations. The optimized flat plate model exhibited lower minimum surface temperatures at multiple span locations compared to the baseline. Overall, this work shows promise of optimizing film cooling to reduce design cycle time and save cooling mass flow in a gas turbine.
The analytical representation of viscoelastic material properties using optimization techniques
NASA Technical Reports Server (NTRS)
Hill, S. A.
1993-01-01
This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.
A Course in Heterogeneous Catalysis: Principles, Practice, and Modern Experimental Techniques.
ERIC Educational Resources Information Center
Wolf, Eduardo E.
1981-01-01
Outlines a multidisciplinary course which comprises fundamental, practical, and experimental aspects of heterogeneous catalysis. The course structure is a combination of lectures and demonstrations dealing with the use of spectroscopic techniques for surface analysis. (SK)
Conservation and Preservation of Archives.
ERIC Educational Resources Information Center
Kathpalia, Y. P.
1982-01-01
Presents concept of preventive conservation of archival records as a new science resulting from the use of modern techniques and chemicals. Various techniques for storage, proper environment, preventive de-acidification, fire prevention, restoration, and staff considerations are described. References are provided. (EJS)
Arroyo, Pedro; Pardío-López, Jeanette; Loria, Alvar; Fernández-García, Victoria
2010-01-01
The objective of this article is to provide information on cooking techniques used by two rural communities of Yucatán. We used a 24-hour recall method with 275 participants consuming 763 dishes. Dishes were classified according to cooking technique: 205 were lard-fried (27%), 169 oil-fried (22%), and 389 boiled/grilled (51%). The smaller more secluded community (San Rafael) consumed more fried dishes than the larger community (Uci) (54% versus 45%) and used more lard-frying than Uci (65% versus 46%). The more extensive use of lard in the smaller community appears to be due to fewer modernizing influences such as the availability and use of industrialized vegetable oils. Copyright © Taylor & Francis Group, LLC
Power Systems Operations and Controls | Grid Modernization | NREL
controlled electric grid-with one-way delivery of power from central-station power plants-into one that Manager, Energy Systems Optimization and Control Group murali.baggu@nrel.gov | 303-275-4337
A Course for All Students: Foundations of Modern Engineering
ERIC Educational Resources Information Center
Best, Charles L.
1971-01-01
Describes a course for non-engineering students at Lafayette College which includes the design process in a project. Also included are the study of modeling, optimization, simulation, computer application, and simple feedback controls. (Author/TS)
DOT National Transportation Integrated Search
2012-07-01
Supplementary cementitious materials (SCM) have become common parts of modern concrete practice. The blending of two or three : cementitious materials to optimize durability, strength, or economics provides owners, engineers, materials suppliers, and...
NASA Astrophysics Data System (ADS)
Prokhorov, Sergey
2017-10-01
Building industry in a present day going through the hard times. Machine and mechanism exploitation cost, on a field of construction and installation works, takes a substantial part in total building construction expenses. There is a necessity to elaborate high efficient method, which allows not only to increase production, but also to reduce direct costs during machine fleet exploitation, and to increase its energy efficiency. In order to achieve the goal we plan to use modern methods of work production, hi-tech and energy saving machine tools and technologies, and use of optimal mechanization sets. As the optimization criteria there are exploitation prime cost and set efficiency. During actual task-solving process we made a conclusion, which shows that mechanization works, energy audit with production juxtaposition, prime prices and costs for energy resources allow to make complex machine fleet supply, improve ecological level and increase construction and installation work quality.
Singular Optimal Controls of Rocket Motion (Survey)
NASA Astrophysics Data System (ADS)
Kiforenko, B. N.
2017-05-01
Survey of modern state and discussion of problems of the perfection of methods of investigation of variational problems with a focus on mechanics of space flight are presented. The main attention is paid to the enhancement of the methods of solving of variational problems of rocket motion in the gravitational fields, including rocket motion in the atmosphere. These problems are directly connected with the permanently actual problem of the practical astronautics to increase the payload that is orbited by the carrier rockets in the circumplanetary orbits. An analysis of modern approaches to solving the problems of control of rockets and spacecraft motion on the trajectories with singular arcs that are optimal for the motion of the variable mass body in the medium with resistance is given. The presented results for some maneuvers can serve as an information source for decision making on designing promising rocket and space technology
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Optimization of freeform surfaces using intelligent deformation techniques for LED applications
NASA Astrophysics Data System (ADS)
Isaac, Annie Shalom; Neumann, Cornelius
2018-04-01
For many years, optical designers have great interests in designing efficient optimization algorithms to bring significant improvement to their initial design. However, the optimization is limited due to a large number of parameters present in the Non-uniform Rationaly b-Spline Surfaces. This limitation was overcome by an indirect technique known as optimization using freeform deformation (FFD). In this approach, the optical surface is placed inside a cubical grid. The vertices of this grid are modified, which deforms the underlying optical surface during the optimization. One of the challenges in this technique is the selection of appropriate vertices of the cubical grid. This is because these vertices share no relationship with the optical performance. When irrelevant vertices are selected, the computational complexity increases. Moreover, the surfaces created by them are not always feasible to manufacture, which is the same problem faced in any optimization technique while creating freeform surfaces. Therefore, this research addresses these two important issues and provides feasible design techniques to solve them. Finally, the proposed techniques are validated using two different illumination examples: street lighting lens and stop lamp for automobiles.
Cheng, Xu-Dong; Feng, Liang; Gu, Jun-Fei; Zhang, Ming-Hua; Jia, Xiao-Bin
2014-11-01
Chinese medicine prescriptions are the wisdom outcomes of traditional Chinese medicine (TCM) clinical treatment determinations which based on differentiation of symptoms and signs. Chinese medicine prescriptions are also the basis of secondary exploitation of TCM. The study on prescription helps to understand the material basis of its efficacy, pharmacological mechanism, which is an important guarantee for the modernization of traditional Chinese medicine. Currently, there is not yet dissertation n the method and technology system of basic research on the prescription of Chinese medicine. This paper focuses on how to build an effective system of prescription research technology. Based on "component structure" theory, a technology system contained four-step method that "prescription analysis, the material basis screening, the material basis of analysis and optimization and verify" was proposed. The technology system analyzes the material basis of the three levels such as Chinese medicine pieces, constituents and the compounds which could respect the overall efficacy of Chinese medicine. Ideas of prescription optimization, remodeling are introduced into the system. The technology system is the combination of the existing research and associates with new techniques and methods, which used for explore the research thought suitable for material basis research and prescription remodeling. The system provides a reference for the secondary development of traditional Chinese medicine, and industrial upgrading.
High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems
NASA Technical Reports Server (NTRS)
Kolano, Paul Z.; Ciotti, Robert B.
2012-01-01
Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.
Hernandez, Wilmar
2007-01-01
In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.
Are we fully utilizing the functionalities of modern operating room ventilators?
Liu, Shujie; Kacmarek, Robert M; Oto, Jun
2017-12-01
The modern operating room ventilators have become very sophisticated and many of their features are comparable with those of an ICU ventilator. To fully utilize the functionality of modern operating room ventilators, it is important for clinicians to understand in depth the working principle of these ventilators and their functionalities. Piston ventilators have the advantages of delivering accurate tidal volume and certain flow compensation functions. Turbine ventilators have great ability of flow compensation. Ventilation modes are mainly volume-based or pressure-based. Pressure-based ventilation modes provide better leak compensation than volume-based. The integration of advanced flow generation systems and ventilation modes of the modern operating room ventilators enables clinicians to provide both invasive and noninvasive ventilation in perioperative settings. Ventilator waveforms can be used for intraoperative neuromonitoring during cervical spine surgery. The increase in number of new features of modern operating room ventilators clearly creates the opportunity for clinicians to optimize ventilatory care. However, improving the quality of ventilator care relies on a complete understanding and correct use of these new features. VIDEO ABSTRACT: http://links.lww.com/COAN/A47.
How to mathematically optimize drug regimens using optimal control.
Moore, Helen
2018-02-01
This article gives an overview of a technique called optimal control, which is used to optimize real-world quantities represented by mathematical models. I include background information about the historical development of the technique and applications in a variety of fields. The main focus here is the application to diseases and therapies, particularly the optimization of combination therapies, and I highlight several such examples. I also describe the basic theory of optimal control, and illustrate each of the steps with an example that optimizes the doses in a combination regimen for leukemia. References are provided for more complex cases. The article is aimed at modelers working in drug development, who have not used optimal control previously. My goal is to make this technique more accessible in the biopharma community.
An integrated study of earth resources in the state of California using remote sensing techniques
NASA Technical Reports Server (NTRS)
1973-01-01
University of California investigations to determine the usefulness of modern remote sensing techniques have concentrated on the water resources of the state. The studies consider in detail the supply, demand, and impact relationships.
Optimization techniques applied to passive measures for in-orbit spacecraft survivability
NASA Technical Reports Server (NTRS)
Mog, Robert A.; Price, D. Marvin
1987-01-01
Optimization techniques applied to passive measures for in-orbit spacecraft survivability, is a six-month study, designed to evaluate the effectiveness of the geometric programming (GP) optimization technique in determining the optimal design of a meteoroid and space debris protection system for the Space Station Core Module configuration. Geometric Programming was found to be superior to other methods in that it provided maximum protection from impact problems at the lowest weight and cost.
Fitting Prony Series To Data On Viscoelastic Materials
NASA Technical Reports Server (NTRS)
Hill, S. A.
1995-01-01
Improved method of fitting Prony series to data on viscoelastic materials involves use of least-squares optimization techniques. Based on optimization techniques yields closer correlation with data than traditional method. Involves no assumptions regarding the gamma'(sub i)s and higher-order terms, and provides for as many Prony terms as needed to represent higher-order subtleties in data. Curve-fitting problem treated as design-optimization problem and solved by use of partially-constrained-optimization techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ranallo, F; Szczykutowicz, T
2014-06-01
Purpose: To provide correct guidance in the proper selection of pitch and rotation time for optimal CT imaging with multi-slice scanners. Methods: There exists a widespread misconception concerning the role of pitch in patient dose with modern multi-slice scanners, particularly with the use of mA modulation techniques. We investigated the relationship of pitch and rotation time to image quality, dose, and scan duration, with CT scanners from different manufacturers in a way that clarifies this misconception. This source of this misconception may concern the role of pitch in single slice CT scanners. Results: We found that the image noise andmore » dose are generally independent of the selected effective mAs (mA*time/ pitch) with manual mA technique settings and are generally independent of the selected pitch and /or rotation time with automatic mA modulation techniques. However we did find that on certain scanners the use of a pitch just above 0.5 provided images of equal image noise at a lower dose compared to the use of a pitch just below 1.0. Conclusion: The misconception that the use of a lower pitch over-irradiates patients by wasting dose is clearly false. The use of a lower pitch provides images of equal or better image quality at the same patient dose, whether using manual mA or automatic mA modulation techniques. By decreasing the pitch and the rotation times by equal amounts, both helical and patient motion artifacts can be reduced without affecting the exam time. The use of lower helical pitch also allows better scanning of larger patients by allowing a greater scan effective mAs, if the exam time can be extended. The one caution with the use of low pitch is not related to patient dose, but to the length of the scan time if the rotation time is not set short enough. Partial Research funding from GE HealthCare.« less
[Antiangiogenic agents in ARMD treatment].
Coroi, Mihaela-Cristiana; Demea, Sorina; Todor, Meda; Apopei, Emmanuela
2012-01-01
The aim of antiangiogenic agents in the treatment of age related senile macular degeneration is to destroy coroidian neoformation vessels by minimally affecting the central vision. We present a case of important central vision recovery after 3 intravitreal injections of Avastin. The therapeutic decision and patient monitoring have been made using imaging studies, such as OCT and AFG. A modern therapeutic approach of neovascular forms of age related macular degeneration, backed up by AFG and OCT is a modern treatment method of this disabling illness which brings patients optimal functional and structural improvement.
The iLappSurgery taTME app: a modern adjunct to the teaching of surgical techniques.
Atallah, S; Brady, R R W
2016-09-01
Application-based technology has emerged as a method of modern information communication, and this has been applied towards surgical training and education. It allows surgeons the ability to obtain portable and instant access to information that is otherwise difficult to deliver. The iLappSurgery Foundation has recently launched the transanal total mesorectal excision educational application (taTME app) which provides a useful adjunct, especially for surgeons interested in mastery of the taTME technique and its principles. The article provides a detailed review of the application, which has achieved a large user-base since its debut in June, 2016.
Advances in Patellofemoral Arthroplasty.
Strickland, Sabrina M; Bird, Mackenzie L; Christ, Alexander B
2018-06-01
To describe current indications, implants, economic benefits, comparison to TKA, and functional and patient-reported outcomes of patellofemoral arthroplasty. Modern onlay implants and improved patient selection have allowed for recent improvements in short- and long-term outcomes after patellofemoral joint replacement surgery. Patellofemoral arthroplasty has become an increasingly utilized technique for the successful treatment of isolated patellofemoral arthritis. Advances in patient selection, implant design, and surgical technique have resulted in improved performance and longevity of these implants. Although short- and mid-term data for modern patellofemoral arthroplasties appear promising, further long-term clinical studies are needed to evaluate how new designs and technologies will affect patient outcomes and long-term implant performance.
Insecticide ADME for support of early-phase discovery: combining classical and modern techniques.
David, Michael D
2017-04-01
The two factors that determine an insecticide's potency are its binding to a target site (intrinsic activity) and the ability of its active form to reach the target site (bioavailability). Bioavailability is dictated by the compound's stability and transport kinetics, which are determined by both physical and biochemical characteristics. At BASF Global Insecticide Research, we characterize bioavailability in early research with an ADME (Absorption, Distribution, Metabolism and Excretion) approach, combining classical and modern techniques. For biochemical assessment of metabolism, we purify native insect enzymes using classical techniques, and recombinantly express individual insect enzymes that are known to be relevant in insecticide metabolism and resistance. For analytical characterization of an experimental insecticide and its metabolites, we conduct classical radiotracer translocation studies when a radiolabel is available. In discovery, where typically no radiolabel has been synthesized, we utilize modern high-resolution mass spectrometry to probe complex systems for the test compounds and its metabolites. By using these combined approaches, we can rapidly compare the ADME properties of sets of new experimental insecticides and aid in the design of structures with an improved potential to advance in the research pipeline. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Drewry, D.; Berny-Mier y Teran, J. C.; Dutta, D.; Gepts, P.
2017-12-01
Hyperspectral sensing in the visible through shortwave infrared (VSWIR) portion of the spectrum has been demonstrated to provide significant information on the structural and functional properties of vegetation, resulting in powerful techniques to discern species differences, characterize crop nutrient or water stress, and quantify the density of foliage in agricultural fields. Modern machine-learning techniques allow for the entire set of spectral bands, on the order of hundreds with modern field and airborne spectrometers, to be used to develop models that can simultaneously retrieve a variety of foliar chemical compounds and hydrological and structural states. The application of these techniques, in the context of leaf-level measurements of VSWIR reflectance, or more complicated remote airborne surveys, has the potential to revolutionize high-throughput methods to phenotype germplasm that optimizes yield, resource-use efficiencies, or alternate objectives related to disease resistance or biomass accumulation, for example. Here we focus on breeding trials for a set of warm-season legumes, conducted in both greenhouse and field settings, and spanning a set of diverse genotypes providing a range of adaptation to drought and yield potential in the context of the semi-arid climate cultivation. At the leaf-level, a large set of spectral reflectance measurements spanning 400-2500 nanometers were made for plants across various growth stages in field experiments that induced severe drought, along with sampling for relevant trait values. Here we will discuss the development and performance of algorithms for a range of leaf traits related to gas exchange, leaf structure, hydrological status, nutrient contents and stable isotope discrimination, along with their relationships to drought resistance and yield. We likewise discuss the effectiveness of quantifying relevant foliar and canopy traits through airborne imaging spectroscopy from small unmanned vehicles (sUAVs), and future directions that augment VSWIR spectral coverage to include the thermal infrared portion of the spectrum, including our recent efforts to accurately retrieve vegetation surface temperature and estimate consumptive water use in agricultural systems throughout the diurnal cycle.
NASA Astrophysics Data System (ADS)
Pflaumann, Uwe; Duprat, Josette; Pujol, Claude; Labeyrie, Laurent D.
1996-02-01
We present a data set of 738 planktonic foraminiferal species counts from sediment surface samples of the eastern North Atlantic and the South Atlantic between 87°N and 40°S, 35°E and 60°W including published Climate: Long-Range Investigation, Mapping, and Prediction (CLIMAP) data. These species counts are linked to Levitus's [1982] modern water temperature data for the four caloric seasons, four depth ranges (0, 30, 50, and 75 m), and the combined means of those depth ranges. The relation between planktonic foraminiferal assemblages and sea surface temperature (SST) data is estimated using the newly developed SIMMAX technique, which is an acronym for a modern analog technique (MAT) with a similarity index, based on (1) the scalar product of the normalized faunal percentages and (2) a weighting procedure of the modern analog's SSTs according to the inverse geographical distances of the most similar samples. Compared to the classical CLIMAP transfer technique and conventional MAT techniques, SIMMAX provides a more confident reconstruction of paleo-SSTs (correlation coefficient is 0.994 for the caloric winter and 0.993 for caloric summer). The standard deviation of the residuals is 0.90°C for caloric winter and 0.96°C for caloric summer at 0-m water depth. The SST estimates reach optimum stability (standard deviation of the residuals is 0.88°C) at the average 0- to 75-m water depth. Our extensive database provides SST estimates over a range of -1.4 to 27.2°C for caloric winter and 0.4 to 28.6°C for caloric summer, allowing SST estimates which are especially valuable for the high-latitude Atlantic during glacial times. An electronic supplement of this material may be obtained on adiskette or Anonymous FTP from KOSMOS.AGU.ORG. (LOGIN toAGU's FTP account using ANONYMOUS as the username and GUESTas the password. Go to the right directory by typing CD APPEND. TypeLS to see what files are available. Type GET and the name of the file toget it. Finally type EXIT to leave the system.) (Paper 95PA01743,SIMMAX: A modern analog technique to deduce Atlantic sea surfacetemperatures from planktonic foraminifera in deep-sea sediments, UwePflaumann, Josette Duprat, Claude Pujol, and Laurent D. Labeyrie).Diskette may be ordered from American Geophysical Union, 2000Florida Avenue, N.W., Washington, DC 20009; Payment mustaccompany order.
NASA Technical Reports Server (NTRS)
Sreekanta Murthy, T.
1992-01-01
Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.
A programing system for research and applications in structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Rogers, J. L., Jr.
1981-01-01
The flexibility necessary for such diverse utilizations is achieved by combining, in a modular manner, a state-of-the-art optimization program, a production level structural analysis program, and user supplied and problem dependent interface programs. Standard utility capabilities in modern computer operating systems are used to integrate these programs. This approach results in flexibility of the optimization procedure organization and versatility in the formulation of constraints and design variables. Features shown in numerical examples include: variability of structural layout and overall shape geometry, static strength and stiffness constraints, local buckling failure, and vibration constraints.
Muscle optimization techniques impact the magnitude of calculated hip joint contact forces.
Wesseling, Mariska; Derikx, Loes C; de Groote, Friedl; Bartels, Ward; Meyer, Christophe; Verdonschot, Nico; Jonkers, Ilse
2015-03-01
In musculoskeletal modelling, several optimization techniques are used to calculate muscle forces, which strongly influence resultant hip contact forces (HCF). The goal of this study was to calculate muscle forces using four different optimization techniques, i.e., two different static optimization techniques, computed muscle control (CMC) and the physiological inverse approach (PIA). We investigated their subsequent effects on HCFs during gait and sit to stand and found that at the first peak in gait at 15-20% of the gait cycle, CMC calculated the highest HCFs (median 3.9 times peak GRF (pGRF)). When comparing calculated HCFs to experimental HCFs reported in literature, the former were up to 238% larger. Both static optimization techniques produced lower HCFs (median 3.0 and 3.1 pGRF), while PIA included muscle dynamics without an excessive increase in HCF (median 3.2 pGRF). The increased HCFs in CMC were potentially caused by higher muscle forces resulting from co-contraction of agonists and antagonists around the hip. Alternatively, these higher HCFs may be caused by the slightly poorer tracking of the net joint moment by the muscle moments calculated by CMC. We conclude that the use of different optimization techniques affects calculated HCFs, and static optimization approached experimental values best. © 2014 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Chemistry Is Dead. Long Live Chemistry!
Lavis, Luke D
2017-10-03
Chemistry, once king of fluorescence microscopy, was usurped by the field of fluorescent proteins. The increased demands of modern microscopy techniques on the "photon budget" require better and brighter fluorophores, causing a renewed interest in synthetic dyes. Here, we review the recent advances in biochemistry, protein engineering, and organic synthesis that have allowed a triumphant return of chemical fluorophores to modern biological imaging.
[Watsu: a modern method in physiotherapy, body regeneration, and sports].
Weber-Nowakowska, Katarzyna; Gebska, Magdalena; Zyzniewska-Banaszak, Ewelina
2013-01-01
Progress in existing methods of physiotherapy and body regeneration and introduction of new methods has made it possible to precisely select the techniques according to patient needs. The modern therapist is capable of improving the physical and mental condition of the patient. Watsu helps the therapist eliminate symptoms from the locomotor system and reach the psychic sphere at the same time.
Graphic Poetry: How to Help Students Get the Most out of Pictures
ERIC Educational Resources Information Center
Chiang, River Ya-ling
2013-01-01
This paper attempts to give an account of some innovative work in paintings and modern poetry and to show how modern poets, such as Jane Flanders and Anne Sexton, the two American poets in particular, express and develop radically new conventions for their respective arts. Also elaborated are how such changes in artistic techniques are related to…
Analytics and Action in Afghanistan
2010-09-01
rests on rational technology , and ultimately on scientific knowledge. No country could be modern without being eco- nomically advanced or...backwardness to enlight - ened modernity. Underdeveloped countries had failed to progress to what Max Weber called rational legalism because of the grip...Douglas Pike, Viet Cong: The Organization and Techniques of the National Liberation Front of South Vietnam (Boston: Massachusetts Institute of Technology
Europe Report, Science and Technology
1986-09-30
to certain basic products of the food industry such as beer, vinegar , 51 spirits, starches, etc. It is also assumed that modern biotechnologies...Czechoslovak food production. This is also the objective of innovative and modernizing programs in the fermented food sectors. The program for the...cattle and improves fodder utilization, assuming balanced doses of fodder. The development of fermentation techniques of production will occur within
Progress in multidisciplinary design optimization at NASA Langley
NASA Technical Reports Server (NTRS)
Padula, Sharon L.
1993-01-01
Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.
ERIC Educational Resources Information Center
Ferguson, Albert S.
Experiences with various modern management techniques and practices in selected small, private church-related colleges were studied. For comparative purposes, practices in public colleges and universities were also assessed. Management techniques used in small companies were identified through review of the literature and the management seminars…
Optimized Orthovoltage Stereotactic Radiosurgery
NASA Astrophysics Data System (ADS)
Fagerstrom, Jessica M.
Because of its ability to treat intracranial targets effectively and noninvasively, stereotactic radiosurgery (SRS) is a prevalent treatment modality in modern radiation therapy. This work focused on SRS delivering rectangular function dose distributions, which are desirable for some targets such as those with functional tissue included within the target volume. In order to achieve such distributions, this work used fluence modulation and energies lower than those utilized in conventional SRS. In this work, the relationship between prescription isodose and dose gradients was examined for standard, unmodulated orthovoltage SRS dose distributions. Monte Carlo-generated energy deposition kernels were used to calculate 4pi, isocentric dose distributions for a polyenergetic orthovoltage spectrum, as well as monoenergetic orthovoltage beams. The relationship between dose gradients and prescription isodose was found to be field size and energy dependent, and values were found for prescription isodose that optimize dose gradients. Next, a pencil-beam model was used with a Genetic Algorithm search heuristic to optimize the spatial distribution of added tungsten filtration within apertures of cone collimators in a moderately filtered 250 kVp beam. Four cone sizes at three depths were examined with a Monte Carlo model to determine the effects of the optimized modulation compared to open cones, and the simulations found that the optimized cones were able to achieve both improved penumbra and flatness statistics at depth compared to the open cones. Prototypes of the filter designs calculated using mathematical optimization techniques and Monte Carlo simulations were then manufactured and inserted into custom built orthovoltage SRS cone collimators. A positioning system built in-house was used to place the collimator and filter assemblies temporarily in the 250 kVp beam line. Measurements were performed in water using radiochromic film scanned with both a standard white light flatbed scanner as well as a prototype laser densitometry system. Measured beam profiles showed that the modulated beams could more closely approach rectangular function dose profiles compared to the open cones. A methodology has been described and implemented to achieve optimized SRS delivery, including the development of working prototypes. Future work may include the construction of a full treatment platform.
Practical Problems in the Cement Industry Solved by Modern Research Techniques
ERIC Educational Resources Information Center
Daugherty, Kenneth E.; Robertson, Les D.
1972-01-01
Practical chemical problems in the cement industry are being solved by such techniques as infrared spectroscopy, gas chromatography-mass spectrometry, X-ray diffraction, atomic absorption and arc spectroscopy, thermally evolved gas analysis, Mossbauer spectroscopy, transmission and scanning electron microscopy. (CP)
New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases
NASA Astrophysics Data System (ADS)
Brescia, Massimo
2012-11-01
Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.
NASA Astrophysics Data System (ADS)
Chen, Shiyu; Li, Haiyang; Baoyin, Hexi
2018-06-01
This paper investigates a method for optimizing multi-rendezvous low-thrust trajectories using indirect methods. An efficient technique, labeled costate transforming, is proposed to optimize multiple trajectory legs simultaneously rather than optimizing each trajectory leg individually. Complex inner-point constraints and a large number of free variables are one main challenge in optimizing multi-leg transfers via shooting algorithms. Such a difficulty is reduced by first optimizing each trajectory leg individually. The results may be, next, utilized as an initial guess in the simultaneous optimization of multiple trajectory legs. In this paper, the limitations of similar techniques in previous research is surpassed and a homotopic approach is employed to improve the convergence efficiency of the shooting process in multi-rendezvous low-thrust trajectory optimization. Numerical examples demonstrate that newly introduced techniques are valid and efficient.
Oliver, Kelly; Manton, David John
2015-01-01
Effective behavior management guides children through the complex social context of dentistry utilizing techniques based on a current understanding of the social, emotional, and cognitive development of children. Behavior management techniques facilitate effective communication and establish social and behavioral guidelines for the dental environment. Contemporary parenting styles, expectations, and attitudes of modern parents and society have influenced the use of behavior management techniques with a prevailing emphasis on communicative techniques and pharmacological management over aversive techniques.
A fast and objective multidimensional kernel density estimation method: fastKDE
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.; ...
2016-03-07
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
Medicine and the space odyssey.
Charlton, Bruce G
2006-01-01
Up to the mid-1960s, science and technology (including medicine) were generally regarded as exciting, beautiful and spiritually enthralling; and the space odyssey seemed a symbol of the optimistic future of humankind. The early seventies saw a growing disillusionment with space travel as part of a mood of cultural pessimism and anti-modernization - and this combined with a resurgence of therapeutic nihilism in medicine. But recent discussions of renewed space exploration and a Mars mission may be evidence of a changing zeitgeist, with Western culture moving towards a bolder and more optimistic attitude. The adventure of space travel, exploration and colonization could be seen as both a barometer of cultural optimism, and an enterprise which would feed-back into cultural optimism for many decades to come. Medical science could also be a beneficiary; since greater boldness and optimism would be likely to renew the goals of medicine to do positive good - as contrasted with the necessary, but relatively uninspiring, requirement to minimize risk and harm. In a modernizing society humankind needs to look outward as well as inward: we need a frontier, and we need to grow. A resurgent space odyssey may be the best way that this can be enacted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirayama, S; Fujimoto, R
Purpose: The purpose was to demonstrate a developed acceleration technique of dose optimization and to investigate its applicability to the optimization process in a treatment planning system (TPS) for proton therapy. Methods: In the developed technique, the dose matrix is divided into two parts, main and halo, based on beam sizes. The boundary of the two parts is varied depending on the beam energy and water equivalent depth by utilizing the beam size as a singular threshold parameter. The optimization is executed with two levels of iterations. In the inner loop, doses from the main part are updated, whereas dosesmore » from the halo part remain constant. In the outer loop, the doses from the halo part are recalculated. We implemented this technique to the optimization process in the TPS and investigated the dependence on the target volume of the speedup effect and applicability to the worst-case optimization (WCO) in benchmarks. Results: We created irradiation plans for various cubic targets and measured the optimization time varying the target volume. The speedup effect was improved as the target volume increased, and the calculation speed increased by a factor of six for a 1000 cm3 target. An IMPT plan for the RTOG benchmark phantom was created in consideration of ±3.5% range uncertainties using the WCO. Beams were irradiated at 0, 45, and 315 degrees. The target’s prescribed dose and OAR’s Dmax were set to 3 Gy and 1.5 Gy, respectively. Using the developed technique, the calculation speed increased by a factor of 1.5. Meanwhile, no significant difference in the calculated DVHs was found before and after incorporating the technique into the WCO. Conclusion: The developed technique could be adapted to the TPS’s optimization. The technique was effective particularly for large target cases.« less
Optimization of Driving Styles for Fuel Economy Improvement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malikopoulos, Andreas; Aguilar, Juan P.
2012-01-01
Modern vehicles have sophisticated electronic control units, particularly to control engine operation with respect to a balance between fuel economy, emissions, and power. These control units are designed for specific driving conditions and testing. However, each individual driving style is different and rarely meets those driving conditions. In the research reported here we investigate those driving style factors that have a major impact on fuel economy. An optimization framework is proposed with the aim of optimizing driving styles with respect to these driving factors. A set of polynomial metamodels are constructed to reflect the responses produced by changes of themore » driving factors. Then we compare the optimized driving styles to the original ones and evaluate the efficiency and effectiveness of the optimization formulation.« less
[Discussion on the cultural loss and return of modern acupuncture].
Liu, Bing; Zhao, Jing-sheng; Gao, Shu-zhong
2009-08-01
The philosophical ontology analysis was used in this study to explore the self-factors related to the cultural loss of modern acupuncture, and to establish the theoretical constructs and the clinical model for the cultural return. It is indicated that the most important factors related to the cultural loss of modern acupuncture are the separation of technical characteristics and cultural connotations and the diversion of modern techniques away from classical acupuncture. An effective way of the cultural return is to build a harmonious theoretical and clinical model to develop acupuncture. Based on the foundation of acupuncture from its own culture roots, the traditional sense and cultural values should be enhanced to facilitate the cultural return of acupuncture in theory and clinical practice.
Candidate eco-friendly gas mixtures for MPGDs
NASA Astrophysics Data System (ADS)
Benussi, L.; Bianco, S.; Saviano, G.; Muhammad, S.; Piccolo, D.; Ferrini, M.; Parvis, M.; Grassini, S.; Colafranceschi, S.; Kjølbro, J.; Sharma, A.; Yang, D.; Chen, G.; Ban, Y.; Li, Q.
2018-02-01
Modern gas detectors for detection of particles require F-based gases for optimal performance. Recent regulations demand the use of environmentally unfriendly F-based gases to be limited or banned. This review studies properties of potential eco-friendly gas candidate replacements.
Ant colony optimization algorithm for signal coordination of oversaturated traffic networks.
DOT National Transportation Integrated Search
2010-05-01
Traffic congestion is a daily and growing problem of the modern era in mostly all major cities in the world. : Increasing traffic demand strains the existing transportation system, leading to oversaturated network : conditions, especially at peak hou...
Computed tomography automatic exposure control techniques in 18F-FDG oncology PET-CT scanning.
Iball, Gareth R; Tout, Deborah
2014-04-01
Computed tomography (CT) automatic exposure control (AEC) systems are now used in all modern PET-CT scanners. A collaborative study was undertaken to compare AEC techniques of the three major PET-CT manufacturers for fluorine-18 fluorodeoxyglucose half-body oncology imaging. An audit of 70 patients was performed for half-body CT scans taken on a GE Discovery 690, Philips Gemini TF and Siemens Biograph mCT (all 64-slice CT). Patient demographic and dose information was recorded and image noise was calculated as the SD of Hounsfield units in the liver. A direct comparison of the AEC systems was made by scanning a Rando phantom on all three systems for a range of AEC settings. The variation in dose and image quality with patient weight was significantly different for all three systems, with the GE system showing the largest variation in dose with weight and Philips the least. Image noise varied with patient weight in Philips and Siemens systems but was constant for all weights in GE. The z-axis mA profiles from the Rando phantom demonstrate that these differences are caused by the nature of the tube current modulation techniques applied. The mA profiles varied considerably according to the AEC settings used. CT AEC techniques from the three manufacturers yield significantly different tube current modulation patterns and hence deliver different doses and levels of image quality across a range of patient weights. Users should be aware of how their system works and of steps that could be taken to optimize imaging protocols.
The unique role of halogen substituents in the design of modern agrochemicals.
Jeschke, Peter
2010-01-01
The past 30 years have witnessed a period of significant expansion in the use of halogenated compounds in the field of agrochemical research and development. The introduction of halogens into active ingredients has become an important concept in the quest for a modern agrochemical with optimal efficacy, environmental safety, user friendliness and economic viability. Outstanding progress has been made, especially in synthetic methods for particular halogen-substituted key intermediates that were previously prohibitively expensive. Interestingly, there has been a rise in the number of commercial products containing 'mixed' halogens, e.g. one or more fluorine, chlorine, bromine or iodine atoms in addition to one or more further halogen atoms. Extrapolation of the current trend indicates that a definite growth is to be expected in fluorine-substituted agrochemicals throughout the twenty-first century. A number of these recently developed agrochemical candidates containing halogen substituents represent novel classes of chemical compounds with new modes of action. However, the complex structure-activity relationships associated with biologically active molecules mean that the introduction of halogens can lead to either an increase or a decrease in the efficacy of a compound, depending on its changed mode of action, physicochemical properties, target interaction or metabolic susceptibility and transformation. In spite of modern design concepts, it is still difficult to predict the sites in a molecule at which halogen substitution will result in optimal desired effects. This review describes comprehensively the successful utilisation of halogens and their unique role in the design of modern agrochemicals, exemplified by various commercial products from Bayer CropScience coming from different agrochemical areas.
Flexible multibody simulation of automotive systems with non-modal model reduction techniques
NASA Astrophysics Data System (ADS)
Shiiba, Taichi; Fehr, Jörg; Eberhard, Peter
2012-12-01
The stiffness of the body structure of an automobile has a strong relationship with its noise, vibration, and harshness (NVH) characteristics. In this paper, the effect of the stiffness of the body structure upon ride quality is discussed with flexible multibody dynamics. In flexible multibody simulation, the local elastic deformation of the vehicle has been described traditionally with modal shape functions. Recently, linear model reduction techniques from system dynamics and mathematics came into the focus to find more sophisticated elastic shape functions. In this work, the NVH-relevant states of a racing kart are simulated, whereas the elastic shape functions are calculated with modern model reduction techniques like moment matching by projection on Krylov-subspaces, singular value decomposition-based reduction techniques, and combinations of those. The whole elastic multibody vehicle model consisting of tyres, steering, axle, etc. is considered, and an excitation with a vibration characteristics in a wide frequency range is evaluated in this paper. The accuracy and the calculation performance of those modern model reduction techniques is investigated including a comparison of the modal reduction approach.
Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.
2015-12-01
Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.
Merchandising Techniques and Libraries.
ERIC Educational Resources Information Center
Green, Sylvie A.
1981-01-01
Proposes that libraries employ modern booksellers' merchandising techniques to improve circulation of library materials. Using displays in various ways, the methods and reasons for weeding out books, replacing worn book jackets, and selecting new books are discussed. Suggestions for learning how to market and 11 references are provided. (RBF)
Dance Critique as Signature Pedagogy
ERIC Educational Resources Information Center
Kearns, Lauren
2017-01-01
The curriculum of preprofessional university degree programs in dance typically comprise four components: theory and history, dance technique, creative process, and performance. This article focuses on critique in the modern dance technique and choreography components of the dance curriculum. Bachelor of Fine Arts programs utilize critique as a…
Quantitative proteomics in the field of microbiology.
Otto, Andreas; Becher, Dörte; Schmidt, Frank
2014-03-01
Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Kamalova, Lera A.; Koletvinova, Natal'ya D.
2016-01-01
This article is aimed to study the problems of reading and improve reading culture of students-bachelors of elementary education in modern high institutions and development of the most effective methods and techniques for improving of reading culture of students in the study of Humanities disciplines. The leading method to the study of this…
System-level power optimization for real-time distributed embedded systems
NASA Astrophysics Data System (ADS)
Luo, Jiong
Power optimization is one of the crucial design considerations for modern electronic systems. In this thesis, we present several system-level power optimization techniques for real-time distributed embedded systems, based on dynamic voltage scaling, dynamic power management, and management of peak power and variance of the power profile. Dynamic voltage scaling has been widely acknowledged as an important and powerful technique to trade off dynamic power consumption and delay. Efficient dynamic voltage scaling requires effective variable-voltage scheduling mechanisms that can adjust voltages and clock frequencies adaptively based on workloads and timing constraints. For this purpose, we propose static variable-voltage scheduling algorithms utilizing criticalpath driven timing analysis for the case when tasks are assumed to have uniform switching activities, as well as energy-gradient driven slack allocation for a more general scenario. The proposed techniques can achieve closeto-optimal power savings with very low computational complexity, without violating any real-time constraints. We also present algorithms for power-efficient joint scheduling of multi-rate periodic task graphs along with soft aperiodic tasks. The power issue is addressed through both dynamic voltage scaling and power management. Periodic task graphs are scheduled statically. Flexibility is introduced into the static schedule to allow the on-line scheduler to make local changes to PE schedules through resource reclaiming and slack stealing, without interfering with the validity of the global schedule. We provide a unified framework in which the response times of aperiodic tasks and power consumption are dynamically optimized simultaneously. Interconnection network fabrics point to a new generation of power-efficient and scalable interconnection architectures for distributed embedded systems. As the system bandwidth continues to increase, interconnection networks become power/energy limited as well. Variable-frequency links have been designed by circuit designers for both parallel and serial links, which can adaptively regulate the supply voltage of transceivers to a desired link frequency, to exploit the variations in bandwidth requirement for power savings. We propose solutions for simultaneous dynamic voltage scaling of processors and links. The proposed solution considers real-time scheduling, flow control, and packet routing jointly. It can trade off the power consumption on processors and communication links via efficient slack allocation, and lead to more power savings than dynamic voltage scaling on processors alone. For battery-operated systems, the battery lifespan is an important concern. Due to the effects of discharge rate and battery recovery, the discharge pattern of batteries has an impact on the battery lifespan. Battery models indicate that even under the same average power consumption, reducing peak power current and variance in the power profile can increase the battery efficiency and thereby prolong battery lifetime. To take advantage of these effects, we propose battery-driven scheduling techniques for embedded applications, to reduce the peak power and the variance in the power profile of the overall system under real-time constraints. The proposed scheduling algorithms are also beneficial in addressing reliability and signal integrity concerns by effectively controlling peak power and variance of the power profile.
Hashim, H A; Abido, M A
2015-01-01
This paper presents a comparative study of fuzzy controller design for the twin rotor multi-input multioutput (MIMO) system (TRMS) considering most promising evolutionary techniques. These are gravitational search algorithm (GSA), particle swarm optimization (PSO), artificial bee colony (ABC), and differential evolution (DE). In this study, the gains of four fuzzy proportional derivative (PD) controllers for TRMS have been optimized using the considered techniques. The optimization techniques are developed to identify the optimal control parameters for system stability enhancement, to cancel high nonlinearities in the model, to reduce the coupling effect, and to drive TRMS pitch and yaw angles into the desired tracking trajectory efficiently and accurately. The most effective technique in terms of system response due to different disturbances has been investigated. In this work, it is observed that GSA is the most effective technique in terms of solution quality and convergence speed.
Hashim, H. A.; Abido, M. A.
2015-01-01
This paper presents a comparative study of fuzzy controller design for the twin rotor multi-input multioutput (MIMO) system (TRMS) considering most promising evolutionary techniques. These are gravitational search algorithm (GSA), particle swarm optimization (PSO), artificial bee colony (ABC), and differential evolution (DE). In this study, the gains of four fuzzy proportional derivative (PD) controllers for TRMS have been optimized using the considered techniques. The optimization techniques are developed to identify the optimal control parameters for system stability enhancement, to cancel high nonlinearities in the model, to reduce the coupling effect, and to drive TRMS pitch and yaw angles into the desired tracking trajectory efficiently and accurately. The most effective technique in terms of system response due to different disturbances has been investigated. In this work, it is observed that GSA is the most effective technique in terms of solution quality and convergence speed. PMID:25960738
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
Liao, Xing; Xie, Yan-ming
2014-10-01
The impact of evidence-based medicine and clinical epidemiology on clinical research has contributed to the development of Chinese medicine in modern times over the past two decades. Many concepts and methods of modern science and technology are emerging in Chinese medicine research, resulting in constant progress. Systematic reviews, randomized controlled trials and other advanced mathematic approaches and statistical analysis methods have brought reform to Chinese medicine. In this new era, Chinese medicine researchers have many opportunities and challenges. On the one hand, Chinese medicine researchers need to dedicate themselves to providing enough evidence to the world through rigorous studies, whilst on the other hand, they also need to keep up with the speed of modern medicine research. For example, recently, real world study, comparative effectiveness research, propensity score techniques and registry study have emerged. This article aims to inspire Chinese medicine researchers to explore new areas by introducing these new ideas and new techniques.
Travis, F; Olson, T; Egenes, T; Gupta, H K
2001-07-01
This study tested the prediction that reading Vedic Sanskrit texts, without knowledge of their meaning, produces a distinct physiological state. We measured EEG, breath rate, heart rate, and skin conductance during: (1) 15-min Transcendental Meditation (TM) practice; (2) 15-min reading verses of the Bhagavad Gita in Sanskrit; and (3) 15-min reading the same verses translated in German, Spanish, or French. The two reading conditions were randomly counterbalanced, and subjects filled out experience forms between each block to reduce carryover effects. Skin conductance levels significantly decreased during both reading Sanskrit and TM practice, and increased slightly during reading a modern language. Alpha power and coherence were significantly higher when reading Sanskrit and during TM practice, compared to reading modern languages. Similar physiological patterns when reading Sanskrit and during practice of the TM technique suggests that the state gained during TM practice may be integrated with active mental processes by reading Sanskrit.
[Construction of multiple drug release system based on components of traditional Chinese medicine].
Liu, Dan; Jia, Xiaobin; Yu, Danhong; Zhang, Zhenhai; Sun, E
2012-08-01
With the development of the modernization drive of traditional Chinese medicine (TCM) preparations, new-type TCM dosage forms research have become a hot spot in the field. Because of complexity of TCM components as well as uncertainty of material base, there is still not a scientific system for modern TCM dosage forms so far. Modern TCM preparations inevitably take the nature of the multi-component and the general function characteristics of multi-link and multi-target into account. The author suggests building a multiple drug release system for TCM using diverse preparation techniques and drug release methods at levels on the basis the nature and function characteristics of TCM components. This essay expounds elaborates the ideas to build the multiple traditional Chinese medicine release system, theoretical basis, preparation techniques and assessment system, current problems and solutions, in order to build a multiple TCM release system with a view of enhancing the bioavailability of TCM components and provide a new form for TCM preparations.
Flight Test Validation of Optimal Input Design and Comparison to Conventional Inputs
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1997-01-01
A technique for designing optimal inputs for aerodynamic parameter estimation was flight tested on the F-18 High Angle of Attack Research Vehicle (HARV). Model parameter accuracies calculated from flight test data were compared on an equal basis for optimal input designs and conventional inputs at the same flight condition. In spite of errors in the a priori input design models and distortions of the input form by the feedback control system, the optimal inputs increased estimated parameter accuracies compared to conventional 3-2-1-1 and doublet inputs. In addition, the tests using optimal input designs demonstrated enhanced design flexibility, allowing the optimal input design technique to use a larger input amplitude to achieve further increases in estimated parameter accuracy without departing from the desired flight test condition. This work validated the analysis used to develop the optimal input designs, and demonstrated the feasibility and practical utility of the optimal input design technique.
Discovery of Newer Therapeutic Leads for Prostate Cancer
2009-06-01
promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of
Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph
Radioisotope studies of the farmville meteorite using γγ-coincidence spectrometry.
Howard, Chris; Ferm, Megan; Cesaratto, John; Daigle, Stephen; Iliadis, Christian
2014-12-01
Radionuclides are cosmogenically produced in meteorites before they fall to the surface of the Earth. Measurement of the radioactive decay of such nuclides provides a wealth of information on the irradiation conditions of the meteorite fragment, the intensity of cosmic rays in the inner solar system, and the magnetic activity of the Sun. We report here on the detection of (26)Al using a sophisticated spectrometer consisting of a HPGe detector and a NaI(Tl) annulus. It is shown that modern γ-ray spectrometers represent an interesting alternative to other detection techniques. Data are obtained for a fragment of the Farmville meteorite and compared to results from Geant4 simulations. In particular, we report on optimizing the detection sensitivity by using suitable coincidence gates for deposited energy and event multiplicity. We measured an (26)Al activity of 48.5±3.5dpm/kg for the Farmville meteorite, in agreement with previously reported values for other H chondrites. Copyright © 2014 Elsevier Ltd. All rights reserved.
Decentralized Fuzzy MPC on Spatial Power Control of a Large PHWR
NASA Astrophysics Data System (ADS)
Liu, Xiangjie; Jiang, Di; Lee, Kwang Y.
2016-08-01
Reliable power control for stabilizing the spatial oscillations is quite important for ensuring the safe operation of a modern pressurized heavy water reactor (PHWR), since these spatial oscillations can cause “flux tilting” in the reactor core. In this paper, a decentralized fuzzy model predictive control (DFMPC) is proposed for spatial control of PHWR. Due to the load dependent dynamics of the nuclear power plant, fuzzy modeling is used to approximate the nonlinear process. A fuzzy Lyapunov function and “quasi-min-max” strategy is utilized in designing the DFMPC, to reduce the conservatism. The plant-wide stability is achieved by the asymptotically positive realness constraint (APRC) for this decentralized MPC. The solving optimization problem is based on a receding horizon scheme involving the linear matrix inequalities (LMIs) technique. Through dynamic simulations, it is demonstrated that the designed DFMPC can effectively suppress spatial oscillations developed in PHWR, and further, shows the advantages over the typical parallel distributed compensation (PDC) control scheme.
Triple-helix molecular switch-based aptasensors and DNA sensors.
Bagheri, Elnaz; Abnous, Khalil; Alibolandi, Mona; Ramezani, Mohammad; Taghdisi, Seyed Mohammad
2018-07-15
Utilization of traditional analytical techniques is limited because they are generally time-consuming and require high consumption of reagents, complicated sample preparation and expensive equipment. Therefore, it is of great interest to achieve sensitive, rapid and simple detection methods. It is believed that nucleic acids assays, especially aptamers, are very important in modern life sciences for target detection and biological analysis. Aptamers and DNA-based sensors have been widely used for the design of various sensors owing to their unique features. In recent years, triple-helix molecular switch (THMS)-based aptasensors and DNA sensors have been broadly utilized for the detection and analysis of different targets. The THMS relies on the formation of DNA triplex via Watson-Crick and Hoogsteen base pairings under optimal conditions. This review focuses on recent progresses in the development and applications of electrochemical, colorimetric, fluorescence and SERS aptasensors and DNA sensors, which are based on THMS. Also, the advantages and drawbacks of these methods are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Baritugo, Kei-Anne; Kim, Hee Taek; David, Yokimiko; Choi, Jong-Il; Hong, Soon Ho; Jeong, Ki Jun; Choi, Jong Hyun; Joo, Jeong Chan; Park, Si Jae
2018-05-01
Bio-based production of industrially important chemicals provides an eco-friendly alternative to current petrochemical-based processes. Because of the limited supply of fossil fuel reserves, various technologies utilizing microbial host strains for the sustainable production of platform chemicals from renewable biomass have been developed. Corynebacterium glutamicum is a non-pathogenic industrial microbial species traditionally used for L-glutamate and L-lysine production. It is a promising species for industrial production of bio-based chemicals because of its flexible metabolism that allows the utilization of a broad spectrum of carbon sources and the production of various amino acids. Classical breeding, systems, synthetic biology, and metabolic engineering approaches have been used to improve its applications, ranging from traditional amino-acid production to modern biorefinery systems for production of value-added platform chemicals. This review describes recent advances in the development of genetic engineering tools and techniques for the establishment and optimization of metabolic pathways for bio-based production of major C2-C6 platform chemicals using recombinant C. glutamicum.
Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation
NASA Astrophysics Data System (ADS)
Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.
2017-11-01
The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.
Advanced online control mode selection for gas turbine aircraft engines
NASA Astrophysics Data System (ADS)
Wiseman, Matthew William
The modern gas turbine aircraft engine is a complex, highly nonlinear system the operates in a widely varying environment. Traditional engine control techniques based on the hydro mechanical control concepts of early turbojet engines are unable to deliver the performance required from today's advanced engine designs. A new type of advanced control utilizing multiple control modes and an online mode selector is investigated, and various strategies for improving the baseline mode selection architecture are introduced. The ability to five-tune actuator command outputs is added to the basic mode selection and blending process, and mode selection designs that we valid for the entire flight envelope are presented. Methods for optimizing the mode selector to improve overall engine performance are also discussed. Finally, using flight test data from a GE F110-powered F16 aircraft, the full-envelope mode selector designs are validated and shown to provide significant performance benefits. Specifically, thrust command tracking is enhanced while critical engine limits are protected, with very little impact on engine efficiency.
Developing a bubble number-density paleoclimatic indicator for glacier ice
Spencer, M.K.; Alley, R.B.; Fitzpatrick, J.J.
2006-01-01
Past accumulation rate can be estimated from the measured number-density of bubbles in an ice core and the reconstructed paleotemperature, using a new technique. Density increase and grain growth in polar firn are both controlled by temperature and accumulation rate, and the integrated effects are recorded in the number-density of bubbles as the firn changes to ice. An empirical model of these processes, optimized to fit published data on recently formed bubbles, reconstructs accumulation rates using recent temperatures with an uncertainty of 41% (P < 0.05). For modern sites considered here, no statistically significant trend exists between mean annual temperature and the ratio of bubble number-density to grain number-density at the time of pore close-off; optimum modeled accumulation-rate estimates require an eventual ???2.02 ?? 0.08 (P < 0.05) bubbles per close-off grain. Bubble number-density in the GRIP (Greenland) ice core is qualitatively consistent with independent estimates for a combined temperature decrease and accumulation-rate increase there during the last 5 kyr.
2-Step scalar deadzone quantization for bitplane image coding.
Auli-Llinas, Francesc
2013-12-01
Modern lossy image coding systems generate a quality progressive codestream that, truncated at increasing rates, produces an image with decreasing distortion. Quality progressivity is commonly provided by an embedded quantizer that employs uniform scalar deadzone quantization (USDQ) together with a bitplane coding strategy. This paper introduces a 2-step scalar deadzone quantization (2SDQ) scheme that achieves same coding performance as that of USDQ while reducing the coding passes and the emitted symbols of the bitplane coding engine. This serves to reduce the computational costs of the codec and/or to code high dynamic range images. The main insights behind 2SDQ are the use of two quantization step sizes that approximate wavelet coefficients with more or less precision depending on their density, and a rate-distortion optimization technique that adjusts the distortion decreases produced when coding 2SDQ indexes. The integration of 2SDQ in current codecs is straightforward. The applicability and efficiency of 2SDQ are demonstrated within the framework of JPEG2000.
Cooperative photometric redshift estimation
NASA Astrophysics Data System (ADS)
Cavuoti, S.; Tortora, C.; Brescia, M.; Longo, G.; Radovich, M.; Napolitano, N. R.; Amaro, V.; Vellucci, C.
2017-06-01
In the modern galaxy surveys photometric redshifts play a central role in a broad range of studies, from gravitational lensing and dark matter distribution to galaxy evolution. Using a dataset of ~ 25,000 galaxies from the second data release of the Kilo Degree Survey (KiDS) we obtain photometric redshifts with five different methods: (i) Random forest, (ii) Multi Layer Perceptron with Quasi Newton Algorithm, (iii) Multi Layer Perceptron with an optimization network based on the Levenberg-Marquardt learning rule, (iv) the Bayesian Photometric Redshift model (or BPZ) and (v) a classical SED template fitting procedure (Le Phare). We show how SED fitting techniques could provide useful information on the galaxy spectral type which can be used to improve the capability of machine learning methods constraining systematic errors and reduce the occurrence of catastrophic outliers. We use such classification to train specialized regression estimators, by demonstrating that such hybrid approach, involving SED fitting and machine learning in a single collaborative framework, is capable to improve the overall prediction accuracy of photometric redshifts.
Pharmacokinetic properties and in silico ADME modeling in drug discovery.
Honório, Kathia M; Moda, Tiago L; Andricopulo, Adriano D
2013-03-01
The discovery and development of a new drug are time-consuming, difficult and expensive. This complex process has evolved from classical methods into an integration of modern technologies and innovative strategies addressed to the design of new chemical entities to treat a variety of diseases. The development of new drug candidates is often limited by initial compounds lacking reasonable chemical and biological properties for further lead optimization. Huge libraries of compounds are frequently selected for biological screening using a variety of techniques and standard models to assess potency, affinity and selectivity. In this context, it is very important to study the pharmacokinetic profile of the compounds under investigation. Recent advances have been made in the collection of data and the development of models to assess and predict pharmacokinetic properties (ADME--absorption, distribution, metabolism and excretion) of bioactive compounds in the early stages of drug discovery projects. This paper provides a brief perspective on the evolution of in silico ADME tools, addressing challenges, limitations, and opportunities in medicinal chemistry.
Greenwood, Taylor J; Lopez-Costa, Rodrigo I; Rhoades, Patrick D; Ramírez-Giraldo, Juan C; Starr, Matthew; Street, Mandie; Duncan, James; McKinstry, Robert C
2015-01-01
The marked increase in radiation exposure from medical imaging, especially in children, has caused considerable alarm and spurred efforts to preserve the benefits but reduce the risks of imaging. Applying the principles of the Image Gently campaign, data-driven process and quality improvement techniques such as process mapping and flowcharting, cause-and-effect diagrams, Pareto analysis, statistical process control (control charts), failure mode and effects analysis, "lean" or Six Sigma methodology, and closed feedback loops led to a multiyear program that has reduced overall computed tomographic (CT) examination volume by more than fourfold and concurrently decreased radiation exposure per CT study without compromising diagnostic utility. This systematic approach involving education, streamlining access to magnetic resonance imaging and ultrasonography, auditing with comparison with benchmarks, applying modern CT technology, and revising CT protocols has led to a more than twofold reduction in CT radiation exposure between 2005 and 2012 for patients at the authors' institution while maintaining diagnostic utility. (©)RSNA, 2015.
Costa, Paulo R; Caldas, Linda V E
2002-01-01
This work presents the development and evaluation using modern techniques to calculate radiation protection barriers in clinical radiographic facilities. Our methodology uses realistic primary and scattered spectra. The primary spectra were computer simulated using a waveform generalization and a semiempirical model (the Tucker-Barnes-Chakraborty model). The scattered spectra were obtained from published data. An analytical function was used to produce attenuation curves from polychromatic radiation for specified kVp, waveform, and filtration. The results of this analytical function are given in ambient dose equivalent units. The attenuation curves were obtained by application of Archer's model to computer simulation data. The parameters for the best fit to the model using primary and secondary radiation data from different radiographic procedures were determined. They resulted in an optimized model for shielding calculation for any radiographic room. The shielding costs were about 50% lower than those calculated using the traditional method based on Report No. 49 of the National Council on Radiation Protection and Measurements.
NASA Astrophysics Data System (ADS)
Veselov, F. V.; Erokhina, I. V.; Makarova, A. S.; Khorshev, A. A.
2017-03-01
The article deals with issues of technical and economic substantiation of priorities and scopes of modernizing the existing thermal power plants (TPPs) in Russia to work out long-term forecasts of the development of the industry. The current situation in the TPP modernization trends is analyzed. The updated initial figures of the capital and operation costs are presented and the obtained estimates of the comparative efficiency of various investment decisions on modernization and equipment replacement at gas-and-oil-burning and coal-fired TPPs with regard to the main zones of the national Unified Power System (UPS) of Russia are cited. The results of optimization of the generating capacity structure underlie a study of alternative TPP modernization strategies that differ in the scope of switching to new technologies, capital intensity, and energy efficiency (decrease in the average heat rate). To provide an integral economic assessment of the above strategies, the authors modified the traditional approach based on determination of the overall discounted costs of power supply (least-cost planning) supplemented with a comparison by the weighted average wholesale price of the electricity. A method for prediction of the wholesale price is proposed reasoning from the direct and dual solutions of the optimization problem. The method can be adapted to various combinations of the mechanisms of payment for the electricity and the capacity on the basis of marginal and average costs. Energy and economic analysis showed that the opposite effects of reduction in the capital investment and fuel saving change in a nonlinear way as the scope of the switch to more advanced power generation technologies at the TPPs increases. As a consequence, a strategy for modernization of the existing power plants rational with respect to total costs of the power supply and wholesale electricity prices has been formulated. The strategy combines decisions on upgrade and replacement of the equipment at the existing power plants of various types. The basic parameters of the strategy for the future until 2035 are provided.
USDA-ARS?s Scientific Manuscript database
Dietary fat is a macronutrient that has historically engendered considerable controversy and continues to do so. Contentious areas include optimal amount and type for cardiovascular disease risk reduction, and role in body weight regulation. Dietary fats and oils are unique in modern times in that ...
OPTIMIZATION OF MODERN DISPERSIVE RAMAN SPECTROMETERS FOR MOLECULAR SPECIATION OF ORGANICS IN WATER
Pesticides and industrial chemicals are typically complex organic molecules with multiple heteroatoms that can ionize in water. However, models for understanding the behavior of these chemicals in the environment typically assume that they exist exclusively as neutral species --...
DOT National Transportation Integrated Search
2014-04-01
Transportation and logistics companies increasingly : rely on modern technologies and in-vehicle tools : (also known as telematics systems) to optimize their : truck fleet operations. Telematics is technology that : combines telecommunications (i.e.,...
Optimization of life support systems and their systems reliability
NASA Technical Reports Server (NTRS)
Fan, L. T.; Hwang, C. L.; Erickson, L. E.
1971-01-01
The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.
Cicero, Raúl; Criales, José Luis; Cardoso, Manuel
2009-01-01
The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.
Diffraction scattering computed tomography: a window into the structures of complex nanomaterials
Birkbak, M. E.; Leemreize, H.; Frølich, S.; Stock, S. R.
2015-01-01
Modern functional nanomaterials and devices are increasingly composed of multiple phases arranged in three dimensions over several length scales. Therefore there is a pressing demand for improved methods for structural characterization of such complex materials. An excellent emerging technique that addresses this problem is diffraction/scattering computed tomography (DSCT). DSCT combines the merits of diffraction and/or small angle scattering with computed tomography to allow imaging the interior of materials based on the diffraction or small angle scattering signals. This allows, e.g., one to distinguish the distributions of polymorphs in complex mixtures. Here we review this technique and give examples of how it can shed light on modern nanoscale materials. PMID:26505175
NASA Astrophysics Data System (ADS)
Wang, Zhen-yu; Yu, Jian-cheng; Zhang, Ai-qun; Wang, Ya-xing; Zhao, Wen-tao
2017-12-01
Combining high precision numerical analysis methods with optimization algorithms to make a systematic exploration of a design space has become an important topic in the modern design methods. During the design process of an underwater glider's flying-wing structure, a surrogate model is introduced to decrease the computation time for a high precision analysis. By these means, the contradiction between precision and efficiency is solved effectively. Based on the parametric geometry modeling, mesh generation and computational fluid dynamics analysis, a surrogate model is constructed by adopting the design of experiment (DOE) theory to solve the multi-objects design optimization problem of the underwater glider. The procedure of a surrogate model construction is presented, and the Gaussian kernel function is specifically discussed. The Particle Swarm Optimization (PSO) algorithm is applied to hydrodynamic design optimization. The hydrodynamic performance of the optimized flying-wing structure underwater glider increases by 9.1%.
Development of seismic tomography software for hybrid supercomputers
NASA Astrophysics Data System (ADS)
Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton
2015-04-01
Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique.
Riccò, R; Nizzero, S; Penna, E; Meneghello, A; Cretaio, E; Enrichi, F
2018-01-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. Graphical abstractWe report a simple, cheap, reliable protocol for the synthesis and systematic tuning of ultra-small (< 10 nm) dye-doped luminescent silica nanoparticles.
On the Impact of Execution Models: A Case Study in Computational Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram
2015-05-25
Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancingmore » that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.« less
Parallel and Portable Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Lee, S. R.; Cummings, J. C.; Nolen, S. D.; Keen, N. D.
1997-08-01
We have developed a multi-group, Monte Carlo neutron transport code in C++ using object-oriented methods and the Parallel Object-Oriented Methods and Applications (POOMA) class library. This transport code, called MC++, currently computes k and α eigenvalues of the neutron transport equation on a rectilinear computational mesh. It is portable to and runs in parallel on a wide variety of platforms, including MPPs, clustered SMPs, and individual workstations. It contains appropriate classes and abstractions for particle transport and, through the use of POOMA, for portable parallelism. Current capabilities are discussed, along with physics and performance results for several test problems on a variety of hardware, including all three Accelerated Strategic Computing Initiative (ASCI) platforms. Current parallel performance indicates the ability to compute α-eigenvalues in seconds or minutes rather than days or weeks. Current and future work on the implementation of a general transport physics framework (TPF) is also described. This TPF employs modern C++ programming techniques to provide simplified user interfaces, generic STL-style programming, and compile-time performance optimization. Physics capabilities of the TPF will be extended to include continuous energy treatments, implicit Monte Carlo algorithms, and a variety of convergence acceleration techniques such as importance combing.
Low-dose CT in clinical diagnostics.
Fuentes-Orrego, Jorge M; Sahani, Dushyant V
2013-09-01
Computed tomography (CT) has become key for patient management due to its outstanding capabilities for detecting disease processes and assessing treatment response, which has led to expansion in CT imaging for diagnostic and image-guided therapeutic interventions. Despite these benefits, the growing use of CT has raised concerns as radiation risks associated with radiation exposure. The purpose of this article is to familiarize the reader with fundamental concepts of dose metrics for assessing radiation exposure and weighting radiation-associated risks. The article also discusses general approaches for reducing radiation dose while preserving diagnostic quality. The authors provide additional insight for undertaking protocol optimization, customizing scanning techniques based on the patients' clinical scenario and demographics. Supplemental strategies are postulated using more advanced post-processing techniques for achieving further dose improvements. The technologic offerings of CT are integral to modern medicine and its role will continue to evolve. Although, the estimated risks from low levels of radiation of a single CT exam are uncertain, it is prudent to minimize the dose from CT by applying common sense solutions and using other simple strategies as well as exploiting technologic innovations. These efforts will enable us to take advantage of all the clinical benefits of CT while minimizing the likelihood of harm to patients.
Tsunami risk zoning in south-central Chile
NASA Astrophysics Data System (ADS)
Lagos, M.
2010-12-01
The recent 2010 Chilean tsunami revealed the need to optimize methodologies for assessing the risk of disaster. In this context, modern techniques and criteria for the evaluation of the tsunami phenomenon were applied in the coastal zone of south-central Chile as a specific methodology for the zoning of tsunami risk. This methodology allows the identification and validation of a scenario of tsunami hazard; the spatialization of factors that have an impact on the risk; and the zoning of the tsunami risk. For the hazard evaluation, different scenarios were modeled by means of numerical simulation techniques, selecting and validating the results that better fit with the observed tsunami data. Hydrodynamic parameters of the inundation as well as physical and socioeconomic vulnerability aspects were considered for the spatialization of the factors that affect the tsunami risk. The tsunami risk zoning was integrated into a Geographic Information System (GIS) by means of multicriteria evaluation (MCE). The results of the tsunami risk zoning show that the local characteristics and their location, together with the concentration of poverty levels, establish spatial differentiated risk levels. This information builds the basis for future applied studies in land use planning that tend to minimize the risk levels associated to the tsunami hazard. This research is supported by Fondecyt 11090210.
Quantitative high dynamic range beam profiling for fluorescence microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, T. J., E-mail: t.j.mitchell@dur.ac.uk; Saunter, C. D.; O’Nions, W.
2014-10-15
Modern developmental biology relies on optically sectioning fluorescence microscope techniques to produce non-destructive in vivo images of developing specimens at high resolution in three dimensions. As optimal performance of these techniques is reliant on the three-dimensional (3D) intensity profile of the illumination employed, the ability to directly record and analyze these profiles is of great use to the fluorescence microscopist or instrument builder. Though excitation beam profiles can be measured indirectly using a sample of fluorescent beads and recording the emission along the microscope detection path, we demonstrate an alternative approach where a miniature camera sensor is used directly withinmore » the illumination beam. Measurements taken using our approach are solely concerned with the illumination optics as the detection optics are not involved. We present a miniature beam profiling device and high dynamic range flux reconstruction algorithm that together are capable of accurately reproducing quantitative 3D flux maps over a large focal volume. Performance of this beam profiling system is verified within an optical test bench and demonstrated for fluorescence microscopy by profiling the low NA illumination beam of a single plane illumination microscope. The generality and success of this approach showcases a widely flexible beam amplitude diagnostic tool for use within the life sciences.« less
Statistics and Informatics in Space Astrophysics
NASA Astrophysics Data System (ADS)
Feigelson, E.
2017-12-01
The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.
NASA Astrophysics Data System (ADS)
Altıparmak, Hamit; Al Shahadat, Mohamad; Kiani, Ehsan; Dimililer, Kamil
2018-04-01
Robotic agriculture requires smart and doable techniques to substitute the human intelligence with machine intelligence. Strawberry is one of the important Mediterranean product and its productivity enhancement requires modern and machine-based methods. Whereas a human identifies the disease infected leaves by his eye, the machine should also be capable of vision-based disease identification. The objective of this paper is to practically verify the applicability of a new computer-vision method for discrimination between the healthy and disease infected strawberry leaves which does not require neural network or time consuming trainings. The proposed method was tested under outdoor lighting condition using a regular DLSR camera without any particular lens. Since the type and infection degree of disease is approximated a human brain a fuzzy decision maker classifies the leaves over the images captured on-site having the same properties of human vision. Optimizing the fuzzy parameters for a typical strawberry production area at a summer mid-day in Cyprus produced 96% accuracy for segmented iron deficiency and 93% accuracy for segmented using a typical human instant classification approximation as the benchmark holding higher accuracy than a human eye identifier. The fuzzy-base classifier provides approximate result for decision making on the leaf status as if it is healthy or not.
Advances and Challenges in Treatment of Locally Advanced Rectal Cancer
Smith, J. Joshua; Garcia-Aguilar, Julio
2015-01-01
Dramatic improvements in the outcomes of patients with rectal cancer have occurred over the past 30 years. Advances in surgical pathology, refinements in surgical techniques and instrumentation, new imaging modalities, and the widespread use of neoadjuvant therapy have all contributed to these improvements. Several questions emerge as we learn of the benefits or lack thereof for components of the current multimodality treatment in subgroups of patients with nonmetastatic locally advanced rectal cancer (LARC). What is the optimal surgical technique for distal rectal cancers? Do all patients need postoperative chemotherapy? Do all patients need radiation? Do all patients need surgery, or is a nonoperative, organ-preserving approach warranted in selected patients? Answering these questions will lead to more precise treatment regimens, based on patient and tumor characteristics, that will improve outcomes while preserving quality of life. However, the idea of shifting the treatment paradigm (chemoradiotherapy, total mesorectal excision, and adjuvant therapy) currently applied to all patients with LARC to a more individually tailored approach is controversial. The paradigm shift toward organ preservation in highly selected patients whose tumors demonstrate clinical complete response to neoadjuvant treatment is also controversial. Herein, we highlight many of the advances and resultant controversies that are likely to dominate the research agenda for LARC in the modern era. PMID:25918296
Digital pre-compensation techniques enabling high-capacity bandwidth variable transponders
NASA Astrophysics Data System (ADS)
Napoli, Antonio; Berenguer, Pablo Wilke; Rahman, Talha; Khanna, Ginni; Mezghanni, Mahdi M.; Gardian, Lennart; Riccardi, Emilio; Piat, Anna Chiadò; Calabrò, Stefano; Dris, Stefanos; Richter, André; Fischer, Johannes Karl; Sommerkorn-Krombholz, Bernd; Spinnler, Bernhard
2018-02-01
Digital pre-compensation techniques are among the enablers for cost-efficient high-capacity transponders. In this paper we describe various methods to mitigate the impairments introduced by state-of-the-art components within modern optical transceivers. Numerical and experimental results validate their performance and benefits.
Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment
ERIC Educational Resources Information Center
Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James
2010-01-01
The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…
Three Contributions of a Spiritual Perspective to Counseling, Psychotherapy, and Behavior Change.
ERIC Educational Resources Information Center
Bergin, Allen E.
1988-01-01
Describes ways in which a spiritual approach can contribute to the modern applied science of behavior change. Divides approach into three areas: conception of human nature, moral frame of reference, and set of techniques. Discusses and demonstrates the transitional person technique. (Author/BH)
Reimbursement-Based Economics--What Is It and How Can We Use It to Inform Drug Policy Reform?
Coyle, Doug; Lee, Karen M; Mamdani, Muhammad; Sabarre, Kelley-Anne; Tingley, Kylie
2015-01-01
In Ontario, approximately $3.8 billion is spent annually on publicly funded drug programs. The annual growth in Ontario Public Drug Program (OPDP) expenditure has been limited to 1.2% over the course of 3 years. Concurrently, the Ontario Drug Policy Research Network (ODPRN) was appointed to conduct drug class review research relating to formulary modernization within the OPDP. Drug class reviews by ODPRN incorporate a novel methodological technique called reimbursement-based economics, which focuses on reimbursement strategies and may be particularly relevant for policy-makers. To describe the reimbursement-based economics approach. Reimbursement-based economics aims to identify the optimal reimbursement strategy for drug classes by incorporating a review of economic literature, comprehensive budget impact analyses, and consideration of cost-effectiveness. This 3-step approach is novel in its focus on the economic impact of alternate reimbursement strategies rather than individual therapies. The methods involved within the reimbursement-based approach are detailed. To facilitate the description, summary methods and findings from a recent application to formulary modernization with respect to the drug class tryptamine-based selective serotonin receptor agonists (triptans) used to treat migraine headaches are presented. The application of reimbursement-based economics in drug policy reforms allows policy-makers to consider the cost-effectiveness and budget impact of different reimbursement strategies allowing consideration of the trade-off between potential cost savings vs increased access to cost-effective treatments. © 2015 American Headache Society.
2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
Research on an augmented Lagrangian penalty function algorithm for nonlinear programming
NASA Technical Reports Server (NTRS)
Frair, L.
1978-01-01
The augmented Lagrangian (ALAG) Penalty Function Algorithm for optimizing nonlinear mathematical models is discussed. The mathematical models of interest are deterministic in nature and finite dimensional optimization is assumed. A detailed review of penalty function techniques in general and the ALAG technique in particular is presented. Numerical experiments are conducted utilizing a number of nonlinear optimization problems to identify an efficient ALAG Penalty Function Technique for computer implementation.
Optimizing Prevention of HIV and Unplanned Pregnancy in Discordant African Couples.
Wall, Kristin M; Kilembe, William; Vwalika, Bellington; Haddad, Lisa B; Khu, Naw Htee; Brill, Ilene; Onwubiko, Udodirim; Chomba, Elwyn; Tichacek, Amanda; Allen, Susan
2017-08-01
Dual method use, which combines condoms with a more effective modern contraceptive to optimize prevention of HIV and unplanned pregnancy, is underutilized in high-risk heterosexual couples. Heterosexual HIV-discordant Zambian couples were enrolled from couples' voluntary HIV counseling and testing services into an open cohort with 3-monthly follow-up (1994-2012). Relative to dual method use, defined as consistent condom use plus modern contraception, we examine predictors of (1) condom-only use (suboptimal pregnancy prevention) or (2) modern contraceptive use with inconsistent condom use (effective pregnancy prevention and suboptimal HIV prevention). Among 3,049 couples, dual method use occurred in 28% of intervals in M+F- and 23% in M-F+, p < 0.01; condom-only use in 56% in M+F- and 61% in M-F+, p < 0.01; and modern contraceptive use with inconsistent condom use in 16% regardless of serostatus. Predictors (p < 0.05) of condom-only use included the man being HIV+ (adjusted hazard ratio, aHR = 1.15); baseline oral contraceptive pill (aHR = 0.76), injectable (aHR = 0.48), or implant (aHR = 0.60) use; woman's age (aHR = 1.04 per 5 years) and lifetime number of sex partners (aHR = 1.01); postpartum periods (aHR = 1.25); and HIV stage of the index partner III/IV versus I (aHR = 1.10). Predictors (p < 0.05) of modern contraceptive use with inconsistent condom use included woman's age (aHR = 0.94 per 5 years) and HIV+ male circumcision (aHR = 1.51), while time-varying implant use was associated with more consistent condom use (aHR = 0.80). Three-quarters of follow-up intervals did not include dual method use. This highlights the need for counseling to reduce unintended pregnancy and HIV transmission and enable safer conception.
Optimization of the Magnetic Field Homogeneity Area for Solenoid Type Magnets
NASA Astrophysics Data System (ADS)
Perepelkin, Eugene; Polyakova, Rima; Tarelkin, Aleksandr; Kovalenko, Alexander; Sysoev, Pavel; Sadovnikova, Marianne; Yudin, Ivan
2018-02-01
Homogeneous magnetic fields are important requisites in modern physics research. In this paper we discuss the problem of magnetic field homogeneity area maximization for solenoid magnets. We discuss A-model and B-model, which are basic types of solenoid magnets used to provide a homogeneous field, and methods for their optimization. We propose C-model which can be used for the NICA project. We have also carried out a cross-check of the C-model with the parameters stated for the CLEO II detector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sivak, David; Crooks, Gavin
A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.
Optimal service using Matlab - simulink controlled Queuing system at call centers
NASA Astrophysics Data System (ADS)
Balaji, N.; Siva, E. P.; Chandrasekaran, A. D.; Tamilazhagan, V.
2018-04-01
This paper presents graphical integrated model based academic research on telephone call centres. This paper introduces an important feature of impatient customers and abandonments in the queue system. However the modern call centre is a complex socio-technical system. Queuing theory has now become a suitable application in the telecom industry to provide better online services. Through this Matlab-simulink multi queuing structured models provide better solutions in complex situations at call centres. Service performance measures analyzed at optimal level through Simulink queuing model.
Control of vibrations of a moving beam
NASA Astrophysics Data System (ADS)
Banichuk, N. V.; Ivanova, S. Yu; Makeev, E. V.; Sinitsyn, A. V.
2018-04-01
The translational motion of a thermoelastic beam under transverse vibrations caused by initial perturbations is considered. It is assumed that a beam moving at a constant translational speed is described by a model of a thermoelastic panel supported at the edges of the considered span. The problem of optimal suppression of vibrations is formulated when applying active transverse influences to the panel. To solve the optimization problem, modern methods developed in the theory of control of systems with distributed parameters described by partial differential equations are used.
From middens to modern estuaries, oyster shells sequester source-specific nitrogen
NASA Astrophysics Data System (ADS)
Darrow, Elizabeth S.; Carmichael, Ruth H.; Andrus, C. Fred T.; Jackson, H. Edwin
2017-04-01
Oysters (Crassostrea virginica) were an important food resource for native peoples of the northern Gulf of Mexico, who deposited waste shells in middens. Nitrogen (N) stable isotopes (δ15N) in bivalve shells have been used as modern proxies for estuarine N sources because they approximate δ15N in suspended particulate matter. We tested the use of midden shell δ15N as a proxy for ancient estuarine N sources. We hypothesized that isotopic signatures in ancient shells from coastal Mississippi would differ from modern shells due to increased anthropogenic N sources, such as wastewater, through time. We decalcified shells using an acidification technique previously developed for modern bivalves, but modified to determine δ15N, δ13C, %N, and % organic C of these low-N, high-C specimens. The modified method resulted in the greatest percentage of usable data from midden shells. Our results showed that oyster shell δ15N did not significantly differ between ancient (500-2100 years old) and modern oysters from the same locations where the sites had undergone relatively little land-use change. δ15N values in modern shells, however, were positively correlated with water column nitrate concentrations associated with urbanization. When N content and total shell mass were combined, we estimated that middens sequestered 410-39,000 kg of relic N, buried at a rate of up to 5 kg N m-2 yr-1. This study provides a relatively simple technique to assess baseline conditions in ecosystems over long time scales by demonstrating that midden shells can be an indicator of pre-historic N source to estuaries and are a potentially significant but previously uncharacterized estuarine N sink.
Knight, Andrew; Watson, Katherine D.
2017-01-01
Simple Summary The identity of Jack the Ripper remains one of the greatest unsolved crime mysteries in history. Jack was notorious both for the brutality of his murders and also for his habit of stealing organs from his victims. His speed and skill in doing so, in conditions of poor light and haste, fueled theories he was a surgeon. However, re-examination of a mortuary sketch from one of his victims has revealed several key aspects that strongly suggest he had no professional surgical training. Instead, the technique used was more consistent with that of a slaughterhouse worker. There were many small-scale slaughterhouses in East London in the 1880s, within which conditions were harsh for animals and workers alike. The brutalizing effects of such work only add to concerns highlighted by modern research that those who commit violence on animals are more likely to target people. Modern slaughterhouses are more humane in some ways but more desensitizing in others, and sociological research has indicated that communities with slaughterhouses are more likely to experience the most violent of crimes. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. Abstract Hundreds of theories exist concerning the identity of “Jack the Ripper”. His propensity for anatomical dissection with a knife—and in particular the rapid location and removal of specific organs—led some to speculate that he must have been surgically trained. However, re-examination of a mortuary sketch of one of his victims has revealed several aspects of incisional technique highly inconsistent with professional surgical training. Related discrepancies are also apparent in the language used within the only letter from Jack considered to be probably authentic. The techniques he used to dispatch his victims and retrieve their organs were, however, highly consistent with techniques used within the slaughterhouses of the day. East London in the 1880s had a large number of small-scale slaughterhouses, within which conditions for both animals and workers were exceedingly harsh. Modern sociological research has highlighted the clear links between the infliction of violence on animals and that inflicted on humans, as well as increased risks of violent crimes in communities surrounding slaughterhouses. Conditions within modern slaughterhouses are more humane in some ways but more desensitising in others. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. PMID:28394281
A performance model for GPUs with caches
Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...
2014-06-24
To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less
Nguyen, Phong Thanh; Abbosh, Amin; Crozier, Stuart
2017-06-01
In this paper, a technique for noninvasive microwave hyperthermia treatment for breast cancer is presented. In the proposed technique, microwave hyperthermia of patient-specific breast models is implemented using a three-dimensional (3-D) antenna array based on differential beam-steering subarrays to locally raise the temperature of the tumor to therapeutic values while keeping healthy tissue at normal body temperature. This approach is realized by optimizing the excitations (phases and amplitudes) of the antenna elements using the global optimization method particle swarm optimization. The antennae excitation phases are optimized to maximize the power at the tumor, whereas the amplitudes are optimized to accomplish the required temperature at the tumor. During the optimization, the technique ensures that no hotspots exist in healthy tissue. To implement the technique, a combination of linked electromagnetic and thermal analyses using MATLAB and the full-wave electromagnetic simulator is conducted. The technique is tested at 4.2 GHz, which is a compromise between the required power penetration and focusing, in a realistic simulation environment, which is built using a 3-D antenna array of 4 × 6 unidirectional antenna elements. The presented results on very dense 3-D breast models, which have the realistic dielectric and thermal properties, validate the capability of the proposed technique in focusing power at the exact location and volume of tumor even in the challenging cases where tumors are embedded in glands. Moreover, the models indicate the capability of the technique in dealing with tumors at different on- and off-axis locations within the breast with high efficiency in using the microwave power.
Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem.
Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing
2015-01-01
Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA.
Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem
Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing
2015-01-01
Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA. PMID:26167171
Functional rehabilitation in advanced intraoral cancer
Roodenburg, Jan L.
2017-01-01
Introduction: Modern treatment of advanced intraoral cancer involves multidisciplinary teams with use of complicated reconstructive techniques to provide improved survival with optimal rehabilitation. Mastication is an important part of this process, and it can be severely impaired by tumor ablation. Whether flap reconstruction is a determinant factor in dental rehabilitation is still in debate. Patients and methods: Thirty-five patients with advanced intraoral cancer were reviewed to determine dental rehabilitation of different reconstructive techniques. The patients were treated with a multidisciplinary team approach. The patients’ demographics, primary treatment, reconstructive surgery, dental rehabilitation, and functional outcome were recorded and analyzed. Results: Nine patients had Stadium III disease, and 26 patients had stadium IV. Thirty-two patients (91.42%) received postoperative radiotherapy. Masticatory and dental functional rehabilitation of patients was very poor. Only 15 patients (42.86%) could eat a normal diet, whereas 18 patients (51.42%) could manage only soft diets, and 2 patients (5.72%) could only be fed with a liquid diet. Denture rehabilitation was even more frustrating and had a direct impact on masticatory rehabilitation. Only 10 patients (28.57%) could use dentures postoperatively and 40% of patients (14 patients) could not use any denture at all. Above all reconstructive techniques, the free radial forearm flap provides the best functional outcome. Conclusions: Reconstruction of advanced intraoral cancer results in poor denture rehabilitation, especially when bulky flaps are used. If massive resections are necessary, the free radial forearm flap reconstruction provides the best functional outcome. PMID:29177211
NASA Astrophysics Data System (ADS)
Neji, N.; Jridi, M.; Alfalou, A.; Masmoudi, N.
2016-02-01
The double random phase encryption (DRPE) method is a well-known all-optical architecture which has many advantages especially in terms of encryption efficiency. However, the method presents some vulnerabilities against attacks and requires a large quantity of information to encode the complex output plane. In this paper, we present an innovative hybrid technique to enhance the performance of DRPE method in terms of compression and encryption. An optimized simultaneous compression and encryption method is applied simultaneously on the real and imaginary components of the DRPE output plane. The compression and encryption technique consists in using an innovative randomized arithmetic coder (RAC) that can well compress the DRPE output planes and at the same time enhance the encryption. The RAC is obtained by an appropriate selection of some conditions in the binary arithmetic coding (BAC) process and by using a pseudo-random number to encrypt the corresponding outputs. The proposed technique has the capabilities to process video content and to be standard compliant with modern video coding standards such as H264 and HEVC. Simulations demonstrate that the proposed crypto-compression system has presented the drawbacks of the DRPE method. The cryptographic properties of DRPE have been enhanced while a compression rate of one-sixth can be achieved. FPGA implementation results show the high performance of the proposed method in terms of maximum operating frequency, hardware occupation, and dynamic power consumption.
Networked localization of sniper shots using acoustics
NASA Astrophysics Data System (ADS)
Hengy, S.; Hamery, P.; De Mezzo, S.; Duffner, P.
2011-06-01
The presence of snipers in modern conflicts leads to high insecurity for the soldiers. In order to improve the soldier's protection against this threat, the French German Research Institute of Saint-Louis (ISL) initiated studies in the domain of acoustic localization of shots. Mobile antennas mounted on the soldier's helmet were initially used for real-time detection, classification and localization of sniper shots. It showed good performances in land scenarios, but also in urban scenarios if the array was in the shot corridor, meaning that the microphones first detect the direct wave and then the reflections of the Mach and muzzle waves. As soon as the acoustic arrays were not near to the shot corridor (only reflections are detected) this solution lost its efficiency and erroneous estimated position were given. In order to estimate the position of the shooter in every kind of urban scenario, ISL started studying time reversal techniques. Knowing the position of every reflective object in the environment (buildings, walls, ...) it should be possible to estimate the position of the shooter. First, a synthetic propagation algorithm has been developed and validated for real scale applications. It has then been validated for small scale models, allowing us to test our time reversal based algorithms in our laboratory. In this paper we discuss all the challenges that are induced by the application of sniper detection using time reversal techniques. We will discuss all the hard points that can be encountered and try to find some solutions in order to optimize the use of this technique.
Drummer, Olaf H
2010-01-01
Forensic toxicology has developed as a forensic science in recent years and is now widely used to assist in death investigations, in civil and criminal matters involving drug use, in drugs of abuse testing in correctional settings and custodial medicine, in road and workplace safety, in matters involving environmental pollution, as well as in sports doping. Drugs most commonly targeted include amphetamines, benzodiazepines, cannabis, cocaine and the opiates, but can be any other illicit substance or almost any over-the-counter or prescribed drug, as well as poisons available to the community. The discipline requires high level skills in analytical techniques with a solid knowledge of pharmacology and pharmacokinetics. Modern techniques rely heavily on immunoassay screening analyses and mass spectrometry (MS) for confirmatory analyses using either high-performance liquid chromatography or gas chromatography as the separation technique. Tandem MS has become more and more popular compared to single-stage MS. It is essential that analytical systems are fully validated and fit for the purpose and the assay batches are monitored with quality controls. External proficiency programs monitor both the assay and the personnel performing the work. For a laboratory to perform optimally, it is vital that the circumstances and context of the case are known and the laboratory understands the limitations of the analytical systems used, including drug stability. Drugs and poisons can change concentration postmortem due to poor or unequal quality of blood and other specimens, anaerobic metabolism and redistribution. The latter provides the largest handicap in the interpretation of postmortem results.
Modern radiosurgical and endovascular classification schemes for brain arteriovenous malformations.
Tayebi Meybodi, Ali; Lawton, Michael T
2018-05-04
Stereotactic radiosurgery (SRS) and endovascular techniques are commonly used for treating brain arteriovenous malformations (bAVMs). They are usually used as ancillary techniques to microsurgery but may also be used as solitary treatment options. Careful patient selection requires a clear estimate of the treatment efficacy and complication rates for the individual patient. As such, classification schemes are an essential part of patient selection paradigm for each treatment modality. While the Spetzler-Martin grading system and its subsequent modifications are commonly used for microsurgical outcome prediction for bAVMs, the same system(s) may not be easily applicable to SRS and endovascular therapy. Several radiosurgical- and endovascular-based grading scales have been proposed for bAVMs. However, a comprehensive review of these systems including a discussion on their relative advantages and disadvantages is missing. This paper is dedicated to modern classification schemes designed for SRS and endovascular techniques.
Advanced grazing-incidence techniques for modern soft-matter materials analysis
Hexemer, Alexander; Müller-Buschbaum, Peter
2015-01-01
The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less
Ding, Hai-quan; Lu, Qi-peng
2012-01-01
"Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.
Advanced grazing-incidence techniques for modern soft-matter materials analysis
Hexemer, Alexander; Müller-Buschbaum, Peter
2015-01-01
The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pharhizgar, K.D.; Lunce, S.E.
1994-12-31
Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1985-01-01
Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.
Options of system integrated environment modelling in the predicated dynamic cyberspace
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janková, Martina; Dvořák, Jiří
In this article there are briefly mentioned some selected options of contemporary conception of cybernetic system models in the corresponding and possible integratable environment with modern system dynamics thinking and all this in the cyberspace of possible projecting of predicted system characteristics. The key to new capabilities of system integration modelling in the considered cyberspace is mainly the ability to improve the environment and the system integration options, all this with the aim of modern control in the hierarchically arranged dynamic cyberspace, e.g. in the currently desired electronic business with information. The aim of this article is to assess generallymore » the trends in the use of modern modelling methods considering the cybernetics applications verified in practice, modern concept of project management and also the potential integration of artificial intelligence in the new projecting and project management of integratable and intelligent models, e.g. with the optimal structures and adaptable behaviour.The article results from the solution of a specific research partial task at the faculty; especially the moments proving that the new economics will be based more and more on information, knowledge system defined cyberspace of modern management, are stressed in the text.« less
Modernization and new technologies: Coping with the information explosion
NASA Technical Reports Server (NTRS)
Blados, Walter R.; Cotter, Gladys A.
1993-01-01
Information has become a valuable and strategic resource in all societies and economies. Scientific and technical information is especially important in developing and maintaining a strong national science and technology base. The expanding use of information technology, the growth of interdisciplinary research, and an increase in international collaboration are changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls'. Notwithstanding this modernization program, it is recogized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected to cover expanded sources of information. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative, one that can optimize the cost/benefit equation for all participants.
NASA Astrophysics Data System (ADS)
Saunders, R.; Samei, E.; Badea, C.; Yuan, H.; Ghaghada, K.; Qi, Y.; Hedlund, L. W.; Mukundan, S.
2008-03-01
Dual-energy contrast-enhanced breast tomosynthesis has been proposed as a technique to improve the detection of early-stage cancer in young, high-risk women. This study focused on optimizing this technique using computer simulations. The computer simulation used analytical calculations to optimize the signal difference to noise ratio (SdNR) of resulting images from such a technique at constant dose. The optimization included the optimal radiographic technique, optimal distribution of dose between the two single-energy projection images, and the optimal weighting factor for the dual energy subtraction. Importantly, the SdNR included both anatomical and quantum noise sources, as dual energy imaging reduces anatomical noise at the expense of increases in quantum noise. Assuming a tungsten anode, the maximum SdNR at constant dose was achieved for a high energy beam at 49 kVp with 92.5 μm copper filtration and a low energy beam at 49 kVp with 95 μm tin filtration. These analytical calculations were followed by Monte Carlo simulations that included the effects of scattered radiation and detector properties. Finally, the feasibility of this technique was tested in a small animal imaging experiment using a novel iodinated liposomal contrast agent. The results illustrated the utility of dual energy imaging and determined the optimal acquisition parameters for this technique. This work was supported in part by grants from the Komen Foundation (PDF55806), the Cancer Research and Prevention Foundation, and the NIH (NCI R21 CA124584-01). CIVM is a NCRR/NCI National Resource under P41-05959/U24-CA092656.
Optimal time-domain technique for pulse width modulation in power electronics
NASA Astrophysics Data System (ADS)
Mayergoyz, I.; Tyagi, S.
2018-05-01
Optimal time-domain technique for pulse width modulation is presented. It is based on exact and explicit analytical solutions for inverter circuits, obtained for any sequence of input voltage rectangular pulses. Two optimal criteria are discussed and illustrated by numerical examples.