Sample records for rapid problem setup

  1. A hybrid computer program for rapidly solving flowing or static chemical kinetic problems involving many chemical species

    NASA Technical Reports Server (NTRS)

    Mclain, A. G.; Rao, C. S. R.

    1976-01-01

    A hybrid chemical kinetic computer program was assembled which provides a rapid solution to problems involving flowing or static, chemically reacting, gas mixtures. The computer program uses existing subroutines for problem setup, initialization, and preliminary calculations and incorporates a stiff ordinary differential equation solution technique. A number of check cases were recomputed with the hybrid program and the results were almost identical to those previously obtained. The computational time saving was demonstrated with a propane-oxygen-argon shock tube combustion problem involving 31 chemical species and 64 reactions. Information is presented to enable potential users to prepare an input data deck for the calculation of a problem.

  2. Development of Experimental Setup of Metal Rapid Prototyping Machine using Selective Laser Sintering Technique

    NASA Astrophysics Data System (ADS)

    Patil, S. N.; Mulay, A. V.; Ahuja, B. B.

    2018-04-01

    Unlike in the traditional manufacturing processes, additive manufacturing as rapid prototyping, allows designers to produce parts that were previously considered too complex to make economically. The shift is taking place from plastic prototype to fully functional metallic parts by direct deposition of metallic powders as produced parts can be directly used for desired purpose. This work is directed towards the development of experimental setup of metal rapid prototyping machine using selective laser sintering and studies the various parameters, which plays important role in the metal rapid prototyping using SLS technique. The machine structure in mainly divided into three main categories namely, (1) Z-movement of bed and table, (2) X-Y movement arrangement for LASER movements and (3) feeder mechanism. Z-movement of bed is controlled by using lead screw, bevel gear pair and stepper motor, which will maintain the accuracy of layer thickness. X-Y movements are controlled using timing belt and stepper motors for precise movements of LASER source. Feeder mechanism is then developed to control uniformity of layer thickness metal powder. Simultaneously, the study is carried out for selection of material. Various types of metal powders can be used for metal RP as Single metal powder, mixture of two metals powder, and combination of metal and polymer powder. Conclusion leads to use of mixture of two metals powder to minimize the problems such as, balling effect and porosity. Developed System can be validated by conducting various experiments on manufactured part to check mechanical and metallurgical properties. After studying the results of these experiments, various process parameters as LASER properties (as power, speed etc.), and material properties (as grain size and structure etc.) will be optimized. This work is mainly focused on the design and development of cost effective experimental setup of metal rapid prototyping using SLS technique which will gives the feel of metal rapid prototyping process and its important parameters.

  3. Single Machine Scheduling and Due Date Assignment with Past-Sequence-Dependent Setup Time and Position-Dependent Processing Time

    PubMed Central

    Zhao, Chuan-Li; Hsu, Hua-Feng

    2014-01-01

    This paper considers single machine scheduling and due date assignment with setup time. The setup time is proportional to the length of the already processed jobs; that is, the setup time is past-sequence-dependent (p-s-d). It is assumed that a job's processing time depends on its position in a sequence. The objective functions include total earliness, the weighted number of tardy jobs, and the cost of due date assignment. We analyze these problems with two different due date assignment methods. We first consider the model with job-dependent position effects. For each case, by converting the problem to a series of assignment problems, we proved that the problems can be solved in O(n 4) time. For the model with job-independent position effects, we proved that the problems can be solved in O(n 3) time by providing a dynamic programming algorithm. PMID:25258727

  4. Single machine scheduling and due date assignment with past-sequence-dependent setup time and position-dependent processing time.

    PubMed

    Zhao, Chuan-Li; Hsu, Chou-Jung; Hsu, Hua-Feng

    2014-01-01

    This paper considers single machine scheduling and due date assignment with setup time. The setup time is proportional to the length of the already processed jobs; that is, the setup time is past-sequence-dependent (p-s-d). It is assumed that a job's processing time depends on its position in a sequence. The objective functions include total earliness, the weighted number of tardy jobs, and the cost of due date assignment. We analyze these problems with two different due date assignment methods. We first consider the model with job-dependent position effects. For each case, by converting the problem to a series of assignment problems, we proved that the problems can be solved in O(n(4)) time. For the model with job-independent position effects, we proved that the problems can be solved in O(n(3)) time by providing a dynamic programming algorithm.

  5. Analysis of dispatching rules in a stochastic dynamic job shop manufacturing system with sequence-dependent setup times

    NASA Astrophysics Data System (ADS)

    Sharma, Pankaj; Jain, Ajai

    2014-12-01

    Stochastic dynamic job shop scheduling problem with consideration of sequence-dependent setup times are among the most difficult classes of scheduling problems. This paper assesses the performance of nine dispatching rules in such shop from makespan, mean flow time, maximum flow time, mean tardiness, maximum tardiness, number of tardy jobs, total setups and mean setup time performance measures viewpoint. A discrete event simulation model of a stochastic dynamic job shop manufacturing system is developed for investigation purpose. Nine dispatching rules identified from literature are incorporated in the simulation model. The simulation experiments are conducted under due date tightness factor of 3, shop utilization percentage of 90% and setup times less than processing times. Results indicate that shortest setup time (SIMSET) rule provides the best performance for mean flow time and number of tardy jobs measures. The job with similar setup and modified earliest due date (JMEDD) rule provides the best performance for makespan, maximum flow time, mean tardiness, maximum tardiness, total setups and mean setup time measures.

  6. Implementation of an automated test setup for measuring electrical conductance of concrete.

    DOT National Transportation Integrated Search

    2007-01-01

    This project was designed to provide the Virginia Department of Transportation (VDOT) with an automated laboratory setup for performing the rapid chloride permeability test (RCPT) to measure the electrical conductance of concrete in accordance with a...

  7. Feasibility study of a take-home array-based functional electrical stimulation system with automated setup for current functional electrical stimulation users with foot-drop.

    PubMed

    Prenton, Sarah; Kenney, Laurence P; Stapleton, Claire; Cooper, Glen; Reeves, Mark L; Heller, Ben W; Sobuh, Mohammad; Barker, Anthony T; Healey, Jamie; Good, Timothy R; Thies, Sibylle B; Howard, David; Williamson, Tracey

    2014-10-01

    To investigate the feasibility of unsupervised community use of an array-based automated setup functional electrical stimulator for current foot-drop functional electrical stimulation (FES) users. Feasibility study. Gait laboratory and community use. Participants (N=7) with diagnosis of unilateral foot-drop of central neurologic origin (>6mo) who were regular users of a foot-drop FES system (>3mo). Array-based automated setup FES system for foot-drop (ShefStim). Logged usage, logged automated setup times for the array-based automated setup FES system and diary recording of problems experienced, all collected in the community environment. Walking speed, ankle angles at initial contact, foot clearance during swing, and the Quebec User Evaluation of Satisfaction with Assistive Technology version 2.0 (QUEST version 2.0) questionnaire, all collected in the gait laboratory. All participants were able to use the array-based automated setup FES system. Total setup time took longer than participants' own FES systems, and automated setup time was longer than in a previous study of a similar system. Some problems were experienced, but overall, participants were as satisfied with this system as their own FES system. The increase in walking speed (N=7) relative to no stimulation was comparable between both systems, and appropriate ankle angles at initial contact (N=7) and foot clearance during swing (n=5) were greater with the array-based automated setup FES system. This study demonstrates that an array-based automated setup FES system for foot-drop can be successfully used unsupervised. Despite setup's taking longer and some problems, users are satisfied with the system and it would appear as effective, if not better, at addressing the foot-drop impairment. Further product development of this unique system, followed by a larger-scale and longer-term study, is required before firm conclusions about its efficacy can be reached. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. Laser-assisted chemical vapor deposition setup for fast synthesis of graphene patterns

    NASA Astrophysics Data System (ADS)

    Zhang, Chentao; Zhang, Jianhuan; Lin, Kun; Huang, Yuanqing

    2017-05-01

    An automatic setup based on the laser-assisted chemical vapor deposition method has been developed for the rapid synthesis of graphene patterns. The key components of this setup include a laser beam control and focusing unit, a laser spot monitoring unit, and a vacuum and flow control unit. A laser beam with precision control of laser power is focused on the surface of a nickel foil substrate by the laser beam control and focusing unit for localized heating. A rapid heating and cooling process at the localized region is induced by the relative movement between the focalized laser spot and the nickel foil substrate, which causes the decomposing of gaseous hydrocarbon and the out-diffusing of excess carbon atoms to form graphene patterns on the laser scanning path. All the fabrication parameters that affect the quality and number of graphene layers, such as laser power, laser spot size, laser scanning speed, pressure of vacuum chamber, and flow rates of gases, can be precisely controlled and monitored during the preparation of graphene patterns. A simulation of temperature distribution was carried out via the finite element method, providing a scientific guidance for the regulation of temperature distribution during experiments. A multi-layer graphene ribbon with few defects was synthesized to verify its performance of the rapid growth of high-quality graphene patterns. Furthermore, this setup has potential applications in other laser-based graphene synthesis and processing.

  9. Pharmacist review and its impact on Singapore nursing homes

    PubMed Central

    Chia, Hui Shan; Ho, John Aik Hui; Lim, Bernadette Daolin

    2015-01-01

    INTRODUCTION There is a high prevalence of polypharmacy and inappropriate medication use in Singapore nursing homes. This study primarily explored the benefits of pharmacist reviews in local nursing homes. The secondary aims were to review the potential cost savings gained from following the pharmacists’ recommendations and to identify the possible risks associated with polypharmacy and inappropriate medication use. METHODS A retrospective period prevalence study was performed. We analysed the pharmacotherapy problems highlighted by pharmacists in three nursing homes and the rate of acceptance of pharmacists’ recommendations. Data was collected in two phases: (a) a one-month pre-setup period, during which 480 patients were reviewed (i.e. one-time review before weekly pharmacist visits); and (b) a six-month post-setup period, during which the 480 patients were reviewed again. Pharmacotherapy problems were classified according to a clinical pharmacist recommendation taxonomy and potential risks were identified. Monthly cost savings were calculated and compared with the monthly costs of pharmacist reviews. RESULTS A total of 392 pharmacotherapy problems were identified, with pharmacist recommendations noted for each problem. Among the 392 recommendations, 236 (60.2%) were accepted. The pharmacotherapy problems were analysed for potential risks, including falls (16.0%) and constipation (13.1%). The acceptance rates were higher during the post-setup period compared to the pre-setup period (p < 0.0001). Total direct acquisition cost savings during the pre- and post-setup periods were SGD 388.30 and SGD 876.69, respectively. CONCLUSION The provision of pharmaceutical care to nursing home residents resulted in improved medication safety and quality of care. PMID:26451051

  10. Grouping in decomposition method for multi-item capacitated lot-sizing problem with immediate lost sales and joint and item-dependent setup cost

    NASA Astrophysics Data System (ADS)

    Narenji, M.; Fatemi Ghomi, S. M. T.; Nooraie, S. V. R.

    2011-03-01

    This article examines a dynamic and discrete multi-item capacitated lot-sizing problem in a completely deterministic production or procurement environment with limited production/procurement capacity where lost sales (the loss of customer demand) are permitted. There is no inventory space capacity and the production activity incurs a fixed charge linear cost function. Similarly, the inventory holding cost and the cost of lost demand are both associated with a linear no-fixed charge function. For the sake of simplicity, a unit of each item is assumed to consume one unit of production/procurement capacity. We analyse a different version of setup costs incurred by a production or procurement activity in a given period of the planning horizon. In this version, called the joint and item-dependent setup cost, an additional item-dependent setup cost is incurred separately for each produced or ordered item on top of the joint setup cost.

  11. An algorithm for a single machine scheduling problem with sequence dependent setup times and scheduling windows

    NASA Technical Reports Server (NTRS)

    Moore, J. E.

    1975-01-01

    An enumeration algorithm is presented for solving a scheduling problem similar to the single machine job shop problem with sequence dependent setup times. The scheduling problem differs from the job shop problem in two ways. First, its objective is to select an optimum subset of the available tasks to be performed during a fixed period of time. Secondly, each task scheduled is constrained to occur within its particular scheduling window. The algorithm is currently being used to develop typical observational timelines for a telescope that will be operated in earth orbit. Computational times associated with timeline development are presented.

  12. Time-domain finite elements in optimal control with application to launch-vehicle guidance. PhD. Thesis

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.

    1991-01-01

    A time-domain finite element method is developed for optimal control problems. The theory derived is general enough to handle a large class of problems including optimal control problems that are continuous in the states and controls, problems with discontinuities in the states and/or system equations, problems with control inequality constraints, problems with state inequality constraints, or problems involving any combination of the above. The theory is developed in such a way that no numerical quadrature is necessary regardless of the degree of nonlinearity in the equations. Also, the same shape functions may be employed for every problem because all strong boundary conditions are transformed into natural or weak boundary conditions. In addition, the resulting nonlinear algebraic equations are very sparse. Use of sparse matrix solvers allows for the rapid and accurate solution of very difficult optimization problems. The formulation is applied to launch-vehicle trajectory optimization problems, and results show that real-time optimal guidance is realizable with this method. Finally, a general problem solving environment is created for solving a large class of optimal control problems. The algorithm uses both FORTRAN and a symbolic computation program to solve problems with a minimum of user interaction. The use of symbolic computation eliminates the need for user-written subroutines which greatly reduces the setup time for solving problems.

  13. Rapid detection of bacteriophages in starter culture using water-in-oil-in-water emulsion microdroplets.

    PubMed

    Wang, Min S; Nitin, Nitin

    2014-10-01

    Bacteriophage contamination of starter culture and raw material poses a major problem in the fermentation industry. In this study, a rapid detection of lytic phage contamination in starter culture using water-in-oil-in-water (W/O/W) emulsion microdroplets was described. A model bacteria with varying concentrations of lytic phages were encapsulated in W/O/W emulsion microdroplets using a simple needle-in-tube setup. The detection of lytic phage contamination was accomplished in 1 h using the propidium iodide labeling of the phage-infected bacteria inside the W/O/W emulsion microdroplets. Using this approach, a detection limit of 10(2) PFU/mL of phages was achieved quantitatively, while 10(4) PFU/mL of phages could be detected qualitatively based on visual comparison of the fluorescence images. Given the simplicity and sensitivity of this approach, it is anticipated that this method can be adapted to any strains of bacteria and lytic phages that are commonly used for fermentation, and has potential for a rapid detection of lytic phage contamination in the fermentation industry.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peratt, A.L.; Mostrom, M.A.

    With the availability of 80--125 MHz microprocessors, the methodology developed for the simulation of problems in pulsed power and plasma physics on modern day supercomputers is now amenable to application on a wide range of platforms including laptops and workstations. While execution speeds with these processors do not match those of large scale computing machines, resources such as computer-aided-design (CAD) and graphical analysis codes are available to automate simulation setup and process data. This paper reports on the adaptation of IVORY, a three-dimensional, fully-electromagnetic, particle-in-cell simulation code, to this platform independent CAD environment. The primary purpose of this talk ismore » to demonstrate how rapidly a pulsed power/plasma problem can be scoped out by an experimenter on a dedicated workstation. Demonstrations include a magnetically insulated transmission line, power flow in a graded insulator stack, a relativistic klystron oscillator, and the dynamics of a coaxial thruster for space applications.« less

  15. Screening and monitoring microbial xenobiotics' biodegradation by rapid, inexpensive and easy to perform microplate UV-absorbance measurements.

    PubMed

    Herzog, Bastian; Lemmer, Hilde; Horn, Harald; Müller, Elisabeth

    2014-02-22

    Evaluation of xenobiotics biodegradation potential, shown here for benzotriazoles (corrosion inhibitors) and sulfamethoxazole (sulfonamide antibiotic) by microbial communities and/or pure cultures normally requires time intensive and money consuming LC/GC methods that are, in case of laboratory setups, not always needed. The usage of high concentrations to apply a high selective pressure on the microbial communities/pure cultures in laboratory setups, a simple UV-absorbance measurement (UV-AM) was developed and validated for screening a large number of setups, requiring almost no preparation and significantly less time and money compared to LC/GC methods. This rapid and easy to use method was evaluated by comparing its measured values to LC-UV and GC-MS/MS results. Furthermore, its application for monitoring and screening unknown activated sludge communities (ASC) and mixed pure cultures has been tested and approved to detect biodegradation of benzotriazole (BTri), 4- and 5-tolyltriazole (4-TTri, 5-TTri) as well as SMX. In laboratory setups, xenobiotics concentrations above 1.0 mg L(-1) without any enrichment or preparation could be detected after optimization of the method. As UV-AM does not require much preparatory work and can be conducted in 96 or even 384 well plate formats, the number of possible parallel setups and screening efficiency was significantly increased while analytic and laboratory costs were reduced to a minimum.

  16. Screening and monitoring microbial xenobiotics’ biodegradation by rapid, inexpensive and easy to perform microplate UV-absorbance measurements

    PubMed Central

    2014-01-01

    Background Evaluation of xenobiotics biodegradation potential, shown here for benzotriazoles (corrosion inhibitors) and sulfamethoxazole (sulfonamide antibiotic) by microbial communities and/or pure cultures normally requires time intensive and money consuming LC/GC methods that are, in case of laboratory setups, not always needed. Results The usage of high concentrations to apply a high selective pressure on the microbial communities/pure cultures in laboratory setups, a simple UV-absorbance measurement (UV-AM) was developed and validated for screening a large number of setups, requiring almost no preparation and significantly less time and money compared to LC/GC methods. This rapid and easy to use method was evaluated by comparing its measured values to LC-UV and GC-MS/MS results. Furthermore, its application for monitoring and screening unknown activated sludge communities (ASC) and mixed pure cultures has been tested and approved to detect biodegradation of benzotriazole (BTri), 4- and 5-tolyltriazole (4-TTri, 5-TTri) as well as SMX. Conclusions In laboratory setups, xenobiotics concentrations above 1.0 mg L-1 without any enrichment or preparation could be detected after optimization of the method. As UV-AM does not require much preparatory work and can be conducted in 96 or even 384 well plate formats, the number of possible parallel setups and screening efficiency was significantly increased while analytic and laboratory costs were reduced to a minimum. PMID:24558966

  17. A novel hybrid genetic algorithm to solve the make-to-order sequence-dependent flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Mirabi, Mohammad; Fatemi Ghomi, S. M. T.; Jolai, F.

    2014-04-01

    Flow-shop scheduling problem (FSP) deals with the scheduling of a set of n jobs that visit a set of m machines in the same order. As the FSP is NP-hard, there is no efficient algorithm to reach the optimal solution of the problem. To minimize the holding, delay and setup costs of large permutation flow-shop scheduling problems with sequence-dependent setup times on each machine, this paper develops a novel hybrid genetic algorithm (HGA) with three genetic operators. Proposed HGA applies a modified approach to generate a pool of initial solutions, and also uses an improved heuristic called the iterated swap procedure to improve the initial solutions. We consider the make-to-order production approach that some sequences between jobs are assumed as tabu based on maximum allowable setup cost. In addition, the results are compared to some recently developed heuristics and computational experimental results show that the proposed HGA performs very competitively with respect to accuracy and efficiency of solution.

  18. Detection of the presence of Chlamydia trachomatis bacteria using diffusing wave spectroscopy with a small number of scatterers

    NASA Astrophysics Data System (ADS)

    Ulyanov, Sergey; Ulianova, Onega; Filonova, Nadezhda; Moiseeva, Yulia; Zaitsev, Sergey; Saltykov, Yury; Polyanina, Tatiana; Lyapina, Anna; Kalduzova, Irina; Larionova, Olga; Utz, Sergey; Feodorova, Valentina

    2018-04-01

    Theory of diffusing wave spectroscopy has been firstly adapted to the problem of rapid detection of Chlamydia trachomatis bacteria in blood samples of Chlamydia patients. Formula for correlation function of temporal fluctuations of speckle intensity is derived for the case of small number of scattering events. Dependence of bandwidth of spectrum on average number of scatterers is analyzed. Set-up for detection of the presence of C. trachomatis cells in aqueous suspension is designed. Good agreement between theoretical results and experimental data is shown. Possibility of detection of the presence of C. trachomatis cells in probing volume using diffusing wave spectroscopy with a small number of scatterers is successfully demonstrated for the first time.

  19. Internal Temperature Control For Vibration Testers

    NASA Technical Reports Server (NTRS)

    Dean, Richard J.

    1996-01-01

    Vibration test fixtures with internal thermal-transfer capabilities developed. Made of aluminum for rapid thermal transfer. Small size gives rapid response to changing temperatures, with better thermal control. Setup quicker and internal ducting facilitates access to parts being tested. In addition, internal flows smaller, so less energy consumed in maintaining desired temperature settings.

  20. Optimization-based manufacturing scheduling with multiple resources and setup requirements

    NASA Astrophysics Data System (ADS)

    Chen, Dong; Luh, Peter B.; Thakur, Lakshman S.; Moreno, Jack, Jr.

    1998-10-01

    The increasing demand for on-time delivery and low price forces manufacturer to seek effective schedules to improve coordination of multiple resources and to reduce product internal costs associated with labor, setup and inventory. This study describes the design and implementation of a scheduling system for J. M. Product Inc. whose manufacturing is characterized by the need to simultaneously consider machines and operators while an operator may attend several operations at the same time, and the presence of machines requiring significant setup times. The scheduling problem with these characteristics are typical for many manufacturers, very difficult to be handled, and have not been adequately addressed in the literature. In this study, both machine and operators are modeled as resources with finite capacities to obtain efficient coordination between them, and an operator's time can be shared by several operations at the same time to make full use of the operator. Setups are explicitly modeled following our previous work, with additional penalties on excessive setups to reduce setup costs and avoid possible scraps. An integer formulation with a separable structure is developed to maximize on-time delivery of products, low inventory and small number of setups. Within the Lagrangian relaxation framework, the problem is decomposed into individual subproblems that are effectively solved by using dynamic programming with additional penalties embedded in state transitions. Heuristics is then developed to obtain a feasible schedule following on our previous work with new mechanism to satisfy operator capacity constraints. The method has been implemented using the object-oriented programming language C++ with a user-friendly interface, and numerical testing shows that the method generates high quality schedules in a timely fashion. Through simultaneous consideration of machines and operators, machines and operators are well coordinated to facilitate the smooth flow of parts through the system. The explicit modeling of setups and the associated penalties let parts with same setup requirements clustered together to avoid excessive setups.

  1. Application of Particle Swarm Optimization in Computer Aided Setup Planning

    NASA Astrophysics Data System (ADS)

    Kafashi, Sajad; Shakeri, Mohsen; Abedini, Vahid

    2011-01-01

    New researches are trying to integrate computer aided design (CAD) and computer aided manufacturing (CAM) environments. The role of process planning is to convert the design specification into manufacturing instructions. Setup planning has a basic role in computer aided process planning (CAPP) and significantly affects the overall cost and quality of machined part. This research focuses on the development for automatic generation of setups and finding the best setup plan in feasible condition. In order to computerize the setup planning process, three major steps are performed in the proposed system: a) Extraction of machining data of the part. b) Analyzing and generation of all possible setups c) Optimization to reach the best setup plan based on cost functions. Considering workshop resources such as machine tool, cutter and fixture, all feasible setups could be generated. Then the problem is adopted with technological constraints such as TAD (tool approach direction), tolerance relationship and feature precedence relationship to have a completely real and practical approach. The optimal setup plan is the result of applying the PSO (particle swarm optimization) algorithm into the system using cost functions. A real sample part is illustrated to demonstrate the performance and productivity of the system.

  2. Hybrid Pareto artificial bee colony algorithm for multi-objective single machine group scheduling problem with sequence-dependent setup times and learning effects.

    PubMed

    Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao

    2016-01-01

    Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.

  3. When a Problem Is More than a Teacher's Question

    ERIC Educational Resources Information Center

    Olson, Jo Clay; Knott, Libby

    2013-01-01

    Not only are the problems teachers pose throughout their teaching of great importance but also the ways in which they use those problems make this a critical component of teaching. A problem-posing episode includes the problem setup, the statement of the problem, and the follow-up questions. Analysis of problem-posing episodes of precalculus…

  4. Analysis of aerobic granular sludge formation based on grey system theory.

    PubMed

    Zhang, Cuiya; Zhang, Hanmin

    2013-04-01

    Based on grey entropy analysis, the relational grade of operational parameters with aerobic granular sludge's granulation indicators was studied. The former consisted of settling time (ST), aeration time (AT), superficial gas velocity (SGV), height/diameter (H/D) ratio and organic loading rates (OLR), the latter included sludge volume index (SVI) and set-up time. The calculated result showed that for SVI and set-up time, the influence orders and the corresponding grey entropy relational grades (GERG) were: SGV (0.9935) > AT (0.9921) > OLR (0.9894) > ST (0.9876) > H/D (0.9857) and SGV (0.9928) > H/D (0.9914) > AT (0.9909) > OLR (0.9897) > ST (0.9878). The chosen parameters were all key impact factors as each GERG was larger than 0.98. SGV played an important role in improving SVI transformation and facilitating the set-up process. The influence of ST on SVI and set-up time was relatively low due to its dual functions. SVI transformation and rapid set-up demanded different optimal H/D ratio scopes (10-20 and 16-20). Meanwhile, different functions could be obtained through adjusting certain factors' scope.

  5. Accurate setup of paraspinal patients using a noninvasive patient immobilization cradle and portal imaging.

    PubMed

    Lovelock, D Michael; Hua, Chiaho; Wang, Ping; Hunt, Margie; Fournier-Bidoz, Nathalie; Yenice, Kamil; Toner, Sean; Lutz, Wendell; Amols, Howard; Bilsky, Mark; Fuks, Zvi; Yamada, Yoshiya

    2005-08-01

    Because of the proximity of the spinal cord, effective radiotherapy of paraspinal tumors to high doses requires highly conformal dose distributions, accurate patient setup, setup verification, and patient immobilization. An immobilization cradle has been designed to facilitate the rapid setup and radiation treatment of patients with paraspinal disease. For all treatments, patients were set up to within 2.5 mm of the design using an amorphous silicon portal imager. Setup reproducibility of the target using the cradle and associated clinical procedures was assessed by measuring the setup error prior to any correction. From 350 anterior/posterior images, and 303 lateral images, the standard deviations, as determined by the imaging procedure, were 1.3 m, 1.6 m, and 2.1 in the ant/post, right/left, and superior/inferior directions. Immobilization was assessed by measuring patient shifts between localization images taken before and after treatment. From 67 ant/post image pairs and 49 lateral image pairs, the standard deviations were found to be less than 1 mm in all directions. Careful patient positioning and immobilization has enabled us to develop a successful clinical program of high dose, conformal radiotherapy of paraspinal disease using a conventional Linac equipped with dynamic multileaf collimation and an amorphous silicon portal imager.

  6. Combustion of Coal/Oil/Water Slurries

    NASA Technical Reports Server (NTRS)

    Kushida, R. O.

    1982-01-01

    Proposed test setup would measure combustion performance of new fuels by rapidly heating a droplet of coal/oil/water mixture and recording resulting explosion. Such mixtures are being considered as petroleum substitutes in oil-fired furnaces.

  7. Length matters: Improved high field EEG-fMRI recordings using shorter EEG cables.

    PubMed

    Assecondi, Sara; Lavallee, Christina; Ferrari, Paolo; Jovicich, Jorge

    2016-08-30

    The use of concurrent EEG-fMRI recordings has increased in recent years, allowing new avenues of medical and cognitive neuroscience research; however, currently used setups present problems with data quality and reproducibility. We propose a compact experimental setup for concurrent EEG-fMRI at 4T and compare it to a more standard reference setup. The compact setup uses short EEG cables connecting to the amplifiers, which are placed right at the back of the head RF coil on a form-fitting extension force-locked to the patient MR bed. We compare the two setups in terms of sensitivity to MR-room environmental noise, interferences between measuring devices (EEG or fMRI), and sensitivity to functional responses in a visual stimulation paradigm. The compact setup reduces the system sensitivity to both external noise and MR-induced artefacts by at least 60%, with negligible EEG noise induced from the mechanical vibrations of the cryogenic cooling compression pump. The compact setup improved EEG data quality and the overall performance of MR-artifact correction techniques. Both setups were similar in terms of the fMRI data, with higher reproducibility for cable placement within the scanner in the compact setup. This improved compact setup may be relevant to MR laboratories interested in reducing the sensitivity of their EEG-fMRI experimental setup to external noise sources, setting up an EEG-fMRI workplace for the first time, or for creating a more reproducible configuration of equipment and cables. Implications for safety and ergonomics are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Rapid test for the detection of hazardous microbiological material

    NASA Astrophysics Data System (ADS)

    Mordmueller, Mario; Bohling, Christian; John, Andreas; Schade, Wolfgang

    2009-09-01

    After attacks with anthrax pathogens have been committed since 2001 all over the world the fast detection and determination of biological samples has attracted interest. A very promising method for a rapid test is Laser Induced Breakdown Spectroscopy (LIBS). LIBS is an optical method which uses time-resolved or time-integrated spectral analysis of optical plasma emission after pulsed laser excitation. Even though LIBS is well established for the determination of metals and other inorganic materials the analysis of microbiological organisms is difficult due to their very similar stoichiometric composition. To analyze similar LIBS-spectra computer assisted chemometrics is a very useful approach. In this paper we report on first results of developing a compact and fully automated rapid test for the detection of hazardous microbiological material. Experiments have been carried out with two setups: A bulky one which is composed of standard laboratory components and a compact one consisting of miniaturized industrial components. Both setups work at an excitation wavelength of λ=1064nm (Nd:YAG). Data analysis is done by Principal Component Analysis (PCA) with an adjacent neural network for fully automated sample identification.

  9. Ant system: optimization by a colony of cooperating agents.

    PubMed

    Dorigo, M; Maniezzo, V; Colorni, A

    1996-01-01

    An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

  10. A computational analysis of lower bounds for the economic lot sizing problem in remanufacturing with separate setups

    NASA Astrophysics Data System (ADS)

    Aishah Syed Ali, Sharifah

    2017-09-01

    This paper considers economic lot sizing problem in remanufacturing with separate setup (ELSRs), where remanufactured and new products are produced on dedicated production lines. Since this problem is NP-hard in general, which leads to computationally inefficient and low-quality of solutions, we present (a) a multicommodity formulation and (b) a strengthened formulation based on a priori addition of valid inequalities in the space of original variables, which are then compared with the Wagner-Whitin based formulation available in the literature. Computational experiments on a large number of test data sets are performed to evaluate the different approaches. The numerical results show that our strengthened formulation outperforms all the other tested approaches in terms of linear relaxation bounds. Finally, we conclude with future research directions.

  11. Self-powered electrospinning apparatus based on a hand-operated Wimshurst generator

    NASA Astrophysics Data System (ADS)

    Han, Wen-Peng; Huang, Yuan-Yuan; Yu, Miao; Zhang, Jun-Cheng; Yan, Xu; Yu, Gui-Feng; Zhang, Hong-Di; Yan, Shi-Ying; Long, Yun-Ze

    2015-03-01

    A conventional electrospinning setup cannot work without a plug (electricity supply). In this article, we report a self-powered electrospinning setup based on a hand-operated Wimshurst generator. The new device has better applicability and portability than a typical conventional electrospinning setup because it is lightweight and can work without an external power supply. Experimental parameters of the apparatus such as the minimum number of handle turns to generate enough energy to spin, rotation speed of the handle and electrospinning distance were investigated. Different polymers such as polystyrene (PS), poly(vinylidene fluoride) (PVDF), polycaprolactone (PCL) and polylactic acid (PLA) were electrospun into ultrathin fibers successfully by this apparatus. The stability, reliability, and repeatability of the new apparatus demonstrate that it can be used as not only a demonstrator for an electrospinning process, but also a beneficial complement to conventional electrospinning especially where or when without a power supply, and may be used in wound healing and rapid hemostasis, etc.A conventional electrospinning setup cannot work without a plug (electricity supply). In this article, we report a self-powered electrospinning setup based on a hand-operated Wimshurst generator. The new device has better applicability and portability than a typical conventional electrospinning setup because it is lightweight and can work without an external power supply. Experimental parameters of the apparatus such as the minimum number of handle turns to generate enough energy to spin, rotation speed of the handle and electrospinning distance were investigated. Different polymers such as polystyrene (PS), poly(vinylidene fluoride) (PVDF), polycaprolactone (PCL) and polylactic acid (PLA) were electrospun into ultrathin fibers successfully by this apparatus. The stability, reliability, and repeatability of the new apparatus demonstrate that it can be used as not only a demonstrator for an electrospinning process, but also a beneficial complement to conventional electrospinning especially where or when without a power supply, and may be used in wound healing and rapid hemostasis, etc. Electronic supplementary information (ESI) available: The video of the electrospinning process by this new self-powered electrospinning apparatus and the vivid details were recorded by a high-speed digital video camera. See DOI: 10.1039/c5nr00387c

  12. Application of genetic algorithm in integrated setup planning and operation sequencing

    NASA Astrophysics Data System (ADS)

    Kafashi, Sajad; Shakeri, Mohsen

    2011-01-01

    Process planning is an essential component for linking design and manufacturing process. Setup planning and operation sequencing is two main tasks in process planning. Many researches solved these two problems separately. Considering the fact that the two functions are complementary, it is necessary to integrate them more tightly so that performance of a manufacturing system can be improved economically and competitively. This paper present a generative system and genetic algorithm (GA) approach to process plan the given part. The proposed approach and optimization methodology analyses the TAD (tool approach direction), tolerance relation between features and feature precedence relations to generate all possible setups and operations using workshop resource database. Based on these technological constraints the GA algorithm approach, which adopts the feature-based representation, optimizes the setup plan and sequence of operations using cost indices. Case study show that the developed system can generate satisfactory results in optimizing the setup planning and operation sequencing simultaneously in feasible condition.

  13. New scheduling rules for a dynamic flexible flow line problem with sequence-dependent setup times

    NASA Astrophysics Data System (ADS)

    Kia, Hamidreza; Ghodsypour, Seyed Hassan; Davoudpour, Hamid

    2017-09-01

    In the literature, the application of multi-objective dynamic scheduling problem and simple priority rules are widely studied. Although these rules are not efficient enough due to simplicity and lack of general insight, composite dispatching rules have a very suitable performance because they result from experiments. In this paper, a dynamic flexible flow line problem with sequence-dependent setup times is studied. The objective of the problem is minimization of mean flow time and mean tardiness. A 0-1 mixed integer model of the problem is formulated. Since the problem is NP-hard, four new composite dispatching rules are proposed to solve it by applying genetic programming framework and choosing proper operators. Furthermore, a discrete-event simulation model is made to examine the performances of scheduling rules considering four new heuristic rules and the six adapted heuristic rules from the literature. It is clear from the experimental results that composite dispatching rules that are formed from genetic programming have a better performance in minimization of mean flow time and mean tardiness than others.

  14. Method and apparatus for studying high-temperature properties of conductive materials in the interests of nuclear power engineering

    NASA Astrophysics Data System (ADS)

    Savvatimskiy, A. I.; Onufriev, S. V.

    2016-12-01

    Physical processes during a rapid (microsecond) heating of metals, carbon, and their compounds by a single pulse of electric current are discussed. Effects arising in such short-term heating near the melting point are noted: the electron emission and heat capacity anomalies and the possible occurrence of Frenkel pair (interstitial atom and vacancy). The problem of measuring the temperature using optical methods under pulse heating is considered, including the use of a specimen in the form of a blackbody model. The melting temperature of carbon (4800-4900 K) is measured at increased pulse pressure. The results of studying the properties of metals (by example of zirconium and hafnium) and of zirconium carbide at high temperatures are discussed. The schematics of the pulse setups and the instrumentation, as well as specimens for a pulse experiment, are presented.

  15. Initial Implementation of Transient VERA-CS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerlach, Andrew; Kochunas, Brendan; Salko, Robert

    In this milestone the capabilities of both CTF and MPACT were extended to perform coupled transient calculations. This required several small changes in MPACT to setup the problems correctly, perform the edits correctly, and call the appropriate CTF interfaces in the right order. For CTF, revisions and corrections to the transient timestepping algorithm were made, as well as the addition of a new interface subroutine to allow MPACT to drive CTF at each timestep. With the modifications completed, the initial coupled capability was demonstrated on some problems used for code verification, a hypothetical small mini-core, and a Watts Bar demonstrationmore » problem. For each of these cases the results showed good agreement with the previous MPACT internal TH feedback model that relied on a simplified fuel heat conduction model and simplified coolant treatment. After the pulse the results are notably different as expected, where the effects of convection of heat to the coolant can be observed. Areas for future work were discussed, including assessment and development of the CTF dynamic fuel deformation and gap conductance models, addition of suitable transient boiling and CHF models for the rapid heating and cooling rates seen in RIAs, additional validation and demonstration work, and areas for improvement to the code input and output capabilities.« less

  16. Balancing antagonistic time and resource utilization constraints in over-subscribed scheduling problems

    NASA Technical Reports Server (NTRS)

    Smith, Stephen F.; Pathak, Dhiraj K.

    1991-01-01

    In this paper, we report work aimed at applying concepts of constraint-based problem structuring and multi-perspective scheduling to an over-subscribed scheduling problem. Previous research has demonstrated the utility of these concepts as a means for effectively balancing conflicting objectives in constraint-relaxable scheduling problems, and our goal here is to provide evidence of their similar potential in the context of HST observation scheduling. To this end, we define and experimentally assess the performance of two time-bounded heuristic scheduling strategies in balancing the tradeoff between resource setup time minimization and satisfaction of absolute time constraints. The first strategy considered is motivated by dispatch-based manufacturing scheduling research, and employs a problem decomposition that concentrates local search on minimizing resource idle time due to setup activities. The second is motivated by research in opportunistic scheduling and advocates a problem decomposition that focuses attention on the goal activities that have the tightest temporal constraints. Analysis of experimental results gives evidence of differential superiority on the part of each strategy in different problem solving circumstances. A composite strategy based on recognition of characteristics of the current problem solving state is then defined and tested to illustrate the potential benefits of constraint-based problem structuring and multi-perspective scheduling in over-subscribe scheduling problems.

  17. A note on resource allocation scheduling with group technology and learning effects on a single machine

    NASA Astrophysics Data System (ADS)

    Lu, Yuan-Yuan; Wang, Ji-Bo; Ji, Ping; He, Hongyu

    2017-09-01

    In this article, single-machine group scheduling with learning effects and convex resource allocation is studied. The goal is to find the optimal job schedule, the optimal group schedule, and resource allocations of jobs and groups. For the problem of minimizing the makespan subject to limited resource availability, it is proved that the problem can be solved in polynomial time under the condition that the setup times of groups are independent. For the general setup times of groups, a heuristic algorithm and a branch-and-bound algorithm are proposed, respectively. Computational experiments show that the performance of the heuristic algorithm is fairly accurate in obtaining near-optimal solutions.

  18. Delft Dashboard: a quick setup tool for coastal and estuarine models

    NASA Astrophysics Data System (ADS)

    Nederhoff, C., III; Van Dongeren, A.; Van Ormondt, M.; Veeramony, J.

    2016-02-01

    We developed easy-to-use Delft DashBoard (DDB) software for the rapid set-up of coastal and estuarine hydrodynamic and basic morphological numerical models. In the "Model Maker" toolbox, users have the capability to set-up Delft3D models, in a minimal amount of time (in the order of a hour), for any location in the world. DDB draws upon public internet data sources of bathymetry and tidesto construct the model. With additional toolboxes, these models can be forced with parameterized hurricane wind fields, uplift of the sea surface due to tsunamis nested in publically available ocean models and forced with meteo data (wind speed, pressure, temperature) In this presentation we will show the skill of a model which is setup with Delft Dashboard and compare it to well-calibrated benchmark models. These latter models have been set-up using detailed input data and boundary conditions. We have tested the functionality of Delft DashBoard and evaluate the performance and robustness of the DDB model system on a variety of cases, ranging from a coastal to basin models. Furthermore, we have performed a sensitivity study to investigate the most critical physical and numerical processes. The software can benefit operational modellers, as well as scientists and consultants.

  19. Problems of Female School Teachers in Kerala

    ERIC Educational Resources Information Center

    Nath, Baiju K.

    2008-01-01

    The problems of employed women will vary with the nature of job, sector in which she is working, and family setup. Fairly large proportion of teaching community is comprised of female teachers, which is one of the major service sectors chosen by women in the state. The study aimed to study the Personal, Familial and Professional problems faced by…

  20. A Comparative Study of Health Status and Quality of Life of Elderly People Living in Old Age Homes and within Family Setup in Raigad District, Maharashtra.

    PubMed

    Amonkar, Priyanka; Mankar, Madhavi Jogesh; Thatkar, Pandurang; Sawardekar, Pradeep; Goel, Rajesh; Anjenaya, Seema

    2018-01-01

    The traditional concept of family in India to provide support to the elderly is changing soon with disintegration of joint families. In this scenario the concept of old age homes (OAHs) is gaining momentum and the number of people seeking OAH care is rapidly increasing. However, not much is known about the quality of life (QOL) of Indian elderly staying in the OAH setup. To assess and compare the Health status, Quality of Life and Depression in elderly people living in OAHs & within family using WHOQOL -OLD questionnaire & Geriatric Depression Scale. A cross sectional study was conducted in elderly aged above 60 years of age. After taking a written consent and matching for age and sex & socioeconomic status, 60 elderly from OAHs & 120 elderly living within family setup were selected randomly. The WHOQOL-OLD standard questionnaire & GDS were used to assess quality of life & depression in elderly. The QOL of elderly in domains of autonomy, past present & future activities, social participation and intimacy was better in family setup (60.62, 70.62, 66.14 and 58.43) as compared to OAHs (51.35, 62.91, 59.47and 41.16) (p<0.05). There was statistically significant difference in mean geriatric depression scores of both the group (3.96 within family setup and 5.76 in OAH's). Quality of life of elderly within family setup was better as compared to elderly in OAHs.

  1. SU-E-J-15: A Patient-Centered Scheme to Mitigate Impacts of Treatment Setup Error

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, L; Southern Medical University, Guangzhou; Tian, Z

    2014-06-01

    Purpose: Current Intensity Modulated Radiation Therapy (IMRT) is plan-centered. At each treatment fraction, we position the patient to match the setup in treatment plan. Inaccurate setup can compromise delivered dose distribution, and hence leading to suboptimal treatments. Moreover, current setup approach via couch shift under image guidance can correct translational errors, while rotational and deformation errors are hard to address. To overcome these problems, we propose in this abstract a patient-centered scheme to mitigate impacts of treatment setup errors. Methods: In the patient-centered scheme, we first position the patient on the couch approximately matching the planned-setup. Our Supercomputing Online Replanningmore » Environment (SCORE) is then employed to design an optimal treatment plan based on the daily patient geometry. It hence mitigates the impacts of treatment setup error and reduces the requirements on setup accuracy. We have conducted simulations studies in 10 head-and-neck (HN) patients to investigate the feasibility of this scheme. Rotational and deformation setup errors were simulated. Specifically, 1, 3, 5, 7 degrees of rotations were put on pitch, roll, and yaw directions; deformation errors were simulated by splitting neck movements into four basic types: rotation, lateral bending, flexion and extension. Setup variation ranges are based on observed numbers in previous studies. Dosimetric impacts of our scheme were evaluated on PTVs and OARs in comparison with original plan dose with original geometry and original plan recalculated dose with new setup geometries. Results: With conventional plan-centered approach, setup error could lead to significant PTV D99 decrease (−0.25∼+32.42%) and contralateral-parotid Dmean increase (−35.09∼+42.90%). The patientcentered approach is effective in mitigating such impacts to 0∼+0.20% and −0.03∼+5.01%, respectively. Computation time is <128 s. Conclusion: Patient-centered scheme is proposed to mitigate setup error impacts using replanning. Its superiority in terms of dosimetric impacts and feasibility has been shown through simulation studies on HN cases.« less

  2. A Laboratory Exercise with Related Rates.

    ERIC Educational Resources Information Center

    Sworder, Steven C.

    A laboratory experiment, based on a simple electric circuit that can be used to demonstrate the existence of real-world "related rates" problems, is outlined and an equation for voltage across the capacitor terminals during discharge is derived. The necessary materials, setup methods, and experimental problems are described. A student laboratory…

  3. Integrated Field Screening for Rapid Sediment Characterization

    DTIC Science & Technology

    2004-08-01

    operating procedure SOW statement of work sq square mile(s) SSC San Diego–Space and Naval Warfare Systems Center, San Diego TBT tributyltin ...PCBs], tributyltin [ TBT ]), the data show these areas not very contaminated. 3.4 PHYSICAL SET-UP AND OPERATION The details of the methodology for the

  4. Optimization and development of solar power system under diffused sunlight condition in rural areas with supercapacitor integration

    NASA Astrophysics Data System (ADS)

    Castelino, Roystan V.; Jana, Suman; Kumhar, Rajesh; Singh, Niraj K.

    2018-04-01

    The simulation and hardware based experiment in this presented paper shows a possibility of increasing the reliability of solar power under diffused condition by using super capacitor module. This experimental setup can be used in those areas where the sun light is intermittent and under the diffused radiation condition. Due to diffused radiation, solar PV cells operate very poorly, but by using this setup the power efficiency can be increased greatly. Sometimes dependent numerical models are used to measure the voltage and current response of the hardware setup in MATLAB Simulink based environment. To convert the scattered solar radiation to electricity using the conventional solar PV module, batteries have to be linked with the rapid charging or discharging device like super capacitor module. The conventional method consists of a charging circuit, which dumps the power if the voltage is below certain voltage level, but this circuit utilizes the entire power even if the voltage is low under diffused sun light conditions. There is no power dumped in this circuit. The efficiency and viability of this labscale experimental setup can be examined with further experiment and industrial model.

  5. American & Soviet engineers examine ASTP docking set-up following tests

    NASA Image and Video Library

    1974-07-10

    S74-25394 (10 July 1974) --- A group of American and Soviet engineers of the Apollo-Soyuz Test Project working group three examines an ASTP docking set-up following a docking mechanism fitness test conducted in Building 13 at the Johnson Space Center. Working Group No. 3 is concerned with ASTP docking problems and techniques. The joint U.S.-USSR ASTP docking mission in Earth orbit is scheduled for the summer of 1975. The Apollo docking mechanism is atop the Soyuz docking mechanism.

  6. Microcontroller-Based Experimental Setup and Experiments for SCADA Education

    ERIC Educational Resources Information Center

    Sahin, S.; Olmez, M.; Isler, Y.

    2010-01-01

    In the field of automation technology, research and development for industrial applications has increased rapidly in recent years. Therefore, industrial automation and control education is a very important element of the industrialization process in developing countries, such as Turkey, which needs to keep abreast for the latest developments in…

  7. Split and flow: reconfigurable capillary connection for digital microfluidic devices.

    PubMed

    Lapierre, Florian; Harnois, Maxime; Coffinier, Yannick; Boukherroub, Rabah; Thomy, Vincent

    2014-09-21

    Supplying liquid to droplet-based microfluidic microsystems remains a delicate task facing the problems of coupling continuous to digital or macro- to microfluidic systems. Here, we take advantage of superhydrophobic microgrids to address this problem. Insertion of a capillary tube inside a microgrid aperture leads to a simple and reconfigurable droplet generation setup.

  8. Problem-Based Learning in Wind Energy Using Virtual and Real Setups

    ERIC Educational Resources Information Center

    Santos-Martin, D.; Alonso-Martinez, J.; Eloy-Garcia Carrasco, J.; Arnaltes, S.

    2012-01-01

    The use of wind energy is now an established fact, and many educational institutions are introducing this topic into their engineering studies. Problem-based learning (PBL), as a student-centered instructional approach, has contributed to important developments in engineering education over the last few years. This paper presents the experience of…

  9. Progress towards a rapidly rotating ultracold Fermi gas

    NASA Astrophysics Data System (ADS)

    Hu, Ming-Guang; van de Graaff, Michael; Cornell, Eric; Jin, Deborah

    2015-05-01

    We are designing an experiment with the goal of creating a rapidly rotating ultracold Fermi gas, which is promising system in which to study quantum Hall physics. We propose to use selective evaporation of a gas that has been initialized with a modest rotation rate to increase the angular momentum per particle in order to reach rapid rotation. We have performed simulations of this evaporation process for a model optical trap potential. Achieving rapid rotation will require a very smooth, very harmonic, and dynamically variable optical trap. We plan to use a setup consisting of two acousto-optical modulators to ``paint'' an optical dipole trapping potential that can be made smooth, radially symmetric, and harmonic. This project is supported by NSF, NIST, NASA.

  10. Quantum Experiments and Graphs: Multiparty States as Coherent Superpositions of Perfect Matchings.

    PubMed

    Krenn, Mario; Gu, Xuemei; Zeilinger, Anton

    2017-12-15

    We show a surprising link between experimental setups to realize high-dimensional multipartite quantum states and graph theory. In these setups, the paths of photons are identified such that the photon-source information is never created. We find that each of these setups corresponds to an undirected graph, and every undirected graph corresponds to an experimental setup. Every term in the emerging quantum superposition corresponds to a perfect matching in the graph. Calculating the final quantum state is in the #P-complete complexity class, thus it cannot be done efficiently. To strengthen the link further, theorems from graph theory-such as Hall's marriage problem-are rephrased in the language of pair creation in quantum experiments. We show explicitly how this link allows one to answer questions about quantum experiments (such as which classes of entangled states can be created) with graph theoretical methods, and how to potentially simulate properties of graphs and networks with quantum experiments (such as critical exponents and phase transitions).

  11. Principle and analysis of a rotational motion Fourier transform infrared spectrometer

    NASA Astrophysics Data System (ADS)

    Cai, Qisheng; Min, Huang; Han, Wei; Liu, Yixuan; Qian, Lulu; Lu, Xiangning

    2017-09-01

    Fourier transform infrared spectroscopy is an important technique in studying molecular energy levels, analyzing material compositions, and environmental pollutants detection. A novel rotational motion Fourier transform infrared spectrometer with high stability and ultra-rapid scanning characteristics is proposed in this paper. The basic principle, the optical path difference (OPD) calculations, and some tolerance analysis are elaborated. The OPD of this spectrometer is obtained by the continuously rotational motion of a pair of parallel mirrors instead of the translational motion in traditional Michelson interferometer. Because of the rotational motion, it avoids the tilt problems occurred in the translational motion Michelson interferometer. There is a cosine function relationship between the OPD and the rotating angle of the parallel mirrors. An optical model is setup in non-sequential mode of the ZEMAX software, and the interferogram of a monochromatic light is simulated using ray tracing method. The simulated interferogram is consistent with the theoretically calculated interferogram. As the rotating mirrors are the only moving elements in this spectrometer, the parallelism of the rotating mirrors and the vibration during the scan are analyzed. The vibration of the parallel mirrors is the main error during the rotation. This high stability and ultra-rapid scanning Fourier transform infrared spectrometer is a suitable candidate for airborne and space-borne remote sensing spectrometer.

  12. Self-powered electrospinning apparatus based on a hand-operated Wimshurst generator.

    PubMed

    Han, Wen-Peng; Huang, Yuan-Yuan; Yu, Miao; Zhang, Jun-Cheng; Yan, Xu; Yu, Gui-Feng; Zhang, Hong-Di; Yan, Shi-Ying; Long, Yun-Ze

    2015-03-19

    A conventional electrospinning setup cannot work without a plug (electricity supply). In this article, we report a self-powered electrospinning setup based on a hand-operated Wimshurst generator. The new device has better applicability and portability than a typical conventional electrospinning setup because it is lightweight and can work without an external power supply. Experimental parameters of the apparatus such as the minimum number of handle turns to generate enough energy to spin, rotation speed of the handle and electrospinning distance were investigated. Different polymers such as polystyrene (PS), poly(vinylidene fluoride) (PVDF), polycaprolactone (PCL) and polylactic acid (PLA) were electrospun into ultrathin fibers successfully by this apparatus. The stability, reliability, and repeatability of the new apparatus demonstrate that it can be used as not only a demonstrator for an electrospinning process, but also a beneficial complement to conventional electrospinning especially where or when without a power supply, and may be used in wound healing and rapid hemostasis, etc.

  13. Setup for polarized neutron imaging using in situ 3He cells at the Oak Ridge National Laboratory High Flux Isotope Reactor CG-1D beamline

    NASA Astrophysics Data System (ADS)

    Dhiman, I.; Ziesche, Ralf; Wang, Tianhao; Bilheux, Hassina; Santodonato, Lou; Tong, X.; Jiang, C. Y.; Manke, Ingo; Treimer, Wolfgang; Chatterji, Tapan; Kardjilov, Nikolay

    2017-09-01

    In the present study, we report a new setup for polarized neutron imaging at the ORNL High Flux Isotope Reactor CG-1D beamline using an in situ 3He polarizer and analyzer. This development is very important for extending the capabilities of the imaging instrument at ORNL providing a polarized beam with a large field-of-view, which can be further used in combination with optical devices like Wolter optics, focusing guides, or other lenses for the development of microscope arrangement. Such a setup can be of advantage for the existing and future imaging beamlines at the pulsed neutron sources. The first proof-of-concept experiment is performed to study the ferromagnetic phase transition in the Fe3Pt sample. We also demonstrate that the polychromatic neutron beam in combination with in situ 3He cells can be used as the initial step for the rapid measurement and qualitative analysis of radiographs.

  14. Setup for polarized neutron imaging using in situ 3He cells at the Oak Ridge National Laboratory High Flux Isotope Reactor CG-1D beamline.

    PubMed

    Dhiman, I; Ziesche, Ralf; Wang, Tianhao; Bilheux, Hassina; Santodonato, Lou; Tong, X; Jiang, C Y; Manke, Ingo; Treimer, Wolfgang; Chatterji, Tapan; Kardjilov, Nikolay

    2017-09-01

    In the present study, we report a new setup for polarized neutron imaging at the ORNL High Flux Isotope Reactor CG-1D beamline using an in situ 3 He polarizer and analyzer. This development is very important for extending the capabilities of the imaging instrument at ORNL providing a polarized beam with a large field-of-view, which can be further used in combination with optical devices like Wolter optics, focusing guides, or other lenses for the development of microscope arrangement. Such a setup can be of advantage for the existing and future imaging beamlines at the pulsed neutron sources. The first proof-of-concept experiment is performed to study the ferromagnetic phase transition in the Fe 3 Pt sample. We also demonstrate that the polychromatic neutron beam in combination with in situ 3 He cells can be used as the initial step for the rapid measurement and qualitative analysis of radiographs.

  15. Optimal Cluster Mill Pass Scheduling With an Accurate and Rapid New Strip Crown Model

    NASA Astrophysics Data System (ADS)

    Malik, Arif S.; Grandhi, Ramana V.; Zipf, Mark E.

    2007-05-01

    Besides the requirement to roll coiled sheet at high levels of productivity, the optimal pass scheduling of cluster-type reversing cold mills presents the added challenge of assigning mill parameters that facilitate the best possible strip flatness. The pressures of intense global competition, and the requirements for increasingly thinner, higher quality specialty sheet products that are more difficult to roll, continue to force metal producers to commission innovative flatness-control technologies. This means that during the on-line computerized set-up of rolling mills, the mathematical model should not only determine the minimum total number of passes and maximum rolling speed, it should simultaneously optimize the pass-schedule so that desired flatness is assured, either by manual or automated means. In many cases today, however, on-line prediction of strip crown and corresponding flatness for the complex cluster-type rolling mills is typically addressed either by trial and error, by approximate deflection models for equivalent vertical roll-stacks, or by non-physical pattern recognition style models. The abundance of the aforementioned methods is largely due to the complexity of cluster-type mill configurations and the lack of deflection models with sufficient accuracy and speed for on-line use. Without adequate assignment of the pass-schedule set-up parameters, it may be difficult or impossible to achieve the required strip flatness. In this paper, we demonstrate optimization of cluster mill pass-schedules using a new accurate and rapid strip crown model. This pass-schedule optimization includes computations of the predicted strip thickness profile to validate mathematical constraints. In contrast to many of the existing methods for on-line prediction of strip crown and flatness on cluster mills, the demonstrated method requires minimal prior tuning and no extensive training with collected mill data. To rapidly and accurately solve the multi-contact problem and predict the strip crown, a new customized semi-analytical modeling technique that couples the Finite Element Method (FEM) with classical solid mechanics was developed to model the deflection of the rolls and strip while under load. The technique employed offers several important advantages over traditional methods to calculate strip crown, including continuity of elastic foundations, non-iterative solution when using predetermined foundation moduli, continuous third-order displacement fields, simple stress-field determination, and a comparatively faster solution time.

  16. Integer Linear Programming for Constrained Multi-Aspect Committee Review Assignment

    PubMed Central

    Karimzadehgan, Maryam; Zhai, ChengXiang

    2011-01-01

    Automatic review assignment can significantly improve the productivity of many people such as conference organizers, journal editors and grant administrators. A general setup of the review assignment problem involves assigning a set of reviewers on a committee to a set of documents to be reviewed under the constraint of review quota so that the reviewers assigned to a document can collectively cover multiple topic aspects of the document. No previous work has addressed such a setup of committee review assignments while also considering matching multiple aspects of topics and expertise. In this paper, we tackle the problem of committee review assignment with multi-aspect expertise matching by casting it as an integer linear programming problem. The proposed algorithm can naturally accommodate any probabilistic or deterministic method for modeling multiple aspects to automate committee review assignments. Evaluation using a multi-aspect review assignment test set constructed using ACM SIGIR publications shows that the proposed algorithm is effective and efficient for committee review assignments based on multi-aspect expertise matching. PMID:22711970

  17. 3D near-infrared imaging based on a single-photon avalanche diode array sensor

    NASA Astrophysics Data System (ADS)

    Mata Pavia, Juan; Charbon, Edoardo; Wolf, Martin

    2011-07-01

    An imager for optical tomography was designed based on a detector with 128×128 single-photon pixels that included a bank of 32 time-to-digital converters. Due to the high spatial resolution and the possibility of performing time resolved measurements, a new contact-less setup has been conceived in which scanning of the object is not necessary. This enables one to perform high-resolution optical tomography with much higher acquisition rate, which is fundamental in clinical applications. The setup has a resolution of 97ps and operates with a laser source with an average power of 3mW. This new imaging system generated a high amount of data that could not be processed by established methods, therefore new concepts and algorithms were developed to take full advantage of it. Images were generated using a new reconstruction algorithm that combined general inverse problem methods with Fourier transforms in order to reduce the complexity of the problem. Simulations show that the potential resolution of the new setup is in the order of millimeters. Experiments have been performed to confirm this potential. Images derived from the measurements demonstrate that we have already reached a resolution of 5mm.

  18. RapidIO as a multi-purpose interconnect

    NASA Astrophysics Data System (ADS)

    Baymani, Simaolhoda; Alexopoulos, Konstantinos; Valat, Sébastien

    2017-10-01

    RapidIO (http://rapidio.org/) technology is a packet-switched high-performance fabric, which has been under active development since 1997. Originally meant to be a front side bus, it developed into a system level interconnect which is today used in all 4G/LTE base stations world wide. RapidIO is often used in embedded systems that require high reliability, low latency and scalability in a heterogeneous environment - features that are highly interesting for several use cases, such as data analytics and data acquisition (DAQ) networks. We will present the results of evaluating RapidIO in a data analytics environment, from setup to benchmark. Specifically, we will share the experience of running ROOT and Hadoop on top of RapidIO. To demonstrate the multi-purpose characteristics of RapidIO, we will also present the results of investigating RapidIO as a technology for high-speed DAQ networks using a generic multi-protocol event-building emulation tool. In addition we will present lessons learned from implementing native ports of CERN applications to RapidIO.

  19. Rapid small-scale column testing of granular activated carbon for organic micro-pollutant removal in treated domestic wastewater.

    PubMed

    Zietzschmann, F; Müller, J; Sperlich, A; Ruhl, A S; Meinel, F; Altmann, J; Jekel, M

    2014-01-01

    This study investigates the applicability of the rapid small-scale column test (RSSCT) concept for testing of granular activated carbon (GAC) for organic micro-pollutants (OMPs) removal from wastewater treatment plant (WWTP) effluent. The chosen experimental setup was checked using pure water, WWTP effluent, different GAC products, and variable hydrodynamic conditions with different flow velocities and differently sized GAC, as well as different empty bed contact times (EBCTs). The setup results in satisfying reproducibility and robustness. RSSCTs in combination with WWTP effluent are effective when comparing the OMP removal potentials of different GAC products and are a useful tool for the estimation of larger filters. Due to the potentially high competition between OMPs and bulk organics, breakthrough curves are likely to have unfavorable shapes when treating WWTP effluent. This effect can be counteracted by extending the EBCT. With respect to the strong competition observed in GAC treatment of WWTP effluent, the small organic acid and neutral substances are retained longer in the RSSCT filters and are likely to cause the majority of the observed adsorption competition with OMPs.

  20. Numerical Simulation Of Cratering Effects In Adobe

    DTIC Science & Technology

    2013-07-01

    DEVELOPMENT OF MATERIAL PARAMETERS .........................................................7 PROBLEM SETUP...37 PARAMETER ADJUSTMENTS ......................................................................................38 GLOSSARY...dependent yield surface with the Geological Yield Surface (GEO) modeled in CTH using well characterized adobe. By identifying key parameters that

  1. Synthetic depth data creation for sensor setup planning and evaluation of multi-camera multi-person trackers

    NASA Astrophysics Data System (ADS)

    Pattke, Marco; Martin, Manuel; Voit, Michael

    2017-05-01

    Tracking people with cameras in public areas is common today. However with an increasing number of cameras it becomes harder and harder to view the data manually. Especially in safety critical areas automatic image exploitation could help to solve this problem. Setting up such a system can however be difficult because of its increased complexity. Sensor placement is critical to ensure that people are detected and tracked reliably. We try to solve this problem using a simulation framework that is able to simulate different camera setups in the desired environment including animated characters. We combine this framework with our self developed distributed and scalable system for people tracking to test its effectiveness and can show the results of the tracking system in real time in the simulated environment.

  2. Rapid determination of Faraday rotation in optical glasses by means of secondary Faraday modulator.

    PubMed

    Sofronie, M; Elisa, M; Sava, B A; Boroica, L; Valeanu, M; Kuncser, V

    2015-05-01

    A rapid high sensitive method for determining the Faraday rotation of optical glasses is proposed. Starting from an experimental setup based on a Faraday rod coupled to a lock-in amplifier in the detection chain, two methodologies were developed for providing reliable results on samples presenting low and large Faraday rotations. The proposed methodologies were critically discussed and compared, via results obtained in transmission geometry, on a new series of aluminophosphate glasses with or without rare-earth doping ions. An example on how the method can be used for a rapid examination of the optical homogeneity of the sample with respect to magneto-optical effects is also provided.

  3. Needleless electrospinning with twisted wire spinneret

    NASA Astrophysics Data System (ADS)

    Holopainen, Jani; Penttinen, Toni; Santala, Eero; Ritala, Mikko

    2015-01-01

    A needleless electrospinning setup named ‘Needleless Twisted Wire Electrospinning’ was developed. The polymer solution is electrospun from the surface of a twisted wire set to a high voltage and collected on a cylindrical collector around the wire. Multiple Taylor cones are simultaneously self-formed on the downward flowing solution. The system is robust and simple with no moving parts aside from the syringe pump used to transport the solution to the top of the wire. The structure and process parameters of the setup and the results on the preparation of polyvinyl pyrrolidone (PVP), hydroxyapatite (HA) and bioglass fibers with the setup are presented. PVP fiber sheets with areas of 40 × 120 cm2 and masses up to 1.15 g were prepared. High production rates of 5.23 g h-1 and 1.40 g h-1 were achieved for PVP and HA respectively. The major limiting factor of the setup is drying of the polymer solution on the wire during the electrospinning process which will eventually force to interrupt the process for cleaning of the wire. Possible solutions to this problem and other ways to develop the setup are discussed. The presented system provides a simple way to increase the production rate and area of fiber sheet as compared with the conventional needle electrospinning.

  4. Set-up of a decision support system to support sustainable development of the Laguna de Bay, Philippines.

    PubMed

    Nauta, Tjitte A; Bongco, Alicia E; Santos-Borja, Adelina C

    2003-01-01

    Over recent decades, population expansion, deforestation, land conversion, urbanisation, intense fisheries and industrialisation have produced massive changes in the Laguna de Bay catchment, Philippines. The resulting problems include rapid siltation of the lake, eutrophication, inputs of toxics, flooding problems and loss of biodiversity. Rational and systematic resolution of conflicting water use and water allocation interests is now urgently needed in order to ensure sustainable use of the water resources. With respect to the competing and conflicting pressures on the water resources, the Laguna Lake Development Authority (LLDA) needs to achieve comprehensive management and development of the area. In view of these problems and needs, the Government of the Netherlands was funding a two-year project entitled 'Sustainable Development of the Laguna de Bay Environment'.A comprehensive tool has been developed to support decision-making at catchment level. This consists of an ArcView GIS-database linked to a state-of-the-art modelling suite, including hydrological and waste load models for the catchment area and a three-dimensional hydrodynamic and water quality model (Delft3D) linked to a habitat evaluation module for the lake. In addition, MS Office based tools to support a stakeholder analysis and financial and economic assessments have been developed. The project also focused on technical studies relating to dredging, drinking water supply and infrastructure works. These aimed to produce technically and economically feasible solutions to water quantity and quality problems. The paper also presents the findings of a study on the development of polder islands in the Laguna de Bay, addressing the water quantity and quality problems and focusing on the application of the decision support system.

  5. Robust Linuron Degradation in On-Farm Biopurification Systems Exposed to Sequential Environmental Changes▿

    PubMed Central

    Sniegowski, Kristel; Bers, Karolien; Ryckeboer, Jaak; Jaeken, Peter; Spanoghe, Pieter; Springael, Dirk

    2011-01-01

    On-farm biopurification systems (BPS) treat pesticide-contaminated wastewater of farms through biodegradation. Adding pesticide-primed soil has been shown to be beneficial for the establishment of pesticide-degrading populations in BPS. However, no data exist on the response of pesticide-degrading microbiota, either endogenous or introduced with pesticide-primed soil, when BPS are exposed to expected less favorable environmental conditions like cold periods, drought periods, and periods without a pesticide supply. Therefore, the response of microbiota mineralizing the herbicide linuron in BPS microcosm setups inoculated either with a linuron-primed soil or a nonprimed soil to a sequence of such less favorable conditions was examined. A period without linuron supply or a drought period reduced the size of the linuron-mineralizing community in both setups. The most severe effect was recorded for the setup containing nonprimed soil, in which stopping the linuron supply decreased the linuron degradation capacity to nondetectable levels. In both systems, linuron mineralization rapidly reestablished after conventional operation conditions were restored. A cold period and feeding with a pesticide mixture did not affect linuron mineralization. The changes in the linuron-mineralizing capacity in microcosms containing primed soil were associated with the dynamics of a particular Variovorax phylotype that previously had been associated with linuron mineralization. This study suggests that the pesticide-mineralizing community in BPS is robust in stress situations imposed by changes in environmental conditions expected to occur on farms. Moreover, it suggests that, in cases where effects do occur, recovery is rapid after restoring conventional operation conditions. PMID:21803897

  6. Multiparametric Experiments and Multiparametric Setups for Metering Explosive Eruptions

    NASA Astrophysics Data System (ADS)

    Taddeucci, J.; Scarlato, P.; Del Bello, E.

    2016-12-01

    Explosive eruptions are multifaceted processes best studied by integrating a variety of observational perspectives. This need marries well with the continuous stream of new means that technological progress provides to volcanologists to parameterize these eruptions. Since decades, new technologies have been tested and integrated approaches have been attempted during so-called multiparametric experiments, i.e., short field campaigns with many, different instruments (and scientists) targeting natural laboratory volcanoes. Recently, portable multiparametric setups have been developed, including a few, highly complementary instruments to be rapidly deployed at any erupting volcano. Multiparametric experiments and setups share most of their challenges, like technical issues, site logistics, and data processing and interpretation. Our FAMoUS (FAst MUltiparametric Setup) setup pivots around coupled, high-speed imaging (visible and thermal) and acoustic (infrasonic to audible) recording, plus occasional seismic recording and sample collection. FAMoUS provided new insights on pyroclasts ejection and settling and jet noise dynamics at volcanoes worldwide. In the last years we conducted a series of BAcIO (Broadband ACquisition and Imaging Operation) experiments at Stromboli (Italy). These hosted state-of-the-art and prototypal eruption-metering technologies, including: multiple high-speed high-definition cameras for 3-D imaging; combined visible-infrared-ultraviolet imaging; in-situ and remote gas measurements; UAV aerial surveys; Doppler radar, and microphone arrays. This combined approach provides new understandings of the fundamental controls of Strombolian-style activity, and allows for crucial cross-validation of instruments and techniques. Several documentary expeditions participated in the BAcIO, attesting its tremendous potential for public outreach. Finally, sharing field work promotes interdisciplinary discussions and cooperation like nothing in the world.

  7. A hybrid flowshop scheduling model considering dedicated machines and lot-splitting for the solar cell industry

    NASA Astrophysics Data System (ADS)

    Wang, Li-Chih; Chen, Yin-Yann; Chen, Tzu-Li; Cheng, Chen-Yang; Chang, Chin-Wei

    2014-10-01

    This paper studies a solar cell industry scheduling problem, which is similar to traditional hybrid flowshop scheduling (HFS). In a typical HFS problem, the allocation of machine resources for each order should be scheduled in advance. However, the challenge in solar cell manufacturing is the number of machines that can be adjusted dynamically to complete the job. An optimal production scheduling model is developed to explore these issues, considering the practical characteristics, such as hybrid flowshop, parallel machine system, dedicated machines, sequence independent job setup times and sequence dependent job setup times. The objective of this model is to minimise the makespan and to decide the processing sequence of the orders/lots in each stage, lot-splitting decisions for the orders and the number of machines used to satisfy the demands in each stage. From the experimental results, lot-splitting has significant effect on shortening the makespan, and the improvement effect is influenced by the processing time and the setup time of orders. Therefore, the threshold point to improve the makespan can be identified. In addition, the model also indicates that more lot-splitting approaches, that is, the flexibility of allocating orders/lots to machines is larger, will result in a better scheduling performance.

  8. Combined Tensor Fitting and TV Regularization in Diffusion Tensor Imaging Based on a Riemannian Manifold Approach.

    PubMed

    Baust, Maximilian; Weinmann, Andreas; Wieczorek, Matthias; Lasser, Tobias; Storath, Martin; Navab, Nassir

    2016-08-01

    In this paper, we consider combined TV denoising and diffusion tensor fitting in DTI using the affine-invariant Riemannian metric on the space of diffusion tensors. Instead of first fitting the diffusion tensors, and then denoising them, we define a suitable TV type energy functional which incorporates the measured DWIs (using an inverse problem setup) and which measures the nearness of neighboring tensors in the manifold. To approach this functional, we propose generalized forward- backward splitting algorithms which combine an explicit and several implicit steps performed on a decomposition of the functional. We validate the performance of the derived algorithms on synthetic and real DTI data. In particular, we work on real 3D data. To our knowledge, the present paper describes the first approach to TV regularization in a combined manifold and inverse problem setup.

  9. The quasi-optimality criterion in the linear functional strategy

    NASA Astrophysics Data System (ADS)

    Kindermann, Stefan; Pereverzyev, Sergiy, Jr.; Pilipenko, Andrey

    2018-07-01

    The linear functional strategy for the regularization of inverse problems is considered. For selecting the regularization parameter therein, we propose the heuristic quasi-optimality principle and some modifications including the smoothness of the linear functionals. We prove convergence rates for the linear functional strategy with these heuristic rules taking into account the smoothness of the solution and the functionals and imposing a structural condition on the noise. Furthermore, we study these noise conditions in both a deterministic and stochastic setup and verify that for mildly-ill-posed problems and Gaussian noise, these conditions are satisfied almost surely, where on the contrary, in the severely-ill-posed case and in a similar setup, the corresponding noise condition fails to hold. Moreover, we propose an aggregation method for adaptively optimizing the parameter choice rule by making use of improved rates for linear functionals. Numerical results indicate that this method yields better results than the standard heuristic rule.

  10. Microwave-mediated magneto-optical trap for polar molecules

    NASA Astrophysics Data System (ADS)

    Dizhou, Xie; Wenhao, Bu; Bo, Yan

    2016-05-01

    Realizing a molecular magneto-optical trap has been a dream for cold molecular physicists for a long time. However, due to the complex energy levels and the small effective Lande g-factor of the excited states, the traditional magneto-optical trap (MOT) scheme does not work very well for polar molecules. One way to overcome this problem is the switching MOT, which requires very fast switching of both the magnetic field and the laser polarizations. Switching laser polarizations is relatively easy, but fast switching of the magnetic field is experimentally challenging. Here we propose an alternative approach, the microwave-mediated MOT, which requires a slight change of the current experimental setup to solve the problem. We calculate the MOT force and compare it with the traditional MOT and the switching MOT scheme. The results show that we can operate a good MOT with this simple setup. Project supported by the Fundamental Research Funds for the Central Universities of China.

  11. Numerical solution of a coefficient inverse problem with multi-frequency experimental raw data by a globally convergent algorithm

    NASA Astrophysics Data System (ADS)

    Nguyen, Dinh-Liem; Klibanov, Michael V.; Nguyen, Loc H.; Kolesov, Aleksandr E.; Fiddy, Michael A.; Liu, Hui

    2017-09-01

    We analyze in this paper the performance of a newly developed globally convergent numerical method for a coefficient inverse problem for the case of multi-frequency experimental backscatter data associated to a single incident wave. These data were collected using a microwave scattering facility at the University of North Carolina at Charlotte. The challenges for the inverse problem under the consideration are not only from its high nonlinearity and severe ill-posedness but also from the facts that the amount of the measured data is minimal and that these raw data are contaminated by a significant amount of noise, due to a non-ideal experimental setup. This setup is motivated by our target application in detecting and identifying explosives. We show in this paper how the raw data can be preprocessed and successfully inverted using our inversion method. More precisely, we are able to reconstruct the dielectric constants and the locations of the scattering objects with a good accuracy, without using any advanced a priori knowledge of their physical and geometrical properties.

  12. Automation and results of Adjacent Band Emission testing

    DOT National Transportation Integrated Search

    2015-03-01

    Problem Statement : Multiple groups conduct tests in various ways - Outcomes vary based on test setup and assumptions - No standard has been established to conduct such tests - Spectrum is scarce and the need for compliance testing will only increase...

  13. A Python tool to set up relative free energy calculations in GROMACS

    PubMed Central

    Klimovich, Pavel V.; Mobley, David L.

    2015-01-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper [14], recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge [16]. Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations. PMID:26487189

  14. A Mixed Integer Linear Programming Approach to Electrical Stimulation Optimization Problems.

    PubMed

    Abouelseoud, Gehan; Abouelseoud, Yasmine; Shoukry, Amin; Ismail, Nour; Mekky, Jaidaa

    2018-02-01

    Electrical stimulation optimization is a challenging problem. Even when a single region is targeted for excitation, the problem remains a constrained multi-objective optimization problem. The constrained nature of the problem results from safety concerns while its multi-objectives originate from the requirement that non-targeted regions should remain unaffected. In this paper, we propose a mixed integer linear programming formulation that can successfully address the challenges facing this problem. Moreover, the proposed framework can conclusively check the feasibility of the stimulation goals. This helps researchers to avoid wasting time trying to achieve goals that are impossible under a chosen stimulation setup. The superiority of the proposed framework over alternative methods is demonstrated through simulation examples.

  15. PyNCS: a microkernel for high-level definition and configuration of neuromorphic electronic systems

    PubMed Central

    Stefanini, Fabio; Neftci, Emre O.; Sheik, Sadique; Indiveri, Giacomo

    2014-01-01

    Neuromorphic hardware offers an electronic substrate for the realization of asynchronous event-based sensory-motor systems and large-scale spiking neural network architectures. In order to characterize these systems, configure them, and carry out modeling experiments, it is often necessary to interface them to workstations. The software used for this purpose typically consists of a large monolithic block of code which is highly specific to the hardware setup used. While this approach can lead to highly integrated hardware/software systems, it hampers the development of modular and reconfigurable infrastructures thus preventing a rapid evolution of such systems. To alleviate this problem, we propose PyNCS, an open-source front-end for the definition of neural network models that is interfaced to the hardware through a set of Python Application Programming Interfaces (APIs). The design of PyNCS promotes modularity, portability and expandability and separates implementation from hardware description. The high-level front-end that comes with PyNCS includes tools to define neural network models as well as to create, monitor and analyze spiking data. Here we report the design philosophy behind the PyNCS framework and describe its implementation. We demonstrate its functionality with two representative case studies, one using an event-based neuromorphic vision sensor, and one using a set of multi-neuron devices for carrying out a cognitive decision-making task involving state-dependent computation. PyNCS, already applicable to a wide range of existing spike-based neuromorphic setups, will accelerate the development of hybrid software/hardware neuromorphic systems, thanks to its code flexibility. The code is open-source and available online at https://github.com/inincs/pyNCS. PMID:25232314

  16. Optical Imaging and Radiometric Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.

    2010-01-01

    OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge diffusion modulation transfer function (MTF).

  17. PyNCS: a microkernel for high-level definition and configuration of neuromorphic electronic systems.

    PubMed

    Stefanini, Fabio; Neftci, Emre O; Sheik, Sadique; Indiveri, Giacomo

    2014-01-01

    Neuromorphic hardware offers an electronic substrate for the realization of asynchronous event-based sensory-motor systems and large-scale spiking neural network architectures. In order to characterize these systems, configure them, and carry out modeling experiments, it is often necessary to interface them to workstations. The software used for this purpose typically consists of a large monolithic block of code which is highly specific to the hardware setup used. While this approach can lead to highly integrated hardware/software systems, it hampers the development of modular and reconfigurable infrastructures thus preventing a rapid evolution of such systems. To alleviate this problem, we propose PyNCS, an open-source front-end for the definition of neural network models that is interfaced to the hardware through a set of Python Application Programming Interfaces (APIs). The design of PyNCS promotes modularity, portability and expandability and separates implementation from hardware description. The high-level front-end that comes with PyNCS includes tools to define neural network models as well as to create, monitor and analyze spiking data. Here we report the design philosophy behind the PyNCS framework and describe its implementation. We demonstrate its functionality with two representative case studies, one using an event-based neuromorphic vision sensor, and one using a set of multi-neuron devices for carrying out a cognitive decision-making task involving state-dependent computation. PyNCS, already applicable to a wide range of existing spike-based neuromorphic setups, will accelerate the development of hybrid software/hardware neuromorphic systems, thanks to its code flexibility. The code is open-source and available online at https://github.com/inincs/pyNCS.

  18. Initialization and Setup of the Coastal Model Test Bed: STWAVE

    DTIC Science & Technology

    2017-01-01

    Laboratory (CHL) Field Research Facility (FRF) in Duck , NC. The improved evaluation methodology will promote rapid enhancement of model capability and focus...Blanton 2008) study . This regional digital elevation model (DEM), with a cell size of 10 m, was generated from numerous datasets collected at different...INFORMATION: For additional information, contact Spicer Bak, Coastal Observation and Analysis Branch, Coastal and Hydraulics Laboratory, 1261 Duck Road

  19. Quality assurance for kilo- and megavoltage in-room imaging and localization for off- and online setup error correction.

    PubMed

    Balter, James M; Antonuk, Larry E

    2008-01-01

    In-room radiography is not a new concept for image-guided radiation therapy. Rapid advances in technology, however, have made this positioning method convenient, and thus radiograph-based positioning has propagated widely. The paradigms for quality assurance of radiograph-based positioning include imager performance, systems integration, infrastructure, procedure documentation and testing, and support for positioning strategy implementation.

  20. Modelling rapid subsurface flow at the hillslope scale with explicit representation of preferential flow paths

    NASA Astrophysics Data System (ADS)

    Wienhöfer, J.; Zehe, E.

    2012-04-01

    Rapid lateral flow processes via preferential flow paths are widely accepted to play a key role for rainfall-runoff response in temperate humid headwater catchments. A quantitative description of these processes, however, is still a major challenge in hydrological research, not least because detailed information about the architecture of subsurface flow paths are often impossible to obtain at a natural site without disturbing the system. Our study combines physically based modelling and field observations with the objective to better understand how flow network configurations influence the hydrological response of hillslopes. The system under investigation is a forested hillslope with a small perennial spring at the study area Heumöser, a headwater catchment of the Dornbirnerach in Vorarlberg, Austria. In-situ points measurements of field-saturated hydraulic conductivity and dye staining experiments at the plot scale revealed that shrinkage cracks and biogenic macropores function as preferential flow paths in the fine-textured soils of the study area, and these preferential flow structures were active in fast subsurface transport of artificial tracers at the hillslope scale. For modelling of water and solute transport, we followed the approach of implementing preferential flow paths as spatially explicit structures of high hydraulic conductivity and low retention within the 2D process-based model CATFLOW. Many potential configurations of the flow path network were generated as realisations of a stochastic process informed by macropore characteristics derived from the plot scale observations. Together with different realisations of soil hydraulic parameters, this approach results in a Monte Carlo study. The model setups were used for short-term simulation of a sprinkling and tracer experiment, and the results were evaluated against measured discharges and tracer breakthrough curves. Although both criteria were taken for model evaluation, still several model setups produced acceptable matches to the observed behaviour. These setups were selected for long-term simulation, the results of which were compared against water level measurements at two piezometers along the hillslope and the integral discharge response of the spring to reject some non-behavioural model setups and further reduce equifinality. The results of this study indicate that process-based modelling can provide a means to distinguish preferential flow networks on the hillslope scale when complementary measurements to constrain the range of behavioural model setups are available. These models can further be employed as a virtual reality to investigate the characteristics of flow path architectures and explore effective parameterisations for larger scale applications.

  1. Quantitative filter technique measurements of spectral light absorption by aquatic particles using a portable integrating cavity absorption meter (QFT-ICAM).

    PubMed

    Röttgers, Rüdiger; Doxaran, David; Dupouy, Cecile

    2016-01-25

    The accurate determination of light absorption coefficients of particles in water, especially in very oligotrophic oceanic areas, is still a challenging task. Concentrating aquatic particles on a glass fiber filter and using the Quantitative Filter Technique (QFT) is a common practice. Its routine application is limited by the necessary use of high performance spectrophotometers, distinct problems induced by the strong scattering of the filters and artifacts induced by freezing and storing samples. Measurements of the sample inside a large integrating sphere reduce scattering effects and direct field measurements avoid artifacts due to sample preservation. A small, portable, Integrating Cavity Absorption Meter setup (QFT-ICAM) is presented, that allows rapid measurements of a sample filter. The measurement technique takes into account artifacts due to chlorophyll-a fluorescence. The QFT-ICAM is shown to be highly comparable to similar measurements in laboratory spectrophotometers, in terms of accuracy, precision, and path length amplification effects. No spectral artifacts were observed when compared to measurement of samples in suspension, whereas freezing and storing of sample filters induced small losses of water-soluble pigments (probably phycoerythrins). Remaining problems in determining the particulate absorption coefficient with the QFT-ICAM are strong sample-to-sample variations of the path length amplification, as well as fluorescence by pigments that is emitted in a different spectral region than that of chlorophyll-a.

  2. Exploring New Physics Frontiers Through Numerical Relativity.

    PubMed

    Cardoso, Vitor; Gualtieri, Leonardo; Herdeiro, Carlos; Sperhake, Ulrich

    2015-01-01

    The demand to obtain answers to highly complex problems within strong-field gravity has been met with significant progress in the numerical solution of Einstein's equations - along with some spectacular results - in various setups. We review techniques for solving Einstein's equations in generic spacetimes, focusing on fully nonlinear evolutions but also on how to benchmark those results with perturbative approaches. The results address problems in high-energy physics, holography, mathematical physics, fundamental physics, astrophysics and cosmology.

  3. Rapid Material Appearance Acquisition Using Consumer Hardware

    PubMed Central

    Filip, Jiří; Vávra, Radomír; Krupička, Mikuláš

    2014-01-01

    A photo-realistic representation of material appearance can be achieved by means of bidirectional texture function (BTF) capturing a material’s appearance for varying illumination, viewing directions, and spatial pixel coordinates. BTF captures many non-local effects in material structure such as inter-reflections, occlusions, shadowing, or scattering. The acquisition of BTF data is usually time and resource-intensive due to the high dimensionality of BTF data. This results in expensive, complex measurement setups and/or excessively long measurement times. We propose an approximate BTF acquisition setup based on a simple, affordable mechanical gantry containing a consumer camera and two LED lights. It captures a very limited subset of material surface images by shooting several video sequences. A psychophysical study comparing captured and reconstructed data with the reference BTFs of seven tested materials revealed that results of our method show a promising visual quality. Speed of the setup has been demonstrated on measurement of human skin and measurement and modeling of a glue dessication time-varying process. As it allows for fast, inexpensive, acquisition of approximate BTFs, this method can be beneficial to visualization applications demanding less accuracy, where BTF utilization has previously been limited. PMID:25340451

  4. Contributions to the problem of piezoelectric accelerometer calibration. [using lock-in voltmeter

    NASA Technical Reports Server (NTRS)

    Jakab, I.; Bordas, A.

    1974-01-01

    After discussing the principal calibration methods for piezoelectric accelerometers, an experimental setup for accelerometer calibration by the reciprocity method is described It is shown how the use of a lock-in voltmeter eliminates errors due to viscous damping and electrical loading.

  5. Videoconferencing On-Line: Enhancing Communication over the Internet.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    1996-01-01

    Discusses the development and use of Internet videoconferencing to aid collaboration among college faculty on campus and throughout the world, to facilitate consultation, and to improve teaching. The author examines the hardware and software needed, the setup process, and the difficulties and problem areas. (GR)

  6. UMTS rapid response real-time seismic networks: implementation and strategies at INGV

    NASA Astrophysics Data System (ADS)

    Govoni, Aladino; Margheriti, Lucia; Moretti, Milena; Lauciani, Valentino; Sensale, Gianpaolo; Bucci, Augusto; Criscuoli, Fabio

    2015-04-01

    The benefits of portable real-time seismic networks are several and well known. During the management of a temporary experiment from the real-time data it is possible to detect and fix rapidly problems with power supply, time synchronization, disk failures and, most important, seismic signal quality degradation due to unexpected noise sources or sensor alignment/tampering. This usually minimizes field maintenance trips and maximizes both the quantity and the quality of the acquired data. When the area of the temporary experiment is not well monitored by the local permanent network, the real-time data from the temporary experiment can be fed to the permanent network monitoring system improving greatly both the real-time hypocentral locations and the final revised bulletin. All these benefits apply also in case of seismic crises when rapid deployment stations can significantly contribute to the aftershock analysis. Nowadays data transmission using meshed radio networks or satellite systems is not a big technological problem for a permanent seismic network where each site is optimized for the device power consumption and is usually installed by properly specialized technicians that can configure transmission devices and align antennas. This is not usually practical for temporary networks and especially for rapid response networks where the installation time is the main concern. These difficulties are substantially lowered using the now widespread UMTS technology for data transmission. A small (but sometimes power hungry) properly configured device with an omnidirectional antenna must be added to the station assembly. All setups are usually configured before deployment and this allows for an easy installation also by untrained personnel. We describe here the implementation of a UMTS based portable seismic network for both temporary experiments and rapid response applications developed at INGV. The first field experimentation of this approach dates back to the 2009 L'Aquila aftershock sequence and since then it has been customized and refined to overcome most reliability and security issues using an industry standard VPN architecture that allows to avoid UMTS provider firewall problems and does not expose to the Internet the usually weak and attack prone data acquisition ports. With this approach all the devices are protected inside a local network and the only exposed port is the VPN server one. This solution improves both the security and the bandwidth available to data transmission. While most of the experimentation has been carried out using the RefTek units of the INGV Mobile Network this solution applies equally well to most seismic data loggers available on the market. Overall the UMTS data transmission has been used in most temporary seismic experiments and in all seismic emergencies happened in Italy since 2010 and has proved to be a very cost effective approach with real-time data acquisition rates usually greater than 97% and all the benefits that result from the fast integration of the temporary data in the National Network monitoring system and in the EIDA data bank.

  7. An economic production model for deteriorating items and time dependent demand with rework and multiple production setups

    NASA Astrophysics Data System (ADS)

    Uthayakumar, R.; Tharani, S.

    2017-12-01

    Recently, much emphasis has given to study the control and maintenance of production inventories of the deteriorating items. Rework is one of the main issues in reverse logistic and green supply chain, since it can reduce production cost and the environmental problem. Many researchers have focused on developing rework model, but few of them have developed model for deteriorating items. Due to this fact, we take up productivity and rework with deterioration as the major concern in this paper. In this paper, a production-inventory model with deteriorative items in which one cycle has n production setups and one rework setup (n, 1) policy is considered for deteriorating items with stock-dependent demand in case 1 and exponential demand in case 2. An effective iterative solution procedure is developed to achieve optimal time, so that the total cost of the system is minimized. Numerical and sensitivity analyses are discussed to examine the outcome of the proposed solution procedure presented in this research.

  8. Iterated greedy algorithms to minimize the total family flow time for job-shop scheduling with job families and sequence-dependent set-ups

    NASA Astrophysics Data System (ADS)

    Kim, Ji-Su; Park, Jung-Hyeon; Lee, Dong-Ho

    2017-10-01

    This study addresses a variant of job-shop scheduling in which jobs are grouped into job families, but they are processed individually. The problem can be found in various industrial systems, especially in reprocessing shops of remanufacturing systems. If the reprocessing shop is a job-shop type and has the component-matching requirements, it can be regarded as a job shop with job families since the components of a product constitute a job family. In particular, sequence-dependent set-ups in which set-up time depends on the job just completed and the next job to be processed are also considered. The objective is to minimize the total family flow time, i.e. the maximum among the completion times of the jobs within a job family. A mixed-integer programming model is developed and two iterated greedy algorithms with different local search methods are proposed. Computational experiments were conducted on modified benchmark instances and the results are reported.

  9. A Simple Experimental Setup for Teaching Additive Colors with Arduino

    NASA Astrophysics Data System (ADS)

    Carvalho, Paulo Simeão; Hahn, Marcelo

    2016-04-01

    The result of additive colors is always fascinating to young students. When we teach this topic to 14- to 16-year-old students, they do not usually notice we use maximum light quantities of red (R), green (G), and blue (B) to obtain yellow, magenta, and cyan colors in order to build the well-known additive color diagram of Fig. 1. But how about using different light intensities for R, G, and B? What colors do we get? This problem of color mixing has been intensively discussed for decades by several authors, as pointed out by Ruiz's "Color Addition and Subtraction Apps" work and the references included therein. An early LED demonstrator for additive color mixing dates back to 1985, and apps to illustrate color mixing are available online. In this work, we describe an experimental setup making use of a microcontroller device: the Arduino Uno. This setup is designed as a game in order to improve students' understanding of color mixing.

  10. Dual-color fluorescence cross-correlation spectroscopy in a single nanoaperture : towards rapid multicomponent screening at high concentrations.

    PubMed

    Wenger, Jérôme; Gérard, Davy; Lenne, Pierre-François; Rigneault, Hervé; Dintinger, José; Ebbesen, Thomas W; Boned, Annie; Conchonaud, Fabien; Marguet, Didier

    2006-12-11

    Single nanometric apertures in a metallic film are used to develop a simple and robust setup for dual-color fluorescence cross-correlation spectroscopy (FCCS) at high concentrations. If the nanoaperture concept has already proven to be useful for single-species analysis, its extension to the dual-color case brings new interesting specificities. The alignment and overlap of the two excitation beams are greatly simplified. No confocal pinhole is used, relaxing the requirement for accurate correction of chromatic aberrations. Compared to two-photon excitation, nanoapertures have the advantage to work with standard fluorophore constructions having high absorption cross-section and well-known absorption/emission spectra. Thanks to the ultra-low volume analysed within one single aperture, fluorescence correlation analysis can be performed with single molecule resolution at micromolar concentrations, resulting in 3 orders of magnitude gain compared to conventional setups. As applications of this technique, we follow the kinetics of an enzymatic cleavage reaction at 2 muM DNA oligonucleotide concentration.We also demonstrate that FCCS in nanoaper-tures can be applied to the fast screening of a sample for dual-labeled species within 1 s acquisition time. This offers new possibilities for rapid screening applications in biotechnology at high concentrations.

  11. Learning overcomplete representations from distributed data: a brief review

    NASA Astrophysics Data System (ADS)

    Raja, Haroon; Bajwa, Waheed U.

    2016-05-01

    Most of the research on dictionary learning has focused on developing algorithms under the assumption that data is available at a centralized location. But often the data is not available at a centralized location due to practical constraints like data aggregation costs, privacy concerns, etc. Using centralized dictionary learning algorithms may not be the optimal choice in such settings. This motivates the design of dictionary learning algorithms that consider distributed nature of data as one of the problem variables. Just like centralized settings, distributed dictionary learning problem can be posed in more than one way depending on the problem setup. Most notable distinguishing features are the online versus batch nature of data and the representative versus discriminative nature of the dictionaries. In this paper, several distributed dictionary learning algorithms that are designed to tackle different problem setups are reviewed. One of these algorithms is cloud K-SVD, which solves the dictionary learning problem for batch data in distributed settings. One distinguishing feature of cloud K-SVD is that it has been shown to converge to its centralized counterpart, namely, the K-SVD solution. On the other hand, no such guarantees are provided for other distributed dictionary learning algorithms. Convergence of cloud K-SVD to the centralized K-SVD solution means problems that are solvable by K-SVD in centralized settings can now be solved in distributed settings with similar performance. Finally, cloud K-SVD is used as an example to show the advantages that are attainable by deploying distributed dictionary algorithms for real world distributed datasets.

  12. Training Moldmakers for Industry.

    ERIC Educational Resources Information Center

    Allyn, Edward P.

    1978-01-01

    In 1974, in response to the critical shortage of trained moldmakers, Berkshire Community College (Massachusetts) developed the first two-year college plastic moldmaking and design associate degree curriculum in the United States. The program focuses on the problems encountered in interpreting blueprints and machine set-up instructions in industry.…

  13. IMPACT OF WATER CHEMISTRY ON MANGANESE REMOVAL DURING OXIDATION/FILTRATION TREATMENT

    EPA Science Inventory

    This is a poster showing the purpose and setup of our pilot plant experiments with manganese filtration. The focus is on the differences, effectiveness, and problems with using chlorine and potassium permanganate in oxidation/filtration. The poster will show the results and findi...

  14. Supporting Advice Sharing for Technical Problems in Residential Settings

    ERIC Educational Resources Information Center

    Poole, Erika Shehan

    2010-01-01

    Visions of future computing in residential settings often come with assumptions of seamless, well-functioning, properly configured devices and network connectivity. In the near term, however, processes of setup, maintenance, and troubleshooting are fraught with difficulties; householders regularly report these tasks as confusing, frustrating, and…

  15. Humidity measurements in passive heat and moisture exchangers applications: a critical issue.

    PubMed

    Dubini, G; Fumero, R

    2000-01-01

    A reliable, quantitative assessment of humidification performances of passive heat and moisture exchangers in mechanically-ventilated patients is still to be achieved, although relevant efforts have been made to date. One of the major problems to tackle consists in the difficulty of humidity measurements, both in vivo (during either anaesthesia or intensive care unit treatments) and in vitro set-ups. In this paper a review of the basic operation principles of humidity sensors as well as an analysis of their usage within in vivo and in vitro tests are presented. Particular attention is devoted to the limitations arising from the specific measurement set-up, as they may affect the results notably.

  16. Measurement-device-independent quantum key distribution for Scarani-Acin-Ribordy-Gisin 04 protocol

    PubMed Central

    Mizutani, Akihiro; Tamaki, Kiyoshi; Ikuta, Rikizo; Yamamoto, Takashi; Imoto, Nobuyuki

    2014-01-01

    The measurement-device-independent quantum key distribution (MDI QKD) was proposed to make BB84 completely free from any side-channel in detectors. Like in prepare & measure QKD, the use of other protocols in MDI setting would be advantageous in some practical situations. In this paper, we consider SARG04 protocol in MDI setting. The prepare & measure SARG04 is proven to be able to generate a key up to two-photon emission events. In MDI setting we show that the key generation is possible from the event with single or two-photon emission by a party and single-photon emission by the other party, but the two-photon emission event by both parties cannot contribute to the key generation. On the contrary to prepare & measure SARG04 protocol where the experimental setup is exactly the same as BB84, the measurement setup for SARG04 in MDI setting cannot be the same as that for BB84 since the measurement setup for BB84 in MDI setting induces too many bit errors. To overcome this problem, we propose two alternative experimental setups, and we simulate the resulting key rate. Our study highlights the requirements that MDI QKD poses on us regarding with the implementation of a variety of QKD protocols. PMID:24913431

  17. Towards simultaneous measurements of electronic and structural properties in ultra-fast x-ray free electron laser absorption spectroscopy experiments

    NASA Astrophysics Data System (ADS)

    Gaudin, J.; Fourment, C.; Cho, B. I.; Engelhorn, K.; Galtier, E.; Harmand, M.; Leguay, P. M.; Lee, H. J.; Nagler, B.; Nakatsutsumi, M.; Ozkan, C.; Störmer, M.; Toleikis, S.; Tschentscher, Th; Heimann, P. A.; Dorchies, F.

    2014-04-01

    The rapidly growing ultrafast science with X-ray lasers unveils atomic scale processes with unprecedented time resolution bringing the so called ``molecular movie'' within reach. X-ray absorption spectroscopy is one of the most powerful x-ray techniques providing both local atomic order and electronic structure when coupled with ad-hoc theory. Collecting absorption spectra within few x-ray pulses is possible only in a dispersive setup. We demonstrate ultrafast time-resolved measurements of the LIII-edge x-ray absorption near-edge spectra of irreversibly laser excited Molybdenum using an average of only few x-ray pulses with a signal to noise ratio limited only by the saturation level of the detector. The simplicity of the experimental set-up makes this technique versatile and applicable for a wide range of pump-probe experiments, particularly in the case of non-reversible processes.

  18. Laboratory grown subaerial biofilms on granite: application to the study of bioreceptivity.

    PubMed

    Vázquez-Nion, Daniel; Silva, Benita; Troiano, Federica; Prieto, Beatriz

    2017-01-01

    Simulated environmental colonisation of granite was induced under laboratory conditions in order to develop an experimental protocol for studying bioreceptivity. The experimental set-up proved suitable for producing subaerial biofilms by inoculating granite blocks with planktonic multi-species phototrophic cultures derived from natural biofilms. The ability of four different cultures to form biofilms was monitored over a three-month growth period via colour measurements, quantification of photosynthetic pigments and EPS, and CLSM observations. One of the cultures under study, which comprised several taxa including Bryophyta, Charophyta, Chlorophyta and Cyanobacteria, was particularly suitable as an inoculum, mainly because of its microbial richness, its rapid adaptability to the substratum and its high colonisation capacity. The use of this culture as an inoculum in the proposed experimental set-up to produce subaerial biofilms under laboratory conditions will contribute to standardising the protocols involved, thus enabling more objective assessment of the bioreceptivity of granite in further experiments.

  19. Towards simultaneous measurements of electronic and structural properties in ultra-fast x-ray free electron laser absorption spectroscopy experiments

    PubMed Central

    Gaudin, J.; Fourment, C.; Cho, B. I.; Engelhorn, K.; Galtier, E.; Harmand, M.; Leguay, P. M.; Lee, H. J.; Nagler, B.; Nakatsutsumi, M.; Ozkan, C.; Störmer, M.; Toleikis, S.; Tschentscher, Th; Heimann, P. A.; Dorchies, F.

    2014-01-01

    The rapidly growing ultrafast science with X-ray lasers unveils atomic scale processes with unprecedented time resolution bringing the so called “molecular movie” within reach. X-ray absorption spectroscopy is one of the most powerful x-ray techniques providing both local atomic order and electronic structure when coupled with ad-hoc theory. Collecting absorption spectra within few x-ray pulses is possible only in a dispersive setup. We demonstrate ultrafast time-resolved measurements of the LIII-edge x-ray absorption near-edge spectra of irreversibly laser excited Molybdenum using an average of only few x-ray pulses with a signal to noise ratio limited only by the saturation level of the detector. The simplicity of the experimental set-up makes this technique versatile and applicable for a wide range of pump-probe experiments, particularly in the case of non-reversible processes. PMID:24740172

  20. Object recognition through a multi-mode fiber

    NASA Astrophysics Data System (ADS)

    Takagi, Ryosuke; Horisaki, Ryoichi; Tanida, Jun

    2017-04-01

    We present a method of recognizing an object through a multi-mode fiber. A number of speckle patterns transmitted through a multi-mode fiber are provided to a classifier based on machine learning. We experimentally demonstrated binary classification of face and non-face targets based on the method. The measurement process of the experimental setup was random and nonlinear because a multi-mode fiber is a typical strongly scattering medium and any reference light was not used in our setup. Comparisons between three supervised learning methods, support vector machine, adaptive boosting, and neural network, are also provided. All of those learning methods achieved high accuracy rates at about 90% for the classification. The approach presented here can realize a compact and smart optical sensor. It is practically useful for medical applications, such as endoscopy. Also our study indicated a promising utilization of artificial intelligence, which has rapidly progressed, for reducing optical and computational costs in optical sensing systems.

  1. Towards simultaneous measurements of electronic and structural properties in ultra-fast x-ray free electron laser absorption spectroscopy experiments

    DOE PAGES

    Gaudin, J.; Fourment, C.; Cho, B. I.; ...

    2014-04-17

    The rapidly growing ultrafast science with X-ray lasers unveils atomic scale processes with unprecedented time resolution bringing the so called “molecular movie” within reach. X-ray absorption spectroscopy is one of the most powerful x-ray techniques providing both local atomic order and electronic structure when coupled with ad-hoc theory. Collecting absorption spectra within few x-ray pulses is possible only in a dispersive setup. We demonstrate ultrafast time-resolved measurements of the LIII-edge x-ray absorption near-edge spectra of irreversibly laser excited Molybdenum using an average of only few x-ray pulses with a signal to noise ratio limited only by the saturation level ofmore » the detector. The simplicity of the experimental set-up makes this technique versatile and applicable for a wide range of pump-probe experiments, particularly in the case of non-reversible processes.« less

  2. Sequential x-ray diffraction topography at 1-BM x-ray optics testing beamline at the advanced photon source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoupin, Stanislav, E-mail: sstoupin@aps.anl.gov; Shvyd’ko, Yuri; Trakhtenberg, Emil

    2016-07-27

    We report progress on implementation and commissioning of sequential X-ray diffraction topography at 1-BM Optics Testing Beamline of the Advanced Photon Source to accommodate growing needs of strain characterization in diffractive crystal optics and other semiconductor single crystals. The setup enables evaluation of strain in single crystals in the nearly-nondispersive double-crystal geometry. Si asymmetric collimator crystals of different crystallographic orientations were designed, fabricated and characterized using in-house capabilities. Imaging the exit beam using digital area detectors permits rapid sequential acquisition of X-ray topographs at different angular positions on the rocking curve of a crystal under investigation. Results on sensitivity andmore » spatial resolution are reported based on experiments with high-quality Si and diamond crystals. The new setup complements laboratory-based X-ray topography capabilities of the Optics group at the Advanced Photon Source.« less

  3. Effective bandwidth guaranteed routing schemes for MPLS traffic engineering

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Jain, Nidhi

    2001-07-01

    In this work, we present online algorithms for dynamic routing bandwidth guaranteed label switched paths (LSPs) where LSP set-up requests (in terms of a pair of ingress and egress routers as well as its bandwidth requirement) arrive one by one and there is no a priori knowledge regarding future LSP set-up requests. In addition, we consider rerouting of LSPs in this work. Rerouting of LSPs has not been well studied in previous work on LSP routing. The need of LSP rerouting arises in a number of ways: occurrence of faults (link and/or node failures), re-optimization of existing LSPs' routes to accommodate traffic fluctuation, requests with higher priorities, and so on. We formulate the bandwidth guaranteed LSP routing with rerouting capability as a multi-commodity flow problem. The solution to this problem is used as the benchmark for comparing other computationally less costly algorithms studied in this paper. Furthermore, to more efficiently utilize the network resources, we propose online routing algorithms which route bandwidth demands over multiple paths at the ingress router to satisfy the customer requests while providing better service survivability. Traffic splitting and distribution over the multiple paths are carefully handled using table-based hashing schemes while the order of packets within a flow is preserved. Preliminary simulations are conducted to show the performance of different design choices and the effectiveness of the rerouting and multi-path routing algorithms in terms of LSP set-up request rejection probability and bandwidth blocking probability.

  4. Optical solver of combinatorial problems: nanotechnological approach.

    PubMed

    Cohen, Eyal; Dolev, Shlomi; Frenkel, Sergey; Kryzhanovsky, Boris; Palagushkin, Alexandr; Rosenblit, Michael; Zakharov, Victor

    2013-09-01

    We present an optical computing system to solve NP-hard problems. As nano-optical computing is a promising venue for the next generation of computers performing parallel computations, we investigate the application of submicron, or even subwavelength, computing device designs. The system utilizes a setup of exponential sized masks with exponential space complexity produced in polynomial time preprocessing. The masks are later used to solve the problem in polynomial time. The size of the masks is reduced to nanoscaled density. Simulations were done to choose a proper design, and actual implementations show the feasibility of such a system.

  5. Automatically assessing properties of dynamic cameras for camera selection and rapid deployment of video content analysis tasks in large-scale ad-hoc networks

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; van Rest, Jeroen H. C.; ten Hove, Johan-Martijn; ter Haar, Frank B.; Burghouts, Gertjan J.

    2017-10-01

    Video analytics is essential for managing large quantities of raw data that are produced by video surveillance systems (VSS) for the prevention, repression and investigation of crime and terrorism. Analytics is highly sensitive to changes in the scene, and for changes in the optical chain so a VSS with analytics needs careful configuration and prompt maintenance to avoid false alarms. However, there is a trend from static VSS consisting of fixed CCTV cameras towards more dynamic VSS deployments over public/private multi-organization networks, consisting of a wider variety of visual sensors, including pan-tilt-zoom (PTZ) cameras, body-worn cameras and cameras on moving platforms. This trend will lead to more dynamic scenes and more frequent changes in the optical chain, creating structural problems for analytics. If these problems are not adequately addressed, analytics will not be able to continue to meet end users' developing needs. In this paper, we present a three-part solution for managing the performance of complex analytics deployments. The first part is a register containing meta data describing relevant properties of the optical chain, such as intrinsic and extrinsic calibration, and parameters of the scene such as lighting conditions or measures for scene complexity (e.g. number of people). A second part frequently assesses these parameters in the deployed VSS, stores changes in the register, and signals relevant changes in the setup to the VSS administrator. A third part uses the information in the register to dynamically configure analytics tasks based on VSS operator input. In order to support the feasibility of this solution, we give an overview of related state-of-the-art technologies for autocalibration (self-calibration), scene recognition and lighting estimation in relation to person detection. The presented solution allows for rapid and robust deployment of Video Content Analysis (VCA) tasks in large scale ad-hoc networks.

  6. A Python tool to set up relative free energy calculations in GROMACS.

    PubMed

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  7. Structured grid technology to enable flow simulation in an integrated system environment

    NASA Astrophysics Data System (ADS)

    Remotigue, Michael Gerard

    An application-driven Computational Fluid Dynamics (CFD) environment needs flexible and general tools to effectively solve complex problems in a timely manner. In addition, reusable, portable, and maintainable specialized libraries will aid in rapidly developing integrated systems or procedures. The presented structured grid technology enables the flow simulation for complex geometries by addressing grid generation, grid decomposition/solver setup, solution, and interpretation. Grid generation is accomplished with the graphical, arbitrarily-connected, multi-block structured grid generation software system (GUM-B) developed and presented here. GUM-B is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a solid-modeling data structure that utilizes a structured grid generation library and a geometric library based on Non-Uniform Rational B-Splines (NURBS). A presented modification of the solid-modeling data structure provides the capability for arbitrarily-connected regions between the grid blocks. The presented grid generation library provides algorithms that are reliable and accurate. GUM-B has been utilized to generate numerous structured grids for complex geometries in hydrodynamics, propulsors, and aerodynamics. The versatility of the libraries that compose GUM-B is also displayed in a prototype to automatically regenerate a grid for a free-surface solution. Grid decomposition and solver setup is accomplished with the graphical grid manipulation and repartition software system (GUMBO) developed and presented here. GUMBO is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a structured grid-tools library. The described functions within the grid-tools library reduce the possibility of human error during decomposition and setup for the numerical solver by accounting for boundary conditions and connectivity. GUMBO is linked with a flow solver interface, to the parallel UNCLE code, to provide load balancing tools and solver setup. Weeks of boundary condition and connectivity specification and validation has been reduced to hours. The UNCLE flow solver is utilized for the solution of the flow field. To accelerate convergence toward a quick engineering answer, a full multigrid (FMG) approach coupled with UNCLE, which is a full approximation scheme (FAS), is presented. The prolongation operators used in the FMG-FAS method are compared. The procedure is demonstrated on a marine propeller in incompressible flow. Interpretation of the solution is accomplished by vortex feature detection. Regions of "Intrinsic Swirl" are located by interrogating the velocity gradient tensor for complex eigenvalues. The "Intrinsic Swirl" parameter is visualized on a solution of a marine propeller to determine if any vortical features are captured. The libraries and the structured grid technology presented herein are flexible and general enough to tackle a variety of complex applications. This technology has significantly enabled the capability of the ERC personnel to effectively calculate solutions for complex geometries.

  8. Simulation and Analysis of Converging Shock Wave Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Shashkov, Mikhail J.

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the originalmore » problem, and minimally straining the general credibility of associated analysis and conclusions.« less

  9. Soil moisture flow and nitrate transport through partially saturated zone considering mobile-immobile approach using 3D tank setup

    NASA Astrophysics Data System (ADS)

    Tomar, J.; Yadav, B. K.

    2016-12-01

    The aim of this study is to investigate the soil water flow and nitrate movement through vadose zone considering mobile-immobile approach using large scale three dimensional (3D) tank setup. The three dimensional sand tank setup was fabricated having dimension of 60 cm length, 30 cm width and 60 cm height and embedded with horizontal and vertical layers of sampling ports. The tank was filled with a porous media of average size of 0.5 to 1.0 mm homogeneous and nitrate concentration of 300 mg/l was applied with a distributed constant water flux of 150ml/hr. at the top using a sprinkler system. Pore water samples were collected hourly from the sampling ports and were analyzed using UV-spectrophotometer. The soil hydraulic and solute transport parameters were deduced from the laboratory experiments for simulating the considered 3D domain using the mobile-immobile approach. Soil moisture flow and contaminant transport equations are numerically solved for simulating the nitrate movement in the tank setup. The simulated break through curves (BTC) show the nitrate movement is rapid in mobile region by a factor of 1.2 as compared with the immobile region. The results show that the mobile-immobile approach of predicting solute transport in variably saturated zone can be used effectively in field after getting the required parameters using the laboratory experiments under similar environmental conditions. The high concentration 130 ppm was observed in lateral and transverse axis at 05 cm depth. This results will help in further investigation in field and in implementation of decontamination techniques.

  10. Measurement of electromagnetic tracking error in a navigated breast surgery setup

    NASA Astrophysics Data System (ADS)

    Harish, Vinyas; Baksh, Aidan; Ungi, Tamas; Lasso, Andras; Baum, Zachary; Gauvin, Gabrielle; Engel, Jay; Rudan, John; Fichtinger, Gabor

    2016-03-01

    PURPOSE: The measurement of tracking error is crucial to ensure the safety and feasibility of electromagnetically tracked, image-guided procedures. Measurement should occur in a clinical environment because electromagnetic field distortion depends on positioning relative to the field generator and metal objects. However, we could not find an accessible and open-source system for calibration, error measurement, and visualization. We developed such a system and tested it in a navigated breast surgery setup. METHODS: A pointer tool was designed for concurrent electromagnetic and optical tracking. Software modules were developed for automatic calibration of the measurement system, real-time error visualization, and analysis. The system was taken to an operating room to test for field distortion in a navigated breast surgery setup. Positional and rotational electromagnetic tracking errors were then calculated using optical tracking as a ground truth. RESULTS: Our system is quick to set up and can be rapidly deployed. The process from calibration to visualization also only takes a few minutes. Field distortion was measured in the presence of various surgical equipment. Positional and rotational error in a clean field was approximately 0.90 mm and 0.31°. The presence of a surgical table, an electrosurgical cautery, and anesthesia machine increased the error by up to a few tenths of a millimeter and tenth of a degree. CONCLUSION: In a navigated breast surgery setup, measurement and visualization of tracking error defines a safe working area in the presence of surgical equipment. Our system is available as an extension for the open-source 3D Slicer platform.

  11. All-Optical Wavelength-Path Service With Quality Assurance by Multilayer Integration System

    NASA Astrophysics Data System (ADS)

    Yagi, Mikio; Tanaka, Shinya; Satomi, Shuichi; Ryu, Shiro; Asano, Shoichiro

    2006-09-01

    In the future all-optical network controlled by generalized multiprotocol label switching (GMPLS), the wavelength path between end nodes will change dynamically. This inevitably means that the fiber parameters along the wavelength path will also vary. This variation in fiber parameters influences the signal quality of high-speed-transmission system (bit rates over 40 Gb/s). Therefore, at a path setup, the fiber-parameter effect should be adequately compensated. Moreover, the path setup must be completed fast enough to meet the network-application demands. To realize the rapid setup of adequate paths, a multilayer integration system for all-optical wavelength-path quality assurance is proposed. This multilayer integration system is evaluated in a field trial. In the trial, the GMPLS control plane, measurement plane, and data plane coordinated to maintain the quality of a 40-Gb/s wavelength path that would otherwise be degraded by the influence of chromatic dispersion. It is also demonstrated that the multilayer integration system can assure the signal quality in the face of not only chromatic dispersion but also degradation in the optical signal-to-noise ratio by the use of a 2R regeneration system. Our experiments confirm that the proposed multilayer integration system is an essential part of future all-optical networks.

  12. Land use patterns and urbanization in the holy city of Varanasi, India: a scenario.

    PubMed

    Kumar, Manoj; Mukherjee, Nivedita; Sharma, Gyan Prakash; Raghubanshi, A S

    2010-08-01

    Rapid urbanization and increasing land use changes due to population and economic growth in selected landscapes is being witnessed of late in India and other developing countries. The cities are expanding in all directions resulting in large-scale urban sprawl and changes in urban land use. The spatial pattern of such changes is clearly noticed on the urban fringes or city peripheral rural areas than in the city center. In fact, this is reflected in changing urban land use patterns. There is an urgent need to accurately describe land use changes for planning and sustainable management. In the recent times, remote sensing is gaining importance as vital tool in the analysis and integration of spatial data. This study intends to estimate land use pattern in a planned and unplanned urban setup and also to analyze the impact of change in land use pattern in the Varanasi urban environment. The results indicate that the planned urban setup had a higher tree cover to that of unplanned area in the Varanasi City, although a considerable disparity existed within the planned urban setups. The results emphasize the need to critically review concepts of urban planning and give more consideration to the preservation and management of urban tree cover/greenspace.

  13. SU-F-P-30: Clinical Assessment of Auto Beam-Hold Triggered by Fiducial Localization During Prostate RapidArc Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, P; Chen, Q

    2016-06-15

    Purpose: To assess the clinical efficacy of auto beam hold during prostate RapidArc delivery, triggered by fiducial localization on kV imaging with a Varian True Beam. Methods: Prostate patients with four gold fiducials were candidates in this study. Daily setup was accomplished by aligning to fiducials using orthogonal kV imaging. During RapidArc delivery, a kV image was automatically acquired with a momentary beam hold every 60 degrees of gantry rotation. The position of each fiducial was identified by a search algorithm and compared to a predetermined 1.4 cm diameter target area. Treatment continued if all the fiducials were within themore » target area. If any fiducial was outside the target area the beam hold was not released, and the operators determined if the patient needed re-alignment using the daily setup method. Results: Four patients were initially selected. For three patients, the auto beam hold performed seamlessly. In one instance, the system correctly identified misaligned fiducials, stopped treatment, and the patient was re-positioned. The fourth patient had a prosthetic hip which sometimes blocked the fiducials and caused the fiducial search algorithm to fail. The auto beam hold was disabled for this patient and the therapists manually monitored the fiducial positions during treatment. Average delivery time for a 2-arc fraction was increased by 59 seconds. Phantom studies indicated the dose discrepancy related to multiple beam holds is <0.1%. For a plan with 43 fractions, the additional imaging increased dose by an estimated 68 cGy. Conclusion: Automated intrafraction kV imaging can effectively perform auto beam holds due to patient movement, with the exception of prosthetic hip patients. The additional imaging dose and delivery time are clinically acceptable. It may be a cost-effective alternative to Calypso in RapidArc prostate patient delivery. Further study is warranted to explore its feasibility under various clinical conditions.« less

  14. High dimensional linear regression models under long memory dependence and measurement error

    NASA Astrophysics Data System (ADS)

    Kaul, Abhishek

    This dissertation consists of three chapters. The first chapter introduces the models under consideration and motivates problems of interest. A brief literature review is also provided in this chapter. The second chapter investigates the properties of Lasso under long range dependent model errors. Lasso is a computationally efficient approach to model selection and estimation, and its properties are well studied when the regression errors are independent and identically distributed. We study the case, where the regression errors form a long memory moving average process. We establish a finite sample oracle inequality for the Lasso solution. We then show the asymptotic sign consistency in this setup. These results are established in the high dimensional setup (p> n) where p can be increasing exponentially with n. Finally, we show the consistency, n½ --d-consistency of Lasso, along with the oracle property of adaptive Lasso, in the case where p is fixed. Here d is the memory parameter of the stationary error sequence. The performance of Lasso is also analysed in the present setup with a simulation study. The third chapter proposes and investigates the properties of a penalized quantile based estimator for measurement error models. Standard formulations of prediction problems in high dimension regression models assume the availability of fully observed covariates and sub-Gaussian and homogeneous model errors. This makes these methods inapplicable to measurement errors models where covariates are unobservable and observations are possibly non sub-Gaussian and heterogeneous. We propose weighted penalized corrected quantile estimators for the regression parameter vector in linear regression models with additive measurement errors, where unobservable covariates are nonrandom. The proposed estimators forgo the need for the above mentioned model assumptions. We study these estimators in both the fixed dimension and high dimensional sparse setups, in the latter setup, the dimensionality can grow exponentially with the sample size. In the fixed dimensional setting we provide the oracle properties associated with the proposed estimators. In the high dimensional setting, we provide bounds for the statistical error associated with the estimation, that hold with asymptotic probability 1, thereby providing the ℓ1-consistency of the proposed estimator. We also establish the model selection consistency in terms of the correctly estimated zero components of the parameter vector. A simulation study that investigates the finite sample accuracy of the proposed estimator is also included in this chapter.

  15. Velocity Measurements in Nasal Cavities by Means of Stereoscopic Piv - Preliminary Tests

    NASA Astrophysics Data System (ADS)

    Cozzi, Fabio; Felisati, Giovanni; Quadrio, Maurizio

    2017-08-01

    The prediction of detailed flow patterns in human nasal cavities using computational fluid dynamics (CFD) can provide essential information on the potential relationship between patient-specific geometrical characteristics of the nasal anatomy and health problems, and ultimately led to improved surgery. The complex flow structure and the intricate geometry of the nasal cavities make achieving such goals a challenge for CFD specialists. The need for experimental data to validate and improve the numerical simulations is particularly crucial. To this aim an experimental set-up based on Stereo PIV and a silicon phantom of nasal cavities have been designed and realized at Politecnico di Milano. This work describes the main features and challenges of the set-up along with some preliminary results.

  16. Yeast Two-Hybrid: State of the Art

    PubMed Central

    Beyaert, Rudi

    1999-01-01

    Genome projects are approaching completion and are saturating sequence databases. This paper discusses the role of the two-hybrid system as a generator of hypotheses. Apart from this rather exhaustive, financially and labour intensive procedure, more refined functional studies can be undertaken. Indeed, by making hybrids of two-hybrid systems, customised approaches can be developed in order to attack specific function-related problems. For example, one could set-up a "differential" screen by combining a forward and a reverse approach in a three-hybrid set-up. Another very interesting project is the use of peptide libraries in two-hybrid approaches. This could enable the identification of peptides with very high specificity comparable to "real" antibodies. With the technology available, the only limitation is imagination. PMID:12734586

  17. High-throughput mouse genotyping using robotics automation.

    PubMed

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  18. Identification of the Species of Origin for Meat Products by Rapid Evaporative Ionization Mass Spectrometry.

    PubMed

    Balog, Julia; Perenyi, Dora; Guallar-Hoyas, Cristina; Egri, Attila; Pringle, Steven D; Stead, Sara; Chevallier, Olivier P; Elliott, Chris T; Takats, Zoltan

    2016-06-15

    Increasingly abundant food fraud cases have brought food authenticity and safety into major focus. This study presents a fast and effective way to identify meat products using rapid evaporative ionization mass spectrometry (REIMS). The experimental setup was demonstrated to be able to record a mass spectrometric profile of meat specimens in a time frame of <5 s. A multivariate statistical algorithm was developed and successfully tested for the identification of animal tissue with different anatomical origin, breed, and species with 100% accuracy at species and 97% accuracy at breed level. Detection of the presence of meat originating from a different species (horse, cattle, and venison) has also been demonstrated with high accuracy using mixed patties with a 5% detection limit. REIMS technology was found to be a promising tool in food safety applications providing a reliable and simple method for the rapid characterization of food products.

  19. Simulation of Flow for an Immersed Sphere

    DTIC Science & Technology

    2016-12-01

    Problem Set-Up .................................................................................................... 18 4.0 Results...CFD computer codes are now widely applied in the commercial world for aircraft design with little requirement for wind tunnel testing. A wide range of...as the burning of fuel in gas turbine combustors. Intricate multiphase physics equations couple the behavior of gas phase CFD algorithms with

  20. Anabat bat detection system: description and maintenance manual.

    Treesearch

    Douglas W. Waldren

    2000-01-01

    Anabat bat detection systems record ultrasonic bat calls on cassette tape by using a sophisticated ultrasonic microphone and cassette tape interface. This paper describes equipment setup and some maintenance issues. The layout and function of display panels are presented with special emphasis on how to use this information to troubleshoot equipment problems. The...

  1. [Individual indirect bonding technique (IIBT) using set-up model].

    PubMed

    Kyung, H M

    1989-01-01

    There has been much progress in Edgewise Appliance since E.H. Angle. One of the most important procedures in edgewise appliance is correct bracket position. Not only conventional edgewise appliance but also straight wire appliance & lingual appliance cannot be used more effectively unless the bracket position is accurate. Improper bracket positioning may reveal much problems during treatment, especially in finishing state. It may require either rebonding after the removal of the malpositioned bracket or the greater number of arch wire and the more complex wire bending, causing much difficulty in performing effective treatments. This made me invent Individual Indirect Bonding Technique with the use of multi-purpose set-up model in order to determine a correct and objective bracket position according to individual patients. This technique is more accurate than former indirect bonding techniques in bracket positioning, because it decides the bracket position on a set-up model which has produced to have the occlusal relationship the clinician desired. This technique is especially effective in straight wire appliance and lingual appliance in which the correct bracket positioning is indispensible.

  2. Addressing fluorogenic real-time qPCR inhibition using the novel custom Excel file system 'FocusField2-6GallupqPCRSet-upTool-001' to attain consistently high fidelity qPCR reactions

    PubMed Central

    Ackermann, Mark R.

    2006-01-01

    The purpose of this manuscript is to discuss fluorogenic real-time quantitative polymerase chain reaction (qPCR) inhibition and to introduce/define a novel Microsoft Excel-based file system which provides a way to detect and avoid inhibition, and enables investigators to consistently design dynamically-sound, truly LOG-linear qPCR reactions very quickly. The qPCR problems this invention solves are universal to all qPCR reactions, and it performs all necessary qPCR set-up calculations in about 52 seconds (using a pentium 4 processor) for up to seven qPCR targets and seventy-two samples at a time – calculations that commonly take capable investigators days to finish. We have named this custom Excel-based file system "FocusField2-6GallupqPCRSet-upTool-001" (FF2-6-001 qPCR set-up tool), and are in the process of transforming it into professional qPCR set-up software to be made available in 2007. The current prototype is already fully functional. PMID:17033699

  3. Mechanistic insights of rapid liver regeneration after associating liver partition and portal vein ligation for stage hepatectomy.

    PubMed

    Moris, Demetrios; Vernadakis, Spyridon; Papalampros, Alexandros; Vailas, Michail; Dimitrokallis, Nikolaos; Petrou, Athanasios; Dimitroulis, Dimitrios

    2016-09-07

    To highlight the potential mechanisms of regeneration in the Associating Liver Partition and Portal vein ligation for Stage hepatectomy models (clinical and experimental) that could unlock the myth behind the extraordinary capability of the liver for regeneration, which would help in designing new therapeutic options for the regenerative drive in difficult setup, such as chronic liver diseases. Associating Liver Partition and Portal vein ligation for Stage hepatectomy has been recently advocated to induce rapid future liver remnant hypertrophy that significantly shortens the time for the second stage hepatectomy. The introduction of Associating Liver Partition and Portal vein ligation for Stage hepatectomy in the surgical armamentarium of therapeutic tools for liver surgeons represented a real breakthrough in the history of liver surgery. A comprehensive literature review of Associating Liver Partition and Portal vein ligation for Stage hepatectomy and its utility in liver regeneration is performed. Liver regeneration after Associating Liver Partition and Portal vein ligation for Stage hepatectomy is a combination of portal flow changes and parenchymal transection that generate a systematic response inducing hepatocyte proliferation and remodeling. Associating Liver Partition and Portal vein ligation for Stage hepatectomy represents a real breakthrough in the history of liver surgery because it offers rapid liver regeneration potential that facilitate resection of liver tumors that were previously though unresectable. The jury is still out though in terms of safety, efficacy and oncological outcomes. As far as Associating Liver Partition and Portal vein ligation for Stage hepatectomy -induced liver regeneration is concerned, further research on the field should focus on the role of non-parenchymal cells in liver regeneration as well as on the effect of Associating Liver Partition and Portal vein ligation for Stage hepatectomy in liver regeneration in the setup of parenchymal liver disease.

  4. Scanning two-photon continuous flow lithography for synthesis of high-resolution 3D microparticles.

    PubMed

    Shaw, Lucas A; Chizari, Samira; Shusteff, Maxim; Naghsh-Nilchi, Hamed; Di Carlo, Dino; Hopkins, Jonathan B

    2018-05-14

    Demand continues to rise for custom-fabricated and engineered colloidal microparticles across a breadth of application areas. This paper demonstrates an improvement in the fabrication rate of high-resolution 3D colloidal particles by using two-photon scanning lithography within a microfluidic channel. To accomplish this, we present (1) an experimental setup that supports fast, 3D scanning by synchronizing a galvanometer, piezoelectric stage, and an acousto-optic switch, and (2) a new technique for modifying the laser's scan path to compensate for the relative motion of the rapidly-flowing photopolymer medium. The result is an instrument that allows for rapid conveyor-belt-like fabrication of colloidal objects with arbitrary 3D shapes and micron-resolution features.

  5. Stability, ghost, and strong coupling in nonrelativistic general covariant theory of gravity with {lambda}{ne}1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang Yongqing; Wang Anzhong

    2011-05-15

    In this paper, we investigate three important issues: stability, ghost, and strong coupling, in the Horava-Melby-Thompson setup of the Horava-Lifshitz theory with {lambda}{ne}1, generalized recently by da Silva. We first develop the general linear scalar perturbations of the Friedmann-Robertson-Walker (FRW) universe with arbitrary spatial curvature and find that an immediate by-product of the setup is that, in all the inflationary models described by a scalar field, the FRW universe is necessarily flat. Applying them to the case of the Minkowski background, we find that it is stable, and, similar to the case {lambda}=1, the spin-0 graviton is eliminated. The vectormore » perturbations vanish identically in the Minkowski background. Thus, similar to general relativity, a free gravitational field in this setup is completely described by a spin-2 massless graviton, even with {lambda}{ne}1. We also study the ghost problem in the FRW background and find explicitly the ghost-free conditions. To study the strong coupling problem, we consider two different kinds of spacetimes, all with the presence of matter: one is cosmological, and the other is static. We find that the coupling becomes strong for a process with energy higher than M{sub pl}|c{sub {psi}|}{sup 5/2} in the flat FRW background and M{sub pl}|c{sub {psi}|}{sup 3} in a static weak gravitational field, where |c{sub {psi}|{identical_to}}|(1-{lambda})/(3{lambda}-1)|{sup 1/2}.« less

  6. Compact, light-weight and cost-effective microscope based on lensless incoherent holography for telemedicine applications.

    PubMed

    Mudanyali, Onur; Tseng, Derek; Oh, Chulwoo; Isikman, Serhan O; Sencan, Ikbal; Bishara, Waheb; Oztoprak, Cetin; Seo, Sungkyu; Khademhosseini, Bahar; Ozcan, Aydogan

    2010-06-07

    Despite the rapid progress in optical imaging, most of the advanced microscopy modalities still require complex and costly set-ups that unfortunately limit their use beyond well equipped laboratories. In the meantime, microscopy in resource-limited settings has requirements significantly different from those encountered in advanced laboratories, and such imaging devices should be cost-effective, compact, light-weight and appropriately accurate and simple to be usable by minimally trained personnel. Furthermore, these portable microscopes should ideally be digitally integrated as part of a telemedicine network that connects various mobile health-care providers to a central laboratory or hospital. Toward this end, here we demonstrate a lensless on-chip microscope weighing approximately 46 grams with dimensions smaller than 4.2 cm x 4.2 cm x 5.8 cm that achieves sub-cellular resolution over a large field of view of approximately 24 mm(2). This compact and light-weight microscope is based on digital in-line holography and does not need any lenses, bulky optical/mechanical components or coherent sources such as lasers. Instead, it utilizes a simple light-emitting-diode (LED) and a compact opto-electronic sensor-array to record lensless holograms of the objects, which then permits rapid digital reconstruction of regular transmission or differential interference contrast (DIC) images of the objects. Because this lensless incoherent holographic microscope has orders-of-magnitude improved light collection efficiency and is very robust to mechanical misalignments it may offer a cost-effective tool especially for telemedicine applications involving various global health problems in resource limited settings.

  7. Television Commercial Preferences of Children Aged 3-6 Years

    ERIC Educational Resources Information Center

    Yurtsever Kilicgun, Muge

    2016-01-01

    Problem Statement: When children watch television, they are exposed to commercial advertisements whose general purpose is to make a positive impression on viewers about a commodity or service in order to drive the sales of that commodity or service. Due to their voiced and moving images, their setup and characters, and their being short and…

  8. Solar powered automobile automation for heatstroke prevention

    NASA Astrophysics Data System (ADS)

    Singh, Navtej Swaroop; Sharma, Ishan; Jangid, Santosh

    2016-03-01

    Heatstroke inside a car has been critical problem in every part of the world. Non-exertional heat stroke results from exposure to a high environmental temperature. Exertional heat stroke happens from strenuous exercise. This paper presents a solution for this fatal problem and proposes an embedded solution, which is cost effective and shows the feasibility in implementation. The proposed system consists of information sharing platform, interfacing of sensors, Global System Mobile (GSM), real time monitoring system and the system is powered by the solar panel. The system has been simulated and tested with experimental setup.

  9. The design of nonlinear observers for wind turbine dynamic state and parameter estimation

    NASA Astrophysics Data System (ADS)

    Ritter, B.; Schild, A.; Feldt, M.; Konigorski, U.

    2016-09-01

    This contribution addresses the dynamic state and parameter estimation problem which arises with more advanced wind turbine controllers. These control devices need precise information about the system's current state to outperform conventional industrial controllers effectively. First, the necessity of a profound scientific treatment on nonlinear observers for wind turbine application is highlighted. Secondly, the full estimation problem is introduced and the variety of nonlinear filters is discussed. Finally, a tailored observer architecture is proposed and estimation results of an illustrative application example from a complex simulation set-up are presented.

  10. Rank the voltage across light bulbs … then set up the live experiment

    NASA Astrophysics Data System (ADS)

    Jacobs, Greg C.

    2018-02-01

    The Tasks Inspired by Physics Education Research (TIPERS) workbooks pose questions in styles quite different from the end-of-chapter problems that those of us of a certain age were assigned back in the days before Netscape. My own spin on TIPERS is not just to do them on paper, but to have students set up the situations in the laboratory to verify—or contradict —their paper solutions. The circuits unit is particularly conducive to creating quick-and-dirty lab setups that demonstrate the result of conceptually framed problems.

  11. Experimental investigation of the flow dynamics and rheology of complex fluids in pipe flow by hybrid multi-scale velocimetry

    NASA Astrophysics Data System (ADS)

    Haavisto, Sanna; Cardona, Maria J.; Salmela, Juha; Powell, Robert L.; McCarthy, Michael J.; Kataja, Markku; Koponen, Antti I.

    2017-11-01

    A hybrid multi-scale velocimetry method utilizing Doppler optical coherence tomography in combination with either magnetic resonance imaging or ultrasound velocity profiling is used to investigate pipe flow of four rheologically different working fluids under varying flow regimes. These fluids include water, an aqueous xanthan gum solution, a softwood fiber suspension, and a microfibrillated cellulose suspension. The measurement setup enables not only the analysis of the rheological (bulk) behavior of a studied fluid but gives simultaneously information on their wall layer dynamics, both of which are needed for analyzing and solving practical fluid flow-related problems. Preliminary novel results on rheological and boundary layer flow properties of the working fluids are reported and the potential of the hybrid measurement setup is demonstrated.

  12. The SMILETRAP facility

    NASA Astrophysics Data System (ADS)

    Carlberg, C.; Borgenstrand, H.; Rouleau, G.; Schuch, R.; Söderberg, F.; Bergström, I.; Jertz, R.; Schwarz, T.; Stein, J.; Bollen, G.; Kluge, H.-J.; Mann, R.

    1995-01-01

    The SMILETRAP experimental set-up, a Penning trap mass spectrometer for highly charged ions, is described. Capture and observation of cyclotron frequencies of externally produced highly charged ions, rapid interchange of investigated and reference ions and measurements of the rotational kinetic energies are demonstrated. Mass measurements utilizing different charge states and species to verify the consistency of the measurements are presented. A relative uncertainty of about 10-9 is attained in comparisons between highly charged carbon, nitrogen, oxygen, neon and the singly charged hydrogen molecule.

  13. A computer program for the localization of small areas in roentgenological images

    NASA Technical Reports Server (NTRS)

    Keller, R. A.; Baily, N. A.

    1976-01-01

    A method and associated algorithm are presented which allow a simple and accurate determination to be made of the location of small symmetric areas presented in roentgenological images. The method utilizes an operator to visually spot object positions but eliminates the need for critical positioning accuracy on the operator's part. The rapidity of measurement allows results to be evaluated on-line. Parameters associated with the algorithm have been analyzed, and methods to facilitate an optimum choice for any particular experimental setup are presented.

  14. Internet-based videoconferencing and data collaboration for the imaging community.

    PubMed

    Poon, David P; Langkals, John W; Giesel, Frederik L; Knopp, Michael V; von Tengg-Kobligk, Hendrik

    2011-01-01

    Internet protocol-based digital data collaboration with videoconferencing is not yet well utilized in the imaging community. Videoconferencing, combined with proven low-cost solutions, can provide reliable functionality and speed, which will improve rapid, time-saving, and cost-effective communications, within large multifacility institutions or globally with the unlimited reach of the Internet. The aim of this project was to demonstrate the implementation of a low-cost hardware and software setup that facilitates global data collaboration using WebEx and GoToMeeting Internet protocol-based videoconferencing software. Both products' features were tested and evaluated for feasibility across 2 different Internet networks, including a video quality and recording assessment. Cross-compatibility with an Apple OS is also noted in the evaluations. Departmental experiences with WebEx pertaining to clinical trials are also described. Real-time remote presentation of dynamic data was generally consistent across platforms. A reliable and inexpensive hardware and software setup for complete Internet-based data collaboration/videoconferencing can be achieved.

  15. Pressure jump relaxation setup with IR detection and millisecond time resolution

    NASA Astrophysics Data System (ADS)

    Schiewek, Martin; Krumova, Marina; Hempel, Günter; Blume, Alfred

    2007-04-01

    An instrument is described that allows the use of Fourier transform infrared (FTIR) spectroscopy as a detection system for kinetic processes after a pressure jump of up to 100bars. The pressure is generated using a high performance liquid chromatography (HPLC) pump and water as a pressure transducing medium. A flexible membrane separates the liquid sample in the IR cell from the pressure transducing medium. Two electromagnetic switching valves in the setup enable pressure jumps with a decay time of 4ms. The FTIR spectrometer is configured to measure time resolved spectra in the millisecond time regime using the rapid scan mode. All components are computer controlled. For a demonstration of the capability of the method first results on the kinetics of a phase transition between two lamellar phases of an aqueous phospholipid dispersion are presented. This combination of FTIR spectroscopy with the pressure jump relaxation technique can also be used for other systems which display cooperative transitions with concomitant volume changes.

  16. A temperature-jump NMR probe setup using rf heating optimized for the analysis of temperature-induced biomacromolecular kinetic processes

    NASA Astrophysics Data System (ADS)

    Rinnenthal, Jörg; Wagner, Dominic; Marquardsen, Thorsten; Krahn, Alexander; Engelke, Frank; Schwalbe, Harald

    2015-02-01

    A novel temperature jump (T-jump) probe operational at B0 fields of 600 MHz (14.1 Tesla) with an integrated cage radio-frequency (rf) coil for rapid (<1 s) heating in high-resolution (HR) liquid-state NMR-spectroscopy is presented and its performance investigated. The probe consists of an inner 2.5 mm "heating coil" designed for generating rf-electric fields of 190-220 MHz across a lossy dielectric sample and an outer two coil assembly for 1H-, 2H- and 15N-nuclei. High B0 field homogeneities (0.7 Hz at 600 MHz) are combined with high heating rates (20-25 K/s) and only small temperature gradients (<±1.5 K, 3 s after 20 K T-jump). The heating coil is under control of a high power rf-amplifier within the NMR console and can therefore easily be accessed by the pulse programmer. Furthermore, implementation of a real-time setup including synchronization of the NMR spectrometer's air flow heater with the rf-heater used to maintain the temperature of the sample is described. Finally, the applicability of the real-time T-jump setup for the investigation of biomolecular kinetic processes in the second-to-minute timescale is demonstrated for samples of a model 14mer DNA hairpin and a 15N-selectively labeled 40nt hsp17-RNA thermometer.

  17. ABrIL - Advanced Brain Imaging Lab : a cloud based computation environment for cooperative neuroimaging projects.

    PubMed

    Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo

    2014-01-01

    Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.

  18. Are annual layers preserved in NorthGRIP Eemian ice?

    NASA Astrophysics Data System (ADS)

    Kettner, E.; Bigler, M.; Nielsen, M. E.; Steffensen, J. P.; Svensson, A.

    2009-04-01

    A newly developed setup for continuous flow analysis (CFA) of ice cores in Copenhagen is optimized for high resolution analysis of four components: Soluble sodium (mainly deriving from sea salt), soluble ammonium (related to biological processes and biomass burning events), insoluble dust particles (basically transported from Asian deserts to Greenland), and the electrolytic melt water conductivity (which is a bulk signal for all ionic constituents). Furthermore, we are for the first time implementing a flow cytometer to obtain high quality dust concentration and size distribution profiles based on individual dust particle measurements. Preliminary measurements show that the setup is able to resolve annual layers of 1 cm thickness. Ice flow models predict that annual layers in the Eemian section of the Greenland NorthGRIP ice core (130-115 ka BP) have a thickness of around 1 cm. However, the visual stratigraphy of the ice core indicates that the annual layering in the Eemian section may be disturbed by micro folds and rapid crystal growth. In this case study we will measure the impurity content of an Eemian segment of the NorthGRIP ice core with the new CFA setup. This will allow for a comparison to well-known impurity levels of the Holocene in both Greenland and Antarctic ice and we will attempt to determine if annual layers are still present in the ice.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yijian; Hong, Mingyi; Dall'Anese, Emiliano

    This paper considers power distribution systems featuring renewable energy sources (RESs), and develops a distributed optimization method to steer the RES output powers to solutions of AC optimal power flow (OPF) problems. The design of the proposed method leverages suitable linear approximations of the AC-power flow equations, and is based on the Alternating Direction Method of Multipliers (ADMM). Convergence of the RES-inverter output powers to solutions of the OPF problem is established under suitable conditions on the stepsize as well as mismatches between the commanded setpoints and actual RES output powers. In a broad sense, the methods and results proposedmore » here are also applicable to other distributed optimization problem setups with ADMM and inexact dual updates.« less

  20. Assessment of PIV-based unsteady load determination of an airfoil with actuated flap

    NASA Astrophysics Data System (ADS)

    Sterenborg, J. J. H. M.; Lindeboom, R. C. J.; Simão Ferreira, C. J.; van Zuijlen, A. H.; Bijl, H.

    2014-02-01

    For complex experimental setups involving movable structures it is not trivial to directly measure unsteady loads. An alternative is to deduce unsteady loads indirectly from measured velocity fields using Noca's method. The ultimate aim is to use this method in future work to determine unsteady loads for fluid-structure interaction problems. The focus in this paper is first on the application and assessment of Noca's method for an airfoil with an oscillating trailing edge flap. To our best knowledge Noca's method has not been applied yet to airfoils with moving control surfaces or fluid-structure interaction problems. In addition, wind tunnel corrections for this type of unsteady flow problem are considered.

  1. A 2D and 3D Code Comparison of Turbulent Mixing in Spherical Implosions

    NASA Astrophysics Data System (ADS)

    Flaig, Markus; Thornber, Ben; Grieves, Brian; Youngs, David; Williams, Robin; Clark, Dan; Weber, Chris

    2016-10-01

    Turbulent mixing due to Richtmyer-Meshkov and Rayleigh-Taylor instabilities has proven to be a major obstacle on the way to achieving ignition in inertial confinement fusion (ICF) implosions. Numerical simulations are an important tool for understanding the mixing process, however, the results of such simulations depend on the choice of grid geometry and the numerical scheme used. In order to clarify this issue, we compare the simulation codes FLASH, TURMOIL, HYDRA, MIRANDA and FLAMENCO for the problem of the growth of single- and multi-mode perturbations on the inner interface of a dense imploding shell. We consider two setups: A single-shock setup with a convergence ratio of 4, as well as a higher convergence multi-shock setup that mimics a typical NIF mixcap experiment. We employ both singlemode and ICF-like broadband perturbations. We find good agreement between all codes concerning the evolution of the mix layer width, however, the are differences in the small scale mixing. We also develop a Bell-Plesset model that is able to predict the mix layer width and find excellent agreement with the simulation results. This work was supported by resources provided by the Pawsey Supercomputing Centre with funding from the Australian Government.

  2. Quantitative estimation of magnetic nanoparticle distributions in one dimension using low-frequency continuous wave electron paramagnetic resonance

    NASA Astrophysics Data System (ADS)

    Coene, A.; Crevecoeur, G.; Dupré, L.; Vaes, P.

    2013-06-01

    In recent years, magnetic nanoparticles (MNPs) have gained increased attention due to their superparamagnetic properties. These properties allow the development of innovative biomedical applications such as targeted drug delivery and tumour heating. However, these modalities lack effective operation arising from the inaccurate quantification of the spatial MNP distribution. This paper proposes an approach for assessing the one-dimensional (1D) MNP distribution using electron paramagnetic resonance (EPR). EPR is able to accurately determine the MNP concentration in a single volume but not the MNP distribution throughout this volume. A new approach that exploits the solution of inverse problems for the correct interpretation of the measured EPR signals, is investigated. We achieve reconstruction of the 1D distribution of MNPs using EPR. Furthermore, the impact of temperature control on the reconstructed distributions is analysed by comparing two EPR setups where the latter setup is temperature controlled. Reconstruction quality for the temperature-controlled setup increases with an average of 5% and with a maximum increase of 13% for distributions with relatively lower iron concentrations and higher resolutions. However, these measurements are only a validation of our new method and form no hard limits.

  3. Remote clinical assessment of gastrointestinal endoscopy (tele-endoscopy): an initial experience.

    PubMed Central

    Kim, C. Y.; Etemad, B.; Glenn, T. F.; Mackey, H. A.; Viator, G. E.; Wallace, M. B.; Mokhashi, M. S.; Cotton, P. B.; Hawes, R. H.

    2000-01-01

    BACKGROUND: Gastrointestinal (GI) endoscopy is an effective tool to screen for cancers of the digestive tract. However, access to endoscopy is limited in many parts of South Carolina. This trial is a part of a prospective multi-part study for remote cancer screening in coastal South Carolina. This pilot study was to evaluate the quality of tele-endoscopy for cancer screening. METHODS: 10 patients scheduled for endoscopic procedures were observed simultaneously by the endoscopist and a remote observer connected over a 512 kbps ISDN line. Findings by both were compared for concordance on malignant or premalignant lesions. RESULTS: The image quality was adequate to support remote diagnosis of GI cancer and abnormal lesions by an experienced observer. However, assessment of the esophagogastric junction for Barrett's esophagus was equivocal. CONCLUSIONS: Overall, our tele-endoscopy setup shows great promise for remote supervision or observation of endoscopic procedures done by nurse endoscopists. Tele-endoscopy is both adequate and feasible for diagnosis of most gastrointestinal lesions. Subtle lesions still may be missed in our current setup. However, improvements are being made in our setup to address the problem with resolution prior to further evaluation. PMID:11079918

  4. A Technology-Assisted Learning Setup as Assessment Supplement for Three Persons with a Diagnosis of Post-Coma Vegetative State and Pervasive Motor Impairment

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Buonocunto, Francesca; Sacco, Valentina; Colonna, Fabio; Navarro, Jorge; Lanzilotti, Crocifissa; Bosco, Andrea; Megna, Gianfranco; De Tommaso, Marina

    2009-01-01

    Post-coma persons in an apparent condition of vegetative state and pervasive motor impairment pose serious problems in terms of assessment and intervention options. A technology-based learning assessment procedure might serve for them as a diagnostic supplement with possible implications for rehabilitation intervention. The learning assessment…

  5. Prioritizing parts from cutting bills when gang-ripping first

    Treesearch

    R. Edward Thomas

    1996-01-01

    Computer optimization of gang-rip-first processing is a difficult problem when working with specific cutting bills. Interactions among board grade and size, arbor setup, and part sizes and quantities greatly complicate the decision making process. Cutting the wrong parts at any moment will mean that more board footage will be required to meet the bill. Using the ROugh...

  6. How much can we trust high-resolution spectroscopic stellar chemical abundances?

    NASA Astrophysics Data System (ADS)

    Blanco-Cuaresma, S.; Nordlander, T.; Heiter, U.; Jofré, P.; Masseron, T.; Casamiquela, L.; Tabernero, H. M.; Bhat, S. S.; Casey, A. R.; Meléndez, J.; Ramírez, I.

    2017-03-01

    To study stellar populations, it is common to combine chemical abundances from different spectroscopic surveys/studies where different setups were used. These inhomogeneities can lead us to inaccurate scientific conclusions. In this work, we studied one aspect of the problem: When deriving chemical abundances from high-resolution stellar spectra, what differences originate from the use of different radiative transfer codes?

  7. Interferometer using a 3 × 3 coupler and Faraday mirrors

    NASA Astrophysics Data System (ADS)

    Breguet, J.; Gisin, N.

    1995-06-01

    A new interferometric setup using a 3 \\times 3 coupler and two Faraday mirrors is presented. It has the advantages of being built only with passive components, of freedom from the polarization fading problem, and of operation with a LED. It is well suited for sensing time-dependent signals and does not depend on reciprocal or nonreciprocal constant perturbations.

  8. Operational Assessment of Tools for Accelerating Leader Development (ALD): Volume 1, Capstone Report

    DTIC Science & Technology

    2009-06-01

    in units and user juries provided feedback on the tools. The pressures of the operational environment seriously limited the time available to work...following functions: account set-up, user authentication, learning management , usage monitoring, problem reporting, assessment data collection, data...especially sources of data) represented—demonstration/assessment manager , operations manager , Web site experts, users (target audience), data collectors

  9. Social Loafing on Group Projects: Structural Antecedents and Effect on Student Satisfaction

    ERIC Educational Resources Information Center

    Aggarwal, Praveen; O'Brien, Connie L.

    2008-01-01

    To respond to the expectations of the industry and business school accreditation bodies, marketing faculty have been making extensive use of group projects in their curricula. A common problem with the use of student groups, however, is that of social loafing. In this study, we identify some easy-to-implement project set-up factors and examine…

  10. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations

    PubMed Central

    Duarte, Belmiro P.M.; Wong, Weng Kee; Oliveira, Nuno M.C.

    2015-01-01

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D–, A– and E–optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D–optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice. PMID:26949279

  11. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations.

    PubMed

    Duarte, Belmiro P M; Wong, Weng Kee; Oliveira, Nuno M C

    2016-02-15

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D -, A - and E -optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D -optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice.

  12. Investigation of a novel approach for the cross-linking characterization of SU-8 photoresist materials by means of optical dispersion measurements

    NASA Astrophysics Data System (ADS)

    Taudt, Ch.; Baselt, T.; Koch, E.; Hartmann, P.

    2014-03-01

    The increase in efficiency and precision in the production of semiconductor structures under the use of polymeric materials like SU-8 is crucial in securing the technological innovation within this industry. The manufacturing of structures on wafers demands a high quality of materials, tools and production processes. In particular, deviations in the materials' parameters (e.g. cross-linking state, density or mechanical properties) could lead to subsequent problems such as a reduced lifetime of structures and systems. In particular problems during the soft and post-exposure bake process can lead to an inhomogeneous distribution of material properties. This paper describes a novel approach for the characterization of SU-8 material properties in relation to a second epoxy-based material of different cross-linking by the measurement of optical dispersion within the material. A white-light interferometer was used. In particular the setup consisted of a white-light source, a Michelson-type interferometer and a spectrometer. The investigation of the dispersion characteristics was carried out by the detection of the equalization wavelength for different positions of the reference arm in a range from 400 to 900 nm. The measured time delay due to dispersion ranges from 850 to 1050 ps/m. For evaluation purposes a 200μm SU-8 sample was characterized in the described setup regarding its dispersion characteristics in relation to bulk epoxy material. The novel measurement approach allowed a fast and high-resolution material characterization for SU-8 micro structures which was suitable for integration in production lines. The outlook takes modifications of the experimental setup regarding on-wafer measurements into account.

  13. Rapid long-wave infrared laser-induced breakdown spectroscopy measurements using a mercury-cadmium-telluride linear array detection system.

    PubMed

    Yang, Clayton S-C; Brown, Eiei; Kumi-Barimah, Eric; Hommerich, Uwe; Jin, Feng; Jia, Yingqing; Trivedi, Sudhir; D'souza, Arvind I; Decuir, Eric A; Wijewarnasuriya, Priyalal S; Samuels, Alan C

    2015-11-20

    In this work, we develop a mercury-cadmium-telluride linear array detection system that is capable of rapidly capturing (∼1-5  s) a broad spectrum of atomic and molecular laser-induced breakdown spectroscopy (LIBS) emissions in the long-wave infrared (LWIR) region (∼5.6-10  μm). Similar to the conventional UV-Vis LIBS, a broadband emission spectrum of condensed phase samples covering the whole 5.6-10 μm region can be acquired from just a single laser-induced microplasma or averaging a few single laser-induced microplasmas. Atomic and molecular signature emission spectra of solid inorganic and organic tablets and thin liquid films deposited on a rough asphalt surface are observed. This setup is capable of rapidly probing samples "as is" without the need of elaborate sample preparation and also offers the possibility of a simultaneous UV-Vis and LWIR LIBS measurement.

  14. Toward 2D and 3D imaging of magnetic nanoparticles using EPR measurements.

    PubMed

    Coene, A; Crevecoeur, G; Leliaert, J; Dupré, L

    2015-09-01

    Magnetic nanoparticles (MNPs) are an important asset in many biomedical applications. An effective working of these applications requires an accurate knowledge of the spatial MNP distribution. A promising, noninvasive, and sensitive technique to visualize MNP distributions in vivo is electron paramagnetic resonance (EPR). Currently only 1D MNP distributions can be reconstructed. In this paper, the authors propose extending 1D EPR toward 2D and 3D using computer simulations to allow accurate imaging of MNP distributions. To find the MNP distribution belonging to EPR measurements, an inverse problem needs to be solved. The solution of this inverse problem highly depends on the stability of the inverse problem. The authors adapt 1D EPR imaging to realize the imaging of multidimensional MNP distributions. Furthermore, the authors introduce partial volume excitation in which only parts of the volume are imaged to increase stability of the inverse solution and to speed up the measurements. The authors simulate EPR measurements of different 2D and 3D MNP distributions and solve the inverse problem. The stability is evaluated by calculating the condition measure and by comparing the actual MNP distribution to the reconstructed MNP distribution. Based on these simulations, the authors define requirements for the EPR system to cope with the added dimensions. Moreover, the authors investigate how EPR measurements should be conducted to improve the stability of the associated inverse problem and to increase reconstruction quality. The approach used in 1D EPR can only be employed for the reconstruction of small volumes in 2D and 3D EPRs due to numerical instability of the inverse solution. The authors performed EPR measurements of increasing cylindrical volumes and evaluated the condition measure. This showed that a reduction of the inherent symmetry in the EPR methodology is necessary. By reducing the symmetry of the EPR setup, quantitative images of larger volumes can be obtained. The authors found that, by selectively exciting parts of the volume, the authors could increase the reconstruction quality even further while reducing the amount of measurements. Additionally, the inverse solution of this activation method degrades slower for increasing volumes. Finally, the methodology was applied to noisy EPR measurements: using the reduced EPR setup's symmetry and the partial activation method, an increase in reconstruction quality of ≈ 80% can be seen with a speedup of the measurements with 10%. Applying the aforementioned requirements to the EPR setup and stabilizing the EPR measurements showed a tremendous increase in noise robustness, thereby making EPR a valuable method for quantitative imaging of multidimensional MNP distributions.

  15. Experimental setup and procedure for the measurement of the 7Be(n,p)7Li reaction at n_TOF

    NASA Astrophysics Data System (ADS)

    Barbagallo, M.; Andrzejewski, J.; Mastromarco, M.; Perkowski, J.; Damone, L. A.; Gawlik, A.; Cosentino, L.; Finocchiaro, P.; Maugeri, E. A.; Mazzone, A.; Dressler, R.; Heinitz, S.; Kivel, N.; Schumann, D.; Colonna, N.; Aberle, O.; Amaducci, S.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Bellia, G.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cortés-Giraldo, M. A.; Cristallo, S.; Diakaki, M.; Dietz, M.; Domingo-Pardo, C.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Furman, V.; Göbel, K.; García, A. R.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González-Romero, E.; Griesmayer, E.; Guerrero, C.; Gunsing, F.; Harada, H.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Johnston, K.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lerendegui-Marco, J.; Lo Meo, S.; Lonsdale, S. J.; Macina, D.; Manna, A.; Marganiec, J.; Martínez, T.; Martins-Correia, J. G.; Masi, A.; Massimi, C.; Mastinu, P.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Pappalardo, A. D.; Patronis, N.; Pavlik, A.; Piscopo, M.; Porras, I.; Praena, J.; Quesada, J. M.; Radeck, D.; Rauscher, T.; Reifarth, R.; Robles, M. S.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schell, J.; Schillebeeckx, P.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weiss, C.; Woods, P. J.; Wright, T.; Žugec, P.

    2018-04-01

    Following the completion of the second neutron beam line and the related experimental area (EAR2) at the n_TOF spallation neutron source at CERN, several experiments were planned and performed. The high instantaneous neutron flux available in EAR2 allows to investigate neutron induced reactions with charged particles in the exit channel even employing targets made out of small amounts of short-lived radioactive isotopes. After the successful measurement of the 7Be(n, α) α cross section, the 7Be(n,p)7Li reaction was studied in order to provide still missing cross section data of relevance for Big Bang Nucleosynthesis (BBN), in an attempt to find a solution to the cosmological Lithium abundance problem. This paper describes the experimental setup employed in such a measurement and its characterization.

  16. The Army and the Need for an Amphibious Capability

    DTIC Science & Technology

    2015-05-23

    prevailing Army-Marine amphibious set-up was unsound because only the Army had both the means and the grasp of the problem to plan, prepare, and... The Army and the Need for an Amphibious Capability A Monograph by MAJ Joseph E. Malone United States Army...this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  17. Dakota Graphical User Interface v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest; Glickman, Matthew; Gibson, Marcus

    Graphical analysis environment for Sandia’s Dakota software for optimization and uncertainty quantification. The Dakota GUI is an interactive graphical analysis environment for creating, running, and interpreting Dakota optimization and uncertainty quantification studies. It includes problem (Dakota study) set-up, option specification, simulation interfacing, analysis execution, and results visualization. Through the use of wizards, templates, and views, Dakota GUI helps uses navigate Dakota’s complex capability landscape.

  18. Machine Tool Technology. Automatic Screw Machine Troubleshooting & Set-Up Training Outlines [and] Basic Operator's Skills Set List.

    ERIC Educational Resources Information Center

    Anoka-Hennepin Technical Coll., Minneapolis, MN.

    This set of two training outlines and one basic skills set list are designed for a machine tool technology program developed during a project to retrain defense industry workers at risk of job loss or dislocation because of conversion of the defense industry. The first troubleshooting training outline lists the categories of problems that develop…

  19. The role of health care ADR (alternative dispute resolution) in reducing legal fees.

    PubMed

    Joseph, D M

    1995-11-01

    An increasingly complex health care system undergoing rapid changes is an ideal set-up for frequent conflicts among the numerous participants. While conflict is inevitable, the manner in which it is handled can markedly affect the outcome of the dispute and the future relationship of the parties, as well as the emotional and financial cost of the dispute. This article presents an overview of the principles and processes of alternative dispute resolution (ADR), and describes how these processes are currently being used to resolve health care disputes.

  20. Testing Instrument for Flight-Simulator Displays

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1987-01-01

    Displays for flight-training simulators rapidly aligned with aid of integrated optical instrument. Calibrations and tests such as aligning boresight of display with respect to user's eyes, checking and adjusting display horizon, checking image sharpness, measuring illuminance of displayed scenes, and measuring distance of optical focus of scene performed with single unit. New instrument combines all measurement devices in single, compact, integrated unit. Requires just one initial setup. Employs laser and produces narrow, collimated beam for greater measurement accuracy. Uses only one moving part, double right prism, to position laser beam.

  1. A small-displacement sensor using total internal reflection theory and surface plasmon resonance technology for heterodyne interferometry.

    PubMed

    Wang, Shinn-Fwu

    2009-01-01

    A small-displacement sensor based on total-internal reflection theory and surface plasmon resonance technology is proposed for use in heterodyne interferometry. A small displacement can be obtained simply by measuring the variation in phase difference between s- and p-polarization states with the small-displacement sensor. The theoretical displacement resolution of the small-displacement sensor can reach 0.45 nm. The sensor has some additional advantages, e.g., a simple optical setup, high resolution, high sensitivity and rapid measurement. Its feasibility is also demonstrated.

  2. Design and optimization of input shapers for liquid slosh suppression

    NASA Astrophysics Data System (ADS)

    Aboel-Hassan, Ameen; Arafa, Mustafa; Nassef, Ashraf

    2009-02-01

    The need for fast maneuvering and accurate positioning of flexible structures poses a control challenge. The inherent flexibility in these lightly damped systems creates large undesirable residual vibrations in response to rapid excitations. Several control approaches have been proposed to tackle this class of problems, of which the input shaping technique is appealing in many aspects. While input shaping has been widely investigated to attenuate residual vibrations in flexible structures, less attention was granted to expand its viability in further applications. The aim of this work is to develop a methodology for applying input shaping techniques to suppress sloshing effects in open moving containers to facilitate safe and fast point-to-point movements. The liquid behavior is modeled using finite element analysis. The input shaper parameters are optimized to find the commands that would result in minimum residual vibration. Other objectives, such as improved robustness, and motion constraints such as deflection limiting are also addressed in the optimization scheme. Numerical results are verified on an experimental setup consisting of a small motor-driven water tank undergoing rectilinear motion, while measuring both the tank motion and free surface displacement of the water. The results obtained suggest that input shaping is an effective method for liquid slosh suppression.

  3. Modelling the Centers of Galaxies

    NASA Technical Reports Server (NTRS)

    Smith, B. F.; Miller, R. H.; Young, Richard E. (Technical Monitor)

    1997-01-01

    The key to studying central regions by means of nobody numerical experiments is to concentrate on the central few parsecs of a galaxy, replacing the remainder of the galaxy by a suitable boundary condition, rather after the manner in which stellar interiors can be studied without a detailed stellar atmosphere by replacing the atmosphere with a boundary condition. Replacements must be carefully designed because the long range gravitational force means that the core region is sensitive to mass outside that region and because particles can exchange between the outer galaxy and the core region. We use periodic boundary conditions, coupled with an iterative procedure to generate initial particle loads in isothermal equilibrium. Angular momentum conservation is ensured for problems including systematic rotation by a circular reflecting boundary and by integrating in a frame that rotates with the mean flow. Mass beyond the boundary contributes to the gravitational potential, but does not participate in the dynamics. A symplectic integration scheme has been developed for rotating coordinate systems. This combination works well, leading to robust configurations. Some preliminary results with this combination show that: (1) Rotating systems are extremely sensitive to non-axisymmetric external potentials, and (2) that a second core, orbiting near the main core (like the M31 second core system), shows extremely rapid orbital decay. The experimental setups will be discussed, along with preliminary results.

  4. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    PubMed

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  5. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    PubMed Central

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  6. Coupling LaGrit unstructured mesh generation and model setup with TOUGH2 flow and transport: A case study

    DOE PAGES

    Sentis, Manuel Lorenzo; Gable, Carl W.

    2017-06-15

    Furthermore, there are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools willmore » provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. Here in this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.« less

  7. Nonequilibrium thermodynamics of single DNA hairpins in a dual-trap optical tweezers setup

    NASA Astrophysics Data System (ADS)

    Crivellari, M. Ribezzi; Huguet, J. M.; Ritort, F.

    2011-03-01

    We use two counter propagating laser beams to create a dual trap optical tweezers setup which is free from cross interference between the beams and provides great instrumental stability. This setup works by direct measurement of light momentum, separately for each trap, and is based on the Minitweezers design [1]. The dual trap setup has many applications: it can be used to study the force-dependent unfolding kinetics of single molecules and to address fundamental problems in nonequilibrium thermodynamics of small systems [2]. Recent progress in statistical physics has shown the importance of considering large energy deviations in the beahvior of systems that are driven out-of-equilibrium by time-dependent forces. Prominent examples are nonequilibrium work relations (e.g. the Jarzynski equality [3]) and fluctuation theorems. By repeated measurement of the irreversible work the Jarzynski equality allows us to recover the free energy difference between two thermodynamic states, AF, by taking exponential averages of the work W done by the external agent on the system, e-βΔF =

  8. Coupling LaGrit unstructured mesh generation and model setup with TOUGH2 flow and transport: A case study

    NASA Astrophysics Data System (ADS)

    Sentís, Manuel Lorenzo; Gable, Carl W.

    2017-11-01

    There are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools will provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 (Pruess et al., 1999) to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. In this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.

  9. Regulation of Renewable Energy Sources to Optimal Power Flow Solutions Using ADMM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Zhang, Yijian; Hong, Mingyi

    This paper considers power distribution systems featuring renewable energy sources (RESs), and develops a distributed optimization method to steer the RES output powers to solutions of AC optimal power flow (OPF) problems. The design of the proposed method leverages suitable linear approximations of the AC-power flow equations, and is based on the Alternating Direction Method of Multipliers (ADMM). Convergence of the RES-inverter output powers to solutions of the OPF problem is established under suitable conditions on the stepsize as well as mismatches between the commanded setpoints and actual RES output powers. In a broad sense, the methods and results proposedmore » here are also applicable to other distributed optimization problem setups with ADMM and inexact dual updates.« less

  10. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines.more » In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.« less

  11. Ergonomics in the electronic library.

    PubMed Central

    Thibodeau, P L; Melamut, S J

    1995-01-01

    New technologies are changing the face of information services and how those services are delivered. Libraries spend a great deal of time planning the hardware and software implementations of electronic information services, but the human factors are often overlooked. Computers and electronic tools have changed the nature of many librarians' daily work, creating new problems, including stress, fatigue, and cumulative trauma disorders. Ergonomic issues need to be considered when designing or redesigning facilities for electronic resources and services. Libraries can prevent some of the common problems that appear in the digital workplace by paying attention to basic ergonomic issues when designing workstations and work areas. Proper monitor placement, lighting, workstation setup, and seating prevent many of the common occupational problems associated with computers. Staff training will further reduce the likelihood of ergonomic problems in the electronic workplace. PMID:7581189

  12. Calorimetric method of ac loss measurement in a rotating magnetic field.

    PubMed

    Ghoshal, P K; Coombs, T A; Campbell, A M

    2010-07-01

    A method is described for calorimetric ac-loss measurements of high-T(c) superconductors (HTS) at 80 K. It is based on a technique used at 4.2 K for conventional superconducting wires that allows an easy loss measurement in parallel or perpendicular external field orientation. This paper focuses on ac loss measurement setup and calibration in a rotating magnetic field. This experimental setup is to demonstrate measuring loss using a temperature rise method under the influence of a rotating magnetic field. The slight temperature increase of the sample in an ac-field is used as a measure of losses. The aim is to simulate the loss in rotating machines using HTS. This is a unique technique to measure total ac loss in HTS at power frequencies. The sample is mounted on to a cold finger extended from a liquid nitrogen heat exchanger (HEX). The thermal insulation between the HEX and sample is provided by a material of low thermal conductivity, and low eddy current heating sample holder in vacuum vessel. A temperature sensor and noninductive heater have been incorporated in the sample holder allowing a rapid sample change. The main part of the data is obtained in the calorimetric measurement is used for calibration. The focus is on the accuracy and calibrations required to predict the actual ac losses in HTS. This setup has the advantage of being able to measure the total ac loss under the influence of a continuous moving field as experienced by any rotating machines.

  13. X-band EPR setup with THz light excitation of Novosibirsk Free Electron Laser: Goals, means, useful extras

    NASA Astrophysics Data System (ADS)

    Veber, Sergey L.; Tumanov, Sergey V.; Fursova, Elena Yu.; Shevchenko, Oleg A.; Getmanov, Yaroslav V.; Scheglov, Mikhail A.; Kubarev, Vitaly V.; Shevchenko, Daria A.; Gorbachev, Iaroslav I.; Salikova, Tatiana V.; Kulipanov, Gennady N.; Ovcharenko, Victor I.; Fedin, Matvey V.

    2018-03-01

    Electron Paramagnetic Resonance (EPR) station at the Novosibirsk Free Electron Laser (NovoFEL) user facility is described. It is based on X-band (∼9 GHz) EPR spectrometer and operates in both Continuous Wave (CW) and Time-Resolved (TR) modes, each allowing detection of either direct or indirect influence of high-power NovoFEL light (THz and mid-IR) on the spin system under study. The optics components including two parabolic mirrors, shutters, optical chopper and multimodal waveguide allow the light of NovoFEL to be directly fed into the EPR resonator. Characteristics of the NovoFEL radiation, the transmission and polarization-retaining properties of the waveguide used in EPR experiments are presented. The types of proposed experiments accessible using this setup are sketched. In most practical cases the high-power radiation applied to the sample induces its rapid temperature increase (T-jump), which is best visible in TR mode. Although such influence is a by-product of THz radiation, this thermal effect is controllable and can deliberately be used to induce and measure transient signals of arbitrary samples. The advantage of tunable THz radiation is the absence of photo-induced processes in the sample and its high penetration ability, allowing fast heating of a large portion of virtually any sample and inducing intense transients. Such T-jump TR EPR spectroscopy with THz pulses has been previewed for the two test samples, being a useful supplement for the main goals of the created setup.

  14. Time-lapse recordings of human corneal epithelial healing.

    PubMed

    Hardarson, Thorir; Hanson, Charles; Claesson, Margareta; Stenevi, Ulf

    2004-04-01

    The aim of this study was to design an experimental set-up for the study of human corneal epithelial wound healing in a controlled in vitro situation. A time-lapse set-up was used. This allowed for pictures to be captured with a magnification ranging from x 80 to x 1800. Pictures were captured at 1-min intervals during the observation period, which lasted up to 4 days. Human corneal tissue was obtained from the Eye Bank or from surgery. A small, rounded lesion was produced in the corneal epithelium with a miniature drill. The specimens were placed in a mini-incubator; the camera focused on the epithelial lesion and continuously observed using the time-lapse set-up. The healing process of human corneal epithelium could be followed for several days. The initial healing response could be divided into a slow, a rapid and a consolidating phase. The first two phases lasted about 12 hours, and by then, epithelial cells covered the lesion. Depending on the origin of the tissue and the placement of the lesion, variations in the healing response could be seen. The time-lapse technique makes it possible to study epithelial wound healing over time at the cellular level. Data collected in this way can fill the gap between in vivo studies, where, by nature, human wound healing studies are restricted, and cell culture techniques, where cellular responses in many cases differ from the in vivo situation.

  15. Diet for rapid weight loss

    MedlinePlus

    ... loss - rapid weight loss; Overweight - rapid weight loss; Obesity - rapid weight loss; Diet - rapid weight loss ... for people who have health problems because of obesity. For these people, losing a lot of weight ...

  16. Optimizing the triple-axis spectrometer PANDA at the MLZ for small samples and complex sample environment conditions

    NASA Astrophysics Data System (ADS)

    Utschick, C.; Skoulatos, M.; Schneidewind, A.; Böni, P.

    2016-11-01

    The cold-neutron triple-axis spectrometer PANDA at the neutron source FRM II has been serving an international user community studying condensed matter physics problems. We report on a new setup, improving the signal-to-noise ratio for small samples and pressure cell setups. Analytical and numerical Monte Carlo methods are used for the optimization of elliptic and parabolic focusing guides. They are placed between the monochromator and sample positions, and the flux at the sample is compared to the one achieved by standard monochromator focusing techniques. A 25 times smaller spot size is achieved, associated with a factor of 2 increased intensity, within the same divergence limits, ± 2 ° . This optional neutron focusing guide shall establish a top-class spectrometer for studying novel exotic properties of matter in combination with more stringent sample environment conditions such as extreme pressures associated with small sample sizes.

  17. Physics, ballistics, and psychology: a history of the chronoscope in/as context, 1845-1890.

    PubMed

    Schmidgen, Henning

    2005-02-01

    In Wilhelm Wundt's (1832-1920) Leipzig laboratory and at numerous other research sites, the chronoscope was used to conduct reaction time experiments. The author argues that the history of the chronoscope is the history not of an instrument but of an experimental setup. This setup was initially devised by the English physicist and instrument maker Charles Wheatstone (1802-1875) in the early 1840s. Shortly thereafter, it was improved by the German clockmaker and mechanic Matthäus Hipp (1813-1893). In the 1850s, the chronoscope was introduced to ballistic research. In the early 1860s, Neuchâtel astronomer Adolphe Hirsch (1830-1901) applied it to the problem of physiological time. The extensions and variations of chronoscope use within the contexts of ballistics, physiology, and psychology presented special challenges. These challenges were met with specific attempts to reduce the errors in chronoscopic experiments on shooting stands and in the psychological laboratory.

  18. The Adiabatic Theorem and Linear Response Theory for Extended Quantum Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, Sven; De Roeck, Wojciech; Fraas, Martin

    2018-03-01

    The adiabatic theorem refers to a setup where an evolution equation contains a time-dependent parameter whose change is very slow, measured by a vanishing parameter ɛ. Under suitable assumptions the solution of the time-inhomogenous equation stays close to an instantaneous fixpoint. In the present paper, we prove an adiabatic theorem with an error bound that is independent of the number of degrees of freedom. Our setup is that of quantum spin systems where the manifold of ground states is separated from the rest of the spectrum by a spectral gap. One important application is the proof of the validity of linear response theory for such extended, genuinely interacting systems. In general, this is a long-standing mathematical problem, which can be solved in the present particular case of a gapped system, relevant e.g. for the integer quantum Hall effect.

  19. Implementation of Hadamard spectroscopy using MOEMS as a coded aperture

    NASA Astrophysics Data System (ADS)

    Vasile, T.; Damian, V.; Coltuc, D.; Garoi, F.; Udrea, C.

    2015-02-01

    Although nowadays spectrometers reached a high level of performance, output signals are often weak and traditional slit spectrometers still confronts the problem of poor optical throughput, minimizing their efficiency in low light setup conditions. In order to overcome these issues, Hadamard Spectroscopy (HS) was implemented in a conventional Ebert Fastie type of spectrometer setup, by substituting the exit slit with a digital micro-mirror device (DMD) who acts like a coded aperture. The theory behind HS and the functionality of the DMD are presented. The improvements brought using HS are enlightened by means of a spectrometric experiment and higher SNR spectrum is acquired. Comparative experiments were conducted in order to emphasize the SNR differences between HS and scanning slit method. Results provide a SNR gain of 3.35 favoring HS. One can conclude the HS method effectiveness to be a great asset for low light spectrometric experiments.

  20. Wide band continuous all-fiber comb generator at 1.5 micron

    NASA Astrophysics Data System (ADS)

    Lemaître, François; Mondin, Linda; Orlik, X.

    2017-11-01

    We present an all-fiber continuous optical frequency comb-generator (OFCG) able to generate over 6 nm (750 GHz) at 1560 nm using a combination of electro-optic and acousto-optic modulations. As opposed to numerous experimental setups that use the longitudinal modes of an optical cavity to generate continuous optical frequency combs, our setup doesn't need any active stabilization of the cavity length since we use the intrinsically high stability of radiofrequency sources to generate the multiple lines of the comb laser. Moreover, compared to the work of ref [1], the hybrid optical modulation we use allows to suppress the problem of instability due interferences between the generated lines. We notice that these lines benefit from the spectral quality of the seed laser because the spectral width of the synthesized hyperfrequency and radiofrequency signals are generally narrower than laser sources.

  1. Controller design for a class of nonlinear MIMO coupled system using multiple models and second level adaptation.

    PubMed

    Pandey, Vinay Kumar; Kar, Indrani; Mahanta, Chitralekha

    2017-07-01

    In this paper, an adaptive control method using multiple models with second level adaptation is proposed for a class of nonlinear multi-input multi-output (MIMO) coupled systems. Multiple estimation models are used to tune the unknown parameters at the first level. The second level adaptation provides a single parameter vector for the controller. A feedback linearization technique is used to design a state feedback control. The efficacy of the designed controller is validated by conducting real time experiment on a laboratory setup of twin rotor MIMO system (TRMS). The TRMS setup is discussed in detail and the experiments were performed for regulation and tracking problem for pitch and yaw control using different reference signals. An Extended Kalman Filter (EKF) has been used to observe the unavailable states of the TRMS. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Improved resolution in practical light microscopy by means of a glass-fiber 2 π-tilting device

    NASA Astrophysics Data System (ADS)

    Bradl, Joachim; Rinke, Bernd; Schneider, Bernhard; Hausmann, Michael; Cremer, Christoph G.

    1996-01-01

    The spatial resolution of a conventional light microscope or a confocal laser scanning microscope can be determined by calculating the point spread function for the objective used. Normally, ideal conditions are assumed for these calculations. Such conditions, however, are often not fulfilled in biological applications especially in those cases where biochemical requirements (e.g. buffer conditions) influence the specimen preparation on the microscope slide (i.e. 'practical' light microscopy). It has been shown that the problem of a reduced z- resolution in 3D-microscopy (optical sectioning) can be overcome by a capillary in a 2(pi) - tilting device that allows object rotation into an optimal perspective. The application of the glass capillary instead of a standard slide has an additional influence on the imaging properties of the microscope. Therefore, another 2(pi) -tilting device was developed, using a glass fiber for object fixation and rotation. Such a fiber could be covered by standard cover glasses. To estimate the resolution of this setup, point spread functions were measured under different conditions using fluorescent microspheres of subwavelength dimensions. Results obtained from standard slide setups were compared to the glass fiber setup. These results showed that in practice rotation leads to an overall 3D-resolution improvement.

  3. Development and testing of a homogenous multi-wavelength LED light source

    NASA Astrophysics Data System (ADS)

    Bolton, Frank J.; Bernat, Amir; Jacques, Steven L.; Levitz, David

    2017-03-01

    Multispectral imaging of human tissue is a powerful method that allows for quantify scattering and absorption parameters of the tissue and differentiate tissue types or identify pathology. This method requires imaging at multiple wavelengths and then fitting the measured data to a model based on light transport theory. Earlier, a mobile phone based multi-spectral imaging system was developed to image the uterine cervix from the colposcopy geometry, outside the patient's body at a distance of 200-300 mm. Such imaging of a distance object has inherent challenges, as bright and homogenous illumination is required. Several solutions addressing this problem were developed, with varied degrees of success. In this paper, several multi-spectral illumination setups were developed and tested for brightness and uniformity. All setups were specifically designed with low cost in mind, utilizing a printed circuit board with surface-mounted LEDs. The three setups include: LEDs illuminating the target directly, LEDs illuminating focused by a 3D printed miniature lens array, and LEDs coupled to a mixing lens and focusing optical system. In order to compare the illumination uniformity and intensity performance two experiments were performed. Test results are presented, and various tradeoffs between the three system configurations are discussed. Test results are presented, and various tradeoffs between the three system configurations are discussed.

  4. The learning unit "Orthodontic set-up" as a new-media module in teaching.

    PubMed

    Asselmeyer, T; Fischer, V; Matthies, H; Schwestka-Polly, R

    2004-07-01

    The present study examines the extent to which computer-assisted learning units provided independently of place and time are used in self-study as a supplement to the classical classroom instruction of dental students. Indications as to whether such teaching modules improve training in orthodontics should be obtained from this. Attention was focussed on the implementation and evaluation of the "Orthodontic set-up" teaching module, which can be accessed in the Internet and Intranet of the university. The didactic arrangement offered classical university courses in parallel (four lectures on the subjects of occlusion, function, diagnostics, and therapy) in addition to the electronically communicated teaching contents. In addition, intensive supervision during the production of the set-up was guaranteed. The use of this multimedia learning concept was in general assessed positively by 63 surveyed students in the 2002/03 winter semester. The results revealed on the one hand the intensity of use and features of the acquisition of knowledge (use types), and on the other hand, in terms of professional relevance, the contents were found to be well explained, didactically attractive, and understandably presented. However, numerous drawbacks were also mentioned (technical and time problems; qualification deficits). The experience gained in this project should encourage more future investment in the development of alternative university didactic models.

  5. Readiness of the ATLAS detector: Performance with the first beam and cosmic data

    NASA Astrophysics Data System (ADS)

    Pastore, F.

    2010-05-01

    During 2008 the ATLAS experiment went through an intense period of preparation to have the detector fully commissioned for the first beam period. In about 30 h of beam time available to ATLAS in 2008 the systems went through a rapid setup sequence, from successfully recording the first bunch ever reaching ATLAS, to setting up the timing of the trigger system synchronous to the incoming single beams. The so-called splash events were recorded, where the beam was stopped on a collimator 140 m upstream of ATLAS, showering the experiment with millions of particles per beam shot. These events were found to be extremely useful for timing setup. After the stop of the beam operation, the experiment went through an extensive cosmic ray data taking campaign, recording more than 500 million cosmic ray events. These events have been used to make significant progress on the calibration and alignment of the detector. This paper describes the commissioning programme and the results obtained from both the single beam data and the cosmic data recorded in 2008.

  6. Applicability of UV laser-induced solid-state fluorescence spectroscopy for characterization of solid dosage forms.

    PubMed

    Woltmann, Eva; Meyer, Hans; Weigel, Diana; Pritzke, Heinz; Posch, Tjorben N; Kler, Pablo A; Schürmann, Klaus; Roscher, Jörg; Huhn, Carolin

    2014-10-01

    High production output of solid pharmaceutical formulations requires fast methods to ensure their quality. Likewise, fast analytical procedures are required in forensic sciences, for example at customs, to substantiate an initial suspicion. We here present the design and the optimization of an instrumental setup for rapid and non-invasive characterization of tablets by laser-induced fluorescence spectroscopy (with a UV-laser (λ ex = 266 nm) as excitation source) in reflection geometry. The setup was first validated with regard to repeatability, bleaching phenomena, and sensitivity. The effect on the spectra by the physical and chemical properties of the samples, e.g. their hardness, homogeneity, chemical composition, and granule grain size of the uncompressed material, using a series of tablets, manufactured in accordance with design of experiments, was investigated. Investigation of tablets with regard to homogeneity, especially, is extremely important in pharmaceutical production processes. We demonstrate that multiplicative scatter correction is an appropriate tool for data preprocessing of fluorescence spectra. Tablets with different physical and chemical characteristics can be discriminated well from their fluorescence spectra by subjecting the results to principal component analysis.

  7. Remote Advanced Payload Test Rig (RAPTR) Portable Payload Test System for the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Calvert, John; Freas, George, II

    2017-01-01

    The RAPTR was developed to test ISS payloads for NASA. RAPTR is a simulation of the Command and Data Handling (C&DH) interfaces of the ISS (MIL-STD 1553B, Ethernet and TAXI) and is designed to facilitate rapid testing and deployment of payload experiments to the ISS. The ISS Program's goal is to reduce the amount of time it takes a payload developer to build, test and fly a payload, including payload software. The RAPTR meets this need with its user oriented, visually rich interface. Additionally, the Analog and Discrete (A&D) signals of the following payload types may be tested with RAPTR: (1) EXPRESS Sub Rack Payloads; (2) ELC payloads; (3) External Columbus payloads; (4) External Japanese Experiment Module (JEM) payloads. The automated payload configuration setup and payload data inspection infrastructure is found nowhere else in ISS payload test systems. Testing can be done with minimal human intervention and setup, as the RAPTR automatically monitors parameters in the data headers that are sent to, and come from the experiment under test.

  8. In-Situ Swelling For Holographic Color Control

    NASA Astrophysics Data System (ADS)

    Walker Parker, Julie L.; Benton, Stephen A.

    1989-05-01

    Deliberate variations of the emulsion thickness between holographic exposures and reconstruction produce a range of output wavelengths from a fixed exposure wavelength, a technique known as "pseudo-color" multi-color reflection holography. Usual methods require the removal of the film or plate from the holographic setup between exposures for imbibition of a swelling agent, followed by drying and replacement, so that a retention of the swelling agent forces a physical increase in the thickness of the emulsion. The density (and hence the thickness) of the gelatin binder can also be varied by changing its electrolytic environment. By immersing the holographic emulsion in a suitable solution, allowing it to come to a new equilibrium thickness, and exposing with a long-wavelength laser, shorter wavelength reconstructions can be obtained without removing the film or plate from the setup. Accurate changes of solution can make a precise sequence of swellings possible, producing multiple reconstruction colors from a set of constant-wavelength recordings. Here we describe pre-treatments of the emulsion that make rapid and stable equilibria possible, and swelling bath sequences that produce color primaries suitable for full-color computer-graphic holographic imagery.

  9. Analysis of Feeder Bus Network Design and Scheduling Problems

    PubMed Central

    Almasi, Mohammad Hadi; Karim, Mohamed Rehan

    2014-01-01

    A growing concern for public transit is its inability to shift passenger's mode from private to public transport. In order to overcome this problem, a more developed feeder bus network and matched schedules will play important roles. The present paper aims to review some of the studies performed on Feeder Bus Network Design and Scheduling Problem (FNDSP) based on three distinctive parts of the FNDSP setup, namely, problem description, problem characteristics, and solution approaches. The problems consist of different subproblems including data preparation, feeder bus network design, route generation, and feeder bus scheduling. Subsequently, descriptive analysis and classification of previous works are presented to highlight the main characteristics and solution methods. Finally, some of the issues and trends for future research are identified. This paper is targeted at dealing with the FNDSP to exhibit strategic and tactical goals and also contributes to the unification of the field which might be a useful complement to the few existing reviews. PMID:24526890

  10. SU-E-J-15: Automatically Detect Patient Treatment Position and Orientation in KV Portal Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu, J; Yang, D

    2015-06-15

    Purpose: In the course of radiation therapy, the complex information processing workflow will Result in potential errors, such as incorrect or inaccurate patient setups. With automatic image check and patient identification, such errors could be effectively reduced. For this purpose, we developed a simple and rapid image processing method, to automatically detect the patient position and orientation in 2D portal images, so to allow automatic check of positions and orientations for patient daily RT treatments. Methods: Based on the principle of portal image formation, a set of whole body DRR images were reconstructed from multiple whole body CT volume datasets,more » and fused together to be used as the matching template. To identify the patient setup position and orientation shown in a 2D portal image, the 2D portal image was preprocessed (contrast enhancement, down-sampling and couch table detection), then matched to the template image so to identify the laterality (left or right), position, orientation and treatment site. Results: Five day’s clinical qualified portal images were gathered randomly, then were processed by the automatic detection and matching method without any additional information. The detection results were visually checked by physicists. 182 images were correct detection in a total of 200kV portal images. The correct rate was 91%. Conclusion: The proposed method can detect patient setup and orientation quickly and automatically. It only requires the image intensity information in KV portal images. This method can be useful in the framework of Electronic Chart Check (ECCK) to reduce the potential errors in workflow of radiation therapy and so to improve patient safety. In addition, the auto-detection results, as the patient treatment site position and patient orientation, could be useful to guide the sequential image processing procedures, e.g. verification of patient daily setup accuracy. This work was partially supported by research grant from Varian Medical System.« less

  11. Design and experimental study of an integrated vapor chamber-thermal energy storage system

    NASA Astrophysics Data System (ADS)

    Kota, Krishna M.

    Future defense, aerospace and automotive technologies involve electronic systems that release high pulsed waste heat like during high power microwave and laser diode applications in tactical and combat aircraft, and electrical and electronic systems in hybrid electric vehicles, which will require the development of an efficient thermal management system. A key design issue is the need for fast charging so as not to overheat the key components. The goal of this work is to study the fabrication and technology implementation feasibility of a novel high energy storage, high heat flux passive heat sink. Key focus is to verify by theory and experiments, the practicability of using phase change materials as a temporary storage of waste heat for heat sink applications. The reason for storing the high heat fluxes temporarily is to be able to reject the heat at the average level when the heat source is off. Accordingly, a concept of a dual latent heat sink intended for moderate to low thermal duty cycle electronic heat sink applications is presented. This heat sink design combines the features of a vapor chamber with rapid thermal energy storage employing graphite foam inside the heat storage facility along with phase change materials and is attractive owing to its passive operation unlike some of the current thermal management techniques for cooling of electronics employing forced air circulation or external heat exchangers. In addition to the concept, end-application dependent criteria to select an optimized design for this dual latent heat sink are presented. A thermal resistance concept based design tool/model has been developed to analyze and optimize the design for experiments. The model showed that it is possible to have a dual latent heat sink design capable of handling 7 MJ of thermal load at a heat flux of 500 W/cm2 (over an area of 100 cm 2) with a volume of 0.072 m3 and weighing about 57.5 kg. It was also found that with such high heat flux absorption capability, the proposed conceptual design could have a vapor-to-condenser temperature difference of less than 10°C with a volume storage density of 97 MJ/m 3 and a mass storage density of 0.122 MJ/kg. The effectiveness of this heat sink depends on the rapidness of the heat storage facility in the design during the pulse heat generation period of the duty cycle. Heat storage in this heat sink involves transient simultaneous laminar film condensation of vapor and melting of an encapsulated phase change material in graphite foam. Therefore, this conjugate heat transfer problem including the wall inertia effect is numerically analyzed and the effectiveness of the heat storage mechanism of the heat sink is verified. An effective heat capacity formulation is employed for modeling the phase change problem and is solved using finite element method. The results of the developed model showed that the concept is effective in preventing undue temperature rise of the heat source. Experiments are performed to investigate the fabrication and implementation feasibility and heat transfer performance for validating the objectives of the design, i.e., to show that the VCTES heat sink is practicable and using PCM helps in arresting the vapor temperature rise in the heat sink. For this purpose, a prototype version of the VCTES heat sink is fabricated and tested for thermal performance. The volume foot-print of the vapor chamber is about 6"X5"X2.5". A custom fabricated thermal energy storage setup is incorporated inside this vapor chamber. A heat flux of 40 W/cm2 is applied at the source as a pulse and convection cooling is used on the condenser surface. Experiments are done with and without using PCM in the thermal energy storage setup. It is found that using PCM as a second latent system in the setup helps in lowering the undue temperature rise of the heat sink system. It is also found that the thermal resistance between the vapor chamber and the thermal energy storage setup, the pool boiling resistance at the heat source in the vapor chamber, the condenser resistance during heat discharging were key parameters that affect the thermal performance. Some suggestions for future improvements in the design to ease its implementation and enhance the heat transfer of this novel heat sink are also presented.

  12. Operational Characteristics of an Ultra Compact Combustor

    DTIC Science & Technology

    2014-03-27

    to control this temperature profile to the turbine. A thermally non -uniform flow can create problems with power extraction and heat loading within...NOx) in an experimental rig set-up using air jet cross flows in non -reacting and reacting conditions at high pressure. NOx formation has become the...performance. One of the obstacles for implementing an UCC is the ability to control this temperature profile to the turbine. A thermally non

  13. The Prediction of Transducer Element Performance from In-Air Measurements.

    DTIC Science & Technology

    1982-01-19

    33 13. Predicted and Measured Transducer Impedance . . . 35 14. Principle of Operation of Fotonic Sensor . . . . 40 15. Experimental Set-up for...inferred from tests of the assembled element, and cannot account for assembly problems such as misalignment and improper glue joints. Thus, the...the results neither predict nor account for the element variability found in actual practice. Our purpose, then, is to derive the lumped-parameter

  14. Er:Yb phosphate glass laser with nonlinear absorber for phase-sensitive optical time domain reflectometry

    NASA Astrophysics Data System (ADS)

    Zhirnov, A. A.; Pnev, A. B.; Svelto, C.; Norgia, M.; Pesatori, A.; Galzerano, G.; Laporta, P.; Shelestov, D. A.; Karasik, V. E.

    2017-11-01

    A novel laser for phase-sensitive optical time-domain reflectometry (Φ-OTDR) is presented. The advantages of a compact solid-state laser are listed, current problems are shown. Experiments with a microchip single-optical-element laser, from setup construction to usage in Φ-OTDR system, are presented. New laser scheme with two-photon intracavity absorber is suggested and its advantages are described.

  15. Automatic Emboli Detection System for the Artificial Heart

    NASA Astrophysics Data System (ADS)

    Steifer, T.; Lewandowski, M.; Karwat, P.; Gawlikowski, M.

    In spite of the progress in material engineering and ventricular assist devices construction, thromboembolism remains the most crucial problem in mechanical heart supporting systems. Therefore, the ability to monitor the patient's blood for clot formation should be considered an important factor in development of heart supporting systems. The well-known methods for automatic embolus detection are based on the monitoring of the ultrasound Doppler signal. A working system utilizing ultrasound Doppler is being developed for the purpose of flow estimation and emboli detection in the clinical artificial heart ReligaHeart EXT. Thesystem will be based on the existing dual channel multi-gate Doppler device with RF digital processing. A specially developed clamp-on cannula probe, equipped with 2 - 4 MHz piezoceramic transducers, enables easy system setup. We present the issuesrelated to the development of automatic emboli detection via Doppler measurements. We consider several algorithms for the flow estimation and emboli detection. We discuss their efficiency and confront them with the requirements of our experimental setup. Theoretical considerations are then met with preliminary experimental findings from a) flow studies with blood mimicking fluid and b) in-vitro flow studies with animal blood. Finally, we discuss some more methodological issues - we consider several possible approaches to the problem of verification of the accuracy of the detection system.

  16. Miniaturized Sample Preparation and Rapid Detection of Arsenite in Contaminated Soil Using a Smartphone.

    PubMed

    Siddiqui, Mohd Farhan; Kim, Soocheol; Jeon, Hyoil; Kim, Taeho; Joo, Chulmin; Park, Seungkyung

    2018-03-04

    Conventional methods for analyzing heavy metal contamination in soil and water generally require laboratory equipped instruments, complex procedures, skilled personnel and a significant amount of time. With the advancement in computing and multitasking performances, smartphone-based sensors potentially allow the transition of the laboratory-based analytical processes to field applicable, simple methods. In the present work, we demonstrate the novel miniaturized setup for simultaneous sample preparation and smartphone-based optical sensing of arsenic As(III) in the contaminated soil. Colorimetric detection protocol utilizing aptamers, gold nanoparticles and NaCl have been optimized and tested on the PDMS-chip to obtain the high sensitivity with the limit of detection of 0.71 ppm (in the sample) and a correlation coefficient of 0.98. The performance of the device is further demonstrated through the comparative analysis of arsenic-spiked soil samples with standard laboratory method, and a good agreement with a correlation coefficient of 0.9917 and the average difference of 0.37 ppm, are experimentally achieved. With the android application on the device to run the experiment, the whole process from sample preparation to detection is completed within 3 hours without the necessity of skilled personnel. The approximate cost of setup is estimated around 1 USD, weight 55 g. Therefore, the presented method offers the simple, rapid, portable and cost-effective means for onsite sensing of arsenic in soil. Combined with the geometric information inside the smartphones, the system will allow the monitoring of the contamination status of soils in a nation-wide manner.

  17. Magnetospheric Gamma-Ray Emission in Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Katsoulakos, Grigorios; Rieger, Frank M.

    2018-01-01

    The rapidly variable, very high-energy (VHE) gamma-ray emission from active galactic nuclei (AGNs) has been frequently associated with non-thermal processes occurring in the magnetospheres of their supermassive black holes. The present work aims to explore the adequacy of different gap-type (unscreened electric field) models to account for the observed characteristics. Based on a phenomenological description of the gap potential, we estimate the maximum extractable gap power L gap for different magnetospheric setups, and study its dependence on the accretion state of the source. L gap is found in general to be proportional to the Blandford–Znajek jet power L BZ and a sensitive function of gap size h, i.e., {L}{gap}∼ {L}{BZ}{(h/{r}g)}β , where the power index β ≥slant 1 is dependent on the respective gap setup. The transparency of the vicinity of the black hole to VHE photons generally requires a radiatively inefficient accretion environment and thereby imposes constraints on possible accretion rates, and correspondingly on L BZ. Similarly, rapid variability, if observed, may allow one to constrain the gap size h∼ c{{Δ }}t. Combining these constraints, we provide a general classification to assess the likelihood that the VHE gamma-ray emission observed from an AGN can be attributed to a magnetospheric origin. When applied to prominent candidate sources these considerations suggest that the variable (day-scale) VHE activity seen in the radio galaxy M87 could be compatible with a magnetospheric origin, while such an origin appears less likely for the (minute-scale) VHE activity in IC 310.

  18. The focal plane reception pattern calculation for a paraboloidal antenna with a nearby fence

    NASA Technical Reports Server (NTRS)

    Schmidt, Richard F.; Cheng, Hwai-Soon; Kao, Michael W.

    1987-01-01

    A computer simulation program is described which is used to estimate the effects of a proximate diffraction fence on the performance of paraboloid antennas. The computer program is written in FORTRAN. The physical problem, mathematical formulation and coordinate references are described. The main control structure of the program and the function of the individual subroutines are discussed. The Job Control Language set-up and program instruction are provided in the user's instruction to help users execute the present program. A sample problem with an appropriate output listing is made available as an illustration of the usage of the program.

  19. PIV Measurements in the 14 x 22 Low Speed Tunnel: Recommendations for Future Testing

    NASA Technical Reports Server (NTRS)

    Watson, Ralph D.; Jenkins, Luther N.; Yao, Chung-Sheng; McGinley, Catherine B.; Paschal, Keith B.; Neuhart, Dan H.

    2003-01-01

    During the period from February 4 to March 21, 2003 stereo digital particle imaging velocimetry measurements were made on a generic high lift model, the Trap Wing, as part of the High Lift Flow Physics Experiment. These measurements were the first PIV measurements made in the NASA, Langley Research Center 14 x 22 Foot Low Speed Tunnel, and several problems were encountered and solved in the acquisition of the data. It is the purpose of this paper to document the solutions to these problems and to make recommendations for further improvements to the tunnel/setup in order to facilitate future measurements of this type.

  20. User's manual for three dimensional FDTD version B code for scattering from frequency-dependent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file, a discussion of radar cross section computations, a discussion of some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  1. Experimental realization of a one-way quantum computer algorithm solving Simon's problem.

    PubMed

    Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G

    2014-11-14

    We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.

  2. Long-Wave Infrared (LWIR) Molecular Laser-Induced Breakdown Spectroscopy (LIBS) Emissions of Thin Solid Explosive Powder Films Deposited on Aluminum Substrates.

    PubMed

    Yang, Clayton S-C; Jin, Feng; Trivedi, Sudhir B; Brown, Ei E; Hommerich, Uwe; Tripathi, Ashish; Samuels, Alan C

    2017-04-01

    Thin solid films made of high nitro (NO 2 )/nitrate (NO 3 ) content explosives were deposited on sand-blasted aluminum substrates and then studied using a mercury-cadmium-telluride (MCT) linear array detection system that is capable of rapidly capturing a broad spectrum of atomic and molecular laser-induced breakdown spectroscopy (LIBS) emissions in the long-wave infrared region (LWIR; ∼5.6-10 µm). Despite the similarities of their chemical compositions and structures, thin films of three commonly used explosives (RDX, HMX, and PETN) studied in this work can be rapidly identified in the ambient air by their molecular LIBS emission signatures in the LWIR region. A preliminary assessment of the detection limit for a thin film of RDX on aluminum appears to be much lower than 60 µg/cm 2 . This LWIR LIBS setup is capable of rapidly probing and charactering samples without the need for elaborate sample preparation and also offers the possibility of a simultaneous ultraviolet visible and LWIR LIBS measurement.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sentis, Manuel Lorenzo; Gable, Carl W.

    Furthermore, there are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools willmore » provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. Here in this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.« less

  4. Lowering the Barrier for Standards-Compliant and Discoverable Hydrological Data Publication

    NASA Astrophysics Data System (ADS)

    Kadlec, J.

    2013-12-01

    The growing need for sharing and integration of hydrological and climate data across multiple organizations has resulted in the development of distributed, services-based, standards-compliant hydrological data management and data hosting systems. The problem with these systems is complicated set-up and deployment. Many existing systems assume that the data publisher has remote-desktop access to a locally managed server and experience with computer network setup. For corporate websites, shared web hosting services with limited root access provide an inexpensive, dynamic web presence solution using the Linux, Apache, MySQL and PHP (LAMP) software stack. In this paper, we hypothesize that a webhosting service provides an optimal, low-cost solution for hydrological data hosting. We propose a software architecture of a standards-compliant, lightweight and easy-to-deploy hydrological data management system that can be deployed on the majority of existing shared internet webhosting services. The architecture and design is validated by developing Hydroserver Lite: a PHP and MySQL-based hydrological data hosting package that is fully standards-compliant and compatible with the Consortium of Universities for Advancement of Hydrologic Sciences (CUAHSI) hydrologic information system. It is already being used for management of field data collection by students of the McCall Outdoor Science School in Idaho. For testing, the Hydroserver Lite software has been installed on multiple different free and low-cost webhosting sites including Godaddy, Bluehost and 000webhost. The number of steps required to set-up the server is compared with the number of steps required to set-up other standards-compliant hydrologic data hosting systems including THREDDS, IstSOS and MapServer SOS.

  5. Non-linear transfer characteristics of stimulation and recording hardware account for spurious low-frequency artifacts during amplitude modulated transcranial alternating current stimulation (AM-tACS).

    PubMed

    Kasten, Florian H; Negahbani, Ehsan; Fröhlich, Flavio; Herrmann, Christoph S

    2018-05-31

    Amplitude modulated transcranial alternating current stimulation (AM-tACS) has been recently proposed as a possible solution to overcome the pronounced stimulation artifact encountered when recording brain activity during tACS. In theory, AM-tACS does not entail power at its modulating frequency, thus avoiding the problem of spectral overlap between brain signal of interest and stimulation artifact. However, the current study demonstrates how weak non-linear transfer characteristics inherent to stimulation and recording hardware can reintroduce spurious artifacts at the modulation frequency. The input-output transfer functions (TFs) of different stimulation setups were measured. Setups included recordings of signal-generator and stimulator outputs and M/EEG phantom measurements. 6 th -degree polynomial regression models were fitted to model the input-output TFs of each setup. The resulting TF models were applied to digitally generated AM-tACS signals to predict the frequency of spurious artifacts in the spectrum. All four setups measured for the study exhibited low-frequency artifacts at the modulation frequency and its harmonics when recording AM-tACS. Fitted TF models showed non-linear contributions significantly different from zero (all p < .05) and successfully predicted the frequency of artifacts observed in AM-signal recordings. Results suggest that even weak non-linearities of stimulation and recording hardware can lead to spurious artifacts at the modulation frequency and its harmonics. These artifacts were substantially larger than alpha-oscillations of a human subject in the MEG. Findings emphasize the need for more linear stimulation devices for AM-tACS and careful analysis procedures, taking into account low-frequency artifacts to avoid confusion with effects of AM-tACS on the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Food quality inspection by speckle decorrelation properties of bacteria colonies

    NASA Astrophysics Data System (ADS)

    Bianco, V.; Mandracchia, B.; Nazzaro, F.; Marchesano, V.; Gennari, O.; Paturzo, M.; Grilli, S.; Ferraro, P.

    2017-06-01

    The development of tools for rapid food quality inspection is a highly pursued goal. These could be valuable devices to be used by food producers in factories or the customers themselves in specific installations at the marketplace. Here we show how speckle patterns in coherent imaging systems can be can be employed as indicators of the presence of bacteria colonies contaminating food or water. Speckle decorrelation is induced by the self-propelling movement of these organisms when they interact with coherent light. Hence, their presence can be detected using a simple setup in a condition in which the single element cannot be imaged, but the properties of the ensemble can be exploited. Thanks to the small magnification factor we set, our system can inspect a large Field-of-View (FoV). We show the possibility to discriminate between fresh and contaminated food, thus paving the way to the rapid food quality testing by consumers at the marketplace.

  7. Rapid Parameterization Schemes for Aircraft Shape Optimization

    NASA Technical Reports Server (NTRS)

    Li, Wu

    2012-01-01

    A rapid shape parameterization tool called PROTEUS is developed for aircraft shape optimization. This tool can be applied directly to any aircraft geometry that has been defined in PLOT3D format, with the restriction that each aircraft component must be defined by only one data block. PROTEUS has eight types of parameterization schemes: planform, wing surface, twist, body surface, body scaling, body camber line, shifting/scaling, and linear morphing. These parametric schemes can be applied to two types of components: wing-type surfaces (e.g., wing, canard, horizontal tail, vertical tail, and pylon) and body-type surfaces (e.g., fuselage, pod, and nacelle). These schemes permit the easy setup of commonly used shape modification methods, and each customized parametric scheme can be applied to the same type of component for any configuration. This paper explains the mathematics for these parametric schemes and uses two supersonic configurations to demonstrate the application of these schemes.

  8. The value and throughput of rest Thallium-201/stress Technetium -99m sestamibi dual-isotope myocardial SPECT.

    PubMed

    Okudan, Berna; Smitherman, Thomas C

    2004-06-01

    Myocardial perfusion scintigraphy is an established method in cardiology for the diagnosis and evaluation of coronary artery disease (CAD). Thallium-201 and Tc-99m sestamibi myocardial perfusion imaging has been widely accepted as non-invasive diagnostic procedure for detection of CAD, risk stratification and myocardial viability assessment. But, standard Tl-201 redistribution and same day or 2-day rest/stress Tc-99m sestamibi protocols are time-consuming. Hence, the dual isotope rest thallium-201/stress technetium-99m sestamibi gated single-photon emission tomography protocol has gained increasing popularity for these applications. Combining the use of thallium-201 with technetium-99m agents permits optimal image resolution and simultaneous assessment of viability. Dual-isotope imaging may be separate or simultaneous acquisition set-up. The more rapid completion of these studies is appreciated as an advantage by patients, technologists, interpreting and referring physicians, nurses and hospital management. Simultaneous imaging has the potential advantages of precise pixel registration and artifacts, if present, are identical in both thallium and sestamibi, and require only one set of imaging. Also, there are some disadvantages of spillover of activity from the Tc-99m to the Tl-201 window. Fortunately, despite this problem it can be overcome. Separate acquisition dual isotope also has some disadvantages. Difference in defect resolution in attenuation and scatter between T-201 and Tc-99m sestamibi potentially results in interpretation problems. But, studies about cost-effectiveness of dual isotope imaging showed that some selective elimination of the rest studies may decrease the cost of the nuclear procedures and should be considered in the current care health system.

  9. SGRAPH (SeismoGRAPHer): Seismic waveform analysis and integrated tools in seismology

    NASA Astrophysics Data System (ADS)

    Abdelwahed, Mohamed F.

    2012-03-01

    Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.

  10. A corrosion control manual for rail rapid transit

    NASA Technical Reports Server (NTRS)

    Gilbert, L. O.; Fitzgerald, J. F., II; Menke, J. T.

    1982-01-01

    In 1979, during the planning stage of the Metropolitan Dade County Transit System, the need was expressed for a corrosion control manual oriented to urban rapid transit system use. This manual responds to that need. The objective of the manual is to aid rail rapid transit agencies by providing practical solutions to selected corrosion problems. The scope of the manual encompasses corrosion problems of the facilities of rapid transit systems: structures and tracks, platforms and stations, power and signals, and cars. It also discusses stray electric current corrosion. Both design and maintenance solutions are provided for each problem. Also included are descriptions of the types of corrosion and their causes, descriptions of rapid transit properties, a list of corrosion control committees and NASA, DOD, and ASTM specifications and design criteria to which reference is made in the manual. A bibliography of papers and excerpts of reports and a glossary of frequency used terms are provided.

  11. An Optical Lever For The Metrology Of Grazing Incidence Optics

    NASA Astrophysics Data System (ADS)

    DeCew, Alan E.; Wagner, Robert W.

    1986-11-01

    Research Optics & Development, Inc. is using a slope tracing profilometer to measure the figure of optical surfaces which cannot be measured conveniently by interferometric means. As a metrological tool, the technique has its greatest advantage as an in-process easurement system. An optician can easily convert from polishing to measurement in less than a minute of time. This rapid feedback allows figure correction with minimal wasted effort and setup time. The present configuration of the slope scanner provides resolutions to 1 micro-radian. By implementing minor modifications, the resolution could be improved by an order of magnitude.

  12. State dependent optimization of measurement policy

    NASA Astrophysics Data System (ADS)

    Konkarikoski, K.

    2010-07-01

    Measurements are the key to rational decision making. Measurement information generates value, when it is applied in the decision making. An investment cost and maintenance costs are associated with each component of the measurement system. Clearly, there is - under a given set of scenarios - a measurement setup that is optimal in expected (discounted) utility. This paper deals how the measurement policy optimization is affected by different system states and how this problem can be tackled.

  13. Variational method for integrating radial gradient field

    NASA Astrophysics Data System (ADS)

    Legarda-Saenz, Ricardo; Brito-Loeza, Carlos; Rivera, Mariano; Espinosa-Romero, Arturo

    2014-12-01

    We propose a variational method for integrating information obtained from circular fringe pattern. The proposed method is a suitable choice for objects with radial symmetry. First, we analyze the information contained in the fringe pattern captured by the experimental setup and then move to formulate the problem of recovering the wavefront using techniques from calculus of variations. The performance of the method is demonstrated by numerical experiments with both synthetic and real data.

  14. Efficacy and workload analysis of a fixed vertical couch position technique and a fixed‐action–level protocol in whole‐breast radiotherapy

    PubMed Central

    Verhoeven, Karolien; Weltens, Caroline; Van den Heuvel, Frank

    2015-01-01

    Quantification of the setup errors is vital to define appropriate setup margins preventing geographical misses. The no‐action–level (NAL) correction protocol reduces the systematic setup errors and, hence, the setup margins. The manual entry of the setup corrections in the record‐and‐verify software, however, increases the susceptibility of the NAL protocol to human errors. Moreover, the impact of the skin mobility on the anteroposterior patient setup reproducibility in whole‐breast radiotherapy (WBRT) is unknown. In this study, we therefore investigated the potential of fixed vertical couch position‐based patient setup in WBRT. The possibility to introduce a threshold for correction of the systematic setup errors was also explored. We measured the anteroposterior, mediolateral, and superior–inferior setup errors during fractions 1–12 and weekly thereafter with tangential angled single modality paired imaging. These setup data were used to simulate the residual setup errors of the NAL protocol, the fixed vertical couch position protocol, and the fixed‐action–level protocol with different correction thresholds. Population statistics of the setup errors of 20 breast cancer patients and 20 breast cancer patients with additional regional lymph node (LN) irradiation were calculated to determine the setup margins of each off‐line correction protocol. Our data showed the potential of the fixed vertical couch position protocol to restrict the systematic and random anteroposterior residual setup errors to 1.8 mm and 2.2 mm, respectively. Compared to the NAL protocol, a correction threshold of 2.5 mm reduced the frequency of mediolateral and superior–inferior setup corrections with 40% and 63%, respectively. The implementation of the correction threshold did not deteriorate the accuracy of the off‐line setup correction compared to the NAL protocol. The combination of the fixed vertical couch position protocol, for correction of the anteroposterior setup error, and the fixed‐action–level protocol with 2.5 mm correction threshold, for correction of the mediolateral and the superior–inferior setup errors, was proved to provide adequate and comparable patient setup accuracy in WBRT and WBRT with additional LN irradiation. PACS numbers: 87.53.Kn, 87.57.‐s

  15. Cone beam CT-based set-up strategies with and without rotational correction for stereotactic body radiation therapy in the liver.

    PubMed

    Bertholet, Jenny; Worm, Esben; Høyer, Morten; Poulsen, Per

    2017-06-01

    Accurate patient positioning is crucial in stereotactic body radiation therapy (SBRT) due to a high dose regimen. Cone-beam computed tomography (CBCT) is often used for patient positioning based on radio-opaque markers. We compared six CBCT-based set-up strategies with or without rotational correction. Twenty-nine patients with three implanted markers received 3-6 fraction liver SBRT. The markers were delineated on the mid-ventilation phase of a 4D-planning-CT. One pretreatment CBCT was acquired per fraction. Set-up strategy 1 used only translational correction based on manual marker match between the CBCT and planning CT. Set-up strategy 2 used automatic 6 degrees-of-freedom registration of the vertebrae closest to the target. The 3D marker trajectories were also extracted from the projections and the mean position of each marker was calculated and used for set-up strategies 3-6. Translational correction only was used for strategy 3. Translational and rotational corrections were used for strategies 4-6 with the rotation being either vertebrae based (strategy 4), or marker based and constrained to ±3° (strategy 5) or unconstrained (strategy 6). The resulting set-up error was calculated as the 3D root-mean-square set-up error of the three markers. The set-up error of the spinal cord was calculated for all strategies. The bony anatomy set-up (2) had the largest set-up error (5.8 mm). The marker-based set-up with unconstrained rotations (6) had the smallest set-up error (0.8 mm) but the largest spinal cord set-up error (12.1 mm). The marker-based set-up with translational correction only (3) or with bony anatomy rotational correction (4) had equivalent set-up error (1.3 mm) but rotational correction reduced the spinal cord set-up error from 4.1 mm to 3.5 mm. Marker-based set-up was substantially better than bony-anatomy set-up. Rotational correction may improve the set-up, but further investigations are required to determine the optimal correction strategy.

  16. Single product lot-sizing on unrelated parallel machines with non-decreasing processing times

    NASA Astrophysics Data System (ADS)

    Eremeev, A.; Kovalyov, M.; Kuznetsov, P.

    2018-01-01

    We consider a problem in which at least a given quantity of a single product has to be partitioned into lots, and lots have to be assigned to unrelated parallel machines for processing. In one version of the problem, the maximum machine completion time should be minimized, in another version of the problem, the sum of machine completion times is to be minimized. Machine-dependent lower and upper bounds on the lot size are given. The product is either assumed to be continuously divisible or discrete. The processing time of each machine is defined by an increasing function of the lot volume, given as an oracle. Setup times and costs are assumed to be negligibly small, and therefore, they are not considered. We derive optimal polynomial time algorithms for several special cases of the problem. An NP-hard case is shown to admit a fully polynomial time approximation scheme. An application of the problem in energy efficient processors scheduling is considered.

  17. Prediction-Correction Algorithms for Time-Varying Constrained Optimization

    DOE PAGES

    Simonetto, Andrea; Dall'Anese, Emiliano

    2017-07-26

    This article develops online algorithms to track solutions of time-varying constrained optimization problems. Particularly, resembling workhorse Kalman filtering-based approaches for dynamical systems, the proposed methods involve prediction-correction steps to provably track the trajectory of the optimal solutions of time-varying convex problems. The merits of existing prediction-correction methods have been shown for unconstrained problems and for setups where computing the inverse of the Hessian of the cost function is computationally affordable. This paper addresses the limitations of existing methods by tackling constrained problems and by designing first-order prediction steps that rely on the Hessian of the cost function (and do notmore » require the computation of its inverse). In addition, the proposed methods are shown to improve the convergence speed of existing prediction-correction methods when applied to unconstrained problems. Numerical simulations corroborate the analytical results and showcase performance and benefits of the proposed algorithms. A realistic application of the proposed method to real-time control of energy resources is presented.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonetto, Andrea; Dall'Anese, Emiliano

    This article develops online algorithms to track solutions of time-varying constrained optimization problems. Particularly, resembling workhorse Kalman filtering-based approaches for dynamical systems, the proposed methods involve prediction-correction steps to provably track the trajectory of the optimal solutions of time-varying convex problems. The merits of existing prediction-correction methods have been shown for unconstrained problems and for setups where computing the inverse of the Hessian of the cost function is computationally affordable. This paper addresses the limitations of existing methods by tackling constrained problems and by designing first-order prediction steps that rely on the Hessian of the cost function (and do notmore » require the computation of its inverse). In addition, the proposed methods are shown to improve the convergence speed of existing prediction-correction methods when applied to unconstrained problems. Numerical simulations corroborate the analytical results and showcase performance and benefits of the proposed algorithms. A realistic application of the proposed method to real-time control of energy resources is presented.« less

  19. X-band EPR setup with THz light excitation of Novosibirsk Free Electron Laser: Goals, means, useful extras.

    PubMed

    Veber, Sergey L; Tumanov, Sergey V; Fursova, Elena Yu; Shevchenko, Oleg A; Getmanov, Yaroslav V; Scheglov, Mikhail A; Kubarev, Vitaly V; Shevchenko, Daria A; Gorbachev, Iaroslav I; Salikova, Tatiana V; Kulipanov, Gennady N; Ovcharenko, Victor I; Fedin, Matvey V

    2018-03-01

    Electron Paramagnetic Resonance (EPR) station at the Novosibirsk Free Electron Laser (NovoFEL) user facility is described. It is based on X-band (∼9 GHz) EPR spectrometer and operates in both Continuous Wave (CW) and Time-Resolved (TR) modes, each allowing detection of either direct or indirect influence of high-power NovoFEL light (THz and mid-IR) on the spin system under study. The optics components including two parabolic mirrors, shutters, optical chopper and multimodal waveguide allow the light of NovoFEL to be directly fed into the EPR resonator. Characteristics of the NovoFEL radiation, the transmission and polarization-retaining properties of the waveguide used in EPR experiments are presented. The types of proposed experiments accessible using this setup are sketched. In most practical cases the high-power radiation applied to the sample induces its rapid temperature increase (T-jump), which is best visible in TR mode. Although such influence is a by-product of THz radiation, this thermal effect is controllable and can deliberately be used to induce and measure transient signals of arbitrary samples. The advantage of tunable THz radiation is the absence of photo-induced processes in the sample and its high penetration ability, allowing fast heating of a large portion of virtually any sample and inducing intense transients. Such T-jump TR EPR spectroscopy with THz pulses has been previewed for the two test samples, being a useful supplement for the main goals of the created setup. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Modeling flow and solute transport at a tile drain field site by explicit representation of preferential flow structures: Equifinality and uncertainty

    NASA Astrophysics Data System (ADS)

    Zehe, E.; Klaus, J.

    2011-12-01

    Rapid flow in connected preferential flow paths is crucial for fast transport of water and solutes through soils, especially at tile drained field sites. The present study tests whether an explicit treatment of worm burrows is feasible for modeling water flow, bromide and pesticide transport in structured heterogeneous soils with a 2-dimensional Richards based model. The essence is to represent worm burrows as morphologically connected paths of low flow resistance and low retention capacity in the spatially highly resolved model domain. The underlying extensive database to test this approach was collected during an irrigation experiment, which investigated transport of bromide and the herbicide Isoproturon at a 900 sqm tile drained field site. In a first step we investigated whether the inherent uncertainty in key data causes equifinality i.e. whether there are several spatial model setups that reproduce tile drain event discharge in an acceptable manner. We found a considerable equifinality in the spatial setup of the model, when key parameters such as the area density of worm burrows and the maximum volumetric water flows inside these macropores were varied within the ranges of either our measurement errors or measurements reported in the literature. Thirteen model runs yielded a Nash-Sutcliffe coefficient of more than 0.9. Also, the flow volumes were in good accordance and peak timing errors where less than or equal to 20 min. In the second step we investigated thus whether this "equifinality" in spatial model setups may be reduced when including the bromide tracer data into the model falsification process. We simulated transport of bromide for the 13 spatial model setups, which performed best with respect to reproduce tile drain event discharge, without any further calibration. Four of this 13 model setups allowed to model bromide transport within fixed limits of acceptability. Parameter uncertainty and equifinality could thus be reduced. Thirdly, we selected one of those four setups for simulating transport of Isoproturon, which was applied the day before the irrigation experiment, and tested different parameter combinations to characterise adsorption according to the footprint data base. Simulations could, however, only reproduce the observed event based leaching behaviour, when we allowed for retardation coefficients that were very close to one. This finding is consistent with observations various field observations. We conclude: a) A realistic representation of dominating structures and their topology is of key importance for predicting preferential water and mass flows at tile drained hillslopes. b) Parameter uncertainty and equifinality could be reduced, but a system inherent equifinality in a 2-dimensional Richards based model has to be accepted.

  1. Eliminating Size-Associated Diffusion Constraints for Rapid On-Surface Bioassays with Nanoparticle Probes.

    PubMed

    Li, Junwei; Zrazhevskiy, Pavel; Gao, Xiaohu

    2016-02-24

    Nanoparticle probes enable implementation of advanced on-surface assay formats, but impose often underappreciated size-associated constraints, in particular on assay kinetics and sensitivity. The present study highlights substantially slower diffusion-limited assay kinetics due to the rapid development of a nanoprobe depletion layer next to the surface, which static incubation and mixing of bulk solution employed in conventional assay setups often fail to disrupt. In contrast, cyclic solution draining and replenishing yields reaction-limited assay kinetics irrespective of the probe size. Using common surface bioassays, enzyme-linked immunosorbent assays and immunofluorescence, this study shows that this conceptually distinct approach effectively "erases" size-dependent diffusion constraints, providing a straightforward route to rapid on-surface bioassays employing bulky probes and procedures involving multiple labeling cycles, such as multicycle single-cell molecular profiling. For proof-of-concept, the study demonstrates that the assay time can be shortened from hours to minutes with the same probe concentration and, at a typical incubation time, comparable target labeling can be achieved with up to eight times lower nanoprobe concentration. The findings are expected to enable realization of novel assay formats and stimulate development of rapid on-surface bioassays with nanoparticle probes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Evaluation of cross-polarized near infrared hyperspectral imaging for early detection of dental caries

    NASA Astrophysics Data System (ADS)

    Usenik, Peter; Bürmen, Miran; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2012-01-01

    Despite major improvements in dental healthcare and oral hygiene, dental caries remains one of the most prevalent oral diseases and represents the primary cause of oral pain and tooth loss. The initial stages of dental caries are characterized by demineralization of enamel crystals and are difficult to diagnose. Near infrared (NIR) hyperspectral imaging is a new promising technique for detection of early changes in the surfaces of carious teeth. This noninvasive imaging technique can characterize and differentiate between the sound tooth surface and initial or advanced tooth caries. The absorbing and scattering properties of dental tissues reflect in distinct spectral features, which can be measured, quantified and used to accurately classify and map different dental tissues. Specular reflections from the tooth surface, which appear as bright spots, mostly located around the edges and the crests of the teeth, act as a noise factor which can significantly interfere with the spectral measurements and analysis of the acquired images, degrading the accuracy of the classification and diagnosis. Employing cross-polarized imaging setup can solve this problem, however has yet to be systematically evaluated, especially in broadband hyperspectral imaging setups. In this paper, we employ cross-polarized illumination setup utilizing state-of-the-art high-contrast broadband wire-grid polarizers in the spectral range from 900 nm to 1700 nm for hyperspectral imaging of natural and artificial carious lesions of various degrees.

  3. Training of Ability for Engineering Design through Long Term Internship Program

    NASA Astrophysics Data System (ADS)

    Konishi, Masami; Gofuku, Akio; Tomita, Eiji

    The education program for engineering design capabilities through long term internship of Okayama University had started in 2006. The program supported by the MEXT is aimed to educate students in the Graduate School of Natural Science and Technology of Okayama University. The internship satellite laboratory of the University is settled in the near place of collaborative companies in which students are engaged with the project themes extracted from problems in the factory of collaborative companies. Through the program, promotion of abilities for setup and solving a problem considering cost and due date together with performance of the solution. Students are also expected to gain knowledge on patent and ethics required for skillful engineers.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanabe, S; Utsunomiya, S; Abe, E

    Purpose: To assess an accuracy of fiducial maker-based setup using ExacTrac (ExT-based setup) as compared with soft tissue-based setup using Cone-beam CT (CBCT-based setup) for patients with prostate cancer receiving intensity-modulated radiation therapy (IMRT) for the purpose of investigating whether ExT-based setup can be an alternative to CBCT-based setup. Methods: The setup accuracy was analyzed prospectively for 7 prostate cancer patients with implanted three fiducial markers received IMRT. All patients were treated after CBCT-based setup was performed and corresponding shifts were recorded. ExacTrac images were obtained before and after CBCT-based setup. The fiducial marker-based shifts were calculated based on thosemore » two images and recorded on the assumption that the setup correction was carried out by fiducial marker-based auto correction. Mean and standard deviation of absolute differences and the correlation between CBCT and ExT shifts were estimated. Results: A total of 178 image dataset were analyzed. On the differences between CBCT and ExT shifts, 133 (75%) of 178 image dataset resulted in smaller differences than 3 mm in all dimensions. Mean differences in the anterior-posterior (AP), superior-inferior (SI), and left-right (LR) dimensions were 1.8 ± 1.9 mm, 0.7 ± 1.9 mm, and 0.6 ± 0.8 mm, respectively. The percentages of shift agreements within ±3 mm were 76% for AP, 90% for SI, and 100% for LR. The Pearson coefficient of correlation for CBCT and ExT shifts were 0.80 for AP, 0.80 for SI, and 0.65 for LR. Conclusion: This work showed that the accuracy of ExT-based setup was correlated with that of CBCT-based setup, implying that ExT-based setup has a potential ability to be an alternative to CBCT-based setup. The further work is to specify the conditions that ExT-based setup can provide the accuracy comparable to CBCT-based setup.« less

  5. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  6. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  7. User's manual for three dimensional FDTD version D code for scattering from frequency-dependent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version D is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version D code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMOND.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  8. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  9. User's manual for three dimensional FDTD version B code for scattering from frequency-dependent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONB.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  10. Express railway disaster in Amagasaki: a review of urban disaster response capacity in Japan.

    PubMed

    Nagata, Takashi; Rosborough, Stephanie N; Rosborogh, Stephanie N; VanRooyen, Michael J; Kozawa, Shuichi; Ukai, Takashi; Nakayama, Shinichi

    2006-01-01

    On the morning of 25 April 2005, a Japan Railway express train derailed in an urban area of Amagasaki, Japan. The crash was Japan's worst rail disaster in 40 years. This study chronicles the rescue efforts and highlights the capacity of Japan's urban disaster response. Public reports were gathered from the media, Internet, government, fire department, and railway company. Four key informants, who were close to the disaster response, were interviewed to corroborate public data and highlight challenges facing the response. The crash left 107 passengers dead and 549 injured. First responders, most of whom were volunteers, were helpful in the rescue effort, and no lives were lost due to transport delays or faulty triage. Responders criticized an early decision to withdraw rescue efforts, a delay in heliport set-up, the inefficiency of the information and instruction center, and emphasized the need for training in confined space medicine. Communication and chain-of-command problems created confusion at the scene. The urban disaster response to the train crash in Amagasaki was rapid and effective. The Kobe Earthquake and other incidents sparked changes that improved disaster preparedness in Amagasaki. However, communication and cooperation among responders were hampered, as in previous disasters, by the lack of a structured command system. Application of an incident command system may improve disaster coordination in Japan.

  11. Second harmonic generation in a molecular magnetic chain

    NASA Astrophysics Data System (ADS)

    Cavigli, L.; Sessoli, R.; Gurioli, M.; Bogani, L.

    2006-05-01

    A setup for the determination of all the components of the second harmonic generation tensor in molecular materials is presented. It allows overcoming depletion problems, which one can expect to be common in molecular systems. A preliminary characterization of the nonlinear properties of the single chain magnet CoPhOMe is carried out. We observe a high second harmonic signal, comparable to that of urea, and show that the bulk contributions are dominant over the surface ones.

  12. LES of flow in the street canyon

    NASA Astrophysics Data System (ADS)

    Fuka, Vladimír; Brechler, Josef

    2012-04-01

    Results of computer simulation of flow over a series of street canyons are presented in this paper. The setup is adapted from an experimental study by [4] with two different shapes of buildings. The problem is simulated by an LES model CLMM (Charles University Large Eddy Microscale Model) and results are analysed using proper orthogonal decomposition and spectral analysis. The results in the channel (layout from the experiment) are compared with results with a free top boundary.

  13. Design of SIP transformation server for efficient media negotiation

    NASA Astrophysics Data System (ADS)

    Pack, Sangheon; Paik, Eun Kyoung; Choi, Yanghee

    2001-07-01

    Voice over IP (VoIP) is one of the advanced services supported by the next generation mobile communication. VoIP should support various media formats and terminals existing together. This heterogeneous environment may prevent diverse users from establishing VoIP sessions among them. To solve the problem an efficient media negotiation mechanism is required. In this paper, we propose the efficient media negotiation architecture using the transformation server and the Intelligent Location Server (ILS). The transformation server is an extended Session Initiation Protocol (SIP) proxy server. It can modify an unacceptable session INVITE message into an acceptable one using the ILS. The ILS is a directory server based on the Lightweight Directory Access Protocol (LDAP) that keeps userí*s location information and available media information. The proposed architecture can eliminate an unnecessary response and re-INVITE messages of the standard SIP architecture. It takes only 1.5 round trip times to negotiate two different media types while the standard media negotiation mechanism takes 2.5 round trip times. The extra processing time in message handling is negligible in comparison to the reduced round trip time. The experimental results show that the session setup time in the proposed architecture is less than the setup time in the standard SIP. These results verify that the proposed media negotiation mechanism is more efficient in solving diversity problems.

  14. Development of a standardized and safe airborne antibacterial assay, and its evaluation on antibacterial biomimetic model surfaces.

    PubMed

    Al-Ahmad, Ali; Zou, Peng; Solarte, Diana Lorena Guevara; Hellwig, Elmar; Steinberg, Thorsten; Lienkamp, Karen

    2014-01-01

    Bacterial infection of biomaterials is a major concern in medicine, and different kinds of antimicrobial biomaterial have been developed to deal with this problem. To test the antimicrobial performance of these biomaterials, the airborne bacterial assay is used, which involves the formation of biohazardous bacterial aerosols. We here describe a new experimental set-up which allows safe handling of such pathogenic aerosols, and standardizes critical parameters of this otherwise intractable and strongly user-dependent assay. With this new method, reproducible, thorough antimicrobial data (number of colony forming units and live-dead-stain) was obtained. Poly(oxonorbornene)-based Synthetic Mimics of Antimicrobial Peptides (SMAMPs) were used as antimicrobial test samples. The assay was able to differentiate even between subtle sample differences, such as different sample thicknesses. With this new set-up, the airborne bacterial assay was thus established as a useful, reliable, and realistic experimental method to simulate the contamination of biomaterials with bacteria, for example in an intraoperative setting.

  15. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  16. A Corrosion Control Manual for Rail Rapid Transit

    NASA Technical Reports Server (NTRS)

    Gilbert, L. O.; Fitzgerald, J. H., III; Menke, J. T.; Lizak, R. M. (Editor)

    1982-01-01

    This manual addresses corrosion problems in the design, contruction, and maintenance of rapid transit systems. Design and maintenance solutions are provided for each problem covered. The scope encompasses all facilities of urban rapid transit systems: structures and tracks, platforms and stations, power and signals, and cars. The types of corrosion and their causes as well as rapid transit properties are described. Corrosion control committees, and NASA, DOD, and ASTM specifications and design criteria to which reference is made in the manual are listed. A bibliography of papers and excerpts of reports is provided and a glossary of frequently used terms is included.

  17. Intracavitary moderator balloon combined with (252)Cf brachytherapy and boron neutron capture therapy, improving dosimetry in brain tumour and infiltrations.

    PubMed

    Brandão, S F; Campos, T P R

    2015-07-01

    This article proposes a combination of californium-252 ((252)Cf) brachytherapy, boron neutron capture therapy (BNCT) and an intracavitary moderator balloon catheter applied to brain tumour and infiltrations. Dosimetric evaluations were performed on three protocol set-ups: (252)Cf brachytherapy combined with BNCT (Cf-BNCT); Cf-BNCT with a balloon catheter filled with light water (LWB) and the same set-up with heavy water (HWB). Cf-BNCT-HWB has presented dosimetric advantages to Cf-BNCT-LWB and Cf-BNCT in infiltrations at 2.0-5.0 cm from the balloon surface. However, Cf-BNCT-LWB has shown superior dosimetry up to 2.0 cm from the balloon surface. Cf-BNCT-HWB and Cf-BNCT-LWB protocols provide a selective dose distribution for brain tumour and infiltrations, mainly further from the (252)Cf source, sparing the normal brain tissue. Malignant brain tumours grow rapidly and often spread to adjacent brain tissues, leading to death. Improvements in brain radiation protocols have been continuously achieved; however, brain tumour recurrence is observed in most cases. Cf-BNCT-LWB and Cf-BNCT-HWB represent new modalities for selectively combating brain tumour infiltrations and metastasis.

  18. Automatic real time evaluation of red blood cell elasticity by optical tweezers

    NASA Astrophysics Data System (ADS)

    Moura, Diógenes S.; Silva, Diego C. N.; Williams, Ajoke J.; Bezerra, Marcos A. C.; Fontes, Adriana; de Araujo, Renato E.

    2015-05-01

    Optical tweezers have been used to trap, manipulate, and measure individual cell properties. In this work, we show that the association of a computer controlled optical tweezers system with image processing techniques allows rapid and reproducible evaluation of cell deformability. In particular, the deformability of red blood cells (RBCs) plays a key role in the transport of oxygen through the blood microcirculation. The automatic measurement processes consisted of three steps: acquisition, segmentation of images, and measurement of the elasticity of the cells. An optical tweezers system was setup on an upright microscope equipped with a CCD camera and a motorized XYZ stage, computer controlled by a Labview platform. On the optical tweezers setup, the deformation of the captured RBC was obtained by moving the motorized stage. The automatic real-time homemade system was evaluated by measuring RBCs elasticity from normal donors and patients with sickle cell anemia. Approximately 150 erythrocytes were examined, and the elasticity values obtained by using the developed system were compared to the values measured by two experts. With the automatic system, there was a significant time reduction (60 × ) of the erythrocytes elasticity evaluation. Automated system can help to expand the applications of optical tweezers in hematology and hemotherapy.

  19. A novel APPI-MS setup for in situ degradation product studies of atmospherically relevant compounds: capillary atmospheric pressure photo ionization (cAPPI).

    PubMed

    Kersten, Hendrik; Derpmann, Valerie; Barnes, Ian; Brockmann, Klaus J; O'Brien, Rob; Benter, Thorsten

    2011-11-01

    We report on the development of a novel atmospheric pressure photoionization setup and its applicability for in situ degradation product studies of atmospherically relevant compounds. A custom miniature spark discharge lamp was embedded into an ion transfer capillary, which separates the atmospheric pressure from the low pressure region in the first differential pumping stage of a conventional atmospheric pressure ionization mass spectrometer. The lamp operates with a continuous argon flow and produces intense light emissions in the VUV. The custom lamp is operated windowless and efficiently illuminates the sample flow through the transfer capillary on an area smaller than 1 mm(2). Limits of detection in the lower ppbV range, a temporal resolution of milliseconds in the positive as well as the quasi simultaneously operating negative ion mode, and a significant reduction of ion transformation processes render this system applicable to real time studies of rapidly changing chemical systems. The method termed capillary atmospheric pressure photo ionization (cAPPI) is characterized with respect to the lamp emission properties as a function of the operating conditions, temporal response, and its applicability for in situ degradation product studies of atmospherically relevant compounds, respectively.

  20. R4SA for Controlling Robots

    NASA Technical Reports Server (NTRS)

    Aghazarian, Hrand

    2009-01-01

    The R4SA GUI mentioned in the immediately preceding article is a userfriendly interface for controlling one or more robot(s). This GUI makes it possible to perform meaningful real-time field experiments and research in robotics at an unmatched level of fidelity, within minutes of setup. It provides such powerful graphing modes as that of a digitizing oscilloscope that displays up to 250 variables at rates between 1 and 200 Hz. This GUI can be configured as multiple intuitive interfaces for acquisition of data, command, and control to enable rapid testing of subsystems or an entire robot system while simultaneously performing analysis of data. The R4SA software establishes an intuitive component-based design environment that can be easily reconfigured for any robotic platform by creating or editing setup configuration files. The R4SA GUI enables event-driven and conditional sequencing similar to those of Mars Exploration Rover (MER) operations. It has been certified as part of the MER ground support equipment and, therefore, is allowed to be utilized in conjunction with MER flight hardware. The R4SA GUI could also be adapted to use in embedded computing systems, other than that of the MER, for commanding and real-time analysis of data.

  1. Design of an experimental apparatus for measurement of the surface tension of metastable fluids

    NASA Astrophysics Data System (ADS)

    Vinš, V.; Hrubý, J.; Hykl, J.; Blaha, J.; Šmíd, B.

    2013-04-01

    A unique experimental apparatus for measurement of the surface tension of aqueous mixtures has been designed, manufactured, and tested in our laboratory. The novelty of the setup is that it allows measurement of surface tension by two different methods: a modified capillary elevation method in a long vertical capillary tube and a method inspired by the approach of Hacker (National Advisory Committee for Aeronautics, Technical Note 2510, 1-20, 1951), i.e. in a short horizontal capillary tube. Functionality of all main components of the apparatus, e.g., glass chamber with the capillary tube, temperature control unit consisting of two thermostatic baths with special valves for rapid temperature jumps, helium distribution setup allowing pressure variation above the liquid meniscus inside the capillary tube, has been successfully tested. Preliminary results for the surface tension of the stable and metastable supercooled water measured by the capillary elevation method at atmospheric pressure are provided. The surface tension of water measured at temperatures between +26 °C and -11 °C is in good agreement with the extrapolated IAPWS correlation (IAPWS Release on Surface Tension of Ordinary Water Substance, September 1994); however it disagrees with data by Hacker.

  2. RF magnetized ring-shaped plasma for target utilization obtained with circular magnet monopole arrangement

    NASA Astrophysics Data System (ADS)

    Amzad Hossain, Md.; Ohtsu, Yasunori

    2018-01-01

    We proposed a new setup for generating outer ring-shaped radio frequency (RF) magnetized plasma near the chamber wall using monopole magnet setups. Three monopole magnet setups with (a) R = 5 mm, (b) R = 20 mm, and (c) R = 35 mm were investigated, where R is the gap between the magnets in consecutive circles. The distributions of the two dimensional magnetic flux lines, the absolute value of the horizontal magnetic flux density, and the discharge voltage were investigated for the proposed setups to produce outer ring-shaped plasma. A highly luminous ring-shaped plasma was observed for the setup (a), whereas multi-ring discharges were observed for the setups (b) and (c). It was found that the electron temperature decreases with increasing gas pressure for all cases. The electron temperatures were 2.42, 1.71, and 1.15 eV at an Ar gas pressure of 4 Pa for setups (a), (b), and (c), respectively. The plasma density was approximately the same for setups (b) and (c) at all gas pressures. The highest plasma densities were 6.26 × 1015, 1.06 × 1016, and 1.11 × 1016 m-3 at 5 Pa for setups (a), (b), and (c), respectively. It was found that the electron mean free path was 41.4, 63.17, and 84.66 mm at an Ar gas pressure of 5 Pa for setups (a), (b), and (c), respectively. The electron neutral collision frequency for setup (a) was higher than those for setups (b) and (c) at a constant RF power of 40 W and an axial distance of z = 13 mm from the target surface. The radial profile of the ion saturation current for setup (b) was more uniform than those for setups (a) and (c).

  3. One-fourth of the prisoners are underweight in Northern Ethiopia: a cross-sectional study.

    PubMed

    Abera, Semaw Ferede; Adane, Kelemework

    2017-05-15

    Despite the fact that prisoners are exposed to different health problems, prison health problems are often overlooked by researchers and no previous study has investigated nutritional problems of prisoners in Ethiopia. Cross-sectional data were collected from 809 prisoners from nine major prison setups in the Tigray region of Ethiopia. A proportional stratified sampling technique was used to select the total number of participants needed from each prison site. The outcome of this study was underweight defined as body mass index (BMI) of less than 18.5 kg/m 2 . Multivariable binary logistic regression was performed to identify determinants of underweight at a p-value of less than 0.05. The prevalence of underweight was 25.2% (95% CI; 22.3%- 28.3%). Khat Chewing (OR = 2.08; 95% CI = 1.17, 3.70) and longer duration of incarceration (OR = 1.07; 95% CI = 1.01, 1.14) were associated with a significantly increased risk of underweight. Additionally, previous incarceration (OR = 1.54; 95% CI = 0.99, 2.42) was a relevant determinant of underweight with a borderline significance. In contrast, family support (OR = 0.61; 95% CI = 0.43, 0.85) and farmer occupation (OR = 0.59; 95% CI = 0.36, 0.98) compared to those who were unemployed were important protective determinants significantly associated with lower risk of underweight. In summary, the burden of underweight was higher among prisoners in Tigray region who had respiratory tract infections. The study has enhanced our understanding of the determinants of underweight in the prison population. We strongly recommend that nutritional support, such as therapeutic feeding programs for severely or moderately underweight prisoners, and environmental health interventions of the prison setups should be urgently implemented to correct the uncovered nutritional problem and its associated factors for improving the health status of prisoners.

  4. ASD FieldSpec Calibration Setup and Techniques

    NASA Technical Reports Server (NTRS)

    Olive, Dan

    2001-01-01

    This paper describes the Analytical Spectral Devices (ASD) Fieldspec Calibration Setup and Techniques. The topics include: 1) ASD Fieldspec FR Spectroradiometer; 2) Components of Calibration; 3) Equipment list; 4) Spectral Setup; 5) Spectral Calibration; 6) Radiometric and Linearity Setup; 7) Radiometric setup; 8) Datadets Required; 9) Data files; and 10) Field of View Measurement. This paper is in viewgraph form.

  5. Infant multiple breath washout using a new commercially available device: Ready to replace the previous setup?

    PubMed

    Kentgens, Anne-Christianne; Guidi, Marisa; Korten, Insa; Kohler, Lena; Binggeli, Severin; Singer, Florian; Latzin, Philipp; Anagnostopoulou, Pinelopi

    2018-05-01

    Multiple breath washout (MBW) is a sensitive test to measure lung volumes and ventilation inhomogeneity from infancy on. The commonly used setup for infant MBW, based on ultrasonic flowmeter, requires extensive signal processing, which may reduce robustness. A new setup may overcome some previous limitations but formal validation is lacking. We assessed the feasibility of infant MBW testing with the new setup and compared functional residual capacity (FRC) values of the old and the new setup in vivo and in vitro. We performed MBW in four healthy infants and four infants with cystic fibrosis, as well as in a Plexiglas lung simulator using realistic lung volumes and breathing patterns, with the new (Exhalyzer D, Spiroware 3.2.0, Ecomedics) and the old setup (Exhalyzer D, WBreath 3.18.0, ndd) in random sequence. The technical feasibility of MBW with the new device-setup was 100%. Intra-subject variability in FRC was low in both setups, but differences in FRC between the setups were considerable (mean relative difference 39.7%, range 18.9; 65.7, P = 0.008). Corrections of software settings decreased FRC differences (14.0%, -6.4; 42.3, P = 0.08). Results were confirmed in vitro. MBW measurements with the new setup were feasible in infants. However, despite attempts to correct software settings, outcomes between setups were not interchangeable. Further work is needed before widespread application of the new setup can be recommended. © 2018 Wiley Periodicals, Inc.

  6. Accurate and fast creep test for viscoelastic fluids using disk-probe-type and quadrupole-arrangement-type electromagnetically spinning systems

    NASA Astrophysics Data System (ADS)

    Hirano, Taichi; Sakai, Keiji

    2017-07-01

    Viscoelasticity is a unique characteristic of soft materials and describes its dynamic response to mechanical stimulations. A creep test is an experimental method for measuring the strain ratio/rate against an applied stress, thereby assessing the viscoelasticity of the materials. We propose two advanced experimental systems suitable for the creep test, adopting our original electromagnetically spinning (EMS) technique. This technique can apply a constant torque by a noncontact mechanism, thereby allowing more sensitive and rapid measurements. The viscosity and elasticity of a semidilute wormlike micellar solution were determined using two setups, and the consistency between the results was assessed.

  7. Exclusive J / Ψ vector-meson production in high-energy nuclear collisions

    NASA Astrophysics Data System (ADS)

    Ramnath, A.; Weigert, H.; Hamilton, A.

    2014-12-01

    We illustrate the first steps in a cross-section determination for exclusive J / Ψ production in ultra-peripheral heavy ion collisions from two viewpoints. First, the setup for a theoretical calculation is done in the context of the Colour Glass Condensate effective field theory, using the Gaussian truncation to parametrise rapidity-averaged n-point correlators. Secondly, a feasibility study is carried out using STARlight Monte Carlo simulations to predict how many exclusive J / Ψ vector-mesons might be expected in ATLAS at the LHC. In a data set corresponding to 160 μb-1 of total integrated luminosity, about 150 candidate events are expected.

  8. Two approaches to the rapid screening of crystallization conditions

    NASA Technical Reports Server (NTRS)

    Mcpherson, Alexander

    1992-01-01

    A screening procedure is described for estimating conditions under which crystallization will proceed, thus providing a starting point for more careful experiments. The initial procedure uses the experimental setup of McPherson (1982) which supports 24 individual hanging drop experiments for screening variables such as the precipitant type, the pH, the temperature, and the effects of certain additives and which uses about 1 mg of protein. A second approach is proposed (which is rather hypothetical at this stage and needs a larger sample), based on the isoelectric focusing of protein samples on concentration gradients of common precipitating agents. Using this approach, crystals of concanavalin B and canavalin were obtained.

  9. Computational imaging with a single-pixel detector and a consumer video projector

    NASA Astrophysics Data System (ADS)

    Sych, D.; Aksenov, M.

    2018-02-01

    Single-pixel imaging is a novel rapidly developing imaging technique that employs spatially structured illumination and a single-pixel detector. In this work, we experimentally demonstrate a fully operating modular single-pixel imaging system. Light patterns in our setup are created with help of a computer-controlled digital micromirror device from a consumer video projector. We investigate how different working modes and settings of the projector affect the quality of reconstructed images. We develop several image reconstruction algorithms and compare their performance for real imaging. Also, we discuss the potential use of the single-pixel imaging system for quantum applications.

  10. Stereoscopic Feature Tracking System for Retrieving Velocity of Surface Waters

    NASA Astrophysics Data System (ADS)

    Zuniga Zamalloa, C. C.; Landry, B. J.

    2017-12-01

    The present work is concerned with the surface velocity retrieval of flows using a stereoscopic setup and finding the correspondence in the images via feature tracking (FT). The feature tracking provides a key benefit of substantially reducing the level of user input. In contrast to other commonly used methods (e.g., normalized cross-correlation), FT does not require the user to prescribe interrogation window sizes and removes the need for masking when specularities are present. The results of the current FT methodology are comparable to those obtained via Large Scale Particle Image Velocimetry while requiring little to no user input which allowed for rapid, automated processing of imagery.

  11. Using Multiple Schedules During Functional Communication Training to Promote Rapid Transfer of Treatment Effects

    PubMed Central

    Fisher, Wayne W.; Greer, Brian D.; Fuhrman, Ashley M.; Querim, Angie C.

    2016-01-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects from one setting to the next and from one therapist to the next. With two children, we conducted FCT in the context of mixed (baseline) and multiple (treatment) schedules introduced across settings or therapists using a multiple baseline design. Results indicated that when the multiple schedules were introduced, the functional communication response came under rapid discriminative control, and problem behavior remained at near-zero rates. We extended these findings with another individual by using a more traditional baseline in which problem behavior produced reinforcement. Results replicated those of the previous participants and showed rapid reductions in problem behavior when multiple schedules were implemented across settings. PMID:26384141

  12. Using multiple schedules during functional communication training to promote rapid transfer of treatment effects.

    PubMed

    Fisher, Wayne W; Greer, Brian D; Fuhrman, Ashley M; Querim, Angie C

    2015-12-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects across settings and therapists. With 2 children, we conducted FCT in the context of mixed (baseline) and multiple (treatment) schedules introduced across settings or therapists using a multiple baseline design. Results indicated that when the multiple schedules were introduced, the functional communication response came under rapid discriminative control, and problem behavior remained at near-zero rates. We extended these findings with another individual by using a more traditional baseline in which problem behavior produced reinforcement. Results replicated those of the previous participants and showed rapid reductions in problem behavior when multiple schedules were implemented across settings. © Society for the Experimental Analysis of Behavior.

  13. Optimal resolution in maximum entropy image reconstruction from projections with multigrid acceleration

    NASA Technical Reports Server (NTRS)

    Limber, Mark A.; Manteuffel, Thomas A.; Mccormick, Stephen F.; Sholl, David S.

    1993-01-01

    We consider the problem of image reconstruction from a finite number of projections over the space L(sup 1)(Omega), where Omega is a compact subset of the set of Real numbers (exp 2). We prove that, given a discretization of the projection space, the function that generates the correct projection data and maximizes the Boltzmann-Shannon entropy is piecewise constant on a certain discretization of Omega, which we call the 'optimal grid'. It is on this grid that one obtains the maximum resolution given the problem setup. The size of this grid grows very quickly as the number of projections and number of cells per projection grow, indicating fast computational methods are essential to make its use feasible. We use a Fenchel duality formulation of the problem to keep the number of variables small while still using the optimal discretization, and propose a multilevel scheme to improve convergence of a simple cyclic maximization scheme applied to the dual problem.

  14. FASOR - A second generation shell of revolution code

    NASA Technical Reports Server (NTRS)

    Cohen, G. A.

    1978-01-01

    An integrated computer program entitled Field Analysis of Shells of Revolution (FASOR) currently under development for NASA is described. When completed, this code will treat prebuckling, buckling, initial postbuckling and vibrations under axisymmetric static loads as well as linear response and bifurcation under asymmetric static loads. Although these modes of response are treated by existing programs, FASOR extends the class of problems treated to include general anisotropy and transverse shear deformations of stiffened laminated shells. At the same time, a primary goal is to develop a program which is free of the usual problems of modeling, numerical convergence and ill-conditioning, laborious problem setup, limitations on problem size and interpretation of output. The field method is briefly described, the shell differential equations are cast in a suitable form for solution by this method and essential aspects of the input format are presented. Numerical results are given for both unstiffened and stiffened anisotropic cylindrical shells and compared with previously published analytical solutions.

  15. Quantum chi-squared and goodness of fit testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Temme, Kristan; Verstraete, Frank

    2015-01-15

    A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fitmore » test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.« less

  16. Flight-deck automation - Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.

  17. Real-time and rapid detection of Salmonella Typhimurium using an inexpensive lab-built surface plasmon resonance setup

    NASA Astrophysics Data System (ADS)

    Lukose, Jijo; Shetty, Vignesh; Ballal, Mamatha; Chidangil, Santhosh; Sinha, Rajeev K.

    2018-07-01

    Cost-effective diagnostic platforms for rapid pathogen detection are always incumbent in both developing and developed worlds. However, exorbitant diagnostic expenses and the inability to detect pathogens early are a matter of concern for the sustainability and affordability of healthcare devices, which are crucial for deciding how to provide healthcare solutions to the masses, especially in developing countries. Herein, we present the rapid and real-time detection of Salmonella Typhimurium using an inexpensive lab-built surface plasmon resonance (SPR) imaging set up. Pathogen detection is accomplished with the aid of a monoclonal antibody immobilized on a 1-ethyl-3-(3-dimethylaminopropyl)-carbodiimide): N-hydroxysuccinimide-modified self-assembled monolayer covalently bonded to a Au thin film. Successful pathogen detection is performed at two concentrations, ~1.5  ×  108 and ~1  ×  106 cfu ml‑1, in phosphate-buffered saline solution. The developed system is capable of detecting bacterial cells within 6–7 min after their injection into the SPR sensor surface. The present study reveals a cost-effective device having high potential for pathogen detection without any labelling tags.

  18. Integrated Reconfigurable Intelligent Systems (IRIS) for Complex Naval Systems

    DTIC Science & Technology

    2011-02-23

    INTRODUCTION 35 2.2 GENERAL MODEL SETUP 36 2.2.1 Co-Simulation Principles 36 2.2.2 Double pendulum : a simple example 38 2.2.3 Description of numerical... pendulum sample problem 45 2.3 DISCUSSION OF APPROACH WITH RESPECT TO PROPOSED SUBTASKS 49 2.4 RESULTS DISCUSSION AND FUTURE WORK 49 TASK 3...Kim and Praehofer 2000]. 2.2.2 Double pendulum : a simple example In order to be able to evaluate co-simulation principles, specifically an

  19. Hawking radiation in an electromagnetic waveguide?

    PubMed

    Schützhold, Ralf; Unruh, William G

    2005-07-15

    It is demonstrated that the propagation of electromagnetic waves in an appropriately designed waveguide is (for large wavelengths) analogous to that within a curved space-time--such as around a black hole. As electromagnetic radiation (e.g., microwaves) can be controlled, amplified, and detected (with present-day technology) much easier than sound, for example, we propose a setup for the experimental verification of the Hawking effect. Apart from experimentally testing this striking prediction, this would facilitate the investigation of the trans-Planckian problem.

  20. Graphical approach for multiple values logic minimization

    NASA Astrophysics Data System (ADS)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  1. A Clock Fingerprints-Based Approach for Wireless Transmitter Identification

    NASA Astrophysics Data System (ADS)

    Zhao, Caidan; Xie, Liang; Huang, Lianfen; Yao, Yan

    Cognitive radio (CR) was proposed as one of the promising solutions for low spectrum utilization. However, security problems such as the primary user emulation (PUE) attack severely limit its applications. In this paper, we propose a clock fingerprints-based authentication approach to prevent PUE attacks in CR networks with the help of curve fitting and classifier. An experimental setup was constructed using the WLAN cards and software radio devices, and the corresponding results show that satisfied identification can be achieved for wireless transmitters.

  2. Fiber Amplifier Report for NEPP 2008

    NASA Technical Reports Server (NTRS)

    Thomes, Joe; Ott, Melanie; LaRocca, Frank; Chuska, Rick; Switzer, Rob

    2008-01-01

    Ongoing qualification activities of LiNbO3 modulators. Passive (unpumped) radiation testing of Er-, Yb-, and Er/Yb-doped fibers: a) Yb-doped fibers exhibit higher radiation resistance than Er-doped fibers; b) Er/Yb co-doped fibers exhibit largest radiation resistance. Active (pumped) radiation testing of Yb-doped fibers conducted at NASA GSFC: a) Typical decay behavior observed; b) No comparison could be made to other fibers due to problems with test setup. Development of new high power fiber terminations.

  3. The effect of systematic set-up deviations on the absorbed dose distribution for left-sided breast cancer treated with respiratory gating

    NASA Astrophysics Data System (ADS)

    Edvardsson, A.; Ceberg, S.

    2013-06-01

    The aim of this study was 1) to investigate interfraction set-up uncertainties for patients treated with respiratory gating for left-sided breast cancer, 2) to investigate the effect of the inter-fraction set-up on the absorbed dose-distribution for the target and organs at risk (OARs) and 3) optimize the set-up correction strategy. By acquiring multiple set-up images the systematic set-up deviation was evaluated. The effect of the systematic set-up deviation on the absorbed dose distribution was evaluated by 1) simulation in the treatment planning system and 2) measurements with a biplanar diode array. The set-up deviations could be decreased using a no action level correction strategy. Not using the clinically implemented adaptive maximum likelihood factor for the gating patients resulted in better set-up. When the uncorrected set-up deviations were simulated the average mean absorbed dose was increased from 1.38 to 2.21 Gy for the heart, 4.17 to 8.86 Gy to the left anterior descending coronary artery and 5.80 to 7.64 Gy to the left lung. Respiratory gating can induce systematic set-up deviations which would result in increased mean absorbed dose to the OARs if not corrected for and should therefore be corrected for by an appropriate correction strategy.

  4. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    2018-03-01

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  5. Quantum annealing for the number-partitioning problem using a tunable spin glass of ions

    PubMed Central

    Graß, Tobias; Raventós, David; Juliá-Díaz, Bruno; Gogolin, Christian; Lewenstein, Maciej

    2016-01-01

    Exploiting quantum properties to outperform classical ways of information processing is an outstanding goal of modern physics. A promising route is quantum simulation, which aims at implementing relevant and computationally hard problems in controllable quantum systems. Here we demonstrate that in a trapped ion setup, with present day technology, it is possible to realize a spin model of the Mattis-type that exhibits spin glass phases. Our method produces the glassy behaviour without the need for any disorder potential, just by controlling the detuning of the spin-phonon coupling. Applying a transverse field, the system can be used to benchmark quantum annealing strategies which aim at reaching the ground state of the spin glass starting from the paramagnetic phase. In the vicinity of a phonon resonance, the problem maps onto number partitioning, and instances which are difficult to address classically can be implemented. PMID:27230802

  6. Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness

    NASA Astrophysics Data System (ADS)

    Kaushik, Anshul; Ramani, Anand

    2014-04-01

    Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.

  7. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  8. OMOGENIA: A Semantically Driven Collaborative Environment

    NASA Astrophysics Data System (ADS)

    Liapis, Aggelos

    Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.

  9. Electron Transport In Nanowires - An Engineer'S View

    NASA Astrophysics Data System (ADS)

    Nawrocki, W.

    In the paper technological problems connected to electron transport in mesoscopic- and nanostructures are considered. The electrical conductance of nanowires formed by metallic contacts in an experimental setup proposed by Costa-Kramer et al. The investigation has been performed in air at room temperature measuring the conductance between two vibrating metal wires with standard oscilloscope. Conductance quantization in units of G o = 2e /h = (12.9 kΩ)-1 up to five quanta of conductance has been observed for nanowires formed in many metals. The explanation of this universal phenomena is the formation of a nanometer-sized wire (nanowire) between macroscopic metallic contacts which induced, due to theory proposed by Landauer, the quantization of conductance. Thermal problems in nanowirese are also discussed in the paper.

  10. Impact of computer use on children's vision.

    PubMed

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  11. GenASiS Basics: Object-oriented utilitarian functionality for large-scale physics simulations

    DOE PAGES

    Cardall, Christian Y.; Budiardja, Reuben D.

    2015-06-11

    Aside from numerical algorithms and problem setup, large-scale physics simulations on distributed-memory supercomputers require more basic utilitarian functionality, such as physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of this sort of rudimentary functionality, along with individual `unit test' programs and larger example problems demonstrating their use. Lastly, these classes compose the Basics division of our developing astrophysics simulation code GenASiS (General Astrophysical Simulation System), but their fundamental nature makes themmore » useful for physics simulations in many fields.« less

  12. A practical guide to the Piccolo autopilot

    NASA Astrophysics Data System (ADS)

    Mornhinweg, Anton

    In support of a UAV contract the Piccolo SL and Piccolo II autopilots were installed and operated on various aircraft. Numerous problems with the autopilot setup and analysis processes were found along with numerous problems with documentation and autopilot system information. Major areas of concern are identified along with objectives to eliminate the major areas of concern. Piccolo simulator vehicle gain calculations and Piccolo generation 2 version 2.1.4 control laws are reverse engineered. A complete modeling guide is created. Methods are developed to perform and analyze doublet maneuvers. A series of flight procedures are outlined that include methods for tuning gains. A series of MATLAB graphical user interfaces were created to analyze flight data and pertinent control loop data for gain tuning.

  13. Heading in the right direction? An innovative approach toward proper patient head positioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grush, William H.; Steffen, Gary A

    2002-12-31

    An in-house-manufactured modification of the standard A-F foam rubber head-neck supports (aka. Timo Supports) was designed to eliminate clinical setup problems with head immobilization and instability during treatment, thus providing for a more comfortable head rest for the patient. The custom design of this head holder seeks to eliminate superior-to-inferior shift, and minimize the lateral right-to-left rotational movement of the head when coupled with an AquaPlast casting system. By focusing attention to the seating of the occipital portion of the head and contour of the patient's neck, the aforementioned problems of movement were addressed, while adhering to the interests ofmore » patient comfort in this modified head support system.« less

  14. Experimental instruction in photonics for high school students: approaches to managing problems faced

    NASA Astrophysics Data System (ADS)

    Choong, Zhengyang

    2017-08-01

    Student research projects are increasingly common at the K-12 level. However, students often face difficulties in the course of their school research projects such as setting realistic timelines and expectations, handling problems stemming from a lack of self-confidence, as well as being sufficiently disciplined for sustained communication and experimentation. In this work, we explore manifestations of these problems in the context of a photonics project, characterising the spectrum of the breakdown flash from Silicon Avalanche Photodiodes. We report on the process of planning and building the setup, data collection, analysis and troubleshooting, as well as the technical and human problems at each step. Approaches that were found to be helpful in managing the aforementioned problems are discussed, including an attention to detail during experimental work, as well as communicating in a forthcoming manner. Œe former allowed for clearer planning and the setting of quantifiable proximal goals; the latter helped in motivating discipline, and also helped in the understanding of research as an iterative learning process without a clear definition of success or failure.

  15. High-resolution continuous-flow analysis setup for water isotopic measurement from ice cores using laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Emanuelsson, B. D.; Baisden, W. T.; Bertler, N. A. N.; Keller, E. D.; Gkinis, V.

    2015-07-01

    Here we present an experimental setup for water stable isotope (δ18O and δD) continuous-flow measurements and provide metrics defining the performance of the setup during a major ice core measurement campaign (Roosevelt Island Climate Evolution; RICE). We also use the metrics to compare alternate systems. Our setup is the first continuous-flow laser spectroscopy system that is using off-axis integrated cavity output spectroscopy (OA-ICOS; analyzer manufactured by Los Gatos Research, LGR) in combination with an evaporation unit to continuously analyze water samples from an ice core. A Water Vapor Isotope Standard Source (WVISS) calibration unit, manufactured by LGR, was modified to (1) enable measurements on several water standards, (2) increase the temporal resolution by reducing the response time and (3) reduce the influence from memory effects. While this setup was designed for the continuous-flow analysis (CFA) of ice cores, it can also continuously analyze other liquid or vapor sources. The custom setups provide a shorter response time (~ 54 and 18 s for 2013 and 2014 setup, respectively) compared to the original WVISS unit (~ 62 s), which is an improvement in measurement resolution. Another improvement compared to the original WVISS is that the custom setups have a reduced memory effect. Stability tests comparing the custom and WVISS setups were performed and Allan deviations (σAllan) were calculated to determine precision at different averaging times. For the custom 2013 setup the precision after integration times of 103 s is 0.060 and 0.070 ‰ for δ18O and δD, respectively. The corresponding σAllan values for the custom 2014 setup are 0.030, 0.060 and 0.043 ‰ for δ18O, δD and δ17O, respectively. For the WVISS setup the precision is 0.035, 0.070 and 0.042 ‰ after 103 s for δ18O, δD and δ17O, respectively. Both the custom setups and WVISS setup are influenced by instrumental drift with δ18O being more drift sensitive than δD. The σAllan values for δ18O are 0.30 and 0.18 ‰ for the custom 2013 and WVISS setup, respectively, after averaging times of 104 s (2.78 h). Using response time tests and stability tests, we show that the custom setups are more responsive (shorter response time), whereas the University of Copenhagen (UC) setup is more stable. More broadly, comparisons of different setups address the challenge of integrating vaporizer/spectrometer isotope measurement systems into a CFA campaign with many other analytical instruments.

  16. Can the use of pulsed direct current induce oscillation in the applied pressure during spark plasma sintering?

    PubMed Central

    Salamon, David; Eriksson, Mirva; Nygren, Mats; Shen, Zhijian

    2012-01-01

    The spark plasma sintering (SPS) process is known for its rapid densification of metals and ceramics. The mechanism behind this rapid densification has been discussed during the last few decades and is yet uncertain. During our SPS experiments we noticed oscillations in the applied pressure, related to a change in electric current. In this study, we investigated the effect of pulsed electrical current on the applied mechanical pressure and related changes in temperature. We eliminated the effect of sample shrinkage in the SPS setup and used a transparent quartz die allowing direct observation of the sample. We found that the use of pulsed direct electric current in our apparatus induces pressure oscillations with the amplitude depending on the current density. While sintering Ti samples we observed temperature oscillations resulting from pressure oscillations, which we attribute to magnetic forces generated within the SPS apparatus. The described current–pressure–temperature relations might increase understanding of the SPS process. PMID:27877472

  17. Protecting tropical forests from the rapid expansion of rubber using carbon payments.

    PubMed

    Warren-Thomas, Eleanor M; Edwards, David P; Bebber, Daniel P; Chhang, Phourin; Diment, Alex N; Evans, Tom D; Lambrick, Frances H; Maxwell, James F; Nut, Menghor; O'Kelly, Hannah J; Theilade, Ida; Dolman, Paul M

    2018-03-02

    Expansion of Hevea brasiliensis rubber plantations is a resurgent driver of deforestation, carbon emissions, and biodiversity loss in Southeast Asia. Southeast Asian rubber extent is massive, equivalent to 67% of oil palm, with rapid further expansion predicted. Results-based carbon finance could dis-incentivise forest conversion to rubber, but efficacy will be limited unless payments match, or at least approach, the costs of avoided deforestation. These include opportunity costs (timber and rubber profits), plus carbon finance scheme setup (transaction) and implementation costs. Using comprehensive Cambodian forest data, exploring scenarios of selective logging and conversion, and assuming land-use choice is based on net present value, we find that carbon prices of $30-$51 per tCO 2 are needed to break even against costs, higher than those currently paid on carbon markets or through carbon funds. To defend forests from rubber, either carbon prices must be increased, or other strategies are needed, such as corporate zero-deforestation pledges, and governmental regulation and enforcement of forest protection.

  18. Chiral phase transition at finite chemical potential in 2 +1 -flavor soft-wall anti-de Sitter space QCD

    NASA Astrophysics Data System (ADS)

    Bartz, Sean P.; Jacobson, Theodore

    2018-04-01

    The phase transition from hadronic matter to chirally symmetric quark-gluon plasma is expected to be a rapid crossover at zero quark chemical potential (μ ), becoming first order at some finite value of μ , indicating the presence of a critical point. Using a three-flavor soft-wall model of anti-de Sitter/QCD, we investigate the effect of varying the light and strange quark masses on the order of the chiral phase transition. At zero quark chemical potential, we reproduce the Columbia Plot, which summarizes the results of lattice QCD and other holographic models. We then extend this holographic model to examine the effects of finite quark chemical potential. We find that the the chemical potential does not affect the critical line that separates first-order from rapid crossover transitions. This excludes the possibility of a critical point in this model, suggesting that a different setup is necessary to reproduce all the features of the QCD phase diagram.

  19. From Cleanroom to Desktop: Emerging Micro-Nanofabrication Technology for Biomedical Applications

    PubMed Central

    Wang, Wei

    2010-01-01

    This review is motivated by the growing demand for low-cost, easy-to-use, compact-size yet powerful micro-nanofabrication technology to address emerging challenges of fundamental biology and translational medicine in regular laboratory settings. Recent advancements in the field benefit considerably from rapidly expanding material selections, ranging from inorganics to organics and from nanoparticles to self-assembled molecules. Meanwhile a great number of novel methodologies, employing off-the-shelf consumer electronics, intriguing interfacial phenomena, bottom-up self-assembly principles, etc., have been implemented to transit micro-nanofabrication from a cleanroom environment to a desktop setup. Furthermore, the latest application of micro-nanofabrication to emerging biomedical research will be presented in detail, which includes point-of-care diagnostics, on-chip cell culture as well as bio-manipulation. While significant progresses have been made in the rapidly growing field, both apparent and unrevealed roadblocks will need to be addressed in the future. We conclude this review by offering our perspectives on the current technical challenges and future research opportunities. PMID:21161384

  20. BEAP profiles as rapid test system for status analysis and early detection of process incidents in biogas plants.

    PubMed

    Refai, Sarah; Berger, Stefanie; Wassmann, Kati; Hecht, Melanie; Dickhaus, Thomas; Deppenmeier, Uwe

    2017-03-01

    A method was developed to quantify the performance of microorganisms involved in different digestion levels in biogas plants. The test system was based on the addition of butyrate (BCON), ethanol (ECON), acetate (ACON) or propionate (PCON) to biogas sludge samples and the subsequent analysis of CH 4 formation in comparison to control samples. The combination of the four values was referred to as BEAP profile. Determination of BEAP profiles enabled rapid testing of a biogas plant's metabolic state within 24 h and an accurate mapping of all degradation levels in a lab-scale experimental setup. Furthermore, it was possible to distinguish between specific BEAP profiles for standard biogas plants and for biogas reactors with process incidents (beginning of NH 4 + -N inhibition, start of acidification, insufficient hydrolysis and potential mycotoxin effects). Finally, BEAP profiles also functioned as a warning system for the early prediction of critical NH 4 + -N concentrations leading to a drop of CH 4 formation.

  1. Real-Time Leaky Lamb Wave Spectrum Measurement and Its Application to NDE of Composites

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Bar-Cohen, Yoseph

    1999-01-01

    Numerous analytical and theoretical studies of the behavior of leaky Lamb waves (LLW) in composite materials were documented in the literature. One of the key issues that are constraining the application of this method as a practical tool is the amount of data that needs to be acquired and the slow process that is involved with such experiments. Recently, a methodology that allows quasi real-time acquisition of LLW dispersion data was developed. At each angle of incidence the reflection spectrum is available in real time from the experimental setup and it can be used for rapid detection of the defects. This technique can be used to rapidly acquire the various plate wave modes along various angles of incidence for the characterization of the material elastic properties. The experimental method and data acquisition technique will be described in this paper. Experimental data was used to examine a series of flaws including porosity and delaminations and demonstrated the efficiency of the developed technique.

  2. From cleanroom to desktop: emerging micro-nanofabrication technology for biomedical applications.

    PubMed

    Pan, Tingrui; Wang, Wei

    2011-02-01

    This review is motivated by the growing demand for low-cost, easy-to-use, compact-size yet powerful micro-nanofabrication technology to address emerging challenges of fundamental biology and translational medicine in regular laboratory settings. Recent advancements in the field benefit considerably from rapidly expanding material selections, ranging from inorganics to organics and from nanoparticles to self-assembled molecules. Meanwhile a great number of novel methodologies, employing off-the-shelf consumer electronics, intriguing interfacial phenomena, bottom-up self-assembly principles, etc., have been implemented to transit micro-nanofabrication from a cleanroom environment to a desktop setup. Furthermore, the latest application of micro-nanofabrication to emerging biomedical research will be presented in detail, which includes point-of-care diagnostics, on-chip cell culture as well as bio-manipulation. While significant progresses have been made in the rapidly growing field, both apparent and unrevealed roadblocks will need to be addressed in the future. We conclude this review by offering our perspectives on the current technical challenges and future research opportunities.

  3. Stability of azimuthal-angle observables under higher order corrections in inclusive three-jet production

    NASA Astrophysics Data System (ADS)

    Caporale, F.; Celiberto, F. G.; Chachamis, G.; Gómez, D. Gordo; Vera, A. Sabio

    2017-04-01

    Recently, a new family of observables consisting of azimuthal-angle generalized ratios was proposed in a kinematical setup that resembles the usual Mueller-Navelet jets but with an additional tagged jet in the central region of rapidity. Nontagged minijet activity between the three jets can affect significantly the azimuthal angle orientation of the jets and is accounted for by the introduction of two Balitsky-Fadin-Kuraev- Lipatov (BFKL) gluon Green functions. Here, we calculate the, presumably, most relevant higher order corrections to the observables by now convoluting the three leading order jet vertices with two gluon Green functions at next-to-leading logarithmic approximation. The corrections appear to be mostly moderate, giving us confidence that the recently proposed observables are actually an excellent way to probe the BFKL dynamics at the LHC. Furthermore, we allow for the jets to take values in different rapidity bins in various configurations such that a comparison between our predictions and the experimental data is a straightforward task.

  4. Bubble-driven mixer integrated with a microfluidic bead-based ELISA for rapid bladder cancer biomarker detection.

    PubMed

    Lin, Yen-Heng; Wang, Chia-Chu; Lei, Kin Fong

    2014-04-01

    In this study, fine bubbles were successfully generated and used as a simple, low-cost driving force for mixing fluids in an integrated microfluidic bead-based enzyme-linked immunosorbent assay (ELISA) to rapidly and quantitatively detect apolipoprotein A1 (APOA1), a biomarker highly correlated with bladder cancer. A wooden gas diffuser was embedded underneath a microfluidic chip to refine injected air and generate bubbles of less than 0.3 mm. The rising bubbles caused disturbances and convection in the fluid, increasing the probability of analyte interaction. This setup not only simplifies the micromixer design but also achieves rapid mixing with a small airflow as a force. We used this bubble-driven micromixer in a bead-based ELISA that targeted APOA1. The results indicate that this micromixer reduced the time for each incubation from 60 min in the conventional assay to 8 min with the chip, resulting in a reduction of total ELISA reaction time from 3-4 h to 30-40 min. Furthermore, the concentration detection limit was 9.16 ng/mL, which was lower than the detection cut-off value (11.16 ng/mL) for bladder cancer diagnosis reported in the literature. Therefore, this chip can be used to achieve rapid low-cost bladder cancer detection and may be used in point-of-care cancer monitoring.

  5. Rapid localized heating of graphene coating on a silicon mold by induction for precision molding of polymer optics.

    PubMed

    Zhang, Lin; Zhou, Wenchen; Yi, Allen Y

    2017-04-01

    In compression molding of polymer optical components with micro/nanoscale surface features, rapid heating of the mold surface is critical for the implementation of this technology for large-scale applications. In this Letter, a novel method of a localized rapid heating process is reported. This process is based on induction heating of a thin conductive coating deposited on a silicon mold. Since the graphene coating is very thin (∼45  nm), a high heating rate of 10∼20°C/s can be achieved by employing a 1200 W 30 kHz electrical power unit. Under this condition, the graphene-coated surface and the polymer substrate can be heated above the polymer's glass transition temperature within 30 s and subsequently cooled down to room temperature within several tens of seconds after molding, resulting in an overall thermal cycle of about 3 min or shorter. The feasibility of this process was validated by fabrication of optical gratings, micropillar matrices, and microlens arrays on polymethylmethacrylate (PMMA) substrates with very high precision. The uniformity and surface geometries of the replicated optical elements are evaluated using an optical profilometer, a diffraction test setup, and a Shack-Hartmann wavefront sensor built with a molded PMMA microlens array. Compared with the conventional bulk heating molding process, this novel rapid localized induction heating process could improve replication efficiency with better geometrical fidelity.

  6. SU-E-J-22: A Feasibility Study On KV-Based Whole Breast Radiation Patient Setup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Q; Zhang, M; Yue, N

    Purpose: In room kilovoltage x-ray (kV) imaging provides higher contrast than Megavoltage (MV) imaging with faster acquisition time compared with on-board cone-beam computed tomography (CBCT), thus improving patient setup accuracy and efficiency. In this study we evaluated the clinical feasibility of utilizing kV imaging for whole breast radiation patient setup. Methods: For six breast cancer patients with whole breast treatment plans using two opposed tangential fields, MV-based patient setup was conducted by aligning patient markers with in room lasers and MV portal images. Beam-eye viewed kV images were acquired using Varian OBI system after the set up process. In housemore » software was developed to transfer MLC blocks information overlaying onto kV images to demonstrate the field shape for verification. KV-based patient digital shift was derived by performing rigid registration between kV image and the digitally reconstructed radiography (DRR) to align the bony structure. This digital shift between kV-based and MV-based setup was defined as setup deviation. Results: Six sets of kV images were acquired for breast patients. The mean setup deviation was 2.3mm, 2.2mm and 1.8mm for anterior-posterior, superior-inferior and left-right direction respectively. The average setup deviation magnitude was 4.3±1.7mm for six patients. Patient with large breast had a larger setup deviation (4.4–6.2mm). There was no strong correlation between MV-based shift and setup deviation. Conclusion: A preliminary clinical workflow for kV-based whole breast radiation setup was established and tested. We observed setup deviation of the magnitude below than 5mm. With the benefit of providing higher contrast and MLC block overlaid on the images for treatment field verification, it is feasible to use kV imaging for breast patient setup.« less

  7. A parameter control method in reinforcement learning to rapidly follow unexpected environmental changes.

    PubMed

    Murakoshi, Kazushi; Mizuno, Junya

    2004-11-01

    In order to rapidly follow unexpected environmental changes, we propose a parameter control method in reinforcement learning that changes each of learning parameters in appropriate directions. We determine each appropriate direction on the basis of relationships between behaviors and neuromodulators by considering an emergency as a key word. Computer experiments show that the agents using our proposed method could rapidly respond to unexpected environmental changes, not depending on either two reinforcement learning algorithms (Q-learning and actor-critic (AC) architecture) or two learning problems (discontinuous and continuous state-action problems).

  8. [Comparison of four identical electronic noses and three measurement set-ups].

    PubMed

    Koczulla, R; Hattesohl, A; Biller, H; Hofbauer, J; Hohlfeld, J; Oeser, C; Wirtz, H; Jörres, R A

    2011-08-01

    Volatile organic compounds (VOCs) can be used as biomarkers in exhaled air. VOC profiles can be detected by an array of nanosensors of an electronic nose. These profiles can be analysed using bioinformatics. It is, however, not known whether different devices of the same model measure identically and to which extent different set-ups and the humidity of the inhaled air influence the VOC profile. Three different measuring set-ups were designed and three healthy control subjects were measured with each of them, using four devices of the same model (Cyranose 320™, Smiths Detection). The exhaled air was collected in a plastic bag. Either ambient air was used as reference (set-up Leipzig), or the reference air was humidified (100% relative humidity) (set-up Marburg and set-up Munich). In the set-up Marburg the subjects inhaled standardised medical air (Aer medicinalis Linde, AGA AB) out of a compressed air bottle through a demand valve; this air (after humidification) was also used as reference. In the set-up Leipzig the subjects inhaled VOC-filtered ambient air, in the set-up Munich unfiltered room air. The data were evaluated using either the real-time data or the changes in resistance as calculated by the device. The results were clearly dependent on the set-up. Apparently, humidification of the reference air could reduce the variance between devices, but this result was also dependent on the evaluation method used. When comparing the three subjects, the set-ups Munich and Marburg mapped these in a similar way, whereas not only the signals but also the variance of the set-up Leipzig were larger. Measuring VOCs with an electronic nose has not yet been standardised and the set-up significantly affects the results. As other researchers use further methods, it is currently not possible to draw generally accepted conclusions. More systematic tests are required to find the most sensitive and reliable but still feasible set-up so that comparability is improved. © Georg Thieme Verlag KG Stuttgart · New York.

  9. NIAAA's rapid response to college drinking problems initiative: reinforcing the use of evidence-based approaches in college alcohol prevention.

    PubMed

    Dejong, William; Larimer, Mary E; Wood, Mark D; Hartman, Roger

    2009-07-01

    The National Institute on Alcohol Abuse and Alcoholism (NIAAA) created the Rapid Response to College Drinking Problems initiative so that senior college administrators facing an alcohol-related crisis could get assistance from well-established alcohol researchers and NIAAA staff. Based on a competitive grant process, NIAAA selected five teams of research scientists with expertise in college drinking research. NIAAA then invited college administrators to propose interventions to address a recently experienced alcohol-related problem. Between September 2004 and September 2005, NIAAA selected 15 sites and paired each recipient college with a scientific team. Together, each program development/evaluation team, working closely with NIAAA scientific staff, jointly designed, implemented, and evaluated a Rapid Response project. This supplement reports the results of several Rapid Response projects, plus other findings of interest that emerged from that research. Eight articles present evaluation findings for prevention and treatment interventions, which can be grouped by the individual, group/interpersonal, institutional, and community levels of the social ecological framework. Additional studies provide further insights that can inform prevention and treatment programs designed to reduce alcohol-related problems among college students. This article provides an overview of these findings, placing them in the context of the college drinking intervention literature. College drinking remains a daunting problem on many campuses, but evidence-based strategies-such as those described in this supplement-provide hope that more effective solutions can be found. The Rapid Response initiative has helped solidify the necessary link between research and practice in college alcohol prevention and treatment.

  10. Rapid classification of landsat TM imagery for phase 1 stratification using the automated NDVI threshold supervised classification (ANTSC) methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2002-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....

  11. Rapid design and optimization of low-thrust rendezvous/interception trajectory for asteroid deflection missions

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Zhu, Yongsheng; Wang, Yukai

    2014-02-01

    Asteroid deflection techniques are essential in order to protect the Earth from catastrophic impacts by hazardous asteroids. Rapid design and optimization of low-thrust rendezvous/interception trajectories is considered as one of the key technologies to successfully deflect potentially hazardous asteroids. In this paper, we address a general framework for the rapid design and optimization of low-thrust rendezvous/interception trajectories for future asteroid deflection missions. The design and optimization process includes three closely associated steps. Firstly, shape-based approaches and genetic algorithm (GA) are adopted to perform preliminary design, which provides a reasonable initial guess for subsequent accurate optimization. Secondly, Radau pseudospectral method is utilized to transcribe the low-thrust trajectory optimization problem into a discrete nonlinear programming (NLP) problem. Finally, sequential quadratic programming (SQP) is used to efficiently solve the nonlinear programming problem and obtain the optimal low-thrust rendezvous/interception trajectories. The rapid design and optimization algorithms developed in this paper are validated by three simulation cases with different performance indexes and boundary constraints.

  12. Intracavitary moderator balloon combined with 252Cf brachytherapy and boron neutron capture therapy, improving dosimetry in brain tumour and infiltrations

    PubMed Central

    Brandão, S F

    2015-01-01

    Objective: This article proposes a combination of californium-252 (252Cf) brachytherapy, boron neutron capture therapy (BNCT) and an intracavitary moderator balloon catheter applied to brain tumour and infiltrations. Methods: Dosimetric evaluations were performed on three protocol set-ups: 252Cf brachytherapy combined with BNCT (Cf-BNCT); Cf-BNCT with a balloon catheter filled with light water (LWB) and the same set-up with heavy water (HWB). Results: Cf-BNCT-HWB has presented dosimetric advantages to Cf-BNCT-LWB and Cf-BNCT in infiltrations at 2.0–5.0 cm from the balloon surface. However, Cf-BNCT-LWB has shown superior dosimetry up to 2.0 cm from the balloon surface. Conclusion: Cf-BNCT-HWB and Cf-BNCT-LWB protocols provide a selective dose distribution for brain tumour and infiltrations, mainly further from the 252Cf source, sparing the normal brain tissue. Advances in knowledge: Malignant brain tumours grow rapidly and often spread to adjacent brain tissues, leading to death. Improvements in brain radiation protocols have been continuously achieved; however, brain tumour recurrence is observed in most cases. Cf-BNCT-LWB and Cf-BNCT-HWB represent new modalities for selectively combating brain tumour infiltrations and metastasis. PMID:25927876

  13. Actuation of atomic force microscopy microcantilevers using contact acoustic nonlinearities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torello, D.; Degertekin, F. Levent, E-mail: levent.degertekin@me.gatech.edu

    2013-11-15

    A new method of actuating atomic force microscopy (AFM) cantilevers is proposed in which a high frequency (>5 MHz) wave modulated by a lower frequency (∼300 kHz) wave passes through a contact acoustic nonlinearity at the contact interface between the actuator and the cantilever chip. The nonlinearity converts the high frequency, modulated signal to a low frequency drive signal suitable for actuation of tapping-mode AFM probes. The higher harmonic content of this signal is filtered out mechanically by the cantilever transfer function, providing for clean output. A custom probe holder was designed and constructed using rapid prototyping technologies and off-the-shelfmore » components and was interfaced with an Asylum Research MFP-3D AFM, which was then used to evaluate the performance characteristics with respect to standard hardware and linear actuation techniques. Using a carrier frequency of 14.19 MHz, it was observed that the cantilever output was cleaner with this actuation technique and added no significant noise to the system. This setup, without any optimization, was determined to have an actuation bandwidth on the order of 10 MHz, suitable for high speed imaging applications. Using this method, an image was taken that demonstrates the viability of the technique and is compared favorably to images taken with a standard AFM setup.« less

  14. An in vitro lung model to assess true shunt fraction by multiple inert gas elimination.

    PubMed

    Varadarajan, Balamurugan; Vogt, Andreas; Hartwich, Volker; Vasireddy, Rakesh; Consiglio, Jolanda; Hugi-Mayr, Beate; Eberle, Balthasar

    2017-01-01

    The Multiple Inert Gas Elimination Technique, based on Micropore Membrane Inlet Mass Spectrometry, (MMIMS-MIGET) has been designed as a rapid and direct method to assess the full range of ventilation-to-perfusion (V/Q) ratios. MMIMS-MIGET distributions have not been assessed in an experimental setup with predefined V/Q-distributions. We aimed (I) to construct a novel in vitro lung model (IVLM) for the simulation of predefined V/Q distributions with five gas exchange compartments and (II) to correlate shunt fractions derived from MMIMS-MIGET with preset reference shunt values of the IVLM. Five hollow-fiber membrane oxygenators switched in parallel within a closed extracorporeal oxygenation circuit were ventilated with sweep gas (V) and perfused with human red cell suspension or saline (Q). Inert gas solution was infused into the perfusion circuit of the gas exchange assembly. Sweep gas flow (V) was kept constant and reference shunt fractions (IVLM-S) were established by bypassing one or more oxygenators with perfusate flow (Q). The derived shunt fractions (MM-S) were determined using MIGET by MMIMS from the retention data. Shunt derived by MMIMS-MIGET correlated well with preset reference shunt fractions. The in vitro lung model is a convenient system for the setup of predefined true shunt fractions in validation of MMIMS-MIGET.

  15. On the Piezoelectric Detection of Guided Ultrasonic Waves

    PubMed Central

    2017-01-01

    In order to quantify the wave motion of guided ultrasonic waves, the characteristics of piezoelectric detectors, or ultrasonic transducers and acoustic emission sensors, have been evaluated systematically. Such guided waves are widely used in structural health monitoring and nondestructive evaluation, but methods of calibrating piezoelectric detectors have been inadequate. This study relied on laser interferometry for the base displacement measurement of bar waves, from which eight different guided wave test set-ups are developed with known wave motion using piezoelectric transmitters. Both plates and bars of 12.7 and 6.4 mm thickness were used as wave propagation media. The upper frequency limit was 2 MHz. Output of guided wave detectors were obtained on the test set-ups and their receiving sensitivities were characterized and averaged. While each sensitivity spectrum was noisy for a detector, the averaged spectrum showed a good convergence to a unique receiving sensitivity. Twelve detectors were evaluated and their sensitivity spectra determined in absolute units. Generally, these showed rapidly dropping sensitivity with increasing frequency due to waveform cancellation on their sensing areas. This effect contributed to vastly different sensitivities to guided wave and to normally incident wave for each one of the 12 detectors tested. Various other effects are discussed and recommendations on methods of implementing the approach developed are provided. PMID:29156579

  16. Efficient Computation of Sparse Matrix Functions for Large-Scale Electronic Structure Calculations: The CheSS Library.

    PubMed

    Mohr, Stephan; Dawson, William; Wagner, Michael; Caliste, Damien; Nakajima, Takahito; Genovese, Luigi

    2017-10-10

    We present CheSS, the "Chebyshev Sparse Solvers" library, which has been designed to solve typical problems arising in large-scale electronic structure calculations using localized basis sets. The library is based on a flexible and efficient expansion in terms of Chebyshev polynomials and presently features the calculation of the density matrix, the calculation of matrix powers for arbitrary powers, and the extraction of eigenvalues in a selected interval. CheSS is able to exploit the sparsity of the matrices and scales linearly with respect to the number of nonzero entries, making it well-suited for large-scale calculations. The approach is particularly adapted for setups leading to small spectral widths of the involved matrices and outperforms alternative methods in this regime. By coupling CheSS to the DFT code BigDFT, we show that such a favorable setup is indeed possible in practice. In addition, the approach based on Chebyshev polynomials can be massively parallelized, and CheSS exhibits excellent scaling up to thousands of cores even for relatively small matrix sizes.

  17. A scheme for a shot-to-shot, femtosecond-resolved pulse length and arrival time measurement of free electron laser x-ray pulses that overcomes the time jitter problem between the FEL and the laser

    NASA Astrophysics Data System (ADS)

    Juranić, P. N.; Stepanov, A.; Peier, P.; Hauri, C. P.; Ischebeck, R.; Schlott, V.; Radović, M.; Erny, C.; Ardana-Lamas, F.; Monoszlai, B.; Gorgisyan, I.; Patthey, L.; Abela, R.

    2014-03-01

    The recent entry of X-ray free electron lasers (FELs) to all fields of physics has created an enormous need, both from scientists and operators, for better characterization of the beam created by these facilities. Of particular interest is the measurement of the arrival time of the FEL pulse relative to a laser pump, for pump-probe experiments, and the measurement of the FEL pulse length. This article describes a scheme that corrects one of the major sources of uncertainty in these types of measurements, namely the jitter in the arrival time of the FEL relative to an experimental laser beam. The setup presented here uses a combination of THz streak cameras and a spectral encoding setup to reduce the effect of an FEL's jitter, leaving the pulse length as the only variable that can affect the accuracy of the pulse length and arrival time measurement. A discussion of underlying principles is also provided.

  18. Terahertz solid immersion microscopy for sub-wavelength-resolution imaging of biological objects and tissues

    NASA Astrophysics Data System (ADS)

    Chernomyrdin, Nikita V.; Kucheryavenko, Anna S.; Malakhov, Kirill M.; Schadko, Alexander O.; Komandin, Gennady A.; Lebedev, Sergey P.; Dolganova, Irina N.; Kurlov, Vladimir N.; Lavrukhin, Denis V.; Ponomarev, Dmitry S.; Yurchenko, Stanislav O.; Tuchin, Valery V.; Zaytsev, Kirill I.

    2018-04-01

    We have developed a method of terahertz (THz) solid immersion microscopy for imaging of biological objects and tissues. It relies on the solid immersion lens (SIL) employing the THz beam focusing into the evanescent-field volume and allowing strong reduction in the dimensions of the THz beam caustic. By solving the problems of the sample handling at the focal plane and raster scanning of its surface with the focused THz beam, the THz SIL microscopy has been adapted for imaging of soft tissues. We have assembled an experimental setup based on a backward-wave oscillator, as a continuous-wave source operating at the wavelength of λ = 500 μm, and a Golay cell, as a detector of the THz wave intensity. By imaging of the razor blade, we have demonstrated advanced 0.2λ-resolution of the proposed THz SIL configuration. Using the experimental setup, we have performed THz imaging of a mint leaf revealing its sub-wavelength features. The observed results highlight a potential of the THz SIL microscopy in biomedical applications of THz science and technology.

  19. Trapped Modes in a Three-Layer Fluid

    NASA Astrophysics Data System (ADS)

    Saha, Sunanda; Bora, Swaroop Nandan

    2018-03-01

    In this work, trapped mode frequencies are computed for a submerged horizontal circular cylinder with the hydrodynamic set-up involving an infinite depth three-layer incompressible fluid with layer-wise different densities. The impermeable cylinder is fully immersed in either the bottom layer or the upper layer. The effect of surface tension at the surface of separation is neglected. In this set-up, there exist three wave numbers: the lowest one on the free surface and the other two on the internal interfaces. For each wave number, there exist two modes for which trapped waves exist. The existence of these trapped modes is shown by numerical evidence. We investigate the variation of these trapped modes subject to change in the depth of the middle layer as well as the submergence depth. We show numerically that two-layer and single-layer results cannot be recovered in the double and single limiting cases of the density ratios tending to unity. The existence of trapped modes shows that in general, a radiation condition for the waves at infinity is insufficient for the uniqueness of the solution of the scattering problem.

  20. Modelling Pulsar Glitches: The Hydrodynamics of Superfluid Vortex Avalanches in Neutron Stars

    NASA Astrophysics Data System (ADS)

    Khomenko, V.; Haskell, B.

    2018-05-01

    The dynamics of quantised vorticity in neutron star interiors is at the heart of most pulsar glitch models. However, the large number of vortices (up to ≈1013) involved in a glitch and the huge disparity in scales between the femtometre scale of vortex cores and the kilometre scale of the star makes quantum dynamical simulations of the problem computationally intractable. In this paper, we take a first step towards developing a mean field prescription to include the dynamics of vortices in large-scale hydrodynamical simulations of superfluid neutron stars. We consider a one-dimensional setup and show that vortex accumulation and differential rotation in the neutron superfluid lead to propagating waves, or `avalanches', as solutions for the equations of motion for the superfluid velocities. We introduce an additional variable, the fraction of free vortices, and test different prescriptions for its advection with the superfluid flow. We find that the new terms lead to solutions with a linear component in the rise of a glitch, and that, in specific setups, they can give rise to glitch precursors and even to decreases in frequency, or `anti-glitches'.

  1. Performance evaluation of distributed wavelength assignment in WDM optical networks

    NASA Astrophysics Data System (ADS)

    Hashiguchi, Tomohiro; Wang, Xi; Morikawa, Hiroyuki; Aoyama, Tomonori

    2004-04-01

    In WDM wavelength routed networks, prior to a data transfer, a call setup procedure is required to reserve a wavelength path between the source-destination node pairs. A distributed approach to a connection setup can achieve a very high speed, while improving the reliability and reducing the implementation cost of the networks. However, along with many advantages, several major challenges have been posed by the distributed scheme in how the management and allocation of wavelength could be efficiently carried out. In this thesis, we apply a distributed wavelength assignment algorithm named priority based wavelength assignment (PWA) that was originally proposed for the use in burst switched optical networks to the problem of reserving wavelengths of path reservation protocols in the distributed control optical networks. Instead of assigning wavelengths randomly, this approach lets each node select the "safest" wavelengths based on the information of wavelength utilization history, thus unnecessary future contention is prevented. The simulation results presented in this paper show that the proposed protocol can enhance the performance of the system without introducing any apparent drawbacks.

  2. Transfer orbit stage mechanisms thermal vacuum test

    NASA Technical Reports Server (NTRS)

    Oleary, Scott T.

    1990-01-01

    A systems level mechanisms test was conducted on the Orbital Sciences Corp.'s Transfer Orbit Stage (TOS). The TOS is a unique partially reusable transfer vehicle which will boost a satellite into its operational orbit from the Space Shuttle's cargo bay. The mechanical cradle and tilt assemblies will return to earth with the Space Shuttle while the Solid Rocket Motor (SRM) and avionics package are expended. A mechanisms test was performed on the forward cradle and aft tilting assemblies of the TOS under thermal vacuum conditions. Actuating these assemblies under a 1 g environment and thermal vacuum conditions proved to be a complex task. Pneumatic test fixturing was used to lift the forward cradle, and tilt the SRM, and avionics package. Clinometers, linear voltage displacement transducers, and load cells were used in the thermal vacuum chamber to measure the performance and characteristics of the TOS mechanism assembly. Incorporation of the instrumentation and pneumatic system into the test setup was not routine since pneumatic actuation of flight hardware had not been previously performed in the facility. The methods used are presented along with the problems experienced during the design, setup and test phases.

  3. Practical methods for generating alternating magnetic fields for biomedical research

    NASA Astrophysics Data System (ADS)

    Christiansen, Michael G.; Howe, Christina M.; Bono, David C.; Perreault, David J.; Anikeeva, Polina

    2017-08-01

    Alternating magnetic fields (AMFs) cause magnetic nanoparticles (MNPs) to dissipate heat while leaving surrounding tissue unharmed, a mechanism that serves as the basis for a variety of emerging biomedical technologies. Unfortunately, the challenges and costs of developing experimental setups commonly used to produce AMFs with suitable field amplitudes and frequencies present a barrier to researchers. This paper first presents a simple, cost-effective, and robust alternative for small AMF working volumes that uses soft ferromagnetic cores to focus the flux into a gap. As the experimental length scale increases to accommodate animal models (working volumes of 100s of cm3 or greater), poor thermal conductivity and volumetrically scaled core losses render that strategy ineffective. Comparatively feasible strategies for these larger volumes instead use low loss resonant tank circuits to generate circulating currents of 1 kA or greater in order to produce the comparable field amplitudes. These principles can be extended to the problem of identifying practical routes for scaling AMF setups to humans, an infrequently acknowledged challenge that influences the extent to which many applications of MNPs may ever become clinically relevant.

  4. Test One to Test Many: A Unified Approach to Quantum Benchmarks

    NASA Astrophysics Data System (ADS)

    Bai, Ge; Chiribella, Giulio

    2018-04-01

    Quantum benchmarks are routinely used to validate the experimental demonstration of quantum information protocols. Many relevant protocols, however, involve an infinite set of input states, of which only a finite subset can be used to test the quality of the implementation. This is a problem, because the benchmark for the finitely many states used in the test can be higher than the original benchmark calculated for infinitely many states. This situation arises in the teleportation and storage of coherent states, for which the benchmark of 50% fidelity is commonly used in experiments, although finite sets of coherent states normally lead to higher benchmarks. Here, we show that the average fidelity over all coherent states can be indirectly probed with a single setup, requiring only two-mode squeezing, a 50-50 beam splitter, and homodyne detection. Our setup enables a rigorous experimental validation of quantum teleportation, storage, amplification, attenuation, and purification of noisy coherent states. More generally, we prove that every quantum benchmark can be tested by preparing a single entangled state and measuring a single observable.

  5. Kalman filter based control for Adaptive Optics

    NASA Astrophysics Data System (ADS)

    Petit, Cyril; Quiros-Pacheco, Fernando; Conan, Jean-Marc; Kulcsár, Caroline; Raynaud, Henri-François; Fusco, Thierry

    2004-12-01

    Classical Adaptive Optics suffer from a limitation of the corrected Field Of View. This drawback has lead to the development of MultiConjugated Adaptive Optics. While the first MCAO experimental set-ups are presently under construction, little attention has been paid to the control loop. This is however a key element in the optimization process especially for MCAO systems. Different approaches have been proposed in recent articles for astronomical applications : simple integrator, Optimized Modal Gain Integrator and Kalman filtering. We study here Kalman filtering which seems a very promising solution. Following the work of Brice Leroux, we focus on a frequential characterization of kalman filters, computing a transfer matrix. The result brings much information about their behaviour and allows comparisons with classical controllers. It also appears that straightforward improvements of the system models can lead to static aberrations and vibrations filtering. Simulation results are proposed and analysed thanks to our frequential characterization. Related problems such as model errors, aliasing effect reduction or experimental implementation and testing of Kalman filter control loop on a simplified MCAO experimental set-up could be then discussed.

  6. Carolinas Coastal Change Processes Project data report for observations near Diamond Shoals, North Carolina, January-May 2009

    USGS Publications Warehouse

    Armstrong, Brandy N.; Warner, John C.; Voulgaris, George; List, Jeffrey H.; Thieler, E. Robert; Martini, Marinna A.; Montgomery, Ellyn T.

    2011-01-01

    This Open-File Report provides information collected for an oceanographic field study that occurred during January - May 2009 to investigate processes that control the sediment transport dynamics at Diamond Shoals, North Carolina. The objective of this report is to make the data available in digital form and to provide information to facilitate further analysis of the data. The report describes the background, experimental setup, equipment, and locations of the sensor deployments. The edited data are presented in time-series plots for rapid visualization of the data set, and in data files that are in the Network Common Data Format (netcdf). Supporting observational data are also included.

  7. Low-Cost Peptide Microarrays for Mapping Continuous Antibody Epitopes.

    PubMed

    McBride, Ryan; Head, Steven R; Ordoukhanian, Phillip; Law, Mansun

    2016-01-01

    With the increasing need for understanding antibody specificity in antibody and vaccine research, pepscan assays provide a rapid method for mapping and profiling antibody responses to continuous epitopes. We have developed a relatively low-cost method to generate peptide microarray slides for studying antibody binding. Using a setup of an IntavisAG MultiPep RS peptide synthesizer, a Digilab MicroGrid II 600 microarray printer robot, and an InnoScan 1100 AL scanner, the method allows the interrogation of up to 1536 overlapping, alanine-scanning, and mutant peptides derived from the target antigens. Each peptide is tagged with a polyethylene glycol aminooxy terminus to improve peptide solubility, orientation, and conjugation efficiency to the slide surface.

  8. Microscopic Optical Projection Tomography In Vivo

    PubMed Central

    Meyer, Heiko; Ripoll, Jorge; Tavernarakis, Nektarios

    2011-01-01

    We describe a versatile optical projection tomography system for rapid three-dimensional imaging of microscopic specimens in vivo. Our tomographic setup eliminates the in xy and z strongly asymmetric resolution, resulting from optical sectioning in conventional confocal microscopy. It allows for robust, high resolution fluorescence as well as absorption imaging of live transparent invertebrate animals such as C. elegans. This system offers considerable advantages over currently available methods when imaging dynamic developmental processes and animal ageing; it permits monitoring of spatio-temporal gene expression and anatomical alterations with single-cell resolution, it utilizes both fluorescence and absorption as a source of contrast, and is easily adaptable for a range of small model organisms. PMID:21559481

  9. Biomass conversion determined via fluorescent cellulose decay assay.

    PubMed

    Wischmann, Bente; Toft, Marianne; Malten, Marco; McFarland, K C

    2012-01-01

    An example of a rapid microtiter plate assay (fluorescence cellulose decay, FCD) that determines the conversion of cellulose in a washed biomass substrate is reported. The conversion, as verified by HPLC, is shown to correlate to the monitored FCD in the assay. The FCD assay activity correlates to the performance of multicomponent enzyme mixtures and is thus useful for the biomass industry. The development of an optimized setup of the 96-well microtiter plate is described, and is used to test a model that shortens the assay incubation time from 72 to 24h. A step-by-step procedure of the final assay is described. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Performance and Calibration of H2RG Detectors and SIDECAR ASICs for the RATIR Camera

    NASA Technical Reports Server (NTRS)

    Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Klein, Christopher R.; Butler, Nathaniel R.; Bloom, Josh; de Diego, Jos A.; Simn Farah, Alejandro D.; Gehrels, Neil A.; Georgiev, Leonid; hide

    2012-01-01

    The Reionization And Transient Infra,.Red (RATIR) camera has been built for rapid Gamma,.Ray Burst (GRE) followup and will provide simultaneous optical and infrared photometric capabilities. The infrared portion of this camera incorporates two Teledyne HgCdTe HAWAII-2RG detectors, controlled by Teledyne's SIDECAR ASICs. While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 interface card and IDE development environment. Together, this setup comprises Teledyne's Development Kit, which is a bundled solution that can be efficiently integrated into future ground-based systems. In this presentation, we characterize the system's read noise, dark current, and conversion gain.

  11. Biomimetic and bioinspired nanoparticles for targeted drug delivery.

    PubMed

    Gagliardi, Mariacristina

    2017-03-01

    In drug targeting, the urgent need for more effective and less iatrogenic therapies is pushing toward a complete revision of carrier setup. After the era of 'articles used as homing systems', novel prototypes are now emerging. Newly conceived carriers are endowed with better biocompatibility, biodistribution and targeting properties. The biomimetic approach bestows such improved functional properties. Exploiting biological molecules, organisms and cells, or taking inspiration from them, drug vector performances are now rapidly progressing toward the perfect carrier. Following this direction, researchers have refined carrier properties, achieving significant results. The present review summarizes recent advances in biomimetic and bioinspired drug vectors, derived from biologicals or obtained by processing synthetic materials with a biomimetic approach.

  12. Heat-enhanced peptide synthesis on Teflon-patterned paper.

    PubMed

    Deiss, Frédérique; Yang, Yang; Matochko, Wadim L; Derda, Ratmir

    2016-06-14

    In this report, we describe the methodology for 96 parallel organic syntheses of peptides on Teflon-patterned paper assisted by heating with an infra-red lamp. SPOT synthesis is an important technology for production of peptide arrays on a paper-based support for rapid identification of peptide ligands, epitope mapping, and identification of bio-conjugation reactions. The major drawback of the SPOT synthesis methodology published to-date is suboptimal reaction conversion due to mass transport limitations in the unmixed reaction spot. The technology developed in this report overcomes these problems by changing the environment of the reaction from static to dynamic (flow-through), and further accelerating the reaction by selective heating of the reaction support in contact with activated amino acids. Patterning paper with Teflon allows for droplets of organic solvents to be confined in a zone on the paper array and flow through the paper at a well-defined rate and provide a convenient, power-free setup for flow-through solid-phase synthesis and efficient assembly of peptide arrays. We employed an infra-red (IR) lamp to locally heat the cellulosic support during the flow-through delivery of the reagents to each zone of the paper-based array. We demonstrate that IR-heating in solid phase peptide synthesis shortened the reaction time necessary for amide bond formation down to 3 minutes; in some couplings of alpha amino acids, conversion rates increased up to fifteen folds. The IR-heating improved the assembly of difficult sequences, such as homo-oligomers of all 20 natural amino acids.

  13. A technique for sequential segmental neuromuscular stimulation with closed loop feedback control.

    PubMed

    Zonnevijlle, Erik D H; Abadia, Gustavo Perez; Somia, Naveen N; Kon, Moshe; Barker, John H; Koenig, Steven; Ewert, D L; Stremel, Richard W

    2002-01-01

    In dynamic myoplasty, dysfunctional muscle is assisted or replaced with skeletal muscle from a donor site. Electrical stimulation is commonly used to train and animate the skeletal muscle to perform its new task. Due to simultaneous tetanic contractions of the entire myoplasty, muscles are deprived of perfusion and fatigue rapidly, causing long-term problems such as excessive scarring and muscle ischemia. Sequential stimulation contracts part of the muscle while other parts rest, thus significantly improving blood perfusion. However, the muscle still fatigues. In this article, we report a test of the feasibility of using closed-loop control to economize the contractions of the sequentially stimulated myoplasty. A simple stimulation algorithm was developed and tested on a sequentially stimulated neo-sphincter designed from a canine gracilis muscle. Pressure generated in the lumen of the myoplasty neo-sphincter was used as feedback to regulate the stimulation signal via three control parameters, thereby optimizing the performance of the myoplasty. Additionally, we investigated and compared the efficiency of amplitude and frequency modulation techniques. Closed-loop feedback enabled us to maintain target pressures within 10% deviation using amplitude modulation and optimized control parameters (correction frequency = 4 Hz, correction threshold = 4%, and transition time = 0.3 s). The large-scale stimulation/feedback setup was unfit for chronic experimentation, but can be used as a blueprint for a small-scale version to unveil the theoretical benefits of closed-loop control in chronic experimentation.

  14. Other ways of measuring `Big G'

    NASA Astrophysics Data System (ADS)

    Rothleitner, Christian

    2016-03-01

    In 1798, the British scientist Henry Cavendish performed the first laboratory experiment to determine the gravitational force between two massive bodies. From his result, Newton's gravitational constant, G, was calculated. Cavendish's measurement principle was the torsion balance invented by John Michell some 15 years before. During the following two centuries, more than 300 new measurements followed. Although technology - and physics - developed rapidly during this time, surprisingly, most experiments were still based on the same principle. In fact, the most accurate determination of G to date is a measurement based on the torsion balance principle. Despite the fact that G was one of the first fundamental physical constants ever measured, and despite the huge number of experiments performed on it to this day, its CODATA recommended value still has the highest standard measurement uncertainty when compared to other fundamental physical constants. Even more serious is the fact that even measurements based on the same principle often do not overlap within their attributed standard uncertainties. It must be assumed that various experiments are subject to one or more unknown biases. In this talk I will present some alternative experimental setups to the torsion balance which have been performed or proposed to measure G. Although their estimated uncertainties are often higher than most torsion balance experiments, revisiting such ideas is worthwhile. Advances in technology could offer solutions to problems which were previously insurmountable, these solutions could result in lower measurement uncertainties. New measurement principles could also help to uncover hidden systematic effects.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbiest, G. J., E-mail: Verbiest@physik.rwth-aachen.de; Zalm, D. J. van der; Oosterkamp, T. H.

    The application of ultrasound in an Atomic Force Microscope (AFM) gives access to subsurface information. However, no commercially AFM exists that is equipped with this technique. The main problems are the electronic crosstalk in the AFM setup and the insufficiently strong excitation of the cantilever at ultrasonic (MHz) frequencies. In this paper, we describe the development of an add-on that provides a solution to these problems by using a special piezo element with a lowest resonance frequency of 2.5 MHz and by separating the electronic connection for this high frequency piezo element from all other connections. In this sense, wemore » support researches with the possibility to perform subsurface measurements with their existing AFMs and hopefully pave also the way for the development of a commercial AFM that is capable of imaging subsurface features with nanometer resolution.« less

  16. Methodology for stereoscopic motion-picture quality assessment

    NASA Astrophysics Data System (ADS)

    Voronov, Alexander; Vatolin, Dmitriy; Sumin, Denis; Napadovsky, Vyacheslav; Borisov, Alexey

    2013-03-01

    Creating and processing stereoscopic video imposes additional quality requirements related to view synchronization. In this work we propose a set of algorithms for detecting typical stereoscopic-video problems, which appear owing to imprecise setup of capture equipment or incorrect postprocessing. We developed a methodology for analyzing the quality of S3D motion pictures and for revealing their most problematic scenes. We then processed 10 modern stereo films, including Avatar, Resident Evil: Afterlife and Hugo, and analyzed changes in S3D-film quality over the years. This work presents real examples of common artifacts (color and sharpness mismatch, vertical disparity and excessive horizontal disparity) in the motion pictures we processed, as well as possible solutions for each problem. Our results enable improved quality assessment during the filming and postproduction stages.

  17. High-resolution continuous flow analysis setup for water isotopic measurement from ice cores using laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Emanuelsson, B. D.; Baisden, W. T.; Bertler, N. A. N.; Keller, E. D.; Gkinis, V.

    2014-12-01

    Here we present an experimental setup for water stable isotopes (δ18O and δD) continuous flow measurements. It is the first continuous flow laser spectroscopy system that is using Off-Axis Integrated Cavity Output Spectroscopy (OA-ICOS; analyzer manufactured by Los Gatos Research - LGR) in combination with an evaporation unit to continuously analyze sample from an ice core. A Water Vapor Isotopic Standard Source (WVISS) calibration unit, manufactured by LGR, was modified to: (1) increase the temporal resolution by reducing the response time (2) enable measurements on several water standards, and (3) to reduce the influence from memory effects. While this setup was designed for the Continuous Flow Analysis (CFA) of ice cores, it can also continuously analyze other liquid or vapor sources. The modified setup provides a shorter response time (~54 and 18 s for 2013 and 2014 setup, respectively) compared to the original WVISS unit (~62 s), which is an improvement in measurement resolution. Another improvement compared to the original WVISS is that the modified setup has a reduced memory effect. Stability tests comparing the modified WVISS and WVISS setups were performed and Allan deviations (σAllan) were calculated to determine precision at different averaging times. For the 2013 modified setup the precision after integration times of 103 s are 0.060 and 0.070‰ for δ18O and δD, respectively. For the WVISS setup the corresponding σAllan values are 0.030, 0.060 and 0.043‰ for δ18O, δD and δ17O, respectively. For the WVISS setup the precision is 0.035, 0.070 and 0.042‰ after 103 s for δ18O, δD and δ17O, respectively. Both the modified setups and WVISS setup are influenced by instrumental drift with δ18O being more drift sensitive than δD. The σAllan values for δ18O of 0.30 and 0.18‰ for the modified (2013) and WVISS setup, respectively after averaging times of 104 s (2.78 h). The Isotopic Water Analyzer (IWA)-modified WVISS setup used during the 2013 Roosevelt Island Climate Evolution (RICE) ice core processing campaign achieved high precision measurements, in particular for δD, with high temporal resolution for the upper part of the core, where a seasonally resolved isotopic signal is preserved.

  18. Rapid Classification of Landsat TM Imagery for Phase 1 Stratification Using the Automated NDVI Threshold Supervised Classification (ANTSC) Methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2005-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....

  19. C3I Rapid Prototype Investigation.

    DTIC Science & Technology

    1986-01-01

    feasibility of applying rapid K prototyping techniques to Air Force C3 1 system developments . This report presents the technical progress during the...computer tunctions. The cost to use each in terms of hardware, software, analysis, and needed further developments was assessed. Prototyping approaches were...acquirer, and developer are the ". basis for problems in C3I system developments . These problems destabilize r-. the requirements determination process

  20. Teaching Problem-Solving Skills to Nuclear Engineering Students

    ERIC Educational Resources Information Center

    Waller, E.; Kaye, M. H.

    2012-01-01

    Problem solving is an essential skill for nuclear engineering graduates entering the workforce. Training in qualitative and quantitative aspects of problem solving allows students to conceptualise and execute solutions to complex problems. Solutions to problems in high consequence fields of study such as nuclear engineering require rapid and…

  1. Complete analytical solution of electromagnetic field problem of high-speed spinning ball

    NASA Astrophysics Data System (ADS)

    Reichert, T.; Nussbaumer, T.; Kolar, J. W.

    2012-11-01

    In this article, a small sphere spinning in a rotating magnetic field is analyzed in terms of the resulting magnetic flux density distribution and the current density distribution inside the ball. From these densities, the motor torque and the eddy current losses can be calculated. An analytical model is derived, and its results are compared to a 3D finite element analysis. The model gives insight into the torque and loss characteristics of a solid rotor induction machine setup, which aims at rotating the sphere beyond 25 Mrpm.

  2. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungerford, Aimee L.; Daniel, David John

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  3. Fringe stabilizers and their application to telecommunication device manufacturing

    NASA Astrophysics Data System (ADS)

    Odhner, Jefferson E.

    2000-10-01

    The ability to create stable holographic grating is an important part of the production of many telecommunication products. The stability problem is increased by the need to use ultra-violet light for close fringe spacing and long exposure times on phot-resist - a relatively low sensitivity material. Active fringe locking increases the modulation depth and efficiency of these holographic gratings. A discussion of how fringe lockers work and how they can be incorporated into a manufacturing set-up is followed by results of using fringe lockers in the manufacturing of some telecommunication devices.

  4. Perception of power modulation of light in conjunction with acoustic stimulation

    NASA Astrophysics Data System (ADS)

    Hahlweg, Cornelius F.; Weyer, Cornelia; Gercke-Hahn, Harald; Gutzmann, Holger L.; Brahmann, Andre; Rothe, Hendrik

    2013-09-01

    The present paper is derived from an ongoing study on the human perception of combined optical and acoustical periodical stimuli. Originating from problems of occupational medicine concerning artificial illumination and certain machinery with coherent optical and acoustical emissions there are effects which are interesting in the context of Optics and Music. Because of the difficulties in evaluation of physical and psychological effects of such coherent stimuli in a first step we questioned if such coherence is perceivable at all. Concept, experimental set-up and first results are discussed in short.

  5. Pitfalls in PCR troubleshooting: Expect the unexpected?

    PubMed Central

    Schrick, Livia; Nitsche, Andreas

    2015-01-01

    PCR is a well-understood and established laboratory technique often used in molecular diagnostics. Huge experience has been accumulated over the last years regarding the design of PCR assays and their set-up, including in-depth troubleshooting to obtain the optimal PCR assay for each purpose. Here we report a PCR troubleshooting that came up with a surprising result never observed before. With this report we hope to sensitize the reader to this peculiar problem and to save troubleshooting efforts in similar situations, especially in time-critical and ambitious diagnostic settings. PMID:27077041

  6. The detection error of thermal test low-frequency cable based on M sequence correlation algorithm

    NASA Astrophysics Data System (ADS)

    Wu, Dongliang; Ge, Zheyang; Tong, Xin; Du, Chunlin

    2018-04-01

    The problem of low accuracy and low efficiency of off-line detecting on thermal test low-frequency cable faults could be solved by designing a cable fault detection system, based on FPGA export M sequence code(Linear feedback shift register sequence) as pulse signal source. The design principle of SSTDR (Spread spectrum time-domain reflectometry) reflection method and hardware on-line monitoring setup figure is discussed in this paper. Testing data show that, this detection error increases with fault location of thermal test low-frequency cable.

  7. Thermotropic Liquid Crystal-Assisted Chemical and Biological Sensors

    PubMed Central

    Honaker, Lawrence W.; Usol’tseva, Nadezhda; Mann, Elizabeth K.

    2017-01-01

    In this review article, we analyze recent progress in the application of liquid crystal-assisted advanced functional materials for sensing biological and chemical analytes. Multiple research groups demonstrate substantial interest in liquid crystal (LC) sensing platforms, generating an increasing number of scientific articles. We review trends in implementing LC sensing techniques and identify common problems related to the stability and reliability of the sensing materials as well as to experimental set-ups. Finally, we suggest possible means of bridging scientific findings to viable and attractive LC sensor platforms. PMID:29295530

  8. Piezoelectric and optical setup to measure an electrical field: application to the longitudinal near-field generated by a tapered coax.

    PubMed

    Euphrasie, S; Vairac, P; Cretin, B; Lengaigne, G

    2008-03-01

    We propose a new setup to measure an electrical field in one direction. This setup is made of a piezoelectric sintered lead zinconate titanate film and an optical interferometric probe. We used this setup to investigate how the shape of the extremity of a coaxial cable influences the longitudinal electrical near-field generated by it. For this application, we designed our setup to have a spatial resolution of 100 microm in the direction of the electrical field. Simulations and experiments are presented.

  9. Assessing water resources in Azerbaijan using a local distributed model forced and constrained with global data

    NASA Astrophysics Data System (ADS)

    Bouaziz, Laurène; Hegnauer, Mark; Schellekens, Jaap; Sperna Weiland, Frederiek; ten Velden, Corine

    2017-04-01

    In many countries, data is scarce, incomplete and often not easily shared. In these cases, global satellite and reanalysis data provide an alternative to assess water resources. To assess water resources in Azerbaijan, a completely distributed and physically based hydrological wflow-sbm model was set-up for the entire Kura basin. We used SRTM elevation data, a locally available river map and one from OpenStreetMap to derive the drainage direction network at the model resolution of approximately 1x1 km. OpenStreetMap data was also used to derive the fraction of paved area per cell to account for the reduced infiltration capacity (c.f. Schellekens et al. 2014). We used the results of a global study to derive root zone capacity based on climate data (Wang-Erlandsson et al., 2016). To account for the variation in vegetation cover over the year, monthly averages of Leaf Area Index, based on MODIS data, were used. For the soil-related parameters, we used global estimates as provided by Dai et al. (2013). This enabled the rapid derivation of a first estimate of parameter values for our hydrological model. Digitized local meteorological observations were scarce and available only for limited time period. Therefore several sources of global meteorological data were evaluated: (1) EU-WATCH global precipitation, temperature and derived potential evaporation for the period 1958-2001 (Harding et al., 2011), (2) WFDEI precipitation, temperature and derived potential evaporation for the period 1979-2014 (by Weedon et al., 2014), (3) MSWEP precipitation (Beck et al., 2016) and (4) local precipitation data from more than 200 stations in the Kura basin were available from the NOAA website for a period up to 1991. The latter, together with data archives from Azerbaijan, were used as a benchmark to evaluate the global precipitation datasets for the overlapping period 1958-1991. By comparing the datasets, we found that monthly mean precipitation of EU-WATCH and WFDEI coincided well with NOAA stations and that MSWEP slightly overestimated precipitation amounts. On a daily basis, there were discrepancies in the peak timing and magnitude between measured precipitation and the global products. A bias between EU-WATCH and WFDEI temperature and potential evaporation was observed and to model the water balance correctly, it was needed to correct EU-WATCH to WFDEI mean monthly values. Overall, the available sources enabled rapid set-up of a hydrological model including the forcing of the model with a relatively good performance to assess water resources in Azerbaijan with a limited calibration effort and allow for a similar set-up anywhere in the world. Timing and quantification of peak volume remains a weakness in global data, making it difficult to be used for some applications (flooding) and for detailed calibration. Selecting and comparing different sources of global meteorological data is important to have a reliable set which improves model performance. - Beck et al., 2016. MSWEP: 3-hourly 0.25° global gridded precipitation (1979-2014) by merging gauge, satellite, and reanalysis data. Hydrol. Earth Syst. Sci. Discuss. - Dai Y. et al. ,2013. Development of a China Dataset of Soil Hydraulic Parameters Using Pedotransfer Functions for Land Surface Modeling. Journal of Hydrometeorology - Harding, R. et al., 2011., WATCH: Current knowledge of the Terrestrial global water cycle, J. Hydrometeorol. - Schellekens, J. et al., 2014. Rapid setup of hydrological and hydraulic models using OpenStreetMap and the SRTM derived digital elevation model. Environmental Modelling&Software - Wang-Erlandsson L. et al., 2016. Global Root Zone Storage Capacity from Satellite-Based Evaporation. Hydrology and Earth System Sciences - Weedon, G. et al., 2014. The WFDEI meteorological forcing data set: WATCH Forcing Data methodology applied to ERA-Interim reanalysis data, Water Resources Research.

  10. Stochastic formulation of patient positioning using linac-mounted cone beam imaging with prior knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoegele, W.; Loeschel, R.; Dobler, B.

    2011-02-15

    Purpose: In this work, a novel stochastic framework for patient positioning based on linac-mounted CB projections is introduced. Based on this formulation, the most probable shifts and rotations of the patient are estimated, incorporating interfractional deformations of patient anatomy and other uncertainties associated with patient setup. Methods: The target position is assumed to be defined by and is stochastically determined from positions of various features such as anatomical landmarks or markers in CB projections, i.e., radiographs acquired with a CB-CT system. The patient positioning problem of finding the target location from CB projections is posed as an inverse problem withmore » prior knowledge and is solved using a Bayesian maximum a posteriori (MAP) approach. The prior knowledge is three-fold and includes the accuracy of an initial patient setup (such as in-room laser and skin marks), the plasticity of the body (relative shifts between target and features), and the feature detection error in CB projections (which may vary depending on specific detection algorithm and feature type). For this purpose, MAP estimators are derived and a procedure of using them in clinical practice is outlined. Furthermore, a rule of thumb is theoretically derived, relating basic parameters of the prior knowledge (initial setup accuracy, plasticity of the body, and number of features) and the parameters of CB data acquisition (number of projections and accuracy of feature detection) to the expected estimation accuracy. Results: MAP estimation can be applied to arbitrary features and detection algorithms. However, to experimentally demonstrate its applicability and to perform the validation of the algorithm, a water-equivalent, deformable phantom with features represented by six 1 mm chrome balls were utilized. These features were detected in the cone beam projections (XVI, Elekta Synergy) by a local threshold method for demonstration purposes only. The accuracy of estimation (strongly varying for different plasticity parameters of the body) agreed with the rule of thumb formula. Moreover, based on this rule of thumb formula, about 20 projections for 6 detectable features seem to be sufficient for a target estimation accuracy of 0.2 cm, even for relatively large feature detection errors with standard deviation of 0.5 cm and spatial displacements of the features with standard deviation of 0.5 cm. Conclusions: The authors have introduced a general MAP-based patient setup algorithm accounting for different sources of uncertainties, which are utilized as the prior knowledge in a transparent way. This new framework can be further utilized for different clinical sites, as well as theoretical developments in the field of patient positioning for radiotherapy.« less

  11. Rapid detection of bacterial contamination in cell or tissue cultures based on Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Bolwien, Carsten; Sulz, Gerd; Becker, Sebastian; Thielecke, Hagen; Mertsching, Heike; Koch, Steffen

    2008-02-01

    Monitoring the sterility of cell or tissue cultures is an essential task, particularly in the fields of regenerative medicine and tissue engineering when implanting cells into the human body. We present a system based on a commercially available microscope equipped with a microfluidic cell that prepares the particles found in the solution for analysis, a Raman-spectrometer attachment optimized for non-destructive, rapid recording of Raman spectra, and a data acquisition and analysis tool for identification of the particles. In contrast to conventional sterility testing in which samples are incubated over weeks, our system is able to analyze milliliters of supernatant or cell suspension within hours by filtering relevant particles and placing them on a Raman-friendly substrate in the microfluidic cell. Identification of critical particles via microscopic imaging and subsequent image analysis is carried out before micro-Raman analysis of those particles is then carried out with an excitation wavelength of 785 nm. The potential of this setup is demonstrated by results of artificial contamination of samples with a pool of bacteria, fungi, and spores: single-channel spectra of the critical particles are automatically baseline-corrected without using background data and classified via hierarchical cluster analysis, showing great promise for accurate and rapid detection and identification of contaminants.

  12. Flash (Ultra-Rapid) Spark-Plasma Sintering of Silicon Carbide

    PubMed Central

    Olevsky, Eugene A.; Rolfing, Stephen M.; Maximenko, Andrey L.

    2016-01-01

    A new ultra-rapid process of flash spark plasma sintering is developed. The idea of flash spark plasma sintering (or flash hot pressing - FHP) stems from the conducted theoretical analysis of the role of thermal runaway phenomena for material processing by flash sintering. The major purpose of the present study is to theoretically analyze the thermal runaway nature of flash sintering and to experimentally address the challenge of uncontrollable thermal conditions by the stabilization of the flash sintering process through the application of the external pressure. The effectiveness of the developed FHP technique is demonstrated by the few seconds–long consolidation of SiC powder in an industrial spark plasma sintering device. Specially designed sacrificial dies heat the pre-compacted SiC powder specimens to a critical temperature before applying any voltage to the powder volume and allowing the electrode-punches of the SPS device setup to contact the specimens and pass electric current through them under elevated temperatures. The experimental results demonstrate that flash sintering phenomena can be realized using conventional SPS devices. The usage of hybrid heating SPS devices is pointed out as the mainstream direction for the future studies and utilization of the new flash hot pressing (ultra-rapid spark plasma sintering) technique. PMID:27624641

  13. Flash (Ultra-Rapid) Spark-Plasma Sintering of Silicon Carbide

    DOE PAGES

    Olevsky, Eugene A.; Rolfing, Stephen M.; Maximenko, Andrey L.

    2016-09-14

    A new ultra-rapid process of flash spark plasma sintering is developed. The idea of flash spark plasma sintering (or flash hot pressing - FHP) stems from the conducted theoretical analysis of the role of thermal runaway phenomena for material processing by flash sintering. The major purpose of the present study is to theoretically analyze the thermal runaway nature of flash sintering and to experimentally address the challenge of uncontrollable thermal conditions by the stabilization of the flash sintering process through the application of the external pressure. The effectiveness of the developed FHP technique is demonstrated by the few seconds–long consolidationmore » of SiC powder in an industrial spark plasma sintering device. Specially designed sacrificial dies heat the pre-compacted SiC powder specimens to a critical temperature before applying any voltage to the powder volume and allowing the electrode-punches of the SPS device setup to contact the specimens and pass electric current through them under elevated temperatures. The experimental results demonstrate that flash sintering phenomena can be realized using conventional SPS devices. The usage of hybrid heating SPS devices is pointed out as the mainstream direction for the future studies and utilization of the new flash hot pressing (ultra-rapid spark plasma sintering) technique.« less

  14. Attosecond transient absorption instrumentation for thin film materials: Phase transitions, heat dissipation, signal stabilization, timing correction, and rapid sample rotation.

    PubMed

    Jager, Marieke F; Ott, Christian; Kaplan, Christopher J; Kraus, Peter M; Neumark, Daniel M; Leone, Stephen R

    2018-01-01

    We present an extreme ultraviolet (XUV) transient absorption apparatus tailored to attosecond and femtosecond measurements on bulk solid-state thin-film samples, specifically when the sample dynamics are sensitive to heating effects. The setup combines methodology for stabilizing sub-femtosecond time-resolution measurements over 48 h and techniques for mitigating heat buildup in temperature-dependent samples. Single-point beam stabilization in pump and probe arms and periodic time-zero reference measurements are described for accurate timing and stabilization. A hollow-shaft motor configuration for rapid sample rotation, raster scanning capability, and additional diagnostics are described for heat mitigation. Heat transfer simulations performed using a finite element analysis allow comparison of sample rotation and traditional raster scanning techniques for 100 Hz pulsed laser measurements on vanadium dioxide, a material that undergoes an insulator-to-metal transition at a modest temperature of 340 K. Experimental results are presented confirming that the vanadium dioxide (VO 2 ) sample cannot cool below its phase transition temperature between laser pulses without rapid rotation, in agreement with the simulations. The findings indicate the stringent conditions required to perform rigorous broadband XUV time-resolved absorption measurements on bulk solid-state samples, particularly those with temperature sensitivity, and elucidate a clear methodology to perform them.

  15. Investigation of droplet nucleation in CCS relevant systems - design and testing of the expansion chamber

    NASA Astrophysics Data System (ADS)

    Čenský, Miroslav; Hrubý, Jan; Vinš, Václav; Hykl, Jiří; Šmíd, Bohuslav

    2018-06-01

    A unique in-house designed experimental apparatus for investigation of nucleation of droplets in CCS relevant systems is being developed by the present team. The apparatus allows simulating various processes relevant to CCS technologies. Gaseous mixtures with CO2 are prepared in a Mixture Preparation Device (MPD) based on accurate adjustment of flow rates of individual components [EPJ Web of Conferences 143, 02140 (2017)]. The mixture then flows into an expansion chamber, where it undergoes a rapid adiabatic expansion. As a consequence of adiabatic cooling, the mixture becomes supersaturated and nucleation and simultaneous growth of droplets occurs. In this study, we describe the design and testing of the expansion part of the experimental setup. The rapid expansion was realized using two valve systems, one for low pressures (up to 0.7 MPa) and the other for high pressures (up to 10 MPa). A challenge for a proper design of the expansion system is avoiding acoustic oscillations. These can occur either in the mode of Helmholtz resonator, where the compressible gas in the chamber acts as a spring and the rapidly moving gas in the valve system as a mass, or in the "flute" mode, where acoustic waves are generated in a long outlet tubing.

  16. Attosecond transient absorption instrumentation for thin film materials: Phase transitions, heat dissipation, signal stabilization, timing correction, and rapid sample rotation

    NASA Astrophysics Data System (ADS)

    Jager, Marieke F.; Ott, Christian; Kaplan, Christopher J.; Kraus, Peter M.; Neumark, Daniel M.; Leone, Stephen R.

    2018-01-01

    We present an extreme ultraviolet (XUV) transient absorption apparatus tailored to attosecond and femtosecond measurements on bulk solid-state thin-film samples, specifically when the sample dynamics are sensitive to heating effects. The setup combines methodology for stabilizing sub-femtosecond time-resolution measurements over 48 h and techniques for mitigating heat buildup in temperature-dependent samples. Single-point beam stabilization in pump and probe arms and periodic time-zero reference measurements are described for accurate timing and stabilization. A hollow-shaft motor configuration for rapid sample rotation, raster scanning capability, and additional diagnostics are described for heat mitigation. Heat transfer simulations performed using a finite element analysis allow comparison of sample rotation and traditional raster scanning techniques for 100 Hz pulsed laser measurements on vanadium dioxide, a material that undergoes an insulator-to-metal transition at a modest temperature of 340 K. Experimental results are presented confirming that the vanadium dioxide (VO2) sample cannot cool below its phase transition temperature between laser pulses without rapid rotation, in agreement with the simulations. The findings indicate the stringent conditions required to perform rigorous broadband XUV time-resolved absorption measurements on bulk solid-state samples, particularly those with temperature sensitivity, and elucidate a clear methodology to perform them.

  17. Rapid Prototyping for In Vitro Knee Rig Investigations of Prosthetized Knee Biomechanics: Comparison with Cobalt-Chromium Alloy Implant Material

    PubMed Central

    Schröder, Christian; Steinbrück, Arnd; Müller, Tatjana; Woiczinski, Matthias; Chevalier, Yan; Müller, Peter E.; Jansson, Volkmar

    2015-01-01

    Retropatellar complications after total knee arthroplasty (TKA) such as anterior knee pain and subluxations might be related to altered patellofemoral biomechanics, in particular to trochlear design and femorotibial joint positioning. A method was developed to test femorotibial and patellofemoral joint modifications separately with 3D-rapid prototyped components for in vitro tests, but material differences may further influence results. This pilot study aims at validating the use of prostheses made of photopolymerized rapid prototype material (RPM) by measuring the sliding friction with a ring-on-disc setup as well as knee kinematics and retropatellar pressure on a knee rig. Cobalt-chromium alloy (standard prosthesis material, SPM) prostheses served as validation standard. Friction coefficients between these materials and polytetrafluoroethylene (PTFE) were additionally tested as this latter material is commonly used to protect pressure sensors in experiments. No statistical differences were found between friction coefficients of both materials to PTFE. UHMWPE shows higher friction coefficient at low axial loads for RPM, a difference that disappears at higher load. No measurable statistical differences were found in knee kinematics and retropatellar pressure distribution. This suggests that using polymer prototypes may be a valid alternative to original components for in vitro TKA studies and future investigations on knee biomechanics. PMID:25879019

  18. Development of Data Acquisition Set-up for Steady-state Experiments

    NASA Astrophysics Data System (ADS)

    Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.

  19. Kilovoltage cone-beam CT imaging dose during breast radiotherapy: A dose comparison between a left and right breast setup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Alexandra, E-mail: Alexandra.quinn@health.nsw.gov.au; Centre for Medical Radiation Physics, University of Wollongong, NSW; Liverpool and Macarthur Cancer Therapy Centres, NSW

    2014-07-01

    The purpose of this study was to investigate the delivered dose from a kilovoltage cone-beam computed tomography (kV-CBCT) acquired in breast treatment position for a left and right breast setup. The dose was measured with thermoluminescent dosimeters positioned within a female anthropomorphic phantom at organ locations. Imaging was performed on an Elekta Synergy XVI system with the phantom setup on a breast board. The image protocol involved 120 kVp, 140 mAs, and a 270° arc rotation clockwise 0° to 270° for the left breast setup and 270° to 180° for the right breast setup (maximum arc rotations possible). The dosemore » delivered to the left breast, right breast, and heart was 5.1 mGy, 3.9 mGy, and 4.0 mGy for the left breast setup kV-CBCT, and 6.4 mGy, 6.0 mGy, and 4.8 mGy for the right breast setup kV-CBCT, respectively. The rotation arc of the kV-CBCT influenced the dose delivered, with the right breast setup kV-CBCT found to deliver a dose of up to 4 mGy or 105% higher to the treated breast′s surface in comparison with the left breast setup. This is attributed to the kV-CBCT source being more proximal to the anterior of the phantom for a right breast setup, whereas the source is more proximal to the posterior of the patient for a left-side scan.« less

  20. Poster - 33: Dosimetry Comparison of Prone Breast Forward and Inverse Treatment planning considering daily setup variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Runqing; Zhan, Lixin; Osei, Ernest

    2016-08-15

    Introduction: The purpose of this study is to investigate the effects of daily setup variations on prone breast forward field-in-field (FinF) and inverse IMRT treatment planning. Methods: Rando Phantom (Left breast) and Pixy phantom (Right breast) were built and CT scanned in prone position. The treatment planning (TP) is performed in Eclipse TP system. Forward FinF plan and inverse IMRT plan were created to satisfy the CTV coverage and OARs criteria. The daily setup variations were assumed to be 5 mm at left-right, superior-inferior, and anterior-posterior directions. The DVHs of CTV coverage and OARs were compared for both forward FinFmore » plan and inverse IMRT plans due to 5mm setup variation. Results and Discussions: DVHs of CTV coverage had fewer variations for 5m setup variation for forward FinF and inverse IMRT plan for both phantoms. However, for the setup variations in the left-right direction, the DVH of CTV coverage of IMRT plan showed the worst variation due to lateral setup variation for both phantoms. For anterior-posterior variation, the CTV could not get full coverage when the breast chest wall is shallow; however, with the guidance of MV imaging, breast chest wall will be checked during the MV imaging setup. So the setup variations have more effects on inverse IMRT plan, compared to forward FinF plan, especially in the left-right direction. Conclusions: The Forward FinF plan was recommended clinically considering daily setup variation.« less

  1. Impacts of dynamical ocean coupling in MJO experiments using NICAM/NICOCO

    NASA Astrophysics Data System (ADS)

    Miyakawa, T.

    2016-12-01

    The cloud-system resolving atmosphereic model NICAM has been successfull in producing Madden-Julian Oscillations(MJOs), having it's prediction skill estimated to be about 4 weeks in a series of hindcast experiments for winter MJO events during 2003-2012 (Miyakawa et al. 2014). A simple mixed-layer ocean model has been applied with nudging towards a prescribed "persistent anomaly SST", which maintains the initial anomaly with a time-varying climatological seasonal cycle. This setup enables the model to interact with an ocean with reasonably realistic SST, and also run in a "forecast mode", without using any observational information after the initial date. A limitation is that under this setup, the model skill drops if the oceanic anomaly rapidly changes after the initial date in the real world. Here we run a recently developed, full 3D-ocean coupled version NICAM-COCO (NICOCO) and explore its impact on MJO simulations. Dynamical ocean models can produce oceanic waves/currents, but will also have a bias and drift away from reality. In a sub-seasonal simulation (an initial problem), it is essential to compare the merit of having better represented oceanic signals and the demerit of bias/drift. A test case simulation series featuring an MJO that triggered the abrupt termination of a major El Nino in 1998 shows that the abrupt termination occurs in all 9 simulation members, highlighting the merit of ocean coupling. However, this is a case where oceanic signals are at its extremes. We carried out an estimation of MJO prediction skill for a preliminary 1-degree mesh ocean version of NICOCO in a similar manner to Miyakawa et al. (2014). The MJO skill was degraded for simulations that was initialized at RMM phases 1 and 2 (corresponding to the Indian Ocean), while those initialized at phase 8 (Africa) was not strongly affected. The tendency of the model ocean to overestimate the Maritime Continent warm pool SST possibly delays the eastward propagation of MJO convective envelope, accounting for the degrade of prediction skills (phases 1 and 2). Reference:Madden-Julian Oscillation prediction skill of a new-generation global model demonstrated using a supercomputer. Miyakawa, T., M. Satoh, H. Miura, H. Tomita, H. Yashiro, A. T. Noda, Y. Yamada, C. Kodama, M. Kimoto & K. Yoneyama. Nature Comm. 5, 3769, doi:10.1038/ncomms4769.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ribezzi-Crivellari, M.; Huguet, J. M.; Ritort, F.

    We present a dual-trap optical tweezers setup which directly measures forces using linear momentum conservation. The setup uses a counter-propagating geometry, which allows momentum measurement on each beam separately. The experimental advantages of this setup include low drift due to all-optical manipulation, and a robust calibration (independent of the features of the trapped object or buffer medium) due to the force measurement method. Although this design does not attain the high-resolution of some co-propagating setups, we show that it can be used to perform different single molecule measurements: fluctuation-based molecular stiffness characterization at different forces and hopping experiments on molecularmore » hairpins. Remarkably, in our setup it is possible to manipulate very short tethers (such as molecular hairpins with short handles) down to the limit where beads are almost in contact. The setup is used to illustrate a novel method for measuring the stiffness of optical traps and tethers on the basis of equilibrium force fluctuations, i.e., without the need of measuring the force vs molecular extension curve. This method is of general interest for dual trap optical tweezers setups and can be extended to setups which do not directly measure forces.« less

  3. Rapid assessment procedures in injury control.

    PubMed

    Klevens, Joanne; Anderson, Mark

    2004-03-01

    Injuries are among the leading causes of death and disability worldwide. The burden caused by injuries is even greater among the poorer nations and is projected to increase. Very often the lack of technical and financial resources, as well as the urgency of the problem, preclude applying sophisticated surveillance and research methods for generating relevant information to develop effective interventions. In these settings, it is necessary to consider more rapid and less costly methods in applying the public health approach to the problem of injury prevention and control. Rapid Assessment Procedures (RAP), developed within the fields of epidemiology, anthropology and health administration, can provide valid information in a manner that is quicker, simpler, and less costly than standard data collection methods. RAP have been applied widely and successfully to infectious and chronic disease issues, but have not been used extensively, if at all, as tools in injury control. This paper describes Rapid Assessment Procedures that (1) are useful for understanding the scope of the problem and for identifying potential risk factors, (2) can assist practitioners in determining intervention priorities, (3) can provide in-depth knowledge about a specific injury-related problem, and (4) can be used in surveillance systems to monitor outcomes. Finally, the paper describes some of the caveats in using RAP.

  4. AQUATOX Setup Guide

    EPA Pesticide Factsheets

    The new Guidance in AQUATOX Setup and Application provides a quick start guide to introduce major model features, as well as being a type of cookbook to guide basic model setup, calibration, and validation.

  5. A novel bench-scale column assay to investigate site-specific nitrification biokinetics in biological rapid sand filters.

    PubMed

    Tatari, K; Smets, B F; Albrechtsen, H-J

    2013-10-15

    A bench-scale assay was developed to obtain site-specific nitrification biokinetic information from biological rapid sand filters employed in groundwater treatment. The experimental set-up uses granular material subsampled from a full-scale filter, packed in a column, and operated with controlled and continuous hydraulic and ammonium loading. Flowrates and flow recirculation around the column are chosen to mimic full-scale hydrodynamic conditions, and minimize axial gradients. A reference ammonium loading rate is calculated based on the average loading experienced in the active zone of the full-scale filter. Effluent concentrations of ammonium are analyzed when the bench-scale column is subject to reference loading, from which removal rates are calculated. Subsequently, removal rates above the reference loading are measured by imposing short-term loading variations. A critical loading rate corresponding to the maximum removal rate can be inferred. The assay was successfully applied to characterize biokinetic behavior from a test rapid sand filter; removal rates at reference loading matched those observed from full-scale observations, while a maximum removal capacity of 6.9 g NH4(+)-N/m(3) packed sand/h could easily be determined at 7.5 g NH4(+)-N/m(3) packed sand/h. This assay, with conditions reflecting full-scale observations, and where the biological activity is subject to minimal physical disturbance, provides a simple and fast, yet powerful tool to gain insight in nitrification kinetics in rapid sand filters. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. A unified approach to fluid-flow, geomechanical, and seismic modelling

    NASA Astrophysics Data System (ADS)

    Yarushina, Viktoriya; Minakov, Alexander

    2016-04-01

    The perturbations of pore pressure can generate seismicity. This is supported by observations from human activities that involve fluid injection into rocks at high pressure (hydraulic fracturing, CO2 storage, geothermal energy production) and natural examples such as volcanic earthquakes. Although the seismic signals that emerge during geotechnical operations are small both in amplitude and duration when compared to natural counterparts. A possible explanation for the earthquake source mechanism is based on a number of in situ stress measurements suggesting that the crustal rocks are close to its plastic yield limit. Hence, a rapid increase of the pore pressure decreases the effective normal stress, and, thus, can trigger seismic shear deformation. At the same time, little attention has been paid to the fact that the perturbation of fluid pressure itself represents an acoustic source. Moreover, non-double-couple source mechanisms are frequently reported from the analysis of microseismicity. A consistent formulation of the source mechanism describing microseismic events should include both a shear and isotropic component. Thus, improved understanding of the interaction between fluid flow and seismic deformation is needed. With this study we aim to increase the competence in integrating real-time microseismic monitoring with geomechanical modelling such that there is a feedback loop between monitored deformation and stress field modelling. We propose fully integrated seismic, geomechanical and reservoir modelling. Our mathematical formulation is based on fundamental set of force balance, mass balance, and constitutive poro-elastoplastic equations for two-phase media consisting of deformable solid rock frame and viscous fluid. We consider a simplified 1D modelling setup for consistent acoustic source and wave propagation in poro-elastoplastic media. In this formulation the seismic wave is generated due to local changes of the stress field and pore pressure induced by e.g. fault generation or strain localization. This approach gives unified framework to characterize microseismicity of both class-I (pressure induced) and class-II (stress triggered) type of events. We consider two modelling setups. In the first setup the event is located within the reservoir and associated with pressure/stress drop due to fracture initiation. In the second setup we assume that seismic wave from a distant source hits a reservoir. The unified formulation of poro-elastoplastic deformation allows us to link the macroscopic stresses to local seismic instability.

  7. Concept and set-up of an IR-gas sensor construction kit

    NASA Astrophysics Data System (ADS)

    Sieber, Ingo; Perner, Gernot; Gengenbach, Ulrich

    2015-10-01

    The paper presents an approach to a cost-efficient modularly built non-dispersive optical IR-gas sensor (NDIR) based on a construction kit. The modularity of the approach offers several advantages: First of all it allows for an adaptation of the performance of the gas sensor to individual specifications by choosing the suitable modular components. The sensitivity of the sensor e.g. can be altered by selecting a source which emits a favorable wavelength spectrum with respect to the absorption spectrum of the gas to be measured or by tuning the measuring distance (ray path inside the medium to be measured). Furthermore the developed approach is very well suited to be used in teaching. Together with students a construction kit on basis of an optical free space system was developed and partly implemented to be further used as a teaching and training aid for bachelor and master students at our institute. The components of the construction kit are interchangeable and freely fixable on a base plate. The components are classified into five groups: sources, reflectors, detectors, gas feed, and analysis cell. Source, detector, and the positions of the components are fundamental to experiment and test different configurations and beam paths. The reflectors are implemented by an aluminum coated adhesive foil, mounted onto a support structure fabricated by additive manufacturing. This approach allows derivation of the reflecting surface geometry from the optical design tool and generating the 3D-printing files by applying related design rules. The rapid fabrication process and the adjustment of the modules on the base plate allow rapid, almost LEGO®-like, experimental assessment of design ideas. Subject of this paper is modeling, design, and optimization of the reflective optical components, as well as of the optical subsystem. The realization of a sample set-up used as a teaching aid and the optical measurement of the beam path in comparison to the simulation results are shown as well.

  8. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  9. Experimental setup for investigation of two-phase (water-air) flows in a tube

    NASA Astrophysics Data System (ADS)

    Kazunin, D. V.; Lashkov, V. A.; Mashek, I. Ch.; Khoronzhuk, R. S.

    2018-05-01

    A special setup was designed and built at St. Petersburg State University for providing experimental research in flow dynamics of the of air-water mixtures in a pipeline. The test section of the setup allows simulating a wide range of flow regimes of a gas-liquid mixture. The parameters of the experimental setup are given; the initial test results are discussed.

  10. Modifications to the rapid melt/rapid quench and transparent polymer video furnaces for the KC-135

    NASA Technical Reports Server (NTRS)

    Smith, Guy A.; Kosten, Sue E.; Workman, Gary L.

    1990-01-01

    Given here is a summary of tasks performed on two furnace systems, the Transparent Polymer (TPF) and the Rapid Melt/Rapid Quench (RMRQ) furnaces, to be used aboard NASA's KC-135. It was determined that major changes were needed for both furnaces to operate according to the scientific investigators' experiment parameters. Discussed here are what the problems were, what was required to solve the problems, and possible future enhancements. It was determined that the enhancements would be required for the furnaces to perform at their optimal levels. Services provided include hardware and software modifications, Safety DataPackage documentation, ground based testing, transportation to and from Ellington Air Field, operation of hardware during KC-135 flights, and post-flight data processing.

  11. Pathgroups, a dynamic data structure for genome reconstruction problems.

    PubMed

    Zheng, Chunfang

    2010-07-01

    Ancestral gene order reconstruction problems, including the median problem, quartet construction, small phylogeny, guided genome halving and genome aliquoting, are NP hard. Available heuristics dedicated to each of these problems are computationally costly for even small instances. We present a data structure enabling rapid heuristic solution to all these ancestral genome reconstruction problems. A generic greedy algorithm with look-ahead based on an automatically generated priority system suffices for all the problems using this data structure. The efficiency of the algorithm is due to fast updating of the structure during run time and to the simplicity of the priority scheme. We illustrate with the first rapid algorithm for quartet construction and apply this to a set of yeast genomes to corroborate a recent gene sequence-based phylogeny. http://albuquerque.bioinformatics.uottawa.ca/pathgroup/Quartet.html chunfang313@gmail.com Supplementary data are available at Bioinformatics online.

  12. Consensus oriented fuzzified decision support for oil spill contingency management.

    PubMed

    Liu, Xin; Wirtz, Kai W

    2006-06-30

    Studies on multi-group multi-criteria decision-making problems for oil spill contingency management are in their infancy. This paper presents a second-order fuzzy comprehensive evaluation (FCE) model to resolve decision-making problems in the area of contingency management after environmental disasters such as oil spills. To assess the performance of different oil combat strategies, second-order FCE allows for the utilization of lexical information, the consideration of ecological and socio-economic criteria and the involvement of a variety of stakeholders. On the other hand, the new approach can be validated by using internal and external checks, which refer to sensitivity tests regarding its internal setups and comparisons with other methods, respectively. Through a case study, the Pallas oil spill in the German Bight in 1998, it is demonstrated that this approach can help decision makers who search for an optimal strategy in multi-thread contingency problems and has a wider application potential in the field of integrated coastal zone management.

  13. Early screening of an infant's visual system

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.; Jorge, Jorge M.

    1999-06-01

    It is of utmost importance to the development of the child's visual system that she perceives clear focused retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur--myopia and hyperopia can only cause important problems in the future when they are significantly large, however for the astigmatism (rather frequent in infants) and anisometropia the problems tend to be more stringent. The early evaluation of the visual status of human infants is thus of critical importance. Photorefraction is a convenient technique for this kind of subjects. Essentially a light beam is delivered into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The photorefraction setup we established using new technological breakthroughs on the fields of imaging devices, digital image processing and fiber optics, allows a fast noninvasive evaluation of children visual status (refractive errors, accommodation, strabismus, ...). Results of the visual screening of a group of risk' child descents of blinds or amblyopes will be presented.

  14. Multipoint to multipoint routing and wavelength assignment in multi-domain optical networks

    NASA Astrophysics Data System (ADS)

    Qin, Panke; Wu, Jingru; Li, Xudong; Tang, Yongli

    2018-01-01

    In multi-point to multi-point (MP2MP) routing and wavelength assignment (RWA) problems, researchers usually assume the optical networks to be a single domain. However, the optical networks develop toward to multi-domain and larger scale in practice. In this context, multi-core shared tree (MST)-based MP2MP RWA are introduced problems including optimal multicast domain sequence selection, core nodes belonging in which domains and so on. In this letter, we focus on MST-based MP2MP RWA problems in multi-domain optical networks, mixed integer linear programming (MILP) formulations to optimally construct MP2MP multicast trees is presented. A heuristic algorithm base on network virtualization and weighted clustering algorithm (NV-WCA) is proposed. Simulation results show that, under different traffic patterns, the proposed algorithm achieves significant improvement on network resources occupation and multicast trees setup latency in contrast with the conventional algorithms which were proposed base on a single domain network environment.

  15. User's manual for three dimensional FDTD version D code for scattering from frequency-dependent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.

  16. Accuracy of off-line bioluminescence imaging to localize targets in preclinical radiation research.

    PubMed

    Tuli, Richard; Armour, Michael; Surmak, Andrew; Reyes, Juvenal; Iordachita, Iulian; Patterson, Michael; Wong, John

    2013-04-01

    In this study, we investigated the accuracy of using off-line bioluminescence imaging (BLI) and tomography (BLT) to guide irradiation of small soft tissue targets on a small animal radiation research platform (SARRP) with on-board cone beam CT (CBCT) capability. A small glass bulb containing BL cells was implanted as a BL source in the abdomen of 11 mouse carcasses. Bioluminescence imaging and tomography were acquired for each carcass. Six carcasses were setup visually without immobilization and 5 were restrained in position with tape. All carcasses were setup in treatment position on the SARRP where the centroid position of the bulb on CBCT was taken as "truth". In the 2D visual setup, the carcass was setup by aligning the point of brightest luminescence with the vertical beam axis. In the CBCT assisted setup, the pose of the carcass on CBCT was aligned with that on the 2D BL image for setup. For both 2D setup methods, the offset of the bulb centroid on CBCT from the vertical beam axis was measured. In the BLT-CBCT fusion method, the 3D torso on BLT and CBCT was registered and the 3D offset of the respective source centroids was calculated. The setup results were independent of the carcass being immobilized or not due to the onset of rigor mortis. The 2D offset of the perceived BL source position from the CBCT bulb position was 2.3 mm ± 1.3 mm. The 3D offset between BLT and CBCT was 1.5 mm ± 0.9 mm. Given the rigidity of the carcasses, the setup results represent the best that can be achieved with off-line 2D BLI and 3D BLT. The setup uncertainty would require the use of undesirably large margin of 4-5 mm. The results compel the implementation of on-board BLT capability on the SARRP to eliminate setup error and to improve BLT accuracy.

  17. Accuracy of Off-Line Bioluminescence Imaging to Localize Targets in Preclinical Radiation Research

    PubMed Central

    Tuli, Richard; Armour, Michael; Surmak, Andrew; Reyes, Juvenal; Iordachita, Iulian; Patterson, Michael; Wong, John

    2013-01-01

    In this study, we investigated the accuracy of using off-line bioluminescence imaging (BLI) and tomography (BLT) to guide irradiation of small soft tissue targets on a small animal radiation research platform (SARRP) with on-board cone beam CT (CBCT) capability. A small glass bulb containing BL cells was implanted as a BL source in the abdomen of 11 mouse carcasses. Bioluminescence imaging and tomography were acquired for each carcass. Six carcasses were setup visually without immobilization and 5 were restrained in position with tape. All carcasses were setup in treatment position on the SARRP where the centroid position of the bulb on CBCT was taken as “truth”. In the 2D visual setup, the carcass was setup by aligning the point of brightest luminescence with the vertical beam axis. In the CBCT assisted setup, the pose of the carcass on CBCT was aligned with that on the 2D BL image for setup. For both 2D setup methods, the offset of the bulb centroid on CBCT from the vertical beam axis was measured. In the BLT-CBCT fusion method, the 3D torso on BLT and CBCT was registered and the 3D offset of the respective source centroids was calculated. The setup results were independent of the carcass being immobilized or not due to the onset of rigor mortis. The 2D offset of the perceived BL source position from the CBCT bulb position was 2.3 mm ± 1.3 mm. The 3D offset between BLT and CBCT was 1.5 mm ± 0.9 mm. Given the rigidity of the carcasses, the setup results represent the best that can be achieved with off-line 2D BLI and 3D BLT. The setup uncertainty would require the use of undesirably large margin of 4–5 mm. The results compel the implementation of on-board BLT capability on the SARRP to eliminate setup error and to improve BLT accuracy. PMID:23578189

  18. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  19. Lensfree diffractive tomography for the imaging of 3D cell cultures

    NASA Astrophysics Data System (ADS)

    Berdeu, Anthony; Momey, Fabien; Dinten, Jean-Marc; Gidrol, Xavier; Picollet-D'hahan, Nathalie; Allier, Cédric

    2017-02-01

    New microscopes are needed to help reaching the full potential of 3D organoid culture studies by gathering large quantitative and systematic data over extended periods of time while preserving the integrity of the living sample. In order to reconstruct large volumes while preserving the ability to catch every single cell, we propose new imaging platforms based on lens-free microscopy, a technic which is addressing these needs in the context of 2D cell culture, providing label-free and non-phototoxic acquisition of large datasets. We built lens-free diffractive tomography setups performing multi-angle acquisitions of 3D organoid cultures embedded in Matrigel and developed dedicated 3D holographic reconstruction algorithms based on the Fourier diffraction theorem. Nonetheless, holographic setups do not record the phase of the incident wave front and the biological samples in Petri dish strongly limit the angular coverage. These limitations introduce numerous artefacts in the sample reconstruction. We developed several methods to overcome them, such as multi-wavelength imaging or iterative phase retrieval. The most promising technic currently developed is based on a regularised inverse problem approach directly applied on the 3D volume to reconstruct. 3D reconstructions were performed on several complex samples such as 3D networks or spheroids embedded in capsules with large reconstructed volumes up to 25 mm3 while still being able to identify single cells. To our knowledge, this is the first time that such an inverse problem approach is implemented in the context of lens-free diffractive tomography enabling to reconstruct large fully 3D volumes of unstained biological samples.

  20. On the predictivity of pore-scale simulations: Estimating uncertainties with multilevel Monte Carlo

    NASA Astrophysics Data System (ADS)

    Icardi, Matteo; Boccardo, Gianluca; Tempone, Raúl

    2016-09-01

    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another ;equivalent; sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics.

  1. Building gene expression profile classifiers with a simple and efficient rejection option in R.

    PubMed

    Benso, Alfredo; Di Carlo, Stefano; Politano, Gianfranco; Savino, Alessandro; Hafeezurrehman, Hafeez

    2011-01-01

    The collection of gene expression profiles from DNA microarrays and their analysis with pattern recognition algorithms is a powerful technology applied to several biological problems. Common pattern recognition systems classify samples assigning them to a set of known classes. However, in a clinical diagnostics setup, novel and unknown classes (new pathologies) may appear and one must be able to reject those samples that do not fit the trained model. The problem of implementing a rejection option in a multi-class classifier has not been widely addressed in the statistical literature. Gene expression profiles represent a critical case study since they suffer from the curse of dimensionality problem that negatively reflects on the reliability of both traditional rejection models and also more recent approaches such as one-class classifiers. This paper presents a set of empirical decision rules that can be used to implement a rejection option in a set of multi-class classifiers widely used for the analysis of gene expression profiles. In particular, we focus on the classifiers implemented in the R Language and Environment for Statistical Computing (R for short in the remaining of this paper). The main contribution of the proposed rules is their simplicity, which enables an easy integration with available data analysis environments. Since in the definition of a rejection model tuning of the involved parameters is often a complex and delicate task, in this paper we exploit an evolutionary strategy to automate this process. This allows the final user to maximize the rejection accuracy with minimum manual intervention. This paper shows how the use of simple decision rules can be used to help the use of complex machine learning algorithms in real experimental setups. The proposed approach is almost completely automated and therefore a good candidate for being integrated in data analysis flows in labs where the machine learning expertise required to tune traditional classifiers might not be available.

  2. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy].

    PubMed

    Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C

    2010-01-01

    Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  3. Low-Cost Manufacturing, Usability, and Security: An Analysis of Bluetooth Simple Pairing and Wi-Fi Protected Setup

    NASA Astrophysics Data System (ADS)

    Kuo, Cynthia; Walker, Jesse; Perrig, Adrian

    Bluetooth Simple Pairing and Wi-Fi Protected Setup specify mechanisms for exchanging authentication credentials in wireless networks. Both Simple Pairing and Protected Setup support multiple setup mechanisms, which increases security risks and hurts the user experience. To improve the security and usability of these specifications, we suggest defining a common baseline for hardware features and a consistent, interoperable user experience across devices.

  4. Setup errors and effectiveness of Optical Laser 3D Surface imaging system (Sentinel) in postoperative radiotherapy of breast cancer.

    PubMed

    Wei, Xiaobo; Liu, Mengjiao; Ding, Yun; Li, Qilin; Cheng, Changhai; Zong, Xian; Yin, Wenming; Chen, Jie; Gu, Wendong

    2018-05-08

    Breast-conserving surgery (BCS) plus postoperative radiotherapy has become the standard treatment for early-stage breast cancer. The aim of this study was to compare the setup accuracy of optical surface imaging by the Sentinel system with cone-beam computerized tomography (CBCT) imaging currently used in our clinic for patients received BCS. Two optical surface scans were acquired before and immediately after couch movement correction. The correlation between the setup errors as determined by the initial optical surface scan and CBCT was analyzed. The deviation of the second optical surface scan from the reference planning CT was considered an estimate for the residual errors for the new method for patient setup correction. The consequences in terms for necessary planning target volume (PTV) margins for treatment sessions without setup correction applied. We analyzed 145 scans in 27 patients treated for early stage breast cancer. The setup errors of skin marker based patient alignment by optical surface scan and CBCT were correlated, and the residual setup errors as determined by the optical surface scan after couch movement correction were reduced. Optical surface imaging provides a convenient method for improving the setup accuracy for breast cancer patient without unnecessary imaging dose.

  5. Predictors of Problem Gambling Severity in Treatment Seeking Gamblers

    ERIC Educational Resources Information Center

    Hounslow, Vanessa; Smith, David; Battersby, Malcolm; Morefield, Kate

    2011-01-01

    Problem gambling has become a widespread problem following the rapid expansion of electronic gaming machines into hotels and clubs over the last 10 years. Recent literature indicates that certain factors can influence problem gambling severity, such as psychiatric co-morbidity and personality traits, gambling related cognitions, substance use and…

  6. Multipurpose setup for low-temperature conversion electron Mössbauer spectroscopy

    NASA Astrophysics Data System (ADS)

    Augustyns, V.; Trekels, M.; Gunnlaugsson, H. P.; Masenda, H.; Temst, K.; Vantomme, A.; Pereira, L. M. C.

    2017-05-01

    We describe an experimental setup for conversion electron Mössbauer spectroscopy (CEMS) at low temperature. The setup is composed of a continuous flow cryostat (temperature range of 4.2-500 K), detector housing, three channel electron multipliers, and corresponding electronics. We demonstrate the capabilities of the setup with CEMS measurements performed on a sample consisting of a thin enriched 57Fe film, with a thickness of 20 nm, deposited on a silicon substrate. We also describe exchangeable adaptations (lid and sample holder) which extend the applicability of the setup to emission Mössbauer spectroscopy as well as measurements under an applied magnetic field.

  7. Claims and Appeals (Medicare)

    MedlinePlus

    ... Glossary MyMedicare.gov Login Search Main Menu , collapsed Main Menu Sign Up / Change Plans Getting started with ... setup: setupNotifier, notify: notify }; lrNotifier.setup(); $("#menu-btn, li.toolbarmenu .toolbarmenu-a").click(function() { // var isExpanded = ' is ...

  8. Rapid Production of Internally Structured Colloids by Flash Nanoprecipitation of Block Copolymer Blends.

    PubMed

    Grundy, Lorena S; Lee, Victoria E; Li, Nannan; Sosa, Chris; Mulhearn, William D; Liu, Rui; Register, Richard A; Nikoubashman, Arash; Prud'homme, Robert K; Panagiotopoulos, Athanassios Z; Priestley, Rodney D

    2018-05-08

    Colloids with internally structured geometries have shown great promise in applications ranging from biosensors to optics to drug delivery, where the internal particle structure is paramount to performance. The growing demand for such nanomaterials necessitates the development of a scalable processing platform for their production. Flash nanoprecipitation (FNP), a rapid and inherently scalable colloid precipitation technology, is used to prepare internally structured colloids from blends of block copolymers and homopolymers. As revealed by a combination of experiments and simulations, colloids prepared from different molecular weight diblock copolymers adopt either an ordered lamellar morphology consisting of concentric shells or a disordered lamellar morphology when chain dynamics are sufficiently slow to prevent defect annealing during solvent exchange. Blends of homopolymer and block copolymer in the feed stream generate more complex internally structured colloids, such as those with hierarchically structured Janus and patchy morphologies, due to additional phase separation and kinetic trapping effects. The ability of the FNP process to generate such a wide range of morphologies using a simple and scalable setup provides a pathway to manufacturing internally structured colloids on an industrial scale.

  9. Alleviating speech and deglutition: Role of a prosthodontist in multidisciplinary management of velopharyngeal insufficiency.

    PubMed

    Nanda, Aditi; Koli, Dheeraj; Sharma, Sunanda; Suryavanshi, Shalini; Verma, Mahesh

    2015-01-01

    Surgical resection of soft palate due to cancer affects the effective functioning of the velopharyngeal mechanism (speech and deglutition). With the loss of speech intelligibility, hyper resonance in voice and impaired function of swallowing (due to nasal regurgitation), there is a depreciation in the quality of life of such an individual. In a multidisciplinary setup, the role of a prosthodontist has been described to rehabilitate such patients by fabrication of speech aid prosthesis. The design and method of fabrication of the prosthesis are simple and easy to perform. The use of prosthesis, together with training (of speech) by a speech pathologist resulted in improvement in speech. Furthermore, an improvement in swallowing had been noted, resulting in an improved nutritional intake and general well-being of an individual. The take-home message is that in the treatment of oral cancer, feasible, and rapid rehabilitation should be endeavored in order to make the patient socially more acceptable. The onus lies on the prosthodontist to practise the same in a rapid manner before the moral of the patient becomes low due to the associated stigma of cancer.

  10. Innovative technology for web-based data management during an outbreak

    PubMed Central

    Mukhi, Shamir N; Chester, Tammy L Stuart; Klaver-Kibria, Justine DA; Nowicki, Deborah L; Whitlock, Mandy L; Mahmud, Salah M; Louie, Marie; Lee, Bonita E

    2011-01-01

    Lack of automated and integrated data collection and management, and poor linkage of clinical, epidemiological and laboratory data during an outbreak can inhibit effective and timely outbreak investigation and response. This paper describes an innovative web-based technology, referred to as Web Data, developed for the rapid set-up and provision of interactive and adaptive data management during outbreak situations. We also describe the benefits and limitations of the Web Data technology identified through a questionnaire that was developed to evaluate the use of Web Data implementation and application during the 2009 H1N1 pandemic by Winnipeg Regional Health Authority and Provincial Laboratory for Public Health of Alberta. Some of the main benefits include: improved and secure data access, increased efficiency and reduced error, enhanced electronic collection and transfer of data, rapid creation and modification of the database, conversion of specimen-level to case-level data, and user-defined data extraction and query capabilities. Areas requiring improvement include: better understanding of privacy policies, increased capability for data sharing and linkages between jurisdictions to alleviate data entry duplication. PMID:23569597

  11. Rapid cable tension estimation using dynamic and mechanical properties

    NASA Astrophysics Data System (ADS)

    Martínez-Castro, Rosana E.; Jang, Shinae; Christenson, Richard E.

    2016-04-01

    Main tension elements are critical to the overall stability of cable-supported bridges. A dependable and rapid determination of cable tension is desired to assess the state of a cable-supported bridge and evaluate its operability. A portable smart sensor setup is presented to reduce post-processing time and deployment complexity while reliably determining cable tension using dynamic characteristics extracted from spectral analysis. A self-recording accelerometer is coupled with a single-board microcomputer that communicates wirelessly with a remote host computer. The portable smart sensing device is designed such that additional algorithms, sensors and controlling devices for various monitoring applications can be installed and operated for additional structural assessment. The tension-estimating algorithms are based on taut string theory and expand to consider bending stiffness. The successful combination of cable properties allows the use of a cable's dynamic behavior to determine tension force. The tension-estimating algorithms are experimentally validated on a through-arch steel bridge subject to ambient vibration induced by passing traffic. The tension estimation is determined in well agreement with previously determined tension values for the structure.

  12. A Rapid Survival Assay to Measure Drug-Induced Cytotoxicity and Cell Cycle Effects

    PubMed Central

    Valiathan, Chandni; McFaline, Jose L.

    2012-01-01

    We describe a rapid method to accurately measure the cytotoxicity of mammalian cells upon exposure to various drugs. Using this assay, we obtain survival data in a fraction of the time required to perform the traditional clonogenic survival assay, considered the gold standard. The dynamic range of the assay allows sensitivity measurements on a multi-log scale allowing better resolution of comparative sensitivities. Moreover, the results obtained contain additional information on cell cycle effects of the drug treatment. Cell survival is obtained from a quantitative comparison of proliferation between drug-treated and untreated cells. During the assay, cells are treated with a drug and, following a recovery period, allowed to proliferate in the presence of BrdU. Cells that synthesize DNA in the presence of bromodeoxyuridine (BrdU) exhibit quenched Hoechst fluorescence easily detected by flow cytometry; quenching is used to determine relative proliferation in treated versus untreated cells. Finally, the multi-well setup of this assay allows the simultaneous screening of multiple cell lines, multiple doses, or multiple drugs to accurately measure cell survival and cell cycle changes after drug treatment. PMID:22133811

  13. Water level effects on breaking wave setup for Pacific Island fringing reefs

    NASA Astrophysics Data System (ADS)

    Becker, J. M.; Merrifield, M. A.; Ford, M.

    2014-02-01

    The effects of water level variations on breaking wave setup over fringing reefs are assessed using field measurements obtained at three study sites in the Republic of the Marshall Islands and the Mariana Islands in the western tropical Pacific Ocean. At each site, reef flat setup varies over the tidal range with weaker setup at high tide and stronger setup at low tide for a given incident wave height. The observed water level dependence is interpreted in the context of radiation stress gradients specified by an idealized point break model generalized for nonnormally incident waves. The tidally varying setup is due in part to depth-limited wave heights on the reef flat, as anticipated from previous reef studies, but also to tidally dependent breaking on the reef face. The tidal dependence of the breaking is interpreted in the context of the point break model in terms of a tidally varying wave height to water depth ratio at breaking. Implications for predictions of wave-driven setup at reef-fringed island shorelines are discussed.

  14. Comparison of organ-at-risk sparing and plan robustness for spot-scanning proton therapy and volumetric modulated arc photon therapy in head-and-neck cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barten, Danique L. J., E-mail: d.barten@vumc.nl; Tol, Jim P.; Dahele, Max

    Purpose: Proton radiotherapy for head-and-neck cancer (HNC) aims to improve organ-at-risk (OAR) sparing over photon radiotherapy. However, it may be less robust for setup and range uncertainties. The authors investigated OAR sparing and plan robustness for spot-scanning proton planning techniques and compared these with volumetric modulated arc therapy (VMAT) photon plans. Methods: Ten HNC patients were replanned using two arc VMAT (RapidArc) and spot-scanning proton techniques. OARs to be spared included the contra- and ipsilateral parotid and submandibular glands and individual swallowing muscles. Proton plans were made using Multifield Optimization (MFO, using three, five, and seven fields) and Single-field Optimizationmore » (SFO, using three fields). OAR sparing was evaluated using mean dose to composite salivary glands (Comp{sub Sal}) and composite swallowing muscles (Comp{sub Swal}). Plan robustness was determined for setup and range uncertainties (±3 mm for setup, ±3% HU) evaluating V95% and V107% for clinical target volumes. Results: Averaged over all patients Comp{sub Sal}/Comp{sub Swal} mean doses were lower for the three-field MFO plans (14.6/16.4 Gy) compared to the three-field SFO plans (20.0/23.7 Gy) and VMAT plans (23.0/25.3 Gy). Using more than three fields resulted in differences in OAR sparing of less than 1.5 Gy between plans. SFO plans were significantly more robust than MFO plans. VMAT plans were the most robust. Conclusions: MFO plans had improved OAR sparing but were less robust than SFO and VMAT plans, while SFO plans were more robust than MFO plans but resulted in less OAR sparing. Robustness of the MFO plans did not increase with more fields.« less

  15. A Mock Circulatory System Incorporating a Compliant 3D-Printed Anatomical Model to Investigate Pulmonary Hemodynamics.

    PubMed

    Knoops, Paul G M; Biglino, Giovanni; Hughes, Alun D; Parker, Kim H; Xu, Linzhang; Schievano, Silvia; Torii, Ryo

    2017-07-01

    A realistic mock circulatory system (MCS) could be a valuable in vitro testbed to study human circulatory hemodynamics. The objective of this study was to design a MCS replicating the pulmonary arterial circulation, incorporating an anatomically representative arterial model suitable for testing clinically relevant scenarios. A second objective of the study was to ensure the system's compatibility with magnetic resonance imaging (MRI) for additional measurements. A latex pulmonary arterial model with two generations of bifurcations was manufactured starting from a 3D-printed mold reconstructed from patient data. The model was incorporated into a MCS for in vitro hydrodynamic measurements. The setup was tested under physiological pulsatile flow conditions and results were evaluated using wave intensity analysis (WIA) to investigate waves traveling in the arterial system. Increased pulmonary vascular resistance (IPVR) was simulated as an example of one pathological scenario. Flow split between right and left pulmonary artery was found to be realistic (54 and 46%, respectively). No substantial difference in pressure waveform was observed throughout the various generations of bifurcations. Based on WIA, three main waves were identified in the main pulmonary artery (MPA), that is, forward compression wave, backward compression wave, and forward expansion wave. For IPVR, a rise in mean pressure was recorded in the MPA, within the clinical range of pulmonary arterial hypertension. The feasibility of using the MCS in the MRI scanner was demonstrated with the MCS running 2 h consecutively while acquiring preliminary MRI data. This study shows the development and verification of a pulmonary MCS, including an anatomically correct, compliant latex phantom. The setup can be useful to explore a wide range of hemodynamic questions, including the development of patient- and pathology-specific models, considering the ease and low cost of producing rapid prototyping molds, and the versatility of the setup for invasive and noninvasive (i.e., MRI) measurements. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  16. A Mock Circulatory System Incorporating a Compliant 3D-Printed Anatomical Model to Investigate Pulmonary Hemodynamics

    PubMed Central

    Knoops, Paul G.M.; Biglino, Giovanni; Hughes, Alun D.; Parker, Kim H.; Xu, Linzhang; Schievano, Silvia; Torii, Ryo

    2017-01-01

    A realistic mock circulatory system (MCS) could be a valuable in vitro testbed to study human circulatory hemodynamics. The objective of this study was to design a MCS replicating the pulmonary arterial circulation, incorporating an anatomically representative arterial model suitable for testing clinically relevant scenarios. A second objective of the study was to ensure the system's compatibility with magnetic resonance imaging (MRI) for additional measurements. A latex pulmonary arterial model with two generations of bifurcations was manufactured starting from a 3D-printed mold reconstructed from patient data. The model was incorporated into a MCS for in vitro hydrodynamic measurements. The setup was tested under physiological pulsatile flow conditions and results were evaluated using wave intensity analysis (WIA) to investigate waves traveling in the arterial system. Increased pulmonary vascular resistance (IPVR) was simulated as an example of one pathological scenario. Flow split between right and left pulmonary artery was found to be realistic (54 and 46%, respectively). No substantial difference in pressure waveform was observed throughout the various generations of bifurcations. Based on WIA, three main waves were identified in the main pulmonary artery (MPA), that is, forward compression wave, backward compression wave, and forward expansion wave. For IPVR, a rise in mean pressure was recorded in the MPA, within the clinical range of pulmonary arterial hypertension. The feasibility of using the MCS in the MRI scanner was demonstrated with the MCS running 2 h consecutively while acquiring preliminary MRI data. This study shows the development and verification of a pulmonary MCS, including an anatomically correct, compliant latex phantom. The setup can be useful to explore a wide range of hemodynamic questions, including the development of patient- and pathology-specific models, considering the ease and low cost of producing rapid prototyping molds, and the versatility of the setup for invasive and noninvasive (i.e., MRI) measurements. PMID:27925228

  17. Paleomagnetism studies at micrometer scales using quantum diamond microscopy

    NASA Astrophysics Data System (ADS)

    Kehayias, P.; Fu, R. R.; Glenn, D. R.; Lima, E. A.; Men, M.; Merryman, H.; Walsworth, A.; Weiss, B. P.; Walsworth, R. L.

    2017-12-01

    Traditional paleomagnetic experiments generally measure the net magnetic moment of cm-size rock samples. Field tests such as the conglomerate and fold tests, based on the measurements of such cm-size samples, are frequently used to constrain the timing of magnetization. However, structures permitting such field tests may occur at the micron scale in geological samples, precluding paleomagnetic field tests using traditional bulk measurement techniques. The quantum diamond microscope (QDM) is a recently developed technology that uses magnetically-sensitive nitrogen-vacancy (NV) color centers in diamond for magnetic mapping with micron resolution [1]. QDM data were previously used to identify the ferromagnetic carriers in chondrules and terrestrial zircons and to image the magnetization distribution in multi-domain dendritic magnetite. Taking advantage of new hardware components, we have developed an optimized QDM setup with a 1E-15 J/T moment sensitivity over a measurement area of several millimeters squared. The improved moment sensitivity of the new QDM setup permits us to image natural remanent magnetization (NRM) in weakly magnetized samples, thereby enabling paleomagnetic field tests at the millimeter scale. We will present recent and ongoing QDM measurements of (1) the Renazzo class carbonaceous (CR) chondrite GRA 95229 and (2) 1 cm scale folds in a post-Bitter Springs Stage ( 790 Ma) carbonate from the Svanbergfjellet Formation (Svalbard). Results from the GRA 95229 micro-conglomerate test, performed on single chondrules containing dusty olivine metals crystallized during chondrule formation, hold implications for the preservation of nebular magnetic field records. The Svanbergfjellet Formation micro-fold test can help confirm the primary origin of a paleomagnetic pole at 790 Ma, which has been cited as evidence for rapid true polar wander in the 820-790 Ma interval. In addition, we will detail technical aspects of the new QDM setup, emphasizing key elements that enable improved sensitivity. [1] D. R. Glenn et al., arXiv:1707.06714 (2017).

  18. Preventive Visit and Yearly Wellness Exams

    MedlinePlus

    ... Glossary MyMedicare.gov Login Search Main Menu , collapsed Main Menu Sign Up / Change Plans Getting started with ... setup: setupNotifier, notify: notify }; lrNotifier.setup(); $("#menu-btn, li.toolbarmenu .toolbarmenu-a").click(function() { // var isExpanded = ' is ...

  19. Column experiments on organic micropollutants - applications and limitations

    NASA Astrophysics Data System (ADS)

    Banzhaf, Stefan; Hebig, Klaus

    2016-04-01

    As organic micropollutants become more and more ubiquitous in the aquatic environment, a sound understanding of their fate and transport behaviour is needed. This is to assure both safe and clean drinking water supply for mankind in the future and to protect the aquatic environment from pollution and negative consequences caused by manmade contamination. Apart from countless field studies, column experiments were and are frequently used to study transport of organic micropollutants. As the transport of (organic) solutes in groundwater is controlled by the chemical and physical properties of the compounds, the solvent (the groundwater including all solutes), and the substrate (the aquifer material), the adjustment and control of these boundary conditions allow to study a multitude of different experimental setups and to address specific research questions. The main purpose, however, remains to study the transport of a specific compound and its sorption and degradation behaviour in a specific sediment or substrate. Apart from the effective control of the individual boundary conditions, the main advantage of columns studies compared to other experimental setups (such as field studies, batch/microcosm studies), is that conservative and reactive solute breakthrough curves are obtained, which represent the sum of the transport processes. The analysis of these curves is well-developed and established. Additionally, limitations of this experimental method are presented here: the effects observed in column studies are often a result of dynamic, non-equilibrium processes. Time (or flow velocity) plays a major role in contrast to batch experiments, in which all processes will be observed until equilibrium is reached in the substrate-solution-system. Slightly modifying boundary conditions in different experiments have a strong influence on transport and degradation behaviour of organic micropollutants. This is a significant severe issue when it comes to general findings on the transport behaviour of a specific organic compound that are transferable to any given hydrogeochemical environment. Unfortunately, results of most column experiments therefore remain restricted to their specific setup. Column experiments can provide good estimates of all relevant transport parameters. However, the obtained results will almost always be limited to the scale they were obtained from. This means that direct application to field scale studies is infeasible as too many parameters are exclusive for the laboratory column setup. The remaining future challenge is to develop standard column experiments on organic micropollutants that overcome this issue. Here, we present a review of column experiments on organic micropollutants. We present different setups and discuss weaknesses, problems and advantages and provide ideas how to obtain more comparable results on the transport of organic micropollutants in the future.

  20. Experimental investigation of refractory metals in the premelting region during fast heating

    NASA Astrophysics Data System (ADS)

    Senchenko, V. N.; Belikov, R. S.; Popov, V. S.

    2015-11-01

    This work demonstrates experimental possibility of investigation of high refractory materials around its melting point, particularly in premelting region with high accuracy. In this article authors describe the developed experimental setup based on rapid resistive self-heating of a sample by a large current pulse generated by a capacitor discharge circuit that allow fast pulse interruption by temperature feedback signal. The sample temperature was measured with a two-channel microsecond radiation pyrometer. Preliminary experiments were conducted on tantalum and molybdenum at heating speed of 108 K/s. The method allows investigating thermophysical properties of refractory conductive materials such as melting temperature, melting heat, specific resistivity, specific enthalpy and specific heat capacity in solid and liquid phase, especially in premelting area.

  1. A Fully Immersive Set-Up for Remote Interaction and Neurorehabilitation Based on Virtual Body Ownership

    PubMed Central

    Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.

    2012-01-01

    Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454

  2. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, M; Feigenberg, S

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patientmore » position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.« less

  3. Rapid Generation of Optimal Asteroid Powered Descent Trajectories Via Convex Optimization

    NASA Technical Reports Server (NTRS)

    Pinson, Robin; Lu, Ping

    2015-01-01

    This paper investigates a convex optimization based method that can rapidly generate the fuel optimal asteroid powered descent trajectory. The ultimate goal is to autonomously design the optimal powered descent trajectory on-board the spacecraft immediately prior to the descent burn. Compared to a planetary powered landing problem, the major difficulty is the complex gravity field near the surface of an asteroid that cannot be approximated by a constant gravity field. This paper uses relaxation techniques and a successive solution process that seeks the solution to the original nonlinear, nonconvex problem through the solutions to a sequence of convex optimal control problems.

  4. Positron lifetime setup based on DRS4 evaluation board

    NASA Astrophysics Data System (ADS)

    Petriska, M.; Sojak, S.; Slugeň, V.

    2014-04-01

    A digital positron lifetime setup based on DRS4 evaluation board designed at the Paul Scherrer Institute has been constructed and tested in the Positron annihilation laboratory Slovak University of Technology Bratislava. The high bandwidth, low power consumption and short readout time make DRS4 chip attractive for positron annihilation lifetime (PALS) setup, replacing traditional ADCs and TDCs. A software for PALS setup online and offline pulse analysis was developed with Qt,Qwt and ALGLIB libraries.

  5. Wave setup over a Pacific Island fringing reef

    NASA Astrophysics Data System (ADS)

    Vetter, O.; Becker, J. M.; Merrifield, M. A.; Pequignet, A.-C.; Aucan, J.; Boc, S. J.; Pollock, C. E.

    2010-12-01

    Measurements obtained across a shore-attached, fringing reef on the southeast coast of the island of Guam are examined to determine the relationship between incident waves and wave-driven setup during storm and nonstorm conditions. Wave setup on the reef flat correlates well (r > 0.95) and scales near the shore as approximately 35% of the incident root mean square wave height in 8 m water depth. Waves generated by tropical storm Man-Yi result in a 1.3 m setup during the peak of the storm. Predictions based on traditional setup theory (steady state, inviscid cross-shore momentum and depth-limited wave breaking) and an idealized model of localized wave breaking at the fore reef are in agreement with the observations. The reef flat setup is used to estimate a similarity parameter at breaking that is in agreement with observations from a steeply sloping sandy beach. A weak (˜10%) increase in setup is observed across the reef flat during wave events. The inclusion of bottom stress in the cross-shore momentum balance may account for a portion of this signal, but this assessment is inconclusive as the reef flat currents in some cases are in the wrong direction to account for the increase. An independent check of fringing reef setup dynamics is carried out for measurements at the neighboring island of Saipan with good agreement.

  6. The influence of age, muscle strength and speed of information processing on recovery responses to external perturbations in gait.

    PubMed

    Senden, R; Savelberg, H H C M; Adam, J; Grimm, B; Heyligers, I C; Meijer, K

    2014-01-01

    Dynamic imbalance caused by external perturbations to gait can successfully be counteracted by adequate recovery responses. The current study investigated how the recovery response is moderated by age, walking speed, muscle strength and speed of information processing. The gait pattern of 50 young and 45 elderly subjects was repeatedly perturbed at 20% and 80% of the first half of the swing phase using the Timed Rapid impact Perturbation (TRiP) set-up. Recovery responses were identified using 2D cameras. Muscular factors (dynamometer) and speed of information processing parameters (computer-based reaction time task) were determined. The stronger, faster reacting and faster walking young subjects recovered more often by an elevating strategy than elderly subjects. Twenty three per cent of the differences in recovery responses were explained by a combination of walking speed (B=-13.85), reaction time (B=-0.82), maximum extension strength (B=0.01) and rate of extension moment development (B=0.19). The recovery response that subjects employed when gait was perturbed by the TRiP set-up was modified by several factors; the individual contribution of walking speed, muscle strength and speed of information processing was small. Insight into remaining modifying factors is needed to assist and optimise fall prevention programmes. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. The formation of CdS quantum dots and Au nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiener, Andreas; Schmidt, Ella; Bergmann, Christoph

    Abstract We report on microsecond-resolved in-situ SAXS experiments of the early nucleation and growth behavior of both cadmium sulfide (CdS) quantum dots in aqueous solution including the temperature dependence and of gold (Au) nanoparticles. A novel free-jet setup was developped to access reaction times as early as 20 μs. As the signal in particular in the beginning of the reaction is weak the containment-free nature of this sample environment prooved crucial. The SAXS data reveal a two-step pathway with a surprising stability of a structurally relaxed cluster with a diameter of about 2 nm. While these develop rapidly by ionicmore » assembly, a further slower growth is attributed to cluster attachment. WAXS diffraction confirms, that the particles at this early stage are not yet crystalline. This growth mode is confirmed for a temperature range from 25°C to 45°C. An energy barrier for the diffusion of primary clusters in water of 0.60 eV was experimentally observed in agreement with molecular simulations. To access reaction times beyond 100 ms, a stopped-drop setup -again contaiment- free is introduced. SAXS experiments on the growth of Au nanoparticles on an extended time scale provide a much slower growth with one population only. Further, the influence of ionizing X-ray radiation on the Au particle fromation and growth is discussed.« less

  8. The ionoluminescence apparatus at the LABEC external microbeam facility

    NASA Astrophysics Data System (ADS)

    Calusi, S.; Colombo, E.; Giuntini, L.; Giudice, A. Lo; Manfredotti, C.; Massi, M.; Pratesi, G.; Vittone, E.

    2008-05-01

    In this paper, we describe the main features of the ionoluminescence (IL) apparatus recently installed at the external scanning microbeam facility of the 3 MV Tandetron accelerator of the INFN LABEC Laboratory in Firenze. The peculiarity of this IL set-up resides in the fact that the light produced by the ion irradiation of the specimen is collected by a bifurcated optical fiber, so that photons are shunted both to a CCD spectrometer, working in the 200-900 nm wavelength range, and to a photomultiplier (PMT). The accurate focusing of the optical system allows high photon collection efficiency and this results in rapid acquisition of luminescence spectra with low ion currents on luminescent materials; simultaneously, luminescence maps with a spatial resolution of 10 μm can be acquired through the synchronization of PMT photon detection with the position of the scanning focused ion beam. An optical filter with a narrow passband facing the photomultiplier allows chromatic selectivity of the luminescence centres. The IL apparatus is synergistically integrated into the existing set-up for ion beam analyses (IBA). The upgraded system permits simultaneous IL and PIXE/PIGE/BS measurements. With our integrated system, we have been studying raw lapis lazuli samples of different known origins and precious lapis lazuli artworks of the Collezione Medicea of Museum of Natural History, University of Firenze, aiming at characterising their composition and provenance.

  9. Coupled modes locally interacting with qubits: Critical assessment of the rotating-wave approximation

    NASA Astrophysics Data System (ADS)

    Cárdenas, P. C.; Teixeira, W. S.; Semião, F. L.

    2017-04-01

    The interaction of qubits with quantized modes of electromagnetic fields has been largely addressed in the quantum optics literature under the rotating wave approximation (RWA), where rapid oscillating terms in the qubit-mode interaction picture Hamiltonian can be neglected. At the same time, it is generally accepted that, provided the interaction is sufficiently strong or for long times, the RWA tends to describe physical phenomena incorrectly. In this work, we extend the investigation of the validity of the RWA to a more involved setup where two qubit-mode subsystems are brought to interaction through their harmonic coordinates. Our treatment is all analytic thanks to a sequence of carefully chosen unitary transformations, which allows us to diagonalize the Hamiltonian within and without the RWA. By also considering qubit dephasing, we find that the purity of the two-qubit state presents non-Markovian features which become more pronounced as the coupling between the modes gets stronger and the RWA loses its validity. In the same regime, there occurs fast generation of entanglement between the qubits, which is also not correctly described under the RWA. The setup and results presented here clearly show the limitations of the RWA in a scenario amenable to exact description and free from numerical uncertainties. Consequently, it may be of interest for the community working with cavity or circuit quantum electrodynamic systems in the strong coupling regime.

  10. Problem-Based Learning in Instrumentation: Synergism of Real and Virtual Modular Acquisition Chains

    ERIC Educational Resources Information Center

    Nonclercq, A.; Biest, A. V.; De Cuyper, K.; Leroy, E.; Martinez, D. L.; Robert, F.

    2010-01-01

    As part of an instrumentation course, a problem-based learning framework was selected for laboratory instruction. Two acquisition chains were designed to help students carry out realistic instrumentation problems. The first tool is a virtual (simulated) modular acquisition chain that allows rapid overall understanding of the main problems in…

  11. Internet Computer Coaches for Introductory Physics Problem Solving

    ERIC Educational Resources Information Center

    Xu Ryan, Qing

    2013-01-01

    The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the…

  12. On-Line Use of Three-Dimensional Marker Trajectory Estimation From Cone-Beam Computed Tomography Projections for Precise Setup in Radiotherapy for Targets With Respiratory Motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worm, Esben S., E-mail: esbeworm@rm.dk; Department of Medical Physics, Aarhus University Hospital, Aarhus; Hoyer, Morten

    2012-05-01

    Purpose: To develop and evaluate accurate and objective on-line patient setup based on a novel semiautomatic technique in which three-dimensional marker trajectories were estimated from two-dimensional cone-beam computed tomography (CBCT) projections. Methods and Materials: Seven treatment courses of stereotactic body radiotherapy for liver tumors were delivered in 21 fractions in total to 6 patients by a linear accelerator. Each patient had two to three gold markers implanted close to the tumors. Before treatment, a CBCT scan with approximately 675 two-dimensional projections was acquired during a full gantry rotation. The marker positions were segmented in each projection. From this, the three-dimensionalmore » marker trajectories were estimated using a probability based method. The required couch shifts for patient setup were calculated from the mean marker positions along the trajectories. A motion phantom moving with known tumor trajectories was used to examine the accuracy of the method. Trajectory-based setup was retrospectively used off-line for the first five treatment courses (15 fractions) and on-line for the last two treatment courses (6 fractions). Automatic marker segmentation was compared with manual segmentation. The trajectory-based setup was compared with setup based on conventional CBCT guidance on the markers (first 15 fractions). Results: Phantom measurements showed that trajectory-based estimation of the mean marker position was accurate within 0.3 mm. The on-line trajectory-based patient setup was performed within approximately 5 minutes. The automatic marker segmentation agreed with manual segmentation within 0.36 {+-} 0.50 pixels (mean {+-} SD; pixel size, 0.26 mm in isocenter). The accuracy of conventional volumetric CBCT guidance was compromised by motion smearing ({<=}21 mm) that induced an absolute three-dimensional setup error of 1.6 {+-} 0.9 mm (maximum, 3.2) relative to trajectory-based setup. Conclusions: The first on-line clinical use of trajectory estimation from CBCT projections for precise setup in stereotactic body radiotherapy was demonstrated. Uncertainty in the conventional CBCT-based setup procedure was eliminated with the new method.« less

  13. Constraint monitoring in TOSCA

    NASA Technical Reports Server (NTRS)

    Beck, Howard

    1992-01-01

    The Job-Shop Scheduling Problem (JSSP) deals with the allocation of resources over time to factory operations. Allocations are subject to various constraints (e.g., production precedence relationships, factory capacity constraints, and limits on the allowable number of machine setups) which must be satisfied for a schedule to be valid. The identification of constraint violations and the monitoring of constraint threats plays a vital role in schedule generation in terms of the following: (1) directing the scheduling process; and (2) informing scheduling decisions. This paper describes a general mechanism for identifying constraint violations and monitoring threats to the satisfaction of constraints throughout schedule generation.

  14. Sierra/Aria 4.48 Verification Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal Fluid Development Team

    Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.

  15. Application of LASCA imaging for detection of disorders of blood microcirculation in chicken embryo, infected by Chlamydia trachomatis

    NASA Astrophysics Data System (ADS)

    Ulianova, Onega; Subbotina, Irina; Filonova, Nadezhda; Zaitsev, Sergey; Saltykov, Yury; Polyanina, Tatiana; Lyapina, Anna; Ulyanov, Sergey; Larionova, Olga; Feodorova, Valentina

    2018-04-01

    Methods of t-LASCA and s-LASCA imaging have been firstly adapted to the problem of monitoring of blood microcirculation in chicken embryo model. Set-up for LASCA imaging of chicken embryo is mounted. Disorders of blood microcirculation in embryonated chicken egg, infected by Chlamydia trachomatis, are detected. Speckle-imaging technique is compared with white-light ovoscopy and new method of laser ovoscopy, based on the scattering of coherent light, advantages of LASCA imaging for the early detection of developmental process of chlamydial agent is demonstrated.

  16. Preparing images for publication: part 2.

    PubMed

    Bengel, Wolfgang; Devigus, Alessandro

    2006-08-01

    The transition from conventional to digital photography presents many advantages for authors and photographers in the field of dentistry, but also many complexities and potential problems. No uniform procedures for authors and publishers exist at present for producing high-quality dental photographs. This two-part article aims to provide guidelines for preparing images for publication and improving communication between these two parties. Part 1 provided information about basic color principles, factors that can affect color perception, and digital color management. Part 2 describes the camera setup, discusses how to take a photograph suitable for publication, and outlines steps for the image editing process.

  17. How to design a cartographic continuum to help users to navigate between two topographic styles?

    NASA Astrophysics Data System (ADS)

    Ory, Jérémie; Touya, Guillaume; Hoarau, Charlotte; Christophe, Sidonie

    2018-05-01

    Geoportals and geovisualization tools provide to users various cartographic abstractions that describe differently a geographical space. Our purpose is to be able to design cartographic continuums, i.e. a set of in-between maps allowing users to navigate between two topographic styles. This paper addresses the problem of the interpolation between two topographic abstractions with different styles. We detail our approach in two steps. Firstly, we setup a comparison in order to identify which structural elements of a cartographic abstraction should be interpolated. Secondly, we propose an approach based on two design methods for maps interpolation.

  18. A proposed atom interferometry determination of G at 10-5 using a cold atomic fountain

    NASA Astrophysics Data System (ADS)

    Rosi, G.

    2018-02-01

    In precision metrology, the determination of the Newtonian gravity constant G represents a real problem, since its history is plagued by huge unknown discrepancies between a large number of independent experiments. In this paper, we propose a novel experimental setup for measuring G with a relative accuracy of 10-5 , using a standard cold atomic fountain and matter wave interferometry. We discuss in detail the major sources of systematic errors, and provide the expected statistical uncertainty. The feasibility of determining G at the 10-6 level is also discussed.

  19. Linking mathematics with engineering applications at an early stage - implementation, experimental set-up and evaluation of a pilot project

    NASA Astrophysics Data System (ADS)

    Rooch, Aeneas; Junker, Philipp; Härterich, Jörg; Hackl, Klaus

    2016-03-01

    Too difficult, too abstract, too theoretical - many first-year engineering students complain about their mathematics courses. The project MathePraxis aims to resolve this disaffection. It links mathematical methods as they are taught in the first semesters with practical problems from engineering applications - and thereby shall give first-year engineering students a vivid and convincing impression of where they will need mathematics in their later working life. But since real applications usually require more than basic mathematics and first-year engineering students typically are not experienced with construction, mensuration and the use of engineering software, such an approach is hard to realise. In this article, we show that it is possible. We report on the implementation of MathePraxis at Ruhr-Universität Bochum. We describe the set-up and the implementation of a course on designing a mass damper which combines basic mathematical techniques with an impressive experiment. In an accompanying evaluation, we have examined the students' motivation relating to mathematics. This opens up new perspectives how to address the need for a more practically oriented mathematical education in engineering sciences.

  20. Hand hygiene: Back to the basics of infection control

    PubMed Central

    Mathur, Purva

    2011-01-01

    Health care associated infections are drawing increasing attention from patients, insurers, governments and regulatory bodies. This is not only because of the magnitude of the problem in terms of the associated morbidity, mortality and cost of treatment, but also due to the growing recognition that most of these are preventable. The medical community is witnessing in tandem unprecedented advancements in the understanding of pathophysiology of infectious diseases and the global spread of multi-drug resistant infections in health care set-ups. These factors, compounded by the paucity of availability of new antimicrobials have necessitated a re-look into the role of basic practices of infection prevention in modern day health care. There is now undisputed evidence that strict adherence to hand hygiene reduces the risk of cross-transmission of infections. With “Clean Care is Safer Care” as a prime agenda of the global initiative of WHO on patient safety programmes, it is time for developing countries to formulate the much-needed policies for implementation of basic infection prevention practices in health care set-ups. This review focuses on one of the simplest, low cost but least accepted from infection prevention: hand hygiene. PMID:22199099

  1. Analysis of Invasion Dynamics of Matrix-Embedded Cells in a Multisample Format.

    PubMed

    Van Troys, Marleen; Masuzzo, Paola; Huyck, Lynn; Bakkali, Karima; Waterschoot, Davy; Martens, Lennart; Ampe, Christophe

    2018-01-01

    In vitro tests of cancer cell invasion are the "first line" tools of preclinical researchers for screening the multitude of chemical compounds or cell perturbations that may aid in halting or treating cancer malignancy. In order to have predictive value or to contribute to designing personalized treatment regimes, these tests need to take into account the cancer cell environment and measure effects on invasion in sufficient detail. The in vitro invasion assays presented here are a trade-off between feasibility in a multisample format and mimicking the complexity of the tumor microenvironment. They allow testing multiple samples and conditions in parallel using 3D-matrix-embedded cells and deal with the heterogeneous behavior of an invading cell population in time. We describe the steps to take, the technical problems to tackle and useful software tools for the entire workflow: from the experimental setup to the quantification of the invasive capacity of the cells. The protocol is intended to guide researchers to standardize experimental set-ups and to annotate their invasion experiments in sufficient detail. In addition, it provides options for image processing and a solution for storage, visualization, quantitative analysis, and multisample comparison of acquired cell invasion data.

  2. Two-mode bosonic quantum metrology with number fluctuations

    NASA Astrophysics Data System (ADS)

    De Pasquale, Antonella; Facchi, Paolo; Florio, Giuseppe; Giovannetti, Vittorio; Matsuoka, Koji; Yuasa, Kazuya

    2015-10-01

    We search for the optimal quantum pure states of identical bosonic particles for applications in quantum metrology, in particular, in the estimation of a single parameter for the generic two-mode interferometric setup. We consider the general case in which the total number of particles is fluctuating around an average N with variance Δ N2 . By recasting the problem in the framework of classical probability, we clarify the maximal accuracy attainable and show that it is always larger than the one reachable with a fixed number of particles (i.e., Δ N =0 ). In particular, for larger fluctuations, the error in the estimation diminishes proportionally to 1 /Δ N , below the Heisenberg-like scaling 1 /N . We also clarify the best input state, which is a quasi-NOON state for a generic setup and, for some special cases, a two-mode Schrödinger-cat state with a vacuum component. In addition, we search for the best state within the class of pure Gaussian states with a given average N , which is revealed to be a product state (with no entanglement) with a squeezed vacuum in one mode and the vacuum in the other.

  3. Model of head-neck joint fast movements in the frontal plane.

    PubMed

    Pedrocchi, A; Ferrigno, G

    2004-06-01

    The objective of this work is to develop a model representing the physiological systems driving fast head movements in frontal plane. All the contributions occurring mechanically in the head movement are considered: damping, stiffness, physiological limit of range of motion, gravitational field, and muscular torques due to voluntary activation as well as to stretch reflex depending on fusal afferences. Model parameters are partly derived from the literature, when possible, whereas undetermined block parameters are determined by optimising the model output, fitting to real kinematics data acquired by a motion capture system in specific experimental set-ups. The optimisation for parameter identification is performed by genetic algorithms. Results show that the model represents very well fast head movements in the whole range of inclination in the frontal plane. Such a model could be proposed as a tool for transforming kinematics data on head movements in 'neural equivalent data', especially for assessing head control disease and properly planning the rehabilitation process. In addition, the use of genetic algorithms seems to fit well the problem of parameter identification, allowing for the use of a very simple experimental set-up and granting model robustness.

  4. Hyperbolic Positioning with Antenna Arrays and Multi-Channel Pseudolite for Indoor Localization

    PubMed Central

    Fujii, Kenjirou; Sakamoto, Yoshihiro; Wang, Wei; Arie, Hiroaki; Schmitz, Alexander; Sugano, Shigeki

    2015-01-01

    A hyperbolic positioning method with antenna arrays consisting of proximately-located antennas and a multi-channel pseudolite is proposed in order to overcome the problems of indoor positioning with conventional pseudolites (ground-based GPS transmitters). A two-dimensional positioning experiment using actual devices is conducted. The experimental result shows that the positioning accuracy varies centimeter- to meter-level according to the geometric relation between the pseudolite antennas and the receiver. It also shows that the bias error of the carrier-phase difference observables is more serious than their random error. Based on the size of the bias error of carrier-phase difference that is inverse-calculated from the experimental result, three-dimensional positioning performance is evaluated by computer simulation. In addition, in the three-dimensional positioning scenario, an initial value convergence analysis of the non-linear least squares is conducted. Its result shows that initial values that can converge to a right position exist at least under the proposed antenna setup. The simulated values and evaluation methods introduced in this work can be applied to various antenna setups; therefore, by using them, positioning performance can be predicted in advance of installing an actual system. PMID:26437405

  5. Use of an AC induction motor system for producing finger movements in human subjects.

    PubMed

    Proudlock, F A; Scott, J J

    1998-12-01

    This report describes the set-up and evaluation of a novel system for producing precise finger movements, for tests of movement perception. The specifications were to construct a system using commercially available components that were easy to use but which offered both flexibility and also high precision control. The system was constructed around an industrial AC induction motor with an optical encoder, controlled by an AC servo digital control module that could be programmed using a simple, high-level language. This set-up fulfilled the requirements regarding position and velocity control for a range of movements and also the facility for the subject to move the joint voluntarily while still attached to the motor. However a number of problems were encountered, the most serious being the level of vibration and the inability to vary the torque during movements. The vibration was reduced to the point where it did not affect the subject, by the introduction of mechanical dampening using an anti-vibration coupling and a pneumatic splint. The torque control could not be modified during rotation and so the system could only be operated using constant torque for any given movement.

  6. Visual communications with side information via distributed printing channels: extended multimedia and security perspectives

    NASA Astrophysics Data System (ADS)

    Voloshynovskiy, Sviatoslav V.; Koval, Oleksiy; Deguillaume, Frederic; Pun, Thierry

    2004-06-01

    In this paper we address visual communications via printing channels from an information-theoretic point of view as communications with side information. The solution to this problem addresses important aspects of multimedia data processing, security and management, since printed documents are still the most common form of visual information representation. Two practical approaches to side information communications for printed documents are analyzed in the paper. The first approach represents a layered joint source-channel coding for printed documents. This approach is based on a self-embedding concept where information is first encoded assuming a Wyner-Ziv set-up and then embedded into the original data using a Gel'fand-Pinsker construction and taking into account properties of printing channels. The second approach is based on Wyner-Ziv and Berger-Flynn-Gray set-ups and assumes two separated communications channels where an appropriate distributed coding should be elaborated. The first printing channel is considered to be a direct visual channel for images ("analog" channel with degradations). The second "digital channel" with constrained capacity is considered to be an appropriate auxiliary channel. We demonstrate both theoretically and practically how one can benefit from this sort of "distributed paper communications".

  7. PsiQuaSP-A library for efficient computation of symmetric open quantum systems.

    PubMed

    Gegg, Michael; Richter, Marten

    2017-11-24

    In a recent publication we showed that permutation symmetry reduces the numerical complexity of Lindblad quantum master equations for identical multi-level systems from exponential to polynomial scaling. This is important for open system dynamics including realistic system bath interactions and dephasing in, for instance, the Dicke model, multi-Λ system setups etc. Here we present an object-oriented C++ library that allows to setup and solve arbitrary quantum optical Lindblad master equations, especially those that are permutationally symmetric in the multi-level systems. PsiQuaSP (Permutation symmetry for identical Quantum Systems Package) uses the PETSc package for sparse linear algebra methods and differential equations as basis. The aim of PsiQuaSP is to provide flexible, storage efficient and scalable code while being as user friendly as possible. It is easily applied to many quantum optical or quantum information systems with more than one multi-level system. We first review the basics of the permutation symmetry for multi-level systems in quantum master equations. The application of PsiQuaSP to quantum dynamical problems is illustrated with several typical, simple examples of open quantum optical systems.

  8. Collider shot setup for Run 2 observations and suggestions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annala, J.; Joshel, B.

    1996-01-31

    This note is intended to provoke discussion on Collider Run II shot setup. We hope this is a start of activities that will converge on a functional description of what is needed for shot setups in Collider Run II. We will draw on observations of the present shot setup to raise questions and make suggestions for the next Collider run. It is assumed that the reader has some familiarity with the Collider operational issues. Shot setup is defined to be the time between the end of a store and the time the Main Control Room declares colliding beams. This ismore » the time between Tevatron clock events SCE and SCB. This definition does not consider the time experiments use to turn on their detectors. This analysis was suggested by David Finley. The operational scenarios for Run II will require higher levels of reliability and speed for shot setup. See Appendix I and II. For example, we estimate that a loss of 3 pb{sup {minus}1}/week (with 8 hour stores) will occur if shot setups take 90 minutes instead of 30 minutes. In other words: If you do 12 shots for one week and accept an added delay of one minute in each shot, you will loose more than 60 nb{sup {minus}1} for that week alone (based on a normal shot setup of 30 minutes). These demands should lead us to be much more pedantic about all the factors that affect shot setups. Shot setup will be viewed as a distinct process that is composed of several inter- dependent `components`: procedures, hardware, controls, and sociology. These components don`t directly align with the different Accelerator Division departments, but are topical groupings of the needed accelerator functions. Defining these components, and categorizing our suggestions within them, are part of the goal of this document. Of course, some suggestions span several of these components.« less

  9. Investigation of the effects of process and geometrical parameters on formability in tube hydroforming using a modular hydroforming tool

    NASA Astrophysics Data System (ADS)

    Joghan, Hamed Dardaei; Staupendahl, Daniel; Hassan, Hamad ul; Henke, Andreas; Keesser, Thorsten; Legat, Francois; Tekkaya, A. Erman

    2018-05-01

    Tube hydroforming is one of the most important manufacturing processes for the production of exhaust systems. Tube hydroforming allows generating parts with highly complex geometries with the forming accuracies needed in the automotive sector. This is possible due to the form-closed nature of the production process. One of the main cost drivers is tool manufacturing, which is expensive and time consuming, especially when forming large parts. To cope with the design trend of individuality, which is gaining more and more importance and leads to a high number of product variants, a new flexible tool design was developed. The designed tool offers a high flexibility in manufacturing different shapes and geometries of tubes with just local alterations and relocation of tool segments. The tolerancing problems that segmented tools from the state of the art have are overcome by an innovative and flexible die holder design. The break-even point of this initially more expensive tool design is already overcome when forming more than 4 different tube shapes. Together with an additionally designed rotary hydraulic tube feeding system, a highly adaptable forming setup is generated. To investigate the performance of the developed tool setup, a study on geometrical and process parameters during forming of a spherical dome was done. Austenitic stainless steel (grade 1.4301) tube with a diameter of 40 mm and a thickness of 1.5 mm was used for the investigations. The experimental analyses were supported by finite element simulations and statistical analyses. The results show that the flexible tool setup can efficiently be used to analyze the interaction of the inner pressure, friction, and the location of the spherical dome and demonstrate the high influence of the feeding rate on the formed part.

  10. Photon-HDF5: Open Data Format and Computational Tools for Timestamp-based Single-Molecule Experiments

    PubMed Central

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2017-01-01

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5. PMID:28649160

  11. Photon-HDF5: Open Data Format and Computational Tools for Timestamp-based Single-Molecule Experiments.

    PubMed

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-02-13

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.

  12. Optical alignment procedure utilizing neural networks combined with Shack-Hartmann wavefront sensor

    NASA Astrophysics Data System (ADS)

    Adil, Fatime Zehra; Konukseven, Erhan İlhan; Balkan, Tuna; Adil, Ömer Faruk

    2017-05-01

    In the design of pilot helmets with night vision capability, to not limit or block the sight of the pilot, a transparent visor is used. The reflected image from the coated part of the visor must coincide with the physical human sight image seen through the nonreflecting regions of the visor. This makes the alignment of the visor halves critical. In essence, this is an alignment problem of two optical parts that are assembled together during the manufacturing process. Shack-Hartmann wavefront sensor is commonly used for the determination of the misalignments through wavefront measurements, which are quantified in terms of the Zernike polynomials. Although the Zernike polynomials provide very useful feedback about the misalignments, the corrective actions are basically ad hoc. This stems from the fact that there exists no easy inverse relation between the misalignment measurements and the physical causes of the misalignments. This study aims to construct this inverse relation by making use of the expressive power of the neural networks in such complex relations. For this purpose, a neural network is designed and trained in MATLAB® regarding which types of misalignments result in which wavefront measurements, quantitatively given by Zernike polynomials. This way, manual and iterative alignment processes relying on trial and error will be replaced by the trained guesses of a neural network, so the alignment process is reduced to applying the counter actions based on the misalignment causes. Such a training requires data containing misalignment and measurement sets in fine detail, which is hard to obtain manually on a physical setup. For that reason, the optical setup is completely modeled in Zemax® software, and Zernike polynomials are generated for misalignments applied in small steps. The performance of the neural network is experimented and found promising in the actual physical setup.

  13. Photon-HDF5: open data format and computational tools for timestamp-based single-molecule experiments

    NASA Astrophysics Data System (ADS)

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-02-01

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon- HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon- HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon- HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.

  14. Effect of patient setup errors on simultaneously integrated boost head and neck IMRT treatment plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, Jeffrey V.; Keall, Paul J.; Wu Qiuwen

    2005-10-01

    Purpose: The purpose of this study is to determine dose delivery errors that could result from random and systematic setup errors for head-and-neck patients treated using the simultaneous integrated boost (SIB)-intensity-modulated radiation therapy (IMRT) technique. Methods and Materials: Twenty-four patients who participated in an intramural Phase I/II parotid-sparing IMRT dose-escalation protocol using the SIB treatment technique had their dose distributions reevaluated to assess the impact of random and systematic setup errors. The dosimetric effect of random setup error was simulated by convolving the two-dimensional fluence distribution of each beam with the random setup error probability density distribution. Random setup errorsmore » of {sigma} = 1, 3, and 5 mm were simulated. Systematic setup errors were simulated by randomly shifting the patient isocenter along each of the three Cartesian axes, with each shift selected from a normal distribution. Systematic setup error distributions with {sigma} = 1.5 and 3.0 mm along each axis were simulated. Combined systematic and random setup errors were simulated for {sigma} = {sigma} = 1.5 and 3.0 mm along each axis. For each dose calculation, the gross tumor volume (GTV) received by 98% of the volume (D{sub 98}), clinical target volume (CTV) D{sub 90}, nodes D{sub 90}, cord D{sub 2}, and parotid D{sub 50} and parotid mean dose were evaluated with respect to the plan used for treatment for the structure dose and for an effective planning target volume (PTV) with a 3-mm margin. Results: Simultaneous integrated boost-IMRT head-and-neck treatment plans were found to be less sensitive to random setup errors than to systematic setup errors. For random-only errors, errors exceeded 3% only when the random setup error {sigma} exceeded 3 mm. Simulated systematic setup errors with {sigma} = 1.5 mm resulted in approximately 10% of plan having more than a 3% dose error, whereas a {sigma} = 3.0 mm resulted in half of the plans having more than a 3% dose error and 28% with a 5% dose error. Combined random and systematic dose errors with {sigma} = {sigma} = 3.0 mm resulted in more than 50% of plans having at least a 3% dose error and 38% of the plans having at least a 5% dose error. Evaluation with respect to a 3-mm expanded PTV reduced the observed dose deviations greater than 5% for the {sigma} = {sigma} = 3.0 mm simulations to 5.4% of the plans simulated. Conclusions: Head-and-neck SIB-IMRT dosimetric accuracy would benefit from methods to reduce patient systematic setup errors. When GTV, CTV, or nodal volumes are used for dose evaluation, plans simulated including the effects of random and systematic errors deviate substantially from the nominal plan. The use of PTVs for dose evaluation in the nominal plan improves agreement with evaluated GTV, CTV, and nodal dose values under simulated setup errors. PTV concepts should be used for SIB-IMRT head-and-neck squamous cell carcinoma patients, although the size of the margins may be less than those used with three-dimensional conformal radiation therapy.« less

  15. Feedforward operation of a lens setup for large defocus and astigmatism correction

    NASA Astrophysics Data System (ADS)

    Verstraete, Hans R. G. W.; Almasian, MItra; Pozzi, Paolo; Bilderbeek, Rolf; Kalkman, Jeroen; Faber, Dirk J.; Verhaegen, Michel

    2016-04-01

    In this manuscript, we present a lens setup for large defocus and astigmatism correction. A deformable defocus lens and two rotational cylindrical lenses are used to control the defocus and astigmatism. The setup is calibrated using a simple model that allows the calculation of the lens inputs so that a desired defocus and astigmatism are actuated on the eye. The setup is tested by determining the feedforward prediction error, imaging a resolution target, and removing introduced aberrations.

  16. Laser-induced transient grating setup with continuously tunable period

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vega-Flick, A.; Applied Physics Department, CINVESTAV-Unidad Mérida, Carretera Antigua a Progreso Km 6, Cordemex, Mérida, Yucatán 97310 Mexico; Eliason, J. K.

    2015-12-15

    We present a modification of the laser-induced transient grating setup enabling continuous tuning of the transient grating period. The fine control of the period is accomplished by varying the angle of the diffraction grating used to split excitation and probe beams. The setup has been tested by measuring dispersion of bulk and surface acoustic waves in both transmission and reflection geometries. The presented modification is fully compatible with optical heterodyne detection and can be easily implemented in any transient grating setup.

  17. Searching for memories, Sudoku, implicit check bits, and the iterative use of not-always-correct rapid neural computation.

    PubMed

    Hopfield, J J

    2008-05-01

    The algorithms that simple feedback neural circuits representing a brain area can rapidly carry out are often adequate to solve easy problems but for more difficult problems can return incorrect answers. A new excitatory-inhibitory circuit model of associative memory displays the common human problem of failing to rapidly find a memory when only a small clue is present. The memory model and a related computational network for solving Sudoku puzzles produce answers that contain implicit check bits in the representation of information across neurons, allowing a rapid evaluation of whether the putative answer is correct or incorrect through a computation related to visual pop-out. This fact may account for our strong psychological feeling of right or wrong when we retrieve a nominal memory from a minimal clue. This information allows more difficult computations or memory retrievals to be done in a serial fashion by using the fast but limited capabilities of a computational module multiple times. The mathematics of the excitatory-inhibitory circuits for associative memory and for Sudoku, both of which are understood in terms of energy or Lyapunov functions, is described in detail.

  18. Pragmatic setup for bioparticle responses by dielectrophoresis for resource limited environment application

    NASA Astrophysics Data System (ADS)

    Ali, Mohd Anuar Md; Yeop Majlis, Burhanuddin; Kayani, Aminuddin Ahmad

    2017-12-01

    Various dielectrophoretic responses of bioparticles, including cell-chain, spinning, rotation and clustering, are of high interest in the field due to their benefit into application for biomedical and clinical implementation potential. Numerous attempts using sophisticated equipment setup have been studied to perform those dielectrophoretic responses, however, for development into resource limited environment application, such as portable, sustainable and environmental friendly diagnostic tools, establishment of pragmatic setup using standard, non-sophisticated and low-cost equipment is of important task. Here we show the advantages in the judicious design optimization of tip microelectrode, also with selection of suspending medium and optimization of electric signal configuration in establishing setup that can promote the aforementioned dielectrophoretic responses within standard equipments, i.e. pragmatic setup.

  19. Single-Camera Stereoscopy Setup to Visualize 3D Dusty Plasma Flows

    NASA Astrophysics Data System (ADS)

    Romero-Talamas, C. A.; Lemma, T.; Bates, E. M.; Birmingham, W. J.; Rivera, W. F.

    2016-10-01

    A setup to visualize and track individual particles in multi-layered dusty plasma flows is presented. The setup consists of a single camera with variable frame rate, and a pair of adjustable mirrors that project the same field of view from two different angles to the camera, allowing for three-dimensional tracking of particles. Flows are generated by inclining the plane in which the dust is levitated using a specially designed setup that allows for external motion control without compromising vacuum. Dust illumination is achieved with an optics arrangement that includes a Powell lens that creates a laser fan with adjustable thickness and with approximately constant intensity everywhere. Both the illumination and the stereoscopy setup allow for the camera to be placed at right angles with respect to the levitation plane, in preparation for magnetized dusty plasma experiments in which there will be no direct optical access to the levitation plane. Image data and analysis of unmagnetized dusty plasma flows acquired with this setup are presented.

  20. Noiseless amplification of weak coherent fields exploiting energy fluctuations of the field

    NASA Astrophysics Data System (ADS)

    Partanen, Mikko; Häyrynen, Teppo; Oksanen, Jani; Tulkki, Jukka

    2012-12-01

    Quantum optics dictates that amplification of a pure state by any linear deterministic amplifier always introduces noise in the signal and results in a mixed output state. However, it has recently been shown that noiseless amplification becomes possible if the requirement of a deterministic operation is relaxed. Here we propose and analyze a noiseless amplification scheme where the energy required to amplify the signal originates from the stochastic fluctuations in the field itself. In contrast to previous amplification setups, our setup shows that a signal can be amplified even if no energy is added to the signal from external sources. We investigate the relation between the amplification and its success rate as well as the statistics of the output states after successful and failed amplification processes. Furthermore, we also optimize the setup to find the maximum success rates in terms of the reflectivities of the beam splitters used in the setup and discuss the relation of our setup with the previous setups.

  1. A biolayer interferometry-based assay for rapid and highly sensitive detection of biowarfare agents.

    PubMed

    Mechaly, Adva; Cohen, Hila; Cohen, Ofer; Mazor, Ohad

    2016-08-01

    Biolayer interferometry (BLI) is an optical technique that uses fiber-optic biosensors for label-free real-time monitoring of protein-protein interactions. In this study, we coupled the advantages of the Octet Red BLI system (automation, fluidics-free, and on-line monitoring) with a signal enhancement step and developed a rapid and sensitive immunological-based method for detection of biowarfare agents. As a proof of concept, we chose to demonstrate the efficacy of this novel assay for the detection of agents representing two classes of biothreats, proteinaceous toxins, and bacterial pathogens: ricin, a lethal plant toxin, and the gram-negative bacterium Francisella tularensis, the causative agent of tularemia. The assay setup consisted of biotinylated antibodies immobilized to the biosensor coupled with alkaline phosphatase-labeled antibodies as the detection moiety to create nonsoluble substrate crystals that precipitate on the sensor surface, thereby inducing a significant wavelength interference. It was found that this BLI-based assay enables sensitive detection of these pathogens (detection limits of 10 pg/ml and 1 × 10(4) pfu/ml ricin and F. tularensis, respectively) within a very short time frame (17 min). Owing to its simplicity, this assay can be easily adapted to detect other analytes in general, and biowarfare agents in particular, in a rapid and sensitive manner. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Application of Natural Isotopic Abundance ¹H-¹³C- and ¹H-¹⁵N-Correlated Two-Dimensional NMR for Evaluation of the Structure of Protein Therapeutics.

    PubMed

    Arbogast, Luke W; Brinson, Robert G; Marino, John P

    2016-01-01

    Methods for characterizing the higher-order structure of protein therapeutics are in great demand for establishing consistency in drug manufacturing, for detecting drug product variations resulting from modifications in the manufacturing process, and for comparing a biosimilar to an innovator reference product. In principle, solution NMR can provide a robust approach for characterization of the conformation(s) of protein therapeutics in formulation at atomic resolution. However, molecular weight limitations and the perceived need for stable isotope labeling have to date limited its practical applications in the biopharmaceutical industry. Advances in NMR magnet and console technologies, cryogenically cooled probes, and new rapid acquisition methodologies, particularly selective optimized flip-angle short transient pulse schemes and nonuniform sampling, have greatly ameliorated these limitations. Here, we describe experimental methods for the collection and analysis of 2D (1)H(N)-(15)N-amide- and (1)H-(13)C-methyl-correlated spectra applied to protein drug products at natural isotopic abundance, including representatives from the rapidly growing class of monoclonal antibody (mAb) therapeutics. Practical aspects of experimental setup and data acquisition for both standard and rapid acquisition NMR techniques are described. Furthermore, strategies for the statistical comparison of 2D (1)H(N)-(15)N-amide- and (1)H-(13)C-methyl-correlated spectra are detailed. 2016 Published by Elsevier Inc.

  3. Sparsity-promoting and edge-preserving maximum a posteriori estimators in non-parametric Bayesian inverse problems

    NASA Astrophysics Data System (ADS)

    Agapiou, Sergios; Burger, Martin; Dashti, Masoumeh; Helin, Tapio

    2018-04-01

    We consider the inverse problem of recovering an unknown functional parameter u in a separable Banach space, from a noisy observation vector y of its image through a known possibly non-linear map {{\\mathcal G}} . We adopt a Bayesian approach to the problem and consider Besov space priors (see Lassas et al (2009 Inverse Problems Imaging 3 87-122)), which are well-known for their edge-preserving and sparsity-promoting properties and have recently attracted wide attention especially in the medical imaging community. Our key result is to show that in this non-parametric setup the maximum a posteriori (MAP) estimates are characterized by the minimizers of a generalized Onsager-Machlup functional of the posterior. This is done independently for the so-called weak and strong MAP estimates, which as we show coincide in our context. In addition, we prove a form of weak consistency for the MAP estimators in the infinitely informative data limit. Our results are remarkable for two reasons: first, the prior distribution is non-Gaussian and does not meet the smoothness conditions required in previous research on non-parametric MAP estimates. Second, the result analytically justifies existing uses of the MAP estimate in finite but high dimensional discretizations of Bayesian inverse problems with the considered Besov priors.

  4. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time.

    PubMed

    Avellar, Gustavo S C; Pereira, Guilherme A S; Pimenta, Luciano C A; Iscold, Paulo

    2015-11-02

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem's (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles' maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs.

  5. Resolving Rapid Variation in Energy for Particle Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haut, Terry Scot; Ahrens, Cory Douglas; Jonko, Alexandra

    2016-08-23

    Resolving the rapid variation in energy in neutron and thermal radiation transport is needed for the predictive simulation capability in high-energy density physics applications. Energy variation is difficult to resolve due to rapid variations in cross sections and opacities caused by quantized energy levels in the nuclei and electron clouds. In recent work, we have developed a new technique to simultaneously capture slow and rapid variations in the opacities and the solution using homogenization theory, which is similar to multiband (MB) and to the finite-element with discontiguous support (FEDS) method, but does not require closure information. We demonstrated the accuracymore » and efficiency of the method for a variety of problems. We are researching how to extend the method to problems with multiple materials and the same material but with different temperatures and densities. In this highlight, we briefly describe homogenization theory and some results.« less

  6. Can We Advance Macroscopic Quantum Systems Outside the Framework of Complex Decoherence Theory?

    PubMed Central

    Brezinski, Mark E; Rupnick, Maria

    2016-01-01

    Macroscopic quantum systems (MQS) are macroscopic systems driven by quantum rather than classical mechanics, a long studied area with minimal success till recently. Harnessing the benefits of quantum mechanics on a macroscopic level would revolutionize fields ranging from telecommunication to biology, the latter focused on here for reasons discussed. Contrary to misconceptions, there are no known physical laws that prevent the development of MQS. Instead, they are generally believed universally lost in complex systems from environmental entanglements (decoherence). But we argue success is achievable MQS with decoherence compensation developed, naturally or artificially, from top-down rather current reductionist approaches. This paper advances the MQS field by a complex systems approach to decoherence. First, why complex system decoherence approaches (top-down) are needed is discussed. Specifically, complex adaptive systems (CAS) are not amenable to reductionist models (and their master equations) because of emergent behaviour, approximation failures, not accounting for quantum compensatory mechanisms, ignoring path integrals, and the subentity problem. In addition, since MQS must exist within the context of the classical world, where rapid decoherence and prolonged coherence are both needed. Nature has already demonstrated this for quantum subsystems such as photosynthesis and magnetoreception. Second, we perform a preliminary study that illustrates a top-down approach to potential MQS. In summary, reductionist arguments against MQS are not justifiable. It is more likely they are not easily detectable in large intact classical systems or have been destroyed by reductionist experimental set-ups. This complex systems decoherence approach, using top down investigations, is critical to paradigm shifts in MQS research both in biological and non-biological systems. PMID:29200743

  7. Simulation of a shock tube with a small exit nozzle

    NASA Astrophysics Data System (ADS)

    Luan, Yigang; Olzmann, Matthias; Magagnato, Franco

    2018-02-01

    Shock tubes are frequently used to rapidly heat up reaction mixtures to study chemical reaction mechanisms and kinetics in the field of combustion chemistry [1]. In the present work, the flow field inside a shock tube with a small nozzle in the end plate has been investigated to support the analysis of reacting chemical mixtures with an attached mass spectrometer and to clarify whether the usual assumptions for the flow field and the related thermodynamics are fulfilled. In the present work, the details of the flow physics inside the tube and the flow out of the nozzle in the end plate have been investigated. Due to the large differences in the typical length scales and the large pressure ratios of this special device, a very strong numerical stiffness prevails during the simulation process. Second-order ROE numerical schemes have been employed to simulate the flow field inside the shock tube. The simulations were performed with the commercial code ANSYS Fluent [2]. Axial-symmetric boundary conditions are employed to reduce the consumption of CPU time. A density-based transient scheme has been used and validated in terms of accuracy and efficiency. The simulation results for pressure and density are compared with analytical solutions. Numerical results show that a density-based numerical scheme performs better when dealing with shock-tube problems [5]. The flow field near the nozzle is studied in detail, and the effects of the nozzle to pressure and temperature variations inside the tube are investigated. The results show that this special shock-tube setup can be used to study high-temperature gas-phase chemical reactions with reasonable accuracy.

  8. Photoinduced Changes of Surface Topography in Amorphous, Liquid-Crystalline, and Crystalline Films of Bent-Core Azobenzene-Containing Substance.

    PubMed

    Bobrovsky, Alexey; Mochalov, Konstantin; Oleinikov, Vladimir; Solovyeva, Daria; Shibaev, Valery; Bogdanova, Yulia; Hamplová, Vĕra; Kašpar, Miroslav; Bubnov, Alexej

    2016-06-09

    Recently, photofluidization and mass-transfer effects have gained substantial interest because of their unique abilities of photocontrolled manipulation with material structure and physicochemical properties. In this work, the surface topographies of amorphous, nematic, and crystalline films of an azobenzene-containing bent-core (banana-shaped) compound were studied using a special experimental setup combining polarizing optical microscopy and atomic force microscopy. Spin-coating or rapid cooling of the samples enabled the formation of glassy amorphous or nematic films of the substance. The effects of UV and visible-light irradiation on the surface roughness of the films were investigated. It was found that UV irradiation leads to the fast isothermal transition of nematic and crystalline phases into the isotropic phase. This effect is associated with E-Z photoisomerization of the compound accompanied by a decrease of the anisometry of the bent-core molecules. Focused polarized visible-light irradiation (457.9 nm) results in mass-transfer phenomena and induces the formation of so-called "craters" in amorphous and crystalline films of the substance. The observed photofluidization and mass-transfer processes allow glass-forming bent-core azobenzene-containing substances to be considered for the creation of promising materials with photocontrollable surface topographies. Such compounds are of principal importance for the solution of a broad range of problems related to the investigation of surface phenomena in colloid and physical chemistry, such as surface modification for chemical and catalytic reactions, predetermined morphology of surfaces and interfaces in soft matter, and chemical and biochemical sensing.

  9. Can We Advance Macroscopic Quantum Systems Outside the Framework of Complex Decoherence Theory?

    PubMed

    Brezinski, Mark E; Rupnick, Maria

    2014-07-01

    Macroscopic quantum systems (MQS) are macroscopic systems driven by quantum rather than classical mechanics, a long studied area with minimal success till recently. Harnessing the benefits of quantum mechanics on a macroscopic level would revolutionize fields ranging from telecommunication to biology, the latter focused on here for reasons discussed. Contrary to misconceptions, there are no known physical laws that prevent the development of MQS. Instead, they are generally believed universally lost in complex systems from environmental entanglements (decoherence). But we argue success is achievable MQS with decoherence compensation developed, naturally or artificially, from top-down rather current reductionist approaches. This paper advances the MQS field by a complex systems approach to decoherence. First, why complex system decoherence approaches (top-down) are needed is discussed. Specifically, complex adaptive systems (CAS) are not amenable to reductionist models (and their master equations) because of emergent behaviour, approximation failures, not accounting for quantum compensatory mechanisms, ignoring path integrals, and the subentity problem. In addition, since MQS must exist within the context of the classical world, where rapid decoherence and prolonged coherence are both needed. Nature has already demonstrated this for quantum subsystems such as photosynthesis and magnetoreception. Second, we perform a preliminary study that illustrates a top-down approach to potential MQS. In summary, reductionist arguments against MQS are not justifiable. It is more likely they are not easily detectable in large intact classical systems or have been destroyed by reductionist experimental set-ups. This complex systems decoherence approach, using top down investigations, is critical to paradigm shifts in MQS research both in biological and non-biological systems.

  10. Fast inverse scattering solutions using the distorted Born iterative method and the multilevel fast multipole algorithm

    PubMed Central

    Hesford, Andrew J.; Chew, Weng C.

    2010-01-01

    The distorted Born iterative method (DBIM) computes iterative solutions to nonlinear inverse scattering problems through successive linear approximations. By decomposing the scattered field into a superposition of scattering by an inhomogeneous background and by a material perturbation, large or high-contrast variations in medium properties can be imaged through iterations that are each subject to the distorted Born approximation. However, the need to repeatedly compute forward solutions still imposes a very heavy computational burden. To ameliorate this problem, the multilevel fast multipole algorithm (MLFMA) has been applied as a forward solver within the DBIM. The MLFMA computes forward solutions in linear time for volumetric scatterers. The typically regular distribution and shape of scattering elements in the inverse scattering problem allow the method to take advantage of data redundancy and reduce the computational demands of the normally expensive MLFMA setup. Additional benefits are gained by employing Kaczmarz-like iterations, where partial measurements are used to accelerate convergence. Numerical results demonstrate both the efficiency of the forward solver and the successful application of the inverse method to imaging problems with dimensions in the neighborhood of ten wavelengths. PMID:20707438

  11. Using 3D Printing for Rapid Prototyping of Characterization Tools for Investigating Powder Blend Behavior.

    PubMed

    Hirschberg, Cosima; Boetker, Johan P; Rantanen, Jukka; Pein-Hackelbusch, Miriam

    2018-02-01

    There is an increasing need to provide more detailed insight into the behavior of particulate systems. The current powder characterization tools are developed empirically and in many cases, modification of existing equipment is difficult. More flexible tools are needed to provide understanding of complex powder behavior, such as mixing process and segregation phenomenon. An approach based on the fast prototyping of new powder handling geometries and interfacing solutions for process analytical tools is reported. This study utilized 3D printing for rapid prototyping of customized geometries; overall goal was to assess mixing process of powder blends at small-scale with a combination of spectroscopic and mechanical monitoring. As part of the segregation evaluation studies, the flowability of three different paracetamol/filler-blends at different ratios was investigated, inter alia to define the percolation thresholds. Blends with a paracetamol wt% above the percolation threshold were subsequently investigated in relation to their segregation behavior. Rapid prototyping using 3D printing allowed designing two funnels with tailored flow behavior (funnel flow) of model formulations, which could be monitored with an in-line near-infrared (NIR) spectrometer. Calculating the root mean square (RMS) of the scores of the two first principal components of the NIR spectra visualized spectral variation as a function of process time. In a same setup, mechanical properties (basic flow energy) of the powder blend were monitored during blending. Rapid prototyping allowed for fast modification of powder testing geometries and easy interfacing with process analytical tools, opening new possibilities for more detailed powder characterization.

  12. Startup of RAPID-L Lunar Base Reactor by Lithium Release Module

    NASA Astrophysics Data System (ADS)

    Kambe, Mitsuru

    The 200 kWe uranium-nitride fueled lithium cooled fast reactor concept RAPID-L to be combined with thermoelectric power conversion system for lunar base power system is demonstrated. Unique challenges in reactivity control systems design have been attempted in RAPID-L concept. The reactor involves the following innovative reactivity control systems: Lithium Expansion Modules (LEM) for inherent reactivity feedback, Lithium Injection Modules (LIM) for inherent ultimate shutdown, and Lithium Release Modules (LRM) for automated reactor startup. All these systems adopt lithium-6 as a liquid poison instead of conventional B4C rods or Be reflectors. These systems are effective independent of the magnitude and direction of the gravity force. In 2006, however, the following design amendment has been made. 1) B4C poison rods were added to ensure criticality safety in unintended positive reactivity insertion by LRMs due to fire in the launch phase accident; because LRM freeze seal melts at 800°C which result in positive reactivity insertion. 2) Lower hot standby temperature of 200°C was adopted instead of conventional 800°C to reduce the external power at the startup. In this paper, development of the LRM orifice which dominates the startup transient of RAPID-L is discussed. An attention was focused how to achieve sufficiently small flow rate of 6Li in the orifice because it enables moderate positive reactivity insertion rate. The LRM orifice performance has been confirmed using 0.5 mm diameter SUS316 orifice/lithium flow test setup in the glove box.

  13. Comparison of six electromyography acquisition setups on hand movement classification tasks

    PubMed Central

    Pizzolato, Stefano; Tagliapietra, Luca; Cognolato, Matteo; Reggiani, Monica; Müller, Henning

    2017-01-01

    Hand prostheses controlled by surface electromyography are promising due to the non-invasive approach and the control capabilities offered by machine learning. Nevertheless, dexterous prostheses are still scarcely spread due to control difficulties, low robustness and often prohibitive costs. Several sEMG acquisition setups are now available, ranging in terms of costs between a few hundred and several thousand dollars. The objective of this paper is the relative comparison of six acquisition setups on an identical hand movement classification task, in order to help the researchers to choose the proper acquisition setup for their requirements. The acquisition setups are based on four different sEMG electrodes (including Otto Bock, Delsys Trigno, Cometa Wave + Dormo ECG and two Thalmic Myo armbands) and they were used to record more than 50 hand movements from intact subjects with a standardized acquisition protocol. The relative performance of the six sEMG acquisition setups is compared on 41 identical hand movements with a standardized feature extraction and data analysis pipeline aimed at performing hand movement classification. Comparable classification results are obtained with three acquisition setups including the Delsys Trigno, the Cometa Wave and the affordable setup composed of two Myo armbands. The results suggest that practical sEMG tests can be performed even when costs are relevant (e.g. in small laboratories, developing countries or use by children). All the presented datasets can be used for offline tests and their quality can easily be compared as the data sets are publicly available. PMID:29023548

  14. Comparison of six electromyography acquisition setups on hand movement classification tasks.

    PubMed

    Pizzolato, Stefano; Tagliapietra, Luca; Cognolato, Matteo; Reggiani, Monica; Müller, Henning; Atzori, Manfredo

    2017-01-01

    Hand prostheses controlled by surface electromyography are promising due to the non-invasive approach and the control capabilities offered by machine learning. Nevertheless, dexterous prostheses are still scarcely spread due to control difficulties, low robustness and often prohibitive costs. Several sEMG acquisition setups are now available, ranging in terms of costs between a few hundred and several thousand dollars. The objective of this paper is the relative comparison of six acquisition setups on an identical hand movement classification task, in order to help the researchers to choose the proper acquisition setup for their requirements. The acquisition setups are based on four different sEMG electrodes (including Otto Bock, Delsys Trigno, Cometa Wave + Dormo ECG and two Thalmic Myo armbands) and they were used to record more than 50 hand movements from intact subjects with a standardized acquisition protocol. The relative performance of the six sEMG acquisition setups is compared on 41 identical hand movements with a standardized feature extraction and data analysis pipeline aimed at performing hand movement classification. Comparable classification results are obtained with three acquisition setups including the Delsys Trigno, the Cometa Wave and the affordable setup composed of two Myo armbands. The results suggest that practical sEMG tests can be performed even when costs are relevant (e.g. in small laboratories, developing countries or use by children). All the presented datasets can be used for offline tests and their quality can easily be compared as the data sets are publicly available.

  15. Measuring uncertainty in dose delivered to the cochlea due to setup error during external beam treatment of patients with cancer of the head and neck.

    PubMed

    Yan, M; Lovelock, D; Hunt, M; Mechalakos, J; Hu, Y; Pham, H; Jackson, A

    2013-12-01

    To use Cone Beam CT scans obtained just prior to treatments of head and neck cancer patients to measure the setup error and cumulative dose uncertainty of the cochlea. Data from 10 head and neck patients with 10 planning CTs and 52 Cone Beam CTs taken at time of treatment were used in this study. Patients were treated with conventional fractionation using an IMRT dose painting technique, most with 33 fractions. Weekly radiographic imaging was used to correct the patient setup. The authors used rigid registration of the planning CT and Cone Beam CT scans to find the translational and rotational setup errors, and the spatial setup errors of the cochlea. The planning CT was rotated and translated such that the cochlea positions match those seen in the cone beam scans, cochlea doses were recalculated and fractional doses accumulated. Uncertainties in the positions and cumulative doses of the cochlea were calculated with and without setup adjustments from radiographic imaging. The mean setup error of the cochlea was 0.04 ± 0.33 or 0.06 ± 0.43 cm for RL, 0.09 ± 0.27 or 0.07 ± 0.48 cm for AP, and 0.00 ± 0.21 or -0.24 ± 0.45 cm for SI with and without radiographic imaging, respectively. Setup with radiographic imaging reduced the standard deviation of the setup error by roughly 1-2 mm. The uncertainty of the cochlea dose depends on the treatment plan and the relative positions of the cochlea and target volumes. Combining results for the left and right cochlea, the authors found the accumulated uncertainty of the cochlea dose per fraction was 4.82 (0.39-16.8) cGy, or 10.1 (0.8-32.4) cGy, with and without radiographic imaging, respectively; the percentage uncertainties relative to the planned doses were 4.32% (0.28%-9.06%) and 10.2% (0.7%-63.6%), respectively. Patient setup error introduces uncertainty in the position of the cochlea during radiation treatment. With the assistance of radiographic imaging during setup, the standard deviation of setup error reduced by 31%, 42%, and 54% in RL, AP, and SI direction, respectively, and consequently, the uncertainty of the mean dose to cochlea reduced more than 50%. The authors estimate that the effects of these uncertainties on the probability of hearing loss for an individual patient could be as large as 10%.

  16. Measuring uncertainty in dose delivered to the cochlea due to setup error during external beam treatment of patients with cancer of the head and neck

    PubMed Central

    Yan, M.; Lovelock, D.; Hunt, M.; Mechalakos, J.; Hu, Y.; Pham, H.; Jackson, A.

    2013-01-01

    Purpose: To use Cone Beam CT scans obtained just prior to treatments of head and neck cancer patients to measure the setup error and cumulative dose uncertainty of the cochlea. Methods: Data from 10 head and neck patients with 10 planning CTs and 52 Cone Beam CTs taken at time of treatment were used in this study. Patients were treated with conventional fractionation using an IMRT dose painting technique, most with 33 fractions. Weekly radiographic imaging was used to correct the patient setup. The authors used rigid registration of the planning CT and Cone Beam CT scans to find the translational and rotational setup errors, and the spatial setup errors of the cochlea. The planning CT was rotated and translated such that the cochlea positions match those seen in the cone beam scans, cochlea doses were recalculated and fractional doses accumulated. Uncertainties in the positions and cumulative doses of the cochlea were calculated with and without setup adjustments from radiographic imaging. Results: The mean setup error of the cochlea was 0.04 ± 0.33 or 0.06 ± 0.43 cm for RL, 0.09 ± 0.27 or 0.07 ± 0.48 cm for AP, and 0.00 ± 0.21 or −0.24 ± 0.45 cm for SI with and without radiographic imaging, respectively. Setup with radiographic imaging reduced the standard deviation of the setup error by roughly 1–2 mm. The uncertainty of the cochlea dose depends on the treatment plan and the relative positions of the cochlea and target volumes. Combining results for the left and right cochlea, the authors found the accumulated uncertainty of the cochlea dose per fraction was 4.82 (0.39–16.8) cGy, or 10.1 (0.8–32.4) cGy, with and without radiographic imaging, respectively; the percentage uncertainties relative to the planned doses were 4.32% (0.28%–9.06%) and 10.2% (0.7%–63.6%), respectively. Conclusions: Patient setup error introduces uncertainty in the position of the cochlea during radiation treatment. With the assistance of radiographic imaging during setup, the standard deviation of setup error reduced by 31%, 42%, and 54% in RL, AP, and SI direction, respectively, and consequently, the uncertainty of the mean dose to cochlea reduced more than 50%. The authors estimate that the effects of these uncertainties on the probability of hearing loss for an individual patient could be as large as 10%. PMID:24320510

  17. Review and Analysis of Peak Tracking Techniques for Fiber Bragg Grating Sensors

    PubMed Central

    2017-01-01

    Fiber Bragg Grating (FBG) sensors are among the most popular elements for fiber optic sensor networks used for the direct measurement of temperature and strain. Modern FBG interrogation setups measure the FBG spectrum in real-time, and determine the shift of the Bragg wavelength of the FBG in order to estimate the physical parameters. The problem of determining the peak wavelength of the FBG from a spectral measurement limited in resolution and noise, is referred as the peak-tracking problem. In this work, the several peak-tracking approaches are reviewed and classified, outlining their algorithmic implementations: the methods based on direct estimation, interpolation, correlation, resampling, transforms, and optimization are discussed in all their proposed implementations. Then, a simulation based on coupled-mode theory compares the performance of the main peak-tracking methods, in terms of accuracy and signal to noise ratio resilience. PMID:29039804

  18. Opening the Pandora's box of quantum spinor fields

    NASA Astrophysics Data System (ADS)

    Bonora, L.; Silva, J. M. Hoff da; Rocha, R. da

    2018-02-01

    Lounesto's classification of spinors is a comprehensive and exhaustive algorithm that, based on the bilinears covariants, discloses the possibility of a large variety of spinors, comprising regular and singular spinors and their unexpected applications in physics and including the cases of Dirac, Weyl, and Majorana as very particular spinor fields. In this paper we pose the problem of an analogous classification in the framework of second quantization. We first discuss in general the nature of the problem. Then we start the analysis of two basic bilinear covariants, the scalar and pseudoscalar, in the second quantized setup, with expressions applicable to the quantum field theory extended to all types of spinors. One can see that an ampler set of possibilities opens up with respect to the classical case. A quantum reconstruction algorithm is also proposed. The Feynman propagator is extended for spinors in all classes.

  19. Equilibrium expert: an add-in to Microsoft Excel for multiple binding equilibrium simulations and parameter estimations.

    PubMed

    Raguin, Olivier; Gruaz-Guyon, Anne; Barbet, Jacques

    2002-11-01

    An add-in to Microsoft Excel was developed to simulate multiple binding equilibriums. A partition function, readily written even when the equilibrium is complex, describes the experimental system. It involves the concentrations of the different free molecular species and of the different complexes present in the experiment. As a result, the software is not restricted to a series of predefined experimental setups but can handle a large variety of problems involving up to nine independent molecular species. Binding parameters are estimated by nonlinear least-square fitting of experimental measurements as supplied by the user. The fitting process allows user-defined weighting of the experimental data. The flexibility of the software and the way it may be used to describe common experimental situations and to deal with usual problems such as tracer reactivity or nonspecific binding is demonstrated by a few examples. The software is available free of charge upon request.

  20. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  1. Quantum key distribution session with 16-dimensional photonic states.

    PubMed

    Etcheverry, S; Cañas, G; Gómez, E S; Nogueira, W A T; Saavedra, C; Xavier, G B; Lima, G

    2013-01-01

    The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD.

  2. A dynamic method to forecast the wheel slip for antilock braking system and its experimental evaluation.

    PubMed

    Oniz, Yesim; Kayacan, Erdal; Kaynak, Okyay

    2009-04-01

    The control of an antilock braking system (ABS) is a difficult problem due to its strongly nonlinear and uncertain characteristics. To overcome this difficulty, the integration of gray-system theory and sliding-mode control is proposed in this paper. This way, the prediction capabilities of the former and the robustness of the latter are combined to regulate optimal wheel slip depending on the vehicle forward velocity. The design approach described is novel, considering that a point, rather than a line, is used as the sliding control surface. The control algorithm is derived and subsequently tested on a quarter vehicle model. Encouraged by the simulation results indicating the ability to overcome the stated difficulties with fast convergence, experimental results are carried out on a laboratory setup. The results presented indicate the potential of the approach in handling difficult real-time control problems.

  3. Optimal directed searches for continuous gravitational waves

    NASA Astrophysics Data System (ADS)

    Ming, Jing; Krishnan, Badri; Papa, Maria Alessandra; Aulbert, Carsten; Fehrmann, Henning

    2016-03-01

    Wide parameter space searches for long-lived continuous gravitational wave signals are computationally limited. It is therefore critically important that the available computational resources are used rationally. In this paper we consider directed searches, i.e., targets for which the sky position is known accurately but the frequency and spin-down parameters are completely unknown. Given a list of such potential astrophysical targets, we therefore need to prioritize. On which target(s) should we spend scarce computing resources? What parameter space region in frequency and spin-down should we search through? Finally, what is the optimal search setup that we should use? In this paper we present a general framework that allows us to solve all three of these problems. This framework is based on maximizing the probability of making a detection subject to a constraint on the maximum available computational cost. We illustrate the method for a simplified problem.

  4. Laser-induced breakdown spectroscopy for detection of heavy metals in environmental samples

    NASA Astrophysics Data System (ADS)

    Wisbrun, Richard W.; Schechter, Israel; Niessner, Reinhard; Schroeder, Hartmut

    1993-03-01

    The application of LIBS technology as a sensor for heavy metals in solid environmental samples has been studied. This specific application introduces some new problems in the LIBS analysis. Some of them are related to the particular distribution of contaminants in the grained samples. Other problems are related to mechanical properties of the samples and to general matrix effects, like the water and organic fibers content of the sample. An attempt has been made to optimize the experimental set-up for the various involved parameters. The understanding of these factors has enabled the adjustment of the technique to the substrates of interest. The special importance of the grain size and of the laser-induced aerosol production is pointed out. Calibration plots for the analysis of heavy metals in diverse sand and soil samples have been carried out. The detection limits are shown to be usually below the recent regulation restricted concentrations.

  5. Quantum key distribution session with 16-dimensional photonic states

    NASA Astrophysics Data System (ADS)

    Etcheverry, S.; Cañas, G.; Gómez, E. S.; Nogueira, W. A. T.; Saavedra, C.; Xavier, G. B.; Lima, G.

    2013-07-01

    The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD.

  6. NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding

    PubMed Central

    Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo

    2016-01-01

    Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280

  7. A closer look at the probabilities of the notorious three prisoners.

    PubMed

    Falk, R

    1992-06-01

    The "problem of three prisoners", a counterintuitive teaser, is analyzed. It is representative of a class of probability puzzles where the correct solution depends on explication of underlying assumptions. Spontaneous beliefs concerning the problem and intuitive heuristics are reviewed. The psychological background of these beliefs is explored. Several attempts to find a simple criterion to predict whether and how the probability of the target event will change as a result of obtaining evidence are examined. However, despite the psychological appeal of these attempts, none proves to be valid in general. A necessary and sufficient condition for change in the probability of the target event, following observation of new data, is proposed. That criterion is an extension of the likelihood-ratio principle (which holds in the case of only two complementary alternatives) to any number of alternatives. Some didactic implications concerning the significance of the chance set-up and reliance on analogies are discussed.

  8. Quantum key distribution session with 16-dimensional photonic states

    PubMed Central

    Etcheverry, S.; Cañas, G.; Gómez, E. S.; Nogueira, W. A. T.; Saavedra, C.; Xavier, G. B.; Lima, G.

    2013-01-01

    The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD. PMID:23897033

  9. Hybrid Self-Adaptive Evolution Strategies Guided by Neighborhood Structures for Combinatorial Optimization Problems.

    PubMed

    Coelho, V N; Coelho, I M; Souza, M J F; Oliveira, T A; Cota, L P; Haddad, M N; Mladenovic, N; Silva, R C P; Guimarães, F G

    2016-01-01

    This article presents an Evolution Strategy (ES)--based algorithm, designed to self-adapt its mutation operators, guiding the search into the solution space using a Self-Adaptive Reduced Variable Neighborhood Search procedure. In view of the specific local search operators for each individual, the proposed population-based approach also fits into the context of the Memetic Algorithms. The proposed variant uses the Greedy Randomized Adaptive Search Procedure with different greedy parameters for generating its initial population, providing an interesting exploration-exploitation balance. To validate the proposal, this framework is applied to solve three different [Formula: see text]-Hard combinatorial optimization problems: an Open-Pit-Mining Operational Planning Problem with dynamic allocation of trucks, an Unrelated Parallel Machine Scheduling Problem with Setup Times, and the calibration of a hybrid fuzzy model for Short-Term Load Forecasting. Computational results point out the convergence of the proposed model and highlight its ability in combining the application of move operations from distinct neighborhood structures along the optimization. The results gathered and reported in this article represent a collective evidence of the performance of the method in challenging combinatorial optimization problems from different application domains. The proposed evolution strategy demonstrates an ability of adapting the strength of the mutation disturbance during the generations of its evolution process. The effectiveness of the proposal motivates the application of this novel evolutionary framework for solving other combinatorial optimization problems.

  10. Noise Source Identification and Dynamic Modeling of a Pneumatic Nailing Device =

    NASA Astrophysics Data System (ADS)

    Nili Ahmadabadi, Zahra

    Exposure to hazardous noise levels emitted by pneumatic nailing devices contributes significantly to risk of hearing damage among the construction workers throughout the world. This health problem comes from the lack of appropriate technology such as low noise devices which in turn results from the lack of scientific knowledge about designing reduced noise devices. This study contributes to the design improvement of pneumatic nailing devices through identifying the noise sources and developing the simulation tool required to redesign the pneumatic nailing device. To identify the noise sources, the study uses a combination of two complementary experimental approaches. The first makes use of time-synchronized data analysis of several variables during the machine operation. This strategy allows identifying the physical processes and provides a detailed separation of the noise generation mechanisms in successive time sequences. However, since multiple noise sources radiate at the same time, this observation approach is not sufficient for noise source identification and ranking. Thus, it is completed by a selective wrapping and muffler procedure. This technique provides overall generated noise associated with each process, as well as ranking of the three major sources: (1) exhaust noise, (2) machine body vibrations, and (3) workpiece vibrations. A special investigation is conducted on this third one with two cases: a workpiece/worktable setup representative of the actual field usage of a nailing device and a workpiece/sandbox setup used in a standardized laboratory test. The study evaluates the efficiency of the workpiece/sandbox setup in reducing the workpiece radiation and obtains a typical workpiece contribution on an actual worksite. To provide a simulation tool, a dynamic model of the pneumatic nailing device needs to be developed. Dynamic modeling of the nailing device requires mathematical modeling of the physical processes involved in its operation. All of these processes can be described through already existing mathematical relations, except for the penetration resistance force (PRF) imposed on the nails when penetrating the wood. The PRF depends on various factors. This study follows two approaches in parallel to develop an empirical prediction law for the PRF: quasi-static and high-speed. The quasi-static approach provides a rapid and precise representation of the law at quasistatic penetration velocities. The law covers the entire displacement range, various nail geometries and sizes, and wood types. The high-speed approach aims to provide a law which covers a much wider range of penetration velocities. The approach is complicated since it requires a sophisticated test machine to conduct the nail driving tests at high penetration velocities. The study designs and fabricates an advanced test machine to later extend the prediction range of the PRF law. The last part of this study develops the dynamic model of a nail gun while integrating the quasi-static PRF law. The model includes dynamics of all the air chambers and the moving parts, and interactions and impacts/contacts between different parts. The study integrates a comprehensive experimental validation of the model. Future improvements in the dynamic model precision will be possible by using the extended version of the PRF law.

  11. Magnetic MIMO Signal Processing and Optimization for Wireless Power Transfer

    NASA Astrophysics Data System (ADS)

    Yang, Gang; Moghadam, Mohammad R. Vedady; Zhang, Rui

    2017-06-01

    In magnetic resonant coupling (MRC) enabled multiple-input multiple-output (MIMO) wireless power transfer (WPT) systems, multiple transmitters (TXs) each with one single coil are used to enhance the efficiency of simultaneous power transfer to multiple single-coil receivers (RXs) by constructively combining their induced magnetic fields at the RXs, a technique termed "magnetic beamforming". In this paper, we study the optimal magnetic beamforming design in a multi-user MIMO MRC-WPT system. We introduce the multi-user power region that constitutes all the achievable power tuples for all RXs, subject to the given total power constraint over all TXs as well as their individual peak voltage and current constraints. We characterize each boundary point of the power region by maximizing the sum-power deliverable to all RXs subject to their minimum harvested power constraints. For the special case without the TX peak voltage and current constraints, we derive the optimal TX current allocation for the single-RX setup in closed-form as well as that for the multi-RX setup. In general, the problem is a non-convex quadratically constrained quadratic programming (QCQP), which is difficult to solve. For the case of one single RX, we show that the semidefinite relaxation (SDR) of the problem is tight. For the general case with multiple RXs, based on SDR we obtain two approximate solutions by applying time-sharing and randomization, respectively. Moreover, for practical implementation of magnetic beamforming, we propose a novel signal processing method to estimate the magnetic MIMO channel due to the mutual inductances between TXs and RXs. Numerical results show that our proposed magnetic channel estimation and adaptive beamforming schemes are practically effective, and can significantly improve the power transfer efficiency and multi-user performance trade-off in MIMO MRC-WPT systems.

  12. The Compact and Inexpensive "Arrowhead" Setup for Holographic Interferometry

    ERIC Educational Resources Information Center

    Ladera, Celso L.; Donoso, Guillermo

    2011-01-01

    Hologram recording and holographic interferometry are intrinsically sensitive to phase changes, and therefore both are easily perturbed by minuscule optical path perturbations. It is therefore very convenient to bank on holographic setups with a reduced number of optical components. Here we present a compact off-axis holographic setup that…

  13. Rapid Selective Annealing of Cu Thin Films on Si Using Microwaves

    NASA Technical Reports Server (NTRS)

    Brain, R. A.; Atwater, H. A.; Watson, T. J.; Barmatz, M.

    1994-01-01

    A major goal of the semiconductor indurstry is to lower the processing temperatures needed for interconnects in silicon integrated circuits. Typical rapid thermal annealing processes heat the film as well as the substrate, creating device problems.

  14. Cryocooler based test setup for high current applications

    NASA Astrophysics Data System (ADS)

    Pradhan, Jedidiah; Das, Nisith Kr.; Roy, Anindya; Duttagupta, Anjan

    2018-04-01

    A cryo-cooler based cryogenic test setup has been designed, fabricated, and tested. The setup incorporates two numbers of cryo-coolers, one for sample cooling and the other one for cooling the large magnet coil. The performance and versatility of the setup has been tested using large samples of high-temperature superconductor magnet coil as well as short samples with high current. Several un-calibrated temperature sensors have been calibrated using this system. This paper presents the details of the system along with results of different performance tests.

  15. Tissue type determination by impedance measurement: A bipolar and monopolar comparison

    PubMed Central

    Sharp, Jack; Bouazza-Marouf, Kaddour; Noronha, Dorita; Gaur, Atul

    2017-01-01

    Background: In certain medical applications, it is necessary to be able to determine the position of a needle inside the body, specifically with regards to identifying certain tissue types. By measuring the electrical impedance of specific tissue types, it is possible to determine the type of tissue the tip of the needle (or probe) is at. Materials and Methods: Two methods have been investigated for electric impedance detection; bipolar and monopolar. Commercially available needle electrodes are of a monopolar type. Although many patents exist on the bipolar setups, these have not as yet been commercialized. This paper reports a comparison of monopolar and bipolar setups for tissue type determination. In vitro experiments were carried out on pork to compare this investigation with other investigations in this field. Results: The results show that both monopolar and bipolar setups are capable of determining tissue type. However, the bipolar setup showed slightly better results; the difference between the different soft tissue type impedances was greater compared to the monopolar method. Conclusion: Both monopolar and bipolar electrical impedance setups work very similarly in inhomogeneous volumes such as biological tissue. There is a clear potential for clinical applications with impedance-based needle guidance, with both the monopolar and bipolar setups. It is, however, worth noting that the bipolar setup is more versatile. PMID:28217047

  16. Numerical investigation of a scalable setup for efficient terahertz generation using a segmented tilted-pulse-front excitation.

    PubMed

    Pálfalvi, László; Tóth, György; Tokodi, Levente; Márton, Zsuzsanna; Fülöp, József András; Almási, Gábor; Hebling, János

    2017-11-27

    A hybrid-type terahertz pulse source is proposed for high energy terahertz pulse generation. It is the combination of the conventional tilted-pulse-front setup and a transmission stair-step echelon-faced nonlinear crystal with a period falling in the hundred-micrometer range. The most important advantage of the setup is the possibility of using plane parallel nonlinear optical crystal for producing good-quality, symmetric terahertz beam. Another advantage of the proposed setup is the significant reduction of imaging errors, which is important in the case of wide pump beams that are used in high energy experiments. A one dimensional model was developed for determining the terahertz generation efficiency, and it was used for quantitative comparison between the proposed new hybrid setup and previously introduced terahertz sources. With lithium niobate nonlinear material, calculations predict an approximately ten-fold increase in the efficiency of the presently described hybrid terahertz pulse source with respect to that of the earlier proposed setup, which utilizes a reflective stair-step echelon and a prism shaped nonlinear optical crystal. By using pump pulses of 50 mJ pulse energy, 500 fs pulse length and 8 mm beam spot radius, approximately 1% conversion efficiency and 0.5 mJ terahertz pulse energy can be reached with the newly proposed setup.

  17. The potential for optical beam shaping of UV laser sources for mass scale quarantine disinfection applications

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd

    2010-08-01

    Recent events concerning H1N1 "swine flu", have demonstrated to the world the significant potential of rapid increases in death and illness among all age groups and even among the healthy population [1] when a highly infectious influenza virus is introduced. In terms of mass casualties due to a pandemic, preparedness and response planning must be done. One course of action to prevent a pandemic outbreak or reduce the impact of a bioterrorist event is the use of isolation or quarantine facilities. The first level of isolation or quarantine is within the personal residence of the person exposed or infected. In the case where, the specific virus is extremely contagious and its onset of symptoms is rapid and severe, there will be a need for the deployment and setup of larger self contained quarantine facilities. Such facilities are used to house infectious individuals to minimize the exposure of susceptible individuals to contagious individuals, especially when specialized care or treatment is required and during the viral shedding period (5 to 7 days). These types of facilities require non-shared air conditioning, heating and ventilating systems where 100% of air is vented to the outside through a series of disinfection systems and staged filters. Although chemical disinfection is possible, there is a desire to incorporate intense UV radiation as a means to deactivate and disinfect airborne virus within hospital settings and isolated mass scale quarantine facilities. UV radiation is also being considered for disinfection of contaminated surfaces, such as table tops, walls and floors in hospitals and temporary quarantine facilities. In such applications the use of UV bulb technology can create many problems, for instance bulb technology requires numerous bulbs to treat a large volume of air, generates significant heat, uses significant power and does not produce large fluxes of UV light efficiently. This paper provides several methods of creating quarantine level disinfection systems using high intensity UV laser sources instead of UV bulb techniques by using laser beam shaping optics in conjunction with traditional optical laser beam delivery techniques.

  18. TerraFERMA: The Transparent Finite Element Rapid Model Assembler for multiphysics problems in Earth sciences

    NASA Astrophysics Data System (ADS)

    Wilson, Cian R.; Spiegelman, Marc; van Keken, Peter E.

    2017-02-01

    We introduce and describe a new software infrastructure TerraFERMA, the Transparent Finite Element Rapid Model Assembler, for the rapid and reproducible description and solution of coupled multiphysics problems. The design of TerraFERMA is driven by two computational needs in Earth sciences. The first is the need for increased flexibility in both problem description and solution strategies for coupled problems where small changes in model assumptions can lead to dramatic changes in physical behavior. The second is the need for software and models that are more transparent so that results can be verified, reproduced, and modified in a manner such that the best ideas in computation and Earth science can be more easily shared and reused. TerraFERMA leverages three advanced open-source libraries for scientific computation that provide high-level problem description (FEniCS), composable solvers for coupled multiphysics problems (PETSc), and an options handling system (SPuD) that allows the hierarchical management of all model options. TerraFERMA integrates these libraries into an interface that organizes the scientific and computational choices required in a model into a single options file from which a custom compiled application is generated and run. Because all models share the same infrastructure, models become more reusable and reproducible, while still permitting the individual researcher considerable latitude in model construction. TerraFERMA solves partial differential equations using the finite element method. It is particularly well suited for nonlinear problems with complex coupling between components. TerraFERMA is open-source and available at http://terraferma.github.io, which includes links to documentation and example input files.

  19. Field Performance Of Concrete Admixtures

    DOT National Transportation Integrated Search

    1998-06-30

    This project investigated compatibility problems involving two concrete admixtures from W.R. Grace Products and Dacotah portland cement. The problems experienced by the South Dakota Department of Transportation (SDDOT) were described as rapid slump l...

  20. Persona-Based Journaling: Striving for Authenticity in Representing the Problem-Solving Process

    ERIC Educational Resources Information Center

    Liljedahl, Peter

    2007-01-01

    Students' mathematical problem-solving experiences are fraught with failed attempts, wrong turns, and partial successes that move in fits and jerks, oscillating between periods of inactivity, stalled progress, rapid advancement, and epiphanies. Students' problem-solving journals, however, do not always reflect this rather organic process. Without…

  1. The Next Decade in Higher Education: Obvious Problems and Possible Solutions.

    ERIC Educational Resources Information Center

    Cheit, Earl F.

    Problems confronting higher education are considered from the perspective of state coordinating agencies. Ten obvious problems are as follows: adjusting to new enrollment patterns; attempting to close the cost-income gap; expenditures increasing more rapidly than income; supporting the capacity for research and advanced study; meeting new…

  2. COMPARISON OF LAPAROSCOPIC SKILLS PERFORMANCE USING SINGLE-SITE ACCESS (SSA) DEVICES VS. AN INDEPENDENT-PORT SSA APPROACH

    PubMed Central

    Schill, Matthew R.; Varela, J. Esteban; Frisella, Margaret M.; Brunt, L. Michael

    2015-01-01

    Background We compared performance of validated laparoscopic tasks on four commercially available single site access (SSA) access devices (AD) versus an independent port (IP) SSA set-up. Methods A prospective, randomized comparison of laparoscopic skills performance on four AD (GelPOINT™, SILS™ Port, SSL Access System™, TriPort™) and one IP SSA set-up was conducted. Eighteen medical students (2nd–4th year), four surgical residents, and five attending surgeons were trained to proficiency in multi-port laparoscopy using four laparoscopic drills (peg transfer, bean drop, pattern cutting, extracorporeal suturing) in a laparoscopic trainer box. Drills were then performed in random order on each IP-SSA and AD-SSA set-up using straight laparoscopic instruments. Repetitions were timed and errors recorded. Data are mean ± SD, and statistical analysis was by two-way ANOVA with Tukey HSD post-hoc tests. Results Attending surgeons had significantly faster total task times than residents or students (p< 0.001), but the difference between residents and students was NS. Pair-wise comparisons revealed significantly faster total task times for the IP-SSA set-up compared to all four AD-SSA’s within the student group only (p<0.05). Total task times for residents and attending surgeons showed a similar profile, but the differences were NS. When data for the three groups was combined, the total task time was less for the IP-SSA set-up than for each of the four AD-SSA set-ups (p < 0.001). Similarly,, the IP-SSA set-up was significantly faster than 3 of 4 AD-SSA set-ups for peg transfer, 3 of 4 for pattern cutting, and 2 of 4 for suturing. No significant differences in error rates between IP-SSA and AD-SSA set-ups were detected. Conclusions When compared to an IP-SSA laparoscopic set-up, single site access devices are associated with longer task performance times in a trainer box model, independent of level of training. Task performance was similar across different SSA devices. PMID:21993938

  3. The Effect of Vegetation on Sea-Swell Waves, Infragravity Waves and Wave-Induced Setup

    NASA Astrophysics Data System (ADS)

    Roelvink, J. A.; van Rooijen, A.; McCall, R. T.; Van Dongeren, A.; Reniers, A.; van Thiel de Vries, J.

    2016-02-01

    Aquatic vegetation in the coastal zone (e.g. mangrove trees) attenuates wave energy and thereby reduces flood risk along many shorelines worldwide. However, in addition to the attenuation of incident-band (sea-swell) waves, vegetation may also affect infragravity-band (IG) waves and the wave-induced water level setup (in short: wave setup). Currently, knowledge on the effect of vegetation on IG waves and wave setup is lacking, while they are they are key parameters for coastal risk assessment. In this study, the process-based storm impact model XBeach was extended with formulations for attenuation of sea-swell and IG waves as well as the effect on the wave setup, in two modes: the sea-swell wave phase-resolving (non-hydrostatic) and the phase-averaged (surfbeat) mode. In surfbeat mode a wave shape model was implemented to estimate the wave phase and to capture the intra-wave scale effect of emergent vegetation and nonlinear waves on the wave setup. Both modeling modes were validated using data from two flume experiments and show good skill in computing the attenuation of both sea-swell and IG waves as well as the effect on the wave-induced water level setup. In surfbeat mode, the prediction of nearshore mean water levels greatly improved when using the wave shape model, while in non-hydrostatic mode this effect is directly accounted for. Subsequently, the model was used to study the influence of the bottom profile slope and the location of the vegetation field on the computed wave setup with and without vegetation. It was found that the reduction is wave setup is strongly related to the location of vegetation relative to the wave breaking point, and that the wave setup is lower for milder slopes. The extended version of XBeach developed within this study can be used to study the nearshore hydrodynamics on coasts fronted by vegetation such as mangroves. It can also serve as tool for storm impact studies on coasts with aquatic vegetation, and can help to quantify the coastal protection function of vegetation.

  4. Optimizing Mouse Surgery with Online Rectal Temperature Monitoring and Preoperative Heat Supply. Effects on Post-Ischemic Acute Kidney Injury.

    PubMed

    Marschner, Julian A; Schäfer, Hannah; Holderied, Alexander; Anders, Hans-Joachim

    2016-01-01

    Body temperature affects outcomes of tissue injury. We hypothesized that online body core temperature recording and selective interventions help to standardize peri-interventional temperature control and the reliability of outcomes in experimental renal ischemia reperfusion injury (IRI). We recorded core temperature in up to seven mice in parallel using a Thermes USB recorder and ret-3-iso rectal probes with three different protocols. Setup A: Heating pad during ischemia time; Setup B: Heating pad from incision to wound closure; Setup C: A ventilated heating chamber before surgery and during ischemia time with surgeries performed on a heating pad. Temperature profile recording displayed significant declines upon installing anesthesia. The profile of the baseline experimental setup A revealed that <1% of the temperature readings were within the target range of 36.5 to 38.5°C. Setup B and C increased the target range readings to 34.6 ± 28.0% and 99.3 ± 1.5%, respectively. Setup C significantly increased S3 tubular necrosis, neutrophil influx, and mRNA expression of kidney injury markers. In addition, using setup C different ischemia times generated a linear correlation with acute tubular necrosis parameters at a low variability, which further correlated with the degree of kidney atrophy 5 weeks after surgery. Changing temperature control setup A to C was equivalent to 10 minutes more ischemia time. We conclude that body temperature drops quickly in mice upon initiating anesthesia. Immediate heat supply, e.g. in a ventilated heating chamber, and online core temperature monitoring can help to standardize and optimize experimental outcomes.

  5. Clinical experience with a 3D surface patient setup system for alignment of partial-breast irradiation patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bert, Christoph; Metheany, Katherine G.; Doppke, Karen P.

    2006-03-15

    Purpose: To assess the utility of surface imaging on patient setup for accelerated partial-breast irradiation (APBI). Methods and Material: A photogrammetry system was used in parallel to APBI setup by laser and portal imaging. Surface data were acquired after laser and port-film setup for 9 patients. Surfaces were analyzed in comparison to a reference surface from the first treatment session by use of rigid transformations. The surface model after laser setup was used in a simulated photogrammetry setup procedure. In addition, breathing data were acquired by surface acquisition at a frame rate of 7 Hz. Results: Mean 3D displacement wasmore » 7.3 mm (SD, 4.4 mm) and 7.6 mm (SD, 4.2 mm) for laser and port film, respectively. Simulated setup with the photogrammetry system yielded mean displacement of 1 mm (SD, 1.2 mm). Distance analysis resulted in mean distances of 3.7 mm (SD, 4.9 mm), 4.3 mm (SD, 5.6 mm), and 1.6 mm (SD, 2.4 mm) for laser, port film, and photogrammetry, respectively. Breathing motion at isocenter was smaller than 3.7 mm, with a mean of 1.9 mm (SD, 1.1 mm). Conclusions: Surface imaging for PBI setup appears promising. Alignment of the 3D breast surface achieved by stereo-photogrammetry shows greater breast topology congruence than when patients are set up by laser or portal imaging. A correlation of breast surface and CTV must be quantitatively established.« less

  6. Parallelized traveling cluster approximation to study numerically spin-fermion models on large lattices

    NASA Astrophysics Data System (ADS)

    Mukherjee, Anamitra; Patel, Niravkumar D.; Bishop, Chris; Dagotto, Elbio

    2015-06-01

    Lattice spin-fermion models are important to study correlated systems where quantum dynamics allows for a separation between slow and fast degrees of freedom. The fast degrees of freedom are treated quantum mechanically while the slow variables, generically referred to as the "spins," are treated classically. At present, exact diagonalization coupled with classical Monte Carlo (ED + MC) is extensively used to solve numerically a general class of lattice spin-fermion problems. In this common setup, the classical variables (spins) are treated via the standard MC method while the fermion problem is solved by exact diagonalization. The "traveling cluster approximation" (TCA) is a real space variant of the ED + MC method that allows to solve spin-fermion problems on lattice sizes with up to 103 sites. In this publication, we present a novel reorganization of the TCA algorithm in a manner that can be efficiently parallelized. This allows us to solve generic spin-fermion models easily on 104 lattice sites and with some effort on 105 lattice sites, representing the record lattice sizes studied for this family of models.

  7. Methods for measuring water activity (aw) of foods and its applications to moisture sorption isotherm studies.

    PubMed

    Zhang, Lida; Sun, Da-Wen; Zhang, Zhihang

    2017-03-24

    Moisture sorption isotherm is commonly determined by saturated salt slurry method, which has defects of long time cost, cumbersome labor, and microbial deterioration of samples. Thus, a novel method, a w measurement (AWM) method, has been developed to overcome these drawbacks. Fundamentals and applications of this fast method have been introduced with respects to its typical operational steps, a variety of equipment set-ups and applied samples. The resultant rapidness and reliability have been evaluated by comparing with conventional methods. This review also discussed factors impairing measurement precision and accuracy, including inappropriate choice of predryingwetting techniques and unachieved moisture uniformity in samples due to inadequate time. This analysis and corresponding suggestions can facilitate improved AWM method with more satisfying accuracy and time cost.

  8. Dimensional measuring techniques in the automotive and aircraft industry

    NASA Astrophysics Data System (ADS)

    Muench, K. H.; Baertlein, Hugh

    1994-03-01

    Optical tooling methods used in industry are rapidly being replaced by new electronic sensor techniques. The impact of new measuring technologies on the production process has caused major changes on the industrial shop floor as well as within industrial measurement systems. The paper deals with one particular industrial measuring system, the manual theodolite measuring system (TMS), within the aircraft and automobile industry. With TMS, setup, data capture, and data analysis are flexible enough to suit industry's demands regarding speed, accuracy, and mobility. Examples show the efficiency and the wide range of TMS applications. In cooperation with industry, the Video Theodolite System was developed. Its origin, functions, capabilities, and future plans are briefly described. With the VTS a major step has been realized in direction to vision systems for industrial applications.

  9. Spinning Disk Confocal Imaging of Neutrophil Migration in Zebrafish

    PubMed Central

    Lam, Pui-ying; Fischer, Robert S; Shin, William D.; Waterman, Clare M; Huttenlocher, Anna

    2014-01-01

    Live-cell imaging techniques have been substantially improved due to advances in confocal microscopy instrumentation coupled with ultrasensitive detectors. The spinning disk confocal system is capable of generating images of fluorescent live samples with broad dynamic range and high temporal and spatial resolution. The ability to acquire fluorescent images of living cells in vivo on a millisecond timescale allows the dissection of biological processes that have not previously been visualized in a physiologically relevant context. In vivo imaging of rapidly moving cells such as neutrophils can be technically challenging. In this chapter, we describe the practical aspects of imaging neutrophils in zebrafish embryos using spinning disk confocal microscopy. Similar setups can also be applied to image other motile cell types and signaling processes in translucent animals or tissues. PMID:24504955

  10. Developing LAr Scintillation Light Collection Ideas in the Short Baseline Neutrino Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szelc, A. M.

    2016-02-08

    Scintillation light is becoming the most rapidly developing feature of Liquid Argon Time Projection Chamber (LArTPC) neutrino detectors due to its capability to enhance and expand their physics reach traditionally based on charge readout. The SBND detector, set to be built on the Booster Neutrino Beam Line at Fermilab, is in a unique position to test novel liquid argon scintillation light readout systems in a detector with physics neutrino events. The different ideas under consideration by the collaboration are described, including an array of PMTs detecting direct light, SiPM coupled lightguide bars and a setup which uses PMTs/SiPMS and wavelengthmore » shifter covered reflector foils, as well as their respective strengths and physics foci and the benchmarks used to compare them.« less

  11. Reflection measurements of microwave absorbers

    NASA Astrophysics Data System (ADS)

    Baker, Dirk E.; van der Neut, Cornelis A.

    1988-12-01

    A swept-frequency interferometer is described for making rapid, real-time assessments of localized inhomogeneities in planar microwave absorber panels. An aperture-matched exponential horn is used to reduce residual reflections in the system to about -37 dB. This residual reflection is adequate for making comparative measurements on planar absorber panels whose reflectivities usually fall in the -15 to -25 dB range. Reflectivity measurements on a variety of planar absorber panels show that multilayer Jaumann absorbers have the greatest inhomogeneity, while honeycomb absorbers generally have excellent homogeneity within a sheet and from sheet to sheet. The test setup is also used to measure the center frequencies of resonant absorbers. With directional couplers and aperture-matched exponential horns, the technique can be easily applied in the standard 2 to 40 GHz waveguide bands.

  12. Rapid stabilization of thawing soils For enhanced vehicle mobility: a field demonstration project

    DOT National Transportation Integrated Search

    1999-02-01

    Thawing soil presents a formidable challenge for vehicle operations cross-country and on unsurfaced roads. To mitigate the problem, a variety of stabilization techniques were evaluated for their suitability for rapid employment to enhance military ve...

  13. A Flexible Pilot-Scale Setup for Real-Time Studies in Process Systems Engineering

    ERIC Educational Resources Information Center

    Panjapornpon, Chanin; Fletcher, Nathan; Soroush, Masoud

    2006-01-01

    This manuscript describes a flexible, pilot-scale setup that can be used for training students and carrying out research in process systems engineering. The setup allows one to study a variety of process systems engineering concepts such as design feasibility, design flexibility, control configuration selection, parameter estimation, process and…

  14. Treatability Study Pilot Test Operation Field Photos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun

    Photos in each group are in chronological order as captured: Group I Tank Platform Setup, November 14, 2017; Group II Tank Setup, November 15, 2017; Group III Aboveground Injestion System (AIS) Setup, November 20, 2017; Group IV Chemical Mixing, November 21, 2017; Group V KB-1 Bacteria Injection, November 27, 2017; Group VI Miscellaneous.

  15. Field instrumentation and testing to study set-up phenomenon of piles driven into Louisiana clayey soils : final report.

    DOT National Transportation Integrated Search

    2016-07-01

    This research study aims to investigate the pile set-up phenomenon for clayey soils and develop empirical models to predict pile set-up : resistance at certain time after end of driving (EOD). To fulfill the objective, a total number of twelve prestr...

  16. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less

  17. Ecotourism and Interpretation in Costa Rica: Parallels and Peregrinations.

    ERIC Educational Resources Information Center

    Williams, Wayne E.

    1994-01-01

    Discusses the ecotourism industry in Costa Rica and some of the problems faced by its national park system, including megaparks, rapid increase in tourism, and interpretive services. Suggests alternatives for the problems. (MKR)

  18. SU-E-J-88: The Study of Setup Error Measured by CBCT in Postoperative Radiotherapy for Cervical Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runxiao, L; Aikun, W; Xiaomei, F

    2015-06-15

    Purpose: To compare two registration methods in the CBCT guided radiotherapy for cervical carcinoma, analyze the setup errors and registration methods, determine the margin required for clinical target volume(CTV) extending to planning target volume(PTV). Methods: Twenty patients with cervical carcinoma were enrolled. All patients were underwent CT simulation in the supine position. Transfering the CT images to the treatment planning system and defining the CTV, PTV and the organs at risk (OAR), then transmit them to the XVI workshop. CBCT scans were performed before radiotherapy and registered to planning CT images according to bone and gray value registration methods. Comparedmore » two methods and obtain left-right(X), superior-inferior(Y), anterior-posterior (Z) setup errors, the margin required for CTV to PTV were calculated. Results: Setup errors were unavoidable in postoperative cervical carcinoma irradiation. The setup errors measured by method of bone (systemic ± random) on X(1eft.right),Y(superior.inferior),Z(anterior.posterior) directions were(0.24±3.62),(0.77±5.05) and (0.13±3.89)mm, respectively, the setup errors measured by method of grey (systemic ± random) on X(1eft-right), Y(superior-inferior), Z(anterior-posterior) directions were(0.31±3.93), (0.85±5.16) and (0.21±4.12)mm, respectively.The spatial distributions of setup error was maximum in Y direction. The margins were 4 mm in X axis, 6 mm in Y axis, 4 mm in Z axis respectively.These two registration methods were similar and highly recommended. Conclusion: Both bone and grey registration methods could offer an accurate setup error. The influence of setup errors of a PTV margin would be suggested by 4mm, 4mm and 6mm on X, Y and Z directions for postoperative radiotherapy for cervical carcinoma.« less

  19. Evaluation of overall setup accuracy and adequate setup margins in pelvic image-guided radiotherapy: Comparison of the male and female patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laaksomaa, Marko, E-mail: marko.laaksomaa@pshp.fi; Kapanen, Mika; Department of Medical Physics, Tampere University Hospital

    We evaluated adequate setup margins for the radiotherapy (RT) of pelvic tumors based on overall position errors of bony landmarks. We also estimated the difference in setup accuracy between the male and female patients. Finally, we compared the patient rotation for 2 immobilization devices. The study cohort included consecutive 64 male and 64 female patients. Altogether, 1794 orthogonal setup images were analyzed. Observer-related deviation in image matching and the effect of patient rotation were explicitly determined. Overall systematic and random errors were calculated in 3 orthogonal directions. Anisotropic setup margins were evaluated based on residual errors after weekly image guidance.more » The van Herk formula was used to calculate the margins. Overall, 100 patients were immobilized with a house-made device. The patient rotation was compared against 28 patients immobilized with CIVCO's Kneefix and Feetfix. We found that the usually applied isotropic setup margin of 8 mm covered all the uncertainties related to patient setup for most RT treatments of the pelvis. However, margins of even 10.3 mm were needed for the female patients with very large pelvic target volumes centered either in the symphysis or in the sacrum containing both of these structures. This was because the effect of rotation (p ≤ 0.02) and the observer variation in image matching (p ≤ 0.04) were significantly larger for the female patients than for the male patients. Even with daily image guidance, the required margins remained larger for the women. Patient rotations were largest about the lateral axes. The difference between the required margins was only 1 mm for the 2 immobilization devices. The largest component of overall systematic position error came from patient rotation. This emphasizes the need for rotation correction. Overall, larger position errors and setup margins were observed for the female patients with pelvic cancer than for the male patients.« less

  20. A versatile setup using femtosecond adaptive spectroscopic techniques for coherent anti-Stokes Raman scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Yujie, E-mail: styojm@physics.tamu.edu; Voronine, Dmitri V.; Sokolov, Alexei V.

    2015-08-15

    We report a versatile setup based on the femtosecond adaptive spectroscopic techniques for coherent anti-Stokes Raman scattering. The setup uses a femtosecond Ti:Sapphire oscillator source and a folded 4f pulse shaper, in which the pulse shaping is carried out through conventional optical elements and does not require a spatial light modulator. Our setup is simple in alignment, and can be easily switched between the collinear single-beam and the noncollinear two-beam configurations. We demonstrate the capability for investigating both transparent and highly scattering samples by detecting transmitted and reflected signals, respectively.

  1. Colorado's energy boom: impact on crime and criminal justice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-02-01

    Information is reported on the impact of rapid energy development on western slope criminal justice agencies. The focus is on crime rates, law enforcement, the courts, and juvenile justice problems. The problems that are likely to develop and what might be done to minimize the negative consequences are analyzed. The social characteristics of boom towns and the changes resulting from rapid growth, the changes in crime rates, the impact experienced by law enforcement agencies and the courts, and information on planning and funding in impact areas are described. (MCW)

  2. Influence of different setups of the Frankfort horizontal plane on 3-dimensional cephalometric measurements.

    PubMed

    Santos, Rodrigo Mologni Gonçalves Dos; De Martino, José Mario; Haiter Neto, Francisco; Passeri, Luis Augusto

    2017-08-01

    The Frankfort horizontal (FH) is a plane that intersects both porions and the left orbitale. However, other combinations of points have also been used to define this plane in 3-dimensional cephalometry. These variations are based on the hypothesis that they do not affect the cephalometric analysis. We investigated the validity of this hypothesis. The material included cone-beam computed tomography data sets of 82 adult subjects with Class I molar relationship. A third-party method of cone-beam computed tomography-based 3-dimensional cephalometry was performed using 7 setups of the FH plane. Six lateral cephalometric hard tissue measurements relative to the FH plane were carried out for each setup. Measurement differences were calculated for each pair of setups of the FH plane. The number of occurrences of differences greater than the limits of agreement was counted for each of the 6 measurements. Only 3 of 21 pairs of setups had no occurrences for the 6 measurements. No measurement had no occurrences for the 21 pairs of setups. Setups based on left or right porion and both orbitales had the greatest number of occurrences for the 6 measurements. This investigation showed that significant and undesirable measurement differences can be produced by varying the definition of the FH plane. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  3. Development of a New Optical Measuring Set-Up

    NASA Astrophysics Data System (ADS)

    Miroshnichenko, I. P.; Parinov, I. A.

    2018-06-01

    The paper proposes a description of the developed optical measuring set-up for the contactless recording and processing of measurement results for small spatial (linear and angular) displacements of control surfaces based on the use of laser technologies and optical interference methods. The proposed set-up is designed to solve all the arising measurement tasks in the study of the physical and mechanical properties of new materials and in the process of diagnosing the state of structural materials by acoustic active methods of nondestructive testing. The structure of the set-up, its constituent parts are described, and the features of construction and functioning during measurements are discussed. New technical solutions for the implementation of the components of the set-up under consideration are obtained. The purpose and description of the original specialized software, used to perform a priori analysis of measurement results, are present, while performing measurements, for a posteriori analysis of measurement results. Moreover, the influences of internal and external disturbance effects on the measurement results and correcting measurement results directly in their implementation are determined. The technical solutions, used in the set-up, are protected by the patents of the Russian Federation for inventions, and software is protected by the certificates of state registration of computer programs. The proposed set-up is intended for use in instrumentation, mechanical engineering, shipbuilding, aviation, energy sector, etc.

  4. Compensating Unknown Time-Varying Delay in Opto-Electronic Platform Tracking Servo System.

    PubMed

    Xie, Ruihong; Zhang, Tao; Li, Jiaquan; Dai, Ming

    2017-05-09

    This paper investigates the problem of compensating miss-distance delay in opto-electronic platform tracking servo system. According to the characteristic of LOS (light-of-sight) motion, we setup the Markovian process model and compensate this unknown time-varying delay by feed-forward forecasting controller based on robust H∞ control. Finally, simulation based on double closed-loop PI (Proportion Integration) control system indicates that the proposed method is effective for compensating unknown time-varying delay. Tracking experiments on the opto-electronic platform indicate that RMS (root-mean-square) error is 1.253 mrad when tracking 10° 0.2 Hz signal.

  5. Optogenetic Light Crafting Tools for the Control of Cardiac Arrhythmias.

    PubMed

    Richter, Claudia; Christoph, Jan; Lehnart, Stephan E; Luther, Stefan

    2016-01-01

    The control of spatiotemporal dynamics in biological systems is a fundamental problem in nonlinear sciences and has important applications in engineering and medicine. Optogenetic tools combined with advanced optical technologies provide unique opportunities to develop and validate novel approaches to control spatiotemporal complexity in neuronal and cardiac systems. Understanding of the mechanisms and instabilities underlying the onset, perpetuation, and control of cardiac arrhythmias will enable the development and translation of novel therapeutic approaches. Here we describe in detail the preparation and optical mapping of transgenic channelrhodopsin-2 (ChR2) mouse hearts, cardiac cell cultures, and the optical setup for photostimulation using digital light processing.

  6. Experimental Study and a Mathematical Model of the Processes in Frozen Soil Under a Reservoir with a Hot Heat-Transfer Agent

    NASA Astrophysics Data System (ADS)

    Kislitsyn, A. A.; Shastunova, U. Yu.; Yanbikova, Yu. F.

    2018-05-01

    On an experimental setup, the authors have measured temperature fields in frozen soil during the filling of a reservoir with hot heat-transfer agent (oil), and also the change in the shape and position of the front of ice melting (isotherms T = 0°C) with time. The approximate solution of a two-dimensional Stefan problem on thawing of frozen soil has been given; it has been shown that satisfactory agreement with experimental results can only be obtained with account taken of the convective transfer of heat due to the water motion in the region of thawed soil.

  7. Population Switching and Charge Sensing in Quantum Dots: A Case for a Quantum Phase Transition

    NASA Astrophysics Data System (ADS)

    Goldstein, Moshe; Berkovits, Richard; Gefen, Yuval

    2010-06-01

    A broad and a narrow level of a quantum dot connected to two external leads may swap their respective occupancies as a function of an external gate voltage. By mapping this problem onto a multiflavored Coulomb gas we show that such population switching is not abrupt. However, trying to measure it by adding a third electrostatically coupled lead may render this switching an abrupt first order quantum phase transition. This is related to the interplay of the Mahan mechanism versus the Anderson orthogonality catastrophe, in similitude to the Fermi edge singularity. A concrete setup for experimental observation of this effect is also suggested.

  8. Noninvasive control of dental calculus removal: qualification of two fluorescence methods

    NASA Astrophysics Data System (ADS)

    Gonchukov, S.; Sukhinina, A.; Bakhmutov, D.; Biryukova, T.

    2013-02-01

    The main condition of periodontitis prevention is the full calculus removal from the teeth surface. This procedure should be fulfilled without harming adjacent unaffected tooth tissues. Nevertheless the problem of sensitive and precise estimating of tooth-calculus interface exists and potential risk of hard tissue damage remains. In this work it was shown that fluorescence diagnostics during calculus removal can be successfully used for precise noninvasive detection of calculus-tooth interface. In so doing the simple implementation of this method free from the necessity of spectrometer using can be employed. Such a simple implementation of calculus detection set-up can be aggregated with the devices of calculus removing.

  9. A Conditional Curie-Weiss Model for Stylized Multi-group Binary Choice with Social Interaction

    NASA Astrophysics Data System (ADS)

    Opoku, Alex Akwasi; Edusei, Kwame Owusu; Ansah, Richard Kwame

    2018-04-01

    This paper proposes a conditional Curie-Weiss model as a model for decision making in a stylized society made up of binary decision makers that face a particular dichotomous choice between two options. Following Brock and Durlauf (Discrete choice with social interaction I: theory, 1955), we set-up both socio-economic and statistical mechanical models for the choice problem. We point out when both the socio-economic and statistical mechanical models give rise to the same self-consistent equilibrium mean choice level(s). Phase diagram of the associated statistical mechanical model and its socio-economic implications are discussed.

  10. Holographic Moire, An Optical Tool For The Determination Of Displacements, Strains, Contours, And Slopes Of Surfaces

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.

    1982-06-01

    In conventional holographic interferometry, the observed fringe patterns are determined by the object displacement and deformation, and by the illumination and observation configurations. The obtained information may not be in the most convenient form for further data processing. To overcome this problem, and to create new possibilities, holographic fringe patterns can be changed by modifying the optical setup. As a result of these modifications, well-known procedures of the moire method can be applied to holographic interferometry. Components of displacement and components of the strain tensor can be isolated and measured separately. Surface contours and slopes can also be determined.

  11. Incremental electrohydraulic forming - A new approach for the manufacture of structured multifunctional sheet metal blanks

    NASA Astrophysics Data System (ADS)

    Djakow, Eugen; Springer, Robert; Homberg, Werner; Piper, Mark; Tran, Julian; Zibart, Alexander; Kenig, Eugeny

    2017-10-01

    Electrohydraulic Forming (EHF) processes permit the production of complex, sharp-edged geometries even when high-strength materials are used. Unfortunately, the forming zone is often limited as compared to other sheet metal forming processes. The use of a special industrial-robot-based tool setup and an incremental process strategy could provide a promising solution for this problem. This paper describes such an innovative approach using an electrohydraulic incremental forming machine, which can be employed to manufacture the large multifunctional and complex part geometries in steel, aluminium, magnesium and reinforced plastic that are employed in lightweight constructions or heating elements.

  12. Experimental Study and a Mathematical Model of the Processes in Frozen Soil Under a Reservoir with a Hot Heat-Transfer Agent

    NASA Astrophysics Data System (ADS)

    Kislitsyn, A. A.; Shastunova, U. Yu.; Yanbikova, Yu. F.

    2018-03-01

    On an experimental setup, the authors have measured temperature fields in frozen soil during the filling of a reservoir with hot heat-transfer agent (oil), and also the change in the shape and position of the front of ice melting (isotherms T = 0°C) with time. The approximate solution of a two-dimensional Stefan problem on thawing of frozen soil has been given; it has been shown that satisfactory agreement with experimental results can only be obtained with account taken of the convective transfer of heat due to the water motion in the region of thawed soil.

  13. GEOMAGIA50: An archeointensity database with PHP and MySQL

    NASA Astrophysics Data System (ADS)

    Korhonen, K.; Donadini, F.; Riisager, P.; Pesonen, L. J.

    2008-04-01

    The GEOMAGIA50 database stores 3798 archeomagnetic and paleomagnetic intensity determinations dated to the past 50,000 years. It also stores details of the measurement setup for each determination, which are used for ranking the data according to prescribed reliability criteria. The ranking system aims to alleviate the data reliability problem inherent in this kind of data. GEOMAGIA50 is based on two popular open source technologies. The MySQL database management system is used for storing the data, whereas the functionality and user interface are provided by server-side PHP scripts. This technical brief gives a detailed description of GEOMAGIA50 from a technical viewpoint.

  14. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  15. The Complex Economic System of Supply Chain Financing

    NASA Astrophysics Data System (ADS)

    Zhang, Lili; Yan, Guangle

    Supply Chain Financing (SCF) refers to a series of innovative and complicated financial services based on supply chain. The SCF set-up is a complex system, where the supply chain management and Small and Medium Enterprises (SMEs) financing services interpenetrate systematically. This paper establishes the organization structure of SCF System, and presents two financing models respectively, with or without the participation of the third-party logistic provider (3PL). Using Information Economics and Game Theory, the interrelationship among diverse economic sectors is analyzed, and the economic mechanism of development and existent for SCF system is demonstrated. New thoughts and approaches to solve SMEs financing problem are given.

  16. Planetary investigation utilizing an imaging spectrometer system based upon charge injection technology

    NASA Technical Reports Server (NTRS)

    Wattson, R. B.; Harvey, P.; Swift, R.

    1975-01-01

    An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.

  17. Experimental model of the device for detection of nuclear cycle materials by photoneutron technology

    NASA Astrophysics Data System (ADS)

    Bakalyarov, A. M.; Karetnikov, M. D.; Kozlov, K. N.; Lebedev, V. I.; Meleshko, E. A.; Obinyakov, B. A.; Ostashev, I. E.; Tupikin, N. A.; Yakovlev, G. V.

    2007-08-01

    The inherent complexity of sea container control makes them potentially dangerous for smuggling nuclear materials. The experts believe that only active technologies based on recording the products of induced radiation from sensitive materials might solve the problem. The paper reports on the experimental model of the device on the basis of the electron LINAC U-28 for detection of nuclear materials by photonuclear technology. The preliminary numerical optimization of output units (converter, filter, collimator) for shaping the bremsstrahlung was carried out. The setup of experimental device and initial results of recording the prompt and delayed fission products are discussed.

  18. An Information-Summarising Instruction Strategy for Improving the Web-Based Problem Solving Abilities of Students

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Kuo, Fan-Ray

    2011-01-01

    As knowledge rapidly expands and accumulates, training and assessing students' information searching ability for solving problems on the Internet has become an important and challenging issue. This research aims to improve the web-based problem solving abilities of primary school students by employing an information summarising approach for…

  19. Population Problems: A Constituent of General Culture in the 21st Century.

    ERIC Educational Resources Information Center

    Rath, Ferdinand J. C. M.

    1993-01-01

    Compares modern population problems with those of previous generations. Examines variations in population problems in different countries and world regions and the ways in which demographic events (e.g., rapid population growth or urbanization) in one region affect other regions. Advocates preparing for demographic changes through education. (DMM)

  20. Rapid and repeatable fabrication of high A/R silk fibroin microneedles using thermally-drawn micromolds.

    PubMed

    Lee, JiYong; Park, Seung Hyun; Seo, Il Ho; Lee, Kang Ju; Ryu, WonHyoung

    2015-08-01

    Thermal drawing is a versatile rapid prototyping method that can freely form microneedle (MN) structures with ultra-high aspect ratio without relying on any complex and expensive process. However, it is still challenging to repeatedly produce MNs with identical shapes using this thermal drawing due to small fluctuations in processing conditions such as temperatures, drawing speeds, drawing heights, or parallelism in the drawing setup. In addition, thermal drawing is only applicable to thermoplastic materials and most natural biomaterials are incompatible with this method. Thus, we propose use of thermal drawing to fabricate master molds with high aspect ratios and replicate the shape by micromolding. In this work, high A/R MNs with various body profiles were fabricated by thermal drawing and replicated to silk fibroin (SF) MNs multiple times using micromolding. The original MN shape was precisely copied to the SF MNs. Methanol treatment enhanced the mechanical strength of SF MNs up to about 113% more depending on the treatment duration. We also demonstrated that methanol exposure time could effectively control drug release rates from SF MNs. Copyright © 2015 Elsevier B.V. All rights reserved.

Top