Sample records for handle extremely large

  1. Diazo compounds in continuous-flow technology.

    PubMed

    Müller, Simon T R; Wirth, Thomas

    2015-01-01

    Diazo compounds are very versatile reagents in organic chemistry and meet the challenge of selective assembly of structurally complex molecules. Their leaving group is dinitrogen; therefore, they are very clean and atom-efficient reagents. However, diazo compounds are potentially explosive and extremely difficult to handle on an industrial scale. In this review, it is discussed how continuous flow technology can help to make these powerful reagents accessible on large scale. Microstructured devices can improve heat transfer greatly and help with the handling of dangerous reagents safely. The in situ formation and subsequent consumption of diazo compounds are discussed along with advances in handling diazomethane and ethyl diazoacetate. The potential large-scale applications of a given methodology is emphasized. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Handling Emergency Management in [an] Object Oriented Modeling Environment

    NASA Technical Reports Server (NTRS)

    Tokgoz, Berna Eren; Cakir, Volkan; Gheorghe, Adrian V.

    2010-01-01

    It has been understood that protection of a nation from extreme disasters is a challenging task. Impacts of extreme disasters on a nation's critical infrastructures, economy and society could be devastating. A protection plan itself would not be sufficient when a disaster strikes. Hence, there is a need for a holistic approach to establish more resilient infrastructures to withstand extreme disasters. A resilient infrastructure can be defined as a system or facility that is able to withstand damage, but if affected, can be readily and cost-effectively restored. The key issue to establish resilient infrastructures is to incorporate existing protection plans with comprehensive preparedness actions to respond, recover and restore as quickly as possible, and to minimize extreme disaster impacts. Although national organizations will respond to a disaster, extreme disasters need to be handled mostly by local emergency management departments. Since emergency management departments have to deal with complex systems, they have to have a manageable plan and efficient organizational structures to coordinate all these systems. A strong organizational structure is the key in responding fast before and during disasters, and recovering quickly after disasters. In this study, the entire emergency management is viewed as an enterprise and modelled through enterprise management approach. Managing an enterprise or a large complex system is a very challenging task. It is critical for an enterprise to respond to challenges in a timely manner with quick decision making. This study addresses the problem of handling emergency management at regional level in an object oriented modelling environment developed by use of TopEase software. Emergency Operation Plan of the City of Hampton, Virginia, has been incorporated into TopEase for analysis. The methodology used in this study has been supported by a case study on critical infrastructure resiliency in Hampton Roads.

  3. How to recover more value from small pine trees: Essential oils and resins

    Treesearch

    Vasant M. Kelkar; Brian W. Geils; Dennis R. Becker; Steven T. Overby; Daniel G. Neary

    2006-01-01

    In recent years, the young dense forests of northern Arizona have suffered extreme droughts, wildfires, and insect outbreaks. Improving forest health requires reducing forest density by cutting many small-diameter trees with the consequent production of large volumes of residual biomass. To offset the cost of handling this low-value timber, additional marketing options...

  4. Perspectives in astrophysical databases

    NASA Astrophysics Data System (ADS)

    Frailis, Marco; de Angelis, Alessandro; Roberto, Vito

    2004-07-01

    Astrophysics has become a domain extremely rich of scientific data. Data mining tools are needed for information extraction from such large data sets. This asks for an approach to data management emphasizing the efficiency and simplicity of data access; efficiency is obtained using multidimensional access methods and simplicity is achieved by properly handling metadata. Moreover, clustering and classification techniques on large data sets pose additional requirements in terms of computation and memory scalability and interpretability of results. In this study we review some possible solutions.

  5. A Bit-Encoding Based New Data Structure for Time and Memory Efficient Handling of Spike Times in an Electrophysiological Setup.

    PubMed

    Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin

    2018-04-01

    Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (< 25 ms), feedback based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.

  6. Solar Power Satellites - A Review of the Space Transportation Options.

    DTIC Science & Technology

    1980-03-01

    already exists with such systems, gained mainly through liquid-metal breeder reactor programmes. 0 For example, inlet temperatures of 970 C can be handled...alternatives exist. In addition, there would be extreme reluctance on the part of most governments to allow large C- reactors , producing gigawatts of power, to...antenna. The reactors employed are high-temperature gas- cooled breeders , which convert U238 into fissile plutonium. Each of the modules includes a

  7. Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane

    NASA Technical Reports Server (NTRS)

    Gera, Joseph; Bosworth, John T.

    1987-01-01

    Initial envelope clearance and subsequent flight testing of a new, fully augmented airplane with an extremely high degree of static instability can place unusual demands on the flight test approach. Previous flight test experience with these kinds of airplanes is very limited or nonexistent. The safe and efficient flight testing may be further complicated by a multiplicity of control effectors that may be present on this class of airplanes. This paper describes some novel flight test and analysis techniques in the flight dynamics and handling qualities area. These techniques were utilized during the initial flight envelope clearance of the X-29A aircraft and were largely responsible for the completion of the flight controls clearance program without any incidents or significant delays.

  8. How we shipped our flip and standard too

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deigl, H.J.; Feltz, D.E.

    1984-07-01

    This paper highlights the planning and handling activities for the shipment of irradiated TRIGA fuel from Texas A and M University to the Argonne National Lab/West (ANL/West) reactor facility at Idaho Falls, Idaho. Attention is focused on the enormous time spent on the planning and preparations prior to the shipment. The actual handling time at the NSCR for three shipping packages containing a total 51 elements was only 4 days, but, the time spent in planning and preparation exceeded 16 months. The fuel was transferred for shipment without incident - and from a health physics standpoint the exercise went verymore » well. Whole body exposures and hand doses were minimal for such a large undertaking. ANL/West health physicists reported contamination of the lifting devices for the HFIR when they received the cask. These pieces were wipe tested and contamination was found to be less than 200 dpm. If they were contaminated we were extremely fortunate during handling not to contaminate our facility or personnel.« less

  9. Traffic sharing algorithms for hybrid mobile networks

    NASA Technical Reports Server (NTRS)

    Arcand, S.; Murthy, K. M. S.; Hafez, R.

    1995-01-01

    In a hybrid (terrestrial + satellite) mobile personal communications networks environment, a large size satellite footprint (supercell) overlays on a large number of smaller size, contiguous terrestrial cells. We assume that the users have either a terrestrial only single mode terminal (SMT) or a terrestrial/satellite dual mode terminal (DMT) and the ratio of DMT to the total terminals is defined gamma. It is assumed that the call assignments to and handovers between terrestrial cells and satellite supercells take place in a dynamic fashion when necessary. The objectives of this paper are twofold, (1) to propose and define a class of traffic sharing algorithms to manage terrestrial and satellite network resources efficiently by handling call handovers dynamically, and (2) to analyze and evaluate the algorithms by maximizing the traffic load handling capability (defined in erl/cell) over a wide range of terminal ratios (gamma) given an acceptable range of blocking probabilities. Two of the algorithms (G & S) in the proposed class perform extremely well for a wide range of gamma.

  10. High-capacity high-speed recording

    NASA Astrophysics Data System (ADS)

    Jamberdino, A. A.

    1981-06-01

    Continuing advances in wideband communications and information handling are leading to extremely large volume digital data systems for which conventional data storage techniques are becoming inadequate. The paper presents an assessment of alternative recording technologies for the extremely wideband, high capacity storage and retrieval systems currently under development. Attention is given to longitudinal and rotary head high density magnetic recording, laser holography in human readable/machine readable devices and a wideband recorder, digital optical disks, and spot recording in microfiche formats. The electro-optical technologies considered are noted to be capable of providing data bandwidths up to 1000 megabits/sec and total data storage capacities in the 10 to the 11th to 10 to the 12th bit range, an order of magnitude improvement over conventional technologies.

  11. Advanced spacecraft: What will they look like and why

    NASA Technical Reports Server (NTRS)

    Price, Humphrey W.

    1990-01-01

    The next century of spaceflight will witness an expansion in the physical scale of spacecraft, from the extreme of the microspacecraft to the very large megaspacecraft. This will respectively spawn advances in highly integrated and miniaturized components, and also advances in lightweight structures, space fabrication, and exotic control systems. Challenges are also presented by the advent of advanced propulsion systems, many of which require controlling and directing hot plasma, dissipating large amounts of waste heat, and handling very high radiation sources. Vehicle configuration studies for a number of theses types of advanced spacecraft were performed, and some of them are presented along with the rationale for their physical layouts.

  12. Vertical interferometer workstation for testing large spherical optics

    NASA Astrophysics Data System (ADS)

    Truax, B.

    2013-09-01

    The design of an interferometer workstation for the testing of large concave and convex spherical optics is presented. The workstation handles optical components and mounts up to 425 mm in diameter with mass of up to 40 kg with 6 axes of adjustment. A unique method for the implementation of focus, roll and pitch was used allowing for extremely precise adjustment. The completed system includes transmission spheres with f-numbers from f/1.6 to f/0.82 incorporating reference surface diameters of up to 306 mm and surface accuracies of better than 63 nm PVr. The design challenges and resulting solutions are discussed. System performance results are presented.

  13. Handle-shaped Prominence

    NASA Image and Video Library

    2001-02-17

    NASA Extreme Ultraviolet Imaging Telescope aboard ESA’s SOHO spacecraft took this image of a huge, handle-shaped prominence in 1999. Prominences are huge clouds of relatively cool dense plasma suspended in the Sun hot, thin corona.

  14. Easy-To-Use Connector-Assembly Tool

    NASA Technical Reports Server (NTRS)

    Redmon, John W., Jr.; Jankowski, Fred

    1988-01-01

    Tool compensates for user's loss of dexterity under awkward conditions. Has jaws that swivel over 180 degree so angle adjusts with respect to handles. Oriented and held in position most comfortable and effective for user in given situation. Jaws lined with rubber pads so they conform to irregularly shaped parts and grips firmly but gently. Once tool engages part, it locks on it so user can release handles without losing part. Ratchet mechanism in tool allows user to work handles back and forth in confined space to connect or disconnect part. Quickly positioned, locked, and released. Gives user feel of its grip on part. Frees grasping muscles from work during part of task, giving user greater freedom to move hand. Operates with only one hand, leaving user's other hand free to manipulate wiring or other parts. Also adapts to handling and positioning extremely-hot or extremely-cold fluid lines, contaminated objects, abrasive or sharp objects, fragile items, and soft objects.

  15. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  16. An Efficient Pipeline Wavefront Phase Recovery for the CAFADIS Camera for Extremely Large Telescopes

    PubMed Central

    Magdaleno, Eduardo; Rodríguez, Manuel; Rodríguez-Ramos, José Manuel

    2010-01-01

    In this paper we show a fast, specialized hardware implementation of the wavefront phase recovery algorithm using the CAFADIS camera. The CAFADIS camera is a new plenoptic sensor patented by the Universidad de La Laguna (Canary Islands, Spain): international patent PCT/ES2007/000046 (WIPO publication number WO/2007/082975). It can simultaneously measure the wavefront phase and the distance to the light source in a real-time process. The pipeline algorithm is implemented using Field Programmable Gate Arrays (FPGA). These devices present architecture capable of handling the sensor output stream using a massively parallel approach and they are efficient enough to resolve several Adaptive Optics (AO) problems in Extremely Large Telescopes (ELTs) in terms of processing time requirements. The FPGA implementation of the wavefront phase recovery algorithm using the CAFADIS camera is based on the very fast computation of two dimensional fast Fourier Transforms (FFTs). Thus we have carried out a comparison between our very novel FPGA 2D-FFTa and other implementations. PMID:22315523

  17. An efficient pipeline wavefront phase recovery for the CAFADIS camera for extremely large telescopes.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Rodríguez-Ramos, José Manuel

    2010-01-01

    In this paper we show a fast, specialized hardware implementation of the wavefront phase recovery algorithm using the CAFADIS camera. The CAFADIS camera is a new plenoptic sensor patented by the Universidad de La Laguna (Canary Islands, Spain): international patent PCT/ES2007/000046 (WIPO publication number WO/2007/082975). It can simultaneously measure the wavefront phase and the distance to the light source in a real-time process. The pipeline algorithm is implemented using Field Programmable Gate Arrays (FPGA). These devices present architecture capable of handling the sensor output stream using a massively parallel approach and they are efficient enough to resolve several Adaptive Optics (AO) problems in Extremely Large Telescopes (ELTs) in terms of processing time requirements. The FPGA implementation of the wavefront phase recovery algorithm using the CAFADIS camera is based on the very fast computation of two dimensional fast Fourier Transforms (FFTs). Thus we have carried out a comparison between our very novel FPGA 2D-FFTa and other implementations.

  18. The problem of extreme events in paired-watershed studies

    Treesearch

    James W. Hornbeck

    1973-01-01

    In paired-watershed studies, the occurrence of an extreme event during the after-treatment period presents a problem: the effects of treatment must be determined by using greatly extrapolated regression statistics. Several steps are presented to help insure careful handling of extreme events during analysis and reporting of research results.

  19. Unique Chernobyl Cranes for Deconstruction Activities in the New Safe Confinement - 13542

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parameswaran, N.A. Vijay; Chornyy, Igor; Owen, Rob

    2013-07-01

    The devastation left behind from the Chernobyl nuclear power plant (ChNPP) Unit 4 accident which occurred on April 26, 1986 presented unparalleled technical challenges to the world engineering and scientific community. One of the largest tasks that are in progress is the design and construction of the New Safe Confinement (NSC). The NSC is an engineered enclosure for the entire object shelter (OS) that includes a suite of process equipment. The process equipment will be used for the dismantling of the destroyed Chernobyl Nuclear Power Plant (ChNPP) Unit. One of the major mechanical handling systems to be installed in themore » NSC is the Main Cranes System (MCS). The planned decontamination and decommissioning or dismantling (D and D) activities will require the handling of heavily shielded waste disposal casks containing nuclear fuel as well as lifting and transporting extremely large structural elements. These activities, to be performed within the NSC, will require large and sophisticated cranes. The article will focus on the unique design features of the MCS for the D and D activities. (authors)« less

  20. Critical parts are stored and shipped in environmentally controlled reusable container

    NASA Technical Reports Server (NTRS)

    Kummerfeld, K. R.

    1966-01-01

    Environmentally controlled, hermetically sealed, reusable metal cabinet with storage drawers is used to ship and store sensitive electronic, pneumatic, or hydraulic parts or medical supplies under extreme weather or handling conditions. This container is compatible with on-site and transportation handling facilities.

  1. Topological Analysis and Gaussian Decision Tree: Effective Representation and Classification of Biosignals of Small Sample Size.

    PubMed

    Zhang, Zhifei; Song, Yang; Cui, Haochen; Wu, Jayne; Schwartz, Fernando; Qi, Hairong

    2017-09-01

    Bucking the trend of big data, in microdevice engineering, small sample size is common, especially when the device is still at the proof-of-concept stage. The small sample size, small interclass variation, and large intraclass variation, have brought biosignal analysis new challenges. Novel representation and classification approaches need to be developed to effectively recognize targets of interests with the absence of a large training set. Moving away from the traditional signal analysis in the spatiotemporal domain, we exploit the biosignal representation in the topological domain that would reveal the intrinsic structure of point clouds generated from the biosignal. Additionally, we propose a Gaussian-based decision tree (GDT), which can efficiently classify the biosignals even when the sample size is extremely small. This study is motivated by the application of mastitis detection using low-voltage alternating current electrokinetics (ACEK) where five categories of bisignals need to be recognized with only two samples in each class. Experimental results demonstrate the robustness of the topological features as well as the advantage of GDT over some conventional classifiers in handling small dataset. Our method reduces the voltage of ACEK to a safe level and still yields high-fidelity results with a short assay time. This paper makes two distinctive contributions to the field of biosignal analysis, including performing signal processing in the topological domain and handling extremely small dataset. Currently, there have been no related works that can efficiently tackle the dilemma between avoiding electrochemical reaction and accelerating assay process using ACEK.

  2. 9 CFR 3.92 - Handling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Handling. 3.92 Section 3.92 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL...: (1) Shelter from sunlight and extreme heat. Sufficient shade must be provided to protect the nonhuman...

  3. 9 CFR 3.92 - Handling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Handling. 3.92 Section 3.92 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL...: (1) Shelter from sunlight and extreme heat. Sufficient shade must be provided to protect the nonhuman...

  4. 9 CFR 3.92 - Handling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Handling. 3.92 Section 3.92 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL...: (1) Shelter from sunlight and extreme heat. Sufficient shade must be provided to protect the nonhuman...

  5. Claims incidence of work-related disorders of the upper extremities: Washington state, 1987 through 1995.

    PubMed

    Silverstein, B; Welp, E; Nelson, N; Kalat, J

    1998-12-01

    This study examined the claim incidence rate, cost, and industry distribution of work-related upper extremity disorders in Washington. Washington State Fund workers' compensation claims from 1987 to 1995 were abstracted and categorized into general and specific disorders of gradual or sudden onset. Accepted claims included 100,449 for hand/wrist disorders (incidence rate: 98.2/10,000 full-time equivalents; carpal tunnel syndrome rate: 27.3), 30,468 for elbow disorders (incidence rate: 29.7; epicondylitis rate: 11.7), and 55,315 for shoulder disorders (incidence rate: 54.0; rotator cuff syndrome rate: 19.9). Average direct workers' compensation claims costs (medical treatment and indemnity) were $15,790 (median: $6774) for rotator cuff syndrome, $12,794 for carpal tunnel syndrome (median: $4190), and $6593 for epicondylitis (median: $534). Construction and food processing were among the industries with the highest rate ratios for all disorders (> 4.0). Upper extremity disorders represent a large and costly problem in Washington State industry. Industries characterized by manual handling and repetitive work have high rate ratios. The contingent workforce appears to be at high risk.

  6. Care and handling of container plants from storage to outplanting

    Treesearch

    Thomas D. Landis; R. Kasten Dumroese

    2011-01-01

    Nursery plants are in a period of high risk from the time they leave the protected environment of the nursery to when they are outplanted. During handling and shipping, nursery stock may be exposed to many damaging stresses, including extreme temperatures, desiccation, mechanical injuries, and storage molds. This is also the period of greatest financial risk, because...

  7. Finger doses for staff handling radiopharmaceuticals in nuclear medicine.

    PubMed

    Pant, Gauri S; Sharma, Sanjay K; Rath, Gaura K

    2006-09-01

    Radiation doses to the fingers of occupational workers handling 99mTc-labeled compounds and 131I for diagnostic and therapeutic procedures in nuclear medicine were measured by thermoluminescence dosimetry. The doses were measured at the base of the ring finger and the index finger of both hands in 2 groups of workers. Group 1 (7 workers) handled 99mTc-labeled radiopharmaceuticals, and group 2 (6 workers) handled 131I for diagnosis and therapy. Radiation doses to the fingertips of 3 workers also were measured. Two were from group 1, and 1 was from group 2. The doses to the base of the fingers for the radiopharmacy staff and physicians from group 1 were observed to be 17+/-7.5 (mean+/-SD) and 13.4+/-6.5 microSv/GBq, respectively. Similarly, the dose to the base of the fingers for the 3 physicians in group 2 was estimated to be 82.0+/-13.8 microSv/GBq. Finger doses for the technologists in both groups could not be calculated per unit of activity because they did not handle the radiopharmaceuticals directly. Their doses were reported in millisieverts that accumulated in 1 wk. The doses to the fingertips of the radiopharmacy worker and the physician in group 1 were 74.3+/-19.8 and 53.5+/-21.9 microSv/GBq, respectively. The dose to the fingertips of the physician in group 2 was 469.9+/-267 microSv/GBq. The radiation doses to the fingers of nuclear medicine staff at our center were measured. The maximum expected annual dose to the extremities appeared to be less than the annual limit (500 mSv/y), except for a physician who handled large quantities of 131I for treatment. Because all of these workers are on rotation and do not constantly handle radioactivity throughout the year, the doses to the base of the fingers or the fingertips should not exceed the prescribed annual limit of 500 mSv.

  8. Determination of electron-nucleus collisions geometry with forward neutrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, L.; Aschenauer, E.; Lee, J. H.

    2014-12-29

    There are a large number of physics programs one can explore in electron-nucleus collisions at a future electron-ion collider. Collision geometry is very important in these studies, while the measurement for an event-by-event geometric control is rarely discussed in the prior deep-inelastic scattering experiments off a nucleus. This paper seeks to provide some detailed studies on the potential of tagging collision geometries through forward neutron multiplicity measurements with a zero degree calorimeter. As a result, this type of geometry handle, if achieved, can be extremely beneficial in constraining nuclear effects for the electron-nucleus program at an electron-ion collider.

  9. Survival in extreme environments - on the current knowledge of adaptations in tardigrades.

    PubMed

    Møbjerg, N; Halberg, K A; Jørgensen, A; Persson, D; Bjørn, M; Ramløv, H; Kristensen, R M

    2011-07-01

    Tardigrades are microscopic animals found worldwide in aquatic as well as terrestrial ecosystems. They belong to the invertebrate superclade Ecdysozoa, as do the two major invertebrate model organisms: Caenorhabditis elegans and Drosophila melanogaster. We present a brief description of the tardigrades and highlight species that are currently used as models for physiological and molecular investigations. Tardigrades are uniquely adapted to a range of environmental extremes. Cryptobiosis, currently referred to as a reversible ametabolic state induced by e.g. desiccation, is common especially among limno-terrestrial species. It has been shown that the entry and exit of cryptobiosis may involve synthesis of bioprotectants in the form of selective carbohydrates and proteins as well as high levels of antioxidant enzymes and other free radical scavengers. However, at present a general scheme of mechanisms explaining this phenomenon is lacking. Importantly, recent research has shown that tardigrades even in their active states may be extremely tolerant to environmental stress, handling extreme levels of ionizing radiation, large fluctuation in external salinity and avoiding freezing by supercooling to below -20 °C, presumably relying on efficient DNA repair mechanisms and osmoregulation. This review summarizes the current knowledge on adaptations found among tardigrades, and presents new data on tardigrade cell numbers and osmoregulation. © 2011 The Authors. Acta Physiologica © 2011 Scandinavian Physiological Society.

  10. Double PCL sign does not always indicate a bucket-handle tear of medial meniscus.

    PubMed

    Liu, Chen; Zheng, Hua Yong; Huang, Yan; Li, Hai Peng; Wu, Han; Sun, Tian Sheng; Yao, Jian Hua

    2016-09-01

    The discoid medial meniscus is an extremely rare anomaly. Bilateral discoid medial menisci are much more rare but intermittently reported. We report the first case of bilateral discoid medial menisci with positive double PCL sign, which typically indicates a bucket-handle tear of medial meniscus. A literature review was also conducted on bilateral discoid medial menisci.

  11. Reaction factoring and bipartite update graphs accelerate the Gillespie Algorithm for large-scale biochemical systems.

    PubMed

    Indurkhya, Sagar; Beal, Jacob

    2010-01-06

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.

  12. Reaction Factoring and Bipartite Update Graphs Accelerate the Gillespie Algorithm for Large-Scale Biochemical Systems

    PubMed Central

    Indurkhya, Sagar; Beal, Jacob

    2010-01-01

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models. PMID:20066048

  13. Enabling large-scale next-generation sequence assembly with Blacklight

    PubMed Central

    Couger, M. Brian; Pipes, Lenore; Squina, Fabio; Prade, Rolf; Siepel, Adam; Palermo, Robert; Katze, Michael G.; Mason, Christopher E.; Blood, Philip D.

    2014-01-01

    Summary A variety of extremely challenging biological sequence analyses were conducted on the XSEDE large shared memory resource Blacklight, using current bioinformatics tools and encompassing a wide range of scientific applications. These include genomic sequence assembly, very large metagenomic sequence assembly, transcriptome assembly, and sequencing error correction. The data sets used in these analyses included uncategorized fungal species, reference microbial data, very large soil and human gut microbiome sequence data, and primate transcriptomes, composed of both short-read and long-read sequence data. A new parallel command execution program was developed on the Blacklight resource to handle some of these analyses. These results, initially reported previously at XSEDE13 and expanded here, represent significant advances for their respective scientific communities. The breadth and depth of the results achieved demonstrate the ease of use, versatility, and unique capabilities of the Blacklight XSEDE resource for scientific analysis of genomic and transcriptomic sequence data, and the power of these resources, together with XSEDE support, in meeting the most challenging scientific problems. PMID:25294974

  14. Environmental Symposium Held in Crystal City, Virginia on May 5-6, 1992

    DTIC Science & Technology

    1992-05-01

    addition, the Act creat a new program designed to prevent sudden, accidental releases of extremely hazardo substances . Generally, the Act sets forth a... prevent of sudden, The owner or operator of any facility handling an extremely hazardous substance will also be required to prepare and implement a risk...management plan to detect and prevent or minimize the potential for an accidental release of extremely hazardous substances . EPA may require that such

  15. The Global Streamflow Indices and Metadata archive (G-SIM): A compilation of global streamflow time series indices and meta-data

    NASA Astrophysics Data System (ADS)

    Do, Hong; Gudmundsson, Lukas; Leonard, Michael; Westra, Seth; Senerivatne, Sonia

    2017-04-01

    In-situ observations of daily streamflow with global coverage are a crucial asset for understanding large-scale freshwater resources which are an essential component of the Earth system and a prerequisite for societal development. Here we present the Global Streamflow Indices and Metadata archive (G-SIM), a collection indices derived from more than 20,000 daily streamflow time series across the globe. These indices are designed to support global assessments of change in wet and dry extremes, and have been compiled from 12 free-to-access online databases (seven national databases and five international collections). The G-SIM archive also includes significant metadata to help support detailed understanding of streamflow dynamics, with the inclusion of drainage area shapefile and many essential catchment properties such as land cover type, soil and topographic characteristics. The automated procedure in data handling and quality control of the project makes G-SIM a reproducible, extendible archive and can be utilised for many purposes in large-scale hydrology. Some potential applications include the identification of observational trends in hydrological extremes, the assessment of climate change impacts on streamflow regimes, and the validation of global hydrological models.

  16. The National Extreme Events Data and Research Center (NEED)

    NASA Astrophysics Data System (ADS)

    Gulledge, J.; Kaiser, D. P.; Wilbanks, T. J.; Boden, T.; Devarakonda, R.

    2014-12-01

    The Climate Change Science Institute at Oak Ridge National Laboratory (ORNL) is establishing the National Extreme Events Data and Research Center (NEED), with the goal of transforming how the United States studies and prepares for extreme weather events in the context of a changing climate. NEED will encourage the myriad, distributed extreme events research communities to move toward the adoption of common practices and will develop a new database compiling global historical data on weather- and climate-related extreme events (e.g., heat waves, droughts, hurricanes, etc.) and related information about impacts, costs, recovery, and available research. Currently, extreme event information is not easy to access and is largely incompatible and inconsistent across web sites. NEED's database development will take into account differences in time frames, spatial scales, treatments of uncertainty, and other parameters and variables, and leverage informatics tools developed at ORNL (i.e., the Metadata Editor [1] and Mercury [2]) to generate standardized, robust documentation for each database along with a web-searchable catalog. In addition, NEED will facilitate convergence on commonly accepted definitions and standards for extreme events data and will enable integrated analyses of coupled threats, such as hurricanes/sea-level rise/flooding and droughts/wildfires. Our goal and vision is that NEED will become the premiere integrated resource for the general study of extreme events. References: [1] Devarakonda, Ranjeet, et al. "OME: Tool for generating and managing metadata to handle BigData." Big Data (Big Data), 2014 IEEE International Conference on. IEEE, 2014. [2] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94.

  17. Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method.

    PubMed

    Matsushima, Kyoji; Nakahara, Sumio

    2009-12-01

    A large-scale full-parallax computer-generated hologram (CGH) with four billion (2(16) x 2(16)) pixels is created to reconstruct a fine true 3D image of a scene, with occlusions. The polygon-based method numerically generates the object field of a surface object, whose shape is provided by a set of vertex data of polygonal facets, while the silhouette method makes it possible to reconstruct the occluded scene. A novel technique using the segmented frame buffer is presented for handling and propagating large wave fields even in the case where the whole wave field cannot be stored in memory. We demonstrate that the full-parallax CGH, calculated by the proposed method and fabricated by a laser lithography system, reconstructs a fine 3D image accompanied by a strong sensation of depth.

  18. Status and path to a final EUVL reticle-handling solution

    NASA Astrophysics Data System (ADS)

    He, Long; Orvek, Kevin; Seidel, Phil; Wurm, Stefan; Underwood, Jon; Betancourt, Ernie

    2007-03-01

    In extreme ultraviolet lithography (EUVL), the lack of a suitable material to build conventional pellicles calls for industry standardization of new techniques for protection and handling throughout the reticle's lifetime. This includes reticle shipping, robotic handling, in-fab transport, storage, and uses in atmospheric environments for metrology and vacuum environments for EUV exposure. In this paper, we review the status of the industry-wide progress in developing EUVL reticle-handling solutions. We show the industry's leading reticle carrier approaches for particle-free protection, such as improvements in conventional single carrier designs and new EUVL-specific carrier concepts, including variations on a removable pellicle. Our test indicates dual pod approach of the removable pellicle led to nearly particle-free use during a simulated life cycle, at ~50nm inspection sensitivity. We will provide an assessment of the remaining technical challenges facing EUVL reticle-handling technology. Finally, we will review the progress of the SEMI EUVL Reticle-handling Task Force in its efforts to standardize a final EUV reticle protection and handling solution.

  19. Development of an instrument for assessing workstyle in checkout cashier work (BAsIK).

    PubMed

    Kjellberg, Katarina; Palm, Peter; Josephson, Malin

    2012-01-01

    Checkout cashier work consists of handling a large number of items during a work shift, which implies repetitive movements of the shoulders, arms and hands/wrists, and a high work rate. The work is associated with a high prevalence of disorders in the neck and upper extremity. The concept of workstyle explains how ergonomic and psychosocial factors interact in the development of work-related upper extremity disorders. The aim of the project was to develop an instrument for the occupational health services to be used in the efforts to prevent upper extremity disorders in checkout cashier work. The instrument is based on the workstyle concept and is intended to be used as a tool to identify high-risk workstyle and needs for interventions, such as training and education. The instrument, BAsIK, consists of four parts; a questionnaire about workstyle, an observation protocol for work technique, a checklist about the design of the checkout and a questionnaire about work organization. The instrument was developed by selecting workstyle items developed for office work and adapting them to checkout cashier work, discussions with researchers and ergonomists, focus-group interviews with cashiers, observations of video recordings of cashiers, and studies of existing guidelines and checklists.

  20. Computational data sciences for assessment and prediction of climate extremes

    NASA Astrophysics Data System (ADS)

    Ganguly, A. R.

    2011-12-01

    Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.

  1. bigSCale: an analytical framework for big-scale single-cell data.

    PubMed

    Iacono, Giovanni; Mereu, Elisabetta; Guillaumet-Adkins, Amy; Corominas, Roser; Cuscó, Ivon; Rodríguez-Esteban, Gustavo; Gut, Marta; Pérez-Jurado, Luis Alberto; Gut, Ivo; Heyn, Holger

    2018-06-01

    Single-cell RNA sequencing (scRNA-seq) has significantly deepened our insights into complex tissues, with the latest techniques capable of processing tens of thousands of cells simultaneously. Analyzing increasing numbers of cells, however, generates extremely large data sets, extending processing time and challenging computing resources. Current scRNA-seq analysis tools are not designed to interrogate large data sets and often lack sensitivity to identify marker genes. With bigSCale, we provide a scalable analytical framework to analyze millions of cells, which addresses the challenges associated with large data sets. To handle the noise and sparsity of scRNA-seq data, bigSCale uses large sample sizes to estimate an accurate numerical model of noise. The framework further includes modules for differential expression analysis, cell clustering, and marker identification. A directed convolution strategy allows processing of extremely large data sets, while preserving transcript information from individual cells. We evaluated the performance of bigSCale using both a biological model of aberrant gene expression in patient-derived neuronal progenitor cells and simulated data sets, which underlines the speed and accuracy in differential expression analysis. To test its applicability for large data sets, we applied bigSCale to assess 1.3 million cells from the mouse developing forebrain. Its directed down-sampling strategy accumulates information from single cells into index cell transcriptomes, thereby defining cellular clusters with improved resolution. Accordingly, index cell clusters identified rare populations, such as reelin ( Reln )-positive Cajal-Retzius neurons, for which we report previously unrecognized heterogeneity associated with distinct differentiation stages, spatial organization, and cellular function. Together, bigSCale presents a solution to address future challenges of large single-cell data sets. © 2018 Iacono et al.; Published by Cold Spring Harbor Laboratory Press.

  2. A DSP-based readout and online processing system for a new focal-plane polarimeter at AGOR

    NASA Astrophysics Data System (ADS)

    Hagemann, M.; Bassini, R.; van den Berg, A. M.; Ellinghaus, F.; Frekers, D.; Hannen, V. M.; Häupke, T.; Heyse, J.; Jacobs, E.; Kirsch, M.; Krüsemann, B.; Rakers, S.; Sohlbach, H.; Wörtche, H. J.

    1999-11-01

    A Focal-Plane Polarimeter (FPP) for the large acceptance Big-Bite Spectrometer (BBS) at AGOR using a novel readout architecture has been commissioned at the KVI Groningen. The instrument is optimized for medium-energy polarized proton scattering near or at 0°. For the handling of the high counting rates at extreme forward angles and for the suppression of small-angle scattering in the graphite analyzer, a high-performance data processing DSP system connecting to the LeCroy FERA and PCOS ECL bus architecture has been made operational and tested successfully. Details of the system and the functions of the various electronic components are described.

  3. The R-Shell approach - Using scheduling agents in complex distributed real-time systems

    NASA Technical Reports Server (NTRS)

    Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre

    1993-01-01

    Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.

  4. The extreme ultraviolet explorer archive

    NASA Astrophysics Data System (ADS)

    Polomski, E.; Drake, J. J.; Dobson, C.; Christian, C.

    1993-09-01

    The Extreme Ultrviolet Explorer (EUVE) public archive was created to handle the storage, maintenance, and distribution of EUVE data and ancillary documentation, information, and software. Access to the archive became available to the public on July 17, 1992, only 40 days after the launch of the EUVE satellite. A brief overview of the archive's contents and the various methods of access will be described.

  5. The Extreme Ultraviolet Explorer science instruments development - Lessons learned

    NASA Technical Reports Server (NTRS)

    Malina, Roger F.; Battel, S.

    1991-01-01

    The science instruments development project for the Extreme Ultraviolet Explorer (EUVE) satellite is reviewed. Issues discussed include the philosophical basis of the program, the establishment of a tight development team, the approach to planning and phasing activities, the handling of the most difficult technical problems, and the assessment of the work done during the preimplemntation period of the project.

  6. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    PubMed

    Schaarup-Jensen, K; Rasmussen, M R; Thorndahl, S

    2009-01-01

    In urban drainage modelling long-term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties with regards to long-term prediction of maximum water levels and combined sewer overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO volumes. Traditionally, long-term rainfall series, from a local rain gauge, are unavailable. In the present case study, however, long and local rain series are available. 2 rainfall gauges have recorded events for approximately 9 years at 2 locations within the catchment. Beside these 2 gauges another 7 gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity can be handled, e.g. by introducing an "averaging procedure" based on the variability within the set of statistics. All simulations are performed by means of the MOUSE LTS model.

  7. Addition of Electrostatic Forces to EDEM with Applications to Triboelectrically Charged Particles

    NASA Technical Reports Server (NTRS)

    Hogue, Michael D.; Calle, Carlos; Curry, David

    2008-01-01

    Tribocharging of particles is common in many processes including fine powder handling and mixing, printer toner transport and dust extraction. In a lunar environment with its high vacuum and lack of water, electrostatic forces are an important factor to consider when designing and operating equipment. Dust mitigation and management is critical to safe and predictable performance of people and equipment. The extreme nature of lunar conditions makes it difficult and costly to carryout experiments on earth which are necessary to better understand how particles gather and transfer charge between each other and with equipment surfaces. DEM (Discrete Element Modeling) provides an excellent virtual laboratory for studying tribocharging of particles as well as for design of devices for dust mitigation and for other purposes related to handling and processing of lunar regolith. Theoretical and experimental work has been performed pursuant to incorporating screened Coulombic electrostatic forces into EDEM Tm, a commercial DEM software package. The DEM software is used to model the trajectories of large numbers of particles for industrial particulate handling and processing applications and can be coupled with other solvers and numerical models to calculate particle interaction with surrounding media and force fields. In this paper we will present overview of the theoretical calculations and experimental data and their comparison to the results of the DEM simulations. We will also discuss current plans to revise the DEM software with advanced electrodynamic and mechanical algorithms.

  8. Chemical Accident Prevention: Site Security

    EPA Pesticide Factsheets

    This chemical safety alert assists facilities that routinely handle extremely hazardous substances, along with SERCs, LEPCs, and emergency responders, in their efforts to reduce criminally caused releases and vulnerability to terrorist activity.

  9. Toward high-throughput genotyping: dynamic and automatic software for manipulating large-scale genotype data using fluorescently labeled dinucleotide markers.

    PubMed

    Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W

    2001-07-01

    To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.

  10. 21 CFR 522.995 - Fluprostenol sodium injection.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... for food. For veterinary use only. Federal law restricts this drug to use by or on the order of a... respiratory problems should exercise extreme caution when handling this product. In the early stages, women...

  11. 21 CFR 522.995 - Fluprostenol sodium injection.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... for food. For veterinary use only. Federal law restricts this drug to use by or on the order of a... respiratory problems should exercise extreme caution when handling this product. In the early stages, women...

  12. Management of stormwater facility maintenance residuals

    DOT National Transportation Integrated Search

    1998-06-01

    Current research on stormwater maintenance residuals has revealed that the source and nature of these materials is extremely variable, that regulation can be ambiguous, and handling can be costly and difficult. From a regulatory perspective, data ind...

  13. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  14. Mooring and ground handling rigid airships

    NASA Technical Reports Server (NTRS)

    Walker, H., Jr.

    1975-01-01

    The problems of mooring and ground handling rigid airships are discussed. A brief history of Mooring and Ground Handling Rigid Airships from July 2, 1900 through September 1, 1939 is included. Also a brief history of ground handling developments with large U. S. Navy nonrigid airships between September 1, 1939 and August 31, 1962 is included wherein developed equipment and techniques appear applicable to future large rigid airships. Finally recommendations are made pertaining to equipment and procedures which appear desirable and feasible for future rigid airship programs.

  15. Solar Power Generation in Extreme Space Environments

    NASA Technical Reports Server (NTRS)

    Elliott, Frederick W.; Piszczor, Michael F.

    2016-01-01

    The exploration of space requires power for guidance, navigation, and control; instrumentation; thermal control; communications and data handling; and many subsystems and activities. Generating sufficient and reliable power in deep space through the use of solar arrays becomes even more challenging as solar intensity decreases and high radiation levels begin to degrade the performance of photovoltaic devices. The Extreme Environments Solar Power (EESP) project goal is to develop advanced photovoltaic technology to address these challenges.

  16. Ergonomic material-handling device

    DOEpatents

    Barsnick, Lance E.; Zalk, David M.; Perry, Catherine M.; Biggs, Terry; Tageson, Robert E.

    2004-08-24

    A hand-held ergonomic material-handling device capable of moving heavy objects, such as large waste containers and other large objects requiring mechanical assistance. The ergonomic material-handling device can be used with neutral postures of the back, shoulders, wrists and knees, thereby reducing potential injury to the user. The device involves two key features: 1) gives the user the ability to adjust the height of the handles of the device to ergonomically fit the needs of the user's back, wrists and shoulders; and 2) has a rounded handlebar shape, as well as the size and configuration of the handles which keep the user's wrists in a neutral posture during manipulation of the device.

  17. Cold Regions Environmental Considerations

    DTIC Science & Technology

    2009-02-03

    braided streams, variable discharge, seasonal breakup) limited seasonally limited abundant Hydrology (frozen lakes and bogs) not present seasonally...fuel hoses may crack increasing the potential for fuel spills. Extreme care must be used when handling cables at cold temperatures, protecting the

  18. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    DOE PAGES

    Desai, Ajit; Khalil, Mohammad; Pettit, Chris; ...

    2017-09-21

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolutionmore » in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.« less

  19. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desai, Ajit; Khalil, Mohammad; Pettit, Chris

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolutionmore » in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.« less

  20. Web-based visualization of very large scientific astronomy imagery

    NASA Astrophysics Data System (ADS)

    Bertin, E.; Pillay, R.; Marmo, C.

    2015-04-01

    Visualizing and navigating through large astronomy images from a remote location with current astronomy display tools can be a frustrating experience in terms of speed and ergonomics, especially on mobile devices. In this paper, we present a high performance, versatile and robust client-server system for remote visualization and analysis of extremely large scientific images. Applications of this work include survey image quality control, interactive data query and exploration, citizen science, as well as public outreach. The proposed software is entirely open source and is designed to be generic and applicable to a variety of datasets. It provides access to floating point data at terabyte scales, with the ability to precisely adjust image settings in real-time. The proposed clients are light-weight, platform-independent web applications built on standard HTML5 web technologies and compatible with both touch and mouse-based devices. We put the system to the test and assess the performance of the system and show that a single server can comfortably handle more than a hundred simultaneous users accessing full precision 32 bit astronomy data.

  1. A two steps solution approach to solving large nonlinear models: application to a problem of conjunctive use.

    PubMed

    Vieira, J; Cunha, M C

    2011-01-01

    This article describes a solution method of solving large nonlinear problems in two steps. The two steps solution approach takes advantage of handling smaller and simpler models and having better starting points to improve solution efficiency. The set of nonlinear constraints (named as complicating constraints) which makes the solution of the model rather complex and time consuming is eliminated from step one. The complicating constraints are added only in the second step so that a solution of the complete model is then found. The solution method is applied to a large-scale problem of conjunctive use of surface water and groundwater resources. The results obtained are compared with solutions determined with the direct solve of the complete model in one single step. In all examples the two steps solution approach allowed a significant reduction of the computation time. This potential gain of efficiency of the two steps solution approach can be extremely important for work in progress and it can be particularly useful for cases where the computation time would be a critical factor for having an optimized solution in due time.

  2. Capturing spatial and temporal patterns of widespread, extreme flooding across Europe

    NASA Astrophysics Data System (ADS)

    Busby, Kathryn; Raven, Emma; Liu, Ye

    2013-04-01

    Statistical characterisation of physical hazards is an integral part of probabilistic catastrophe models used by the reinsurance industry to estimate losses from large scale events. Extreme flood events are not restricted by country boundaries which poses an issue for reinsurance companies as their exposures often extend beyond them. We discuss challenges and solutions that allow us to appropriately capture the spatial and temporal dependence of extreme hydrological events on a continental-scale, which in turn enables us to generate an industry-standard stochastic event set for estimating financial losses for widespread flooding. By presenting our event set methodology, we focus on explaining how extreme value theory (EVT) and dependence modelling are used to account for short, inconsistent hydrological data from different countries, and how to make appropriate statistical decisions that best characterise the nature of flooding across Europe. The consistency of input data is of vital importance when identifying historical flood patterns. Collating data from numerous sources inherently causes inconsistencies and we demonstrate our robust approach to assessing the data and refining it to compile a single consistent dataset. This dataset is then extrapolated using a parameterised EVT distribution to estimate extremes. Our method then captures the dependence of flood events across countries using an advanced multivariate extreme value model. Throughout, important statistical decisions are explored including: (1) distribution choice; (2) the threshold to apply for extracting extreme data points; (3) a regional analysis; (4) the definition of a flood event, which is often linked with reinsurance industry's hour's clause; and (5) handling of missing values. Finally, having modelled the historical patterns of flooding across Europe, we sample from this model to generate our stochastic event set comprising of thousands of events over thousands of years. We then briefly illustrate how this is applied within a probabilistic model to estimate catastrophic loss curves used by the reinsurance industry.

  3. Unique and massive Chernobyl cranes for deconstruction activities in the new safe confinement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parameswaran, N. A. Vijay; Chornyy, Igor; Owen, Rob

    2013-07-01

    On 26 April 1986, the worst nuclear power plant accident in history occurred at the Chernobyl plant in Ukraine (then part of the Soviet Union). The destruction of Unit 4 sent highly radioactive fallout over Belarus, Russia, Ukraine, and Europe. The object shelter-a containment sarcophagus-was built in November 1986 to limit exposure to radiation. However, it has only a planned 25-year lifespan and would probably not survive even a moderate seismic event in a region that has more than its share of such events. It was time to take action. One of the largest tasks that are in progress ismore » the design and construction of the New Safe Confinement (NSC). The NSC is an engineered enclosure for the entire object shelter that includes a suite of process equipment. The process equipment will be used for the dismantling of the destroyed Chernobyl Nuclear Power Plant Unit. One of the major mechanical handling systems to be installed in the new safe confinement is the Main Cranes System. The planned decontamination and decommissioning or dismantling activities will require the handling of heavily shielded waste disposal casks containing nuclear fuel as well as lifting and transporting extremely large structural elements. These activities, to be performed within the new safe confinement, will require large and sophisticated cranes. The article will focus on the current progress of the new safe confinement and of the main cranes system for the decommissioning or dismantling activities. (authors)« less

  4. 5. Credit BG. This interior view shows the weigh room, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Credit BG. This interior view shows the weigh room, looking west (240°): Electric lighting and scale read-outs (boxes with circular windows on the wall) are fitted with explosion-proof enclosures; these enclosures prevent malfunctioning electrical parts from sparking and starting fires or explosions. One marble table and scale have been removed at the extreme left of the view. Two remaining scales handle small and large quantities of propellants and additives. Marble tables do not absorb chemicals or conduct electricity; their mass also prevents vibration from upsetting the scales. The floor has an electrically conductive coating to dissipate static electric charges, thus preventing sparks which might ignite propellants. - Jet Propulsion Laboratory Edwards Facility, Weigh & Control Building, Edwards Air Force Base, Boron, Kern County, CA

  5. Larger-Stroke Piezoelectrically Actuated Microvalve

    NASA Technical Reports Server (NTRS)

    Yang, Eui-Hyeok

    2003-01-01

    A proposed normally-closed microvalve would contain a piezoelectric bending actuator instead of a piezoelectric linear actuator like that of the microvalve described in the preceding article. Whereas the stroke of the linear actuator of the preceding article would be limited to approximately equal to 6 micrometers, the stroke of the proposed bending actuator would lie in the approximate range of 10 to 15 micrometers-large enough to enable the microvalve to handle a variety of liquids containing suspended particles having sizes up to 10 m. Such particulate-laden liquids occur in a variety of microfluidic systems, one example being a system that sorts cells or large biomolecules for analysis. In comparison with the linear actuator of the preceding article, the bending actuator would be smaller and less massive. The combination of increased stroke, smaller mass, and smaller volume would be obtained at the cost of decreased actuation force: The proposed actuator would generate a force in the approximate range of 1 to 4 N, the exact amount depending on operating conditions and details of design. This level of actuation force would be too low to enable the valve to handle a fluid at the high pressure level mentioned in the preceding article. The proposal encompasses two alternative designs one featuring a miniature piezoelectric bimorph actuator and one featuring a thick-film unimorph piezoelectric actuator (see figure). In either version, the valve would consume a power of only 0.01 W when actuated at a frequency of 100 Hz. Also, in either version, it would be necessary to attach a soft elastomeric sealing ring to the valve seat so that any particles that settle on the seat would be pushed deep into the elastomeric material to prevent or reduce leakage. The overall dimensions of the bimorph version would be 7 by 7 by 1 mm. The actuator in this version would generate a force of 1 N and a stroke of 10 m at an applied potential of 150 V. The actuation force would be sufficient to enable the valve to handle a fluid pressurized up to about 50 psi (approximately equal to 0.35 MPa). The overall dimensions of the unimorph version would be 2 by 2 by 0.5 mm. In this version, an electric field across the piezoelectric film on a diaphragm would cause the film to pull on, and thereby bend, the diaphragm. At an applied potential of 20 V, the actuator in this version would generate a stroke of 10 micrometers and a force of 0.01 N. This force level would be too low to enable handling of fluids at pressures comparable to those of the bimorph version. This version would be useful primarily in microfluidic and nanofluidic applications that involve extremely low differential pressures and in which there are requirements for extreme miniaturization of valves. Examples of such applications include liquid chromatography and sequencing of deoxyribonucleic acid.

  6. Building 909, oblique view to southeast, 135 mm lens. Building ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building 909, oblique view to southeast, 135 mm lens. Building 908 at extreme right for context. - Travis Air Force Base, Handling Crew Building, North of W Street, Armed Forces Special Weapons Project Q Area, Fairfield, Solano County, CA

  7. An analytical study of electric vehicle handling dynamics

    NASA Technical Reports Server (NTRS)

    Greene, J. E.; Segal, D. J.

    1979-01-01

    Hypothetical electric vehicle configurations were studied by applying available analytical methods. Elementary linearized models were used in addition to a highly sophisticated vehicle dynamics computer simulation technique. Physical properties of specific EV's were defined for various battery and powertrain packaging approaches applied to a range of weight distribution and inertial properties which characterize a generic class of EV's. Computer simulations of structured maneuvers were performed for predicting handling qualities in the normal driving range and during various extreme conditions related to accident avoidance. Results indicate that an EV with forward weight bias will possess handling qualities superior to a comparable EV that is rear-heavy or equally balanced. The importance of properly matching tires, suspension systems, and brake system front/rear torque proportioning to a given EV configuration during the design stage is demonstrated.

  8. Evaluation of Flying Qualities and Guidance Displays for an Advanced Tilt-Wing STOL Transport Aircraft in Final Approach and Landing

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Franklin, James A.; Hardy, Gordon H.

    2002-01-01

    A piloted simulation was performed on the Vertical Motion Simulator at NASA Ames Research Center to evaluate flying qualities of a tilt-wing Short Take-Off and Landing (STOL) transport aircraft during final approach and landing. The experiment was conducted to assess the design s handling qualities, and to evaluate the use of flightpath-centered guidance for the precision approach and landing tasks required to perform STOL operations in instrument meteorological conditions, turbulence, and wind. Pilots rated the handling qualities to be satisfactory for all operations evaluated except those encountering extreme crosswinds and severe windshear; even in these difficult meteorological conditions, adequate handling qualities were maintained. The advanced flight control laws and guidance displays provided consistent performance and precision landings.

  9. Ergonomics of disposable handles for minimally invasive surgery.

    PubMed

    Büchel, D; Mårvik, R; Hallabrin, B; Matern, U

    2010-05-01

    The ergonomic deficiencies of currently available minimally invasive surgery (MIS) instrument handles have been addressed in many studies. In this study, a new ergonomic pistol handle concept, realized as a prototype, and two disposable ring handles were investigated according to ergonomic properties set by new European standards. In this study, 25 volunteers performed four practical tasks to evaluate the ergonomics of the handles used in standard operating procedures (e.g., measuring a suture and cutting to length, precise maneuvering and targeting, and dissection of a gallbladder). Moreover, 20 participants underwent electromyography (EMG) tests to measure the muscle strain they experienced while carrying out the basic functions (grasp, rotate, and maneuver) in the x, y, and z axes. The data measured included the number of errors, the time required for task completion, perception of pressure areas, and EMG data. The values for usability in the test were effectiveness, efficiency, and user satisfaction. Surveys relating to the subjective rating were completed after each task for each of the three handles tested. Each handle except the new prototype caused pressure areas and pain. Extreme differences in muscle strain could not be observed for any of the three handles. Experienced surgeons worked more quickly with the prototype when measuring and cutting a suture (approximately 20%) and during precise maneuvering and targeting (approximately 20%). On the other hand, they completed the dissection task faster with the handle manufactured by Ethicon. Fewer errors were made with the prototype in dissection of the gallbladder. In contrast to the handles available on the market, the prototype was always rated as positive by the volunteers in the subjective surveys. None of the handles could fulfil all of the requirements with top scores. Each handle had its advantages and disadvantages. In contrast to the ring handles, the volunteers could fulfil most of the tasks more efficiently using the prototype handle without any remarkable pressure areas, cramps, or pain.

  10. Genetic selection for temperament traits in dairy and beef cattle.

    PubMed

    Haskell, Marie J; Simm, Geoff; Turner, Simon P

    2014-01-01

    Animal temperament can be defined as a response to environmental or social stimuli. There are a number of temperament traits in cattle that contribute to their welfare, including their response to handling or milking, response to challenge such as human approach or intervention at calving, and response to conspecifics. In a number of these areas, the genetic basis of the trait has been studied. Heritabilities have been estimated and in some cases quantitative trait loci (QTL) have been identified. The variation is sometimes considerable and moderate heritabilities have been found for the major handling temperament traits, making them amenable to selection. Studies have also investigated the correlations between temperament and other traits, such as productivity and meat quality. Despite this, there are relatively few examples of temperament traits being used in selection programmes. Most often, animals are screened for aggression or excessive fear during handling or milking, with extreme animals being culled, or EBVs for temperament are estimated, but these traits are not commonly included routinely in selection indices, despite there being economic, welfare and human safety drivers for their. There may be a number of constraints and barriers. For some traits and breeds, there may be difficulties in collecting behavioral data on sufficiently large populations of animals to estimate genetic parameters. Most selection indices require estimates of economic values, and it is often difficult to assign an economic value to a temperament trait. The effects of selection primarily for productivity traits on temperament and welfare are discussed. Future opportunities include automated data collection methods and the wider use of genomic information in selection.

  11. Genetic selection for temperament traits in dairy and beef cattle

    PubMed Central

    Haskell, Marie J.; Simm, Geoff; Turner, Simon P.

    2014-01-01

    Animal temperament can be defined as a response to environmental or social stimuli. There are a number of temperament traits in cattle that contribute to their welfare, including their response to handling or milking, response to challenge such as human approach or intervention at calving, and response to conspecifics. In a number of these areas, the genetic basis of the trait has been studied. Heritabilities have been estimated and in some cases quantitative trait loci (QTL) have been identified. The variation is sometimes considerable and moderate heritabilities have been found for the major handling temperament traits, making them amenable to selection. Studies have also investigated the correlations between temperament and other traits, such as productivity and meat quality. Despite this, there are relatively few examples of temperament traits being used in selection programmes. Most often, animals are screened for aggression or excessive fear during handling or milking, with extreme animals being culled, or EBVs for temperament are estimated, but these traits are not commonly included routinely in selection indices, despite there being economic, welfare and human safety drivers for their. There may be a number of constraints and barriers. For some traits and breeds, there may be difficulties in collecting behavioral data on sufficiently large populations of animals to estimate genetic parameters. Most selection indices require estimates of economic values, and it is often difficult to assign an economic value to a temperament trait. The effects of selection primarily for productivity traits on temperament and welfare are discussed. Future opportunities include automated data collection methods and the wider use of genomic information in selection. PMID:25374582

  12. New morphological data of Litomosoides chagasfilhoi (Nematoda: Filarioidea) parasitizing Nectomys squamipes in Rio de Janeiro, Brazil.

    PubMed

    Muniz-Pereira, Luís Cláudio; Gonçalves, Paula Araujo; Guimarães, Erick Vaz; Fonseca, Fábio de Oliveira; Santos, José Augusto Albuquerque Dos; Maldonado-Júnior, Arnaldo; Moraes, Antonio Henrique Almeida de

    2016-01-01

    Litomosoides chagasfilhoi, originally described by Moraes Neto, Lanfredi & De Souza (1997) parasitizing the abdominal cavity of the wild rodent, Akodon cursor (Winge, 1887), was found in the abdominal cavity of Nectomys squamipes (Brants, 1827), from the municipality of Rio Bonito, Rio de Janeiro State, Brazil. This study led to addition of new morphological data and a new geographical distribution for this filarioid in Brazil. Several characters were detailed and emended to previous records of L. chagasfilhoi in N. squamipes, and confirming the original description in A. cursor: buccal capsule longer than wide with walls thinner than the lumen, right spicule slightly sclerotized, with membranous distal extremity slender, with a small tongue-like terminal portion, left spicule with handle longer than the blade, whose edges form large membranous wings folded longitudinally.

  13. E-ELT M1 test facility

    NASA Astrophysics Data System (ADS)

    Dimmler, M.; Marrero, J.; Leveque, S.; Barriga, P.; Sedghi, B.; Mueller, M.

    2012-09-01

    During the advanced design phase of the European Extremely Large Telescope (E-ELT) several critical components have been prototyped. During the last year some of them have been tested in dedicated test stands. In particular, a representative section of the E-ELT primary mirror has been assembled with 2 active and 2 passive segments. This test stand is equipped with complete prototype segment subunits, i.e. including support mechanisms, glass segments, edge sensors, position actuators as well as additional metrology for monitoring. The purpose is to test various procedures such as calibration, alignment and handling and to study control strategies. In addition the achievable component and subsystem performances are evaluated, and interface issues are identified. In this paper an overview of the activities related to the E-ELT M1 Test Facility will be given. Experiences and test results are presented.

  14. The climbing crawling robot (a unique cable robot for space and Earth)

    NASA Technical Reports Server (NTRS)

    Kerley, James J.; May, Edward; Eklund, Wayne

    1991-01-01

    Some of the greatest concerns in robotic designs have been the high center of gravity of the robot, the irregular or flat surface that the robot has to work on, the weight of the robot that has to handle heavy weights or use heavy forces, and the ability of the robot to climb straight up in the air. This climbing crawling robot handles these problems well with magnets, suction cups, or actuators. The cables give body to the robot and it performs very similar to a caterpillar. The computer program is simple and inexpensive as is the robot. One of the important features of this system is that the robot can work in pairs or triplets to handle jobs that would be extremely difficult for single robots. The light weight of the robot allows it to handle quite heavy weights. The number of feet give the robot many roots where a simple set of feet would give it trouble.

  15. Measurement of doses to the extremities of nuclear medicine staff

    NASA Astrophysics Data System (ADS)

    Shousha, Hany A.; Farag, Hamed; Hassan, Ramadan A.

    2010-01-01

    Medical uses of ionizing radiation now represent>95% of all man-made radiation exposure, and is the largest single radiation source after natural background radiation. Therefore, it is important to quantify the amount of radiation received by occupational individuals to optimize the working conditions for staff, and further, to compare doses in different departments to ensure compatibility with the recommended standards. For some groups working with unsealed sources in nuclear medicine units, the hands are more heavily exposed to ionizing radiation than the rest of the body. A personal dosimetry service runs extensively in Egypt. But doses to extremities have not been measured to a wide extent. The purpose of this study was to investigate the equivalent radiation doses to the fingers for five different nuclear medicine staff occupational groups for which heavy irradiation of the hands was suspected. Finger doses were measured for (1) nuclear medicine physicians, (2) technologists, (3) nurses and (4) physicists. The fifth group contains three technicians handling 131I, while the others handled 99mTc. Each staff member working with the radioactive material wore two thermoluminescent dosimeters (TLDs) during the whole testing period, which lasted from 1 to 4 weeks. Staff performed their work on a regular basis throughout the month, and mean annual doses were calculated for these groups. Results showed that the mean equivalent doses to the fingers of technologist, nurse and physicist groups were 30.24±14.5, 30.37±17.5 and 16.3±7.7 μSv/GBq, respectively. Equivalent doses for the physicians could not be calculated per unit of activity because they did not handle the radiopharmaceuticals directly. Their doses were reported in millisieverts (mSv) that accumulated in one week. Similarly, the dose to the fingers of individuals in Group 5 was estimated to be 126.13±38.2 μSv/GBq. The maximum average finger dose, in this study, was noted in the technologists who handled therapeutic 131I (2.5 mSv). In conclusion, the maximum expected annual dose to extremities is less than the annual limit (500 mSv/y).

  16. Handling Qualities of Large Flexible Aircraft. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Poopaka, S.

    1980-01-01

    The effects on handling qualities of elastic modes interaction with the rigid body dynamics of a large flexible aircraft are studied by a mathematical computer simulation. An analytical method to predict the pilot ratings when there is a severe modes interactions is developed. This is done by extending the optimal control model of the human pilot response to include the mode decomposition mechanism into the model. The handling qualities are determined for a longitudinal tracking task using a large flexible aircraft with parametric variations in the undamped natural frequencies of the two lowest frequency, symmetric elastic modes made to induce varying amounts of mode interaction.

  17. Doormat or Dictator?

    ERIC Educational Resources Information Center

    Pawlas, George E.

    1993-01-01

    Somewhere between easygoing and hardboiled management extremes lies the realm of true leadership. An effective administrator gets results by leading people (not ordering them), learning how to handle them, and discovering what makes each one tick. A true leader captures and holds staff members' confidence, helps them develop needed skills, and…

  18. Developing a safe on-orbit cryogenic depot

    NASA Technical Reports Server (NTRS)

    Bahr, Nicholas J.

    1992-01-01

    New U.S. space initiatives will require technology to realize planned programs such as piloted lunar and Mars missions. Key to the optimal execution of such missions are high performance orbit transfer vehicles and propellant storage facilities. Large amounts of liquid hydrogen and oxygen demand a uniquely designed on-orbit cryogenic propellant depot. Because of the inherent dangers in propellant storage and handling, a comprehensive system safety program must be established. This paper shows how the myriad and complex hazards demonstrate the need for an integrated safety effort to be applied from program conception through operational use. Even though the cryogenic depot is still in the conceptual stage, many of the hazards have been identified, including fatigue due to heavy thermal loading from environmental and operating temperature extremes, micrometeoroid and/or depot ancillary equipment impact (this is an important problem due to the large surface area needed to house the large quantities of propellant), docking and maintenance hazards, and hazards associated with extended extravehicular activity. Various safety analysis techniques were presented for each program phase. Specific system safety implementation steps were also listed. Enhanced risk assessment was demonstrated through the incorporation of these methods.

  19. Massively parallel support for a case-based planning system

    NASA Technical Reports Server (NTRS)

    Kettler, Brian P.; Hendler, James A.; Anderson, William A.

    1993-01-01

    Case-based planning (CBP), a kind of case-based reasoning, is a technique in which previously generated plans (cases) are stored in memory and can be reused to solve similar planning problems in the future. CBP can save considerable time over generative planning, in which a new plan is produced from scratch. CBP thus offers a potential (heuristic) mechanism for handling intractable problems. One drawback of CBP systems has been the need for a highly structured memory to reduce retrieval times. This approach requires significant domain engineering and complex memory indexing schemes to make these planners efficient. In contrast, our CBP system, CaPER, uses a massively parallel frame-based AI language (PARKA) and can do extremely fast retrieval of complex cases from a large, unindexed memory. The ability to do fast, frequent retrievals has many advantages: indexing is unnecessary; very large case bases can be used; memory can be probed in numerous alternate ways; and queries can be made at several levels, allowing more specific retrieval of stored plans that better fit the target problem with less adaptation. In this paper we describe CaPER's case retrieval techniques and some experimental results showing its good performance, even on large case bases.

  20. Large-Scale Residential Demolition

    EPA Pesticide Factsheets

    The EPA provides resources for handling residential demolitions or renovations. This includes planning, handling harmful materials, recycling, funding, compliance assistance, good practices and regulations.

  1. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the differences between images produced by various rendering methods. Images created by our software can be stored in the SGI RGB format. Our idtools software reads in pair of images and compares them using various metrics. The differences of the images using the RGB, HSV, and HSL color models can be calculated and shown. We can also calculate the auto-correlation function and the Fourier transform of the image and image differences. We will explore how these image differences compare in order to find useful metrics for quantifying the success of various visualization approaches. In general, progress was consistent with our research plan for the second year of the grant.

  2. Functional constraints on the evolution of long butterfly proboscides: lessons from Neotropical skippers (Lepidoptera: Hesperiidae)

    PubMed Central

    Bauder, J A S; Morawetz, L; Warren, A D; Krenn, H W

    2015-01-01

    Extremely long proboscides are rare among butterflies outside of the Hesperiidae, yet representatives of several genera of skipper butterflies possess proboscides longer than 50 mm. Although extremely elongated mouthparts can be regarded as advantageous adaptations to gain access to nectar in deep-tubed flowers, the scarcity of long-proboscid butterflies is a phenomenon that has not been adequately accounted for. So far, the scarceness was explained by functional costs arising from increased flower handling times caused by decelerated nectar intake rates. However, insects can compensate for the negative influence of a long proboscis through changes in the morphological configuration of the feeding apparatus. Here, we measured nectar intake rates in 34 species representing 21 Hesperiidae genera from a Costa Rican lowland rainforest area to explore the impact of proboscis length, cross-sectional area of the food canal and body size on intake rate. Long-proboscid skippers did not suffer from reduced intake rates due to their large body size and enlarged food canals. In addition, video analyses of the flower-visiting behaviour revealed that suction times increased with proboscis length, suggesting that long-proboscid skippers drink a larger amount of nectar from deep-tubed flowers. Despite these advantages, we showed that functional costs of exaggerated mouthparts exist in terms of longer manipulation times per flower. Finally, we discuss the significance of scaling relationships on the foraging efficiency of butterflies and why some skipper taxa, in particular, have evolved extremely long proboscides. PMID:25682841

  3. A Look at Technologies Vis-a-vis Information Handling Techniques.

    ERIC Educational Resources Information Center

    Swanson, Rowena W.

    The paper examines several ideas for information handling implemented with new technologies that suggest directions for future development. These are grouped under the topic headings: Handling Large Data Banks, Providing Personalized Information Packages, Providing Information Specialist Services, and Expanding Man-Machine Interaction. Guides in…

  4. Work activities and musculoskeletal complaints among preschool workers.

    PubMed

    Grant, K A; Habes, D J; Tepper, A L

    1995-12-01

    The potential for musculoskeletal trauma among preschool workers has been largely unexplored in the United States. This case report describes an investigation conducted to identify and evaluate possible causes of back and lower extremity pain among 22 workers at a Montessori day care facility. Investigators met with and distributed a questionnaire to school employees, and made measurements of workstation and furniture dimensions. Investigators also recorded the normal work activities of school employees on videotape, and performed a work sampling study to estimate the percentage of time employees spend performing various tasks and in certain postures. Questionnaire results from 18 employees indicated that back pain/discomfort was a common musculoskeletal complaint, reported by 61% of respondents. Neck/shoulder pain, lower extremity pain and hand/wrist pain were reported by 33, 33 and 11% of respondents, respectively. Observation and analysis of work activities indicated that employees spend significant periods of time kneeling, sitting on the floor, squatting, or bending at the waist. Furthermore, staff members who work with smaller children (i.e. six weeks to 18 months of age) performed more lifts and assumed more awkward lower extremity postures than employees who work with older children (3-4 years of age). Analysis of two lifting tasks using the revised NIOSH lifting equation indicated that employees who handle small children may be at increased risk of lifting-related low back pain. Investigators concluded that day care employees at this facility are at increased risk of low back pain and lower extremity (i.e. knee) injury due to work activities that require awkward or heavy lifts, and static working postures. Recommendations for reducing or eliminating these risks by modifying the workplace and changing the organization and methods of work are presented.

  5. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    PubMed

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  6. Andromeda: a peptide search engine integrated into the MaxQuant environment.

    PubMed

    Cox, Jürgen; Neuhauser, Nadin; Michalski, Annette; Scheltema, Richard A; Olsen, Jesper V; Mann, Matthias

    2011-04-01

    A key step in mass spectrometry (MS)-based proteomics is the identification of peptides in sequence databases by their fragmentation spectra. Here we describe Andromeda, a novel peptide search engine using a probabilistic scoring model. On proteome data, Andromeda performs as well as Mascot, a widely used commercial search engine, as judged by sensitivity and specificity analysis based on target decoy searches. Furthermore, it can handle data with arbitrarily high fragment mass accuracy, is able to assign and score complex patterns of post-translational modifications, such as highly phosphorylated peptides, and accommodates extremely large databases. The algorithms of Andromeda are provided. Andromeda can function independently or as an integrated search engine of the widely used MaxQuant computational proteomics platform and both are freely available at www.maxquant.org. The combination enables analysis of large data sets in a simple analysis workflow on a desktop computer. For searching individual spectra Andromeda is also accessible via a web server. We demonstrate the flexibility of the system by implementing the capability to identify cofragmented peptides, significantly improving the total number of identified peptides.

  7. Design and evaluation of a new ergonomic handle for instruments in minimally invasive surgery.

    PubMed

    Sancibrian, Ramon; Gutierrez-Diez, María C; Torre-Ferrero, Carlos; Benito-Gonzalez, Maria A; Redondo-Figuero, Carlos; Manuel-Palazuelos, Jose C

    2014-05-01

    Laparoscopic surgery techniques have been demonstrated to provide massive benefits to patients. However, surgeons are subjected to hardworking conditions because of the poor ergonomic design of the instruments. In this article, a new ergonomic handle design is presented. This handle is designed using ergonomic principles, trying to provide both more intuitive manipulation of the instrument and a shape that reduces the high-pressure zones in the contact with the surgeon's hand. The ergonomic characteristics of the new handle were evaluated using objective and subjective studies. The experimental evaluation was performed using 28 volunteers by means of the comparison of the new handle with the ring-handle (RH) concept in an instrument available on the market. The volunteers' muscle activation and motions of the hand, wrist, and arm were studied while they performed different tasks. The data measured in the experiment include electromyography and goniometry values. The results obtained from the subjective analysis reveal that most volunteers (64%) preferred the new prototype to the RH, reporting less pain and less difficulty to complete the tasks. The results from the objective study reveal that the hyperflexion of the wrist required for the manipulation of the instrument is strongly reduced. The new ergonomic handle not only provides important ergonomic advantages but also improves the efficiency when completing the tasks. Compared with RH instruments, the new prototype reduced the high-pressure areas and the extreme motions of the wrist. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  9. Applied extreme-value statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinnison, R.R.

    1983-05-01

    The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all themore » subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.« less

  10. Technology architecture guidelines for a health care system.

    PubMed

    Jones, D T; Duncan, R; Langberg, M L; Shabot, M M

    2000-01-01

    Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration.

  11. Technology architecture guidelines for a health care system.

    PubMed Central

    Jones, D. T.; Duncan, R.; Langberg, M. L.; Shabot, M. M.

    2000-01-01

    Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration. PMID:11079913

  12. Tape tracking and handling for magnetic tape recorders. [aboard spacecraft

    NASA Technical Reports Server (NTRS)

    Paroby, W.; Disilvestre, R.

    1975-01-01

    One of the critical performance and life limiting elements of a spacecraft tape recorder instrumentation system which has received little attention in technical literature is magnetic tape tracking and handling technology. This technology is required to understand how to gently transfer tape from one reel to another with proper alignment and a desirable uniform velocity at the read and write transducer heads. The increased demand for high data rate (i.e., multi-track spacecraft recording instrumentation systems), coupled with performance under extreme environmental conditions, requires a thorough knowledge of the various parameters which establish an optimum designed tape tracking and handling system. Stress analysis techniques are required to evaluate these parameters substantiated with test tape tracking data, to show the effect of each parameter on a tape recorder instrumentation tracking system. The technology is applicable to ground type tape recorders where the detrimental effects of edge guidance can be eliminated.

  13. Finasteride. Does it affect spermatogenesis and pregnancy?

    PubMed Central

    Pole, M.; Koren, G.

    2001-01-01

    QUESTION: A few women have asked me whether finasteride, taken by their partners for male pattern baldness, will affect their pregnancies. The product monograph is very alarming: it sounds as if even handling the medication could cause harm, especially to a male fetus. Should a man stop taking finasteride if his partner is planning pregnancy or is pregnant? What is the risk to the fetus if its mother accidentally handles crushed or broken tablets? ANSWER: To date, there are no reports of adverse pregnancy outcomes among women exposed to finasteride. Taking 1 mg of finasteride daily did not have any clinically significant effect on men's semen. Absorption through the skin while handling tablets is extremely unlikely to cause fetal exposure or harm. There is no reason to discontinue the drug. Motherisk is currently following up women who are pregnant or planning pregnancy and whose partners are taking finasteride. PMID:11785276

  14. EUVL mask dual pods to be used for mask shipping and handling in exposure tools

    NASA Astrophysics Data System (ADS)

    Gomei, Yoshio; Ota, Kazuya; Lystad, John; Halbmair, Dave; He, Long

    2007-03-01

    The concept of Extreme Ultra-Violet Lithography (EUVL) mask dual pods is proposed for use in both mask shipping and handling in exposure tools. The inner pod was specially designed to protect masks from particle contamination during shipping from mask houses to wafer factories. It can be installed in a load-lock chamber of exposure tools and evacuated while holding the mask inside. The inner pod upper cover is removed just before the mask is installed to a mask stage. Prototypes were manufactured and tested for shipping and for vacuum cycling. We counted particle adders through these actions with a detectable level of 54 nm and up. The adder count was close to zero, or we can say that the obtained result is within the noise level of our present evaluation environment. This indicates that the present concept is highly feasible for EUVL mask shipping and handling in exposure tools.

  15. Handling Qualities of Large Rotorcraft in Hover and Low Speed

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos; Theodore, Colin R.; Lawrence , Ben; Blanken, Chris L.

    2015-01-01

    According to a number of system studies, large capacity advanced rotorcraft with a capability of high cruise speeds (approx.350 mph) as well as vertical and/or short take-off and landing (V/STOL) flight could alleviate anticipated air transportation capacity issues by making use of non-primary runways, taxiways, and aprons. These advanced aircraft pose a number of design challenges, as well as unknown issues in the flight control and handling qualities domains. A series of piloted simulation experiments have been conducted on the NASA Ames Research Center Vertical Motion Simulator (VMS) in recent years to systematically investigate the fundamental flight control and handling qualities issues associated with the characteristics of large rotorcraft, including tiltrotors, in hover and low-speed maneuvering.

  16. The extremely long-tongued Neotropical butterfly Eurybia lycisca (Riodinidae): Proboscis morphology and flower handling

    PubMed Central

    Bauder, Julia A.S.; Lieskonig, Nora R.; Krenn, Harald W.

    2011-01-01

    Few species of true butterflies (Lepidoptera: Papilionoidea) have evolved a proboscis that greatly exceeds the length of the body. This study is the first to examine the morphology of an extremely long butterfly proboscis and to describe how it is used to obtain nectar from flowers with very deep corolla tubes. The proboscis of Eurybia lycisca (Riodinidae) is approximately twice as long as the body. It has a maximal length of 45.6 mm (mean length 36.5 mm ± 4.1 S.D., N = 20) and is extremely thin, measuring only about 0.26 mm at its maximum diameter. The proboscis has a unique arrangement of short sensilla at the tip, and its musculature arrangement is derived. The flower handling times on the preferred nectar plant, Calathea crotalifera (Marantaceae), were exceptionally long (mean 54.5 sec ± 28.5 S.D., N = 26). When feeding on the deep flowers remarkably few proboscis movements occur. The relationship between Eurybia lycisca and its preferred nectar plant and larval host plant, Calathea crotalifera, is not mutualistic since the butterfly exploits the flowers without contributing to their pollination. We hypothesize that the extraordinarily long proboscis of Eurybia lycisca is an adaptation for capitalizing on the pre-existing mutualistic interaction of the host plant with its pollinating long-tongued nectar feeding insects. PMID:21115131

  17. Green and Smart: Hydrogels to Facilitate Independent Practical Learning

    ERIC Educational Resources Information Center

    Hurst, Glenn A.

    2017-01-01

    A laboratory experiment was developed to enable students to investigate the use of smart hydrogels for potential application in targeted drug delivery. This is challenging for students to explore practically because of the extremely high risks of handling cross-linking agents such as glutaraldehyde. Genipin is a safe and green alternative that has…

  18. Geospatial Analysis and Model Evaluation Software (GAMES): Integrated Web-Based Analysis and Visualization

    DTIC Science & Technology

    2014-04-11

    particle location files for each source (hours) dti : time step in seconds horzmix: CONSTANT = use the value of horcon...however, if leg lengths are short. Extreme values of D/Lo can occur. We will handle these by assigning a maximum to the output. This is discussed by

  19. Penetrating missile-type head injury from a defective badminton racquet.

    PubMed

    Pappano, Dante; Murray, Elizabeth; Cimpello, Lynn Babcock; Conners, Gregory

    2009-06-01

    Injuries occurring during badminton are rarely serious and primarily involve the lower extremities. We report an instance wherein a patient suffered serious brain injury related to playing with a defective badminton racquet. The possibility of similar injuries following the separation of the racquet head and shaft from the handle needs to be disseminated.

  20. Visible Machine Learning for Biomedicine.

    PubMed

    Yu, Michael K; Ma, Jianzhu; Fisher, Jasmin; Kreisberg, Jason F; Raphael, Benjamin J; Ideker, Trey

    2018-06-14

    A major ambition of artificial intelligence lies in translating patient data to successful therapies. Machine learning models face particular challenges in biomedicine, however, including handling of extreme data heterogeneity and lack of mechanistic insight into predictions. Here, we argue for "visible" approaches that guide model structure with experimental biology. Copyright © 2018. Published by Elsevier Inc.

  1. 78 FR 22202 - Marketing Order Regulating the Handling of Spearmint Oil Produced in the Far West; Salable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-15

    ... avoiding extreme fluctuations in supplies and prices to help maintain stability in the spearmint oil market... experiencing higher than average returns. Lastly, improving global economic conditions have led to increased... improving economic indicators for the Far West Scotch spearmint oil industry outlined above, the Committee...

  2. 77 FR 33076 - Marketing Order Regulating the Handling of Spearmint Oil Produced in the Far West; Salable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-05

    ... avoiding extreme fluctuations in supplies and prices to help maintain stability in the spearmint oil market... spearmint oil industry experienced relatively good economic conditions, which motivated producers to... purchases just to rebuild inventories that were depleted during the worst of the recent U.S. economic...

  3. A general method for handling missing binary outcome data in randomized controlled trials

    PubMed Central

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441

  4. A modified generalized extremal optimization algorithm for the quay crane scheduling problem with interference constraints

    NASA Astrophysics Data System (ADS)

    Guo, Peng; Cheng, Wenming; Wang, Yi

    2014-10-01

    The quay crane scheduling problem (QCSP) determines the handling sequence of tasks at ship bays by a set of cranes assigned to a container vessel such that the vessel's service time is minimized. A number of heuristics or meta-heuristics have been proposed to obtain the near-optimal solutions to overcome the NP-hardness of the problem. In this article, the idea of generalized extremal optimization (GEO) is adapted to solve the QCSP with respect to various interference constraints. The resulting GEO is termed the modified GEO. A randomized searching method for neighbouring task-to-QC assignments to an incumbent task-to-QC assignment is developed in executing the modified GEO. In addition, a unidirectional search decoding scheme is employed to transform a task-to-QC assignment to an active quay crane schedule. The effectiveness of the developed GEO is tested on a suite of benchmark problems introduced by K.H. Kim and Y.M. Park in 2004 (European Journal of Operational Research, Vol. 156, No. 3). Compared with other well-known existing approaches, the experiment results show that the proposed modified GEO is capable of obtaining the optimal or near-optimal solution in a reasonable time, especially for large-sized problems.

  5. Modular mechatronic system for stationary bicycles interfaced with virtual environment for rehabilitation.

    PubMed

    Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos

    2014-06-05

    Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.

  6. Two hot to handle: How do we manage the simultaneous impacts of climate change and natural disasters on human health?

    NASA Astrophysics Data System (ADS)

    Phalkey, R. K.; Louis, V. R.

    2016-05-01

    Climate change is one of the major challenges we face today. There is recognition alongside evidence that the health impacts of both climate change and natural disasters are significant and rising. The impacts of both are also complex and span well beyond health to include environmental, social, demographic, cultural, and economic aspects of human lives. Nonetheless integrated impact assessments are rare and so are system level approaches or systematic preparedness and adaptation strategies to brace the two simultaneously particularly in low and middle-income countries. Ironically the impacts of both climate change as well as natural disasters will be disproportionately borne by low emitters. Sufficiently large and long-term data from comprehensive weather, socio-economic, demographic and health observational systems are currently unavailable to guide adaptation strategies with the necessary precision. In the absence of these and given the uncertainties around the health impact projections alongside the geographic disparities even within the countries, the main question is how can countries then prepare to brace the unknown? We certainly cannot wait to obtain answers to all the questions before we plan solutions. Strengthening health systems is therefore a pragmatic "zero regrets" strategy and should be adopted hastily before the parallel impacts from climate change and associated extreme weather events (disasters thereof) become too hot to handle.

  7. Differential seed handling by two African primates affects seed fate and establishment of large-seeded trees

    NASA Astrophysics Data System (ADS)

    Gross-Camp, Nicole D.; Kaplin, Beth A.

    2011-11-01

    We examined the influence of seed handling by two semi-terrestrial African forest primates, chimpanzees ( Pan troglodytes) and l'Hoest's monkeys ( Cercopithecus lhoesti), on the fate of large-seeded tree species in an afromontane forest. Chimpanzees and l'Hoest's monkeys dispersed eleven seed species over one year, with quantity and quality of dispersal varying through time. Primates differed in their seed handling behaviors with chimpanzees defecating large seeds (>0.5 cm) significantly more than l'Hoest's. Furthermore, they exhibited different oral-processing techniques with chimpanzees discarding wadges containing many seeds and l'Hoest's monkeys spitting single seeds. A PCA examined the relationship between microhabitat characteristics and the site where primates deposited seeds. The first two components explained almost half of the observed variation. Microhabitat characteristics associated with sites where seeds were defecated had little overlap with those characteristics describing where spit seeds arrived, suggesting that seed handling in part determines the location where seeds are deposited. We monitored a total of 552 seed depositions through time, recording seed persistence, germination, and establishment. Defecations were deposited significantly farther from an adult conspecific than orally-discarded seeds where they experienced the greatest persistence but poorest establishment. In contrast, spit seeds were deposited closest to an adult conspecific but experienced the highest seed establishment rates. We used experimental plots to examine the relationship between seed handling, deposition site, and seed fate. We found a significant difference in seed handling and fate, with undispersed seeds in whole fruits experiencing the lowest establishment rates. Seed germination differed by habitat type with open forest experiencing the highest rates of germination. Our results highlight the relationship between primate seed handling and deposition site and seed fate, and may be helpful in developing models to predict seed shadows and recruitment patterns of large-seeded trees.

  8. Ride quality sensitivity to SAS control law and to handling quality variations

    NASA Technical Reports Server (NTRS)

    Roberts, P. A.; Schmidt, D. K.; Swaim, R. L.

    1976-01-01

    The RQ trends which large flexible aircraft exhibit under various parameterizations of control laws and handling qualities are discussed. A summary of the assumptions and solution technique, a control law parameterization review, a discussion of ride sensitivity to handling qualities, and the RQ effects generated by implementing relaxed static stability configurations are included.

  9. Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall

    NASA Astrophysics Data System (ADS)

    Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.

  10. Asbestos: A Lingering Danger. AIO Red Paper #20.

    ERIC Educational Resources Information Center

    Malcolm, Stuart

    Its unique qualities makes asbestos extremely useful in industry, yet it is termed one of the most dangerous and insidious substances in the work place. Composed of mostly fibers, asbestos is readily freed into the atmosphere during handling, constituting a real health risk. There are two ways asbestos can enter the human body: by inhalation or…

  11. Functional constraints on the evolution of long butterfly proboscides: lessons from Neotropical skippers (Lepidoptera: Hesperiidae).

    PubMed

    Bauder, J A S; Morawetz, L; Warren, A D; Krenn, H W

    2015-03-01

    Extremely long proboscides are rare among butterflies outside of the Hesperiidae, yet representatives of several genera of skipper butterflies possess proboscides longer than 50 mm. Although extremely elongated mouthparts can be regarded as advantageous adaptations to gain access to nectar in deep-tubed flowers, the scarcity of long-proboscid butterflies is a phenomenon that has not been adequately accounted for. So far, the scarceness was explained by functional costs arising from increased flower handling times caused by decelerated nectar intake rates. However, insects can compensate for the negative influence of a long proboscis through changes in the morphological configuration of the feeding apparatus. Here, we measured nectar intake rates in 34 species representing 21 Hesperiidae genera from a Costa Rican lowland rainforest area to explore the impact of proboscis length, cross-sectional area of the food canal and body size on intake rate. Long-proboscid skippers did not suffer from reduced intake rates due to their large body size and enlarged food canals. In addition, video analyses of the flower-visiting behaviour revealed that suction times increased with proboscis length, suggesting that long-proboscid skippers drink a larger amount of nectar from deep-tubed flowers. Despite these advantages, we showed that functional costs of exaggerated mouthparts exist in terms of longer manipulation times per flower. Finally, we discuss the significance of scaling relationships on the foraging efficiency of butterflies and why some skipper taxa, in particular, have evolved extremely long proboscides. © 2015 The Authors. Journal of Evolutionary Biology published by John Wiley & Sons ltd on behalf of European Society for Evolutionary Biology.

  12. Recent developments in user-job management with Ganga

    NASA Astrophysics Data System (ADS)

    Currie, R.; Elmsheuser, J.; Fay, R.; Owen, P. H.; Richards, A.; Slater, M.; Sutcliffe, W.; Williams, M.

    2015-12-01

    The Ganga project was originally developed for use by LHC experiments and has been used extensively throughout Run1 in both LHCb and ATLAS. This document describes some the most recent developments within the Ganga project. There have been improvements in the handling of large scale computational tasks in the form of a new GangaTasks infrastructure. Improvements in file handling through using a new IGangaFile interface makes handling files largely transparent to the end user. In addition to this the performance and usability of Ganga have both been addressed through the development of a new queues system allows for parallel processing of job related tasks.

  13. Safe gas handling and system design for the large scale production of amorphous silicon based solar cells

    NASA Astrophysics Data System (ADS)

    Fortmann, C. M.; Farley, M. V.; Smoot, M. A.; Fieselmann, B. F.

    1988-07-01

    Solarex is one of the leaders in amorphous silicon based photovoltaic production and research. The large scale production environment presents unique safety concerns related to the quantity of dangerous materials as well as the number of personnel handling these materials. The safety measures explored by this work include gas detection systems, training, and failure resistant gas handling systems. Our experiences with flow restricting orifices in the CGA connections and the use of steel cylinders is reviewed. The hazards and efficiency of wet scrubbers for silane exhausts are examined. We have found it to be useful to provide the scrubbler with temperature alarms.

  14. An Investigation of Large Tilt-Rotor Hover and Low Speed Handling Qualities

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Decker, William A.; Theodore, Colin R.; Lindsey, James E.; Lawrence, Ben; Blanken, Chris L.

    2011-01-01

    A piloted simulation experiment conducted on the NASA-Ames Vertical Motion Simulator evaluated the hover and low speed handling qualities of a large tilt-rotor concept, with particular emphasis on longitudinal and lateral position control. Ten experimental test pilots evaluated different combinations of Attitude Command-Attitude Hold (ACAH) and Translational Rate Command (TRC) response types, nacelle conversion actuator authority limits and inceptor choices. Pilots performed evaluations in revised versions of the ADS-33 Hover, Lateral Reposition and Depart/Abort MTEs and moderate turbulence conditions. Level 2 handling qualities ratings were primarily recorded using ACAH response type in all three of the evaluation maneuvers. The baseline TRC conferred Level 1 handling qualities in the Hover MTE, but there was a tendency to enter into a PIO associated with nacelle actuator rate limiting when employing large, aggressive control inputs. Interestingly, increasing rate limits also led to a reduction in the handling qualities ratings. This led to the identification of a nacelle rate to rotor longitudinal flapping coupling effect that induced undesired, pitching motions proportional to the allowable amount of nacelle rate. A modification that counteracted this effect significantly improved the handling qualities. Evaluation of the different response type variants showed that inclusion of TRC response could provide Level 1 handling qualities in the Lateral Reposition maneuver by reducing coupled pitch and heave off axis responses that otherwise manifest with ACAH. Finally, evaluations in the Depart/Abort maneuver showed that uncertainty about commanded nacelle position and ensuing aircraft response, when manually controlling the nacelle, demanded high levels of attention from the pilot. Additional requirements to maintain pitch attitude within 5 deg compounded the necessary workload.

  15. The Challenges of Using Horses for Practical Teaching Purposes in Veterinary Programmes

    PubMed Central

    Gronqvist, Gabriella; Rogers, Chris; Gee, Erica; Bolwell, Charlotte; Gordon, Stuart

    2016-01-01

    Simple Summary Veterinary students often lack previous experience in handling horses and other large animals. This article discusses the challenges of using horses for veterinary teaching purposes and the potential consequences to student and equine welfare. The article proposes a conceptual model to optimise equine welfare, and subsequently student safety, during practical equine handling classes. Abstract Students enrolled in veterinary degrees often come from an urban background with little previous experience in handling horses and other large animals. Many veterinary degree programmes place importance on the teaching of appropriate equine handling skills, yet within the literature it is commonly reported that time allocated for practical classes often suffers due to time constraint pressure from other elements of the curriculum. The effect of this pressure on animal handling teaching time is reflected in the self-reported low level of animal handling competency, particularly equine, in students with limited prior experience with horses. This is a concern as a naive student is potentially at higher risk of injury to themselves when interacting with horses. Additionally, a naive student with limited understanding of equine behaviour may, through inconsistent or improper handling, increase the anxiety and compromise the welfare of these horses. There is a lack of literature investigating the welfare of horses in university teaching facilities, appropriate handling procedures, and student safety. This article focuses on the importance for students to be able to interpret equine behaviour and the potential consequences of poor handling skills to equine and student welfare. Lastly, the authors suggest a conceptual model to optimise equine welfare, and subsequently student safety, during practical equine handling classes. PMID:27845702

  16. [Accidents in equestrian sports : Analysis of injury mechanisms and patterns].

    PubMed

    Schröter, C; Schulte-Sutum, A; Zeckey, C; Winkelmann, M; Krettek, C; Mommsen, P

    2017-02-01

    Equestrian sports are one of the most popular forms of sport in Germany, while also being one of the most accident-prone sports. Furthermore, riding accidents are frequently associated with a high degree of severity of injuries and mortality. Nevertheless, there are insufficient data regarding incidences, demographics, mechanisms of accidents, injury severity and patterns and outcome of injured persons in amateur equestrian sports. Accordingly, it was the aim of the present study to retrospectively analyze these aspects. A total of 503 patients were treated in the emergency room of the Hannover Medical School because of an accident during recreational horse riding between 2006 and 2011. The female gender was predominantly affected with 89.5 %. The mean age of the patients was 26.2 ± 14.9 years and women (24.5 ± 12.5 years) were on average younger than men (40.2 ± 23.9 years). A special risk group was girls and young women aged between 10 and 39 years. The overall injury severity was measured using the injury severity score (ISS). Based on the total population, head injuries were the most common location of injuries with 17.3 % followed by injuries to the upper extremities with 15.2 % and the thoracic and lumbar spine with 10.9 %. The three most common injury locations after falling from a horse were the head (17.5 %), the upper extremities (17.4 %), the thoracic and lumbar spine (12.9 %). The most frequent injuries while handling horses were foot injuries (17.2 %), followed by head (16.6 %) and mid-facial injuries (15.0 %). With respect to the mechanism of injury accidents while riding were predominant (74 %), while accidents when handling horses accounted for only 26 %. The median ISS was 9.8 points. The proportion of multiple trauma patients (ISS > 16) was 18.1 %. Based on the total sample, the average in-hospital patient stay was 5.3 ± 5.4 days with a significantly higher proportion of hospitalized patients in the group of riding accidents. Fatal cases were not found in this study but the danger of riding is not to be underestimated. The large number of sometimes severe injuries with ISS values up to 62 points can be interpreted as an indication that recreational riding can easily result in life-threatening situations. Girls and young women could be identified as a group at particular risk. It has been demonstrated in this study that the three most common injury locations after falling from a horse were the head, the upper extremities, the thoracic and lumbar spine. The most frequent injury locations while handling horses were foot injuries, followed by head and mid-facial injuries.

  17. Development of longitudinal handling qualities criteria for large advanced supersonic aircraft

    NASA Technical Reports Server (NTRS)

    Sudderth, R. W.; Bohn, J. G.; Caniff, M. A.; Bennett, G. R.

    1975-01-01

    Longitudinal handling qualities criteria in terms of airplane response characteristics were developed. The criteria cover high speed cruise maneuvering, landing approach, and stall recovery. Data substantiating the study results are reported.

  18. Comparison of upper extremity glenohumeral joint forces in children with cerebral palsy using anterior and posterior walkers - biomed 2009.

    PubMed

    Strifling, Kelly M B; Konop, Katherine A; Wang, Mei; Harris, Gerald F

    2009-01-01

    Walkers are prescribed with the notion that one type of walker will be better for a child than another. One underlying justification for this practice is the theory that one walker may produce less stress on the upper extremities as the patient uses the walker. Nevertheless, upper extremity joint loading is not typically analyzed during walker assisted gait in children with spastic diplegic cerebral palsy. It has been difficult to evaluate the theory of walker prescription based on upper extremity stresses because loading on the upper extremities however has not been quantified until recently. In this study, weight bearing on the glenohumeral joints was analyzed in five children with spastic diplegic cerebral palsy using both anterior and posterior walkers fitted with 6-axis handle transducers. Though walkers' effects on the upper extremities proved to be similar between walker types, the differences between the walkers may have some clinical significance in the long run. In general, posterior walker use created larger glenohumeral joint forces. Though these differences are not statistically significant, over time and with repetitive loading they may be clinically significant.

  19. New Surgical Drapes for Observation of the Lower Extremities during Abdominal Aortic Repair.

    PubMed

    Obitsu, Yukio; Shigematsu, Hiroshi; Satou, Kazuhiro; Watanabe, Yoshiko; Saiki, Naozumi; Koizumii, Nobusato

    2010-01-01

    For the early diagnosis and therapy of peripheral thromboembolism (TE) as a complication of abdominal aortic repair (AAR), we developed and evaluated the usefulness of surgical drapes that permit observation of the lower extremities during AAR. Between January 2007 and June 2009, the handling, durability, and usefulness of new surgical drapes were evaluated during AAR in 157 patients with abdominal aortic aneurysms and 9 patients with peripheral arterial disease. The drapes are manufactured by Hogy Medical Co. Ltd. and made of a water-repellent, spun lace, non-woven fabric, including a transparent polyethylene film that covers the patients' legs. This transparent film enables inspection and palpation of the lower extremities during surgery for early diagnosis and therapy of peripheral TE. As a peripheral complication, 1 patient had right lower extremity TE. This was diagnosed immediately after anastomosis, thrombectomy was performed, and the remaining clinical course was uneventful. In all patients, the drapes permitted observation of the lower extremities , and the dorsal arteries were palpable. There were no problems with durability. New surgical drapes permit observation of the lower extremities during AAR for early diagnosis and treatment of peripheral TE.

  20. The extremely long-tongued neotropical butterfly Eurybia lycisca (Riodinidae): proboscis morphology and flower handling.

    PubMed

    Bauder, Julia A S; Lieskonig, Nora R; Krenn, Harald W

    2011-03-01

    Few species of true butterflies (Lepidoptera: Papilionoidea) have evolved a proboscis that greatly exceeds the length of the body. This study is the first to examine the morphology of an extremely long butterfly proboscis and to describe how it is used to obtain nectar from flowers with very deep corolla tubes. The proboscis of Eurybia lycisca (Riodinidae) is approximately twice as long as the body. It has a maximal length of 45.6 mm (mean length 36.5 mm ± 4.1 S.D., N = 20) and is extremely thin, measuring only about 0.26 mm at its maximum diameter. The proboscis has a unique arrangement of short sensilla at the tip, and its musculature arrangement is derived. The flower handling times on the preferred nectar plant, Calathea crotalifera (Marantaceae), were exceptionally long (mean 54.5 sec ± 28.5 S.D., N = 26). When feeding on the deep flowers remarkably few proboscis movements occur. The relationship between Eurybia lycisca and its preferred nectar plant and larval host plant, Calathea crotalifera, is not mutualistic since the butterfly exploits the flowers without contributing to their pollination. We hypothesize that the extraordinarily long proboscis of Eurybia lycisca is an adaptation for capitalizing on the pre-existing mutualistic interaction of the host plant with its pollinating long-tongued nectar feeding insects. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. To repair or not to repair: with FAVOR there is no question

    NASA Astrophysics Data System (ADS)

    Garetto, Anthony; Schulz, Kristian; Tabbone, Gilles; Himmelhaus, Michael; Scheruebl, Thomas

    2016-10-01

    In the mask shop the challenges associated with today's advanced technology nodes, both technical and economic, are becoming increasingly difficult. The constant drive to continue shrinking features means more masks per device, smaller manufacturing tolerances and more complexity along the manufacturing line with respect to the number of manufacturing steps required. Furthermore, the extremely competitive nature of the industry makes it critical for mask shops to optimize asset utilization and processes in order to maximize their competitive advantage and, in the end, profitability. Full maximization of profitability in such a complex and technologically sophisticated environment simply cannot be achieved without the use of smart automation. Smart automation allows productivity to be maximized through better asset utilization and process optimization. Reliability is improved through the minimization of manual interactions leading to fewer human error contributions and a more efficient manufacturing line. In addition to these improvements in productivity and reliability, extra value can be added through the collection and cross-verification of data from multiple sources which provides more information about our products and processes. When it comes to handling mask defects, for instance, the process consists largely of time consuming manual interactions that are error prone and often require quick decisions from operators and engineers who are under pressure. The handling of defects itself is a multiple step process consisting of several iterations of inspection, disposition, repair, review and cleaning steps. Smaller manufacturing tolerances and features with higher complexity contribute to a higher number of defects which must be handled as well as a higher level of complexity. In this paper the recent efforts undertaken by ZEISS to provide solutions which address these challenges, particularly those associated with defectivity, will be presented. From automation of aerial image analysis to the use of data driven decision making to predict and propose the optimized back end of line process flow, productivity and reliability improvements are targeted by smart automation. Additionally the generation of the ideal aerial image from the design and several repair enhancement features offer additional capabilities to improve the efficiency and yield associated with defect handling.

  2. Accurate high-speed liquid handling of very small biological samples.

    PubMed

    Schober, A; Günther, R; Schwienhorst, A; Döring, M; Lindemann, B F

    1993-08-01

    Molecular biology techniques require the accurate pipetting of buffers and solutions with volumes in the microliter range. Traditionally, hand-held pipetting devices are used to fulfill these requirements, but many laboratories have also introduced robotic workstations for the handling of liquids. Piston-operated pumps are commonly used in manually as well as automatically operated pipettors. These devices cannot meet the demands for extremely accurate pipetting of very small volumes at the high speed that would be necessary for certain applications (e.g., in sequencing projects with high throughput). In this paper we describe a technique for the accurate microdispensation of biochemically relevant solutions and suspensions with the aid of a piezoelectric transducer. It is suitable for liquids of a viscosity between 0.5 and 500 milliPascals. The obtainable drop sizes range from 5 picoliters to a few nanoliters with up to 10,000 drops per second. Liquids can be dispensed in single or accumulated drops to handle a wide volume range. The system proved to be excellently suitable for the handling of biological samples. It did not show any detectable negative impact on the biological function of dissolved or suspended molecules or particles.

  3. The Management Challenge: Handling Exams Involving Large Quantities of Students, on and off Campus--A Design Concept

    ERIC Educational Resources Information Center

    Larsson, Ken

    2014-01-01

    This paper looks at the process of managing large numbers of exams efficiently and secure with the use of a dedicated IT support. The system integrates regulations on different levels, from national to local, (even down to departments) and ensures that the rules are employed in all stages of handling the exams. The system has a proven record of…

  4. An Investigation of Large Tilt-Rotor Short-Term Attitude Response Handling Qualities Requirements in Hover

    NASA Technical Reports Server (NTRS)

    Malcipa, Carlos; Decker, William A.; Theodore, Colin R.; Blanken, Christopher L.; Berger, Tom

    2010-01-01

    A piloted simulation investigation was conducted using the NASA Ames Vertical Motion Simulator to study the impact of pitch, roll and yaw attitude bandwidth and phase delay on handling qualities of large tilt-rotor aircraft. Multiple bandwidth and phase delay pairs were investigated for each axis. The simulation also investigated the effect that the pilot offset from the center of gravity has on handling qualities. While pilot offset does not change the dynamics of the vehicle, it does affect the proprioceptive and visual cues and it can have an impact on handling qualities. The experiment concentrated on two primary evaluation tasks: a precision hover task and a simple hover pedal turn. Six pilots flew over 1400 data runs with evaluation comments and objective performance data recorded. The paper will describe the experiment design and methodology, discuss the results of the experiment and summarize the findings.

  5. The plasticity of extracellular fluid homeostasis in insects.

    PubMed

    Beyenbach, Klaus W

    2016-09-01

    In chemistry, the ratio of all dissolved solutes to the solution's volume yields the osmotic concentration. The present Review uses this chemical perspective to examine how insects deal with challenges to extracellular fluid (ECF) volume, solute content and osmotic concentration (pressure). Solute/volume plots of the ECF (hemolymph) reveal that insects tolerate large changes in all three of these ECF variables. Challenges beyond those tolerances may be 'corrected' or 'compensated'. While a correction simply reverses the challenge, compensation accommodates the challenge with changes in the other two variables. Most insects osmoregulate by keeping ECF volume and osmotic concentration within a wide range of tolerance. Other insects osmoconform, allowing the ECF osmotic concentration to match the ambient osmotic concentration. Aphids are unique in handling solute and volume loads largely outside the ECF, in the lumen of the gut. This strategy may be related to the apparent absence of Malpighian tubules in aphids. Other insects can suspend ECF homeostasis altogether in order to survive extreme temperatures. Thus, ECF homeostasis in insects is highly dynamic and plastic, which may partly explain why insects remain the most successful class of animals in terms of both species number and biomass. © 2016. Published by The Company of Biologists Ltd.

  6. The Effect of Tool Handle Shape on Hand Muscle Load and Pinch Force in a Simulated Dental Scaling Task

    PubMed Central

    Dong, Hui; Loomer, Peter; Barr, Alan; LaRoche, Charles; Young, Ed; Rempel, David

    2007-01-01

    Work-related upper extremity musculoskeletal disorders, including carpal tunnel syndrome, are prevalent among dentists and dental hygienists. An important risk factor for developing these disorders is forceful pinching which occurs during periodontal work such as dental scaling. Ergonomically designed dental scaling instruments may help reduce the prevalence of carpal tunnel syndrome among dental practitioners. In this study, 8 custom-designed dental scaling instruments with different handle shapes were used by 24 dentists and dental hygienists to perform a simulated tooth scaling task. The muscle activity of two extensors and two flexors in the forearm was recorded with electromyography while thumb pinch force was measured by pressure sensors. The results demonstrated that the instrument handle with a tapered, round shape and a 10 mm diameter required the least muscle load and pinch force when performing simulated periodontal work. The results from this study can guide dentists and dental hygienists in selection of dental scaling instruments. PMID:17156742

  7. HAND TRUCK FOR HANDLING EQUIPMENT

    DOEpatents

    King, D.W.

    1959-02-24

    A truck is described for the handling of large and relatively heavy pieces of equipment and particularly for the handling of ion source units for use in calutrons. The truck includes a chassis and a frame pivoted to the chassis so as to be operable to swing in the manner of a boom. The frame has spaced members so arranged that the device to be handled can be suspended between or passed between these spaced members and also rotated with respect to the frame when the device is secured to the spaced members.

  8. A general method for handling missing binary outcome data in randomized controlled trials.

    PubMed

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-12-01

    The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. We propose a sensitivity analysis where standard analyses, which could include 'missing = smoking' and 'last observation carried forward', are embedded in a wider class of models. We apply our general method to data from two smoking cessation trials. A total of 489 and 1758 participants from two smoking cessation trials. The abstinence outcomes were obtained using telephone interviews. The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. © 2014 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  9. Effects of scapula-upward taping using kinesiology tape in a patient with shoulder pain caused by scapular downward rotation

    PubMed Central

    Kim, Byeong-Jo; Lee, Jung-Hoon

    2015-01-01

    [Purpose] The aim of this study was to evaluate the effects of scapula-upward taping (SUT) in a patient with shoulder pain caused by scapular downward rotation (SDR). [Subject] A 26-year-old male with SDR experienced severe pain in the left shoulder when he lifted his left upper extremity to hold the handle in a bus and during and after push-up exercise. [Methods] The patient underwent SUT for a period of 1 month, 5 times per week, for an average of 9 h/d. [Results] The patient’s radiographs showed that the degree of SDR had decreased; the left shoulder pain also decreased in the resting state and during and after push-up exercise. The manual muscle strength test grades of the upper trapezius, lower trapezius, and serratus anterior had increased. The patient was able to lift the left upper extremity to hold the handle in a bus and perform the push-up exercise without experiencing any pain. [Conclusion] Repeated SUT application may be a beneficial treatment method for alleviating the degree of SDR and shoulder pain in SDR patients. PMID:25729213

  10. A comparison of alternative variants of the lead and lag time TTO.

    PubMed

    Devlin, Nancy; Buckingham, Ken; Shah, Koonal; Tsuchiya, Aki; Tilling, Carl; Wilkinson, Grahame; van Hout, Ben

    2013-05-01

    'Lead Time' TTO improves upon conventional TTO by providing a uniform method for eliciting positive and negative values. This research investigates (i) the values generated from different combinations of time in poor health and in full health; and the order in which these appear (lead vs. lag); (ii) whether values concur with participants' views about states; (iii) methods for handling extreme preferences. n = 208 participants valued five EQ-5D states, using two of four variants. Combinations of lead time and health state duration were: 10 years and 20 years; 5 years and 1 year; 5 years and 10 years; and a health state duration of 5 years with a lag time of 10 years. Longer lead times capture more preferences, but may involve a framing effect. Lag time results in less non-trading for mild states, and less time being traded for severe states. Negative values broadly agree with participants' stated opinion that the state is worse than dead. The values are sensitive to the ratio of lead time to duration of poor health, and the order in which these appear (lead vs. lag). It is feasible to handle extreme preferences though challenges remain. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Semi-supervised and unsupervised extreme learning machines.

    PubMed

    Huang, Gao; Song, Shiji; Gupta, Jatinder N D; Wu, Cheng

    2014-12-01

    Extreme learning machines (ELMs) have proven to be efficient and effective learning mechanisms for pattern classification and regression. However, ELMs are primarily applied to supervised learning problems. Only a few existing research papers have used ELMs to explore unlabeled data. In this paper, we extend ELMs for both semi-supervised and unsupervised tasks based on the manifold regularization, thus greatly expanding the applicability of ELMs. The key advantages of the proposed algorithms are as follows: 1) both the semi-supervised ELM (SS-ELM) and the unsupervised ELM (US-ELM) exhibit learning capability and computational efficiency of ELMs; 2) both algorithms naturally handle multiclass classification or multicluster clustering; and 3) both algorithms are inductive and can handle unseen data at test time directly. Moreover, it is shown in this paper that all the supervised, semi-supervised, and unsupervised ELMs can actually be put into a unified framework. This provides new perspectives for understanding the mechanism of random feature mapping, which is the key concept in ELM theory. Empirical study on a wide range of data sets demonstrates that the proposed algorithms are competitive with the state-of-the-art semi-supervised or unsupervised learning algorithms in terms of accuracy and efficiency.

  12. Abnormal MRI in a patient with 'headache with neurological deficits and CSF lymphocytosis (HaNDL)'.

    PubMed

    Yilmaz, A; Kaleagasi, H; Dogu, O; Kara, E; Ozge, A

    2010-05-01

    A 27-year-old woman was admitted to the Emergency Department with right upper-extremity numbness and mild weakness followed by a bifrontal throbbing headache for 30 min, which was similar to a headache lasting for 12 h that had occurred 3 days ago. Laboratory tests were unremarkable except for cerebrospinal fluid (CSF) lymphocytic pleocytosis. On the following day, a headache episode with left hemiparesis and hemihypoaesthesia, left hemifield visio-spatial inattention, anosagnosia and confusion recurred. The headache was diagnosed as headache and neurological deficits with cerebrospinal fluid lymphocytosis (HaNDL) syndrome according to the criteria of the second edition of the International Classification of Headache Disorders. Simultaneously performed magnetic resonance imaging (MRI) revealed swelling of the grey matter, CSF enhancement in the sulci of the right temporal and occipital regions and hypoperfusion of the same brain regions. During the following 10 days two more similar episodes recurred and during the ensuing 12 months the patient remained headache free. Neuroimaging findings of the HaNDL syndrome are always thought as virtually normal. MRI abnormalities in our patient have not been reported in HaNDL syndrome previously, although they have been reported in hemiplegic migraine patients before. The findings in our case suggest that hemiplegic migraine and HaNDL syndrome may share a common pathophysiological pathway resulting in similar imaging findings and neurological symptoms.

  13. Extreme Value Analysis of hydro meteorological extremes in the ClimEx Large-Ensemble

    NASA Astrophysics Data System (ADS)

    Wood, R. R.; Martel, J. L.; Willkofer, F.; von Trentini, F.; Schmid, F. J.; Leduc, M.; Frigon, A.; Ludwig, R.

    2017-12-01

    Many studies show an increase in the magnitude and frequency of hydrological extreme events in the course of climate change. However the contribution of natural variability to the magnitude and frequency of hydrological extreme events is not yet settled. A reliable estimate of extreme events is from great interest for water management and public safety. In the course of the ClimEx Project (www.climex-project.org) a new single-model large-ensemble was created by dynamically downscaling the CanESM2 large-ensemble with the Canadian Regional Climate Model version 5 (CRCM5) for an European Domain and a Northeastern North-American domain. By utilizing the ClimEx 50-Member Large-Ensemble (CRCM5 driven by CanESM2 Large-Ensemble) a thorough analysis of natural variability in extreme events is possible. Are the current extreme value statistical methods able to account for natural variability? How large is the natural variability for e.g. a 1/100 year return period derived from a 50-Member Large-Ensemble for Europe and Northeastern North-America? These questions should be answered by applying various generalized extreme value distributions (GEV) to the ClimEx Large-Ensemble. Hereby various return levels (5-, 10-, 20-, 30-, 60- and 100-years) based on various lengths of time series (20-, 30-, 50-, 100- and 1500-years) should be analyzed for the maximum one day precipitation (RX1d), the maximum three hourly precipitation (RX3h) and the streamflow for selected catchments in Europe. The long time series of the ClimEx Ensemble (7500 years) allows us to give a first reliable estimate of the magnitude and frequency of certain extreme events.

  14. Complaint handling in healthcare: expectation gaps between physicians and the public; results of a survey study.

    PubMed

    Friele, R D; Reitsma, P M; de Jong, J D

    2015-10-01

    Patients who submit complaints about the healthcare they have received are often dissatisfied with the response to their complaints. This is usually attributed to the failure of physicians to respond adequately to what complainants want, e.g. an apology or an explanation. However, expectations of complaint handling among the public may colour how they evaluate the way their own complaint is handled. This descriptive study assesses expectations of complaint handling in healthcare among the public and physicians. Negative public expectations and the gap between these expectations and those of physicians may explain patients' dissatisfaction with complaints procedures. We held two surveys; one among physicians, using a panel of 3366 physicians (response rate 57 %, containing all kinds of physicians like GP's, medical specialist and physicians working in a nursing home) and one among the public, using the Dutch Healthcare Consumer Panel (n = 1422, response rate 68 %). We asked both panels identical questions about their expectations of how complaints are handled in healthcare. Differences in expectation scores between the public and the physicians were tested using non-parametric tests. The public have negative expectations about how complaints are handled. Physician's expectations are far more positive, demonstrating large expectation gaps between physicians and the public. The large expectation gap between the public and physicians means that when they meet because of complaint, they are likely to start off with opposite expectations of the situation. This is no favourable condition for a positive outcome of a complaints procedure. The negative public preconceptions about the way their complaint will be handled will prove hard to change during the process of complaints handling. People tend to see what they thought would happen, almost inevitably leading to a negative judgement about how their complaint was handled.

  15. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; hide

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  16. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  17. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation.

    PubMed

    Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2014-11-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.

  18. Design and simulation of integration system between automated material handling system and manufacturing layout in the automotive assembly line

    NASA Astrophysics Data System (ADS)

    Seha, S.; Zamberi, J.; Fairu, A. J.

    2017-10-01

    Material handling system (MHS) is an important part for the productivity plant and has recognized as an integral part of today’s manufacturing system. Currently, MHS has growth tremendously with its technology and equipment type. Based on the case study observation, the issue involving material handling system contribute to the reduction of production efficiency. This paper aims to propose a new design of integration between material handling and manufacturing layout by investigating the influences of layout and material handling system. A method approach tool using Delmia Quest software is introduced and the simulation result is used to assess the influences of the integration between material handling system and manufacturing layout in the performance of automotive assembly line. The result show, the production of assembly line output increases more than 31% from the current system. The source throughput rate average value went up to 252 units per working hour in model 3 and show the effectiveness of the pick-to-light system as efficient storage equipment. Thus, overall result shows, the application of AGV and the pick-to-light system gave a large significant effect in the automotive assembly line. Moreover, the change of layout also shows a large significant improvement to the performance.

  19. Ergonomics study on the handle length and lift angle for the culinary spatula.

    PubMed

    Wu, Swei-Pi; Hsieh, Chang-Sheng

    2002-09-01

    The culinary spatula (turning shovel) is one of the most common cooking tools used in the kitchen in Asia. However, the culinary spatula has seldom been ergonomically investigated. When a person uses a spatula to cook food, the operations involve repetitive bent-wrist motions, such as dorsiflexion, palmary flexion, and radial and ulnar deviations. These movements may cause cumulative trauma disorders in the upper extremities, and in particular carpal tunnel syndrome. A poorly designed culinary spatula will be ergonomically inefficient and cause injury to the hand and wrist. The purpose of this study was to investigate the effects of spatula handle length and lift angle on food-frying, food-turning, and food-shoveling performance. Eight female subjects were tested using 16 different culinary spatulas, with four different handle lengths (20, 25, 30 and 35 cm) and four different lift angles (15 degrees, 25 degrees, 35 degrees and 45 ). The criterion measures included cooking performance, and rating of perceived exertion. The subjects ranked their preference after all of the tasks in the tests were completed. The results showed that: (1) The handle length had a significant influence on the cooking performance, and rating of perceived exertion. The optimal handle lengths for frying food, turning food, and shoveling food were 20, 25 and 25 cm, respectively. (2) The lift angle significantly affected the cooking performance, and rating of perceived exertion. The optimal lift angles for frying food, turning food, and shoveling food were 15 degrees, 15 degrees and 25 degrees, respectively. (3) Both the handle length and lift angle had significant effects on subjective preference. For the handle length, the 20 cm length was the best. For the lift angle, the 25 angle was the best. (4) In general, a spatula with a 20 cm handle length and 25 degrees lift angle was the best. A spatula with a 25 cm handle length and 15 lift angle was the second most preferred. (5) However, to prevent subjects from touching the edge of a hot pan, a spatula with a 25 cm handle length and 25 lift angle is suggested.

  20. PeakRanger: A cloud-enabled peak caller for ChIP-seq data

    PubMed Central

    2011-01-01

    Background Chromatin immunoprecipitation (ChIP), coupled with massively parallel short-read sequencing (seq) is used to probe chromatin dynamics. Although there are many algorithms to call peaks from ChIP-seq datasets, most are tuned either to handle punctate sites, such as transcriptional factor binding sites, or broad regions, such as histone modification marks; few can do both. Other algorithms are limited in their configurability, performance on large data sets, and ability to distinguish closely-spaced peaks. Results In this paper, we introduce PeakRanger, a peak caller software package that works equally well on punctate and broad sites, can resolve closely-spaced peaks, has excellent performance, and is easily customized. In addition, PeakRanger can be run in a parallel cloud computing environment to obtain extremely high performance on very large data sets. We present a series of benchmarks to evaluate PeakRanger against 10 other peak callers, and demonstrate the performance of PeakRanger on both real and synthetic data sets. We also present real world usages of PeakRanger, including peak-calling in the modENCODE project. Conclusions Compared to other peak callers tested, PeakRanger offers improved resolution in distinguishing extremely closely-spaced peaks. PeakRanger has above-average spatial accuracy in terms of identifying the precise location of binding events. PeakRanger also has excellent sensitivity and specificity in all benchmarks evaluated. In addition, PeakRanger offers significant improvements in run time when running on a single processor system, and very marked improvements when allowed to take advantage of the MapReduce parallel environment offered by a cloud computing resource. PeakRanger can be downloaded at the official site of modENCODE project: http://www.modencode.org/software/ranger/ PMID:21554709

  1. Urban heat stress: novel survey suggests health and fitness as future avenue for research and adaptation strategies

    NASA Astrophysics Data System (ADS)

    Schuster, Christian; Honold, Jasmin; Lauf, Steffen; Lakes, Tobia

    2017-04-01

    Extreme heat has tremendous adverse effects on human health. Heat stress is expected to further increase due to urbanization, an aging population, and global warming. Previous research has identified correlations between extreme heat and mortality. However, the underlying physical, behavioral, environmental, and social risk factors remain largely unknown and comprehensive quantitative investigation on an individual level is lacking. We conducted a new cross-sectional household questionnaire survey to analyze individual heat impairment (self-assessed and reported symptoms) and a large set of potential risk factors in the city of Berlin, Germany. This unique dataset (n = 474) allows for the investigation of new relationships, especially between health/fitness and urban heat stress. Our analysis found previously undocumented associations, leading us to generate new hypotheses for future research: various health/fitness variables returned the strongest associations with individual heat stress. Our primary hypothesis is that age, the most commonly used risk factor, is outperformed by health/fitness as a dominant risk factor. Related variables seem to more accurately represent humans’ cardiovascular capacity to handle elevated temperature. Among them, active travel was associated with reduced heat stress. We observed statistical associations for heat exposure regarding the individual living space but not for the neighborhood environment. Heat stress research should further investigate individual risk factors of heat stress using quantitative methodologies. It should focus more on health and fitness and systematically explore their role in adaptation strategies. The potential of health and fitness to reduce urban heat stress risk means that encouraging active travel could be an effective adaptation strategy. Through reduced CO2 emissions from urban transport, societies could reap double rewards by addressing two root causes of urban heat stress: population health and global warming.

  2. Analysis of the dependence of extreme rainfalls

    NASA Astrophysics Data System (ADS)

    Padoan, Simone; Ancey, Christophe; Parlange, Marc

    2010-05-01

    The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.

  3. Restructuring Big Data to Improve Data Access and Performance in Analytic Services Making Research More Efficient for the Study of Extreme Weather Events and Application User Communities

    NASA Astrophysics Data System (ADS)

    Ostrenga, D.; Shen, S.; Vollmer, B.; Meyer, D. L.

    2017-12-01

    NASA climate reanalysis dataset from MERRA-2 contains numerous data for atmosphere, land, and ocean, that are grouped into 95 products of archived volume over 300 TB. The data files are saved as hourly-file, day-file (hourly time interval) and month-file containing up to 125 parameters. Due to the large number of data files and the sheer data volumes, it is a challenging for users, especially those in the application research community, to handle dealing with the original data files. Most of these researchers prefer to focus on a small region or single location using the hourly data for long time periods to analyze extreme weather events or say winds for renewable energy applications. At the GES DISC, we have been working closely with the science teams and the application user community to create several new value added data products and high quality services to facilitate the use of the model data for various types of research. We have tested converting hourly data from one-day per file into different data cubes, such as one-month, one-year, or whole-mission and then continued to analyze the efficiency of the accessibility of this newly structured data through various services. Initial results have shown that compared to the original file structure, the new data has significantly improved the performance for accessing long time series. It is noticed that the performance is associated to the cube size and structure, the compression method, and how the data are accessed. The optimized data cube structure will not only improve the data access, but also enable better online analytic services for doing statistical analysis and extreme events mining. Two case studies will be presented using the newly structured data and value added services, the California drought and the extreme drought of the Northeastern states of Brazil. Furthermore, data access and analysis through cloud storage capabilities will be investigated.

  4. Coastal Hazards and Integration of Impacts in Local Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Knudsen, P.; Sorensen, C.; Molgaard, M. R.; Broge, N. H.; Andersen, O. B.

    2016-12-01

    Data on sea and groundwater levels, precipitation, land subsidence, geology, and geotechnical soil properties are combined with information on flood and erosion protection measures to analyze water-related impacts from climate change at an exposed coastal location. Future sea extremes will have a large impact but several coupled effects in the hydrological system need to be considered as well to provide for optimal protection and mitigation efforts. For instance, the investment and maintenance costs of securing functional water and wastewater pipes are significantly reduced by incorporating knowledge about climate change. The translation of regional sea level rise evidence and projections to concrete impact measures should take into account the potentially affected stakeholders who must collaborate on common and shared adaptation solutions. Here, knowledge integration across levels of governance and between research, private and public institutions, and the local communities provides: understanding of the immediate and potential future challenges; appreciation of different stakeholder motives, business agendas, legislative constraints etc., and a common focus on how to cost-efficiently adapt to and manage impacts of climate change. By construction of a common working platform that is updated with additional data and knowledge, e.g. from future regional models or extreme events, advances in sea level research can more readily be translated into concrete and local impact measures in a way that handles uncertainties in the future climate and urban development as well as suiting the varying stakeholder needs.

  5. PMHT Approach for Multi-Target Multi-Sensor Sonar Tracking in Clutter.

    PubMed

    Li, Xiaohua; Li, Yaan; Yu, Jing; Chen, Xiao; Dai, Miao

    2015-11-06

    Multi-sensor sonar tracking has many advantages, such as the potential to reduce the overall measurement uncertainty and the possibility to hide the receiver. However, the use of multi-target multi-sensor sonar tracking is challenging because of the complexity of the underwater environment, especially the low target detection probability and extremely large number of false alarms caused by reverberation. In this work, to solve the problem of multi-target multi-sensor sonar tracking in the presence of clutter, a novel probabilistic multi-hypothesis tracker (PMHT) approach based on the extended Kalman filter (EKF) and unscented Kalman filter (UKF) is proposed. The PMHT can efficiently handle the unknown measurements-to-targets and measurements-to-transmitters data association ambiguity. The EKF and UKF are used to deal with the high degree of nonlinearity in the measurement model. The simulation results show that the proposed algorithm can improve the target tracking performance in a cluttered environment greatly, and its computational load is low.

  6. Computational analysis of drop formation before and after the first singularity: the fate of free and satellite drops during simple dripping and DOD drop formation

    NASA Astrophysics Data System (ADS)

    Chen, Alvin U.; Basaran, Osman A.

    2000-11-01

    Drop formation from a capillary --- dripping mode --- or an ink jet nozzle --- drop-on-demand (DOD) mode --- falls into a class of scientifically challenging yet practically useful free surface flows that exhibit a finite time singularity, i.e. the breakup of an initially single liquid mass into two or more fragments. While computational tools to model such problems have been developed recently, they lack the accuracy needed to quantitatively predict all the dynamics observed in experiments. Here we present a new finite element method (FEM) based on a robust algorithm for elliptic mesh generation and remeshing to handle extremely large interface deformations. The new algorithm allows continuation of computations beyond the first singularity to track fates of both primary and any satellite drops. The accuracy of the computations is demonstrated by comparison of simulations with experimental measurements made possible with an ultra high-speed digital imager capable of recording 100 million frames per second.

  7. Microfluidics on liquid handling stations (μF-on-LHS): a new industry-compatible microfluidic platform

    NASA Astrophysics Data System (ADS)

    Kittelmann, Jörg; Radtke, Carsten P.; Waldbaur, Ansgar; Neumann, Christiane; Hubbuch, Jürgen; Rapp, Bastian E.

    2014-03-01

    Since the early days microfluidics as a scientific discipline has been an interdisciplinary research field with a wide scope of potential applications. Besides tailored assays for point-of-care (PoC) diagnostics, microfluidics has been an important tool for large-scale screening of reagents and building blocks in organic chemistry, pharmaceutics and medical engineering. Furthermore, numerous potential marketable products have been described over the years. However, especially in industrial applications, microfluidics is often considered only an alternative technology for fluid handling, a field which is industrially mostly dominated by large-scale numerically controlled fluid and liquid handling stations. Numerous noteworthy products have dominated this field in the last decade and have been inhibited the widespread application of microfluidics technology. However, automated liquid handling stations and microfluidics do not have to be considered as mutually exclusive approached. We have recently introduced a hybrid fluidic platform combining an industrially established liquid handling station and a generic microfluidic interfacing module that allows probing a microfluidic system (such as an essay or a synthesis array) using the instrumentation provided by the liquid handling station. We term this technology "Microfluidic on Liquid Handling Stations (μF-on-LHS)" - a classical "best of both worlds"- approach that allows combining the highly evolved, automated and industry-proven LHS systems with any type of microfluidic assay. In this paper we show, to the best of our knowledge, the first droplet microfluidics application on an industrial LHS using the μF-on-LHS concept.

  8. Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, Ioulia; Scherzer, Daniel

    2016-04-01

    Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.

  9. Alumina Handling Dustiness

    NASA Astrophysics Data System (ADS)

    Authier-Martin, Monique

    Dustiness of calcined alumina is a major concern, causing undesirable working conditions and serious alumina losses. These losses occur primarily during unloading and handling or pot loading and crust breaking. The handling side of the problem is first addressed. The Perra pulvimeter constitutes a simple and reproducible tool to quantify handling dustiness and yields results in agreement with plant experience. Attempts are made to correlate dustiness with bulk properties (particle size, attrition index, …) for a large number of diverse aluminas. The characterization of the dust generated with the Perra pulvimeter is most revealing. The effect of the addition of E.S.P. dust is also reported.

  10. Extreme alien light allows survival of terrestrial bacteria

    NASA Astrophysics Data System (ADS)

    Johnson, Neil; Zhao, Guannan; Caycedo, Felipe; Manrique, Pedro; Qi, Hong; Rodriguez, Ferney; Quiroga, Luis

    2013-07-01

    Photosynthetic organisms provide a crucial coupling between the Sun's energy and metabolic processes supporting life on Earth. Searches for extraterrestrial life focus on seeking planets with similar incident light intensities and environments. However the impact of abnormal photon arrival times has not been considered. Here we present the counterintuitive result that broad classes of extreme alien light could support terrestrial bacterial life whereas sources more similar to our Sun might not. Our detailed microscopic model uses state-of-the-art empirical inputs including Atomic Force Microscopy (AFM) images. It predicts a highly nonlinear survivability for the basic lifeform Rsp. Photometricum whereby toxic photon feeds get converted into a benign metabolic energy supply by an interplay between the membrane's spatial structure and temporal excitation processes. More generally, our work suggests a new handle for manipulating terrestrial photosynthesis using currently-available extreme value statistics photon sources.

  11. A low cost, high precision extreme/harsh cold environment, autonomous sensor data gathering and transmission platform.

    NASA Astrophysics Data System (ADS)

    Chetty, S.; Field, L. A.

    2014-12-01

    SWIMS III, is a low cost, autonomous sensor data gathering platform developed specifically for extreme/harsh cold environments. Arctic ocean's continuing decrease of summer-time ice is related to rapidly diminishing multi-year ice due to the effects of climate change. Ice911 Research aims to develop environmentally inert materials that when deployed will increase the albedo, enabling the formation and/preservation of multi-year ice. SWIMS III's sophisticated autonomous sensors are designed to measure the albedo, weather, water temperature and other environmental parameters. This platform uses low cost, high accuracy/precision sensors, extreme environment command and data handling computer system using satellite and terrestrial wireless solution. The system also incorporates tilt sensors and sonar based ice thickness sensors. The system is light weight and can be deployed by hand by a single person. This presentation covers the technical, and design challenges in developing and deploying these platforms.

  12. Military Standardization Handbook: Aircraft Refueling Handbook

    DTIC Science & Technology

    1992-10-20

    explosive serious deterioration of rany rubber materials. It is as AVGAS. All aviation fuels must be handled therefore extremely important that only...additives are advised dissolved water a fuel will hold in parts per million to follow all instructions, wear gloves and aprons. (ppmt is approximately...FSII prevents the growth of contain one of these additives may cause abnormal Sfunguses and other microorganisrns which can wear or malfunctions or

  13. Criteria for Handling Qualities of Military Aircraft.

    DTIC Science & Technology

    1982-06-01

    loop precognitive manner. The pilot is able to apply discrete, step-like inputs which more or less exactly produce the desired aircraft response. Some...While closed loop operation depends upon the frequency domain response characteristics, successful precognitive control requires the time domain...represents the other extreme of the pilot task from the precognitive time response situation. Mich work was done in attempting to predict pilot opinion from

  14. Modular mechatronic system for stationary bicycles interfaced with virtual environment for rehabilitation

    PubMed Central

    2014-01-01

    Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780

  15. Characterization of extreme years in Central Europe between 2000 and 2016 according to specific vegetation characteristics based on Earth Observatory data

    NASA Astrophysics Data System (ADS)

    Kern, Anikó; Marjanović, Hrvoje; Barcza, Zoltán

    2017-04-01

    Extreme weather events frequently occur in Central Europe, affecting the state of the vegetation in large areas. Droughts and heat-waves affect all plant functional types, but the response of the vegetation is not uniform and depends on other parameters, plant strategies and the antecedent meteorological conditions as well. Meteorologists struggle with the definition of extreme events and selection of years that can be considered as extreme in terms of meteorological conditions due to the large variability of the meteorological parameters both in time and space. One way to overcome this problem is the definition of extreme weather based on its observed effect on plant state. The Normalized Difference Vegetation Index (NDVI), the Enhanced Vegetation Index (EVI), the Leaf Area Index (LAI), the Fraction of Photosynthetically Active Radiation (FPAR) and the Gross Primary Production (GPP) are different measures of the land vegetation derived from remote sensing data, providing information about the plant state, but it is less known how weather anomalies affect these measures. We used the vegetation related official products created from the measurements of the MODerate resolution Imaging Spectroradiometer (MODIS) on board satellite Terra to select and characterize the extreme years in Central European countries during the 2000-2016 time period. The applied Collection-6 MOD13 NDVI/EVI, MOD15 LAI/FPAR and MOD17 GPP datasets have 500 m × 500 m spatial resolution covering the region of the Carpathian-Basin. After quality and noise filtering (and temporal interpolation in case of MOD13) 8-day anomaly values were derived to investigate the different years. The freely available FORESEE meteorological database was used to study climate variability in the region. Daily precipitation and maximum/minimum temperature fields at 1/12° × 1/12° grid were resampled to the 8-day temporal and 500 m × 500 m spatial resolution of the MODIS products. To discriminate the different behavior of the various plant functional types MODIS (MCD12) and CORINE (CLC2012) land cover datasets were applied and handled together. Based on the determination of the reliable pixels with different plant types the response of broadleaf forests, coniferous forests, grasslands and croplands were discriminated and investigated. Characteristic time periods were selected based on the remote sensing data to define anomalies, and then the meteorological data were used to define critical time periods within the year that has the strongest effect on the observed anomalies. Similarities/dissimilarities between the behaviors of the different remotely sensed measures are also studied to elucidate the consistency of the indices. The results indicate that the diverse remote sensing indices typically co-vary but reveal strong plant functional type dependency. The study suggest that the selection of extreme years based on annual data is not the best choice, as shorter time periods within the years explain the anomalies to a higher degree than annual data. The results can be used to select anomalous years outside of the satellite era as well. Keywords: Remote sensing, meteorology; extreme years; MODIS, NDVI; EVI; LAI; FPAR; GPP; phenology

  16. Large-Scale Meteorological Patterns Associated with Extreme Precipitation in the US Northeast

    NASA Astrophysics Data System (ADS)

    Agel, L. A.; Barlow, M. A.

    2016-12-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. Tropopause height provides a compact representation of large-scale circulation patterns, as it is linked to mid-level circulation, low-level thermal contrasts and low-level diabatic heating. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into a larger context. Six tropopause patterns are identified on extreme days: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong upward motion during, and moisture transport preceding, extreme precipitation events.

  17. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Safe Patient Handling and Mobility: Development and Implementation of a Large-Scale Education Program.

    PubMed

    Lee, Corinne; Knight, Suzanne W; Smith, Sharon L; Nagle, Dorothy J; DeVries, Lori

    This article addresses the development, implementation, and evaluation of an education program for safe patient handling and mobility at a large academic medical center. The ultimate goal of the program was to increase safety during patient mobility/transfer and reduce nursing staff injury from lifting/pulling. This comprehensive program was designed on the basis of the principles of prework, application, and support at the point of care. A combination of online learning, demonstration, skill evaluation, and coaching at the point of care was used to achieve the goal. Specific roles and responsibilities were developed to facilitate implementation. It took 17 master trainers, 88 certified trainers, 176 unit-based trainers, and 98 coaches to put 3706 nurses and nursing assistants through the program. Evaluations indicated both an increase in knowledge about safe patient handling and an increased ability to safely mobilize patients. The challenge now is sustainability of safe patient-handling practices and the growth and development of trainers and coaches.

  19. Discrete Element Modeling of Triboelectrically Charged Particles

    NASA Technical Reports Server (NTRS)

    Hogue, Michael D.; Calle, Carlos I.; Weitzman, Peter S.; Curry, David R.

    2008-01-01

    Tribocharging of particles is common in many processes including fine powder handling and mixing, printer toner transport and dust extraction. In a lunar environment with its high vacuum and lack of water, electrostatic forces are an important factor to consider when designing and operating equipment. Dust mitigation and management is critical to safe and predictable performance of people and equipment. The extreme nature of lunar conditions makes it difficult and costly to carry out experiments on earth which are necessary to better understand how particles gather and transfer charge between each other and with equipment surfaces. DEM (Discrete Element Modeling) provides an excellent virtual laboratory for studying tribocharging of particles as well as for design of devices for dust mitigation and for other purposes related to handling and processing of lunar regolith. Theoretical and experimental work has been performed pursuant to incorporating screened Coulombic electrostatic forces into EDEM, a commercial DEM software package. The DEM software is used to model the trajectories of large numbers of particles for industrial particulate handling and processing applications and can be coupled with other solvers and numerical models to calculate particle interaction with surrounding media and force fields. While simple Coulombic force between two particles is well understood, its operation in an ensemble of particles is more complex. When the tribocharging of particles and surfaces due to frictional contact is also considered, it is necessary to consider longer range of interaction of particles in response to electrostatic charging. The standard DEM algorithm accounts for particle mechanical properties and inertia as a function of particle shape and mass. If fluid drag is neglected, then particle dynamics are governed by contact between particles, between particles and equipment surfaces and gravity forces. Consideration of particle charge and any tribocharging and electric field effects requires calculation of the forces due to these effects.

  20. Addressing as low as reasonably achievable (ALARA) issues: investigation of worker collective external and extremity dose data

    DOE PAGES

    Cournoyer, Michael Edward; Costigan, Stephen Andrew; Schreiber, Stephen Bruce

    2017-03-17

    Plutonium emits both neutrons and photons and when it is stored or handled inside a glovebox, both photons and neutrons are significant external radiation hazards. Doses to the extremities are usually dominated by gamma radiation in typical plutonium glovebox operations. Excess external dose can generates stochastic effects consisting of cancer and benign tumors in some organs. Direct doses from radiation sources external to the body are measured by thermoluminescent dosimeters (TLDs) placed on the glovebox worker between the neck and waist. Wrist dosimeters are used to assess externally penetrating radiation including neutrons and provide an estimate of neutron radiation exposuremore » to the extremities. Both TLDs and wrist dosimeters are processed monthly for most glovebox workers. Here, worker collective extremity and external dose data have been analyzed to prevent and mitigate external radiation events through the use of Lean Manufacturing and Six Sigma business practices (LSS). Employing LSS, statistically significant variations (trends) are identified in worker collective extremity and external dose data. Finally, the research results presented in this paper are pivotal to the ultimate focus of this program, which is to minimize external radiation events.« less

  1. Addressing as low as reasonably achievable (ALARA) issues: investigation of worker collective external and extremity dose data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cournoyer, Michael Edward; Costigan, Stephen Andrew; Schreiber, Stephen Bruce

    Plutonium emits both neutrons and photons and when it is stored or handled inside a glovebox, both photons and neutrons are significant external radiation hazards. Doses to the extremities are usually dominated by gamma radiation in typical plutonium glovebox operations. Excess external dose can generates stochastic effects consisting of cancer and benign tumors in some organs. Direct doses from radiation sources external to the body are measured by thermoluminescent dosimeters (TLDs) placed on the glovebox worker between the neck and waist. Wrist dosimeters are used to assess externally penetrating radiation including neutrons and provide an estimate of neutron radiation exposuremore » to the extremities. Both TLDs and wrist dosimeters are processed monthly for most glovebox workers. Here, worker collective extremity and external dose data have been analyzed to prevent and mitigate external radiation events through the use of Lean Manufacturing and Six Sigma business practices (LSS). Employing LSS, statistically significant variations (trends) are identified in worker collective extremity and external dose data. Finally, the research results presented in this paper are pivotal to the ultimate focus of this program, which is to minimize external radiation events.« less

  2. Automation practices in large molecule bioanalysis: recommendations from group L5 of the global bioanalytical consortium.

    PubMed

    Ahene, Ago; Calonder, Claudio; Davis, Scott; Kowalchick, Joseph; Nakamura, Takahiro; Nouri, Parya; Vostiar, Igor; Wang, Yang; Wang, Jin

    2014-01-01

    In recent years, the use of automated sample handling instrumentation has come to the forefront of bioanalytical analysis in order to ensure greater assay consistency and throughput. Since robotic systems are becoming part of everyday analytical procedures, the need for consistent guidance across the pharmaceutical industry has become increasingly important. Pre-existing regulations do not go into sufficient detail in regard to how to handle the use of robotic systems for use with analytical methods, especially large molecule bioanalysis. As a result, Global Bioanalytical Consortium (GBC) Group L5 has put forth specific recommendations for the validation, qualification, and use of robotic systems as part of large molecule bioanalytical analyses in the present white paper. The guidelines presented can be followed to ensure that there is a consistent, transparent methodology that will ensure that robotic systems can be effectively used and documented in a regulated bioanalytical laboratory setting. This will allow for consistent use of robotic sample handling instrumentation as part of large molecule bioanalysis across the globe.

  3. CANISTER TRANSFER SYSTEM DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B. Gorpani

    2000-06-23

    The Canister Transfer System receives transportation casks containing large and small disposable canisters, unloads the canisters from the casks, stores the canisters as required, loads them into disposal containers (DCs), and prepares the empty casks for re-shipment. Cask unloading begins with cask inspection, sampling, and lid bolt removal operations. The cask lids are removed and the canisters are unloaded. Small canisters are loaded directly into a DC, or are stored until enough canisters are available to fill a DC. Large canisters are loaded directly into a DC. Transportation casks and related components are decontaminated as required, and empty casks aremore » prepared for re-shipment. One independent, remotely operated canister transfer line is provided in the Waste Handling Building System. The canister transfer line consists of a Cask Transport System, Cask Preparation System, Canister Handling System, Disposal Container Transport System, an off-normal canister handling cell with a transfer tunnel connecting the two cells, and Control and Tracking System. The Canister Transfer System operating sequence begins with moving transportation casks to the cask preparation area with the Cask Transport System. The Cask Preparation System prepares the cask for unloading and consists of cask preparation manipulator, cask inspection and sampling equipment, and decontamination equipment. The Canister Handling System unloads the canister(s) and places them into a DC. Handling equipment consists of a bridge crane hoist, DC loading manipulator, lifting fixtures, and small canister staging racks. Once the cask has been unloaded, the Cask Preparation System decontaminates the cask exterior and returns it to the Carrier/Cask Handling System via the Cask Transport System. After the DC is fully loaded, the Disposal Container Transport System moves the DC to the Disposal Container Handling System for welding. To handle off-normal canisters, a separate off-normal canister handling cell is located adjacent to the canister transfer cell and is interconnected to the transfer cell by means of the off-normal canister transfer tunnel. All canister transfer operations are controlled by the Control and Tracking System. The system interfaces with the Carrier/Cask Handling System for incoming and outgoing transportation casks. The system also interfaces with the Disposal Container Handling System, which prepares the DC for loading and subsequently seals the loaded DC. The system support interfaces are the Waste Handling Building System and other internal Waste Handling Building (WHB) support systems.« less

  4. Protection efficiency of a standard compliant EUV reticle handling solution

    NASA Astrophysics Data System (ADS)

    He, Long; Lystad, John; Wurm, Stefan; Orvek, Kevin; Sohn, Jaewoong; Ma, Andy; Kearney, Patrick; Kolbow, Steve; Halbmaier, David

    2009-03-01

    For successful implementation of extreme ultraviolet lithography (EUVL) technology for late cycle insertion at 32 nm half-pitch (hp) and full introduction for 22 nm hp high volume production, the mask development infrastructure must be in place by 2010. The central element of the mask infrastructure is contamination-free reticle handling and protection. Today, the industry has already developed and balloted an EUV pod standard for shipping, transporting, transferring, and storing EUV masks. We have previously demonstrated that the EUV pod reticle handling method represents the best approach in meeting EUVL high volume production requirements, based on then state-of-the-art inspection capability at ~53nm polystyrene latex (PSL) equivalent sensitivity. In this paper, we will present our latest data to show defect-free reticle handling is achievable down to 40 nm particle sizes, using the same EUV pod carriers as in the previous study and the recently established world's most advanced defect inspection capability of ~40 nm SiO2 equivalent sensitivity. The EUV pod is a worthy solution to meet EUVL pilot line and pre-production exposure tool development requirements. We will also discuss the technical challenges facing the industry in refining the EUV pod solution to meet 22 nm hp EUVL production requirements and beyond.

  5. Evolution of precipitation extremes in two large ensembles of climate simulations

    NASA Astrophysics Data System (ADS)

    Martel, Jean-Luc; Mailhot, Alain; Talbot, Guillaume; Brissette, François; Ludwig, Ralf; Frigon, Anne; Leduc, Martin; Turcotte, Richard

    2017-04-01

    Recent studies project significant changes in the future distribution of precipitation extremes due to global warming. It is likely that extreme precipitation intensity will increase in a future climate and that extreme events will be more frequent. In this work, annual maxima daily precipitation series from the Canadian Earth System Model (CanESM2) 50-member large ensemble (spatial resolution of 2.8°x2.8°) and the Community Earth System Model (CESM1) 40-member large ensemble (spatial resolution of 1°x1°) are used to investigate extreme precipitation over the historical (1980-2010) and future (2070-2100) periods. The use of these ensembles results in respectively 1 500 (30 years x 50 members) and 1200 (30 years x 40 members) simulated years over both the historical and future periods. These large datasets allow the computation of empirical daily extreme precipitation quantiles for large return periods. Using the CanESM2 and CESM1 large ensembles, extreme daily precipitation with return periods ranging from 2 to 100 years are computed in historical and future periods to assess the impact of climate change. Results indicate that daily precipitation extremes generally increase in the future over most land grid points and that these increases will also impact the 100-year extreme daily precipitation. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety. Estimated increases in precipitation associated to very extreme precipitation events (e.g. 100 years) will drastically change the likelihood of flooding and their extent in future climate. These results, although interesting, need to be extended to sub-daily durations, relevant for urban flooding protection and urban infrastructure design (e.g. sewer networks, culverts). Models and simulations at finer spatial and temporal resolution are therefore needed.

  6. Development of Handling Qualities Criteria for Rotorcraft with Externally Slung Loads

    NASA Technical Reports Server (NTRS)

    Hoh, Roger H.; Heffley, Robert K.; Mitchell, David G.

    2006-01-01

    Piloted simulations were performed on the NASA-Ames Vertical Motion Simulator (VMS) to explore handling qualities issues for large cargo helicopters, particularly focusing on external slung load operations. The purpose of this work was based upon the need to include handling qualities criteria for cargo helicopters in an upgrade to the U.S. Army's rotorcraft handling qualities specification, Aeronautical Design Standard-33 (ADS-33E-PRF). From the VMS results, handling qualities criteria were developed fro cargo helicopters carrying external slung loads in the degraded visual environment (DVE). If satisfied, these criteria provide assurance that the handling quality rating (HQR) will be 4 or better for operations in the DVE, and with a load mass ratio of 0.33 or less. For lighter loads, flying qualities were found to be less dependent on the load geometry and therefore the significance of the criteria is less. For heavier loads, meeting the criteria ensures the best possible handling qualities, albeit Level 2 for load mass ratios greater than 0.33.

  7. Evolution of Precipitation Extremes in Three Large Ensembles of Climate Simulations - Impact of Spatial and Temporal Resolutions

    NASA Astrophysics Data System (ADS)

    Martel, J. L.; Brissette, F.; Mailhot, A.; Wood, R. R.; Ludwig, R.; Frigon, A.; Leduc, M.; Turcotte, R.

    2017-12-01

    Recent studies indicate that the frequency and intensity of extreme precipitation will increase in future climate due to global warming. In this study, we compare annual maxima precipitation series from three large ensembles of climate simulations at various spatial and temporal resolutions. The first two are at the global scale: the Canadian Earth System Model (CanESM2) 50-member large ensemble (CanESM2-LE) at a 2.8° resolution and the Community Earth System Model (CESM1) 40-member large ensemble (CESM1-LE) at a 1° resolution. The third ensemble is at the regional scale over both Eastern North America and Europe: the Canadian Regional Climate Model (CRCM5) 50-member large ensemble (CRCM5-LE) at a 0.11° resolution, driven at its boundaries by the CanESM-LE. The CRCM5-LE is a new ensemble issued from the ClimEx project (http://www.climex-project.org), a Québec-Bavaria collaboration. Using these three large ensembles, change in extreme precipitations over the historical (1980-2010) and future (2070-2100) periods are investigated. This results in 1 500 (30 years x 50 members for CanESM2-LE and CRCM5-LE) and 1200 (30 years x 40 members for CESM1-LE) simulated years over both the historical and future periods. Using these large datasets, the empirical daily (and sub-daily for CRCM5-LE) extreme precipitation quantiles for large return periods ranging from 2 to 100 years are computed. Results indicate that daily extreme precipitations generally will increase over most land grid points of both domains according to the three large ensembles. Regarding the CRCM5-LE, the increase in sub-daily extreme precipitations will be even more important than the one observed for daily extreme precipitations. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety.

  8. MrEnt: an editor for publication-quality phylogenetic tree illustrations.

    PubMed

    Zuccon, Alessandro; Zuccon, Dario

    2014-09-01

    We developed MrEnt, a Windows-based, user-friendly software that allows the production of complex, high-resolution, publication-quality phylogenetic trees in few steps, directly from the analysis output. The program recognizes the standard Nexus tree format and the annotated tree files produced by BEAST and MrBayes. MrEnt combines in a single software a large suite of tree manipulation functions (e.g. handling of multiple trees, tree rotation, character mapping, node collapsing, compression of large clades, handling of time scale and error bars for chronograms) with drawing tools typical of standard graphic editors, including handling of graphic elements and images. The tree illustration can be printed or exported in several standard formats suitable for journal publication, PowerPoint presentation or Web publication. © 2014 John Wiley & Sons Ltd.

  9. Extreme-value dependence: An application to exchange rate markets

    NASA Astrophysics Data System (ADS)

    Fernandez, Viviana

    2007-04-01

    Extreme value theory (EVT) focuses on modeling the tail behavior of a loss distribution using only extreme values rather than the whole data set. For a sample of 10 countries with dirty/free float regimes, we investigate whether paired currencies exhibit a pattern of asymptotic dependence. That is, whether an extremely large appreciation or depreciation in the nominal exchange rate of one country might transmit to another. In general, after controlling for volatility clustering and inertia in returns, we do not find evidence of extreme-value dependence between paired exchange rates. However, for asymptotic-independent paired returns, we find that tail dependency of exchange rates is stronger under large appreciations than under large depreciations.

  10. Toward a Framework for Learner Segmentation

    ERIC Educational Resources Information Center

    Azarnoush, Bahareh; Bekki, Jennifer M.; Runger, George C.; Bernstein, Bianca L.; Atkinson, Robert K.

    2013-01-01

    Effectively grouping learners in an online environment is a highly useful task. However, datasets used in this task often have large numbers of attributes of disparate types and different scales, which traditional clustering approaches cannot handle effectively. Here, a unique dissimilarity measure based on the random forest, which handles the…

  11. Information Systems and Performance Measures in Schools.

    ERIC Educational Resources Information Center

    Coleman, James S.; Karweit, Nancy L.

    Large school systems bring various administrative problems in handling scheduling, records, and avoiding making red tape casualties of students. The authors review a portion of the current use of computers to handle these problems and examine the range of activities for which computer processing could provide aid. Since automation always brings…

  12. A novel integrated chassis controller for full drive-by-wire vehicles

    NASA Astrophysics Data System (ADS)

    Song, Pan; Tomizuka, Masayoshi; Zong, Changfu

    2015-02-01

    In this paper, a systematic design with multiple hierarchical layers is adopted in the integrated chassis controller for full drive-by-wire vehicles. A reference model and the optimal preview acceleration driver model are utilised in the driver control layer to describe and realise the driver's anticipation of the vehicle's handling characteristics, respectively. Both the sliding mode control and terminal sliding mode control techniques are employed in the vehicle motion control (MC) layer to determine the MC efforts such that better tracking performance can be attained. In the tyre force allocation layer, a polygonal simplification method is proposed to deal with the constraints of the tyre adhesive limits efficiently and effectively, whereby the load transfer due to both roll and pitch is also taken into account which directly affects the constraints. By calculating the motor torque and steering angle of each wheel in the executive layer, the total workload of four wheels is minimised during normal driving, whereas the MC efforts are maximised in extreme handling conditions. The proposed controller is validated through simulation to improve vehicle stability and handling performance in both open- and closed-loop manoeuvres.

  13. Handle grip span for optimising finger-specific force capability as a function of hand size.

    PubMed

    Lee, Soo-Jin; Kong, Yong-Ku; Lowe, Brian D; Song, Seongho

    2009-05-01

    Five grip spans (45 to 65 mm) were tested to evaluate the effects of handle grip span and user's hand size on maximum grip strength, individual finger force and subjective ratings of comfort using a computerised digital dynamometer with independent finger force sensors. Forty-six males participated and were assigned into three hand size groups (small, medium, large) according to their hands' length. In general, results showed the 55- and 50-mm grip spans were rated as the most comfortable sizes and showed the largest grip strength (433.6 N and 430.8 N, respectively), whereas the 65-mm grip span handle was rated as the least comfortable size and the least grip strength. With regard to the interaction effect of grip span and hand size, small and medium-hand participants rated the best preference for the 50- to 55-mm grip spans and the least for the 65-mm grip span, whereas large-hand participants rated the 55- to 60-mm grip spans as the most preferred and the 45-mm grip span as the least preferred. Normalised grip span (NGS) ratios (29% and 27%) are the ratios of user's hand length to handle grip span. The NGS ratios were obtained and applied for suggesting handle grip spans in order to maximise subjective comfort as well as gripping force according to the users' hand sizes. In the analysis of individual finger force, the middle finger force showed the highest contribution (37.5%) to the total finger force, followed by the ring (28.7%), index (20.2%) and little (13.6%) finger. In addition, each finger was observed to have a different optimal grip span for exerting the maximum force, resulting in a bow-contoured shaped handle (the grip span of the handle at the centre is larger than the handle at the end) for two-handle hand tools. Thus, the grip spans for two-handle hand tools may be designed according to the users' hand/finger anthropometrics to maximise subjective ratings and performance based on this study. Results obtained in this study will provide guidelines for hand tool designers and manufacturers for designing grip spans of two-handle tools, which can maximise handle comfort and performance.

  14. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  15. Data Mining of Extremely Large Ad Hoc Data Sets to Produce Inverted Indices

    DTIC Science & Technology

    2016-06-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited DATA MINING OF...COVERED Master’s Thesis 4. TITLE AND SUBTITLE DATA MINING OF EXTREMELY LARGE AD HOC DATA SETS TO PRODUCE INVERTED INDICES 5. FUNDING NUMBERS 6...INTENTIONALLY LEFT BLANK iii Approved for public release; distribution is unlimited DATA MINING OF EXTREMELY LARGE AD HOC DATA SETS TO PRODUCE

  16. Finding Statistically Significant Communities in Networks

    PubMed Central

    Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo

    2011-01-01

    Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480

  17. Handling Qualities of a Large Civil Tiltrotor in Hover using Translational Rate Command

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Theodore, Colin R.; Lawrence, Ben; Lindsey, James; Blanken, Chris

    2012-01-01

    A Translational Rate Command (TRC) control law has been developed to enable low speed maneuvering of a large civil tiltrotor with minimal pitch changes by means of automatic nacelle angle deflections for longitudinal velocity control. The nacelle actuator bandwidth required to achieve Level 1 handling qualities in hover and the feasibility of additional longitudinal cyclic control to augment low bandwidth nacelle actuation were investigated. A frequency-domain handling qualities criterion characterizing TRC response in terms of bandwidth and phase delay was proposed and validated against a piloted simulation conducted on the NASA-Ames Vertical Motion Simulator. Seven experimental test pilots completed evaluations in the ADS-33E-PRF Hover Mission Task Element (MTE) for a matrix of nacelle actuator bandwidths, equivalent rise times and control response sensitivities, and longitudinal cyclic control allocations. Evaluated against this task, longitudinal phase delay shows the Level 1 boundary is around 0.4 0.5 s. Accordingly, Level 1 handling qualities were achieved either with a nacelle actuator bandwidth greater than 4 rad/s, or by employing longitudinal cyclic control to augment low bandwidth nacelle actuation.

  18. Dynamical systems proxies of atmospheric predictability and mid-latitude extremes

    NASA Astrophysics Data System (ADS)

    Messori, Gabriele; Faranda, Davide; Caballero, Rodrigo; Yiou, Pascal

    2017-04-01

    Extreme weather ocurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. Many extremes (for e.g. storms, heatwaves, cold spells, heavy precipitation) are tied to specific patterns of midlatitude atmospheric circulation. The ability to identify these patterns and use them to enhance the predictability of the extremes is therefore a topic of crucial societal and economic value. We propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We use two simple dynamical systems metrics - local dimension and persistence - to identify sets of similar large-scale atmospheric flow patterns which present a coherent temporal evolution. When these patterns correspond to weather extremes, they therefore afford a particularly good forward predictability. We specifically test this technique on European winter temperatures, whose variability largely depends on the atmospheric circulation in the North Atlantic region. We find that our dynamical systems approach provides predictability of large-scale temperature extremes up to one week in advance.

  19. Protecting personal information: Implications of the Protection of Personal Information (POPI) Act for healthcare professionals.

    PubMed

    Buys, M

    2017-10-31

    Careless handling of patient information in daily medical practice can result in Health Professions Council of South Africa sanction, breach of privacy lawsuits and, in extreme cases, serious monetary penalty or even imprisonment. This review will focus on the Protection of Personal Information (POPI) Act (No. 4 of 2013) and the implications thereof for healthcare professionals in daily practice. Recommendations regarding the safeguarding of information are made.

  20. Emergency-Evacuation Cart

    NASA Technical Reports Server (NTRS)

    Fedor, Otto H.; Owens, Lester J.

    1988-01-01

    Proposed cart designed to remove injured worker from vicinity of hazardous chemical spill. Self-propelled cart enables rescuer to move victim of industrial accident quickly away from toxic, flammable, explosive, corrosive, carcinogenic, asphyxiating, or extremely cold liquids. Intended for use where too dangerous for ambulances and other vehicles to approach accident site. Constructed of high-strength tubing, rides on bicycle wheels with balloon tires. Rescuer steers cart with handle at rear. Estimated mass of fully equipped vehicle is 650 lb.

  1. Extreme longevity in freshwater mussels revisited: sources of bias in age estimates derived from mark-recapture experiments

    Treesearch

    Wendell R. Haag

    2009-01-01

    There may be bias associated with mark–recapture experiments used to estimate age and growth of freshwater mussels. Using subsets of a mark–recapture dataset for Quadrula pustulosa, I examined how age and growth parameter estimates are affected by (i) the range and skew of the data and (ii) growth reduction due to handling. I compared predictions...

  2. High Altitude Supersonic Target (HAST), Phase 2

    DTIC Science & Technology

    1974-08-01

    consists of a padded cradle assembly and a tubular steel stand with lockable swivel casters on the front wheels . c Tne recovery module lifting handle is...eds no modification of the ordnance fired at it in order to function. With the HAST system a target will be provided to evaluate the most advanced...Components were tested under environmental extremes. With the completion of the preflight readiness tests and with modification incorporated during the

  3. Flight test experience with high-alpha control system techniques on the F-14 airplane

    NASA Technical Reports Server (NTRS)

    Gera, J.; Wilson, R. J.; Enevoldson, E. K.; Nguyen, L. T.

    1981-01-01

    Improved handling qualities of fighter aircraft at high angles of attack can be provided by various stability and control augmentation techniques. NASA and the U.S. Navy are conducting a joint flight demonstration of these techniques on an F-14 airplane. This paper reports on the flight test experience with a newly designed lateral-directional control system which suppresses such high angle of attack handling qualities problems as roll reversal, wing rock, and directional divergence while simultaneously improving departure/spin resistance. The technique of integrating a piloted simulation into the flight program was used extensively in this program. This technique had not been applied previously to high angle of attack testing and required the development of a valid model to simulate the test airplane at extremely high angles of attack.

  4. Evaluation of objectivity, reliability and criterion validity of the key indicator method for manual handling operations (KIM-MHO), draft 2007.

    PubMed

    Klußmann, André; Gebhardt, Hansjürgen; Rieger, Monika; Liebers, Falk; Steinberg, Ulf

    2012-01-01

    Upper extremity musculoskeletal symptoms and disorders are common in the working population. The economic and social impact of such disorders is considerable. Long-time, dynamic repetitive exposure of the hand-arm system during manual handling operations (MHO) alone or in combination with static and postural effort are recognised as causes of musculoskeletal symptoms and disorders. The assessment of these manual work tasks is crucial to estimate health risks of exposed employees. For these work tasks, a new method for the assessment of the working conditions was developed and a validation study was performed. The results suggest satisfying criterion validity and moderate objectivity of the KIM-MHO draft 2007. The method was modified and evaluated again. It is planned to release a new version of KIM-MHO in spring 2012.

  5. A powerful and efficient set test for genetic markers that handles confounders

    PubMed Central

    Listgarten, Jennifer; Lippert, Christoph; Kang, Eun Yong; Xiang, Jing; Kadie, Carl M.; Heckerman, David

    2013-01-01

    Motivation: Approaches for testing sets of variants, such as a set of rare or common variants within a gene or pathway, for association with complex traits are important. In particular, set tests allow for aggregation of weak signal within a set, can capture interplay among variants and reduce the burden of multiple hypothesis testing. Until now, these approaches did not address confounding by family relatedness and population structure, a problem that is becoming more important as larger datasets are used to increase power. Results: We introduce a new approach for set tests that handles confounders. Our model is based on the linear mixed model and uses two random effects—one to capture the set association signal and one to capture confounders. We also introduce a computational speedup for two random-effects models that makes this approach feasible even for extremely large cohorts. Using this model with both the likelihood ratio test and score test, we find that the former yields more power while controlling type I error. Application of our approach to richly structured Genetic Analysis Workshop 14 data demonstrates that our method successfully corrects for population structure and family relatedness, whereas application of our method to a 15 000 individual Crohn’s disease case–control cohort demonstrates that it additionally recovers genes not recoverable by univariate analysis. Availability: A Python-based library implementing our approach is available at http://mscompbio.codeplex.com. Contact: jennl@microsoft.com or lippert@microsoft.com or heckerma@microsoft.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23599503

  6. Dendritic morphology of amygdala and hippocampal neurons in more and less predator stress responsive rats and more and less spontaneously anxious handled controls

    PubMed Central

    Adamec, Robert; Hebert, Mark; Blundell, Jacqueline; Mervis, Ronald F.

    2013-01-01

    We investigated the neurobiological bases of variation in response to predator stress (PS). Sixteen days after treatment (PS or handling), rats were grouped according to anxiety in the elevated plus maze (EPM). Acoustic startle was also measured. We examined the structure of dendritic trees of basolateral amygdala (BLA) output neurons (stellate and pyramidal cells) and of dorsal hippocampal (DHC) dentate granule cells of less anxious (LA) and more (extremely) anxious (MA) stressed animals (PSLA and PSMA). Handled controls (HC) which were less anxious (HCLA) and spontaneously more anxious (HCMA) equivalently to predator stressed subgroups were also studied. Golgi analysis revealed BLA output neurons of HCMA rats exhibited longer, more branched dendrites with higher spine density than the other groups of rats, which did not differ. Finally, spine density of DHC granule cells was equally depressed in HCMA and PSMA rats relative to HCLA and PSLA rats. Total dendritic length of BLA pyramidal and stellate cells (positive predictor) and DHC spine density (negative predictor) together accounted for 96% of the variance of anxiety of handled rats. DHC spine density was a negative predictor of PSMA and PSLA anxiety, accounting for 70% of the variance. Data are discussed in the context of morphological differences as phenotypic markers of a genetic predisposition to anxiety in handled controls, and a possible genetic vulnerability to predator stress expressed as reduced spine density in the DHC. Significance of findings for animal models of anxiety and hyperarousal comorbidities of PTSD are discussed. PMID:21925210

  7. The Vermicelli Handling Test: A Simple Quantitative Measure of Dexterous Forepaw Function in Rats

    PubMed Central

    Allred, Rachel P.; Adkins, DeAnna L.; Woodlee, Martin T.; Husbands, Lincoln C.; Maldonado, Mónica A.; Kane, Jacqueline R.; Schallert, Timothy; Jones, Theresa A.

    2008-01-01

    Loss of function in the hands occurs with many brain disorders, but there are few measures of skillful forepaw use in rats available to model these impairments that are both sensitive and simple to administer. Whishaw and Coles (1996) previously described the dexterous manner in which rats manipulate food items with their paws, including thin pieces of pasta. We set out to develop a measure of this food handling behavior that would be quantitative, easy to administer, sensitive to the effects of damage to sensory and motor systems of the CNS and useful for identifying the side of lateralized impairments. When rats handle 7 cm lengths of vermicelli, they manipulate the pasta by repeatedly adjusting the forepaw hold on the pasta piece. As operationally defined, these adjustments can be easily identified and counted by an experimenter without specialized equipment. After unilateral sensorimotor cortex (SMC) lesions, transient middle cerebral artery occlusion (MCAO) and striatal dopamine depleting (6-hydroxydopamine, 6-OHDA) lesions in adult rats, there were enduring reductions in adjustments made with the contralateral forepaw. Additional pasta handling characteristics distinguished between the lesion types. MCAO and 6-OHDA lesions increased the frequency of several identified atypical handling patterns. Severe dopamine depletion increased eating time and adjustments made with the ipsilateral forepaw. However, contralateral forepaw adjustment number most sensitively detected enduring impairments across lesion types. Because of its ease of administration and sensitivity to lateralized impairments in skilled forepaw use, this measure may be useful in rat models of upper extremity impairment. PMID:18325597

  8. Occupational health and safety aspects of animal handling in dairy production.

    PubMed

    Lindahl, Cecilia; Lundqvist, Peter; Hagevoort, G Robert; Lunner Kolstrup, Christina; Douphrate, David I; Pinzke, Stefan; Grandin, Temple

    2013-01-01

    Livestock handling in dairy production is associated with a number of health and safety issues. A large number of fatal and nonfatal injuries still occur when handling livestock. The many animal handling tasks on a dairy farm include moving cattle between different locations, vaccination, administration of medication, hoof care, artificial insemination, ear tagging, milking, and loading onto trucks. There are particular problems with bulls, which continue to cause considerable numbers of injuries and fatalities in dairy production. In order to reduce the number of injuries during animal handling on dairy farms, it is important to understand the key factors in human-animal interactions. These include handler attitudes and behavior, animal behavior, and fear in cows. Care when in close proximity to the animal is the key for safe handling, including knowledge of the flight zone, and use of the right types of tools and suitable restraint equipment. Thus, in order to create safe working conditions during livestock handling, it is important to provide handlers with adequate training and to establish sound safety management procedures on the farm.

  9. Identifying Heat Waves in Florida: Considerations of Missing Weather Data

    PubMed Central

    Leary, Emily; Young, Linda J.; DuClos, Chris; Jordan, Melissa M.

    2015-01-01

    Background Using current climate models, regional-scale changes for Florida over the next 100 years are predicted to include warming over terrestrial areas and very likely increases in the number of high temperature extremes. No uniform definition of a heat wave exists. Most past research on heat waves has focused on evaluating the aftermath of known heat waves, with minimal consideration of missing exposure information. Objectives To identify and discuss methods of handling and imputing missing weather data and how those methods can affect identified periods of extreme heat in Florida. Methods In addition to ignoring missing data, temporal, spatial, and spatio-temporal models are described and utilized to impute missing historical weather data from 1973 to 2012 from 43 Florida weather monitors. Calculated thresholds are used to define periods of extreme heat across Florida. Results Modeling of missing data and imputing missing values can affect the identified periods of extreme heat, through the missing data itself or through the computed thresholds. The differences observed are related to the amount of missingness during June, July, and August, the warmest months of the warm season (April through September). Conclusions Missing data considerations are important when defining periods of extreme heat. Spatio-temporal methods are recommended for data imputation. A heat wave definition that incorporates information from all monitors is advised. PMID:26619198

  10. Identifying Heat Waves in Florida: Considerations of Missing Weather Data.

    PubMed

    Leary, Emily; Young, Linda J; DuClos, Chris; Jordan, Melissa M

    2015-01-01

    Using current climate models, regional-scale changes for Florida over the next 100 years are predicted to include warming over terrestrial areas and very likely increases in the number of high temperature extremes. No uniform definition of a heat wave exists. Most past research on heat waves has focused on evaluating the aftermath of known heat waves, with minimal consideration of missing exposure information. To identify and discuss methods of handling and imputing missing weather data and how those methods can affect identified periods of extreme heat in Florida. In addition to ignoring missing data, temporal, spatial, and spatio-temporal models are described and utilized to impute missing historical weather data from 1973 to 2012 from 43 Florida weather monitors. Calculated thresholds are used to define periods of extreme heat across Florida. Modeling of missing data and imputing missing values can affect the identified periods of extreme heat, through the missing data itself or through the computed thresholds. The differences observed are related to the amount of missingness during June, July, and August, the warmest months of the warm season (April through September). Missing data considerations are important when defining periods of extreme heat. Spatio-temporal methods are recommended for data imputation. A heat wave definition that incorporates information from all monitors is advised.

  11. Habitat quality affects stress responses and survival in a bird wintering under extremely low ambient temperatures

    NASA Astrophysics Data System (ADS)

    Cīrule, Dina; Krama, Tatjana; Krams, Ronalds; Elferts, Didzis; Kaasik, Ants; Rantala, Markus J.; Mierauskas, Pranas; Luoto, Severi; Krams, Indrikis A.

    2017-12-01

    Animals normally respond to stressful environmental stimuli by releasing glucocorticoid hormones. We investigated whether baseline corticosterone (CORT), handling-induced corticosterone concentration(s), and body condition indices of members of willow tit ( Poecile montanus) groups differed while wintering in old growth forests and managed young forests in mild weather conditions and during cold spells. Willow tits spend the winter season in non-kin groups in which dominant individuals typically claim their priority to access resources, while subordinate individuals may experience greater levels of stress and higher mortality, especially during cold spells. We captured birds to measure baseline CORT and levels of handling-induced CORT secretion after 20 min of capture. Willow tits in the young forests had higher baseline CORT and a smaller increase in CORT in response to capture than individuals in the old forests. Baseline CORT was higher in females and juvenile birds compared to adult males, whereas handling-induced CORT secretion did not differ between birds of different ages. During cold spells, baseline CORT of willow tits increased and handling-induced CORT secretion decreased, especially in birds in young forests. Willow tits' survival was higher in the old forests, with dominant individuals surviving better than subordinates. Our results show that changes in CORT secretion reflect responses to habitat quality and climate harshness, indicating young managed coniferous forests as a suboptimal habitat for the willow tit.

  12. Effect of a worktable position on head and shoulder posture and shoulder muscles in manual material handling.

    PubMed

    Kim, Min-Hee; Yoo, Won-Gyu

    2015-06-05

    According to a recent research, manual working with high levels of static contraction, repetitive loads, or extreme working postures involving the neck and shoulder muscles causes an increased risk of neck and shoulder musculoskeletal disorders. We investigated the effects of the forwardly worktable position on head and shoulder angles and shoulder muscle activity in manual material handling tasks. The forward head and shoulder angles and the activity of upper trapezius, levator scapulae, and middle deltoid muscle activities of 15 workers were measured during performing of manual material handling in two tasks that required different forward head and shoulder angles. The second manual material task required a significantly increased forward head and shoulder angle. The upper trapezius and levator scapulae muscle activity in second manual material task was increased significantly compared with first manual material task. The middle deltoid muscle activity in second manual material task was not significantly different compared with first manual material task. Based on this result, the forward head and shoulder angles while performing manual work need to be considered in selection of the forward distance of a worktable form the body. The high level contractions of the neck and shoulder muscles correlated with neck and shoulder pain. Therefore, the forward distance of a worktable can be an important factor in preventing neck and shoulder pain in manual material handling workers.

  13. A different perspective: anesthesia for extreme premature infants: is there an age limitation or how low should we go?

    PubMed

    Lönnqvist, Per-Arne

    2018-06-01

    To put in perspective, the various challenges that faces pediatric anesthesiologists because of the recently lowered limits with regards to the viability of a fetus. Both medical and ethical considerations will be highlighted. Issues related to: who should anesthetize these tiny babies; can we provide adequate and legal monitoring during the anesthetic; does these immature babies need hypnosis and amnesia and the moral/ethical implications associated with being involved with care of doubtful long-term outcome are reviewed. There does currently not exist sufficient research data to provide any evidence-based guidelines for the anesthetic handling of extreme premature infants. Current practice relies on extrapolations from other patient groups and from attempting to preserve normal physiology. Thus, focused research initiatives within this specific field of anesthesia should be a priority. Furthermore, in-depth multiprofessional ethical discussions regarding long-term outcome of aggressive care of extremely premature babies are urgently needed, including the new concepts of disability-free survival and number-need-to-suffer.

  14. Information science team

    NASA Technical Reports Server (NTRS)

    Billingsley, F.

    1982-01-01

    Concerns are expressed about the data handling aspects of system design and about enabling technology for data handling and data analysis. The status, contributing factors, critical issues, and recommendations for investigations are listed for data handling, rectification and registration, and information extraction. Potential supports to individual P.I., research tasks, systematic data system design, and to system operation. The need for an airborne spectrometer class instrument for fundamental research in high spectral and spatial resolution is indicated. Geographic information system formatting and labelling techniques, very large scale integration, and methods for providing multitype data sets must also be developed.

  15. Handling qualities criteria for the space shuttle orbiter during the terminal phase of flight

    NASA Technical Reports Server (NTRS)

    Stapleford, R. L.; Klein, R. H.; Hob, R. H.

    1972-01-01

    It was found that large portions of the military handling qualities specification are directly applicable. However a number of additional and substitute criteria are recommended for areas not covered or inadequately covered in the military specification. Supporting pilot/vehicle analyses and simulation experiments were conducted and are described. Results are also presented of analytical and simulator evaluations of three specific interim Orbiter designs which provided a test of the proposed handling qualities criteria. The correlations between the analytical and experimental evaluations were generally excellent.

  16. Effect of bit wear on hammer drill handle vibration and productivity.

    PubMed

    Antonucci, Andrea; Barr, Alan; Martin, Bernard; Rempel, David

    2017-08-01

    The use of large electric hammer drills exposes construction workers to high levels of hand vibration that may lead to hand-arm vibration syndrome and other musculoskeletal disorders. The aim of this laboratory study was to investigate the effect of bit wear on drill handle vibration and drilling productivity (e.g., drilling time per hole). A laboratory test bench system was used with an 8.3 kg electric hammer drill and 1.9 cm concrete bit (a typical drill and bit used in commercial construction). The system automatically advanced the active drill into aged concrete block under feed force control to a depth of 7.6 cm while handle vibration was measured according to ISO standards (ISO 5349 and 28927). Bits were worn to 4 levels by consecutive hole drilling to 4 cumulative drilling depths: 0, 1,900, 5,700, and 7,600 cm. Z-axis handle vibration increased significantly (p<0.05) from 4.8 to 5.1 m/s 2 (ISO weighted) and from 42.7-47.6 m/s 2 (unweighted) when comparing a new bit to a bit worn to 1,900 cm of cumulative drilling depth. Handle vibration did not increase further with bits worn more than 1900 cm of cumulative drilling depth. Neither x- nor y-axis handle vibration was effected by bit wear. The time to drill a hole increased by 58% for the bit with 5,700 cm of cumulative drilling depth compared to a new bit. Bit wear led to a small but significant increase in both ISO weighted and unweighted z-axis handle vibration. Perhaps more important, bit wear had a large effect on productivity. The effect on productivity will influence a worker's allowable daily drilling time if exposure to drill handle vibration is near the ACGIH Threshold Limit Value. [1] Construction contractors should implement a bit replacement program based on these findings.

  17. Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses

    NASA Astrophysics Data System (ADS)

    Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.

    2014-12-01

    Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.

  18. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    NASA Astrophysics Data System (ADS)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are being simulated for plausible physical reasons, boosting confidence in future projections of temperature extremes. Conversely, where model skill is identified to be lower, caution should be exercised in interpreting future projections.

  19. Control and applications of cooperating disparate robotic manipulators relevant to nuclear waste management

    NASA Technical Reports Server (NTRS)

    Lew, Jae Young; Book, Wayne J.

    1991-01-01

    Remote handling in nuclear waste management requires a robotic system with precise motion as well as a large workspace. The concept of a small arm mounted on the end of a large arm may satisfy such needs. However, the performance of such a serial configuration lacks payload capacity which is a crucial factor for handling a massive object. Also, this configuration induces more flexibility on the structure. To overcome these problems, the topology of bracing the tip of the small arm (not the large arm) and having an end effector in the middle of the chain is proposed in this paper. Also, control of these cooperating disparate manipulators is accomplished in computer simulations. Thus, this robotic system can have the accuracy of the small arm, and at the same time, it can have the payload capacity and large workspace of the large arm.

  20. Repeated thermal stressor causes chronic elevation of baseline corticosterone and suppresses the physiological endocrine sensitivity to acute stressor in the cane toad (Rhinella marina).

    PubMed

    Narayan, Edward J; Hero, Jean-Marc

    2014-04-01

    Extreme environmental temperature could impact the physiology and ecology of animals. The stress endocrine axis provides necessary physiological stress response to acute (day-day) stressors. Presently, there are no empirical evidences showing that exposure to extreme thermal stressor could cause chronic stress in amphibians. This could also modulate the physiological endocrine sensitivity to acute stressors and have serious implications for stress coping in amphibians, particularly those living in fragmented and disease prone environments. We addressed this important question using the cane toad (Rhinella marina) model from its introduced range in Queensland, Australia. We quantified their physiological endocrine sensitivity to a standard acute (capture and handling) stressor after exposing the cane toads to thermal shock at 35°C for 30min daily for 34 days. Corticosterone (CORT) responses to the capture and handling protocol were measured on three sampling intervals (days 14, 24, and 34) to determine whether the physiological endocrine sensitivity was maintained or modulated over-time. Two control groups (C1 for baseline CORT measurement only and C2 acute handled only) and two temperature treatment groups (T1 received daily thermal shock up to day 14 only and a recovery phase of 20 days and T2 received thermal shock daily for 34 days). Results showed that baseline CORT levels remained high on day 14 (combined effect of capture, captivity and thermal stress) for both T1 and T2. Furthermore, baseline CORT levels decreased for T1 once the thermal shock was removed after day 14 and returned to baseline by day 29. On the contrary, baseline CORT levels kept on increasing for T2 over the 34 days of daily thermal shocks. Furthermore, the magnitudes of the acute CORT responses or physiological endocrine sensitivity were consistently high for both C1 and T1. However, acute CORT responses for T2 toads were dramatically reduced between days 24 and 34. These novel findings suggest that repeated exposure to extreme thermal stressor could cause chronic stress and consequently suppress the physiological endocrine sensitivity to acute stressors (e.g. pathogenic diseases) in amphibians. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Identification of large-scale meteorological patterns associated with extreme precipitation in the US northeast

    NASA Astrophysics Data System (ADS)

    Agel, Laurie; Barlow, Mathew; Feldstein, Steven B.; Gutowski, William J.

    2018-03-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. The tropopause height provides a compact representation of the upper-tropospheric potential vorticity, which is closely related to the overall evolution and intensity of weather systems. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into the overall context of patterns for all days. Six tropopause patterns are identified through KMC for extreme day precipitation: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong moisture transport preceding, and upward motion during, extreme precipitation events.

  2. Use of Health Belief Model Variables To Examine Self-Reported Food Handling Behaviors in a Sample of U.S. Adults Attending a Tailgate Event.

    PubMed

    Hanson, Jennifer A; Hughes, Susan M; Liu, Pei

    2015-12-01

    Unsafe food handling behaviors are common among consumers, and, given the venue, individuals attending a tailgating event may be at risk for foodborne illness. The objective of this study was to measure the association between Health Belief Model variables and self-reported usual food handling behaviors in a convenience sample of men and women at a tailgate event. Participants (n = 128) completed validated subscales for self-reported food handling behaviors (i.e., cross-contamination, sanitation), perceived threat of foodborne illness (i.e., perceived severity, perceived susceptibility), and safe food handling cues to action (i.e., media cues, educational cues). Perceived severity of foodborne illness was associated with safer behaviors related to sanitation (r = 0.40; P < 0.001) and cross-contamination (r = 0.33; P = 0.001). Perceived severity of foodborne illness was also associated with exposure to safe food handling media cues (r = 0.20; P = 0.027) but not with safe food handling educational cues. A large proportion of participants reported that they never or seldom (i) read newspaper or magazine articles about foodborne illness (65.6%); (ii) read brochures about safe ways to handle food (61.7%); (iii) see store displays that explain ways to handle food (51.6%); or (iv) read the "safe handling instructions" on packages of raw meat and poultry (46.9%). Perceived severity of foodborne illness was positively related to both dimensions of safe food handling as well as with safe food handling media cues. Except for the weak correlation between media cues and perceived severity, the relationships between safe food handling cues and perceived threat, as well as between safe food handling cues and behaviors, were nonsignificant. This finding may be due, in part, to the participants' overall low exposure to safe food handling cues. The overall results of this study reinforce the postulate that perceived severity of foodborne illness may influence food handling behaviors.

  3. Extreme Mean and Its Applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.

    1979-01-01

    Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.

  4. Automatic liquid handling for life science: a critical review of the current state of the art.

    PubMed

    Kong, Fanwei; Yuan, Liang; Zheng, Yuan F; Chen, Weidong

    2012-06-01

    Liquid handling plays a pivotal role in life science laboratories. In experiments such as gene sequencing, protein crystallization, antibody testing, and drug screening, liquid biosamples frequently must be transferred between containers of varying sizes and/or dispensed onto substrates of varying types. The sample volumes are usually small, at the micro- or nanoliter level, and the number of transferred samples can be huge when investigating large-scope combinatorial conditions. Under these conditions, liquid handling by hand is tedious, time-consuming, and impractical. Consequently, there is a strong demand for automated liquid-handling methods such as sensor-integrated robotic systems. In this article, we survey the current state of the art in automatic liquid handling, including technologies developed by both industry and research institutions. We focus on methods for dealing with small volumes at high throughput and point out challenges for future advancements.

  5. Evaluation in industry of a draft code of practice for manual handling.

    PubMed

    Ashby, Liz; Tappin, David; Bentley, Tim

    2004-05-01

    This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.

  6. Extreme-ultraviolet and electron beam lithography processing using water developable resist material

    NASA Astrophysics Data System (ADS)

    Takei, Satoshi

    2017-08-01

    In order to achieve the use of pure water in the developable process of extreme-ultraviolet and electron beam lithography, instead of conventionally used tetramethylammonium hydroxide and organic solvents, a water developable resist material was designed and developed. The water-developable resist material was derived from woody biomass with beta-linked disaccharide unit for environmental affair, safety, easiness of handling, and health of the working people. 80 nm dense line patterning images with exposure dose of 22 μC/cm2 and CF4 etching selectivity of 1.8 with hardmask layer were provided by specific process conditions. The approach of our water-developable resist material will be one of the most promising technologies ready to be investigated into production of medical device applications.

  7. Performance of GUNGEN Idea Generation Support Groupware: Lessons from Over A Few Hundred Trial Sessions

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Munemori, Jun

    GUNGEN-DXII, a new version of the GUNGEN groupware, allows the users to process hundreds of qualitative data segments (phrases and sentences) and compose a coherent piece of text containing a number of emergent ideas. The idea generation process is guided by the KJ method, a leading idea generation technique in Japan. This paper describes functions of GUNGEN supporting three major sub-activities of idea generation, namely, brainstorming, idea clustering, and text composition, and also summarizes the results obtained from a few hundred trial sessions with the old and new GUNGEN systems in terms of some qualitative and quantitative measures. The results show that the sessions with GUNGEN yield intermediate and final products at least as good as those from the original paper-and-pencil KJ method sessions, in addition to the advantages of the online system, such as distance collaboration and digital storage of the products. Moreover, results from the new GUNGEN-DXII raises hope for enabling the users to handle an extremely large number of qualitative data segments in the near future.

  8. Introduction. Computational aerodynamics.

    PubMed

    Tucker, Paul G

    2007-10-15

    The wide range of uses of computational fluid dynamics (CFD) for aircraft design is discussed along with its role in dealing with the environmental impact of flight. Enabling technologies, such as grid generation and turbulence models, are also considered along with flow/turbulence control. The large eddy simulation, Reynolds-averaged Navier-Stokes and hybrid turbulence modelling approaches are contrasted. The CFD prediction of numerous jet configurations occurring in aerospace are discussed along with aeroelasticity for aeroengine and external aerodynamics, design optimization, unsteady flow modelling and aeroengine internal and external flows. It is concluded that there is a lack of detailed measurements (for both canonical and complex geometry flows) to provide validation and even, in some cases, basic understanding of flow physics. Not surprisingly, turbulence modelling is still the weak link along with, as ever, a pressing need for improved (in terms of robustness, speed and accuracy) solver technology, grid generation and geometry handling. Hence, CFD, as a truly predictive and creative design tool, seems a long way off. Meanwhile, extreme practitioner expertise is still required and the triad of computation, measurement and analytic solution must be judiciously used.

  9. Reconstructing the Seismic Wavefield using Curvelets and Distributed Acoustic Sensing

    NASA Astrophysics Data System (ADS)

    Muir, J. B.; Zhan, Z.

    2017-12-01

    Distributed Acoustic Sensing (DAS) offers an opportunity to produce cost effective and uniquely dense images of the surface seismic wavefield - DAS also produces extremely large data volumes that require innovative methods of data reduction and seismic parameter inversion to handle efficiently. We leverage DAS and the super-Nyquist sampling enabled by compressed sensing of the wavefield in the curvelet domain to produce accurate images of the horizontal velocity within a target region, using only short ( 1-10 minutes) records of either active seismic sources or ambient seismic signals. Once the wavefield has been fully described, modern "tomographic" techniques, such as Helmholtz tomography or Wavefield Gradiometry, can be employed to determine seismic parameters of interest such as phase velocity. An additional practical benefit of employing a wavefield reconstruction step is that multiple heterogeneous forms of instrumentation can be naturally combined - therefore in this study we also explore the addition of three component nodal seismic data into the reconstructed wavefield. We illustrate these techniques using both synthetic examples and data taken from the Brady Geothermal Field in Nevada during the PoroTomo (U. Wisconsin Madison) experiment of 2016.

  10. The Nexus of Extremism and Trafficking: Scourge of the World or So Much Hype?

    DTIC Science & Technology

    2013-10-01

    in AML / CFT ,” International Monetary Fund. Available at: www. imf.org/external/np/leg/amlcft/eng/aml1.htm (accessed April 6, 2012). 19. “Money...tional Law Enforcement Organizations’ Interdiction Again Human Traffick- ing,” thesis presented to the Graduate Council of Texas State University-San...Marcos, December 2011, 18-21. Available at: http://repositories.tdl.org/txstate- ir/bitstream/handle/10529/ETD-TXSTATE-2011-12-299/BAILEY- THESIS . pdf

  11. Conference Video for Booth at SAE World Congress Experience Conference

    NASA Technical Reports Server (NTRS)

    Harkey, Ann Marie

    2017-01-01

    Contents: Publicly released videos on technology transfer items available for licensing from NASA. Includes; Powder Handling Device for Analytical Instruments (Ames); 2. Fiber Optic Shape Sensing (FOSS) (Armstrong); 3. Robo-Glove (Johnson); 4. Modular Robotic Vehicle (Johnson); 5. Battery Management System (Johnson); 6. Active Response Gravity Offload System (ARGOS) (Johnson); 7. Contaminant Resistant Coatings for Extreme Environments (Langley); 8. Molecular Adsorber Coating (MAC) (Goddard); 9. Ultrasonic Stir Welding (Marshall). Also includes scenes from the International Space Station.

  12. The Effect of Yaw Coupling in Turning Maneuvers of Large Transport Aircraft

    NASA Technical Reports Server (NTRS)

    McNeill, Walter E.; Innis, Robert C.

    1965-01-01

    A study has been made, using a piloted moving simulator, of the effects of the yaw-coupling parameters N(sub p) and N(sub delta(sub a) on the lateral-directional handling qualities of a large transport airplane at landing-approach airspeed. It is shown that the desirable combinations of these parameters tend to be more proverse when compared with values typical of current aircraft. Results of flight tests in a large variable-stability jet transport showed trends which were similar to those of the simulator data. Areas of minor disagreement, which were traced to differences in airplane geometry, indicate that pilot consciousness of side acceleration forces can be an important factor in handling qualities of future long-nosed transport aircraft.

  13. Tethered particle analysis of supercoiled circular DNA using peptide nucleic acid handles.

    PubMed

    Norregaard, Kamilla; Andersson, Magnus; Nielsen, Peter Eigil; Brown, Stanley; Oddershede, Lene B

    2014-09-01

    This protocol describes how to monitor individual naturally supercoiled circular DNA plasmids bound via peptide nucleic acid (PNA) handles between a bead and a surface. The protocol was developed for single-molecule investigation of the dynamics of supercoiled DNA, and it allows the investigation of both the dynamics of the molecule itself and of its interactions with a regulatory protein. Two bis-PNA clamps designed to bind with extremely high affinity to predetermined homopurine sequence sites in supercoiled DNA are prepared: one conjugated with digoxigenin for attachment to an anti-digoxigenin-coated glass cover slide, and one conjugated with biotin for attachment to a submicron-sized streptavidin-coated polystyrene bead. Plasmids are constructed, purified and incubated with the PNA handles. The dynamics of the construct is analyzed by tracking the tethered bead using video microscopy: less supercoiling results in more movement, and more supercoiling results in less movement. In contrast to other single-molecule methodologies, the current methodology allows for studying DNA in its naturally supercoiled state with constant linking number and constant writhe. The protocol has potential for use in studying the influence of supercoils on the dynamics of DNA and its associated proteins, e.g., topoisomerase. The procedure takes ~4 weeks.

  14. An artificial perch to help Snail Kites handle an exotic Apple Snail

    USGS Publications Warehouse

    Pias, Kyle E.; Welch, Zach C.; Kitchens, Wiley M.

    2012-01-01

    In the United States, the Snail Kite (Rostrhamus sociabilis plumbeus) is a federally endangered species and restricted to the wetlands of south-central Florida where the current population numbers less than 1,500. The Snail Kite is an extreme dietary specialist, previously feeding almost exclusively on one species of snail, the Florida Apple Snail (Pomacea paludosa). Within the past decade, an exotic species of apple snail, the Island Apple Snail (Pomacea insularum), has become established on lakes in central Florida. Island Apple Snails are larger than the native Florida Apple Snails, and Snail Kites handle the exotic snails less efficiently. Juvenile Snail Kites, in particular, have lower daily energy balances while feeding on Island Apple Snails. An inexpensive, easy-to-construct platform was developed that would provide Snail Kites with a flat, stable surface on which to extract snails. The platform has the potential to reduce the difficulties Snail Kites experience when handling exotic snails, and may benefit the Snail Kite population as a whole. Initial observations indicate that Snail Kites use the platforms frequently, and snails extracted at the platforms are larger than snails extracted at other perches.

  15. Evaluating Environment, Erosion and Sedimentation Aspects in Coastal Area to Determine Priority Handling (A Case Study in Jepara Regency, northern Central Java, Indonesia)

    NASA Astrophysics Data System (ADS)

    Wahyudi, S. I.; Adi, H. P.

    2018-04-01

    Many areas of the northern coastal in Central Java, Indonesia, have been suffering from damage. One of the areas is Jepara, which has been experiencing this kind of damage for 7.6 kilometres from total 72 kilometres long beach. All damages are mostly caused by coastal erosion, sedimentation, environment and tidal flooding. Several efforts have been done, such as replanting mangroves, building revetment and groins, but it still could not mitigated the coastal damage. The purposes of this study are to map the coastal damages, to analyze handling priority and to determine coastal protection model. The method used are by identifying and plotting the coastal damage on the map, assessing score of each variable, and determining the handling priority and suitable coastal protection model. There are five levels of coastal damage used in this study, namely as light damage, medium, heavy, very heavy, and extremely heavy. Based on the priority assessment of coastal damage, it needs to be followed up by designing in detail and implementing through soft structure for example mangrove, sand nourishes and hard structure, such as breakwater, groins and revetment.

  16. Solid Freeform Fabrication: An Enabling Technology for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M. B.; Hafley, Robert A.; Dicus, Dennis L.

    2002-01-01

    The emerging class of direct manufacturing processes known as Solid Freeform Fabrication (SFF) employs a focused energy beam and metal feedstock to build structural parts directly from computer aided design (CAD) data. Some variations on existing SFF techniques have potential for application in space for a variety of different missions. This paper will focus on three different applications ranging from near to far term to demonstrate the widespread potential of this technology for space-based applications. One application is the on-orbit construction of large space structures, on the order of tens of meters to a kilometer in size. Such structures are too large to launch intact even in a deployable design; their extreme size necessitates assembly or erection of such structures in space. A low-earth orbiting satellite with a SFF system employing a high-energy beam for high deposition rates could be employed to construct large space structures using feedstock launched from Earth. A second potential application is a small, multifunctional system that could be used by astronauts on long-duration human exploration missions to manufacture spare parts. Supportability of human exploration missions is essential, and a SFF system would provide flexibility in the ability to repair or fabricate any part that may be damaged or broken during the mission. The system envisioned would also have machining and welding capabilities to increase its utility on a mission where mass and volume are extremely limited. A third example of an SFF application in space is a miniaturized automated system for structural health monitoring and repair. If damage is detected using a low power beam scan, the beam power can be increased to perform repairs within the spacecraft or satellite structure without the requirement of human interaction or commands. Due to low gravity environment for all of these applications, wire feedstock is preferred to powder from a containment, handling, and safety standpoint. The energy beams may be either electron beam or laser, and the developments required for either energy source to achieve success in these applications will be discussed.

  17. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  18. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    NASA Astrophysics Data System (ADS)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  19. Climate Impacts on Extreme Energy Consumption of Different Types of Buildings

    PubMed Central

    Li, Mingcai; Shi, Jun; Guo, Jun; Cao, Jingfu; Niu, Jide; Xiong, Mingming

    2015-01-01

    Exploring changes of building energy consumption and its relationships with climate can provide basis for energy-saving and carbon emission reduction. Heating and cooling energy consumption of different types of buildings during 1981-2010 in Tianjin city, was simulated by using TRNSYS software. Daily or hourly extreme energy consumption was determined by percentile methods, and the climate impact on extreme energy consumption was analyzed. The results showed that days of extreme heating consumption showed apparent decrease during the recent 30 years for residential and large venue buildings, whereas days of extreme cooling consumption increased in large venue building. No significant variations were found for the days of extreme energy consumption for commercial building, although a decreasing trend in extreme heating energy consumption. Daily extreme energy consumption for large venue building had no relationship with climate parameters, whereas extreme energy consumption for commercial and residential buildings was related to various climate parameters. Further multiple regression analysis suggested heating energy consumption for commercial building was affected by maximum temperature, dry bulb temperature, solar radiation and minimum temperature, which together can explain 71.5 % of the variation of the daily extreme heating energy consumption. The daily extreme cooling energy consumption for commercial building was only related to the wet bulb temperature (R2= 0.382). The daily extreme heating energy consumption for residential building was affected by 4 climate parameters, but the dry bulb temperature had the main impact. The impacts of climate on hourly extreme heating energy consumption has a 1-3 hour delay in all three types of buildings, but no delay was found in the impacts of climate on hourly extreme cooling energy consumption for the selected buildings. PMID:25923205

  20. Climate impacts on extreme energy consumption of different types of buildings.

    PubMed

    Li, Mingcai; Shi, Jun; Guo, Jun; Cao, Jingfu; Niu, Jide; Xiong, Mingming

    2015-01-01

    Exploring changes of building energy consumption and its relationships with climate can provide basis for energy-saving and carbon emission reduction. Heating and cooling energy consumption of different types of buildings during 1981-2010 in Tianjin city, was simulated by using TRNSYS software. Daily or hourly extreme energy consumption was determined by percentile methods, and the climate impact on extreme energy consumption was analyzed. The results showed that days of extreme heating consumption showed apparent decrease during the recent 30 years for residential and large venue buildings, whereas days of extreme cooling consumption increased in large venue building. No significant variations were found for the days of extreme energy consumption for commercial building, although a decreasing trend in extreme heating energy consumption. Daily extreme energy consumption for large venue building had no relationship with climate parameters, whereas extreme energy consumption for commercial and residential buildings was related to various climate parameters. Further multiple regression analysis suggested heating energy consumption for commercial building was affected by maximum temperature, dry bulb temperature, solar radiation and minimum temperature, which together can explain 71.5 % of the variation of the daily extreme heating energy consumption. The daily extreme cooling energy consumption for commercial building was only related to the wet bulb temperature (R2= 0.382). The daily extreme heating energy consumption for residential building was affected by 4 climate parameters, but the dry bulb temperature had the main impact. The impacts of climate on hourly extreme heating energy consumption has a 1-3 hour delay in all three types of buildings, but no delay was found in the impacts of climate on hourly extreme cooling energy consumption for the selected buildings.

  1. Handling qualities of large flexible control-configured aircraft

    NASA Technical Reports Server (NTRS)

    Swaim, R. L.

    1979-01-01

    The approach to an analytical study of flexible airplane longitudinal handling qualities was to parametrically vary the natural frequencies of two symmetric elastic modes to induce mode interactions with the rigid body dynamics. Since the structure of the pilot model was unknown for such dynamic interactions, the optimal control pilot modeling method is being applied and used in conjunction with pilot rating method.

  2. 1997 Technology Applications Report,

    DTIC Science & Technology

    1997-01-01

    handle high -power loads at microwave radio frequencies , microwave vacuum tubes remain the chosen technology to amplify high power. Aria Microwave...structure called the active RF cavity amplifier (ARFCA). With this design , the amplifier handles high -power loads at radio and microwave frequencies ...developed this technology using BMDO-funded modeling methods designed to simulate the dynamics of large space-based structures. Because it increases

  3. Detailed performance analysis of the A.A.D. - concept B

    NASA Technical Reports Server (NTRS)

    Sekar, R.; Tozzi, L.

    1983-01-01

    New concepts for engine performance improvement are seen through the adoption of heat regeneration techniques; advanced methods to enhance the combustion; and higher efficiency air handling machinery, such as the positive displacement helical screw expander and compressor. Each of these concepts plays a particular role in engine performance improvement. First regeneration has a great potential for achieving higher engine thermal efficiency through the recovery of waste energy. Although the concept itself is not new (this technique is used in the gas turbine), the application to reciprocating internal combustion engines is quite unusual and presents conceptual difficulties. The second important area is better control of the combustion process in terms of heat transfer characteristics, combustion products, and heat release rate. The third area for performance improvement is in the adoption of high efficiency air handling machinery. In particular, positive displacement helical expander and compressor exhibit an extremely high efficiency over a wide range of operating conditions.

  4. Clinically Significant Envenomation From Postmortem Copperhead (Agkistrodon contortrix).

    PubMed

    Emswiler, Michael P; Griffith, F Phillip; Cumpston, Kirk L

    2017-03-01

    Over 14,000 copperhead (Agkistrodon contortrix) bites were reported to United States poison centers between 1983 and 2008, and 1809 cases were reported to poison centers in 2014. The copperhead is primarily found in the southeastern United States and belongs to the pit viper subfamily Crotalinae, which also includes the water moccasin (Agkistrodon piscivorus) and rattlesnakes (Crotalus and Sistrurus genera). Postmortem rattlesnakes have been reported to cause clinically significant envenomation; we report a case of a postmortem copperhead causing clinically significant envenomation after inadvertent puncture with the deceased copperhead fang. The copperhead was transected twice, leaving the snake in 3 separate pieces. While handling the snake head, an inadvertent puncture occurred on the right index finger followed by pain and swelling in the affected extremity necessitating antivenom administration. Care should be taken when handling deceased pit vipers due to the continued risk of envenomation. Copyright © 2017 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.

  5. 30. Engine controls and valve gear, looking aft on main ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. Engine controls and valve gear, looking aft on main (promenade) deck level. Threaded admission valve lift rods (two at immediate left of chronometer) permit adjustment of valve timing in lower and upper admission valves of cylinder (left rod controls lower valve, right rod upper valve). Valve rods are lifted by jaw-like "wipers" during operation. Exhaust valve lift rods and wipers are located to right of chronometer. Crank at extreme right drives valve wiper shaft when engaged to end of eccentric rod, shown under "Crank Indicator" dial. Pair of handles to immediate left of admission valve rods control condenser water valves; handles to right of exhaust valve rods control feedwater flow to boilers from pumps. Gauges indicate boiler pressure (left) and condenser vacuum (right); "Crank Indicator" on wall aids engineer in keeping engine crank off "dead-center" at stop so that engine may be easily restarted. - Steamboat TICONDEROGA, Shelburne Museum Route 7, Shelburne, Chittenden County, VT

  6. Respiratory alkalosis.

    PubMed

    Foster, G T; Vaziri, N D; Sassoon, C S

    2001-04-01

    Respiratory alkalosis is an extremely common and complicated problem affecting virtually every organ system in the body. This article reviews the various facets of this interesting problem. Respiratory alkalosis produces multiple metabolic abnormalities, from changes in potassium, phosphate, and calcium, to the development of a mild lactic acidosis. Renal handling of the above ions is also affected. The etiologies may be related to pulmonary or extrapulmonary disorders. Hyperventilation syndrome is a common etiology of respiratory alkalosis in the emergency department setting and is a diagnosis by exclusion. There are many cardiac effects of respiratory alkalosis, such as tachycardia, ventricular and atrial arrhythmias, and ischemic and nonischemic chest pain. In the lungs, vasodilation occurs, and in the gastrointestinal system there are changes in perfusion, motility, and electrolyte handling. Therapeutically, respiratory alkalosis is used for treatment of elevated intracranial pressure. Correction of a respiratory alkalosis is best performed by correcting the underlying etiology.

  7. Applying Physics to the Student's World

    NASA Astrophysics Data System (ADS)

    Blanton, Patricia

    2003-03-01

    It has been a challenging day. The electricity is off because snow and extreme winds have toppled trees into power lines all across my county. As I've been trying to devise a way to cook dinner, I've been thinking about how equipped we are to handle such emergencies. My husband, who has worked most of the day getting the portable generator going and figuring out how to hook up things to keep us warm and safe, made the comment, "You have to be a MacGyver if you are going to be a homeowner." I began wondering how well we are equipping our students with the ability to figure out how to make things work. We teach the physics principles so they can solve the "book problems," but are we helping them to understand the principles well enough to become real problem solvers? Are they prepared to handle situations when the "usual things" aren't working?

  8. New color-based tracking algorithm for joints of the upper extremities

    NASA Astrophysics Data System (ADS)

    Wu, Xiangping; Chow, Daniel H. K.; Zheng, Xiaoxiang

    2007-11-01

    To track the joints of the upper limb of stroke sufferers for rehabilitation assessment, a new tracking algorithm which utilizes a developed color-based particle filter and a novel strategy for handling occlusions is proposed in this paper. Objects are represented by their color histogram models and particle filter is introduced to track the objects within a probability framework. Kalman filter, as a local optimizer, is integrated into the sampling stage of the particle filter that steers samples to a region with high likelihood and therefore fewer samples is required. A color clustering method and anatomic constraints are used in dealing with occlusion problem. Compared with the general basic particle filtering method, the experimental results show that the new algorithm has reduced the number of samples and hence the computational consumption, and has achieved better abilities of handling complete occlusion over a few frames.

  9. Translator for Optimizing Fluid-Handling Components

    NASA Technical Reports Server (NTRS)

    Landon, Mark; Perry, Ernest

    2007-01-01

    A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.

  10. Effects of wearing knitted or rubber gloves on the transfer of Escherichia cohi between hands and meat.

    PubMed

    Gill, C O; Jones, T

    2002-06-01

    On eight occasions, five volunteers each handled five pieces of meat with bare hands or while wearing dry or wet knitted gloves or rubber gloves after hands had been inoculated with Escherichia coli or after handling a piece of meat inoculated with E. coli. On each occasion, after all meat was handled, each piece of meat, glove, and hand were sampled to recover E. coli. When hands were inoculated, E. coli was recovered from all meat handled with bare hands, in lesser numbers from some pieces handled with knitted gloves, and from only one piece handled with rubber gloves. When pieces of inoculated meat were handled, the numbers of E. coli transferred to uninoculated meat from bare hands or rubber gloves decreased substantially with each successive piece of uninoculated meat, but decreases were small with knitted gloves. The findings indicate that, compared with bare hands, the use of knitted gloves could reduce the transfer of bacteria from hands to meat but could increase the transfer of bacteria between meat pieces, whereas the use of rubber gloves could largely prevent the first and greatly reduce the second type of bacteria transfer.

  11. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  12. Review and synthesis of problems and directions for large scale geographic information system development

    NASA Technical Reports Server (NTRS)

    Boyle, A. R.; Dangermond, J.; Marble, D.; Simonett, D. S.; Tomlinson, R. F.

    1983-01-01

    Problems and directions for large scale geographic information system development were reviewed and the general problems associated with automated geographic information systems and spatial data handling were addressed.

  13. Remote Infrared Thermal Sensing of Sewer Voids, Four-Year Update

    NASA Astrophysics Data System (ADS)

    Weil, Gary J.

    1988-01-01

    When a sewer caves in, it often takes the street, sidewalks, and surrounding buildings along for the ride. These collapses endanger public health and safety. Repairing a sewer before such a cave-in is obviously the preferred method. Emergency repairs cost far more than prevention measures - often millions of dollars more. Many combined sewers in the St. Louis area, as in many of America's cities, are more than 125 years old and are subject to structural failure. In 1981 alone, St. Louis had 4,000 sewer collapses and an astronomical repair bill. These and similar problems have been described as "a crisis of national proportions. The question addressed by this paper is how to detect unseen problem areas in sewer systems before they give way. At the present, progressive sewer administrations may use crawl crews to inspect sewers when problems are suspected. This can be extremely costly and dangerous, and a void around the outside of the sewer is often invisible from within. Thus, even a crawl crew can fail to detect most voids. Infrared Thermography has been found by sewer districts and independent evaluation engineering firms to be an extremely accurate method of finding sewer voids, before they can cause expensive and dangerous problems. This technique uses a non-contact, remote sensing method, with the potential for surveying large areas quickly and efficiently. This paper reviews our initial paper presented to The International Society for Optical Engineering in October of 1983 and presents an update of our experience, both successes and failures, in several large-scale void detection projects. Infrared Thermographic techniques of non-destructive testing will have major implications for cities and for the engineering profession because it promises to make the crisis of infrastructure repair and rehabilitation more manageable. Intelligent, systematic use of this relatively low cost void detection method, Infrared Thermography, may revolutionize the way sewer problems are handled in the future.

  14. Applications of flight control system methods to an advanced combat rotorcraft

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.

    1989-01-01

    Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.

  15. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  16. Recent Extreme Marine Events at Southern Coast of Black Sea

    NASA Astrophysics Data System (ADS)

    Ozyurt Tarakcioglu, Gulizar; Cevdet Yalciner, Ahmet; Kirezci, Cagil; Baykal, Cuneyt; Gokhan Guler, Hasan; Erol, Onur; Zaytsev, Andrey; Kurkin, Andrey

    2015-04-01

    The utilization at the coastal areas of Black Sea basin has increased in the recent years with the projects such as large commercial ports, international transportation hubs, gas and petrol pipelines, touristic and recreational infrastructures both along surrounding shoreline. Although Black Sea is a closed basin, extreme storms and storm surges have also been observed with an increasing frequency in the recent years. Among those events, February 1999, March 2013 and September 2014 storms impacted Southern coast of Black sea have clearly shown that the increasing economic value at the coastal areas caused the increasing cost of damages and loss of property by natural hazards. The storm occurred on February 19-20, 1999 is one of the most destructive storm in the last decades. The 1999 event (1999 Southern Black sea storm) caused destruction at all harbors and coastal protection structures along the Black Sea coast of Turkey. The complete damage of the breakwater of Giresun Harbor and damage on the harbor structures and cargo handling equipment were the major impacts of the 1999 Southern Black sea storm. Similar coastal impact have also been observed during the September 24, 2014 storm at 500m East of Giresun harbor. Although there are considerable number of destructive storms observed at southern coast of Black sea recently, data on these events are limited and vastly scattered. In this study the list of recent extreme marine events at South coast of the Black sea compiled and related data such as wind speed, wave height, period, and type of damages are cataloged. Particular attention is focused on the 1999 and 2014 storm events. The meteorological and morphological characteristics which may be considered as the reasons of the generation and coastal amplification of these storms are discussed. ACKNOWLEDGEMENTS: This study is partly supported by Turkish Russian Joint Research Grant Program by TUBITAK (Turkey) and RFBR (Russia), and TUBITAK 213M534 Research Project.

  17. 40. BUILDING NO. 454, ORDNANCE FACILITY (BAG CHANGE FILLING PLANT), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    40. BUILDING NO. 454, ORDNANCE FACILITY (BAG CHANGE FILLING PLANT), DETAIL SOUTHEAST SIDE OF EXTERIOR ELECTRICAL EQUIPMENT ROOM, SHOWING DOOR TO SEWING ROOM NO. 3, VENTILATOR FAN (OVER DOOR), STEAM LINE (PIPE), SEWING MACHINE MOTOR IN OVERHEAD, ALARM BELL, EXPLOSION-PROOF SWITCH BOXES, GROUNDS ON DOORS, PULL ALARM HANDLE (EXTREME RIGHT; PULLEY CABLE CONDUCTED IN CONDUIT TO SWITCH INSIDE BUILDING. PULLEYS INSIDE ALL ELBOW JOINTS.) - Picatinny Arsenal, 400 Area, Gun Bag Loading District, State Route 15 near I-80, Dover, Morris County, NJ

  18. The use of hydro-dynamic models in the practice-oriented education of engineering students

    NASA Astrophysics Data System (ADS)

    Sziebert, J.; Zellei, L.; Tamás, E. A.

    2009-04-01

    Management tasks related to open channel flows became rather comprehensive and multi-disciplinary, particularly with the predominancy of nature management aspects. The water regime of our rivers has proven to reach extremities more and more frequently in the past decades. In order to develop and analyse alternative solutions and to handle and resolve conflicts of interests, we apply 1D hydro-dynamic models in education for the explanation of processes and to improve practical skills of our students.

  19. Use of Flowtran Simulation in Education

    ERIC Educational Resources Information Center

    Clark, J. Peter; Sommerfeld, Jude T.

    1976-01-01

    Describes the use in chemical engineering education of FLOWTRAN, a large steady-state simulator of chemical processes with extensive facilities for physical and thermodynamic data-handling and a large library of equipment modules, including cost estimation capability. (MLH)

  20. Handling a Large Collection of PDF Documents

    EPA Pesticide Factsheets

    You have several options for making a large collection of PDF documents more accessible to your audience: avoid uploading altogether, use multiple document pages, and use document IDs as anchors for direct links within a document page.

  1. Simulator study of flight characteristics of several large, dissimilar, cargo transport airplanes during approach and landing

    NASA Technical Reports Server (NTRS)

    Grantham, W. D.; Smith, P. M.; Deal, P. L.; Neely, W. R., Jr.

    1984-01-01

    A six-degree-of-freedom, ground based simulator study is conducted to evaluate the low-speed flight characteristics of four dissimilar cargo transport airplanes. These characteristics are compared with those of a large, present-day (reference) transport configuration similar to the Lockheed C-5A airplane. The four very large transport concepts evaluated consist of single-fuselage, twin-fuselage, triple-fuselage, and span-loader configurations. The primary piloting task is the approach and landing operation. The results of his study indicate that all four concepts evaluated have unsatisfactory longitudinal and lateral directional low speed flight characteristics and that considerable stability and control augmentation would be required to improve these characteristics (handling qualities) to a satisfactory level. Through the use of rate command/attitude hold augmentation in the pitch and roll axes, and the use of several turn-coordination features, the handling qualities of all four large transports simulated are improved appreciably.

  2. Reducing variation in a rabbit vaccine safety study with particular emphasis on housing conditions and handling.

    PubMed

    Verwer, Cynthia M; van der Ark, Arno; van Amerongen, Geert; van den Bos, Ruud; Hendriksen, Coenraad F M

    2009-04-01

    This paper describes the results of a study of the effects of modified housing conditions, conditioning and habituation on humans using a rabbit model for monitoring whole-cell pertussis vaccine (pWCV)-induced adverse effects. The study has been performed with reference to previous vaccine safety studies of pWCV in rabbits in which results were difficult to interpret due to the large variation in experimental outcome, especially in the key parameter deep-body temperature (T(b)). Certain stressful laboratory conditions, as well as procedures involving humans, e.g. blood sampling, inoculation and cage-cleaning, were hypothesized to cause this large variation. The results of this study show that under modified housing conditions rabbits have normal circadian body temperatures. This allowed discrimination of pWCV-induced adverse effects in which handled rabbits tended to show a dose-related increase in temperature after inoculation with little variance, whereas non-handled rabbits did not. Effects of experimental and routine procedures on body temperature were significantly reduced under modified conditions and were within the normal T(b) range. Handled animals reacted less strongly and with less variance to experimental procedures, such as blood sampling, injection and cage-cleaning, than non-handled rabbits. Overall, handling had a positive effect on the behaviour of the animals. Data show that the housing modifications have provided a more robust model for monitoring pWCV adverse effects. Furthermore, conditioning and habituation of rabbits to humans reduce the variation in experimental outcome, which might allow for a reduction in the number of animals used. In addition, this also reduces distress and thus contributes to refining this animal model.

  3. Thermodynamic evaluation of transonic compressor rotors using the finite volume approach

    NASA Technical Reports Server (NTRS)

    Moore, John; Nicholson, Stephen; Moore, Joan G.

    1986-01-01

    The development of a computational capability to handle viscous flow with an explicit time-marching method based on the finite volume approach is summarized. Emphasis is placed on the extensions to the computational procedure which allow the handling of shock induced separation and large regions of strong backflow. Appendices contain abstracts of papers and whole reports generated during the contract period.

  4. Procedures to handle inventory cluster plots that straddle two or more conditions

    Treesearch

    Jerold T. Hahn; Colin D. MacLean; Stanford L. Arner; William A. Bechtold

    1995-01-01

    We review the relative merits and field procedures for four basic plot designs to handle forest inventory plots that straddle two or more conditions, given that subplots will not be moved. A cluster design is recommended that combines fixed-area subplots and variable-radius plot (VRP) sampling. Each subplot in a cluster consists of a large fixed-area subplot for...

  5. 76 FR 58038 - Notice of Inventory Completion: U.S. Department of the Interior, Bureau of Indian Affairs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ... iron handle, 1 iron handle fragment, 1 iron bowl fragment, 2 iron keys, 1 iron hinge, 1 iron gun hammer, 2 iron gun pieces, 1 fish hook, 12 nails, 1 iron ring, 1 coffee mill, 1 possible iron file, 1 large iron tack, 4 iron rods, 3 unidentified iron fragments, 1 metal tube, 1 scissors fragment, 1 finial...

  6. Morphometric characteristics of the metacestode Echinococcus vogeli Rausch & Bernstein, 1972 in human infections from the northern region of Brazil.

    PubMed

    Almeida, F; Oliveira, F; Neves, R; Siqueira, N; Rodrigues-Silva, R; Daipert-Garcia, D; Machado-Silva, J R

    2015-07-01

    Polycystic echinococcosis, caused by the larval stage (metacestode) of the small-sized tapeworm, Echinococcus vogeli, is an emerging parasitic zoonosis of great public health concern in the humid tropical rainforests of South and Central America. Because morphological and morphometric characteristics of the metacestode are not well known, hydatid cysts from the liver and the mesentery were examined from patients following surgical procedures. Whole mounts of protoscoleces with rostellar hooks were examined under light and confocal laser scanning microscopy. Measurements were made of both large and small hooks, including the total area, total length, total width, blade area, blade length, blade width, handle area, handle length and handle width. The results confirmed the 1:1 arrangement of hooks in the rostellar pad and indicated, for the first time, that the morphometry of large and small rostellar hooks varies depending upon the site of infection. Light and confocal microscopy images displayed clusters of calcareous corpuscles in the protoscoleces. In conclusion, morphological features of large and small rostellar hooks of E. vogeli are adapted to a varied environment within the vertebrate host and such morphological changes in calcareous corpuscles occur at different stages in the maturation of metacestodes.

  7. In-hardware demonstration of model-independent adaptive tuning of noisy systems with arbitrary phase drift

    DOE PAGES

    Scheinker, Alexander; Baily, Scott; Young, Daniel; ...

    2014-08-01

    In this work, an implementation of a recently developed model-independent adaptive control scheme, for tuning uncertain and time varying systems, is demonstrated on the Los Alamos linear particle accelerator. The main benefits of the algorithm are its simplicity, ability to handle an arbitrary number of components without increased complexity, and the approach is extremely robust to measurement noise, a property which is both analytically proven and demonstrated in the experiments performed. We report on the application of this algorithm for simultaneous tuning of two buncher radio frequency (RF) cavities, in order to maximize beam acceptance into the accelerating electromagnetic fieldmore » cavities of the machine, with the tuning based only on a noisy measurement of the surviving beam current downstream from the two bunching cavities. The algorithm automatically responds to arbitrary phase shift of the cavity phases, automatically re-tuning the cavity settings and maximizing beam acceptance. Because it is model independent it can be utilized for continuous adaptation to time-variation of a large system, such as due to thermal drift, or damage to components, in which the remaining, functional components would be automatically re-tuned to compensate for the failing ones. We start by discussing the general model-independent adaptive scheme and how it may be digitally applied to a large class of multi-parameter uncertain systems, and then present our experimental results.« less

  8. Global Weirding? - Using Very Large Ensembles and Extreme Value Theory to assess Changes in Extreme Weather Events Today

    NASA Astrophysics Data System (ADS)

    Otto, F. E. L.; Mitchell, D.; Sippel, S.; Black, M. T.; Dittus, A. J.; Harrington, L. J.; Mohd Saleh, N. H.

    2014-12-01

    A shift in the distribution of socially-relevant climate variables such as daily minimum winter temperatures and daily precipitation extremes, has been attributed to anthropogenic climate change for various mid-latitude regions. However, while there are many process-based arguments suggesting also a change in the shape of these distributions, attribution studies demonstrating this have not currently been undertaken. Here we use a very large initial condition ensemble of ~40,000 members simulating the European winter 2013/2014 using the distributed computing infrastructure under the weather@home project. Two separate scenarios are used:1. current climate conditions, and 2. a counterfactual scenario of "world that might have been" without anthropogenic forcing. Specifically focusing on extreme events, we assess how the estimated parameters of the Generalized Extreme Value (GEV) distribution vary depending on variable-type, sampling frequency (daily, monthly, …) and geographical region. We find that the location parameter changes for most variables but, depending on the region and variables, we also find significant changes in scale and shape parameters. The very large ensemble allows, furthermore, to assess whether such findings in the fitted GEV distributions are consistent with an empirical analysis of the model data, and whether the most extreme data still follow a known underlying distribution that in a small sample size might otherwise be thought of as an out-lier. The ~40,000 member ensemble is simulated using 12 different SST patterns (1 'observed', and 11 best guesses of SSTs with no anthropogenic warming). The range in SSTs, along with the corresponding changings in the NAO and high-latitude blocking inform on the dynamics governing some of these extreme events. While strong tele-connection patterns are not found in this particular experiment, the high number of simulated extreme events allows for a more thorough analysis of the dynamics than has been performed before. Therefore, combining extreme value theory with very large ensemble simulations allows us to understand the dynamics of changes in extreme events which is not possible just using the former but also shows in which cases statistics combined with smaller ensembles give as valid results as very large initial conditions.

  9. A dynamical systems approach to studying midlatitude weather extremes

    NASA Astrophysics Data System (ADS)

    Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide

    2017-04-01

    Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.

  10. Knotty: Efficient and Accurate Prediction of Complex RNA Pseudoknot Structures.

    PubMed

    Jabbari, Hosna; Wark, Ian; Montemagno, Carlo; Will, Sebastian

    2018-06-01

    The computational prediction of RNA secondary structure by free energy minimization has become an important tool in RNA research. However in practice, energy minimization is mostly limited to pseudoknot-free structures or rather simple pseudoknots, not covering many biologically important structures such as kissing hairpins. Algorithms capable of predicting sufficiently complex pseudoknots (for sequences of length n) used to have extreme complexities, e.g. Pknots (Rivas and Eddy, 1999) has O(n6) time and O(n4) space complexity. The algorithm CCJ (Chen et al., 2009) dramatically improves the asymptotic run time for predicting complex pseudoknots (handling almost all relevant pseudoknots, while being slightly less general than Pknots), but this came at the cost of large constant factors in space and time, which strongly limited its practical application (∼200 bases already require 256GB space). We present a CCJ-type algorithm, Knotty, that handles the same comprehensive pseudoknot class of structures as CCJ with improved space complexity of Θ(n3 + Z)-due to the applied technique of sparsification, the number of "candidates", Z, appears to grow significantly slower than n4 on our benchmark set (which include pseudoknotted RNAs up to 400 nucleotides). In terms of run time over this benchmark, Knotty clearly outperforms Pknots and the original CCJ implementation, CCJ 1.0; Knotty's space consumption fundamentally improves over CCJ 1.0, being on a par with the space-economic Pknots. By comparing to CCJ 2.0, our unsparsified Knotty variant, we demonstrate the isolated effect of sparsification. Moreover, Knotty employs the state-of-the-art energy model of "HotKnots DP09", which results in superior prediction accuracy over Pknots. Our software is available at https://github.com/HosnaJabbari/Knotty. will@tbi.unvie.ac.at. Supplementary data are available at Bioinformatics online.

  11. TrustRank: a Cold-Start tolerant recommender system

    NASA Astrophysics Data System (ADS)

    Zou, Haitao; Gong, Zhiguo; Zhang, Nan; Zhao, Wei; Guo, Jingzhi

    2015-02-01

    The explosive growth of the World Wide Web leads to the fast advancing development of e-commerce techniques. Recommender systems, which use personalised information filtering techniques to generate a set of items suitable to a given user, have received considerable attention. User- and item-based algorithms are two popular techniques for the design of recommender systems. These two algorithms are known to have Cold-Start problems, i.e., they are unable to effectively handle Cold-Start users who have an extremely limited number of purchase records. In this paper, we develop TrustRank, a novel recommender system which handles the Cold-Start problem by leveraging the user-trust networks which are commonly available for e-commerce applications. A user-trust network is formed by friendships or trust relationships that users specify among them. While it is straightforward to conjecture that a user-trust network is helpful for improving the accuracy of recommendations, a key challenge for using user-trust network to facilitate Cold-Start users is that these users also tend to have a very limited number of trust relationships. To address this challenge, we propose a pre-processing propagation of the Cold-Start users' trust network. In particular, by applying the personalised PageRank algorithm, we expand the friends of a given user to include others with similar purchase records to his/her original friends. To make this propagation algorithm scalable to a large amount of users, as required by real-world recommender systems, we devise an iterative computation algorithm of the original personalised TrustRank which can incrementally compute trust vectors for Cold-Start users. We conduct extensive experiments to demonstrate the consistently improvement provided by our proposed algorithm over the existing recommender algorithms on the accuracy of recommendations for Cold-Start users.

  12. Quantifying the effect of editor-author relations on manuscript handling times.

    PubMed

    Sarigöl, Emre; Garcia, David; Scholtes, Ingo; Schweitzer, Frank

    2017-01-01

    In this article we study to what extent the academic peer review process is influenced by social relations between the authors of a manuscript and the editor handling the manuscript. Taking the open access journal PlosOne as a case study, our analysis is based on a data set of more than 100,000 articles published between 2007 and 2015. Using available data on handling editor, submission and acceptance time of manuscripts, we study the question whether co-authorship relations between authors and the handling editor affect the manuscript handling time , i.e. the time taken between the submission and acceptance of a manuscript. Our analysis reveals (1) that editors handle papers co-authored by previous collaborators significantly more often than expected at random, and (2) that such prior co-author relations are significantly related to faster manuscript handling. Addressing the question whether these shorter manuscript handling times can be explained by the quality of publications, we study the number of citations and downloads which accepted papers eventually accumulate. Moreover, we consider the influence of additional (social) factors, such as the editor's experience, the topical similarity between authors and editors, as well as reciprocal citation relations between authors and editors. Our findings show that, even when correcting for other factors like time, experience, and performance, prior co-authorship relations have a large and significant influence on manuscript handling times, speeding up the editorial decision on average by 19 days.

  13. Pilot-Induced Oscillation Prediction With Three Levels of Simulation Motion Displacement

    NASA Technical Reports Server (NTRS)

    Schroeder, Jeffery A.; Chung, William W. Y.; Tran, Duc T.; Laforce, Soren; Bengford, Norman J.

    2001-01-01

    Simulator motion platform characteristics were examined to determine if the amount of motion affects pilot-induced oscillation (PIO) prediction. Five test pilots evaluated how susceptible 18 different sets of pitch dynamics were to PIOs with three different levels of simulation motion platform displacement: large, small, and none. The pitch dynamics were those of a previous in-flight experiment, some of which elicited PIOs These in-flight results served as truth data for the simulation. As such, the in-flight experiment was replicated as much as possible. Objective and subjective data were collected and analyzed With large motion, PIO and handling qualities ratings matched the flight data more closely than did small motion or no motion. Also, regardless of the aircraft dynamics, large motion increased pilot confidence in assigning handling qualifies ratings, reduced safety pilot trips, and lowered touchdown velocities. While both large and small motion provided a pitch rate cue of high fidelity, only large motion presented the pilot with a high fidelity vertical acceleration cue.

  14. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  15. Orbital maneuvering end effectors

    NASA Technical Reports Server (NTRS)

    Myers, W. Neill (Inventor); Forbes, John C. (Inventor); Barnes, Wayne L. (Inventor)

    1986-01-01

    This invention relates to an end effector device for grasping and maneuvering objects such as berthing handles of a space telescope. The device includes a V-shaped capture window defined as inclined surfaces in parallel face plates which converge toward a retainer recess in which the handle is retained. A pivotal finger (30) meshes with a pair of pivoted fingers which rotate in counterrotation. The fingers rotate to pull a handle within the capture window into recess where latches lock handle in the recess. To align the capture window, plates may be cocked plus or minus five degrees on base. Drive means is included in the form of a motor coupled with a harmonic drive speed reducer, which provides for slow movement of the fingers at a high torque so that large articles may be handled. Novelty of the invention is believed to reside in the combined intermeshing finger structure, drive means and the harmonic drive speed reducer, which features provide the required maneuverability and strength.

  16. Improving Predictions and Management of Hydrological Extremes through Climate Services

    NASA Astrophysics Data System (ADS)

    van den Hurk, Bart; Wijngaard, Janet; Pappenberger, Florian; Bouwer, Laurens; Weerts, Albrecht; Buontempo, Carlo; Doescher, Ralf; Manez, Maria; Ramos, Maria-Helena; Hananel, Cedric; Ercin, Ertug; Hunink, Johannes; Klein, Bastian; Pouget, Laurent; Ward, Philip

    2016-04-01

    The EU Roadmap on Climate Services can be seen as a result of convergence between the society's call for "actionable research", and the climate research community providing tailored data, information and knowledge. However, although weather and climate have clearly distinct definitions, a strong link between weather and climate services exists that is not explored extensively. Stakeholders being interviewed in the context of the Roadmap consider climate as a far distant long term feature that is difficult to consider in present-day decision taking, which is dominated by daily experience with handling extreme events. It is argued that this experience is a rich source of inspiration to increase society's resilience to an unknown future. A newly started European research project, IMPREX, is built on the notion that "experience in managing current day weather extremes is the best learning school to anticipate consequences of future climate". This paper illustrates possible ways to increase the link between information and services addressing weather and climate time scales by discussing the underlying concepts of IMPREX and its expected outcome.

  17. An instrument to aid tubal sterilization by laparoscopy.

    PubMed

    Siegler, A M

    1972-05-01

    A single-handled instrument, developed by Siegler for his two-incision technique, has broad biopsy capability. The shaft and handle are insulated to protect the operator from shock; the jaws rotate independently from the handle position; an O-ring seal in the cannula eliminates the need for external sealing devices for carbon dioxide maintenance; and either the cutting or coagulation power may be applied. The biopsy instrument can coagulate and biopsy both tubes without removing the forceps after treating one side since the jaws are large enough to accommodate both segments. The instrument is manufactured by the American Cystoscope Makers, Inc., Pelham Manor, New York.

  18. Changes in extremes due to half a degree warming in observations and models

    NASA Astrophysics Data System (ADS)

    Fischer, E. M.; Schleussner, C. F.; Pfleiderer, P.

    2017-12-01

    Assessing the climate impacts of half-a-degree warming increments is high on the post-Paris science agenda. Discriminating those effects is particularly challenging for climate extremes such as heavy precipitation and heat extremes for which model uncertainties are generally large, and for which internal variability is so important that it can easily offset or strongly amplify the forced local changes induced by half a degree warming. Despite these challenges we provide evidence for large-scale changes in the intensity and frequency of climate extremes due to half a degree warming. We first assess the difference in extreme climate indicators in observational data for the 1960s and 1970s versus the recent past, two periods differ by half a degree. We identify distinct differences for the global and continental-scale occurrence of heat and heavy precipitation extremes. We show that those observed changes in heavy precipitation and heat extremes broadly agree with simulated historical differences and are informative for the projected differences between 1.5 and 2°C warming despite different radiative forcings. We therefore argue that evidence from the observational record can inform the debate about discernible climate impacts in the light of model uncertainty by providing a conservative estimate of the implications of 0.5°C warming. A limitation of using the observational record arises from potential non-linearities in the response of climate extremes to a certain level of warming. We test for potential non-linearities in the response of heat and heavy precipitation extremes in a large ensemble of transient climate simulations. We further quantify differences between a time-window approach in a coupled model large ensemble vs. time-slice experiments using prescribed SST experiments performed in the context of the HAPPI-MIP project. Thereby we provide different lines of evidence that half a degree warming leads to substantial changes in the expected occurrence of heat and heavy precipitation extremes.

  19. Role of absorbing aerosols on hot extremes in India in a GCM

    NASA Astrophysics Data System (ADS)

    Mondal, A.; Sah, N.; Venkataraman, C.; Patil, N.

    2017-12-01

    Temperature extremes and heat waves in North-Central India during the summer months of March through June are known for causing significant impact in terms of human health, productivity and mortality. While greenhouse gas-induced global warming is generally believed to intensify the magnitude and frequency of such extremes, aerosols are usually associated with an overall cooling, by virtue of their dominant radiation scattering nature, in most world regions. Recently, large-scale atmospheric conditions leading to heat wave and extreme temperature conditions have been analysed for the North-Central Indian region. However, the role of absorbing aerosols, including black carbon and dust, is still not well understood, in mediating hot extremes in the region. In this study, we use 30-year simulations from a chemistry-coupled atmosphere-only General Circulation Model (GCM), ECHAM6-HAM2, forced with evolving aerosol emissions in an interactive aerosol module, along with observed sea surface temperatures, to examine large-scale and mesoscale conditions during hot extremes in India. The model is first validated with observed gridded temperature and reanalysis data, and is found to represent observed variations in temperature in the North-Central region and concurrent large-scale atmospheric conditions during high temperature extremes realistically. During these extreme events, changes in near surface properties include a reduction in single scattering albedo and enhancement in short-wave solar heating rate, compared to climatological conditions. This is accompanied by positive anomalies of black carbon and dust aerosol optical depths. We conclude that the large-scale atmospheric conditions such as the presence of anticyclones and clear skies, conducive to heat waves and high temperature extremes, are exacerbated by absorbing aerosols in North-Central India. Future air quality regulations are expected to reduce sulfate particles and their masking of GHG warming. It is concurrently important to mitigate emissions of warming black carbon particles, to manage future climate change-induced hot extremes.

  20. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  1. Physics-based animation of large-scale splashing liquids, elastoplastic solids, and model-reduced flow

    NASA Astrophysics Data System (ADS)

    Gerszewski, Daniel James

    Physical simulation has become an essential tool in computer animation. As the use of visual effects increases, the need for simulating real-world materials increases. In this dissertation, we consider three problems in physics-based animation: large-scale splashing liquids, elastoplastic material simulation, and dimensionality reduction techniques for fluid simulation. Fluid simulation has been one of the greatest successes of physics-based animation, generating hundreds of research papers and a great many special effects over the last fifteen years. However, the animation of large-scale, splashing liquids remains challenging. We show that a novel combination of unilateral incompressibility, mass-full FLIP, and blurred boundaries is extremely well-suited to the animation of large-scale, violent, splashing liquids. Materials that incorporate both plastic and elastic deformations, also referred to as elastioplastic materials, are frequently encountered in everyday life. Methods for animating such common real-world materials are useful for effects practitioners and have been successfully employed in films. We describe a point-based method for animating elastoplastic materials. Our primary contribution is a simple method for computing the deformation gradient for each particle in the simulation. Given the deformation gradient, we can apply arbitrary constitutive models and compute the resulting elastic forces. Our method has two primary advantages: we do not store or compare to an initial rest configuration and we work directly with the deformation gradient. The first advantage avoids poor numerical conditioning and the second naturally leads to a multiplicative model of deformation appropriate for finite deformations. One of the most significant drawbacks of physics-based animation is that ever-higher fidelity leads to an explosion in the number of degrees of freedom. This problem leads us to the consideration of dimensionality reduction techniques. We present several enhancements to model-reduced fluid simulation that allow improved simulation bases and two-way solid-fluid coupling. Specifically, we present a basis enrichment scheme that allows us to combine data-driven or artistically derived bases with more general analytic bases derived from Laplacian Eigenfunctions. Additionally, we handle two-way solid-fluid coupling in a time-splitting fashion---we alternately timestep the fluid and rigid body simulators, while taking into account the effects of the fluid on the rigid bodies and vice versa. We employ the vortex panel method to handle solid-fluid coupling and use dynamic pressure to compute the effect of the fluid on rigid bodies. Taken together, these contributions have advanced the state-of-the art in physics-based animation and are practical enough to be used in production pipelines.

  2. Countering Insider Threats - Handling Insider Threats Using Dynamic, Run-Time Forensics

    DTIC Science & Technology

    2007-10-01

    able to handle the security policy requirements of a large organization containing many decentralized and diverse users, while being easily managed... contained in the TIF folder. Searching for any text string and sorting is supported also. The cache index file of Internet Explorer is not changed... containing thousands of malware software signatures. Separate datasets can be created for various classifications of malware such as encryption software

  3. Registration of Large Motion Blurred Images

    DTIC Science & Technology

    2016-05-09

    in handling the dynamics of the capturing system, for example, a drone. CMOS sensors , used in recent times, when employed in these cameras produce...handling the dynamics of the capturing system, for example, a drone. CMOS sensors , used in recent times, when employed in these cameras produce two types...blur in the captured image when there is camera motion during exposure. However, contemporary CMOS sensors employ an electronic rolling shutter (RS

  4. A Process Research Framework: The International Process Research Consortium

    DTIC Science & Technology

    2006-12-01

    projects ? 52 Theme P | IPRC Framework 5 P-30 How should a process for collaborative development be formulated? The development at different companies...requires some process for the actual collaboration . How should it be handled? P-31 How do we handle change? Requirements change during development ...source projects employ a single-site development model in which there is no large community of testers but rather a single-site small group

  5. 49 CFR 178.915 - General Large Packaging standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... Large Packagings intended for solid hazardous materials must be sift-proof and water-resistant. (b) All... materials, the internal pressure of the contents and the stresses of normal handling and transport. A Large... without gross distortion or failure and must be positioned so as to cause no undue stress in any part of...

  6. 77 FR 16559 - Large Power Transformers From Korea: Scheduling of the Final Phase of an Antidumping Investigation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-21

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 731-TA-1189 (Final)] Large Power Transformers... transformers, provided for in subheading 8504.23.00 of the Harmonized Tariff Schedule of the United States.\\1... merchandise as ``large liquid dielectric power transformers (LPTs) having a top power handling capacity...

  7. Lux in obscuro II: photon orbits of extremal AdS black holes revisited

    NASA Astrophysics Data System (ADS)

    Tang, Zi-Yu; Ong, Yen Chin; Wang, Bin

    2017-12-01

    A large class of spherically symmetric static extremal black hole spacetimes possesses a stable null photon sphere on their horizons. For the extremal Kerr-Newman family, the photon sphere only really coincides with the horizon in the sense clarified by Doran. The condition under which a photon orbit is stable on an asymptotically flat extremal Kerr-Newman black hole horizon has recently been clarified; it is found that a sufficiently large angular momentum destabilizes the photon orbit, whereas an electrical charge tends to stabilize it. We investigated the effect of a negative cosmological constant on this observation, and found the same behavior in the case of extremal asymptotically Kerr-Newman-AdS black holes in (3+1) -dimensions. In (2+1) -dimensions, in the presence of an electrical charge, the angular momentum never becomes large enough to destabilize the photon orbit. We comment on the instabilities of black hole spacetimes with a stable photon orbit.

  8. North American Extreme Temperature Events and Related Large Scale Meteorological Patterns: A Review of Statistical Methods, Dynamics, Modeling, and Trends

    NASA Technical Reports Server (NTRS)

    Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.; hide

    2015-01-01

    The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.

  9. Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce

    NASA Astrophysics Data System (ADS)

    Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani

    Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.

  10. APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data

    NASA Astrophysics Data System (ADS)

    Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.

    2018-04-01

    APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.

  11. Optimization of an organic memristor as an adaptive memory element

    NASA Astrophysics Data System (ADS)

    Berzina, Tatiana; Smerieri, Anteo; Bernabò, Marco; Pucci, Andrea; Ruggeri, Giacomo; Erokhin, Victor; Fontana, M. P.

    2009-06-01

    The combination of memory and signal handling characteristics of a memristor makes it a promising candidate for adaptive bioinspired information processing systems. This poses stringent requirements on the basic device, such as stability and reproducibility over a large number of training/learning cycles, and a large anisotropy in the fundamental control material parameter, in our case the electrical conductivity. In this work we report results on the improved performance of electrochemically controlled polymeric memristors, where optimization of a conducting polymer (polyaniline) in the active channel and better environmental control of fabrication methods led to a large increase both in the absolute values of the conductivity in the partially oxydized state of polyaniline and of the on-off conductivity ratio. These improvements are crucial for the application of the organic memristor to adaptive complex signal handling networks.

  12. Polygenic determinants in extremes of high-density lipoprotein cholesterol[S

    PubMed Central

    Dron, Jacqueline S.; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A.; Robinson, John F.; McIntyre, Adam D.; Ban, Matthew R.; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J.; Lettre, Guillaume; Tardif, Jean-Claude

    2017-01-01

    HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. PMID:28870971

  13. Polygenic determinants in extremes of high-density lipoprotein cholesterol.

    PubMed

    Dron, Jacqueline S; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A; Robinson, John F; McIntyre, Adam D; Ban, Matthew R; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J; Lettre, Guillaume; Tardif, Jean-Claude; Hegele, Robert A

    2017-11-01

    HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.

  14. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  15. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends

    DOE PAGES

    Grotjahn, Richard; Black, Robert; Leung, Ruby; ...

    2015-05-22

    This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less

  16. Data Validation and Sharing in a Large Research Program

    EPA Science Inventory

    Appropriate data handling practices are important in the support of large research teams with shifting and competing priorities. Determining those best practices is an ongoing effort for the US EPA’s National Aquatic Resource Surveys. We focus on the well understood data ...

  17. Drivers and seasonal predictability of extreme wind speeds in the ECMWF System 4 and a statistical model

    NASA Astrophysics Data System (ADS)

    Walz, M. A.; Donat, M.; Leckebusch, G. C.

    2017-12-01

    As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.

  18. [Handling of misconduct in Swedish research must be improved. More strict investigative procedures, tougher punishments and withdrawals are required].

    PubMed

    Linder, Stig

    2015-12-15

    Scientific misconduct constitutes a severe threat to research. Procedures to handle misconduct must therefore be both efficient and precise. In Sweden, suspected cases of misconduct are handled by the universities themselves. Investigations are generally performed by appointed scientists, leading to unnecessary discussions of the validity of the conclusions made. Sweden has a Central Ethical Review Board but this is infrequently used by the universities. It is an absolute requirement for a university to withdraw incorrect publications from the literature but regulations in this area are lacking in Sweden. The extraordinarily strong legal status of graduate students at Swedish universities leads to slow and costly investigations. Even when found to be guilty of misconduct, students are allowed to defend their PhD theses. In conclusion, there is a large potential for improvement of the regulations and routines for handling scientific misconduct in Sweden.

  19. Ground-Handling Forces on a 1/40-scale Model of the U. S. Airship "Akron."

    NASA Technical Reports Server (NTRS)

    Silverstein, Abe; Gulick, B G

    1937-01-01

    This report presents the results of full-scale wind tunnel tests conducted to determine the ground-handling forces on a 1/40-scale model of the U. S. Airship "Akron." Ground-handling conditions were simulated by establishing a velocity gradient above a special ground board in the tunnel comparable with that encountered over a landing field. The tests were conducted at Reynolds numbers ranging from 5,000,000 to 19,000,000 at each of six angles of yaw between 0 degree and 180 degrees and at four heights of the model above the ground board. The ground-handling forces vary greatly with the angle of yaw and reach large values at appreciable angles of yaw. Small changes in height, pitch, or roll did not critically affect the forces on the model. In the range of Reynolds numbers tested, no significant variation of the forces with the scale was disclosed.

  20. Persistent pulmonary hypertension of the newborn.

    PubMed

    Nair, P M C; Bataclan, Maria Flordeliz A

    2004-06-01

    This article attempts to define a complicated, yet not rare disease of the neonate, which presents with extreme hypoxemia due to increased pulmonary vascular resistance, resulting in diversion of the pulmonary venous blood through persistent fetal channels, namely ductus arteriosus and foramen ovale. Pathophysiology, diagnostic approach and the various modalities of management are analyzed. Persistent pulmonary hypertension of the newborn is multi-factorial, which is reflected in the management as well. These babies are extremely labile to hypoxia and should be stabilized with minimum handling. One hundred percent oxygen and ventilation are the mainstay of treatment. The role of hyperventilation, alkalinization, various non-specific vasodilators such as tolazoline, magnesium sulphate, selective vasodilators such as inhaled nitric oxide, adenosine and the role of high frequency oscillatory ventilation and extra corporeal membrane oxygenation are discussed. With the newer modalities of management, the outlook has improved with mortality of less than 20% and fewer long-term deficits.

  1. Lock hopper values for coal gasification plant service

    NASA Technical Reports Server (NTRS)

    Schoeneweis, E. F.

    1977-01-01

    Although the operating principle of the lock hopper system is extremely simple, valve applications involving this service for coal gasification plants are likewise extremely difficult. The difficulties center on the requirement of handling highly erosive pulverized coal or char (either in dry or slurry form) combined with the requirement of providing tight sealing against high-pressure (possibly very hot) gas. Operating pressures and temperatures in these applications typically range up to 1600 psi (110bar) and 600F (316C), with certain process requirements going even higher. In addition, and of primary concern, is the need for reliable operation over long service periods with the provision for practical and economical maintenance. Currently available data indicate the requirement for something in the order of 20,000 to 30,000 open-close cycles per year and a desire to operate at least that long without valve failure.

  2. Decision strategies for handling the uncertainty of future extreme rainfall under the influence of climate change.

    PubMed

    Gregersen, I B; Arnbjerg-Nielsen, K

    2012-01-01

    Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.

  3. Extreme phenophase delays and their relationship with natural forcings in Beijing over the past 260 years.

    PubMed

    Liu, Yang; Zhang, Mingqing; Fang, Xiuqi

    2018-03-20

    By merging reconstructed phenological series from published articles and observations of China Phenology Observation Network (CPON), the first blooming date of Amygdalus davidiana (FBA) in Beijing between 1741 and 2000 is reconstructed. The Butterworth method is used to remove the multi-year variations for generating the phenological series of annual variations in the first blooming date of A. davidiana. The extreme delay years in the phenological series are identified using the percentage threshold method. The characteristics of the extreme delays and the correspondence of these events with natural forcings are analysed. The main results are as follows. In annual phenological series, the extreme delays appeared in single year as main feature, only A.D.1800-1801, 1816-1817 and 1983-1984 were the events of two consecutively extreme years. Approximately 85% of the extreme delays occurred during 1-2 years after the large volcanic eruptions (VEI ≥ 4) in the eastern rim or the western rim of the Pacific Ocean, as the same proportion of the extreme delays followed El Niño events. About 73% years of the extreme delays fall in the valleys of sunspot cycles or the Dalton minimum period in the year or the previous year. According to the certainty factor (CF), the large eruptions have the greatest influence to the extreme delays; sunspot activity is the second, and ENSO is the last one. The extreme phenological delayed year is most likely to occur after a large eruption, which particularly occurs during El Niño year and its previous several years were in the descending portion or valley of sunspot phase.

  4. Extreme phenophase delays and their relationship with natural forcings in Beijing over the past 260 years

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Zhang, Mingqing; Fang, Xiuqi

    2018-03-01

    By merging reconstructed phenological series from published articles and observations of China Phenology Observation Network (CPON), the first blooming date of Amygdalus davidiana (FBA) in Beijing between 1741 and 2000 is reconstructed. The Butterworth method is used to remove the multi-year variations for generating the phenological series of annual variations in the first blooming date of A. davidiana. The extreme delay years in the phenological series are identified using the percentage threshold method. The characteristics of the extreme delays and the correspondence of these events with natural forcings are analysed. The main results are as follows. In annual phenological series, the extreme delays appeared in single year as main feature, only A.D.1800-1801, 1816-1817 and 1983-1984 were the events of two consecutively extreme years. Approximately 85% of the extreme delays occurred during 1-2 years after the large volcanic eruptions (VEI ≥ 4) in the eastern rim or the western rim of the Pacific Ocean, as the same proportion of the extreme delays followed El Niño events. About 73% years of the extreme delays fall in the valleys of sunspot cycles or the Dalton minimum period in the year or the previous year. According to the certainty factor (CF), the large eruptions have the greatest influence to the extreme delays; sunspot activity is the second, and ENSO is the last one. The extreme phenological delayed year is most likely to occur after a large eruption, which particularly occurs during El Niño year and its previous several years were in the descending portion or valley of sunspot phase.

  5. Overburdened, Overwhelmed.

    ERIC Educational Resources Information Center

    Hardy, Lawrence

    2003-01-01

    Health professionals concerned about children's mental health say schools have become more stressful places and that many students cannot handle the pressure. Factors contributing to students' stress include high-stakes testing, fear of failure, parent pressure, and large impersonal schools. To combat the effects of a large school, Venice High…

  6. Haplotype Reconstruction in Large Pedigrees with Many Untyped Individuals

    NASA Astrophysics Data System (ADS)

    Li, Xin; Li, Jing

    Haplotypes, as they specify the linkage patterns between dispersed genetic variations, provide important information for understanding the genetics of human traits. However haplotypes are not directly available from current genotyping platforms, and hence there are extensive investigations of computational methods to recover such information. Two major computational challenges arising in current family-based disease studies are large family sizes and many ungenotyped family members. Traditional haplotyping methods can neither handle large families nor families with missing members. In this paper, we propose a method which addresses these issues by integrating multiple novel techniques. The method consists of three major components: pairwise identical-bydescent (IBD) inference, global IBD reconstruction and haplotype restoring. By reconstructing the global IBD of a family from pairwise IBD and then restoring the haplotypes based on the inferred IBD, this method can scale to large pedigrees, and more importantly it can handle families with missing members. Compared with existing methods, this method demonstrates much higher power to recover haplotype information, especially in families with many untyped individuals.

  7. THE PASSAGE OF MARKED IONS INTO TISSUES AND FLUIDS OF THE REPRODUCTIVE TRACT OF PREGNANT RABBITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, J.P.; Boursnell, J.C.; Lutwak-Mann, C.

    1959-10-31

    Rapid changes were demonstrated in the uptake of labeled ions both in the developing embryo and in the endometrium, mesodermal placental folds, and other closely associated tissues and fluids following the intravenous injection of labeled ions in pregnant rabbits. Phosphorus-32, sulfur-85, sodium-24, iodine- 131, and potassium-42 were used as tracers. A number of new techniques were developed to obtain, weigh, and handle the extremely small samples. The influence of exogenous materials on the early development of fetuses is discussed briefly. (C.R.)

  8. X-ray diffraction on radioactive materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiferl, D.; Roof, R.B.

    1978-01-01

    X-ray diffraction studies on radioactive materials are discussed with the aim of providing a guide to new researchers in the field. Considerable emphasis is placed on the safe handling and loading of not-too-exotic samples. Special considerations such as the problems of film blackening by the gamma rays and changes induced by the self-irradiation of the sample are covered. Some modifications of common diffraction techniques are presented. Finally, diffraction studies on radioactive samples under extreme conditions are discussed, with primary emphasis on high-pressure studies involving diamond-anvil cells.

  9. Effects of tool handle dimension and workpiece orientation and size on wrist ulnar/radial torque strength, usability and discomfort in a wrench task.

    PubMed

    Dianat, Iman; Rahimi, Soleyman; Nedaei, Moein; Asghari Jafarabadi, Mohammad; Oskouei, Ali E

    2017-03-01

    The effects of tool handle dimension (three modified designs of wrenches with 30-50 mm diameter cylindrical handles and traditional design with rectangular cross-sectional (5 mm × 25 mm) handle), workpiece orientation (vertical/horizontal) and workpiece size (small/large) as well as user's hand size on wrist ulnar/radial (U/R) torque strength, usability and discomfort, and also the relationship between these variables were evaluated in a maximum torque task using wrenches. The highest and lowest levels of maximal wrist U/R torque strength were recorded for the 30 mm diameter handle and traditional wrench design, respectively. The prototype handle with 30 mm diameter, together with 40 mm diameter handle, was also better than other designs as they received higher usability ratings and caused less discomfort. The mean wrist torque strength exerted on a vertically oriented workpiece (in the sagittal plane) was 23.8% higher than that exerted on a horizontally oriented one (in the transverse plane). The user's hand size had no effect on torque exertions. The wrist torque strength and usability were negatively correlated with hand and finger discomfort ratings. The results are also discussed in terms of their implications for hand tool and workstation configuration in torque tasks involving wrenches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Dynamical analysis of extreme precipitation in the US northeast based on large-scale meteorological patterns

    NASA Astrophysics Data System (ADS)

    Agel, Laurie; Barlow, Mathew; Colby, Frank; Binder, Hanin; Catto, Jennifer L.; Hoell, Andrew; Cohen, Judah

    2018-05-01

    Previous work has identified six large-scale meteorological patterns (LSMPs) of dynamic tropopause height associated with extreme precipitation over the Northeast US, with extreme precipitation defined as the top 1% of daily station precipitation. Here, we examine the three-dimensional structure of the tropopause LSMPs in terms of circulation and factors relevant to precipitation, including moisture, stability, and synoptic mechanisms associated with lifting. Within each pattern, the link between the different factors and extreme precipitation is further investigated by comparing the relative strength of the factors between days with and without the occurrence of extreme precipitation. The six tropopause LSMPs include two ridge patterns, two eastern US troughs, and two troughs centered over the Ohio Valley, with a strong seasonality associated with each pattern. Extreme precipitation in the ridge patterns is associated with both convective mechanisms (instability combined with moisture transport from the Great Lakes and Western Atlantic) and synoptic forcing related to Great Lakes storm tracks and embedded shortwaves. Extreme precipitation associated with eastern US troughs involves intense southerly moisture transport and strong quasi-geostrophic forcing of vertical velocity. Ohio Valley troughs are associated with warm fronts and intense warm conveyor belts that deliver large amounts of moisture ahead of storms, but little direct quasi-geostrophic forcing. Factors that show the largest difference between days with and without extreme precipitation include integrated moisture transport, low-level moisture convergence, warm conveyor belts, and quasi-geostrophic forcing, with the relative importance varying between patterns.

  11. Integrated Power Adapter: Isolated Converter with Integrated Passives and Low Material Stress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-09-01

    ADEPT Project: CPES at Virginia Tech is developing an extremely efficient power converter that could be used in power adapters for small, lightweight laptops and other types of mobile electronic devices. Power adapters convert electrical energy into useable power for an electronic device, and they currently waste a lot of energy when they are plugged into an outlet to power up. CPES at Virginia Tech is integrating high-density capacitors, new magnetic materials, high-frequency integrated circuits, and a constant-flux transformer to create its efficient power converter. The high-density capacitors enable the power adapter to store more energy. The new magnetic materialsmore » also increase energy storage, and they can be precisely dispensed using a low-cost ink-jet printer which keeps costs down. The high-frequency integrated circuits can handle more power, and they can handle it more efficiently. And, the constant-flux transformer processes a consistent flow of electrical current, which makes the converter more efficient.« less

  12. Maximising Longwall performance and profitability by the introduction of moving car bunker systems into outbye haulage layouts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barlow, P.M.

    1996-12-31

    In order to maximize the profitability of Longwall mining operations, outbye materials handling systems should be cost effectively engineered to both fully harmonize with cyclical surges in output and to maintain high levels of availability. With the introduction of new Longwall systems into existing mines, the upgrading of outbye haulage to handle peak production from the Longwall panel can prove to be extremely expensive and is often unnecessary. In addition, despite high levels of investment in new equipment, many mines are still failing to achieve planned gains in Longwall productivity due to persistent downtime on outbye haulage routes. This papermore » details the planning, production and engineering considerations necessary to maximize profitability by the introduction of moving car bunker systems to underground haulage layouts. It also examines case studies of recent installations within the North American coal mining industry and charts the significant success being achieved in improving profitability at these mines.« less

  13. Detection and recognition of uneaten fish food pellets in aquaculture using image processing

    NASA Astrophysics Data System (ADS)

    Liu, Huanyu; Xu, Lihong; Li, Dawei

    2015-03-01

    The waste of fish food has always been a serious problem in aquaculture. On one hand, the leftover fish food spawns a big waste in the aquaculture industry because fish food accounts for a large proportion of the investment. On the other hand, the left over fish food may pollute the water and make fishes sick. In general, the reason for fish food waste is that there is no feedback about the consumption of delivered fish food after feeding. So it is extremely difficult for fish farmers to determine the amount of feedstuff that should be delivered each time and the feeding intervals. In this paper, we propose an effective method using image processing techniques to solve this problem. During feeding events, we use an underwater camera with supplementary LED lights to obtain images of uneaten fish food pellets on the tank bottom. An algorithm is then developed to figure out the number of left pellets using adaptive Otsu thresholding and a linear-time component labeling algorithm. This proposed algorithm proves to be effective in handling the non-uniform lighting and very accurate number of pellets are counted in experiments.

  14. Reduced-order surrogate models for Green's functions in black hole spacetimes

    NASA Astrophysics Data System (ADS)

    Galley, Chad; Wardell, Barry

    2016-03-01

    The fundamental nature of linear wave propagation in curved spacetime is encoded in the retarded Green's function (or propagator). Green's functions are useful tools because almost any field quantity of interest can be computed via convolution integrals with a source. In addition, perturbation theories involving nonlinear wave propagation can be expressed in terms of multiple convolutions of the Green's function. Recently, numerical solutions for propagators in black hole spacetimes have been found that are globally valid and accurate for computing physical quantities. However, the data generated is too large for practical use because the propagator depends on two spacetime points that must be sampled finely to yield accurate convolutions. I describe how to build a reduced-order model that can be evaluated as a substitute, or surrogate, for solutions of the curved spacetime Green's function equation. The resulting surrogate accurately and quickly models the original and out-of-sample data. I discuss applications of the surrogate, including self-consistent evolutions and waveforms of extreme mass ratio binaries. Green's function surrogate models provide a new and practical way to handle many old problems involving wave propagation and motion in curved spacetimes.

  15. Real-Time Processing of Continuous Physiological Signals in a Neurocritical Care Unit on a Stream Data Analytics Platform.

    PubMed

    Bai, Yong; Sow, Daby; Vespa, Paul; Hu, Xiao

    2016-01-01

    Continuous high-volume and high-frequency brain signals such as intracranial pressure (ICP) and electroencephalographic (EEG) waveforms are commonly collected by bedside monitors in neurocritical care. While such signals often carry early signs of neurological deterioration, detecting these signs in real time with conventional data processing methods mainly designed for retrospective analysis has been extremely challenging. Such methods are not designed to handle the large volumes of waveform data produced by bedside monitors. In this pilot study, we address this challenge by building a prototype system using the IBM InfoSphere Streams platform, a scalable stream computing platform, to detect unstable ICP dynamics in real time. The system continuously receives electrocardiographic and ICP signals and analyzes ICP pulse morphology looking for deviations from a steady state. We also designed a Web interface to display in real time the result of this analysis in a Web browser. With this interface, physicians are able to ubiquitously check on the status of their patients and gain direct insight into and interpretation of the patient's state in real time. The prototype system has been successfully tested prospectively on live hospitalized patients.

  16. Modeling The Atmosphere In The Era Of Big Data From Extremely Wide Field-Of-View Telescopes

    NASA Astrophysics Data System (ADS)

    Gonzalez Quiles, Junellie; Nordin, Jakob

    2018-01-01

    Surveys like the Sloan Digital Sky Survey (SDSS), Pan-STARRS and the Palomar Transient Factory Survey (PTF) receive large amounts of data, which need to be processed and calibrated in order to correct for various factors. One of the limiting factors in obtaining high quality data is the atmosphere, and it is therefore essential to find the appropriate calibration for the atmospheric extinction. It is to be expected that a physical atmospheric model, compared to a photometric calibration used currently by PTF, is more effective in calibrating for the atmospheric extinction due to its ability to account for rapid atmospheric fluctuation and objects of different colors. We focused on creating tools to model the atmospheric extinction for the upcoming Zwicky Transient Factory Survey (ZTF). In order to model the atmosphere, we created a program that combines input data and catalogue values, and efficiently handles them. Then, using PTF data and the SDSS catalogue, we created several models to fit the data, and tested the quality of the fits by chi-square minimization. This will allow us to optimize atmospheric extinction for the upcoming ZTF in the near future.

  17. Impact of an extreme climatic event on community assembly.

    PubMed

    Thibault, Katherine M; Brown, James H

    2008-03-04

    Extreme climatic events are predicted to increase in frequency and magnitude, but their ecological impacts are poorly understood. Such events are large, infrequent, stochastic perturbations that can change the outcome of entrained ecological processes. Here we show how an extreme flood event affected a desert rodent community that has been monitored for 30 years. The flood (i) caused catastrophic, species-specific mortality; (ii) eliminated the incumbency advantage of previously dominant species; (iii) reset long-term population and community trends; (iv) interacted with competitive and metapopulation dynamics; and (v) resulted in rapid, wholesale reorganization of the community. This and a previous extreme rainfall event were punctuational perturbations-they caused large, rapid population- and community-level changes that were superimposed on a background of more gradual trends driven by climate and vegetation change. Captured by chance through long-term monitoring, the impacts of such large, infrequent events provide unique insights into the processes that structure ecological communities.

  18. Uvf - Unified Volume Format: A General System for Efficient Handling of Large Volumetric Datasets.

    PubMed

    Krüger, Jens; Potter, Kristin; Macleod, Rob S; Johnson, Christopher

    2008-01-01

    With the continual increase in computing power, volumetric datasets with sizes ranging from only a few megabytes to petascale are generated thousands of times per day. Such data may come from an ordinary source such as simple everyday medical imaging procedures, while larger datasets may be generated from cluster-based scientific simulations or measurements of large scale experiments. In computer science an incredible amount of work worldwide is put into the efficient visualization of these datasets. As researchers in the field of scientific visualization, we often have to face the task of handling very large data from various sources. This data usually comes in many different data formats. In medical imaging, the DICOM standard is well established, however, most research labs use their own data formats to store and process data. To simplify the task of reading the many different formats used with all of the different visualization programs, we present a system for the efficient handling of many types of large scientific datasets (see Figure 1 for just a few examples). While primarily targeted at structured volumetric data, UVF can store just about any type of structured and unstructured data. The system is composed of a file format specification with a reference implementation of a reader. It is not only a common, easy to implement format but also allows for efficient rendering of most datasets without the need to convert the data in memory.

  19. Circuit breaker lock out assembly

    DOEpatents

    Gordy, W.T.

    1983-05-18

    A lock out assembly for a circuit breaker which consists of a generally step-shaped unitary base with an aperture in the small portion of the step-shaped base and a roughly S shaped retaining pin which loops through the large portion of the step-shaped base. The lock out assembly is adapted to fit over a circuit breaker with the handle switch projecting through the aperture, and the retaining pin projecting into an opening of the handle switch, preventing removal.

  20. Circuit breaker lock out assembly

    DOEpatents

    Gordy, Wade T.

    1984-01-01

    A lock out assembly for a circuit breaker which consists of a generally step-shaped unitary base with an aperture in the small portion of the step-shaped base and a roughly "S" shaped retaining pin which loops through the large portion of the step-shaped base. The lock out assembly is adapted to fit over a circuit breaker with the handle switch projecting through the aperture, and the retaining pin projecting into an opening of the handle switch, preventing removal.

  1. High performance interconnection between high data rate networks

    NASA Technical Reports Server (NTRS)

    Foudriat, E. C.; Maly, K.; Overstreet, C. M.; Zhang, L.; Sun, W.

    1992-01-01

    The bridge/gateway system needed to interconnect a wide range of computer networks to support a wide range of user quality-of-service requirements is discussed. The bridge/gateway must handle a wide range of message types including synchronous and asynchronous traffic, large, bursty messages, short, self-contained messages, time critical messages, etc. It is shown that messages can be classified into three basic classes, synchronous and large and small asynchronous messages. The first two require call setup so that packet identification, buffer handling, etc. can be supported in the bridge/gateway. Identification enables resequences in packet size. The third class is for messages which do not require call setup. Resequencing hardware based to handle two types of resequencing problems is presented. The first is for a virtual parallel circuit which can scramble channel bytes. The second system is effective in handling both synchronous and asynchronous traffic between networks with highly differing packet sizes and data rates. The two other major needs for the bridge/gateway are congestion and error control. A dynamic, lossless congestion control scheme which can easily support effective error correction is presented. Results indicate that the congestion control scheme provides close to optimal capacity under congested conditions. Under conditions where error may develop due to intervening networks which are not lossless, intermediate error recovery and correction takes 1/3 less time than equivalent end-to-end error correction under similar conditions.

  2. Impact of a Single Unusually Large Rainfall Event on the Level of Risk Used for Infrastructure Design

    NASA Astrophysics Data System (ADS)

    Dhakal, N.; Jain, S.

    2013-12-01

    Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.

  3. Statistical Downscaling of Gusts During Extreme European Winter Storms Using Radial-Basis-Function Networks

    NASA Astrophysics Data System (ADS)

    Voigt, M.; Lorenz, P.; Kruschke, T.; Osinski, R.; Ulbrich, U.; Leckebusch, G. C.

    2012-04-01

    Winterstorms and related gusts can cause extensive socio-economic damages. Knowledge about the occurrence and the small scale structure of such events may help to make regional estimations of storm losses. For a high spatial and temporal representation, the use of dynamical downscaling methods (RCM) is a cost-intensive and time-consuming option and therefore only applicable for a limited number of events. The current study explores a methodology to provide a statistical downscaling, which offers small scale structured gust fields from an extended large scale structured eventset. Radial-basis-function (RBF) networks in combination with bidirectional Kohonen (BDK) maps are used to generate the gustfields on a spatial resolution of 7 km from the 6-hourly mean sea level pressure field from ECMWF reanalysis data. BDK maps are a kind of neural network which handles supervised classification problems. In this study they are used to provide prototypes for the RBF network and give a first order approximation for the output data. A further interpolation is done by the RBF network. For the training process the 50 most extreme storm events over the North Atlantic area from 1957 to 2011 are used, which have been selected from ECMWF reanalysis datasets ERA40 and ERA-Interim by an objective wind based tracking algorithm. These events were downscaled dynamically by application of the DWD model chain GME → COSMO-EU. Different model parameters and their influence on the quality of the generated high-resolution gustfields are studied. It is shown that the statistical RBF network approach delivers reasonable results in modeling the regional gust fields for untrained events.

  4. Downscaling Satellite Precipitation with Emphasis on Extremes: A Variational ℓ1-Norm Regularization in the Derivative Domain

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, E.; Ebtehaj, A. M.; Zhang, S. Q.; Hou, A. Y.

    2014-05-01

    The increasing availability of precipitation observations from space, e.g., from the Tropical Rainfall Measuring Mission (TRMM) and the forthcoming Global Precipitation Measuring (GPM) Mission, has fueled renewed interest in developing frameworks for downscaling and multi-sensor data fusion that can handle large data sets in computationally efficient ways while optimally reproducing desired properties of the underlying rainfall fields. Of special interest is the reproduction of extreme precipitation intensities and gradients, as these are directly relevant to hazard prediction. In this paper, we present a new formalism for downscaling satellite precipitation observations, which explicitly allows for the preservation of some key geometrical and statistical properties of spatial precipitation. These include sharp intensity gradients (due to high-intensity regions embedded within lower-intensity areas), coherent spatial structures (due to regions of slowly varying rainfall), and thicker-than-Gaussian tails of precipitation gradients and intensities. Specifically, we pose the downscaling problem as a discrete inverse problem and solve it via a regularized variational approach (variational downscaling) where the regularization term is selected to impose the desired smoothness in the solution while allowing for some steep gradients (called ℓ1-norm or total variation regularization). We demonstrate the duality between this geometrically inspired solution and its Bayesian statistical interpretation, which is equivalent to assuming a Laplace prior distribution for the precipitation intensities in the derivative (wavelet) space. When the observation operator is not known, we discuss the effect of its misspecification and explore a previously proposed dictionary-based sparse inverse downscaling methodology to indirectly learn the observation operator from a data base of coincidental high- and low-resolution observations. The proposed method and ideas are illustrated in case studies featuring the downscaling of a hurricane precipitation field.

  5. Manufacturing blood ex vivo: a futuristic approach to deal with the supply and safety concerns

    PubMed Central

    Singh, Vimal K.; Saini, Abhishek; Tsuji, Kohichiro; Sharma, P. B.; Chandra, Ramesh

    2014-01-01

    Blood transfusions are routinely done in every medical regimen and a worldwide established collection, processing/storage centers provide their services for the same. There have been extreme global demands for both raising the current collections and supply of safe/adequate blood due to increasingly demanding population. With, various risks remain associated with the donor derived blood, and a number of post collection blood screening and processing methods put extreme constraints on supply system especially in the underdeveloped countries. A logistic approach to manufacture erythrocytes ex-vivo by using modern tissue culture techniques have surfaced in the past few years. There are several reports showing the possibilities of RBCs (and even platelets/neutrophils) expansion under tightly regulated conditions. In fact, ex vivo synthesis of the few units of clinical grade RBCs from a single dose of starting material such as umbilical cord blood (CB) has been well established. Similarly, many different sources are also being explored for the same purpose, such as embryonic stem cells, induced pluripotent stem cells. However, the major concerns remain elusive before the manufacture and clinical use of different blood components may be used to successfully replace the present system of donor derived blood transfusion. The most important factor shall include the large scale of RBCs production from each donated unit within a limited time period and cost of their production, both of these issues need to be handled carefully since many of the recipients among developing countries are unable to pay even for the freely available donor derived blood. Anyways, keeping these issues in mind, present article shall be focused on the possibilities of blood production and their use in the near future. PMID:25364733

  6. Manufacturing blood ex vivo: a futuristic approach to deal with the supply and safety concerns.

    PubMed

    Singh, Vimal K; Saini, Abhishek; Tsuji, Kohichiro; Sharma, P B; Chandra, Ramesh

    2014-01-01

    Blood transfusions are routinely done in every medical regimen and a worldwide established collection, processing/storage centers provide their services for the same. There have been extreme global demands for both raising the current collections and supply of safe/adequate blood due to increasingly demanding population. With, various risks remain associated with the donor derived blood, and a number of post collection blood screening and processing methods put extreme constraints on supply system especially in the underdeveloped countries. A logistic approach to manufacture erythrocytes ex-vivo by using modern tissue culture techniques have surfaced in the past few years. There are several reports showing the possibilities of RBCs (and even platelets/neutrophils) expansion under tightly regulated conditions. In fact, ex vivo synthesis of the few units of clinical grade RBCs from a single dose of starting material such as umbilical cord blood (CB) has been well established. Similarly, many different sources are also being explored for the same purpose, such as embryonic stem cells, induced pluripotent stem cells. However, the major concerns remain elusive before the manufacture and clinical use of different blood components may be used to successfully replace the present system of donor derived blood transfusion. The most important factor shall include the large scale of RBCs production from each donated unit within a limited time period and cost of their production, both of these issues need to be handled carefully since many of the recipients among developing countries are unable to pay even for the freely available donor derived blood. Anyways, keeping these issues in mind, present article shall be focused on the possibilities of blood production and their use in the near future.

  7. Downscaling Satellite Precipitation with Emphasis on Extremes: A Variational 1-Norm Regularization in the Derivative Domain

    NASA Technical Reports Server (NTRS)

    Foufoula-Georgiou, E.; Ebtehaj, A. M.; Zhang, S. Q.; Hou, A. Y.

    2013-01-01

    The increasing availability of precipitation observations from space, e.g., from the Tropical Rainfall Measuring Mission (TRMM) and the forthcoming Global Precipitation Measuring (GPM) Mission, has fueled renewed interest in developing frameworks for downscaling and multi-sensor data fusion that can handle large data sets in computationally efficient ways while optimally reproducing desired properties of the underlying rainfall fields. Of special interest is the reproduction of extreme precipitation intensities and gradients, as these are directly relevant to hazard prediction. In this paper, we present a new formalism for downscaling satellite precipitation observations, which explicitly allows for the preservation of some key geometrical and statistical properties of spatial precipitation. These include sharp intensity gradients (due to high-intensity regions embedded within lower-intensity areas), coherent spatial structures (due to regions of slowly varying rainfall),and thicker-than-Gaussian tails of precipitation gradients and intensities. Specifically, we pose the downscaling problem as a discrete inverse problem and solve it via a regularized variational approach (variational downscaling) where the regularization term is selected to impose the desired smoothness in the solution while allowing for some steep gradients(called 1-norm or total variation regularization). We demonstrate the duality between this geometrically inspired solution and its Bayesian statistical interpretation, which is equivalent to assuming a Laplace prior distribution for the precipitation intensities in the derivative (wavelet) space. When the observation operator is not known, we discuss the effect of its misspecification and explore a previously proposed dictionary-based sparse inverse downscaling methodology to indirectly learn the observation operator from a database of coincidental high- and low-resolution observations. The proposed method and ideas are illustrated in case studies featuring the downscaling of a hurricane precipitation field.

  8. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  9. Chemical Trends in Solid Alkali Pertechnetates.

    PubMed

    Weaver, Jamie; Soderquist, Chuck Z; Washton, Nancy M; Lipton, Andrew S; Gassman, Paul L; Lukens, Wayne W; Kruger, Albert A; Wall, Nathalie A; McCloy, John S

    2017-03-06

    Insight into the solid-state chemistry of pure technetium-99 ( 99 Tc) oxides is required in the development of a robust immobilization and disposal system for nuclear waste stemming from the radiopharmaceutical industry, from the production of nuclear weapons, and from spent nuclear fuel. However, because of its radiotoxicity and the subsequent requirement of special facilities and handling procedures for research, only a few studies have been completed, many of which are over 20 years old. In this study, we report the synthesis of pure alkali pertechnetates (sodium, potassium, rubidium, and cesium) and analysis of these compounds by Raman spectroscopy, X-ray absorption spectroscopy (XANES and EXAFS), solid-state nuclear magnetic resonance (static and magic angle spinning), and neutron diffraction. The structures and spectral signatures of these compounds will aid in refining the understanding of 99 Tc incorporation into and release from nuclear waste glasses. NaTcO 4 shows aspects of the relatively higher electronegativity of the Na atom, resulting in large distortions of the pertechnetate tetrahedron and deshielding of the 99 Tc nucleus relative to the aqueous TcO 4 - . At the other extreme, the large Cs and Rb atoms interact only weakly with the pertechnetate, have closer to perfect tetrahedral symmetry at the Tc atom, and have very similar vibrational spectra, even though the crystal structure of CsTcO 4 is orthorhombic while that of RbTcO 4 is tetragonal. Further trends are observed in the cell volume and quadrupolar coupling constant.

  10. Chemical Trends in Solid Alkali Pertechnetates

    DOE PAGES

    Weaver, Jamie; Soderquist, Chuck Z.; Washton, Nancy M.; ...

    2017-02-21

    Insight into the solid-state chemistry of pure technetium-99 ( 99Tc) oxides is required in the development of a robust immobilization and disposal system for nuclear waste stemming from the radiopharmaceutical industry, from the production of nuclear weapons, and from spent nuclear fuel. However, because of its radiotoxicity and the subsequent requirement of special facilities and handling procedures for research, only a few studies have been completed, many of which are over 20 years old. In this study, we report the synthesis of pure alkali pertechnetates (sodium, potassium, rubidium, and cesium) and analysis of these compounds by Raman spectroscopy, X-ray absorptionmore » spectroscopy (XANES and EXAFS), solid-state nuclear magnetic resonance (static and magic angle spinning), and neutron diffraction. The structures and spectral signatures of these compounds will aid in refining the understanding of 99Tc incorporation into and release from nuclear waste glasses. NaTcO 4 shows aspects of the relatively higher electronegativity of the Na atom, resulting in large distortions of the pertechnetate tetrahedron and deshielding of the 99Tc nucleus relative to the aqueous TcO 4 –. At the other extreme, the large Cs and Rb atoms interact only weakly with the pertechnetate, have closer to perfect tetrahedral symmetry at the Tc atom, and have very similar vibrational spectra, even though the crystal structure of CsTcO 4 is orthorhombic while that of RbTcO 4 is tetragonal. Further trends are observed in the cell volume and quadrupolar coupling constant.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Jamie; Soderquist, Chuck Z.; Washton, Nancy M.

    Insight into the solid-state chemistry of pure technetium-99 ( 99Tc) oxides is required in the development of a robust immobilization and disposal system for nuclear waste stemming from the radiopharmaceutical industry, from the production of nuclear weapons, and from spent nuclear fuel. However, because of its radiotoxicity and the subsequent requirement of special facilities and handling procedures for research, only a few studies have been completed, many of which are over 20 years old. In this study, we report the synthesis of pure alkali pertechnetates (sodium, potassium, rubidium, and cesium) and analysis of these compounds by Raman spectroscopy, X-ray absorptionmore » spectroscopy (XANES and EXAFS), solid-state nuclear magnetic resonance (static and magic angle spinning), and neutron diffraction. The structures and spectral signatures of these compounds will aid in refining the understanding of 99Tc incorporation into and release from nuclear waste glasses. NaTcO 4 shows aspects of the relatively higher electronegativity of the Na atom, resulting in large distortions of the pertechnetate tetrahedron and deshielding of the 99Tc nucleus relative to the aqueous TcO 4 –. At the other extreme, the large Cs and Rb atoms interact only weakly with the pertechnetate, have closer to perfect tetrahedral symmetry at the Tc atom, and have very similar vibrational spectra, even though the crystal structure of CsTcO 4 is orthorhombic while that of RbTcO 4 is tetragonal. Further trends are observed in the cell volume and quadrupolar coupling constant.« less

  12. Flight Dynamics Aspects of a Large Civil Tiltrotor Simulation Using Translational Rate Command

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Malpica, Carlos A.; Theodore, Colin R.; Decker, William A.; Lindsey, James E.

    2011-01-01

    An in-depth analysis of a Large Civil Tiltrotor simulation with a Translational Rate Command control law that uses automatic nacelle deflections for longitudinal velocity control and lateral cyclic for lateral velocity control is presented. Results from piloted real-time simulation experiments and offline time and frequency domain analyses are used to investigate the fundamental flight dynamic and control mechanisms of the control law. The baseline Translational Rate Command conferred handling qualities improvements over an attitude command attitude hold control law but in some scenarios there was a tendency to enter PIO. Nacelle actuator rate limiting strongly influenced the PIO tendency and reducing the rate limits degraded the handling qualities further. Counterintuitively, increasing rate limits also led to a worsening of the handling qualities ratings. This led to the identification of a nacelle rate to rotor longitudinal flapping coupling effect that induced undesired pitching motions proportional to the allowable amount of nacelle rate. A modification that applied a counteracting amount of longitudinal cyclic proportional to the nacelle rate significantly improved the handling qualities. The lateral axis of the Translational Rate Command conferred Level 1 handling qualities in a Lateral Reposition maneuver. Analysis of the influence of the modeling fidelity on the lateral flapping angles is presented. It is showed that the linear modeling approximation is likely to have under-predicted the side-force and therefore under-predicted the lateral flapping at velocities above 15 ft/s. However, at lower velocities, and therefore more weakly influenced by the side force modeling, the accelerations that the control law commands also significantly influenced the peak levels of lateral flapping achieved.

  13. Work-related stress: A survey of Indian anesthesiologists.

    PubMed

    Bakshi, Sumitra Ganesh; Divatia, Jigeeshu Vasishtha; Kannan, Sadhana; Myatra, Sheila Nainan

    2017-01-01

    Work-related stress is common among medical caregivers and concerns all perioperative care providers. Although anesthesiologists are known to experience stress, there are limited Indian data addressing this issue. This survey was conducted among Indian anesthesiologists to determine their awareness about work stress and views regarding prevention programs. A survey questionnaire was distributed to delegates visiting the exhibits at the national anesthesiology conference in 2011. The questionnaire had ten questions on the work pattern, five on work-related stress, nine on opinion regarding the need and willingness to participate in stress-related programs. There were 1178 responders. Forty-three percent were faculty in medical institutions, 26% were residents and 25% were in free-lance practice. Ninety-one percent of participants rated their stress as moderate-extreme. There was a significant correlation between the amount of stress and working for more than 8 h ( P < 0.001), handling high risk patients ( P = 0.002), working on weekends ( P = 0.002), and carrying work back home ( P < 0.001). Forty-one percent of respondents were very satisfied professionally. Seventy-six percent of doctors agreed that the questionnaire had made them think about work stress. Eighty-four percent of participants felt the need for stress management programs and 69% expressed their willingness to participate in the same. The majority of participants rated their stress as moderate-extreme and was higher in anesthesiologists working long hours, over the weekend and those handling high-risk patients. A majority of participants felt the survey made them think about work-related stress and expressed their willingness to participate in stress management programs.

  14. Six centuries of geomagnetic intensity variations recorded by royal Judean stamped jar handles

    NASA Astrophysics Data System (ADS)

    Ben-Yosef, Erez; Millman, Michael; Shaar, Ron; Tauxe, Lisa; Lipschits, Oded

    2017-02-01

    Earth’s magnetic field, one of the most enigmatic physical phenomena of the planet, is constantly changing on various time scales, from decades to millennia and longer. The reconstruction of geomagnetic field behavior in periods predating direct observations with modern instrumentation is based on geological and archaeological materials and has the twin challenges of (i) the accuracy of ancient paleomagnetic estimates and (ii) the dating of the archaeological material. Here we address the latter by using a set of storage jar handles (fired clay) stamped by royal seals as part of the ancient administrative system in Judah (Jerusalem and its vicinity). The typology of the stamp impressions, which corresponds to changes in the political entities ruling this area, provides excellent age constraints for the firing event of these artifacts. Together with rigorous paleomagnetic experimental procedures, this study yielded an unparalleled record of the geomagnetic field intensity during the eighth to second centuries BCE. The new record constitutes a substantial advance in our knowledge of past geomagnetic field variations in the southern Levant. Although it demonstrates a relatively stable and gradually declining field during the sixth to second centuries BCE, the new record provides further support for a short interval of extreme high values during the late eighth century BCE. The rate of change during this “geomagnetic spike” [defined as virtual axial dipole moment > 160 ZAm2 (1021 Am2)] is further constrained by the new data, which indicate an extremely rapid weakening of the field (losing ˜27% of its strength over ca. 30 y).

  15. Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data

    NASA Astrophysics Data System (ADS)

    Liu, N.; Liu, C.

    2017-12-01

    Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.

  16. The spatiotemporal changes in precipitation extremes over Canada and their connections to large-scale climate patterns

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Gan, T. Y.; Tan, X.

    2017-12-01

    In the past few decades, there have been more extreme climate events around the world, and Canada has also suffered from numerous extreme precipitation events. In this paper, trend analysis, change point analysis, probability distribution function, principal component analysis and wavelet analysis were used to investigate the spatial and temporal patterns of extreme precipitation in Canada. Ten extreme precipitation indices were calculated using long-term daily precipitation data from 164 gauging stations. Several large-scale climate patterns such as El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), Pacific-North American (PNA), and North Atlantic Oscillation (NAO) were selected to analyze the relationships between extreme precipitation and climate indices. Convective Available Potential Energy (CAPE), specific humidity, and surface temperature were employed to investigate the potential causes of the trends.The results show statistically significant positive trends for most indices, which indicate increasing extreme precipitation. The majority of indices display more increasing trends along the southern border of Canada while decreasing trends dominate in the central Canadian Prairies (CP). In addition, strong connections are found between the extreme precipitation and climate indices and the effects of climate pattern differ for each region. The seasonal CAPE, specific humidity, and temperature are found to be closely related to Canadian extreme precipitation.

  17. Time reducing exposure containing 18 fluorine flourodeoxyglucose master vial dispensing in hot lab: Omega technique

    PubMed Central

    Rao, Vatturi Venkata Satya Prabhakar; Manthri, Ranadheer; Hemalatha, Pottumuthu; Kumar, Vuyyuru Navin; Azhar, Mohammad

    2016-01-01

    Hot lab dispensing of large doses of 18 fluorine fluorodeoxyglucose in master vials supplied from the cyclotrons requires high degrees of skill to handle high doses. Presently practiced conventional method of fractionating from the inverted tiltable vial pig mounted on a metal frame has its own limitations such as increasing isotope handling times and exposure to the technologist. Innovative technique devised markedly improves the fractionating efficiency along with speed, precision, and reduced dose exposure. PMID:27095872

  18. LACIE data-handling techniques

    NASA Technical Reports Server (NTRS)

    Waits, G. H. (Principal Investigator)

    1979-01-01

    Techniques implemented to facilitate processing of LANDSAT multispectral data between 1975 and 1978 are described. The data that were handled during the large area crop inventory experiment and the storage mechanisms used for the various types of data are defined. The overall data flow, from the placing of the LANDSAT orders through the actual analysis of the data set, is discussed. An overview is provided of the status and tracking system that was developed and of the data base maintenance and operational task. The archiving of the LACIE data is explained.

  19. Hazards and Safeguards of High Pressure Hydraulic Fatigue Testing

    DTIC Science & Technology

    1990-07-01

    rew e I&64aN neem mde tliF by block mumber) The creation and transfer of hydraulic pressure at the 690-MPa (100,000-psi) level is in itself hazardous...our hydraulic test systems using fluids capable of flow up to the test pressure. Up to 690 MPa (100,000 psi), synthetic oils especially formulated for...HANDLING Our most frequent injury problem has been in handling the large tubular specimens. These are inherently smooth, round, oil -coated, and heavy. For

  20. Medical-Information-Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, Sidney; Friedman, Carl A.; Frankowski, James W.

    1989-01-01

    Medical Information Management System (MIMS) computer program interactive, general-purpose software system for storage and retrieval of information. Offers immediate assistance where manipulation of large data bases required. User quickly and efficiently extracts, displays, and analyzes data. Used in management of medical data and handling all aspects of data related to care of patients. Other applications include management of data on occupational safety in public and private sectors, handling judicial information, systemizing purchasing and procurement systems, and analyses of cost structures of organizations. Written in Microsoft FORTRAN 77.

  1. Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane

    NASA Technical Reports Server (NTRS)

    Gera, Joseph; Bosworth, John T.

    1987-01-01

    This paper describes some novel flight tests and analysis techniques in the flight dynamics and handling qualities area. These techniques were utilized during the initial flight envelope clearance of the X-29A aircraft and were largely responsible for the completion of the flight controls clearance program without any incidents or significant delays. The resulting open-loop and closed-loop frequency responses and the time history comparison using flight and linear simulation data are discussed.

  2. General Monte Carlo reliability simulation code including common mode failures and HARP fault/error-handling

    NASA Technical Reports Server (NTRS)

    Platt, M. E.; Lewis, E. E.; Boehm, F.

    1991-01-01

    A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.

  3. Cold Trap Dismantling and Sodium Removal at a Fast Breeder Reactor - 12327

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, A.; Petrick, H.; Stutz, U.

    2012-07-01

    The first German prototype Fast Breeder Nuclear Reactor (KNK) is currently being dismantled after being the only operating Fast Breeder-type reactor in Germany. As this reactor type used sodium as a coolant in its primary and secondary circuit, seven cold traps containing various amounts of partially activated sodium needed to be disposed of as part of the dismantling. The resulting combined difficulties of radioactive contamination and high chemical reactivity were handled by treating the cold traps differently depending on their size and the amount of sodium contained inside. Six small cold traps were processed onsite by cutting them up intomore » small parts using a band saw under a protective atmosphere. The sodium was then converted to sodium hydroxide by using water. The remaining large cold trap could not be handled in the same way due to its dimensions (2.9 m x 1.1 m) and the declared amount of sodium inside (1,700 kg). It was therefore manually dismantled inside a large box filled with a protective atmosphere, while the resulting pieces were packaged for later burning in a special facility. The experiences gained by KNK during this process may be advantageous for future dismantling projects in similar sodium-cooled reactors worldwide. The dismantling of a prototype fast breeder reactor provides the challenge not only to dismantle radioactive materials but also to handle sodium-contaminated or sodium-containing components. The treatment of sodium requires additional equipment and installations to ensure a safe handling. Since it is not permitted to bring sodium into a repository, all sodium has to be neutralized either through a controlled reaction with water or by incinerating. The resulting components can be disposed of as normal radioactive waste with no further conditions. The handling of sodium needs skilled and experienced workers to minimize the inherent risks. And the example of the disposal of the large KNK cold trap shows the interaction with others and also foreign decommissioning projects can provide solutions with were unknown before. (authors)« less

  4. Development and evaluation of a multifaceted ergonomics program to prevent injuries associated with patient handling tasks.

    PubMed

    Nelson, Audrey; Matz, Mary; Chen, Fangfei; Siddharthan, Kris; Lloyd, John; Fragala, Guy

    2006-08-01

    Nurses have one of the highest rates of work-related musculoskeletal injury of any profession. Over the past 30 years, efforts to reduce work-related musculoskeletal disorders in nurses have been largely unsuccessful. The primary goal of this program was to create safer working environments for nursing staff who provide direct patient care. Our first objective was to design and implement a multifaceted program that successfully integrated evidence-based practice, technology, and safety improvement. The second objective was to evaluate the impact of the program on injury rate, lost and modified work days, job satisfaction, self-reported unsafe patient handling acts, level of support for program, staff and patient acceptance, program effectiveness, costs, and return on investment. The intervention included six program elements: (1) Ergonomic Assessment Protocol, (2) Patient Handling Assessment Criteria and Decision Algorithms, (3) Peer Leader role, "Back Injury Resource Nurses", (4) State-of-the-art Equipment, (5) After Action Reviews, and (6) No Lift Policy. A pre-/post design without a control group was used to evaluate the effectiveness of a patient care ergonomics program on 23 high risk units (19 nursing home care units and 4 spinal cord injury units) in 7 facilities. Injury rates, lost work days, modified work days, job satisfaction, staff , and patient acceptance, program effectiveness, and program costs/savings were compared over two nine month periods: pre-intervention (May 2001-January 2002) and post-intervention (March 2002-November 2002). Data were collected prospectively through surveys, weekly process logs, injury logs, and cost logs. The program elements resulted in a statistically significant decrease in the rate of musculoskeletal injuries as well as the number of modified duty days taken per injury. While the total number of lost workdays decreased by 18% post-intervention, this difference was not statistically significant. There were statistically significant increases in two subscales of job satisfaction: professional status and tasks requirements. Self-reports by nursing staff revealed a statistically significant decrease in the number of 'unsafe' patient handling practices performed daily. Nurses ranked program elements they deemed to be "extremely effective": equipment was rated as most effective (96%), followed by No Lift Policy (68%), peer leader education program (66%), ergonomic assessment protocol (59%), patient handling assessment criteria and decision algorithms (55%), and lastly after action reviews (41%). Perceived support and interest for the program started at a high level for managers and nursing staff and remained very high throughout the program implementation. Patient acceptance was moderate when the program started but increased to very high by the end of the program. Although the ease and success of program implementation initially varied between and within the facilities, after six months there was strong evidence of support at all levels. The initial capital investment for patient handling equipment was recovered in approximately 3.75 years based on annual post-intervention savings of over $200,000/year in workers' compensation expenses and cost savings associated with reduced lost and modified work days and worker compensation. This multi-faceted program resulted in an overall lower injury rate, fewer modified duty days taken per injury, and significant cost savings. The program was well accepted by patients, nursing staff, and administrators. Given the significant increases in two job satisfaction subscales (professional status and task requirements), it is possible that nurse recruitment and retention could be positively impacted.

  5. The persistence of the large volumes in black holes

    NASA Astrophysics Data System (ADS)

    Ong, Yen Chin

    2015-08-01

    Classically, black holes admit maximal interior volumes that grow asymptotically linearly in time. We show that such volumes remain large when Hawking evaporation is taken into account. Even if a charged black hole approaches the extremal limit during this evolution, its volume continues to grow; although an exactly extremal black hole does not have a "large interior". We clarify this point and discuss the implications of our results to the information loss and firewall paradoxes.

  6. GenomeVista

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poliakov, Alexander; Couronne, Olivier

    2002-11-04

    Aligning large vertebrate genomes that are structurally complex poses a variety of problems not encountered on smaller scales. Such genomes are rich in repetitive elements and contain multiple segmental duplications, which increases the difficulty of identifying true orthologous SNA segments in alignments. The sizes of the sequences make many alignment algorithms designed for comparing single proteins extremely inefficient when processing large genomic intervals. We integrated both local and global alignment tools and developed a suite of programs for automatically aligning large vertebrate genomes and identifying conserved non-coding regions in the alignments. Our method uses the BLAT local alignment program tomore » find anchors on the base genome to identify regions of possible homology for a query sequence. These regions are postprocessed to find the best candidates which are then globally aligned using the AVID global alignment program. In the last step conserved non-coding segments are identified using VISTA. Our methods are fast and the resulting alignments exhibit a high degree of sensitivity, covering more than 90% of known coding exons in the human genome. The GenomeVISTA software is a suite of Perl programs that is built on a MySQL database platform. The scheduler gets control data from the database, builds a queve of jobs, and dispatches them to a PC cluster for execution. The main program, running on each node of the cluster, processes individual sequences. A Perl library acts as an interface between the database and the above programs. The use of a separate library allows the programs to function independently of the database schema. The library also improves on the standard Perl MySQL database interfere package by providing auto-reconnect functionality and improved error handling.« less

  7. Modeling complex chemical effects in turbulent nonpremixed combustion

    NASA Technical Reports Server (NTRS)

    Smith, Nigel S. A.

    1995-01-01

    Virtually all of the energy derived from the consumption of combustibles occurs in systems which utilize turbulent fluid motion. Since combustion is largely related to the mixing of fluids and mixing processes are orders of magnitude more rapid when enhanced by turbulent motion, efficiency criteria dictate that chemically powered devices necessarily involve fluid turbulence. Where combustion occurs concurrently with mixing at an interface between two reactive fluid bodies, this mode of combustion is called nonpremixed combustion. This is distinct from premixed combustion where flame-fronts propagate into a homogeneous mixture of reactants. These two modes are limiting cases in the range of temporal lag between mixing of reactants and the onset of reaction. Nonpremixed combustion occurs where this lag tends to zero, while premixed combustion occurs where this lag tends to infinity. Many combustion processes are hybrids of these two extremes with finite non-zero lag times. Turbulent nonpremixed combustion is important from a practical standpoint because it occurs in gas fired boilers, furnaces, waste incinerators, diesel engines, gas turbine combustors, and afterburners etc. To a large extent, past development of these practical systems involved an empirical methodology. Presently, efficiency standards and emission regulations are being further tightened (Correa 1993), and empiricism has had to give way to more fundamental research in order to understand and effectively model practical combustion processes (Pope 1991). A key element in effective modeling of turbulent combustion is making use of a sufficiently detailed chemical kinetic mechanism. The prediction of pollutant emission such as oxides of nitrogen (NO(x)) and sulphur (SO(x)) unburned hydrocarbons, and particulates demands the use of detailed chemical mechanisms. It is essential that practical models for turbulent nonpremixed combustion are capable of handling large numbers of 'stiff' chemical species equations.

  8. Sodium Handling Technology and Engineering Design of the Madison Dynamo Experiment.

    NASA Astrophysics Data System (ADS)

    Kendrick, R.; Forest, C. B.; O'Connell, R.; Wright, A.; Robinson, K.

    1998-11-01

    A new liquid metal MHD experiment is being constructed at the University of Wisconsin to test several key predictions of dynamo theory: magnetic instabilities driven by sheared flow, the effects of turbulence on current generation, and the back-reaction of the self-generated magnetic field on the fluid motion which brings saturation. This presentation describes the engineering design of the experiment, which is a 0.5 m radius spherical vessel, filled with liquid sodium at 150 degrees Celsius. The experiment is designed to achieve a magnetic Reynolds number in excess of 100, which requires approximately 80 Hp of mechanical drive, producing flow velocities in sodium of 15 m/s through impellers. Handling liquid sodium offers a number of technical challenges, but routine techniques have been developed over the past several decades for safely handling large quantities for the fast breeder reactor. The handling strategy is discussed, technical details concerning seals and pressurazation are presented, and safety elements are highlighted.

  9. An Investigation of Large Aircraft Handling Qualities

    NASA Astrophysics Data System (ADS)

    Joyce, Richard D.

    An analytical technique for investigating transport aircraft handling qualities is exercised in a study using models of two such vehicles, a Boeing 747 and Lockheed C-5A. Two flight conditions are employed for climb and directional tasks, and a third included for a flare task. The analysis technique is based upon a "structural model" of the human pilot developed by Hess. The associated analysis procedure has been discussed previously in the literature, but centered almost exclusively on the characteristics of high-performance fighter aircraft. The handling qualities rating level (HQRL) and pilot induced oscillation tendencies rating level (PIORL) are predicted for nominal configurations of the aircraft and for "damaged" configurations where actuator rate limits are introduced as nonlinearites. It is demonstrated that the analysis can accommodate nonlinear pilot/vehicle behavior and do so in the context of specific flight tasks, yielding estimates of handling qualities, pilot-induced oscillation tendencies and upper limits of task performance. A brief human-in-the-loop tracking study was performed to provide a limited validation of the pilot model employed.

  10. Ergonomics and comfort in lawn mower handle positioning: An evaluation of handle geometry.

    PubMed

    Lowndes, Bethany R; Heald, Elizabeth A; Hallbeck, M Susan

    2015-11-01

    Hand operation accompanied with any combination of large forces, awkward positions and repetition may lead to upper limb injury or illness and may be exacerbated by vibration. Commercial lawn mowers expose operators to these factors during actuation of hand controls and therefore may be a health concern. A nontraditional lawn mower control system may decrease upper limb illnesses and injuries through more neutral hand and body positioning. This study compared maximum grip strength in twelve different orientations (3 grip spans and 4 positions) and evaluated self-described comfortable handle positions. The results displayed force differences between nontraditional (X) and both vertical (V) and pistol (P) positions (p < 0.0001) and among the different grip spans (p < 0.0001). Based on these results, recommended designs should incorporate a tilt between 45 and 70°, handle rotations between 48 and 78°, and reduced force requirements or decreased grip spans to improve user health and comfort. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Parameter Estimations of Dynamic Energy Budget (DEB) Model over the Life History of a Key Antarctic Species: The Antarctic Sea Star Odontaster validus Koehler, 1906.

    PubMed

    Agüera, Antonio; Collard, Marie; Jossart, Quentin; Moreau, Camille; Danis, Bruno

    2015-01-01

    Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB) model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O. validus allowed us to increase our knowledge on the ecophysiology of this species, providing new insights on the role of food availability and temperature on its life cycle and reproduction strategy.

  12. Holographic Adaptive Optics

    NASA Astrophysics Data System (ADS)

    Andersen, G.

    For the last two decades adaptive optics has been used as a technique for correcting imaging applications and directed energy/laser targeting and laser communications systems affected by atmospheric turbulence. Typically these systems are bulky and limited to <10 kHz due to large computing overhead and limited photon efficiencies. Moreover most use zonal wavefront sensors which cannot easily handle extreme scintillation or unexpected obscuration of a pre-set aperture. Here we present a compact, lightweight adaptive optics system with the potential to operate at speeds of MHz. The system utilizes a hologram to perform an all-optical wavefront analysis that removes the need for any computer. Finally, the sensing is made on a modal basis so it is largely insensitive to scintillation and obscuration. We have constructed a prototype device and will present experimental results from our research. The holographic adaptive optics system begins with the creation of a multiplexed hologram. This hologram is created by recording the maximum and minimum response functions of every actuator in the deformable mirror against a unique focused reference beam. When a wavefront of some arbitrary phase is incident on the processed hologram, a number of focal spots are created -- one pair for each actuator in the DM. The absolute phase error at each particular actuator location is simply related to the ratio of the intensity of each pair of spots. In this way we can use an array of photodetectors to give a direct readout of phase error without the need for any calculations. The advantages of holographic adaptive optics are many. To begin with, the measurement of phase error is made all optically, so the wavefront sensor directly controls the actuators in the DM without any computers. Using fast, photon counting photodetectors allows for closed loop correction limited only by the speed of the deformable mirror which in the case of MEMS devices can be 100 kHz or more. All this can be achieved in an extremely compact and lightweight package making it perfectly suited to applications such as UAV surveillance imagery and free space optical communications systems. Lastly, since the correction is made on a modal basis instead of zonal, it is virtually insensitive to scintillation and obscuration.

  13. AMS radiocarbon analyses from Lake Baikal, Siberia: Challenges of dating sediments from a large, oligotrophic lake

    USGS Publications Warehouse

    Colman, Steven M.; Jones, Glenn A.; Rubin, M.; King, J.W.; Peck, J.A.; Orem, W.H.

    1996-01-01

    A suite of 146 new accelerator-mass spectrometer (AMS) radiocarbon ages provides the first reliable chronology for late Quaternary sediments in Lake Baikal. In this large, highly oligotrophic lake, biogenic and authigenic carbonate are absent, and plant macrofossils are extremely rare. Total organic carbon is therefore the primary material available for dating. Several problems are associated with the TOC ages. One is the mixture of carbon sources in TOC, not all of which are syndepositional in age. This problem manifests itself in apparent ages for the sediment surface that are greater than zero. However, because most of the organic carbon in Lake Baikal sediments is algal (autochthonous) in origin, this effect is limited to about 1000+500 years, which can be corrected, at least for young deposits. The other major problem with dating Lake Baikal sediments is the very low carbon contents of glacial-age deposits, which makes them extremely susceptible to contamination with modern carbon. This problem can be minimized by careful sampling and handling procedures. The ages show almost an order of magnitude difference in sediment-accumulation rates among different sedimentary environments in Lake Baikal, from about 0.04 mm/year on isolated banks such as Academician Ridge, to nearly 0.3 mm/year in the turbidite depositional areas beneath the deep basin floors, such as the Central Basin. The new AMS ages clearly indicate that the dramatic increase in diatom productivity in the lake, as evidenced by increases in biogenic silica and organic carbon, began about 13 ka, in contrast to previous estimates of 7 ka for the age of this transition. Holocene net sedimentation rates may be less than, equal to, or greater than those in the late Pleistocene, depending on the site. This variability reflects the balance between variable terrigenous sedimentation and increased biogenic sedimentation during interglaciations. The ages reported here, and the temporal and spatial variation in sedimentation rates that they imply, provide opportunities for paleoenvironmental reconstructions at different time scales and resolutions.

  14. Bacterial gastroenteritis

    MedlinePlus

    ... at picnics, school cafeterias, large social gatherings, or restaurants. Your food may get infected in many ways: ... handling or preparation may occur in grocery stores, restaurants, or homes. Food poisoning often occurs from eating ...

  15. Pose-Invariant Face Recognition via RGB-D Images.

    PubMed

    Sang, Gaoli; Li, Jing; Zhao, Qijun

    2016-01-01

    Three-dimensional (3D) face models can intrinsically handle large pose face recognition problem. In this paper, we propose a novel pose-invariant face recognition method via RGB-D images. By employing depth, our method is able to handle self-occlusion and deformation, both of which are challenging problems in two-dimensional (2D) face recognition. Texture images in the gallery can be rendered to the same view as the probe via depth. Meanwhile, depth is also used for similarity measure via frontalization and symmetric filling. Finally, both texture and depth contribute to the final identity estimation. Experiments on Bosphorus, CurtinFaces, Eurecom, and Kiwi databases demonstrate that the additional depth information has improved the performance of face recognition with large pose variations and under even more challenging conditions.

  16. Tests of protective clothing for the safe handling of pressurized lamps

    NASA Technical Reports Server (NTRS)

    Ewashinka, J. G.

    1975-01-01

    Tests were made to find a clothing material combination for use in handling high-pressure lamps. Monofilament nylon, ballistic nylon, and ballistic felt grouped into various multilayer combinations and chromed leather were positioned around and 30 cm (12 in.) away from exploding high-pressure lamps of different manufacturers and wattages. The results are: (1) 5024 nylon/ballistic felt/5024 nylon in a layered configuration was not penetrated by fragments of lamps as large as 6.5 kW; (2) this layered combination is lightweight and pliable and offers greater mobility and comfort to the user than previous protective clothing; and (3) Lexan plastic 1.6 mm (1/6 in.) thick to be used for face shield material showed no penetration for lamps as large as 20 kW.

  17. sbtools: A package connecting R to cloud-based data for collaborative online research

    USGS Publications Warehouse

    Winslow, Luke; Chamberlain, Scott; Appling, Alison P.; Read, Jordan S.

    2016-01-01

    The adoption of high-quality tools for collaboration and reproducible research such as R and Github is becoming more common in many research fields. While Github and other version management systems are excellent resources, they were originally designed to handle code and scale poorly to large text-based or binary datasets. A number of scientific data repositories are coming online and are often focused on dataset archival and publication. To handle collaborative workflows using large scientific datasets, there is increasing need to connect cloud-based online data storage to R. In this article, we describe how the new R package sbtools enables direct access to the advanced online data functionality provided by ScienceBase, the U.S. Geological Survey’s online scientific data storage platform.

  18. New weight-handling device for commercial oil pressure balances

    NASA Astrophysics Data System (ADS)

    Woo, S. Y.; Choi, I. M.; Kim, B. S.

    2005-12-01

    This paper presents a new device to automatically handle a large number of weights for the calibration of a pressure gauge. This newly invented weight-handling device is made for use in conjunction with a commercial oil pressure balance. Although the pressure balance is essential as a calibration tool, its use has been generally tedious and labour intensive for a long time. In particular, the process of loading a different combination of weights on the top of a piston requires repetitious manual handling for every new measurement. This inevitably leaves the operator fatigued, and sometimes causes damage to the weights due to careless handling. The newly invented automatic weight-handling device can eliminate such tedious, error-prone and wear-inducing manual weight manipulation. The device consists of a stepping motor, a drive belt, a solenoid valve, three weight-lifting assemblies and three linear-motion guide assemblies. The weight-lifting assembly is composed of a pneumatic actuator, a solid-state switch and a metal finger. It has many advantages compared with the commercial automatic weight-handling device. Firstly, it is not necessary to lift all the weights off the piston in the weight selection process, as it is in the case of the commercial device. Thus it can prevent a permanent deformation of the weight carrier. Secondly, this new device can handle a larger number of weights than the commercial one. This is because the new device adopts a different method in retaining the remaining weights in place. Another advantage of this new device is that there is no possibility of the fingers touching the surface of the weights due to the oscillation of weights. Moreover it uses the general technology of a stepping motor, and is also made up of components that are easily obtainable in the market, thereby being very economical.

  19. Simulation of Unique Pressure Changing Steps and Situations in Psa Processes

    NASA Technical Reports Server (NTRS)

    Ebner, Armin D.; Mehrotra, Amal; Knox, James C.; LeVan, Douglas; Ritter, James A.

    2007-01-01

    A more rigorous cyclic adsorption process simulator is being developed for use in the development and understanding of new and existing PSA processes. Unique features of this new version of the simulator that Ritter and co-workers have been developing for the past decade or so include: multiple absorbent layers in each bed, pressure drop in the column, valves for entering and exiting flows and predicting real-time pressurization and depressurization rates, ability to account for choked flow conditions, ability to pressurize and depressurize simultaneously from both ends of the columns, ability to equalize between multiple pairs of columns, ability to equalize simultaneously from both ends of pairs of columns, and ability to handle very large pressure ratios and hence velocities associated with deep vacuum systems. These changes to the simulator now provide for unique opportunities to study the effects of novel pressure changing steps and extreme process conditions on the performance of virtually any commercial or developmental PSA process. This presentation will provide an overview of the cyclic adsorption process simulator equations and algorithms used in the new adaptation. It will focus primarily on the novel pressure changing steps and their effects on the performance of a PSA system that epitomizes the extremes of PSA process design and operation. This PSA process is a sorbent-based atmosphere revitalization (SBAR) system that NASA is developing for new manned exploration vehicles. This SBAR system consists of a 2-bed 3-step 3-layer system that operates between atmospheric pressure and the vacuum of space, evacuates from both ends of the column simultaneously, experiences choked flow conditions during pressure changing steps, and experiences a continuously changing feed composition, as it removes metabolic CO2 and H20 from a closed and fixed volume, i.e., the spacecraft cabin. Important process performance indicators of this SBAR system are size, and the corresponding CO2 and H20 removal efficiencies, and N2 and O2 loss rates. Results of the fundamental behavior of this PSA process during extreme operating conditions will be presented and discussed.

  20. 49 CFR 178.930 - Standards for fiberboard Large Packagings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... puncture resistance of 15 Joules (11 foot-pounds of energy) measured according to ISO 3036 (IBR, see § 171... in handling and transport. Where a detachable pallet is used, its top surface must be free from protrusions that might damage the Large Packaging. (3) Strengthening devices, such as timber supports to...

  1. 49 CFR 178.930 - Standards for fiberboard Large Packagings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... puncture resistance of 15 Joules (11 foot-pounds of energy) measured according to ISO 3036 (IBR, see § 171... in handling and transport. Where a detachable pallet is used, its top surface must be free from protrusions that might damage the Large Packaging. (3) Strengthening devices, such as timber supports to...

  2. 49 CFR 178.930 - Standards for fiberboard Large Packagings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... puncture resistance of 15 Joules (11 foot-pounds of energy) measured according to ISO 3036 (IBR, see § 171... in handling and transport. Where a detachable pallet is used, its top surface must be free from protrusions that might damage the Large Packaging. (3) Strengthening devices, such as timber supports to...

  3. 49 CFR 178.930 - Standards for fiberboard Large Packagings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... puncture resistance of 15 Joules (11 foot-pounds of energy) measured according to ISO 3036 (IBR, see § 171... in handling and transport. Where a detachable pallet is used, its top surface must be free from protrusions that might damage the Large Packaging. (3) Strengthening devices, such as timber supports to...

  4. 49 CFR 178.930 - Standards for fiberboard Large Packagings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... puncture resistance of 15 Joules (11 foot-pounds of energy) measured according to ISO 3036 (IBR, see § 171... in handling and transport. Where a detachable pallet is used, its top surface must be free from protrusions that might damage the Large Packaging. (3) Strengthening devices, such as timber supports to...

  5. 49 CFR 178.915 - General Large Packaging standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... intended for solid hazardous materials must be sift-proof and water-resistant. (b) All service equipment... internal pressure of the contents and the stresses of normal handling and transport. A Large Packaging... gross distortion or failure and must be positioned so as to cause no undue stress in any part of the...

  6. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  7. Comprehensive numerical methodology for direct numerical simulations of compressible Rayleigh-Taylor instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.

    A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.

  8. Airport baggage handling--where do human factors fit in the challenges that airports put on a baggage system?

    PubMed

    Lenior, O N M

    2012-01-01

    The challenges put on large baggage systems by airports can be summarized as: handling a high number of bags in a short period of time, in a limited space, with all sorts of disruptions, whilst complying with stringent regulation upon security, sustainability and health and safety. The aim of this company case study is to show in the different project phases--as indicated in the system ergonomic approach--how the human factors specialist can play a major part in tackling these challenges. By describing different projects in terms of scope, organization, human factors topics covered, phases and lessons learned, the importance of Human-Computer Interaction, automation as well as manual handling and work organization in baggage is addressed.

  9. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  10. Springtime extreme moisture transport into the Arctic and its impact on sea ice concentration

    NASA Astrophysics Data System (ADS)

    Yang, Wenchang; Magnusdottir, Gudrun

    2017-05-01

    Recent studies suggest that springtime moisture transport into the Arctic can initiate sea ice melt that extends to a large area in the following summer and fall, which can help explain Arctic sea ice interannual variability. Yet the impact from an individual moisture transport event, especially the extreme ones, is unclear on synoptic to intraseasonal time scales and this is the focus of the current study. Springtime extreme moisture transport into the Arctic from a daily data set is found to be dominant over Atlantic longitudes. Lag composite analysis shows that these extreme events are accompanied by a substantial sea ice concentration reduction over the Greenland-Barents-Kara Seas that lasts around a week. Surface air temperature also becomes anomalously high over these seas and cold to the west of Greenland as well as over the interior Eurasian continent. The blocking weather regime over the North Atlantic is mainly responsible for the extreme moisture transport, occupying more than 60% of the total extreme days, while the negative North Atlantic Oscillation regime is hardly observed at all during the extreme transport days. These extreme moisture transport events appear to be preceded by eastward propagating large-scale tropical convective forcing by as long as 2 weeks but with great uncertainty due to lack of statistical significance.

  11. Optical phased array configuration for an extremely large telescope.

    PubMed

    Meinel, Aden Baker; Meinel, Marjorie Pettit

    2004-01-20

    Extremely large telescopes are currently under consideration by several groups in several countries. Extrapolation of current technology up to 30 m indicates a cost of over dollars 1 billion. Innovative concepts are being explored to find significant cost reductions. We explore the concept of an Optical Phased Array (OPA) telescope. Each element of the OPA is a separate Cassegrain telescope. Collimated beams from the array are sent via an associated set of delay lines to a central beam combiner. This array of small telescope elements offers the possibility of starting with a low-cost array of a few rings of elements, adding structure and additional Cass elements until the desired diameter telescope is attained. We address the salient features of such an extremely large telescope and cost elements relative to more conventional options.

  12. Flood protection diversification to reduce probabilities of extreme losses.

    PubMed

    Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor

    2012-11-01

    Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.

  13. Automation of Technology for Cancer Research.

    PubMed

    van der Ent, Wietske; Veneman, Wouter J; Groenewoud, Arwin; Chen, Lanpeng; Tulotta, Claudia; Hogendoorn, Pancras C W; Spaink, Herman P; Snaar-Jagalska, B Ewa

    2016-01-01

    Zebrafish embryos can be obtained for research purposes in large numbers at low cost and embryos develop externally in limited space, making them highly suitable for high-throughput cancer studies and drug screens. Non-invasive live imaging of various processes within the larvae is possible due to their transparency during development, and a multitude of available fluorescent transgenic reporter lines.To perform high-throughput studies, handling large amounts of embryos and larvae is required. With such high number of individuals, even minute tasks may become time-consuming and arduous. In this chapter, an overview is given of the developments in the automation of various steps of large scale zebrafish cancer research for discovering important cancer pathways and drugs for the treatment of human disease. The focus lies on various tools developed for cancer cell implantation, embryo handling and sorting, microfluidic systems for imaging and drug treatment, and image acquisition and analysis. Examples will be given of employment of these technologies within the fields of toxicology research and cancer research.

  14. Musculoskeletal injuries resulting from patient handling tasks among hospital workers.

    PubMed

    Pompeii, Lisa A; Lipscomb, Hester J; Schoenfisch, Ashley L; Dement, John M

    2009-07-01

    The purpose of this study was to evaluate musculoskeletal injuries and disorders resulting from patient handling prior to the implementation of a "minimal manual lift" policy at a large tertiary care medical center. We sought to define the circumstances surrounding patient handling injuries and to identify potential preventive measures. Human resources data were used to define the cohort and their time at work. Workers' compensation records (1997-2003) were utilized to identify work-related musculoskeletal claims, while the workers' description of injury was used to identify those that resulted from patient handling. Adjusted rate ratios were generated using Poisson regression. One-third (n = 876) of all musculoskeletal injuries resulted from patient handling activities. Most (83%) of the injury burden was incurred by inpatient nurses, nurses' aides and radiology technicians, while injury rates were highest for nurses' aides (8.8/100 full-time equivalent, FTEs) and smaller workgroups including emergency medical technicians (10.3/100 FTEs), patient transporters (4.3/100 FTEs), operating room technicians (3.1/100 FTEs), and morgue technicians (2.2/100 FTEs). Forty percent of injuries due to lifting/transferring patients may have been prevented through the use of mechanical lift equipment, while 32% of injuries resulting from repositioning/turning patients, pulling patients up in bed, or catching falling patients may not have been prevented by the use of lift equipment. The use of mechanical lift equipment could significantly reduce the risk of some patient handling injuries but additional interventions need to be considered that address other patient handling tasks. Smaller high-risk workgroups should not be neglected in prevention efforts.

  15. A Test-Length Correction to the Estimation of Extreme Proficiency Levels

    ERIC Educational Resources Information Center

    Magis, David; Beland, Sebastien; Raiche, Gilles

    2011-01-01

    In this study, the estimation of extremely large or extremely small proficiency levels, given the item parameters of a logistic item response model, is investigated. On one hand, the estimation of proficiency levels by maximum likelihood (ML), despite being asymptotically unbiased, may yield infinite estimates. On the other hand, with an…

  16. Heterogeneous Sensitivity of Tropical Precipitation Extremes during Growth and Mature Phases of Atmospheric Warming

    NASA Astrophysics Data System (ADS)

    Parhi, P.; Giannini, A.; Lall, U.; Gentine, P.

    2016-12-01

    Assessing and managing risks posed by climate variability and change is challenging in the tropics, from both a socio-economic and a scientific perspective. Most of the vulnerable countries with a limited climate adaptation capability are in the tropics. However, climate projections, particularly of extreme precipitation, are highly uncertain there. The CMIP5 (Coupled Model Inter- comparison Project - Phase 5) inter-model range of extreme precipitation sensitivity to the global temperature under climate change is much larger in the tropics as compared to the extra-tropics. It ranges from nearly 0% to greater than 30% across models (O'Gorman 2012). The uncertainty is also large in historical gauge or satellite based observational records. These large uncertainties in the sensitivity of tropical precipitation extremes highlight the need to better understand how tropical precipitation extremes respond to warming. We hypothesize that one of the factors explaining the large uncertainty is due to differing sensitivities during different phases of warming. We consider the `growth' and `mature' phases of warming under climate variability case- typically associated with an El Niño event. In the remote tropics (away from tropical Pacific Ocean), the response of the precipitation extremes during the two phases can be through different pathways: i) a direct and fast changing radiative forcing in an atmospheric column, acting top-down due to the tropospheric warming, and/or ii) an indirect effect via changes in surface temperatures, acting bottom-up through surface water and energy fluxes. We also speculate that the insights gained here might be useful in interpreting the large sensitivity under climate change scenarios, since the physical mechanisms during the two warming phases under climate variability case, have some correspondence with an increasing and stabilized green house gas emission scenarios.

  17. Background levels in the Borexino detector

    NASA Astrophysics Data System (ADS)

    D'Angelo, Davide; Wurm, Michael; Borexino Collaboration

    2008-11-01

    The Borexino detector, designed and constructed for sub-MeV solar neutrino spectroscopy, is taking data at the Gran Sasso Laboratory, Italy; since May 2007. The main physics objective of Borexino, based on elastic scattering of neutrinos in organic liquid scintillator, is the real time flux measurement of the 862keV mono-energetic neutrinos from 7Be, which set extremely severe radio-purity requirements in the detector's design and handling. The first year of continous data taking provide now evidence of the extremely low background levels achieved in the construction of the detector and in the purification of the target mass. Several pieces of analysis sense the presence of radioisotopes of the 238U and 232Th chains, of 85Kr and of 210Po out of equilibrium from other Radon daughters. Particular emphasis is given to the detection of the cosmic muon background whose angular distributions have been obtained with the outer detector tracking algorithm and to the possibility of tagging the muon-induced neutron background in the scintillator with the recently enhanced electronics setup.

  18. A longitudinal study of ice hockey in boys aged 8--12.

    PubMed

    MacNab, R B

    1979-03-01

    A group of fifteen boys (experimental or competitive) were studied over a five year period of competitive ice hockey beginning at age 8. The subjects were members of a team which averaged 66 games per year, ranging from 50 at age 8 to 78 at age 12. In addition, they practiced twice a week with heavy stress on skating and individual puck handling skills. A second group of eleven boys (control or less competitive) were studied from age 10 to 12. The latter subjects played an average of 25 games per year and practiced once a week. All subjects were measured each year on skating and puck control skills, fitness-performance tests, grip strength, physical work capacity as well as height and weight. The results demonstrate learning curves for skating and puck control tests which, while typical in nature, show extremely high levels of achievement. Fitness-Performance, grip strength and physical work capacity levels of the competitive group are extremely high in comparison with data from other countries.

  19. Numerical proof of stability of roll waves in the small-amplitude limit for inclined thin film flow

    NASA Astrophysics Data System (ADS)

    Barker, Blake

    2014-10-01

    We present a rigorous numerical proof based on interval arithmetic computations categorizing the linearized and nonlinear stability of periodic viscous roll waves of the KdV-KS equation modeling weakly unstable flow of a thin fluid film on an incline in the small-amplitude KdV limit. The argument proceeds by verification of a stability condition derived by Bar-Nepomnyashchy and Johnson-Noble-Rodrigues-Zumbrun involving inner products of various elliptic functions arising through the KdV equation. One key point in the analysis is a bootstrap argument balancing the extremely poor sup norm bounds for these functions against the extremely good convergence properties for analytic interpolation in order to obtain a feasible computation time. Another is the way of handling analytic interpolation in several variables by a two-step process carving up the parameter space into manageable pieces for rigorous evaluation. These and other general aspects of the analysis should serve as blueprints for more general analyses of spectral stability.

  20. Nonempirical Semilocal Free-Energy Density Functional for Matter under Extreme Conditions.

    PubMed

    Karasiev, Valentin V; Dufty, James W; Trickey, S B

    2018-02-16

    Realizing the potential for predictive density functional calculations of matter under extreme conditions depends crucially upon having an exchange-correlation (XC) free-energy functional accurate over a wide range of state conditions. Unlike the ground-state case, no such functional exists. We remedy that with systematic construction of a generalized gradient approximation XC free-energy functional based on rigorous constraints, including the free-energy gradient expansion. The new functional provides the correct temperature dependence in the slowly varying regime and the correct zero-T, high-T, and homogeneous electron gas limits. Its accuracy in the warm dense matter regime is attested by excellent agreement of the calculated deuterium equation of state with reference path integral Monte Carlo results at intermediate and elevated T. Pressure shifts for hot electrons in compressed static fcc Al and for low-density Al demonstrate the combined magnitude of thermal and gradient effects handled well by this functional over a wide T range.

  1. A comprehensive dose assessment of irradiated hand by iridium-192 source in industrial radiography.

    PubMed

    Hosseini Pooya, S M; Dashtipour, M R; Paydar, R; Mianji, F; Pourshahab, B

    2017-09-01

    Among the various incidents in industrial radiography, inadvertent handling of sources by hands is one of the most frequent incidents in which some parts of the hands may be locally exposed to high doses. An accurate assessment of extremity dose assists medical doctors in selecting appropriate treatments, preventing the injury expansion in the region. In this study, a phantom was designed to simulate a fisted hand of a radiographer when the worker holds a radioactive source in their hands. The local doses were measured using implanted TLDs in the phantom at different distances from a source. Furthermore, skin dose distribution was measured by Gaf-chromic films in the palm region of the phantom. The reliability of the measurements has been studied via analytical as well as Monte-Carlo simulation methods. The results showed that the new phantom design can be used reliably in extremity dose assessments, particularly at the points next to the source.

  2. Forecasting the value-at-risk of Chinese stock market using the HARQ model and extreme value theory

    NASA Astrophysics Data System (ADS)

    Liu, Guangqiang; Wei, Yu; Chen, Yongfei; Yu, Jiang; Hu, Yang

    2018-06-01

    Using intraday data of the CSI300 index, this paper discusses value-at-risk (VaR) forecasting of the Chinese stock market from the perspective of high-frequency volatility models. First, we measure the realized volatility (RV) with 5-minute high-frequency returns of the CSI300 index and then model it with the newly introduced heterogeneous autoregressive quarticity (HARQ) model, which can handle the time-varying coefficients of the HAR model. Second, we forecast the out-of-sample VaR of the CSI300 index by combining the HARQ model and extreme value theory (EVT). Finally, using several popular backtesting methods, we compare the VaR forecasting accuracy of HARQ model with other traditional HAR-type models, such as HAR, HAR-J, CHAR, and SHAR. The empirical results show that the novel HARQ model can beat other HAR-type models in forecasting the VaR of the Chinese stock market at various risk levels.

  3. Nonempirical Semilocal Free-Energy Density Functional for Matter under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Karasiev, Valentin V.; Dufty, James W.; Trickey, S. B.

    2018-02-01

    Realizing the potential for predictive density functional calculations of matter under extreme conditions depends crucially upon having an exchange-correlation (X C ) free-energy functional accurate over a wide range of state conditions. Unlike the ground-state case, no such functional exists. We remedy that with systematic construction of a generalized gradient approximation X C free-energy functional based on rigorous constraints, including the free-energy gradient expansion. The new functional provides the correct temperature dependence in the slowly varying regime and the correct zero-T , high-T , and homogeneous electron gas limits. Its accuracy in the warm dense matter regime is attested by excellent agreement of the calculated deuterium equation of state with reference path integral Monte Carlo results at intermediate and elevated T . Pressure shifts for hot electrons in compressed static fcc Al and for low-density Al demonstrate the combined magnitude of thermal and gradient effects handled well by this functional over a wide T range.

  4. Finite element analysis of steady and transiently moving/rolling nonlinear viscoelastic structure. III - Impact/contact simulations

    NASA Technical Reports Server (NTRS)

    Nakajima, Yukio; Padovan, Joe

    1987-01-01

    In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.

  5. Vacuum jacketed composite propulsion feedlines for cryogenic launch and space vehicles, volume 1. [development of glass fiber composite for strength and protection from handling damage

    NASA Technical Reports Server (NTRS)

    Spond, D. E.; Laintz, D. J.; Hall, C. A.; Dulaigh, D. E.

    1974-01-01

    Thin metallic liners that provide leak-free service in cryogenic propulsion systems are overwrapped with a glass-fiber composite that provides strength and protection from handling damage. The resultant tube is lightweight, strong, and has a low thermal flux. The inside commodity flow line and the outside vacuum jacket were fabricated using this method. Several types of vacuum jackets were fabricated and tested at operating temperatures from 294 to 21 K (+70 to minus 423 F) and operating pressure up to 69 N/cm2 (100 psi). The primary objective of the program was to develop vacuum jacket concepts, using previously developed concepts for the inner line. All major program objectives were met resulting in a design concept that is adaptable to a wide range of aerospace vehicle requirements. Major items of development included convolution of thin metallic sections up to 46 cm (18 in.) in diameter, design and fabrication of an extremely lightweight tension membrane concept for the vacuum jacket, and analytical tools that predict the failure mode and levels.

  6. Particle protection capability of SEMI-compliant EUV-pod carriers

    NASA Astrophysics Data System (ADS)

    Huang, George; He, Long; Lystad, John; Kielbaso, Tom; Montgomery, Cecilia; Goodwin, Frank

    2010-04-01

    With the projected rollout of pre-production extreme ultraviolet lithography (EUVL) scanners in 2010, EUVL pilot line production will become a reality in wafer fabrication companies. Among EUVL infrastructure items that must be ready, EUV mask carriers remain critical. To keep non-pellicle EUV masks free from particle contamination, an EUV pod concept has been extensively studied. Early prototypes demonstrated nearly particle-free results at a 53 nm PSL equivalent inspection sensitivity during EUVL mask robotic handling, shipment, vacuum pump-purge, and storage. After the passage of SEMI E152, which specifies the EUV pod mechanical interfaces, standards-compliant EUV pod prototypes, including a production version inner pod and prototype outer pod, were built and tested. Their particle protection capability results are reported in this paper. A state-of-the-art blank defect inspection tool was used to quantify their defect protection capability during mask robotic handling, shipment, and storage tests. To ensure the availability of an EUV pod for 2010 pilot production, the progress and preliminary test results of pre-production EUV outer pods are reported as well.

  7. Characterizing differences in precipitation regimes of extreme wet and dry years: implications for climate change experiments.

    PubMed

    Knapp, Alan K; Hoover, David L; Wilcox, Kevin R; Avolio, Meghan L; Koerner, Sally E; La Pierre, Kimberly J; Loik, Michael E; Luo, Yiqi; Sala, Osvaldo E; Smith, Melinda D

    2015-02-03

    Climate change is intensifying the hydrologic cycle and is expected to increase the frequency of extreme wet and dry years. Beyond precipitation amount, extreme wet and dry years may differ in other ways, such as the number of precipitation events, event size, and the time between events. We assessed 1614 long-term (100 year) precipitation records from around the world to identify key attributes of precipitation regimes, besides amount, that distinguish statistically extreme wet from extreme dry years. In general, in regions where mean annual precipitation (MAP) exceeded 1000 mm, precipitation amounts in extreme wet and dry years differed from average years by ~40% and 30%, respectively. The magnitude of these deviations increased to >60% for dry years and to >150% for wet years in arid regions (MAP<500 mm). Extreme wet years were primarily distinguished from average and extreme dry years by the presence of multiple extreme (large) daily precipitation events (events >99th percentile of all events); these occurred twice as often in extreme wet years compared to average years. In contrast, these large precipitation events were rare in extreme dry years. Less important for distinguishing extreme wet from dry years were mean event size and frequency, or the number of dry days between events. However, extreme dry years were distinguished from average years by an increase in the number of dry days between events. These precipitation regime attributes consistently differed between extreme wet and dry years across 12 major terrestrial ecoregions from around the world, from deserts to the tropics. Thus, we recommend that climate change experiments and model simulations incorporate these differences in key precipitation regime attributes, as well as amount into treatments. This will allow experiments to more realistically simulate extreme precipitation years and more accurately assess the ecological consequences. © 2015 John Wiley & Sons Ltd.

  8. Multiresource inventories incorporating GIS, GPS, and database management systems

    Treesearch

    Loukas G. Arvanitis; Balaji Ramachandran; Daniel P. Brackett; Hesham Abd-El Rasol; Xuesong Du

    2000-01-01

    Large-scale natural resource inventories generate enormous data sets. Their effective handling requires a sophisticated database management system. Such a system must be robust enough to efficiently store large amounts of data and flexible enough to allow users to manipulate a wide variety of information. In a pilot project, related to a multiresource inventory of the...

  9. Correction Methods for Organic Carbon Artifacts when Using Quartz-Fiber Filters in Large Particulate Matter Monitoring Networks: The Regression Method and Other Options

    EPA Science Inventory

    Sampling and handling artifacts can bias filter-based measurements of particulate organic carbon (OC). Several measurement-based methods for OC artifact reduction and/or estimation are currently used in research-grade field studies. OC frequently is not artifact-corrected in larg...

  10. Functional studies in 79-year-olds. II. Upper extremity function.

    PubMed

    Lundgren-Lindquist, B; Sperling, L

    1983-01-01

    As part of the Gerontological and Geriatric Population Study of 79-year-old people in Göteborg, a representative subsample comprising 112 women and 93 men took part in a study of upper extremity function. Thirty-eight per cent of the women and 37% of the men had disorders in the upper extremities. The investigation included tests of co-ordination, static strength in the key-grip and the transversal volar grip, power capacity in opening jars and a bottle, basal movements in the upper extremities in personal hygiene and dressing activities, function in the kitchen e.g. reaching shelves, manual tasks including tests of pronation and supination of the forearm. In the key-grip as well as in the transversal volar grip men showed a generally larger decrease in strength with age than women compared to 70-year-olds in a previous population study. Significant correlations were found between strength in the key-grip and the performance time in the test of co-ordination. Women produced about 66% of the muscular force of the men when opening jars. Significant correlations were found between strength in the transversal volar grip and the maximal torque for opening the jars. Female and male subjects who were not capable of handling the electric plug in the manual ability test had significantly weaker strength in the key-grip. The importance of designing products and adapting the environment so as to correspond to the functional capacity of the elderly, is emphasized.

  11. The effect of ergonomic laparoscopic tool handle design on performance and efficiency.

    PubMed

    Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S

    2015-09-01

    Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion times for the suturing task. Finally, there was no significant interaction between tool type and errors made during trials. There was a significant preference for as well as lower pain experienced during use of the pistol grip tool as seen from the survey feedback. Both evaluation tasks (cutting and peg transfer) were also completed significantly faster with the pistol grip tool. Finally, due to the high degree of variability in the error data, it was not possible to draw any meaningful conclusions about the effect of tool design on the number or degree of errors made.

  12. The Seasonal Predictability of Extreme Wind Events in the Southwest United States

    NASA Astrophysics Data System (ADS)

    Seastrand, Simona Renee

    Extreme wind events are a common phenomenon in the Southwest United States. Entities such as the United States Air Force (USAF) find the Southwest appealing for many reasons, primarily for the an expansive, unpopulated, and electronically unpolluted space for large-scale training and testing. However, wind events can cause hazards for the USAF including: surface wind gusts can impact the take-off and landing of all aircraft, can tip the airframes of large wing-surface aircraft during the performance of maneuvers close to the ground, and can even impact weapons systems. This dissertation is comprised of three sections intended to further our knowledge and understanding of wind events in the Southwest. The first section builds a climatology of wind events for seven locations in the Southwest during the twelve 3-month seasons of the year. The first section further examines the wind events in relation to terrain and the large-scale flow of the atmosphere. The second section builds upon the first by taking the wind events and generating mid-level composites for each of the twelve 3-month seasons. In the third section, teleconnections identified as consistent with the large-scale circulation in the second paper were used as predictor variables to build a Poisson regression model for each of the twelve 3-month seasons. The purpose of this research is to increase our understanding of the climatology of extreme wind events, increase our understanding of how the large-scale circulation influences extreme wind events, and create a model to enhance predictability of extreme wind events in the Southwest. Knowledge from this paper will help protect personnel and property associated with not only the USAF, but all those in the Southwest.

  13. A parallel metaheuristic for large mixed-integer dynamic optimization problems, with applications in computational biology

    PubMed Central

    Henriques, David; González, Patricia; Doallo, Ramón; Saez-Rodriguez, Julio; Banga, Julio R.

    2017-01-01

    Background We consider a general class of global optimization problems dealing with nonlinear dynamic models. Although this class is relevant to many areas of science and engineering, here we are interested in applying this framework to the reverse engineering problem in computational systems biology, which yields very large mixed-integer dynamic optimization (MIDO) problems. In particular, we consider the framework of logic-based ordinary differential equations (ODEs). Methods We present saCeSS2, a parallel method for the solution of this class of problems. This method is based on an parallel cooperative scatter search metaheuristic, with new mechanisms of self-adaptation and specific extensions to handle large mixed-integer problems. We have paid special attention to the avoidance of convergence stagnation using adaptive cooperation strategies tailored to this class of problems. Results We illustrate its performance with a set of three very challenging case studies from the domain of dynamic modelling of cell signaling. The simpler case study considers a synthetic signaling pathway and has 84 continuous and 34 binary decision variables. A second case study considers the dynamic modeling of signaling in liver cancer using high-throughput data, and has 135 continuous and 109 binaries decision variables. The third case study is an extremely difficult problem related with breast cancer, involving 690 continuous and 138 binary decision variables. We report computational results obtained in different infrastructures, including a local cluster, a large supercomputer and a public cloud platform. Interestingly, the results show how the cooperation of individual parallel searches modifies the systemic properties of the sequential algorithm, achieving superlinear speedups compared to an individual search (e.g. speedups of 15 with 10 cores), and significantly improving (above a 60%) the performance with respect to a non-cooperative parallel scheme. The scalability of the method is also good (tests were performed using up to 300 cores). Conclusions These results demonstrate that saCeSS2 can be used to successfully reverse engineer large dynamic models of complex biological pathways. Further, these results open up new possibilities for other MIDO-based large-scale applications in the life sciences such as metabolic engineering, synthetic biology, drug scheduling. PMID:28813442

  14. A parallel metaheuristic for large mixed-integer dynamic optimization problems, with applications in computational biology.

    PubMed

    Penas, David R; Henriques, David; González, Patricia; Doallo, Ramón; Saez-Rodriguez, Julio; Banga, Julio R

    2017-01-01

    We consider a general class of global optimization problems dealing with nonlinear dynamic models. Although this class is relevant to many areas of science and engineering, here we are interested in applying this framework to the reverse engineering problem in computational systems biology, which yields very large mixed-integer dynamic optimization (MIDO) problems. In particular, we consider the framework of logic-based ordinary differential equations (ODEs). We present saCeSS2, a parallel method for the solution of this class of problems. This method is based on an parallel cooperative scatter search metaheuristic, with new mechanisms of self-adaptation and specific extensions to handle large mixed-integer problems. We have paid special attention to the avoidance of convergence stagnation using adaptive cooperation strategies tailored to this class of problems. We illustrate its performance with a set of three very challenging case studies from the domain of dynamic modelling of cell signaling. The simpler case study considers a synthetic signaling pathway and has 84 continuous and 34 binary decision variables. A second case study considers the dynamic modeling of signaling in liver cancer using high-throughput data, and has 135 continuous and 109 binaries decision variables. The third case study is an extremely difficult problem related with breast cancer, involving 690 continuous and 138 binary decision variables. We report computational results obtained in different infrastructures, including a local cluster, a large supercomputer and a public cloud platform. Interestingly, the results show how the cooperation of individual parallel searches modifies the systemic properties of the sequential algorithm, achieving superlinear speedups compared to an individual search (e.g. speedups of 15 with 10 cores), and significantly improving (above a 60%) the performance with respect to a non-cooperative parallel scheme. The scalability of the method is also good (tests were performed using up to 300 cores). These results demonstrate that saCeSS2 can be used to successfully reverse engineer large dynamic models of complex biological pathways. Further, these results open up new possibilities for other MIDO-based large-scale applications in the life sciences such as metabolic engineering, synthetic biology, drug scheduling.

  15. Biochemical and physiological responses of Carcinus maenas to temperature and the fungicide azoxystrobin.

    PubMed

    Rodrigues, Elsa Teresa; Moreno, António; Mendes, Tito; Palmeira, Carlos; Pardal, Miguel Ângelo

    2015-08-01

    Research on the effects of thermal stress is currently pertinent as climate change is expected to cause more severe climate-driven events. Carcinus maenas, a recognised estuarine model organism, was selected to test temperature-dependence of azoxystrobin toxicity, a widely applied fungicide. Crabs' responses were assessed after a 10-d acclimation at different temperatures (5°C, 22°C, and 27°C) of which the last 72h were of exposure to an environmental concentration of azoxystrobin. SOD and GST activities, mitochondrial oxygen consumption rates and protein content, as well as the Coupling Index were determined. The hypothesis proposed that extreme temperatures (5°C and 27°C) and azoxystrobin would affect crabs' responses. Results showed statistically significant different effects of SOD and all oxygen rates measured promoted by temperature, and that neither 30.3μgL(-1) of azoxystrobin nor the combined effect were crab-responsive. Protein content at 5°C was statistically higher when compared with the control temperature (22°C). The Coupling Index revealed both a slight and a drastic decrease of this index promoted by 5°C and 27°C, respectively. Regarding azoxystrobin effects, at 22°C, this index only decreased slightly. However, at extreme temperatures it fell 47% at 5°C and slightly increased at 27°C. Results provided evidence that crabs' responses to cope with low temperatures were more effective than their responses to cope with high temperatures, which are expected in future climate projections. Moreover, crabs are capable of handling environmental concentrations of azoxystrobin. However, the Coupling Index showed that combined stress factors unbalance crabs' natural capability to handle a single stressor. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    NASA Astrophysics Data System (ADS)

    Yu, Lejiang; Zhong, Shiyuan; Pei, Lisi; Bian, Xindi; Heilman, Warren E.

    2016-04-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for severe flooding over a large region, little is known about how extreme precipitation events that cause flash flooding and occur at sub-daily time scales have changed over time. Here we use the observed hourly precipitation from the North American Land Data Assimilation System Phase 2 forcing datasets to determine trends in the frequency of extreme precipitation events of short (1 h, 3 h, 6 h, 12 h and 24 h) duration for the period 1979-2013. The results indicate an increasing trend in the central and eastern US. Over most of the western US, especially the Southwest and the Intermountain West, the trends are generally negative. These trends can be largely explained by the interdecadal variability of the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation (AMO), with the AMO making a greater contribution to the trends in both warm and cold seasons.

  17. Increasing climate whiplash in 21st century California

    NASA Astrophysics Data System (ADS)

    Swain, D. L.; Langenbrunner, B.; Neelin, J. D.; Hall, A. D.

    2017-12-01

    Temperate "Mediterranean" climate regimes across the globe are particularly susceptible to wide swings between drought and flood—of which California's rapid transition from record multi-year dryness between 2012-2016 to extreme wetness during 2016-2017 provides a dramatic example. The wide-ranging human and environmental impacts of this recent "climate whiplash" event in a highly-populated, economically critical, and biodiverse region highlight the importance of understanding weather and climate extremes at both ends of the hydroclimatic spectrum. Previous studies have examined the potential contribution of anthropogenic warming to recent California extremes, but findings to date have been mixed and primarily drought-focused. Here, we use specific historical California flood and drought events as thresholds for quantifying long-term changes in precipitation extremes using a large ensemble of multi-decadal climate model simulations (CESM-LENS). We find that greenhouse gas emissions are already responsible for a detectable increase in both wet and dry extremes across portions of California, and that increasing 21st century "climate whiplash" will likely yield large increases in the frequency of both rapid "dry-to-wet" transitions and severe flood events over a wide range of timescales. This projected intensification of California's hydrological cycle would seriously challenge the region's existing water storage, conveyance, and flood control infrastructure—even absent large changes in mean precipitation.

  18. Handling qualities of large flexible control-configured aircraft

    NASA Technical Reports Server (NTRS)

    Swaim, R. L.

    1980-01-01

    The effects on handling qualities of low frequency symmetric elastic mode interaction with the rigid body dynamics of a large flexible aircraft was analyzed by use of a mathematical pilot modeling computer simulation. An extension of the optimal control model for a human pilot was made so that the mode interaction effects on the pilot's control task could be assessed. Pilot ratings were determined for a longitudinal tracking task with parametric variations in the undamped natural frequencies of the two lowest frequency symmetric elastic modes made to induce varying amounts of mode interaction. Relating numerical performance index values associated with the frequency variations used in several dynamic cases, to a numerical Cooper-Harper pilot rating has proved successful in discriminating when the mathematical pilot can or cannot separate rigid from elastic response in the tracking task.

  19. Climate Change and Hydrological Extreme Events - Risks and Perspectives for Water Management in Bavaria and Québec

    NASA Astrophysics Data System (ADS)

    Ludwig, R.

    2017-12-01

    There is as yet no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for `virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change.

  20. European Extremely Large Telescope: progress report

    NASA Astrophysics Data System (ADS)

    Tamai, R.; Spyromilio, J.

    2014-07-01

    The European Extremely Large Telescope is a project of the European Southern Observatory to build and operate a 40-m class optical near-infrared telescope. The telescope design effort is largely concluded and construction contracts are being placed with industry and academic/research institutes for the various components. The siting of the telescope in Northern Chile close to the Paranal site allows for an integrated operation of the facility providing significant economies. The progress of the project in various areas is presented in this paper and references to other papers at this SPIE meeting are made.

  1. Construction of a ratiometric fluorescent probe with an extremely large emission shift for imaging hypochlorite in living cells

    NASA Astrophysics Data System (ADS)

    Song, Xuezhen; Dong, Baoli; Kong, Xiuqi; Wang, Chao; Zhang, Nan; Lin, Weiying

    2018-01-01

    Hypochlorite is one of the important reactive oxygen species (ROS) and plays critical roles in many biologically vital processes. Herein, we present a unique ratiometric fluorescent probe (CBP) with an extremely large emission shift for detecting hypochlorite in living cells. Utilizing positively charged α,β-unsaturated carbonyl group as the reaction site, the probe CBP itself exhibited near-infrared (NIR) fluorescence at 662 nm, and can display strong blue fluorescence at 456 nm when responded to hypochlorite. Notably, the extremely large emission shift of 206 nm could enable the precise measurement of the fluorescence peak intensities and ratios. CBP showed high sensitivity, excellent selectivity, desirable performance at physiological pH, and low cytotoxicity. The bioimaging experiments demonstrate the biological application of CBP for the ratiometric imaging of hypochlorite in living cells.

  2. Constrained Maximum Likelihood Estimation for Model Calibration Using Summary-level Information from External Big Data Sources

    PubMed Central

    Chatterjee, Nilanjan; Chen, Yi-Hau; Maas, Paige; Carroll, Raymond J.

    2016-01-01

    Information from various public and private data sources of extremely large sample sizes are now increasingly available for research purposes. Statistical methods are needed for utilizing information from such big data sources while analyzing data from individual studies that may collect more detailed information required for addressing specific hypotheses of interest. In this article, we consider the problem of building regression models based on individual-level data from an “internal” study while utilizing summary-level information, such as information on parameters for reduced models, from an “external” big data source. We identify a set of very general constraints that link internal and external models. These constraints are used to develop a framework for semiparametric maximum likelihood inference that allows the distribution of covariates to be estimated using either the internal sample or an external reference sample. We develop extensions for handling complex stratified sampling designs, such as case-control sampling, for the internal study. Asymptotic theory and variance estimators are developed for each case. We use simulation studies and a real data application to assess the performance of the proposed methods in contrast to the generalized regression (GR) calibration methodology that is popular in the sample survey literature. PMID:27570323

  3. The Palomar Transient Factory: High Quality Realtime Data Processing in a Cost-Constrained Environment

    NASA Astrophysics Data System (ADS)

    Surace, J.; Laher, R.; Masci, F.; Grillmair, C.; Helou, G.

    2015-09-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sommer, T.; Melick, T.; Morrison, D.

    The objective of this DOE sponsored project was to successfully fire coal-water slurry in a fire-tube boiler that was designed for oil/gas firing and establish a data base that will be relevant to a large number of existing installations. Firing slurry in a fire-tube configuration is a very demanding application because of the extremely high heat release rates and the correspondingly low furnace volume where combustion can be completed. Recognizing that combustion efficiency is the major obstacle when firing slurry in a fire-tube boiler, the program was focused on innovative approaches for improving carbon burnout without major modifications to themore » boiler. The boiler system was successfully designed and operated to fire coal-water slurry for extended periods of time with few slurry related operational problems. The host facility was a 3.8 million Btu/hr Cleaver-Brooks fire-tube boiler located on the University of Alabama Campus. A slurry atomizer was designed that provided outstanding atomization and was not susceptible to pluggage. The boiler was operated for over 1000 hours and 12 shipments of slurry were delivered. The new equipment engineered for the coal-water slurry system consisted of the following: combustion air and slurry heaters; cyclone; baghouse; fly ash reinjection system; new control system; air compressor; CWS/gas burner and gas valve train; and storage tank and slurry handling system.« less

  5. The extrusion test and sensory perception revisited: Some comments on generality and the effect of measurement temperature.

    PubMed

    Brenner, Tom; Tomczyńska-Mleko, Marta; Mleko, Stanisław; Nishinari, Katsuyoshi

    2017-12-01

    Relations between sensory perception, extrusion and fracture in shear, extension and compression are examined. Gelatin-based gels are perceived as less firm and less hard than expected based on their mechanical properties compared to polysaccharide gels that have the same mechanical properties at room temperature but melt well above body temperature, underlying the importance of the measurement temperature for gels that melt during mastication. Correlations between parameters from extrusion and compression, extension and shear are verified using mixed polysaccharide gels. We previously reported a high correlation between several sensory attributes and parameters from an extrusion test. The extrusion test showed the most robust correlation, and could be used to assess samples at both extremes of the texture range with respect to elasticity, for example, both samples that could not be extended as their very low elasticity led to their fracture during handling, as well as samples that could not be fractured in compression. Here, we reexamine the validity of the relations reported. We demonstrate the generality of the relations between large deformation tests and extrusion, but the findings underscore the need to take into account the measurement temperature for samples that melt during mastication when correlating instrumental parameters with sensory perception. © 2017 Wiley Periodicals, Inc.

  6. Methane producing bacteria: Immunological characterization: Progress report, April 1, 1984--June 30, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conway de Macario, E.; Macario, A.J.L.; Wolin, M.J.

    1988-01-01

    A major contribution of this research has been a significant advance of the immunology of methanogens and other archaebacteria (e.g., extreme halophiles). The foundations have been laid to begin the immunologic study of microbes which are non-methanogens themselves but are important for the fermentation process. This work helped to make clear that bacterial immunology goes beyond the study of pathogens for man, animals, or plants. Immunology can be applied successfully to the study of isolates of importance to understand evolution, phylogeny, ecology, bio-conversion systems, and to advance methanogenic biotechnology. Immunology holds considerable potential to aid in genetic and genetic engineeringmore » manipulations as well as in in situ handling of microbes relevant to methanogenesis. Thus, antibodies can help in the discovery of useful microbes, the generation of improved stains, the selection of desirable microorganisms, and in the monitoring and controlling of bioreactors. Immunogolic work in this new field should generate knowledge and devices relevant to areas such as Biological Energy Research, Ecology of Microorganisms, and Environmental (Sanitary) Engineering. In this regard, this work has contributed a comprehensive antiserum bank, a large panel of calibrated polyclonal antibody probes, and techniques for producing and utilizing these probes in the study of methanogens and related bacteria. 67 refs.« less

  7. Modeling and replicating statistical topology and evidence for CMB nonhomogeneity

    PubMed Central

    Agami, Sarit

    2017-01-01

    Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301

  8. The future point-of-care detection of disease and its data capture and handling.

    PubMed

    Lopez-Barbosa, Natalia; Gamarra, Jorge D; Osma, Johann F

    2016-04-01

    Point-of-care detection is a widely studied area that attracts effort and interest from a large number of fields and companies. However, there is also increased interest from the general public in this type of device, which has driven enormous changes in the design and conception of these developments and the way data is handled. Therefore, future point-of-care detection has to include communication with front-end technology, such as smartphones and networks, automation of manufacture, and the incorporation of concepts like the Internet of Things (IoT) and cloud computing. Three key examples, based on different sensing technology, are analyzed in detail on the basis of these items to highlight a route for the future design and development of point-of-care detection devices and their data capture and handling.

  9. Management of Intolerance to Casting the Upper Extremities in Claustrophobic Patients

    PubMed Central

    Nagura, Issei; Kanatani, Takako; Sumi, Masatoshi; Inui, Atsuyuki; Mifune, Yutaka; Kokubu, Takeshi; Kurosaka, Masahiro

    2014-01-01

    Introduction. Some patients showed unusual responses to the immobilization without any objective findings with casts in upper extremities. We hypothesized their that intolerance with excessive anxiety to casts is due to claustrophobia triggered by cast immobilization. The aim of this study is to analyze the relevance of cast immobilization to the feeling of claustrophobia and discover how to handle them. Methods. There were nine patients who showed the caustrophobic symptoms with their casts. They were assesed whether they were aware of their claustrophobis themselves. Further we investigated the alternative immobilization to casts. Results. Seven out of nine cases that were aware of their claustrophobic tendencies either were given removable splints initially or had the casts converted to removable splints when they exhibited symptoms. The two patients who were unaware of their latent claustrophobic tendencies were identified when they showed similar claustrophobic symptoms to the previous patients soon after short arm cast application. We replaced the casts with removable splints. This resolved the issue in all cases. Conclusions. We should be aware of the claustrophobia if patients showed unusual responses to the immobilization without any objective findings with casts in upper extremities, where removal splint is practical alternative to cast to continue the treatment successfully. PMID:25379544

  10. A flexible cure rate model for spatially correlated survival data based on generalized extreme value distribution and Gaussian process priors.

    PubMed

    Li, Dan; Wang, Xia; Dey, Dipak K

    2016-09-01

    Our present work proposes a new survival model in a Bayesian context to analyze right-censored survival data for populations with a surviving fraction, assuming that the log failure time follows a generalized extreme value distribution. Many applications require a more flexible modeling of covariate information than a simple linear or parametric form for all covariate effects. It is also necessary to include the spatial variation in the model, since it is sometimes unexplained by the covariates considered in the analysis. Therefore, the nonlinear covariate effects and the spatial effects are incorporated into the systematic component of our model. Gaussian processes (GPs) provide a natural framework for modeling potentially nonlinear relationship and have recently become extremely powerful in nonlinear regression. Our proposed model adopts a semiparametric Bayesian approach by imposing a GP prior on the nonlinear structure of continuous covariate. With the consideration of data availability and computational complexity, the conditionally autoregressive distribution is placed on the region-specific frailties to handle spatial correlation. The flexibility and gains of our proposed model are illustrated through analyses of simulated data examples as well as a dataset involving a colon cancer clinical trial from the state of Iowa. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Ground-based and in-flight simulator studies of flight characteristics of a twin-fuselage passenger transport airplane during approach and landing

    NASA Technical Reports Server (NTRS)

    Grantham, W. D.; Smith, P. M.; Neely, W. R., Jr.; Deal, P. L.; Yenni, K. R.

    1985-01-01

    Six-degree-of-freedom ground-based and in-flight simulator studies were conducted to evaluate the low-speed flight characteristics of a twin-fuselage passenger transport airplane and to compare these characteristics with those of a large, single-fuselage (reference) transport configuration similar to the Lockheed C-5A airplane. The primary piloting task was the approach and landing task. The results of this study indicated that the twin-fuselage transport concept had acceptable but unsatisfactory longitudinal and lateral-directional low-speed flight characteristics, and that stability and control augmentation would be required in order to improve the handling qualities. Through the use of rate-command/attitude-hold augmentation in the pitch and roll axes, and the use of several turn coordination features, the handling qualities of the simulated transport were improved appreciably. The in-flight test results showed excellent agreement with those of the six-degree-of-freedom ground-based simulator handling qualities tests. As a result of the in-flight simulation study, a roll-control-induced normal-acceleration criterion was developed. The handling qualities of the augmented twin-fuselage passenger transport airplane exhibited an improvement over the handling characteristics of the reference (single-fuselage) transport.

  12. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hukerikar, Saurabh; Engelmann, Christian

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest that very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Practical limits on power consumption in HPC systems will require future systems to embrace innovative architectures, increasing the levels of hardware and software complexities. The resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies thatmore » are capable of handling a broad set of fault models at accelerated fault rates. These techniques must seek to improve resilience at reasonable overheads to power consumption and performance. While the HPC community has developed various solutions, application-level as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power eciency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software ecosystems, which are expected to be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience based on the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. The catalog of resilience design patterns provides designers with reusable design elements. We define a design framework that enhances our understanding of the important constraints and opportunities for solutions deployed at various layers of the system stack. The framework may be used to establish mechanisms and interfaces to coordinate flexible fault management across hardware and software components. The framework also enables optimization of the cost-benefit trade-os among performance, resilience, and power consumption. The overall goal of this work is to enable a systematic methodology for the design and evaluation of resilience technologies in extreme-scale HPC systems that keep scientific applications running to a correct solution in a timely and cost-ecient manner in spite of frequent faults, errors, and failures of various types.« less

  13. A large, benign prostatic cyst presented with an extremely high serum prostate-specific antigen level.

    PubMed

    Chen, Han-Kuang; Pemberton, Richard

    2016-01-08

    We report a case of a patient who presented with an extremely high serum prostate specific antigen (PSA) level and underwent radical prostatectomy for presumed prostate cancer. Surprisingly, the whole mount prostatectomy specimen showed only small volume, organ-confined prostate adenocarcinoma and a large, benign intraprostatic cyst, which was thought to be responsible for the PSA elevation. 2016 BMJ Publishing Group Ltd.

  14. The impact of milk handling procedures on Ostertagia ostertagi antibody ELISA test results.

    PubMed

    Vanderstichel, Raphaël; Dohoo, Ian; Stryhn, Henrik

    2010-04-19

    The impact of various milk handling stressors were analyzed using a commercially available enzyme-linked immunosorbent assay (ELISA) test measuring Ostertagia ostertagi antibodies in milk from dairy cattle (Svanovir). An indirect ELISA has the ability to determine the amount of milk production losses related to intestinal parasitism. The ELISA test recommends fresh defatted milk, however, milk collected from Dairy Herd Improvement (DHI) programs in North America undergo many stressors, including, heating, freezing and are not defatted. Normalized optical density ratios (ODRs) were compared between fresh defatted milk and milk subjected to one or more stressors with a linear mixed model accounting for differences in variation between the fresh and the frozen samples. Concordance correlation coefficients were also analyzed for comparisons to other similar studies. After accounting for random cow and container effects, the treatment factors interacted with each other (p<0.001). Biologically interesting contrasts were created to explain the interaction. The estimated difference in ODR between the milk samples handled according to recommendations of the manufacturers of Svanovir and the whole milk samples that were subjected to the most extreme treatment (heated, frozen, thawed, and re-frozen for 4 weeks) was 0.062 (p<0.001). This difference represented less than 5% of the range, and was thus considered biologically negligible. Frozen whole milk processed by DHI programs, the most likely method of collecting on-farm samples in North America, will likely yield reliable results for the indirect ELISA tests, particularly, Svanovir.

  15. Ceramics for Molten Materials Containment, Transfer and Handling on the Lunar Surface

    NASA Technical Reports Server (NTRS)

    Standish, Evan; Stefanescu, Doru M.; Curreri, Peter A.

    2009-01-01

    As part of a project on Molten Materials Transfer and Handling on the Lunar Surface, molten materials containment samples of various ceramics were tested to determine their performance in contact with a melt of lunar regolith simulant. The test temperature was 1600 C with contact times ranging from 0 to 12 hours. Regolith simulant was pressed into cylinders with the approximate dimensions of 1.25 dia x 1.25cm height and then melted on ceramic substrates. The regolith-ceramic interface was examined after processing to determine the melt/ceramic interaction. It was found that the molten regolith wetted all oxide ceramics tested extremely well which resulted in chemical reaction between the materials in each case. Alumina substrates were identified which withstood contact at the operating temperature of a molten regolith electrolysis cell (1600 C) for eight hours with little interaction or deformation. This represents an improvement over alumina grades currently in use and will provide a lifetime adequate for electrolysis experiments lasting 24 hours or more. Two types of non-oxide ceramics were also tested. It was found that they interacted to a limited degree with the melt resulting in little corrosion. These ceramics, Sic and BN, were not wetted as well as the oxides by the melt, and so remain possible materials for molten regolith handling. Tests wing longer holding periods and larger volumes of regolith are necessary to determine the ultimate performance of the tested ceramics.

  16. Differences among nursing homes in outcomes of a safe resident handling program

    PubMed Central

    Kurotvski, Alicia; Gore, Rebecca; Buchholz, Bryan; Punnett, Laura

    2018-01-01

    A large nursing home corporation implemented a safe resident handling program (SRHP) in 2004–2007. We evaluated its efficacy over a 2-year period by examining differences among 5 centers in program outcomes and potential predictors of those differences. We observed nursing assistants (NAs), recording activities and body postures at 60-second intervals on personal digital assistants at baseline and at 3-month, 12-month, and 24-month follow-ups. The two outcomes computed were change in equipment use during resident handling and change in a physical workload index that estimated spinal loading due to body postures and handled loads. Potential explanatory factors were extracted from post-observation interviews, investigator surveys of the workforce, from administrative data, and employee satisfaction surveys. The facility with the most positive outcome measures was associated with many positive changes in explanatory factors and the facility with the fewest positive outcome measures experienced negative changes in the same factors. These findings suggest greater SRHP benefits where there was lower NA turnover and agency staffing; less time pressure; and better teamwork, staff communication, and supervisory support. PMID:22833329

  17. A large Great Britain-wide outbreak of STEC O157 phage type 8 linked to handling of raw leeks and potatoes.

    PubMed

    Launders, N; Locking, M E; Hanson, M; Willshaw, G; Charlett, A; Salmon, R; Cowden, J; Harker, K S; Adak, G K

    2016-01-01

    Between December 2010 and July 2011, 252 cases of STEC O157 PT8 stx1 + 2 infection were reported in England, Scotland and Wales. This was the largest outbreak of STEC reported in England and the second largest in the UK to date. Eighty cases were hospitalized, with two cases of haemolytic uraemic syndrome and one death reported. Routine investigative data were used to generate a hypothesis but the subsequent case-control study was inconclusive. A second, more detailed, hypothesis generation exercise identified consumption or handling of vegetables as a potential mode of transmission. A second case-control study demonstrated that cases were more likely than controls to live in households whose members handled or prepared leeks bought unwrapped [odds ratio (OR) 40, 95% confidence interval (CI) 2·08-769·4], and potatoes bought in sacks (OR 13·13, 95% CI 1·19-145·3). This appears to be the first outbreak of STEC O157 infection linked to the handling of leeks.

  18. Defining near misses: towards a sharpened definition based on empirical data about error handling processes.

    PubMed

    Kessels-Habraken, Marieke; Van der Schaaf, Tjerk; De Jonge, Jan; Rutte, Christel

    2010-05-01

    Medical errors in health care still occur frequently. Unfortunately, errors cannot be completely prevented and 100% safety can never be achieved. Therefore, in addition to error reduction strategies, health care organisations could also implement strategies that promote timely error detection and correction. Reporting and analysis of so-called near misses - usually defined as incidents without adverse consequences for patients - are necessary to gather information about successful error recovery mechanisms. This study establishes the need for a clearer and more consistent definition of near misses to enable large-scale reporting and analysis in order to obtain such information. Qualitative incident reports and interviews were collected on four units of two Dutch general hospitals. Analysis of the 143 accompanying error handling processes demonstrated that different incident types each provide unique information about error handling. Specifically, error handling processes underlying incidents that did not reach the patient differed significantly from those of incidents that reached the patient, irrespective of harm, because of successful countermeasures that had been taken after error detection. We put forward two possible definitions of near misses and argue that, from a practical point of view, the optimal definition may be contingent on organisational context. Both proposed definitions could yield large-scale reporting of near misses. Subsequent analysis could enable health care organisations to improve the safety and quality of care proactively by (1) eliminating failure factors before real accidents occur, (2) enhancing their ability to intercept errors in time, and (3) improving their safety culture. Copyright 2010 Elsevier Ltd. All rights reserved.

  19. Asymmetrical Responses of Ecosystem Processes to Positive Versus Negative Precipitation Extremes: a Replicated Regression Experimental Approach

    NASA Astrophysics Data System (ADS)

    Felton, A. J.; Smith, M. D.

    2016-12-01

    Heightened climatic variability due to atmospheric warming is forecast to increase the frequency and severity of climate extremes. In particular, changes to interannual variability in precipitation, characterized by increases in extreme wet and dry years, are likely to impact virtually all terrestrial ecosystem processes. However, to date experimental approaches have yet to explicitly test how ecosystem processes respond to multiple levels of climatic extremity, limiting our understanding of how ecosystems will respond to forecast increases in the magnitude of climate extremes. Here we report the results of a replicated regression experimental approach, in which we imposed 9 and 11 levels of growing season precipitation amount and extremity in mesic grassland during 2015 and 2016, respectively. Each level corresponded to a specific percentile of the long-term record, which produced a large gradient of soil moisture conditions that ranged from extreme wet to extreme dry. In both 2015 and 2016, asymptotic responses to water availability were observed for soil respiration. This asymmetry was driven in part by transitions between soil moisture versus temperature constraints on respiration as conditions became increasingly dry versus increasingly wet. In 2015, aboveground net primary production (ANPP) exhibited asymmetric responses to precipitation that largely mirrored those of soil respiration. In total, our results suggest that in this mesic ecosystem, these two carbon cycle processes were more sensitive to extreme drought than to extreme wet years. Future work will assess ANPP responses for 2016, soil nutrient supply and physiological responses of the dominant plant species. Future efforts are needed to compare our findings across a diverse array of ecosystem types, and in particular how the timing and magnitude of precipitation events may modify the response of ecosystem processes to increasing magnitudes of precipitation extremes.

  20. Heavy Tail Behavior of Rainfall Extremes across Germany

    NASA Astrophysics Data System (ADS)

    Castellarin, A.; Kreibich, H.; Vorogushyn, S.; Merz, B.

    2017-12-01

    Distributions are termed heavy-tailed if extreme values are more likely than would be predicted by probability distributions that have exponential asymptotic behavior. Heavy-tail behavior often leads to surprise, because historical observations can be a poor guide for the future. Heavy-tail behavior seems to be widespread for hydro-meteorological extremes, such as extreme rainfall and flood events. To date there have been only vague hints to explain under which conditions these extremes show heavy-tail behavior. We use an observational data set consisting of 11 climate variables at 1440 stations across Germany. This homogenized, gap-free data set covers 110 years (1901-2010) at daily resolution. We estimate the upper tail behavior, including its uncertainty interval, of daily precipitation extremes for the 1,440 stations at the annual and seasonal time scales. Different tail indicators are tested, including the shape parameter of the Generalized Extreme Value distribution, the upper tail ratio and the obesity index. In a further step, we explore to which extent the tail behavior can be explained by geographical and climate factors. A large number of characteristics is derived, such as station elevation, degree of continentality, aridity, measures for quantifying the variability of humidity and wind velocity, or event-triggering large-scale atmospheric situation. The link between the upper tail behavior and these characteristics is investigated via data mining methods capable of detecting non-linear relationships in large data sets. This exceptionally rich observational data set, in terms of number of stations, length of time series and number of explaining variables, allows insights into the upper tail behavior which is rarely possible given the typical observational data sets available.

  1. Handling a Collection of PDF Documents

    EPA Pesticide Factsheets

    You have several options for making a large collection of PDF documents more accessible to your audience: avoid uploading altogether, use multiple document pages, and use document IDs as anchors for direct links within a document page.

  2. How senior entomologists can be involved in the annual meeting: organization and the coming together of a large event

    USDA-ARS?s Scientific Manuscript database

    The Annual Meeting for the Entomological Society of America is a large event where planning is started at the end of the previous years’ meeting. The President of the Society named the Program Committee Co-Chairs for Entomology 2017 at the 2015 Annual Meeting, so that they could handle the duties o...

  3. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Multicategory Composite Least Squares Classifiers

    PubMed Central

    Park, Seo Young; Liu, Yufeng; Liu, Dacheng; Scholl, Paul

    2010-01-01

    Classification is a very useful statistical tool for information extraction. In particular, multicategory classification is commonly seen in various applications. Although binary classification problems are heavily studied, extensions to the multicategory case are much less so. In view of the increased complexity and volume of modern statistical problems, it is desirable to have multicategory classifiers that are able to handle problems with high dimensions and with a large number of classes. Moreover, it is necessary to have sound theoretical properties for the multicategory classifiers. In the literature, there exist several different versions of simultaneous multicategory Support Vector Machines (SVMs). However, the computation of the SVM can be difficult for large scale problems, especially for problems with large number of classes. Furthermore, the SVM cannot produce class probability estimation directly. In this article, we propose a novel efficient multicategory composite least squares classifier (CLS classifier), which utilizes a new composite squared loss function. The proposed CLS classifier has several important merits: efficient computation for problems with large number of classes, asymptotic consistency, ability to handle high dimensional data, and simple conditional class probability estimation. Our simulated and real examples demonstrate competitive performance of the proposed approach. PMID:21218128

  5. Head circumference growth among extremely preterm infants in Denmark has improved during the past two decades.

    PubMed

    Zachariassen, Gitte; Hansen, Bo Mølholm

    2015-07-01

    Treatment of extremely preterm and low birth weight infants is still evolving and improving. In this study, we evaluated if growth has improved from birth to two years of corrected age (CA) among extremely low birth weight (BW) and preterm born infants in Denmark. This was an observational study with comparison of head circumference (HC), weight and length growth in two Danish cohorts of extremely preterm (gestational age (GA) < 28 weeks) and extremely low birth weight (ELBW with a BW < 1,000 g) infants (A: 1994-1995 and B: 2004-2008). Infants in cohort A (n = 198) and B (n = 64) had a median GA and BW of 27 + 2 weeks and 948 g in A, and 27 + 3 weeks and 934 g in B. At discharge, infants in B compared with A had increased more in HC (p = 0.000), length (p = 0.008) and weight (p = 0.000). At two years CA, HC was still significantly larger in cohort B than A (p = 0.03), while no significant difference was recorded for length or weight. Growth during hospitalisation seems to have improved among extremely preterm and low birth weight infants from 1994-1995 to 2004-2008. This may be a result of improved nutrition in combination with improved intensive care during hospitalisation. Collection of data in the 2004-2008 cohort was supported by the Institute of Regional Health Services Research, the Egmont Foundation and the University of Southern Denmark. Collection of data from birth to two years of age in the 1994-1995 cohort was without financial support. For the 1994-1995 study, all eight regional Research Ethics Committees in Denmark at that time approved the study. The 2004-2008 study was approved by the Danish National Committee on Biomedical Research Ethics, and handling of data and registrations were approved by the Danish Data Protection Agency.

  6. A rational decision rule with extreme events.

    PubMed

    Basili, Marcello

    2006-12-01

    Risks induced by extreme events are characterized by small or ambiguous probabilities, catastrophic losses, or windfall gains. Through a new functional, that mimics the restricted Bayes-Hurwicz criterion within the Choquet expected utility approach, it is possible to represent the decisionmaker behavior facing both risky (large and reliable probability) and extreme (small or ambiguous probability) events. A new formalization of the precautionary principle (PP) is shown and a new functional, which encompasses both extreme outcomes and expectation of all the possible results for every act, is claimed.

  7. Effects of climate extremes on the terrestrial carbon cycle: concepts, processes and potential future impacts

    PubMed Central

    Frank, Dorothea; Reichstein, Markus; Bahn, Michael; Thonicke, Kirsten; Frank, David; Mahecha, Miguel D; Smith, Pete; van der Velde, Marijn; Vicca, Sara; Babst, Flurin; Beer, Christian; Buchmann, Nina; Canadell, Josep G; Ciais, Philippe; Cramer, Wolfgang; Ibrom, Andreas; Miglietta, Franco; Poulter, Ben; Rammig, Anja; Seneviratne, Sonia I; Walz, Ariane; Wattenbach, Martin; Zavala, Miguel A; Zscheischler, Jakob

    2015-01-01

    Extreme droughts, heat waves, frosts, precipitation, wind storms and other climate extremes may impact the structure, composition and functioning of terrestrial ecosystems, and thus carbon cycling and its feedbacks to the climate system. Yet, the interconnected avenues through which climate extremes drive ecological and physiological processes and alter the carbon balance are poorly understood. Here, we review the literature on carbon cycle relevant responses of ecosystems to extreme climatic events. Given that impacts of climate extremes are considered disturbances, we assume the respective general disturbance-induced mechanisms and processes to also operate in an extreme context. The paucity of well-defined studies currently renders a quantitative meta-analysis impossible, but permits us to develop a deductive framework for identifying the main mechanisms (and coupling thereof) through which climate extremes may act on the carbon cycle. We find that ecosystem responses can exceed the duration of the climate impacts via lagged effects on the carbon cycle. The expected regional impacts of future climate extremes will depend on changes in the probability and severity of their occurrence, on the compound effects and timing of different climate extremes, and on the vulnerability of each land-cover type modulated by management. Although processes and sensitivities differ among biomes, based on expert opinion, we expect forests to exhibit the largest net effect of extremes due to their large carbon pools and fluxes, potentially large indirect and lagged impacts, and long recovery time to regain previous stocks. At the global scale, we presume that droughts have the strongest and most widespread effects on terrestrial carbon cycling. Comparing impacts of climate extremes identified via remote sensing vs. ground-based observational case studies reveals that many regions in the (sub-)tropics are understudied. Hence, regional investigations are needed to allow a global upscaling of the impacts of climate extremes on global carbon–climate feedbacks. PMID:25752680

  8. Effects of climate extremes on the terrestrial carbon cycle: concepts, processes and potential future impacts.

    PubMed

    Frank, Dorothea; Reichstein, Markus; Bahn, Michael; Thonicke, Kirsten; Frank, David; Mahecha, Miguel D; Smith, Pete; van der Velde, Marijn; Vicca, Sara; Babst, Flurin; Beer, Christian; Buchmann, Nina; Canadell, Josep G; Ciais, Philippe; Cramer, Wolfgang; Ibrom, Andreas; Miglietta, Franco; Poulter, Ben; Rammig, Anja; Seneviratne, Sonia I; Walz, Ariane; Wattenbach, Martin; Zavala, Miguel A; Zscheischler, Jakob

    2015-08-01

    Extreme droughts, heat waves, frosts, precipitation, wind storms and other climate extremes may impact the structure, composition and functioning of terrestrial ecosystems, and thus carbon cycling and its feedbacks to the climate system. Yet, the interconnected avenues through which climate extremes drive ecological and physiological processes and alter the carbon balance are poorly understood. Here, we review the literature on carbon cycle relevant responses of ecosystems to extreme climatic events. Given that impacts of climate extremes are considered disturbances, we assume the respective general disturbance-induced mechanisms and processes to also operate in an extreme context. The paucity of well-defined studies currently renders a quantitative meta-analysis impossible, but permits us to develop a deductive framework for identifying the main mechanisms (and coupling thereof) through which climate extremes may act on the carbon cycle. We find that ecosystem responses can exceed the duration of the climate impacts via lagged effects on the carbon cycle. The expected regional impacts of future climate extremes will depend on changes in the probability and severity of their occurrence, on the compound effects and timing of different climate extremes, and on the vulnerability of each land-cover type modulated by management. Although processes and sensitivities differ among biomes, based on expert opinion, we expect forests to exhibit the largest net effect of extremes due to their large carbon pools and fluxes, potentially large indirect and lagged impacts, and long recovery time to regain previous stocks. At the global scale, we presume that droughts have the strongest and most widespread effects on terrestrial carbon cycling. Comparing impacts of climate extremes identified via remote sensing vs. ground-based observational case studies reveals that many regions in the (sub-)tropics are understudied. Hence, regional investigations are needed to allow a global upscaling of the impacts of climate extremes on global carbon-climate feedbacks. © 2015 The Authors. Global Change Biology published by John Wiley & Sons Ltd.

  9. Programmable Automated Welding System (PAWS): Control of welding through software and hardware

    NASA Technical Reports Server (NTRS)

    Kline, Martin D.; Doyle, Thomas E.

    1994-01-01

    The ATD phase of the PAWS program ended in November 1992 and the follow-on ManTech program was started in September 1993. The system will be industrially hardened during the first year of this program. Follow-on years will focus upon the transition into specific end-user sites. These implementations will also expand the system into other welding processes (e.g. FCAW, GTAW, PAW). In addition, the architecture is being developed for application to other non-welding robotic processes (e.g. inspection, surface finishing). Future development is anticipated to encompass hardening for extreme environments, expanded exception handling techniques, and application to a range of manipulators.

  10. Plague.

    PubMed

    Cobbs, C Glenn; Chansolme, David H

    2004-07-01

    Plague is a disease that has been present for thousands of years and described since the earliest medical accounts. It occurs today worldwide, and may present in a variety of clinical forms. Bubonic disease, pneumonic plague, and septicemic plague are seen in addition to a number of other less common manifestations. As an agent of bioterrorism,Yersinia pestis could pose an extreme threat if released in the appropriate form and in the appropriate environment. Presumptive diagnosis may be made with readily available techniques, but laboratory handling of specimens requires special care. When there is a strong suspicion of plague, treatment should be instituted immediately, as delaying therapy will result in increased morbidity and mortality.

  11. Aerogel: From Aerospace to Apparel

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Aspen Systems Inc. developed an aerogel-manufacturing process solved the handling problems associated with aerogel-based insulation products. Their aerogels can now be manufactured into blankets, thin sheets, beads, and molded parts; and may be transparent, translucent, or opaque. Aspen made the material effective for window and skylight insulation, non-flammable building insulation, and inexpensive firewall insulation that will withstand fires in homes and buildings, and also assist in the prevention of forest fires. Another Aspen product is Spaceloft(TM); an inexpensive, flexible blanket that incorporates a thin layer of aerogel embedded directly into the fabric. Spaceloft, is incorporated into jackets intended for wear in extremely harsh conditions and activities, such as Antarctic expeditions.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less

  13. Large uncertainties in observed daily precipitation extremes over land

    NASA Astrophysics Data System (ADS)

    Herold, Nicholas; Behrangi, Ali; Alexander, Lisa V.

    2017-01-01

    We explore uncertainties in observed daily precipitation extremes over the terrestrial tropics and subtropics (50°S-50°N) based on five commonly used products: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) dataset, the Global Precipitation Climatology Centre-Full Data Daily (GPCC-FDD) dataset, the Tropical Rainfall Measuring Mission (TRMM) multi-satellite research product (T3B42 v7), the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR), and the Global Precipitation Climatology Project's One-Degree Daily (GPCP-1DD) dataset. We use the precipitation indices R10mm and Rx1day, developed by the Expert Team on Climate Change Detection and Indices, to explore the behavior of "moderate" and "extreme" extremes, respectively. In order to assess the sensitivity of extreme precipitation to different grid sizes we perform our calculations on four common spatial resolutions (0.25° × 0.25°, 1° × 1°, 2.5° × 2.5°, and 3.75° × 2.5°). The impact of the chosen "order of operation" in calculating these indices is also determined. Our results show that moderate extremes are relatively insensitive to product and resolution choice, while extreme extremes can be very sensitive. For example, at 0.25° × 0.25° quasi-global mean Rx1day values vary from 37 mm in PERSIANN-CDR to 62 mm in T3B42. We find that the interproduct spread becomes prominent at resolutions of 1° × 1° and finer, thus establishing a minimum effective resolution at which observational products agree. Without improvements in interproduct spread, these exceedingly large observational uncertainties at high spatial resolution may limit the usefulness of model evaluations. As has been found previously, resolution sensitivity can be largely eliminated by applying an order of operation where indices are calculated prior to regridding. However, this approach is not appropriate when true area averages are desired (e.g., for model evaluations).

  14. Handling of thermal paper: Implications for dermal exposure to bisphenol A and its alternatives

    PubMed Central

    Bernier, Meghan R.

    2017-01-01

    Bisphenol A (BPA) is an endocrine disrupting chemical used in a wide range of consumer products including photoactive dyes used in thermal paper. Recent studies have shown that dermal absorption of BPA can occur when handling these papers. Yet, regulatory agencies have largely dismissed thermal paper as a major source of BPA exposure. Exposure estimates provided by agencies such as the European Food Safety Authority (EFSA) are based on assumptions about how humans interact with this material, stating that ‘typical’ exposures for adults involve only one handling per day for short periods of time (<1 minute), with limited exposure surfaces (three fingertips). The objective of this study was to determine how individuals handle thermal paper in one common setting: a cafeteria providing short-order meals. We observed thermal paper handling in a college-aged population (n = 698 subjects) at the University of Massachusetts’ dining facility. We find that in this setting, individuals handle receipts for an average of 11.5 min, that >30% of individuals hold thermal paper with more than three fingertips, and >60% allow the paper to touch their palm. Only 11% of the participants we observed were consistent with the EFSA model for time of contact and dermal surface area. Mathematical modeling based on handling times we measured and previously published transfer coefficients, concentrations of BPA in paper, and absorption factors indicate the most conservative estimated intake from handling thermal paper in this population is 51.1 ng/kg/day, similar to EFSA’s estimates of 59 ng/kg/day from dermal exposures. Less conservative estimates, using published data on concentrations in thermal paper and transfer rates to skin, indicate that exposures are likely significantly higher. Based on our observational data, we propose that the current models for estimating dermal BPA exposures are not consistent with normal human behavior and should be reevaluated. PMID:28570582

  15. Handling of thermal paper: Implications for dermal exposure to bisphenol A and its alternatives.

    PubMed

    Bernier, Meghan R; Vandenberg, Laura N

    2017-01-01

    Bisphenol A (BPA) is an endocrine disrupting chemical used in a wide range of consumer products including photoactive dyes used in thermal paper. Recent studies have shown that dermal absorption of BPA can occur when handling these papers. Yet, regulatory agencies have largely dismissed thermal paper as a major source of BPA exposure. Exposure estimates provided by agencies such as the European Food Safety Authority (EFSA) are based on assumptions about how humans interact with this material, stating that 'typical' exposures for adults involve only one handling per day for short periods of time (<1 minute), with limited exposure surfaces (three fingertips). The objective of this study was to determine how individuals handle thermal paper in one common setting: a cafeteria providing short-order meals. We observed thermal paper handling in a college-aged population (n = 698 subjects) at the University of Massachusetts' dining facility. We find that in this setting, individuals handle receipts for an average of 11.5 min, that >30% of individuals hold thermal paper with more than three fingertips, and >60% allow the paper to touch their palm. Only 11% of the participants we observed were consistent with the EFSA model for time of contact and dermal surface area. Mathematical modeling based on handling times we measured and previously published transfer coefficients, concentrations of BPA in paper, and absorption factors indicate the most conservative estimated intake from handling thermal paper in this population is 51.1 ng/kg/day, similar to EFSA's estimates of 59 ng/kg/day from dermal exposures. Less conservative estimates, using published data on concentrations in thermal paper and transfer rates to skin, indicate that exposures are likely significantly higher. Based on our observational data, we propose that the current models for estimating dermal BPA exposures are not consistent with normal human behavior and should be reevaluated.

  16. Parallel Clustering Algorithm for Large-Scale Biological Data Sets

    PubMed Central

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Backgrounds Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Methods Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. Result A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies. PMID:24705246

  17. Increasing precipitation volatility in twenty-first-century California

    NASA Astrophysics Data System (ADS)

    Swain, Daniel L.; Langenbrunner, Baird; Neelin, J. David; Hall, Alex

    2018-05-01

    Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California's rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016-2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California's `Great Flood of 1862'. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California's existing water storage, conveyance and flood control infrastructure.

  18. Updates to FuncLab, a Matlab based GUI for handling receiver functions

    NASA Astrophysics Data System (ADS)

    Porritt, Robert W.; Miller, Meghan S.

    2018-02-01

    Receiver functions are a versatile tool commonly used in seismic imaging. Depending on how they are processed, they can be used to image discontinuity structure within the crust or mantle or they can be inverted for seismic velocity either directly or jointly with complementary datasets. However, modern studies generally require large datasets which can be challenging to handle; therefore, FuncLab was originally written as an interactive Matlab GUI to assist in handling these large datasets. This software uses a project database to allow interactive trace editing, data visualization, H-κ stacking for crustal thickness and Vp/Vs ratio, and common conversion point stacking while minimizing computational costs. Since its initial release, significant advances have been made in the implementation of web services and changes in the underlying Matlab platform have necessitated a significant revision to the software. Here, we present revisions to the software, including new features such as data downloading via irisFetch.m, receiver function calculations via processRFmatlab, on-the-fly cross-section tools, interface picking, and more. In the descriptions of the tools, we present its application to a test dataset in Michigan, Wisconsin, and neighboring areas following the passage of USArray Transportable Array. The software is made available online at https://robporritt.wordpress.com/software.

  19. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  20. Effects of dynamic aeroelasticity on handling qualities and pilot rating

    NASA Technical Reports Server (NTRS)

    Swaim, R. L.; Yen, W.-Y.

    1978-01-01

    Pilot performance parameters, such as pilot ratings, tracking errors, and pilot comments, were recorded and analyzed for a longitudinal pitch tracking task on a large, flexible aircraft. The tracking task was programmed on a fixed-base simulator with a CRT attitude director display of pitch angle command, pitch angle, and pitch angle error. Parametric variations in the undamped natural frequencies of the two lowest frequency symmetric elastic modes were made to induce varying degrees of rigid body and elastic mode interaction. The results indicate that such mode interaction can drastically affect the handling qualities and pilot ratings of the task.

  1. Wind tunnel pressurization and recovery system

    NASA Technical Reports Server (NTRS)

    Pejack, Edwin R.; Meick, Joseph; Ahmad, Adnan; Lateh, Nordin; Sadeq, Omar

    1988-01-01

    The high density, low toxicity characteristics of refrigerant-12 (dichlorofluoromethane) make it an ideal gas for wind tunnel testing. Present limitations on R-12 emissions, set to slow the rate of ozone deterioration, pose a difficult problem in recovery and handling of large quantities of R-12. This preliminary design is a possible solution to the problem of R-12 handling in wind tunnel testing. The design incorporates cold temperature condensation with secondary purification of the R-12/air mixture by adsorption. Also discussed is the use of Freon-22 as a suitable refrigerant for the 12 foot wind tunnel.

  2. Hydrostatic force used to handle outsized, heavy objects

    NASA Technical Reports Server (NTRS)

    Craft, G. W.; Starkey, A. W.

    1967-01-01

    Specially fitted barge is used to load and transport large, heavy objects to a dock side site. There the barge itself can lift, rotate, and position the objects. Typical functions are economically accomplished by water buoyancy.

  3. Efficient Power Network Analysis with Modeling of Inductive Effects

    NASA Astrophysics Data System (ADS)

    Zeng, Shan; Yu, Wenjian; Hong, Xianlong; Cheng, Chung-Kuan

    In this paper, an efficient method is proposed to accurately analyze large-scale power/ground (P/G) networks, where inductive parasitics are modeled with the partial reluctance. The method is based on frequency-domain circuit analysis and the technique of vector fitting [14], and obtains the time-domain voltage response at given P/G nodes. The frequency-domain circuit equation including partial reluctances is derived, and then solved with the GMRES algorithm with rescaling, preconditioning and recycling techniques. With the merit of sparsified reluctance matrix and iterative solving techniques for the frequency-domain circuit equations, the proposed method is able to handle large-scale P/G networks with complete inductive modeling. Numerical results show that the proposed method is orders of magnitude faster than HSPICE, several times faster than INDUCTWISE [4], and capable of handling the inductive P/G structures with more than 100, 000 wire segments.

  4. Multi-scale Modeling of Radiation Damage: Large Scale Data Analysis

    NASA Astrophysics Data System (ADS)

    Warrier, M.; Bhardwaj, U.; Bukkuru, S.

    2016-10-01

    Modification of materials in nuclear reactors due to neutron irradiation is a multiscale problem. These neutrons pass through materials creating several energetic primary knock-on atoms (PKA) which cause localized collision cascades creating damage tracks, defects (interstitials and vacancies) and defect clusters depending on the energy of the PKA. These defects diffuse and recombine throughout the whole duration of operation of the reactor, thereby changing the micro-structure of the material and its properties. It is therefore desirable to develop predictive computational tools to simulate the micro-structural changes of irradiated materials. In this paper we describe how statistical averages of the collision cascades from thousands of MD simulations are used to provide inputs to Kinetic Monte Carlo (KMC) simulations which can handle larger sizes, more defects and longer time durations. Use of unsupervised learning and graph optimization in handling and analyzing large scale MD data will be highlighted.

  5. Method of developing all-optical trinary JK, D-type, and T-type flip-flops using semiconductor optical amplifiers.

    PubMed

    Garai, Sisir Kumar

    2012-04-10

    To meet the demand of very fast and agile optical networks, the optical processors in a network system should have a very fast execution rate, large information handling, and large information storage capacities. Multivalued logic operations and multistate optical flip-flops are the basic building blocks for such fast running optical computing and data processing systems. In the past two decades, many methods of implementing all-optical flip-flops have been proposed. Most of these suffer from speed limitations because of the low switching response of active devices. The frequency encoding technique has been used because of its many advantages. It can preserve its identity throughout data communication irrespective of loss of light energy due to reflection, refraction, attenuation, etc. The action of polarization-rotation-based very fast switching of semiconductor optical amplifiers increases processing speed. At the same time, tristate optical flip-flops increase information handling capacity.

  6. A geochemical transport model for redox-controlled movement of mineral fronts in groundwater flow systems: A case of nitrate removal by oxidation of pyrite

    USGS Publications Warehouse

    Engesgaard, Peter; Kipp, Kenneth L.

    1992-01-01

    A one-dimensional prototype geochemical transport model was developed in order to handle simultaneous precipitation-dissolution and oxidation-reduction reactions governed by chemical equilibria. Total aqueous component concentrations are the primary dependent variables, and a sequential iterative approach is used for the calculation. The model was verified by analytical and numerical comparisons and is able to simulate sharp mineral fronts. At a site in Denmark, denitrification has been observed by oxidation of pyrite. Simulation of nitrate movement at this site showed a redox front movement rate of 0.58 m yr−1, which agreed with calculations of others. It appears that the sequential iterative approach is the most practical for extension to multidimensional simulation and for handling large numbers of components and reactions. However, slow convergence may limit the size of redox systems that can be handled.

  7. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  8. Predictability and possible earlier awareness of extreme precipitation across Europe

    NASA Astrophysics Data System (ADS)

    Lavers, David; Pappenberger, Florian; Richardson, David; Zsoter, Ervin

    2017-04-01

    Extreme hydrological events can cause large socioeconomic damages in Europe. In winter, a large proportion of these flood episodes are associated with atmospheric rivers, a region of intense water vapour transport within the warm sector of extratropical cyclones. When preparing for such extreme events, forecasts of precipitation from numerical weather prediction models or river discharge forecasts from hydrological models are generally used. Given the strong link between water vapour transport (integrated vapour transport IVT) and heavy precipitation, it is possible that IVT could be used to warn of extreme events. Furthermore, as IVT is located in extratropical cyclones, it is hypothesized to be a more predictable variable due to its link with synoptic-scale atmospheric dynamics. In this research, we firstly provide an overview of the predictability of IVT and precipitation forecasts, and secondly introduce and evaluate the ECMWF Extreme Forecast Index (EFI) for IVT. The EFI is a tool that has been developed to evaluate how ensemble forecasts differ from the model climate, thus revealing the extremeness of the forecast. The ability of the IVT EFI to capture extreme precipitation across Europe during winter 2013/14, 2014/15, and 2015/16 is presented. The results show that the IVT EFI is more capable than the precipitation EFI of identifying extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase. However, the precipitation EFI is superior during the negative NAO phase and at shorter lead times. An IVT EFI example is shown for storm Desmond in December 2015 highlighting its potential to identify upcoming hydrometeorological extremes.

  9. Changes in extreme events and the potential impacts on human health.

    PubMed

    Bell, Jesse E; Brown, Claudia Langford; Conlon, Kathryn; Herring, Stephanie; Kunkel, Kenneth E; Lawrimore, Jay; Luber, George; Schreck, Carl; Smith, Adam; Uejio, Christopher

    2018-04-01

    Extreme weather and climate-related events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity, and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, dust storms, flooding rains, coastal flooding, storm surges, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden. More information is needed about the impacts of climate change on public health and economies to effectively plan for and adapt to climate change. This paper describes some of the ways extreme events are changing and provides examples of the potential impacts on human health and infrastructure. It also identifies key research gaps to be addressed to improve the resilience of public health to extreme events in the future. Extreme weather and climate events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity, and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, flooding rains, coastal flooding, surges, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden.

  10. Extreme temperatures and out-of-hospital coronary deaths in six large Chinese cities.

    PubMed

    Chen, Renjie; Li, Tiantian; Cai, Jing; Yan, Meilin; Zhao, Zhuohui; Kan, Haidong

    2014-12-01

    The seasonal trend of out-of-hospital coronary death (OHCD) and sudden cardiac death has been observed, but whether extreme temperature serves as a risk factor is rarely investigated. We therefore aimed to evaluate the impact of extreme temperatures on OHCDs in China. We obtained death records of 126,925 OHCDs from six large Chinese cities (Harbin, Beijing, Tianjin, Nanjing, Shanghai and Guangzhou) during the period 2009-2011. The short-term associations between extreme temperature and OHCDs were analysed with time-series methods in each city, using generalised additive Poisson regression models. We specified distributed lag non-linear models in studying the delayed effects of extreme temperature. We then applied Bayesian hierarchical models to combine the city-specific effect estimates. The associations between extreme temperature and OHCDs were almost U-shaped or J-shaped. The pooled relative risks (RRs) of extreme cold temperatures over the lags 0-14 days comparing the 1st and 25th centile temperatures were 1.49 (95% posterior interval (PI) 1.26-1.76); the pooled RRs of extreme hot temperatures comparing the 99th and 75th centile temperatures were 1.53 (95% PI 1.27-1.84) for OHCDs. The RRs of extreme temperature on OHCD were higher if the patients with coronary heart disease were old, male and less educated. This multicity epidemiological study suggested that both extreme cold and hot temperatures posed significant risks on OHCDs, and might have important public health implications for the prevention of OHCD or sudden cardiac death. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Intra-seasonal Characteristics of Wintertime Extreme Cold Events over South Korea

    NASA Astrophysics Data System (ADS)

    Park, Taewon; Jeong, Jeehoon; Choi, Jahyun

    2017-04-01

    The present study reveals the changes in the characteristics of extreme cold events over South Korea for boreal winter (November to March) in terms of the intra-seasonal variability of frequency, duration, and atmospheric circulation pattern. Influences of large-scale variabilities such as the Siberian High activity, the Arctic Oscillation (AO), and the Madden-Julian Oscillation (MJO) on extreme cold events are also investigated. In the early and the late of the winter during November and March, the upper-tropospheric wave-train for a life-cycle of the extreme cold events tends to pass quickly over East Asia. In addition, compared with the other months, the intensity of the Siberian High is weaker and the occurrences of strong negative AO are less frequent. It lead to events with weak amplitude and short duration. On the other hand, the amplified Siberian High and the strong negative AO occur more frequently in the mid of the winter from December to February. The extreme cold events are mainly characterized by a well-organized anticyclonic blocking around the Ural Mountain and the Subarctic. These large-scale circulation makes the extreme cold events for the midwinter last long with strong amplitude. The MJO phases 2-3 which provide a suitable condition for the amplification of extreme cold events occur frequently for November to January when the frequencies are more than twice those for February and March. While the extreme cold events during March have the least frequency, the weakest amplitude, and the shortest duration due to weak impacts of the abovementioned factors, the strong activities of the factors for January force the extreme cold events to be the most frequent, the strongest, and the longest among the boreal winter. Keywords extreme cold event, wave-train, blocking, Siberian High, AO, MJO

  12. The Joint Statistics of California Temperature and Precipitation as a Function of the Large-scale State of the Climate

    NASA Astrophysics Data System (ADS)

    OBrien, J. P.; O'Brien, T. A.

    2015-12-01

    Single climatic extremes have a strong and disproportionate effect on society and the natural environment. However, the joint occurrence of two or more concurrent extremes has the potential to negatively impact these areas of life in ways far greater than any single event could. California, USA, home to nearly 40 million people and the largest agricultural producer in the United States, is currently experiencing an extreme drought, which has persisted for several years. While drought is commonly thought of in terms of only precipitation deficits, above average temperatures co-occurring with precipitation deficits greatly exacerbate drought conditions. The 2014 calendar year in California was characterized both by extremely low precipitation and extremely high temperatures, which has significantly deepened the already extreme drought conditions leading to severe water shortages and wildfires. While many studies have shown the statistics of 2014 temperature and precipitation anomalies as outliers, none have demonstrated a connection with large-scale, long-term climate trends, which would provide useful relationships for predicting the future trajectory of California climate and water resources. We focus on understanding non-stationarity in the joint distribution of California temperature and precipitation anomalies in terms of large-scale, low-frequency trends in climate such as global mean temperature rise and oscillatory indices such as ENSO and the Pacific Decadal Oscillation among others. We consider temperature and precipitation data from the seven distinct climate divisions in California and employ a novel, high-fidelity kernel density estimation method to directly infer the multivariate distribution of temperature and precipitation anomalies conditioned on the large-scale state of the climate. We show that the joint distributions and associated statistics of temperature and precipitation are non-stationary and vary regionally in California. Further, we show that recurrence intervals of extreme concurrent events vary as a function of time and of teleconnections. This research has implications for predicting and forecasting future temperature and precipitation anomalies, which is critically important for city, water, and agricultural planning in California.

  13. FDT 2.0: Improving scalability of the fuzzy decision tree induction tool - integrating database storage.

    PubMed

    Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W

    2014-12-01

    Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.

  14. Accelerating root system phenotyping of seedlings through a computer-assisted processing pipeline.

    PubMed

    Dupuy, Lionel X; Wright, Gladys; Thompson, Jacqueline A; Taylor, Anna; Dekeyser, Sebastien; White, Christopher P; Thomas, William T B; Nightingale, Mark; Hammond, John P; Graham, Neil S; Thomas, Catherine L; Broadley, Martin R; White, Philip J

    2017-01-01

    There are numerous systems and techniques to measure the growth of plant roots. However, phenotyping large numbers of plant roots for breeding and genetic analyses remains challenging. One major difficulty is to achieve high throughput and resolution at a reasonable cost per plant sample. Here we describe a cost-effective root phenotyping pipeline, on which we perform time and accuracy benchmarking to identify bottlenecks in such pipelines and strategies for their acceleration. Our root phenotyping pipeline was assembled with custom software and low cost material and equipment. Results show that sample preparation and handling of samples during screening are the most time consuming task in root phenotyping. Algorithms can be used to speed up the extraction of root traits from image data, but when applied to large numbers of images, there is a trade-off between time of processing the data and errors contained in the database. Scaling-up root phenotyping to large numbers of genotypes will require not only automation of sample preparation and sample handling, but also efficient algorithms for error detection for more reliable replacement of manual interventions.

  15. Simulator study of flight characteristics of a large twin-fuselage cargo transport airplane during approach and landing

    NASA Technical Reports Server (NTRS)

    Grantham, W. D.; Deal, P. L.; Keyser, G. L., Jr.; Smith, P. M.

    1983-01-01

    A six degree-of-freedom, ground-based simulator study was conducted to evaluate the low speed flight characteristics of a twin fuselage cargo transport airplane and to compare these characteristics with those of a large, single fuselage (reference) transport configuration which was similar to the Lockheed C-5C airplane. The primary piloting task was the approach and landing. The results indicated that in order to achieve "acceptable' low speed handling qualities on the twin fuselage concept, considerable stability and control augmentation was required, and although the augmented airplane could be landed safely under adverse conditions, the roll performance of the aircraft had to be improved appreciably before the handling qualities were rated as being "satisfactory.' These ground-based simulation results indicated that a value of t sub phi = 30 (time required to bank 30 deg) less than 6 sec should result in "acceptable' roll response characteristics, and when t sub phi = 30 is less than 3.8 sec, "satisfactory' roll response should be attainable on such large and unusually configured aircraft as the subject twin fuselage cargo transport concept.

  16. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    Treesearch

    Lejiang Yu; Shiyuan Zhong; Lisi Pei; Xindi (Randy) Bian; Warren E. Heilman

    2016-01-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for...

  17. A Projection of Changes in Landfilling Atmospheric River Frequency and Extreme Precipitation over Western North America from the Large Ensemble CESM Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson M.; Leung, Lai-Yung R.; Yoon, Jin-Ho

    Simulations from the Community Earth System Model Large Ensemble project are analyzed to investigate the impact of global warming on atmospheric rivers (ARs). The model has notable biases in simulating the subtropical jet position and the relationship between extreme precipitation and moisture transport. After accounting for these biases, the model projects an ensemble mean increase of 35% in the number of landfalling AR days between the last twenty years of the 20th and 21st centuries. However, the number of AR associated extreme precipitation days increases only by 28% because the moisture transport required to produce extreme precipitation also increases withmore » warming. Internal variability introduces an uncertainty of ±8% and ±7% in the projected changes in AR days and associated extreme precipitation days. In contrast, accountings for model biases only change the projected changes by about 1%. The significantly larger mean changes compared to internal variability and to the effects of model biases highlight the robustness of AR responses to global warming.« less

  18. Large optical glass blanks for the ELT generation

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Petzold, Uwe; Dietrich, Volker; Wittmer, Volker; Rexius, Olga

    2016-07-01

    The upcoming extremely large telescope projects like the E-ELT, TMT or GMT telescopes require not only large amount of mirror blank substrates but have also sophisticated instrument setups. Common instrument components are atmospheric dispersion correctors that compensate for the varying atmospheric path length depending on the telescope inclination angle. These elements consist usually of optical glass blanks that have to be large due to the increased size of the focal beam of the extremely large telescopes. SCHOTT has a long experience in producing and delivering large optical glass blanks for astronomical applications up to 1 m and in homogeneity grades up to H3 quality in the past. The most common optical glass available in large formats is SCHOTT N-BK7. But other glass types like F2 or LLF1 can also be produced in formats up to 1 m. The extremely large telescope projects partly demand atmospheric dispersion components even in sizes beyond 1m up to a range of 1.5 m diameter. The production of such large homogeneous optical glass banks requires tight control of all process steps. To cover this demand in the future SCHOTT initiated a research project to improve the large optical blank production process steps from melting to annealing and measurement. Large optical glass blanks are measured in several sub-apertures that cover the total clear aperture of the application. With SCHOTT's new stitching software it is now possible to combine individual sub-aperture measurements to a total homogeneity map of the blank. In this presentation first results will be demonstrated.

  19. Extremely large magnetoresistance in a high-quality WTe2 grown by flux method

    NASA Astrophysics Data System (ADS)

    Tsumura, K.; Yano, R.; Kashiwaya, H.; Koyanagi, M.; Masubuchi, S.; Machida, T.; Namiki, H.; Sasagawa, T.; Kashiwaya, S.

    2018-03-01

    We have grown single crystals of WTe2 by a self-flux method and evaluated the quality of the crystals. A Hall bar-type device was fabricated from an as-exfoliated film on a Si substrate and longitudinal resistance Rxx was measured. Rxx increased with an applied perpendicular magnetic field without saturation and an extremely large magnetoresistance as high as 376,059 % was observed at 8.27 T and 1.7 K.

  20. Program Instrumentation and Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Goldberg, Allen; Filman, Robert; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2002-01-01

    Several attempts have been made recently to apply techniques such as model checking and theorem proving to the analysis of programs. This shall be seen as a current trend to analyze real software systems instead of just their designs. This includes our own effort to develop a model checker for Java, the Java PathFinder 1, one of the very first of its kind in 1998. However, model checking cannot handle very large programs without some kind of abstraction of the program. This paper describes a complementary scalable technique to handle such large programs. Our interest is turned on the observation part of the equation: How much information can be extracted about a program from observing a single execution trace? It is our intention to develop a technology that can be applied automatically and to large full-size applications, with minimal modification to the code. We present a tool, Java PathExplorer (JPaX), for exploring execution traces of Java programs. The tool prioritizes scalability for completeness, and is directed towards detecting errors in programs, not to prove correctness. One core element in JPaX is an instrumentation package that allows to instrument Java byte code files to log various events when executed. The instrumentation is driven by a user provided script that specifies what information to log. Examples of instructions that such a script can contain are: 'report name and arguments of all called methods defined in class C, together with a timestamp'; 'report all updates to all variables'; and 'report all acquisitions and releases of locks'. In more complex instructions one can specify that certain expressions should be evaluated and even that certain code should be executed under various conditions. The instrumentation package can hence be seen as implementing Aspect Oriented Programming for Java in the sense that one can add functionality to a Java program without explicitly changing the code of the original program, but one rather writes an aspect and compiles it into the original program using the instrumentation. Another core element of JPaX is an observation package that supports the analysis of the generated event stream. Two kinds of analysis are currently supported. In temporal analysis the execution trace is evaluated against formulae written in temporal logic. We have implemented a temporal logic evaluator on finite traces using the Maude rewriting system from SRI International, USA. Temporal logic is defined in Maude by giving its syntax as a signature and its semantics as rewrite equations. The resulting semantics is extremely efficient and can handle event streams of hundreds of millions events in few minutes. Furthermore, the implementation is very succinct. The second form of even stream analysis supported is error pattern analysis where an execution trace is analyzed using various error detection algorithms that can identify error-prone programming practices that may potentially lead to errors in some different executions. Two such algorithms focusing on concurrency errors have been implemented in JPaX, one for deadlocks and the other for data races. It is important to note, that a deadlock or data race potential does not need to occur in order for its potential to be detected with these algorithms. This is what makes them very scalable in practice. The data race algorithm implemented is the Eraser algorithm from Compaq, however adopted to Java. The tool is currently being applied to a code base for controlling a spacecraft by the developers of that software in order to evaluate its applicability.

  1. High-optical-power handling InGaAs photodiodes and balanced receivers for high-spurious free dynamic range (SFDR) analog photonic links

    NASA Astrophysics Data System (ADS)

    Joshi, Abhay M.; Wang, Xinde; Mohr, Dan; Becker, Donald; Patil, Ravikiran

    2004-08-01

    We have developed 20 mA or higher photocurrent handling InGaAs photodiodes with 20 GHz bandwidth, and 10 mA or higher photocurrent handling InGaAs photodiodes with >40 GHz bandwidth. These photodiodes have been thoroughly tested for reliability including Bellcore GR 468 standard and are built to ISO 9001:2000 Quality Management System. These Dual-depletion InGaAs/InP photodiodes are surface illuminated and yet handle such large photocurrent due to advanced band-gap engineering. They have broad wavelength coverage from 800 nm to 1700 nm, and thus can be used at several wavelengths such as 850 nm, 1064 nm, 1310 nm, 1550 nm, and 1620 nm. Furthermore, they exhibit very low Polarization Dependence Loss of 0.05dB typical to 0.1dB maximum. Using above high current handling photodiodes, we have developed classical Push-Pull pair balanced photoreceivers for the 2 to 18 GHz EW system. These balanced photoreceivers boost the Spurious Free Dynamic Range (SFDR) by almost 3 dB by eliminating the laser RIN noise. Future research calls for designing an Avalanche Photodiode Balanced Pair to boost the SFDR even further by additional 3 dB. These devices are a key enabling technology in meeting the SFDR requirements for several DoD systems.

  2. Investigating the intrinsic cleanliness of automated handling designed for EUV mask pod-in-pod systems

    NASA Astrophysics Data System (ADS)

    Brux, O.; van der Walle, P.; van der Donck, J. C. J.; Dress, P.

    2011-11-01

    Extreme Ultraviolet Lithography (EUVL) is the most promising solution for technology nodes 16nm (hp) and below. However, several unique EUV mask challenges must be resolved for a successful launch of the technology into the market. Uncontrolled introduction of particles and/or contamination into the EUV scanner significantly increases the risk for device yield loss and potentially scanner down-time. With the absence of a pellicle to protect the surface of the EUV mask, a zero particle adder regime between final clean and the point-of-exposure is critical for the active areas of the mask. A Dual Pod concept for handling EUV masks had been proposed by the industry as means to minimize the risk of mask contamination during transport and storage. SuSS-HamaTech introduces MaskTrackPro InSync as a fully automated solution for the handling of EUV masks in and out of this Dual Pod System and therefore constitutes an interface between various tools inside the Fab. The intrinsic cleanliness of each individual handling and storage step of the inner shell (EIP) of this Dual Pod and the EUV mask inside the InSync Tool has been investigated to confirm the capability for minimizing the risk of cross-contamination. An Entegris Dual Pod EUV-1000A-A110 has been used for the qualification. The particle detection for the qualification procedure was executed with the TNO's RapidNano Particle Scanner, qualified for particle sizes down to 50nm (PSL equivalent). It has been shown that the target specification of < 2 particles @ 60nm per 25 cycles has been achieved. In case where added particles were measured, the EIP has been identified as a potential root cause for Ni particle generation. Any direct Ni-Al contact has to be avoided to mitigate the risk of material abrasion.

  3. Monte Carlo Simulation of Nanoparticle Encapsulation in Flames

    NASA Technical Reports Server (NTRS)

    Sun, Z.; Huertas, J. I.; Axelbaum, R. L.

    1999-01-01

    Two critical challenges facing the application of flames for synthesis of nanopowder materials are: (1) overcoming formation of agglomerates and (2) ensuring that the highly reactive nanopowders that are synthesized in flames can be produced in such a manner that their purity is maintained during subsequent processing. Agglomerates are produced in flames because particle formation occurs in a high temperature and high number density environment. They are undesirable in most advanced applications of powders. For example, agglomerates have a deleterious effect on compaction density, leading to voids when nanopowders are consolidated. Efforts to avoid agglomeration in flames without substantially reducing particle number density and, consequently, production rate, have had limited success. Powder purity must also be maintained during subsequent handling of nanopowders and this poses a significant challenge for any synthesis route because nanopowders, particularly metals and non-oxide ceramic powders, are inherently reactive. Impurities acquired during handling of nanopowders have slowed the advancement of the nanostructured materials industry. One promising approach that has been proposed to address these problems is nano-encapsulation. In this approach, the core particles are encapsulated in a removable material while they are within the flame but before excessive agglomeration has occurred. Condensation can be very rapid so that core particles are trapped within the condensed material and agglomeration is limited. Nano-encapsulation also addresses the handling concerns for post-synthesis processing. Results have shown that when nano-encapsulated powders are exposed to atmosphere the core particles are protected from oxidation and/or hydrolysis. Thus, handling of the powders does not require extreme care. If, for example, at the time of consolidation the encapsulation material is removed by vacuum annealing, the resulting powder remains unagglomerated and free of impurities. In this work, we described a novel aerosol model that has been developed to simulate particle encapsulation in flames. The model will ultimately be coupled to a one-dimensional spherical flame code and compared to results from microgravity flame experiments.

  4. Belgian and Spanish consumption data and consumer handling practices for fresh fruits and vegetables useful for further microbiological and chemical exposure assessment.

    PubMed

    Jacxsens, L; Ibañez, I Castro; Gómez-López, V M; Fernandes, J Araujo; Allende, A; Uyttendaele, M; Huybrechts, I

    2015-04-01

    A consumer survey was organized in Spain and Belgium to obtain consumption data and to gain insight into consumer handling practices for fresh vegetables consumed raw or minimally processed (i.e., heads of leafy greens, bell peppers, tomatoes, fresh herbs, and precut and packed leafy greens) and fruits to be consumed without peeling (i.e., apples, grapes, strawberries, raspberries, other berries, fresh juices, and precut mixed fruit). This information can be used for microbiological and/or chemical food safety research. After extensive cleanup of rough databases for missing and extreme values and age correction, information from 583 respondents from Spain and 1,605 respondents from Belgium (18 to 65 years of age) was retained. Daily intake (grams per day) was calculated taking into account frequency and seasonality of consumption, and distributions were obtained that can be used in quantitative risk assessment for chemical hazards with chronic effects on human health. Data also were recalculated to obtain discrete distributions of consumption per portion and the corresponding frequency of consumption, which can be used in acute microbiological risk assessment or outbreak investigations. The ranked median daily consumption of fruits and vegetables was similar in Spain and Belgium: apple > strawberry > grapes > strawberries and raspberries; and tomatoes > leafy greens > bell peppers > fresh herbs. However, vegetable consumption was higher (in terms of both portion and frequency of consumption) in Spain than in Belgium, whereas the opposite was found for fruit consumption. Regarding consumer handling practices related to storage time and method, Belgian consumers less frequently stored their fresh produce in a refrigerator and did so for shorter times compared with Spanish consumers. Washing practices for lettuce heads and packed leafy greens also were different. The survey revealed differences between these two countries in consumption and consumer handling practices, which can have an impact on outcomes of future microbiological or chemical risk assessment studies.

  5. 77 FR 40494 - Procedures for the Handling of Retaliation Complaints Under Section 219 of the Consumer Product...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... cases involving important or novel legal issues, large numbers of employees, alleged violations that... computation of time before those tribunals and express filing deadlines as days rather than business days...

  6. The Application of Special Computing Techniques to Speed-Up Image Feature Extraction and Processing Techniques.

    DTIC Science & Technology

    1981-12-01

    ocessors has led to the possibility of implementing a large number of image processing functions in near real time . ~CC~ jnro _ j:% UNLSSFE (b-.YC ASIIAINO...to the possibility of implementing a large number of image processing functions in near " real - time ," a result which is essential to establishing a...for example, and S) rapid image handling for near real - time in- teraction by a user at a display. For example, for a large resolution image, say

  7. Speech-Enabled Interfaces for Travel Information Systems with Large Grammars

    NASA Astrophysics Data System (ADS)

    Zhao, Baoli; Allen, Tony; Bargiela, Andrzej

    This paper introduces three grammar-segmentation methods capable of handling the large grammar issues associated with producing a real-time speech-enabled VXML bus travel application for London. Large grammars tend to produce relatively slow recognition interfaces and this work shows how this limitation can be successfully addressed. Comparative experimental results show that the novel last-word recognition based grammar segmentation method described here achieves an optimal balance between recognition rate, speed of processing and naturalness of interaction.

  8. ClimEx - Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec

    NASA Astrophysics Data System (ADS)

    Ludwig, Ralf; Baese, Frank; Braun, Marco; Brietzke, Gilbert; Brissette, Francois; Frigon, Anne; Giguère, Michel; Komischke, Holger; Kranzlmueller, Dieter; Leduc, Martin; Martel, Jean-Luc; Ricard, Simon; Schmid, Josef; von Trentini, Fabian; Turcotte, Richard; Weismueller, Jens; Willkofer, Florian; Wood, Raul

    2017-04-01

    The recent accumulation of extreme hydrological events in Bavaria and Québec has stimulated scientific and also societal interest. In addition to the challenges of an improved prediction of such situations and the implications for the associated risk management, there is, as yet, no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for 'virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change. [The authors acknowledge funding for the project from the Bavarian State Ministry for the Environment and Consumer Protection].

  9. Texas passes first law for safe patient handling in America: landmark legislation protects health-care workers and patients from injury related to manual patient lifting.

    PubMed

    Hudson, Mary Anne

    2005-01-01

    On June 17,2005, Texas Governor Rick Perry (R) signed into law Senate Bill 1525, making Texas the first state in the nation to require hospitals and nursing homes to implement safe patient handling and movement programs. Governor Perry is to be commended for this heroic first stand for safe patient handling in America. The landmark legislation will take effect January 1, 2006, requiring the establishment of policy to identify, assess, and develop methods of controlling the risk of injury to patients and nurses associated with lifting, transferring, repositioning, and movement of patients; evaluation of alternative methods from manual lifting to reduce the risk of injury from patient lifting, including equipment and patient care environment; restricting, to the extent feasible with existing equipment, manual handling of all or most of a patient's weight to emergency, life-threatening, or exceptional circumstances; and provision for refusal to perform patient handling tasks believed to involve unacceptable risks of injury to a patient or nurse. Manually lifting patients has been called deplorable, inefficient, dangerous to nurses, and painful and brutal to patients; manual lifting can cause needless suffering and injury to patients, with dangers including pain, bruising, skin tears, abrasions, tube dislodgement, dislocations, fractures, and being dropped by nursing staff during attempts to manually lift. Use of safe, secure, mechanical lift equipment and gentle friction-reducing devices for patient maneuvering tasks could eliminate such needless brutality. Research has proven that manual patient lifting is extremely hazardous to health-care workers, creating substantial risk of low-back injury, whether with one or two patient handlers. Studies on the use of mechanical patient lift equipment, by either nursing staff or lift teams, have proven repeatedly that most nursing staff back injury is preventable, leading to substantial savings to employers on medical and compensation costs. Because the health-care industry has relied on people to do the work of machines, nursing work remains the most dangerous occupation for disabling back injury. Back injury from patient lifting may be the single largest contributor to the nursing shortage, with perhaps 12% of nurses leaving or being terminated because of back injury. The US health-care industry has not kept pace with other industries, which provide mechanical lift equipment for lifting loads equivalent to the weight of patients, or with other countries, such as Australia and England, which are more advanced in their use of modern technology for patient lifting and with no-lifting practices in compliance with government regulations and nursing policies banning manual lifting. With Texas being the first state to succeed in passing legislation for safe patient handling, other states are working toward legislative protection against injury with manual patient lifting. California re-introduced safe patient handling legislation on February 17, 2005, with CA SB 363, Hospitals: Lift Teams, following the September 22, 2004, veto of CA AB 2532 by Governor Arnold Schwarzenegger, who said he believes existing statutory protection and workplace safety standards are sufficient to protect health care workers from injury. Massachusetts HB 2662, Relating to Safe Patient Handling in Certain Health Facilities, was introduced December 1, 2004. Ohio HB 67, signed March 21, 2005 by Governor Bob Taft (R), creates a program for interest-free loans to nursing homes for implementation of a no-manual-lift program. New York companion bills AB 7641 and SB 4029 were introduced in April, 2005, calling for creation of a 2-year study to establish safe patient handling programs and collect data on nursing staff and patient injury with manual patient handling versus lift equipment, to determine best practices for improving health and safety of health-care workers and patients during patient handling. Washington State is planning re-introduction of safe patient handling legislation, after WA HB 1672, Relating to reducing injuries among patients and health care workers, was stalled in committee in February, 2005. Language from these state initiatives may be used as models to assist other states with drafting safe patient handling legislation. Rapid enactment of a federal mandate for Safe Patient Handling No Manual Lift is essential and anticipated.

  10. [Principles of management of All-Russia Disaster Medicine Services].

    PubMed

    Sakhno, I I

    2000-11-01

    Experience of liquidation of earthquake consequences in Armenia (1988) has shown that it is extremely necessary to create the system of management in regions of natural disaster, large accident or catastrophe before arrival of main forces in order to provide reconnaissance, to receive the arriving units. It will help to make well-grounded decisions, to set tasks in time, to organize and conduct emergency-and-rescue works. The article contains general material concerning the structure of All-Russia service of disaster medicine (ARSDM), organization of management at all levels and interaction between the components of ARSDM and other subsystems of Russian Service of Extreme Situations. It is recommended how to organize management of ARSDM during liquidation of medical-and-sanitary consequences of large-scale extreme situations.

  11. Physically-based extreme flood frequency with stochastic storm transposition and paleoflood data on large watersheds

    NASA Astrophysics Data System (ADS)

    England, John F.; Julien, Pierre Y.; Velleux, Mark L.

    2014-03-01

    Traditionally, deterministic flood procedures such as the Probable Maximum Flood have been used for critical infrastructure design. Some Federal agencies now use hydrologic risk analysis to assess potential impacts of extreme events on existing structures such as large dams. Extreme flood hazard estimates and distributions are needed for these efforts, with very low annual exceedance probabilities (⩽10-4) (return periods >10,000 years). An integrated data-modeling hydrologic hazard framework for physically-based extreme flood hazard estimation is presented. Key elements include: (1) a physically-based runoff model (TREX) coupled with a stochastic storm transposition technique; (2) hydrometeorological information from radar and an extreme storm catalog; and (3) streamflow and paleoflood data for independently testing and refining runoff model predictions at internal locations. This new approach requires full integration of collaborative work in hydrometeorology, flood hydrology and paleoflood hydrology. An application on the 12,000 km2 Arkansas River watershed in Colorado demonstrates that the size and location of extreme storms are critical factors in the analysis of basin-average rainfall frequency and flood peak distributions. Runoff model results are substantially improved by the availability and use of paleoflood nonexceedance data spanning the past 1000 years at critical watershed locations.

  12. The scaling of population persistence with carrying capacity does not asymptote in populations of a fish experiencing extreme climate variability.

    PubMed

    White, Richard S A; Wintle, Brendan A; McHugh, Peter A; Booker, Douglas J; McIntosh, Angus R

    2017-06-14

    Despite growing concerns regarding increasing frequency of extreme climate events and declining population sizes, the influence of environmental stochasticity on the relationship between population carrying capacity and time-to-extinction has received little empirical attention. While time-to-extinction increases exponentially with carrying capacity in constant environments, theoretical models suggest increasing environmental stochasticity causes asymptotic scaling, thus making minimum viable carrying capacity vastly uncertain in variable environments. Using empirical estimates of environmental stochasticity in fish metapopulations, we showed that increasing environmental stochasticity resulting from extreme droughts was insufficient to create asymptotic scaling of time-to-extinction with carrying capacity in local populations as predicted by theory. Local time-to-extinction increased with carrying capacity due to declining sensitivity to demographic stochasticity, and the slope of this relationship declined significantly as environmental stochasticity increased. However, recent 1 in 25 yr extreme droughts were insufficient to extirpate populations with large carrying capacity. Consequently, large populations may be more resilient to environmental stochasticity than previously thought. The lack of carrying capacity-related asymptotes in persistence under extreme climate variability reveals how small populations affected by habitat loss or overharvesting, may be disproportionately threatened by increases in extreme climate events with global warming. © 2017 The Author(s).

  13. Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach

    PubMed Central

    Tian, Yuan; Guan, Tao; Wang, Cheng

    2010-01-01

    To produce a realistic augmentation in Augmented Reality, the correct relative positions of real objects and virtual objects are very important. In this paper, we propose a novel real-time occlusion handling method based on an object tracking approach. Our method is divided into three steps: selection of the occluding object, object tracking and occlusion handling. The user selects the occluding object using an interactive segmentation method. The contour of the selected object is then tracked in the subsequent frames in real-time. In the occlusion handling step, all the pixels on the tracked object are redrawn on the unprocessed augmented image to produce a new synthesized image in which the relative position between the real and virtual object is correct. The proposed method has several advantages. First, it is robust and stable, since it remains effective when the camera is moved through large changes of viewing angles and volumes or when the object and the background have similar colors. Second, it is fast, since the real object can be tracked in real-time. Last, a smoothing technique provides seamless merging between the augmented and virtual object. Several experiments are provided to validate the performance of the proposed method. PMID:22319278

  14. Differences among nursing homes in outcomes of a safe resident handling program.

    PubMed

    Kurowski, Alicia; Gore, Rebecca; Buchholz, Bryan; Punnett, Laura

    2012-01-01

    A large nursing home corporation implemented a safe resident handling program (SRHP) in 2004-2007. We evaluated its efficacy over a 2-year period by examining differences among 5 centers in program outcomes and potential predictors of those differences. We observed nursing assistants (NAs), recording activities and body postures at 60-second intervals on personal digital assistants at baseline and at 3-month, 12-month, and 24-month follow-ups. The two outcomes computed were change in equipment use during resident handling and change in a physical workload index that estimated spinal loading due to body postures and handled loads. Potential explanatory factors were extracted from post-observation interviews, investigator surveys of the workforce, from administrative data, and employee satisfaction surveys. The facility with the most positive outcome measures was associated with many positive changes in explanatory factors and the facility with the fewest positive outcome measures experienced negative changes in the same factors. These findings suggest greater SRHP benefits where there was lower NA turnover and agency staffing; less time pressure; and better teamwork, staff communication, and supervisory support. © 2012 American Society for Healthcare Risk Management of the American Hospital Association.

  15. Performance of the Magnetospheric Multiscale central instrument data handling

    NASA Astrophysics Data System (ADS)

    Klar, Robert A.; Miller, Scott A.; Brysch, Michael L.; Bertrand, Allison R.

    In order to study the fundamental physical processes of magnetic reconnection, particle acceleration and turbulence, the Magnetospheric Multiscale (MMS) mission employs a constellation of four identically configured observatories, each with a suite of complementary science instruments. Southwest Research Institute® (SwRI® ) developed the Central Instrument Data Processor (CIDP) to handle the large data volume associated with these instruments. The CIDP is an integrated access point between the instruments and the spacecraft. It provides synchronization pulses, relays telecommands, and gathers instrument housekeeping telemetry. It collects science data from the instruments and stores it to a mass memory for later playback to a ground station. This paper retrospectively examines the data handling performance realized by the CIDP implementation. It elaborates on some of the constraints on the hardware and software designs and the resulting effects on performance. For the hardware, it discusses the limitations of the front-end electronics input/output (I/O) architecture and associated mass memory buffering. For the software, it discusses the limitations of the Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP) implementation and the data structure choices for file management. It also describes design changes that improve data handling performance in newer designs.

  16. Improving Memory Error Handling Using Linux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlton, Michael Andrew; Blanchard, Sean P.; Debardeleben, Nathan A.

    As supercomputers continue to get faster and more powerful in the future, they will also have more nodes. If nothing is done, then the amount of memory in supercomputer clusters will soon grow large enough that memory failures will be unmanageable to deal with by manually replacing memory DIMMs. "Improving Memory Error Handling Using Linux" is a process oriented method to solve this problem by using the Linux kernel to disable (offline) faulty memory pages containing bad addresses, preventing them from being used again by a process. The process of offlining memory pages simplifies error handling and results in reducingmore » both hardware and manpower costs required to run Los Alamos National Laboratory (LANL) clusters. This process will be necessary for the future of supercomputing to allow the development of exascale computers. It will not be feasible without memory error handling to manually replace the number of DIMMs that will fail daily on a machine consisting of 32-128 petabytes of memory. Testing reveals the process of offlining memory pages works and is relatively simple to use. As more and more testing is conducted, the entire process will be automated within the high-performance computing (HPC) monitoring software, Zenoss, at LANL.« less

  17. Modern Sorters for Soil Segregation on Large Scale Remediation Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shonka, J.J.; Kelley, J.E.; O'Brien, J.M.

    2008-01-15

    In the mid-1940's, Dr. C. Lapointe developed a Geiger tube based uranium ore scanner and picker to replace hand-cobbing. In the 1990's, a modern version of the Lapointe Picker for soil sorting was developed around the need to clean the Johnston Atoll of plutonium. It worked well with sand, but these systems are ineffective with soil, especially with wet conditions. Additionally, several other constraints limited throughput. Slow moving belts and thin layers of material on the belt coupled with the use of multiple small detectors and small sorting gates make these systems ineffective for high throughput. Soil sorting of clay-bearingmore » soils and building debris requires a new look at both the material handling equipment, and the radiation detection methodology. A new class of Super-Sorters has attained throughput of one hundred times that of the old designs. Higher throughput means shorter schedules which reduce costs substantially. The planning, cost, implementation, and other site considerations for these new Super-Sorters are discussed. Modern soil segregation was developed by Ed Bramlitt of the Defense Nuclear Agency for clean up at Johnston Atoll. The process eventually became the Segmented Gate System (SGS). This system uses an array of small sodium iodide (NaI) detectors, each viewing a small volume (segment), that control a gate. The volume in the gate is approximately one kg. This system works well when the material to be processed is sand; however, when the material is wet and sticky (soils with clays) the system has difficulty moving the material through the gates. Super-Sorters are a new class of machine designed to take advantage of high throughput aggregate processing conveyors, large acquisition volumes, and large NaI detectors using gamma spectroscopy. By using commercially available material handling equipment, the system can attain processing rates of up to 400 metric tons/hr with spectrum acquisition approximately every 0.5 sec, so the acquisition volume is 50 kilograms or less. Smaller sorting volumes can be obtained with lower throughput or by re-sorting the diverted material. This equipment can also handle large objects. The use of spectroscopy systems allows several regions of- interest to be set. Super-Sorters can bring waste processing charges down to less than $30/ metric ton on smaller jobs and can save hundreds of dollars per metric ton in disposal charges. The largest effect on the overall project cost occurs during planning and implementation. The overall goal is reduction of the length of the project, which dictates the most efficient soil processing. With all sorting systems the parameters that need to be accounted for are matrix type, soil feed rate, soil pre-processing, site conditions, and regulatory issues. The soil matrix and its ability to flow are extremely crucial to operations. It is also important to consider that as conditions change (i.e., moisture), the flowability of the soil matrix will change. Many soil parameters have to be considered: cohesive strength, internal and wall friction, permeability, and bulk density as a function of consolidating pressure. Clay bearing soils have very low permeability and high cohesive strength which makes them difficult to process, especially when wet. Soil feed speed is dependent on the equipment present and the ability to move the soil in the Super-Sorter processing area. When a Super-Sorter is running at 400 metric tons per hour it is difficult to feed the system. As an example, front-end loaders with large buckets would move approximately 5-10 metric tons of material, and 400 metric tons per hour would require 50-100 bucket-loads per hour to attain. Because the flowability of the soil matrix is important, poor material is often pre-processed before it is added to the feed hopper of the 'survey' conveyor. This pre-processing can consist of a 'grizzly' to remove large objects from the soil matrix, followed screening plant to prepare the soil so that it feeds well. Hydrated lime can be added to improve material properties. Site conditions (site area, typical weather conditions, etc.) also play a large part in project planning. Downtime lengthens project schedule and costs. The system must be configured to handle weather conditions or other variables that affect throughput. The largest single factor that plays into the project design is the regulatory environment. Before a sorter can be utilized, an averaging mass must be established by the regulator(s). There currently are no standards or guidelines in this area. The differences between acquisition mass and averaging mass are very important. The acquisition mass is defined based on the acquisition time and the geometry of the detectors. The averaging mass can then be as small as the acquisition mass or as large as several hundred tons (the averaging mass is simply the sum of a number of acquisitions). It is important to define volumetric limits and any required point-source limits. Super-Sorters handle both of these types of limits simultaneously. The minimum detectable activity for Super- Sorters is a function of speed. The chart below illustrates the detection confidence level for a 0.1 {mu}Ci point source of Ra-226 vs alarm point for three different sorter process rates. The minimal detection activity and diversion volume for a Super-Sorter is also a function of the acquisition mass. The curves were collected using a 0-15 kg acquisition mass. Diversion volumes ranged from 20-30 kg for a point source diversion. Soil Super-Sorters should be considered for every D and D project where it is desirable to reduce the waste stream. A volume reduction of 1:1000 can be gained for each pass through a modern sorter, resulting in significant savings in disposal costs.« less

  18. Multi-objective optimization for model predictive control.

    PubMed

    Wojsznis, Willy; Mehta, Ashish; Wojsznis, Peter; Thiele, Dirk; Blevins, Terry

    2007-06-01

    This paper presents a technique of multi-objective optimization for Model Predictive Control (MPC) where the optimization has three levels of the objective function, in order of priority: handling constraints, maximizing economics, and maintaining control. The greatest weights are assigned dynamically to control or constraint variables that are predicted to be out of their limits. The weights assigned for economics have to out-weigh those assigned for control objectives. Control variables (CV) can be controlled at fixed targets or within one- or two-sided ranges around the targets. Manipulated Variables (MV) can have assigned targets too, which may be predefined values or current actual values. This MV functionality is extremely useful when economic objectives are not defined for some or all the MVs. To achieve this complex operation, handle process outputs predicted to go out of limits, and have a guaranteed solution for any condition, the technique makes use of the priority structure, penalties on slack variables, and redefinition of the constraint and control model. An engineering implementation of this approach is shown in the MPC embedded in an industrial control system. The optimization and control of a distillation column, the standard Shell heavy oil fractionator (HOF) problem, is adequately achieved with this MPC.

  19. Effect of Americium-241 Content on Plutonium Radiation Source Terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    1998-12-28

    The management of excess plutonium by the US Department of Energy includes a number of storage and disposition alternatives. Savannah River Site (SRS) is supporting DOE with plutonium disposition efforts, including the immobilization of certain plutonium materials in a borosilicate glass matrix. Surplus plutonium inventories slated for vitrification include materials with elevated levels of Americium-241. The Am-241 content of plutonium materials generally reflects in-growth of the isotope due to decay of plutonium and is age-dependent. However, select plutonium inventories have Am-241 levels considerably above the age-based levels. Elevated levels of americium significantly impact radiation source terms of plutonium materials andmore » will make handling of the materials more difficult. Plutonium materials are normally handled in shielded glove boxes, and the work entails both extremity and whole body exposures. This paper reports results of an SRS analysis of plutonium materials source terms vs. the Americium-241 content of the materials. Data with respect to dependence and magnitude of source terms on/vs. Am-241 levels are presented and discussed. The investigation encompasses both vitrified and un-vitrified plutonium oxide (PuO2) batches.« less

  20. Development and Testing of a ``Backlash-Free'' Gas-Tight High Precision Sample Handling Mechanism for Combined Science on the ExoMars 2018 Rover

    NASA Astrophysics Data System (ADS)

    Paul, R.; Redlich, D.; Richter, L.; Zuknik, K.-H.; Muhlbauer, Q.; Thiel, M.; Fowler, L.; Tattusch, T.; Weisz, H.; Musso, F.; Durrant, S.

    2015-09-01

    This paper presents the development and testing by the OHB System AG of the Powdered Sample Handling Mechanism (PSHS) that is part of the rover of the European Space Agency 2018 ExoMars Mission, a cooperative mission with Roscosmos including a scientific instrument contribution from NASA. The task of this mechanism is to flatten and position powdered Martian soil samples allowing subsequent investigation of selected grains by different optical instruments thus providing combined science in an ultra-clean environment.The exceptional sensitivity of these instruments causes extremely challenging requirements with respect to positioning performance as well as cleanliness and contamination control. The impact of these design drivers is highlighted focusing on specific mechanism features such as the pre-torque device to minimize the backlash and the dynamic feed-through, allowing a gas-tight encapsulation of an ultra-clean zone free of drive-train components.Subsequently the results of the test campaign of an elegant breadboard under Mars-like conditions, as well as first QM test results are described. Furthermore the outcomes of combined tests with an optical instrument are reported.

  1. Uses for lunar crawler transporters

    NASA Astrophysics Data System (ADS)

    Kaden, Richard A.

    This article discusses state-of-the-art crawler transporters and expresses the need for additional research and development for lunar crawlers. The thrust of the paper illustrates how the basic crawler technology has progressed to a point where extremely large modules can be shop fabricated and move to some distant location at a considerable savings. Also, extremely heavy loads may be lifted by large crawler cranes and placed in designed locations. The Transi-Lift Crawler crane with its traveling counterweight is an attractive concept for lunar construction.

  2. EVA Swab Tool to Support Planetary Protection and Astrobiology Evaluations

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle A.; Hood, Drew; Walker, Mary; Venkateswaran, Kasthuri J.; Schuerger, Andrew C.

    2018-01-01

    When we send humans to search for life on other planets, we'll need to know what we brought with us versus what may already be there. To ensure our crewed systems meet planetary protection requirements-and to protect our science from human contamination-we'll need to assess whether microorganisms may be leaking or venting from our spacecraft. Microbial sample collection outside of a pressurized spacecraft is complicated by temperature extremes, low pressures that preclude the use of laboratory standard (wetted) swabs, and operation either in bulky spacesuits or with robotic assistance. Engineers at the National Aeronautics and Space Administration (NASA) recently developed a swab kit for use in collecting microbial samples from the external surfaces of crewed spacecraft, including spacesuits. The Extravehicular Activity (EVA) Swab Kit consists of a single swab tool handle and an eight-canister sample caddy. The design team minimized development cost by re-purposing a heritage Space Shuttle tile repair handle that was designed to quickly snap into different tool attachments by engaging a mating device in each attachment. This allowed the tool handle to snap onto a fresh swab attachment much like popular shaving razor handles can snap onto a disposable blade cartridge. To disengage the handle from a swab, the user performs two independent functions, which can be done with a single hand. This dual operation mitigates the risk that a swab will be inadvertently released and lost in microgravity. Each swab attachment is fitted with commercially available foam swab tips, vendor-certified to be sterile for Deoxyribonucleic Acid (DNA). A microbial filter installed in the bottom of each sample container allows the container to outgas and repressurize without introducing microbial contaminants to internal void spaces. Extensive ground testing, post-test handling, and sample analysis confirmed the design is able to maintain sterile conditions as the canister moves between various pressure environments. To further minimize cost, the design team acquired extensive ground test experience in a relevant flight environment by piggy-backing onto suited crew training runs. These training runs allowed the project to validate tool interfaces with pressurized EVA gloves and collect user feedback on the tool design and function, as well as characterize baseline microbial data for different types of spacesuits. In general, test subjects found the EVA Swab Kit relatively straightforward to operate, but identified a number of design improvements that will be incorporated into the final design. Although originally intended to help characterize human forward contaminants, this tool has other potential applications, such as for collecting and preserving space-exposed materials to support astrobiology experiments.

  3. Dynamic optimization of walker-assisted FES-activated paraplegic walking: simulation and experimental studies.

    PubMed

    Nekoukar, Vahab; Erfanian, Abbas

    2013-11-01

    In this paper, we propose a musculoskeletal model of walker-assisted FES-activated paraplegic walking for the generation of muscle stimulation patterns and characterization of the causal relationships between muscle excitations, multi-joint movement, and handle reaction force (HRF). The model consists of the lower extremities, trunk, hands, and a walker. The simulation of walking is performed using particle swarm optimization to minimize the tracking errors from the desired trajectories for the lower extremity joints, to reduce the stimulations of the muscle groups acting around the hip, knee, and ankle joints, and to minimize the HRF. The results of the simulation studies using data recorded from healthy subjects performing walker-assisted walking indicate that the model-generated muscle stimulation patterns are in agreement with the EMG patterns that have been reported in the literature. The experimental results on two paraplegic subjects demonstrate that the proposed methodology can improve walking performance, reduce HRF, and increase walking speed when compared to the conventional FES-activated paraplegic walking. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  4. Nonempirical Semilocal Free-Energy Density Functional for Matter under Extreme Conditions

    DOE PAGES

    Karasiev, Valentin V.; Dufty, James W.; Trickey, S. B.

    2018-02-14

    The potential for density functional calculations to predict the properties of matter under extreme conditions depends crucially upon having a non-empirical approximate free energy functional valid over a wide range of state conditions. Unlike the ground-state case, no such free-energy exchange- correlation (XC) functional exists. We remedy that with systematic construction of a generalized gradient approximation XC free-energy functional based on rigorous constraints, including the free energy gradient expansion. The new functional provides the correct temperature dependence in the slowly varying regime and the correct zero-T, high-T, and homogeneous electron gas limits. Application in Kohn-Sham calculations for hot electrons inmore » a static fcc Aluminum lattice demon- strates the combined magnitude of thermal and gradient effects handled by this functional. Its accuracy in the increasingly important warm dense matter regime is attested by excellent agreement of the calculated deuterium equation of state with reference path integral Monte Carlo results at intermediate and elevated temperatures and by low density Al calculations over a wide T range.« less

  5. Nonempirical Semilocal Free-Energy Density Functional for Matter under Extreme Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karasiev, Valentin V.; Dufty, James W.; Trickey, S. B.

    The potential for density functional calculations to predict the properties of matter under extreme conditions depends crucially upon having a non-empirical approximate free energy functional valid over a wide range of state conditions. Unlike the ground-state case, no such free-energy exchange- correlation (XC) functional exists. We remedy that with systematic construction of a generalized gradient approximation XC free-energy functional based on rigorous constraints, including the free energy gradient expansion. The new functional provides the correct temperature dependence in the slowly varying regime and the correct zero-T, high-T, and homogeneous electron gas limits. Application in Kohn-Sham calculations for hot electrons inmore » a static fcc Aluminum lattice demon- strates the combined magnitude of thermal and gradient effects handled by this functional. Its accuracy in the increasingly important warm dense matter regime is attested by excellent agreement of the calculated deuterium equation of state with reference path integral Monte Carlo results at intermediate and elevated temperatures and by low density Al calculations over a wide T range.« less

  6. Osteogenesis imperfecta: rehabilitation approach with infants and young children.

    PubMed

    Binder, H; Hawks, L; Graybill, G; Gerber, N L; Weintrob, J C

    1984-09-01

    A rehabilitation approach, consisting of initial handling and positioning followed by functional and formal strengthening exercises, was developed for the child with severe progressive osteogenesis imperfecta (OI). The program was developed because of the increased life expectancy for infants and children with severe progressive OI, combined with the lack of published reports dealing with their rehabilitation. The program can be followed easily by parents or therapists with regular monitoring by a psychiatrist. The goals are to improve the life span as well as the quality of life of these children by preventing the following: (1) positional contractures and deformities, (2) muscle weakness and osteoporosis, and (3) malalignment of the lower extremity joints prohibiting weight-bearing. Implementation of the program requires full cooperation of the parents. The initial results in four children between the ages of 3 and 11 years are encouraging. The benefits of increased strength and mobility leading to more age-appropriate activities and behaviors outweigh the only observed negative result, that is trauma-related lower extremity fractures in children with milder disease, and therefore greater mobility and higher activity levels.

  7. Flight Test Results from the NF-15B Intelligent Flight Control System (IFCS) Project with Adaptation to a Simulated Stabilator Failure

    NASA Technical Reports Server (NTRS)

    Bosworth, John T.; Williams-Hayes, Peggy S.

    2007-01-01

    Adaptive flight control systems have the potential to be more resilient to extreme changes in airplane behavior. Extreme changes could be a result of a system failure or of damage to the airplane. A direct adaptive neural-network-based flight control system was developed for the National Aeronautics and Space Administration NF-15B Intelligent Flight Control System airplane and subjected to an inflight simulation of a failed (frozen) (unmovable) stabilator. Formation flight handling qualities evaluations were performed with and without neural network adaptation. The results of these flight tests are presented. Comparison with simulation predictions and analysis of the performance of the adaptation system are discussed. The performance of the adaptation system is assessed in terms of its ability to decouple the roll and pitch response and reestablish good onboard model tracking. Flight evaluation with the simulated stabilator failure and adaptation engaged showed that there was generally improvement in the pitch response; however, a tendency for roll pilot-induced oscillation was experienced. A detailed discussion of the cause of the mixed results is presented.

  8. Flight Test Results from the NF-15B Intelligent Flight Control System (IFCS) Project with Adaptation to a Simulated Stabilator Failure

    NASA Technical Reports Server (NTRS)

    Bosworth, John T.; Williams-Hayes, Peggy S.

    2010-01-01

    Adaptive flight control systems have the potential to be more resilient to extreme changes in airplane behavior. Extreme changes could be a result of a system failure or of damage to the airplane. A direct adaptive neural-network-based flight control system was developed for the National Aeronautics and Space Administration NF-15B Intelligent Flight Control System airplane and subjected to an inflight simulation of a failed (frozen) (unmovable) stabilator. Formation flight handling qualities evaluations were performed with and without neural network adaptation. The results of these flight tests are presented. Comparison with simulation predictions and analysis of the performance of the adaptation system are discussed. The performance of the adaptation system is assessed in terms of its ability to decouple the roll and pitch response and reestablish good onboard model tracking. Flight evaluation with the simulated stabilator failure and adaptation engaged showed that there was generally improvement in the pitch response; however, a tendency for roll pilot-induced oscillation was experienced. A detailed discussion of the cause of the mixed results is presented.

  9. Time management and nectar flow: flower handling and suction feeding in long-proboscid flies (Nemestrinidae: Prosoeca).

    PubMed

    Karolyi, Florian; Morawetz, Linde; Colville, Jonathan F; Handschuh, Stephan; Metscher, Brian D; Krenn, Harald W

    2013-11-01

    A well-developed suction pump in the head represents an important adaptation for nectar-feeding insects, such as Hymenoptera, Lepidoptera and Diptera. This pumping organ creates a pressure gradient along the proboscis, which is responsible for nectar uptake. The extremely elongated proboscis of the genus Prosoeca (Nemestrinidae) evolved as an adaptation to feeding from long, tubular flowers. According to the functional constraint hypothesis, nectar uptake through a disproportionately elongated, straw-like proboscis increases flower handling time and consequently lowers the energy intake rate. Due to the conspicuous length variation of the proboscis of Prosoeca, individuals with longer proboscides are hypothesised to have longer handling times. To test this hypothesis, we used field video analyses of flower-visiting behaviour, detailed examinations of the suction pump morphology and correlations of proboscis length with body length and suction pump dimensions. Using a biomechanical framework described for nectar-feeding Lepidoptera in relation to proboscis length and suction pump musculature, we describe and contrast the system in long-proboscid flies. Flies with longer proboscides spent significantly more time drinking from flowers. In addition, proboscis length and body length showed a positive allometric relationship. Furthermore, adaptations of the suction pump included an allometric relationship between proboscis length and suction pump muscle volume and a combination of two pumping organs. Overall, the study gives detailed insight into the adaptations required for long-proboscid nectar feeding, and comparisons with other nectar-sucking insects allow further considerations of the evolution of the suction pump in insects with sucking mouthparts.

  10. Linear modeling of human hand-arm dynamics relevant to right-angle torque tool interaction.

    PubMed

    Ay, Haluk; Sommerich, Carolyn M; Luscher, Anthony F

    2013-10-01

    A new protocol was evaluated for identification of stiffness, mass, and damping parameters employing a linear model for human hand-arm dynamics relevant to right-angle torque tool use. Powered torque tools are widely used to tighten fasteners in manufacturing industries. While these tools increase accuracy and efficiency of tightening processes, operators are repetitively exposed to impulsive forces, posing risk of upper extremity musculoskeletal injury. A novel testing apparatus was developed that closely mimics biomechanical exposure in torque tool operation. Forty experienced torque tool operators were tested with the apparatus to determine model parameters and validate the protocol for physical capacity assessment. A second-order hand-arm model with parameters extracted in the time domain met model accuracy criterion of 5% for time-to-peak displacement error in 93% of trials (vs. 75% for frequency domain). Average time-to-peak handle displacement and relative peak handle force errors were 0.69 ms and 0.21%, respectively. Model parameters were significantly affected by gender and working posture. Protocol and numerical calculation procedures provide an alternative method for assessing mechanical parameters relevant to right-angle torque tool use. The protocol more closely resembles tool use, and calculation procedures demonstrate better performance of parameter extraction using time domain system identification methods versus frequency domain. Potential future applications include parameter identification for in situ torque tool operation and equipment development for human hand-arm dynamics simulation under impulsive forces that could be used for assessing torque tools based on factors relevant to operator health (handle dynamics and hand-arm reaction force).

  11. Time management and nectar flow: flower handling and suction feeding in long-proboscid flies (Nemestrinidae: Prosoeca)

    NASA Astrophysics Data System (ADS)

    Karolyi, Florian; Morawetz, Linde; Colville, Jonathan F.; Handschuh, Stephan; Metscher, Brian D.; Krenn, Harald W.

    2013-11-01

    A well-developed suction pump in the head represents an important adaptation for nectar-feeding insects, such as Hymenoptera, Lepidoptera and Diptera. This pumping organ creates a pressure gradient along the proboscis, which is responsible for nectar uptake. The extremely elongated proboscis of the genus Prosoeca (Nemestrinidae) evolved as an adaptation to feeding from long, tubular flowers. According to the functional constraint hypothesis, nectar uptake through a disproportionately elongated, straw-like proboscis increases flower handling time and consequently lowers the energy intake rate. Due to the conspicuous length variation of the proboscis of Prosoeca, individuals with longer proboscides are hypothesised to have longer handling times. To test this hypothesis, we used field video analyses of flower-visiting behaviour, detailed examinations of the suction pump morphology and correlations of proboscis length with body length and suction pump dimensions. Using a biomechanical framework described for nectar-feeding Lepidoptera in relation to proboscis length and suction pump musculature, we describe and contrast the system in long-proboscid flies. Flies with longer proboscides spent significantly more time drinking from flowers. In addition, proboscis length and body length showed a positive allometric relationship. Furthermore, adaptations of the suction pump included an allometric relationship between proboscis length and suction pump muscle volume and a combination of two pumping organs. Overall, the study gives detailed insight into the adaptations required for long-proboscid nectar feeding, and comparisons with other nectar-sucking insects allow further considerations of the evolution of the suction pump in insects with sucking mouthparts.

  12. America's container ports : freight hubs that connect our nation to global markets

    DOT National Transportation Integrated Search

    2009-06-01

    The U.S. marine transportation system continues to handle large volumes of domestic and international : freight in support of the nations economic activities. The demand for freight transportation : responds to trends in global economic activity a...

  13. Special Advocate.

    ERIC Educational Resources Information Center

    Vander Weele, Maribeth

    1992-01-01

    Thomas Hehir, special education chief of Chicago Public Schools, is evangelist of integrating children with disabilities into regular classrooms. By completely reorganizing department viewed as political patronage dumping ground, Hehir has made remarkable progress in handling large number of children awaiting evaluation and placement in special…

  14. Large Prominence Eruption [video

    NASA Image and Video Library

    2014-10-07

    The STEREO (Behind) spacecraft captured this large prominence and corona mass ejection as they erupted into space (Sept. 26, 2014). By combining images from three instruments, scientists can see the eruption itself (in extreme UV light) as well as follow its progression over the period of about 13 hours with its two coronagraphs. Credit: NASA/Goddard/STEREO The STEREO (Behind) spacecraft captured this large prominence and corona mass ejection as they erupted into space (Sept. 26, 2014). By combining images from three instruments, scientists can see the eruption itself (in extreme UV light) as well as follow its progression over the period of about 13 hours with its two coronagraphs.

  15. Large Prominence Eruption (October 3, 2014)

    NASA Image and Video Library

    2017-12-08

    The STEREO (Behind) spacecraft captured this large prominence and corona mass ejection as they erupted into space (Sept. 26, 2014). By combining images from three instruments, scientists can see the eruption itself (in extreme UV light) as well as follow its progression over the period of about 13 hours with its two coronagraphs. Credit: NASA/Goddard/STEREO The STEREO (Behind) spacecraft captured this large prominence and corona mass ejection as they erupted into space (Sept. 26, 2014). By combining images from three instruments, scientists can see the eruption itself (in extreme UV light) as well as follow its progression over the period of about 13 hours with its two coronagraphs.

  16. The study on servo-control system in the large aperture telescope

    NASA Astrophysics Data System (ADS)

    Hu, Wei; Zhenchao, Zhang; Daxing, Wang

    2008-08-01

    Large astronomical telescope or extremely enormous astronomical telescope servo tracking technique will be one of crucial technology that must be solved in researching and manufacturing. To control technique feature of large astronomical telescope or extremely enormous astronomical telescope, this paper design a sort of large astronomical telescope servo tracking control system. This system composes a principal and subordinate distributed control system, host computer sends steering instruction and receive slave computer functional mode, slave computer accomplish control algorithm and execute real-time control. Large astronomical telescope servo control use direct drive machine, and adopt DSP technology to complete direct torque control algorithm, Such design can not only increase control system performance, but also greatly reduced volume and costs of control system, which has a significant occurrence. The system design scheme can be proved reasonably by calculating and simulating. This system can be applied to large astronomical telescope.

  17. Amphibians and ultra high diluted thyroxine--further experiments and re-analysis of data.

    PubMed

    Endler, Peter Christian; Scherer-Pongratz, Waltraud; Harrer, Bernhard; Lingg, Gerhard; Lothaller, Harald

    2015-10-01

    A model of thyroxine and metamorphosis of highland amphibians is frequently mentioned as an example of experiments on extremely diluted substances in discussions around 'homeopathy'. The model was scrutinized by reanalysing the results of the initial researcher A and a second researcher B as well as of 5 external researchers C between 1990 and 2013. Rana temporaria larvae were taken from an alpine highland biotope. The test solution was thyroxine 10(-30) (T30x), tetra-iodo-thyronine sodium pentahydrate diluted with pure water in 26 steps of 1:10, being agitated after each step. Analogously prepared water (W30x) was used for control. Tadpoles were observed from the 2-legged to the 4-legged stage. Experiments were performed in different years, at different times of season, and their duration could vary. Frequencies of 4-legged animals, effect sizes and areas under the curves (AUCs) were calculated and regression analyses were performed to investigate possible correlations between year, season, duration etc. Experiments were in line with animal protection guidelines. The total set of data A + B + C as well as subsets A (initial researcher, N=286+293), B (second centre, 965 + 965) and C (5 external researchers, 690 + 690) showed an effect of extremely diluted agitated thyroxine reverse to that known of molecular thyroxin, i.e. test values were below control by 11.4% for A, 9.5% for B and 7.0% for C (p<0.001 for each of the subsets). The effect size (Cohen's d) was >0.8 (large) for both A and B and 0.74 (medium) for C. Although a perfect reproducibility was not obtained, this paradoxical phenomenon was generally consistent in different observations. Correlations were found between details of laboratory handling, as well as environment temperature, and the size of the results. Copyright © 2015 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  18. Fracture of Carbon Nanotube - Amorphous Carbon Composites: Molecular Modeling

    NASA Technical Reports Server (NTRS)

    Jensen, Benjamin D.; Wise, Kristopher E.; Odegard, Gregory M.

    2015-01-01

    Carbon nanotubes (CNTs) are promising candidates for use as reinforcements in next generation structural composite materials because of their extremely high specific stiffness and strength. They cannot, however, be viewed as simple replacements for carbon fibers because there are key differences between these materials in areas such as handling, processing, and matrix design. It is impossible to know for certain that CNT composites will represent a significant advance over carbon fiber composites before these various factors have been optimized, which is an extremely costly and time intensive process. This work attempts to place an upper bound on CNT composite mechanical properties by performing molecular dynamics simulations on idealized model systems with a reactive forcefield that permits modeling of both elastic deformations and fracture. Amorphous carbon (AC) was chosen for the matrix material in this work because of its structural simplicity and physical compatibility with the CNT fillers. It is also much stiffer and stronger than typical engineering polymer matrices. Three different arrangements of CNTs in the simulation cell have been investigated: a single-wall nanotube (SWNT) array, a multi-wall nanotube (MWNT) array, and a SWNT bundle system. The SWNT and MWNT array systems are clearly idealizations, but the SWNT bundle system is a step closer to real systems in which individual tubes aggregate into large assemblies. The effect of chemical crosslinking on composite properties is modeled by adding bonds between the CNTs and AC. The balance between weakening the CNTs and improving fiber-matrix load transfer is explored by systematically varying the extent of crosslinking. It is, of course, impossible to capture the full range of deformation and fracture processes that occur in real materials with even the largest atomistic molecular dynamics simulations. With this limitation in mind, the simulation results reported here provide a plausible upper limit on achievable CNT composite properties and yield some insight on the influence of processing conditions on the mechanical properties of CNT composites.

  19. Variability of hydrological extreme events in East Asia and their dynamical control: a comparison between observations and two high-resolution global climate models

    NASA Astrophysics Data System (ADS)

    Freychet, N.; Duchez, A.; Wu, C.-H.; Chen, C.-A.; Hsu, H.-H.; Hirschi, J.; Forryan, A.; Sinha, B.; New, A. L.; Graham, T.; Andrews, M. B.; Tu, C.-Y.; Lin, S.-J.

    2017-02-01

    This work investigates the variability of extreme weather events (drought spells, DS15, and daily heavy rainfall, PR99) over East Asia. It particularly focuses on the large scale atmospheric circulation associated with high levels of the occurrence of these extreme events. Two observational datasets (APHRODITE and PERSIANN) are compared with two high-resolution global climate models (HiRAM and HadGEM3-GC2) and an ensemble of other lower resolution climate models from CMIP5. We first evaluate the performance of the high resolution models. They both exhibit good skill in reproducing extreme events, especially when compared with CMIP5 results. Significant differences exist between the two observational datasets, highlighting the difficulty of having a clear estimate of extreme events. The link between the variability of the extremes and the large scale circulation is investigated, on monthly and interannual timescales, using composite and correlation analyses. Both extreme indices DS15 and PR99 are significantly linked to the low level wind intensity over East Asia, i.e. the monsoon circulation. It is also found that DS15 events are strongly linked to the surface temperature over the Siberian region and to the land-sea pressure contrast, while PR99 events are linked to the sea surface temperature anomalies over the West North Pacific. These results illustrate the importance of the monsoon circulation on extremes over East Asia. The dependencies on of the surface temperature over the continent and the sea surface temperature raise the question as to what extent they could affect the occurrence of extremes over tropical regions in future projections.

  20. A Generalized Framework for Non-Stationary Extreme Value Analysis

    NASA Astrophysics Data System (ADS)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA accessible to a broader audience.

  1. The effects of smartphone use on upper extremity muscle activity and pain threshold

    PubMed Central

    Lee, Minkyung; Hong, Yunkyung; Lee, Seunghoon; Won, Jinyoung; Yang, Jinjun; Park, Sookyoung; Chang, Kyu-Tae; Hong, Yonggeun

    2015-01-01

    [Purpose] The purpose of this study was to determine whether muscle activity and pressure-induced pain in the upper extremities are affected by smartphone use, and to compare the effects of phone handling with one hand and with both hands. [Subjects] The study subjects were asymptomatic women 20–22 years of age. [Methods] The subjects sat in a chair with their feet on the floor and the elbow flexed, holding a smartphone positioned on the thigh. Subsequently, the subjects typed the Korean anthem for 3 min, one-handed or with both hands. Each subject repeated the task three times, with a 5-min rest period between tasks to minimize fatigue. Electromyography (EMG) was used to record the muscle activity of the upper trapezius (UT), extensor pollicis longus (EPL), and abductor pollicis (AP) during phone operation. We also used a dolorimeter to measure the pressure-induced pain threshold in the UT. [Results] We observed higher muscle activity in the UT, AP, and EPL in one-handed smartphone use than in its two-handed use. The pressure-induced pain threshold of the UT was lower after use of the smartphone, especially after one-handed use. [Conclusion] Our results show that smartphone operation with one hand caused greater UT pain and induced increased upper extremity muscle activity. PMID:26180311

  2. Improving Predictions and Management of Hydrological Extremes

    NASA Astrophysics Data System (ADS)

    Wijngaard, Janet; Liggins, Felicity; Hurk, Bart vd; Lavers, David; Magnusson, Linus; Bouwer, Laurens; Weerts, Albrecht; Kjellström, Erik; Mañez, Maria; Ramos, Maria-Helena; Hananel, Cedric; Ercin, Ertug; Hunink, Johannes; Klein, Bastian; Pouget, Laurent; de Moel, Hans

    2017-04-01

    The EU Roadmap on Climate Services can be seen as a result of convergence between society's call for "actionable research" and the climate research community's provision of tailored data, information and knowledge. Although weather and climate have distinct definitions, a strong link between weather and climate services does exist but, to date, this link has not been explored extensively. Stakeholders being interviewed in the context of the Roadmap consider changes in our climate as distant, long-term impacts that are difficult to consider in present-day decision making, a process usually dominated by their daily experience with handling adverse weather and extreme events. However, it could be argued that this experience is a rich source of inspiration to increase society's resilience to an unknown future. The European research project, IMPREX, is built on the notion that "experience in managing present day weather extremes can help us anticipate the consequences of future climate variability and change". This presentation illustrates how IMPREX is building the link between the providers and users of information and services addressing both the weather and climate timescales. For different stakeholders in key economic sectors the needs and vulnerabilities in their daily practice are discussed, followed by an analysis of how weather and climate (W&C) services could contribute to the demands that arise from this. Examples of case studies showing the relevance of the tailored W&C information in users' operations will be included.

  3. An index-flood model for deficit volumes assessment

    NASA Astrophysics Data System (ADS)

    Strnad, Filip; Moravec, Vojtěch; Hanel, Martin

    2017-04-01

    The estimation of return periods of hydrological extreme events and the evaluation of risks related to such events are objectives of many water resources studies. The aim of this study is to develop statistical model for drought indices using extreme value theory and index-flood method and to use this model for estimation of return levels of maximum deficit volumes of total runoff and baseflow. Deficit volumes for hundred and thirty-three catchments in the Czech Republic for the period 1901-2015 simulated by a hydrological model Bilan are considered. The characteristics of simulated deficit periods (severity, intensity and length) correspond well to those based on observed data. It is assumed that annual maximum deficit volumes in each catchment follow the generalized extreme value (GEV) distribution. The catchments are divided into three homogeneous regions considering long term mean runoff, potential evapotranspiration and base flow. In line with the index-flood method it is further assumed that the deficit volumes within each homogeneous region are identically distributed after scaling with a site-specific factor. The goodness-of-fit of the statistical model is assessed by Anderson-Darling statistics. For the estimation of critical values of the test several resampling strategies allowing for appropriate handling of years without drought are presented. Finally the significance of the trends in the deficit volumes is assessed by a likelihood ratio test.

  4. Minimum important differences for the patient-specific functional scale, 4 region-specific outcome measures, and the numeric pain rating scale.

    PubMed

    Abbott, J Haxby; Schmitt, John

    2014-08-01

    Multicenter, prospective, longitudinal cohort study. To investigate the minimum important difference (MID) of the Patient-Specific Functional Scale (PSFS), 4 region-specific outcome measures, and the numeric pain rating scale (NPRS) across 3 levels of patient-perceived global rating of change in a clinical setting. The MID varies depending on the external anchor defining patient-perceived "importance." The MID for the PSFS has not been established across all body regions. One thousand seven hundred eight consecutive patients with musculoskeletal disorders were recruited from 5 physical therapy clinics. The PSFS, NPRS, and 4 region-specific outcome measures-the Oswestry Disability Index, Neck Disability Index, Upper Extremity Functional Index, and Lower Extremity Functional Scale-were assessed at the initial and final physical therapy visits. Global rating of change was assessed at the final visit. MID was calculated for the PSFS and NPRS (overall and for each body region), and for each region-specific outcome measure, across 3 levels of change defined by the global rating of change (small, medium, large change) using receiver operating characteristic curve methodology. The MID for the PSFS (on a scale from 0 to 10) ranged from 1.3 (small change) to 2.3 (medium change) to 2.7 (large change), and was relatively stable across body regions. MIDs for the NPRS (-1.5 to -3.5), Oswestry Disability Index (-12), Neck Disability Index (-14), Upper Extremity Functional Index (6 to 11), and Lower Extremity Functional Scale (9 to 16) are also reported. We reported the MID for small, medium, and large patient-perceived change on the PSFS, NPRS, Oswestry Disability Index, Neck Disability Index, Upper Extremity Functional Index, and Lower Extremity Functional Scale for use in clinical practice and research.

  5. Evaluating the ClimEx Single Model Large Ensemble in Comparison with EURO-CORDEX Results of Seasonal Means and Extreme Precipitation Indicators

    NASA Astrophysics Data System (ADS)

    von Trentini, F.; Schmid, F. J.; Braun, M.; Brisette, F.; Frigon, A.; Leduc, M.; Martel, J. L.; Willkofer, F.; Wood, R. R.; Ludwig, R.

    2017-12-01

    Meteorological extreme events seem to become more frequent in the present and future, and a seperation of natural climate variability and a clear climate change effect on these extreme events gains more and more interest. Since there is only one realisation of historical events, natural variability in terms of very long timeseries for a robust statistical analysis is not possible with observation data. A new single model large ensemble (SMLE), developed for the ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) is supposed to overcome this lack of data by downscaling 50 members of the CanESM2 (RCP 8.5) with the Canadian CRCM5 regional model (using the EURO-CORDEX grid specifications) for timeseries of 1950-2099 each, resulting in 7500 years of simulated climate. This allows for a better probabilistic analysis of rare and extreme events than any preceding dataset. Besides seasonal sums, several extreme indicators like R95pTOT, RX5day and others are calculated for the ClimEx ensemble and several EURO-CORDEX runs. This enables us to investigate the interaction between natural variability (as it appears in the CanESM2-CRCM5 members) and a climate change signal of those members for past, present and future conditions. Adding the EURO-CORDEX results to this, we can also assess the role of internal model variability (or natural variability) in climate change simulations. A first comparison shows similar magnitudes of variability of climate change signals between the ClimEx large ensemble and the CORDEX runs for some indicators, while for most indicators the spread of the SMLE is smaller than the spread of different CORDEX models.

  6. Evaluating sub-seasonal skill in probabilistic forecasts of Atmospheric Rivers and associated extreme events

    NASA Astrophysics Data System (ADS)

    Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.

    2017-12-01

    Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.

  7. Synoptic moisture pathways associated with mean and extreme precipitation over Canada for winter and spring

    NASA Astrophysics Data System (ADS)

    Tan, X.; Gan, T. Y. Y.; Chen, Y. D.

    2017-12-01

    Dominant synoptic moisture pathway patterns of vertically integrated water vapor transport (IVT) in winter and spring over Canada West and East were identified using the self-organizing map method. Large-scale meteorological patterns (LSMPs) were related to the variability in seasonal precipitation totals and occurrences of precipitation extremes. Changes in both occurrences of LSMPs and seasonal precipitation occurred under those LSMPs were evaluated to attribute observed changes in seasonal precipitation totals and occurrences of precipitation extremes. Effects of large-scale climate anomalies on occurrences of LSMPs were also examined. Results show that synoptic moisture pathways and LSMPs exhibit the propagation of jet streams as the location and direction of ridges and troughs, and the strength and center of pressure lows and highs varied considerably between LSMPs. Significant decreases in occurrences of synoptic moisture pathway patterns that are favorable with positive precipitation anomalies and more precipitation extremes in winter over Canada West resulted in decreases in seasonal precipitation and occurrences of precipitation extremes. LSMPs resulting in a hot and dry climate and less (more) frequent precipitation extremes over the Canadian Prairies in winter and northwestern Canada in spring are more likely to occur in years with a negative phase of PNA. Occurrences of LSMPs for a wet climate and frequent occurrences of extreme precipitation events over southeastern Canada are associated with a positive phase of NAO. In El Niño years or negative PDO years, LSMPs associated with a dry climate and less frequent precipitation extremes over western Canada tend to occur.

  8. The Influence of the Madden-Julian Oscillation (mjo) on Extreme Rainfall Over the Central and Southern Peruvian Andes

    NASA Astrophysics Data System (ADS)

    Heidinger, H.; Jones, C.; Carvalho, L. V.

    2015-12-01

    Extreme rainfall is important for the Andean region because of the large contribution of these events to the seasonal totals and consequent impacts on water resources for agriculture, water consumption, industry and hydropower generation, as well as the occurrence of floods and landslides. Over Central and Southern Peruvian Andes (CSPA), rainfall exceeding the 90th percentile contributed between 44 to 100% to the total Nov-Mar 1979-2010 rainfall. Additionally, precipitation from a large majority of stations in the CSPA exhibits statistically significant spectral peaks on intraseasonal time-scales (20 to 70 days). The Madden-Julian Oscillation (MJO) is the most important intraseasonal mode of atmospheric circulation and moist convection in the tropics and the occurrence of extreme weather events worldwide. Mechanisms explaining the relationships between the MJO and precipitation in the Peruvian Andes have not been properly described yet. The present study examines the relationships between the activity and phases of the MJO and the occurrence of extreme rainfall over the CSPA. We found that the frequency of extreme rainfall events increase in the CSPA when the MJO is active. MJO phases 5, 6 and 7 contribute to the overall occurrence of extreme rainfall events over the CSPA. However, how the MJO phases modulate extreme rainfall depends on the location of the stations. For instance, extreme precipitation (above the 90th percentile) in stations in the Amazon basin are slightly more sensitive to phases 2, 3 and 4; the frequency of extremes in stations in the Pacific basin increases in phases 5, 6 and 7 whereas phase 2, 3 and 7 modulates extreme precipitation in stations in the Titicaca basin. Greater variability among stations is observed when using the 95th and 99th percentiles to identify extremes. Among the main mechanisms that explain the increase in extreme rainfall events in the Peruvian Andes is the intensification of the easterly moisture flux anomalies, which are favored during certain phases of the MJO. Here dynamical mechanisms linking the MJO to the occurrence of extreme rainfall in stations in the Peruvian Andes are investigated using composites of integrated moisture flux and geopotential height anomalies.

  9. Air quality as a constraint to the use of coal in California

    NASA Technical Reports Server (NTRS)

    Austin, T. C.

    1978-01-01

    Low-NOx burners, wet scrubbing systems, baghouses and ammonia injection systems are feasible for use on large combustion sources such as utility boilers. These devices, used in combination with coal handling techniques which minimize fugitive dust and coal transportation related emissions, should enable new power plants and large industrial boilers to burn coal without the adverse air quality impacts for which coal became notorious.

  10. Semi-Lagrangian particle methods for high-dimensional Vlasov-Poisson systems

    NASA Astrophysics Data System (ADS)

    Cottet, Georges-Henri

    2018-07-01

    This paper deals with the implementation of high order semi-Lagrangian particle methods to handle high dimensional Vlasov-Poisson systems. It is based on recent developments in the numerical analysis of particle methods and the paper focuses on specific algorithmic features to handle large dimensions. The methods are tested with uniform particle distributions in particular against a recent multi-resolution wavelet based method on a 4D plasma instability case and a 6D gravitational case. Conservation properties, accuracy and computational costs are monitored. The excellent accuracy/cost trade-off shown by the method opens new perspective for accurate simulations of high dimensional kinetic equations by particle methods.

  11. Bacterial copper storage proteins.

    PubMed

    Dennison, Christopher; David, Sholto; Lee, Jaeick

    2018-03-30

    Copper is essential for most organisms as a cofactor for key enzymes involved in fundamental processes such as respiration and photosynthesis. However, copper also has toxic effects in cells, which is why eukaryotes and prokaryotes have evolved mechanisms for safe copper handling. A new family of bacterial proteins uses a Cys-rich four-helix bundle to safely store large quantities of Cu(I). The work leading to the discovery of these proteins, their properties and physiological functions, and how their presence potentially impacts the current views of bacterial copper handling and use are discussed in this review. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.

  12. Maximum-likelihood estimation of parameterized wavefronts from multifocal data

    PubMed Central

    Sakamoto, Julia A.; Barrett, Harrison H.

    2012-01-01

    A method for determining the pupil phase distribution of an optical system is demonstrated. Coefficients in a wavefront expansion were estimated using likelihood methods, where the data consisted of multiple irradiance patterns near focus. Proof-of-principle results were obtained in both simulation and experiment. Large-aberration wavefronts were handled in the numerical study. Experimentally, we discuss the handling of nuisance parameters. Fisher information matrices, Cramér-Rao bounds, and likelihood surfaces are examined. ML estimates were obtained by simulated annealing to deal with numerous local extrema in the likelihood function. Rapid processing techniques were employed to reduce the computational time. PMID:22772282

  13. MetAlign: interface-driven, versatile metabolomics tool for hyphenated full-scan mass spectrometry data preprocessing.

    PubMed

    Lommen, Arjen

    2009-04-15

    Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign softwareas described in this manuscripthandles a broad range of accurate mass and nominal mass GC/MS and LC/MS data. It is capable of automatic format conversions, accurate mass calculations, baseline corrections, peak-picking, saturation and mass-peak artifact filtering, as well as alignment of up to 1000 data sets. A 100 to 1000-fold data reduction is achieved. MetAlign software output is compatible with most multivariate statistics programs.

  14. Diagnostic studies of ensemble forecast "jumps"

    NASA Astrophysics Data System (ADS)

    Magnusson, Linus; Hewson, Tim; Ferranti, Laura; Rodwell, Mark

    2016-04-01

    During 2015 we saw exceptional consistency in successive seasonal forecasts produced at ECMWF, for the winter period 2015/16, right across the globe. This winter was characterised by a well-predicted and unusually strong El Nino, and some have ascribed the consistency to that. For most of December this consistency was mirrored in the (separate) ECMWF monthly forecast system, which correctly predicted anomalously strong (mild) zonal flow, over the North Atlantic and western Eurasia, even in forecasts for weeks 3 and 4. In monthly forecasts in general these weeks are often devoid of strong signals. However in late December and early January strong signals, even in week 2, proved to be incorrect, most notably over the North Atlantic and Eurasian sectors. Indeed on at least two occasions the outcome was beyond the ensemble forecast range over Scandinavia. In one of these conditions flipped from extreme mild to extreme cold as a high latitude block developed. Temperature prediction is very important to many customers, notably those dealing with renewable energy, because cold weather causes increased demand but also tends to coincide with reduced wind power production. So understandably jumps can cause consternation amongst some customer groups, and are very difficult to handle operationally. This presentation will discuss the results of initial diagnostic investigations into what caused the "ensemble jumps", particularly at the week two lead, though reference will also be made to a related shorter range (day 3) jump that was important for flooding over the UK. Initial results suggest that an inability of the ECMWF model to correctly represent convective outbreaks over North America (that for winter-time were quite extreme) played an important role. Significantly, during this period, an unusually large amount of upper air data over North America was rejected or ascribed low weight. These results bear similarities to previous diagnostic studies at ECMWF, wherein major convective outbreaks in spring and early summer over North America were shown to have a detrimental impact on forecast quality. The possible contributions of other factors will also be discussed; for example we know that the ECMWF model exhibits different skill levels for different regime transitions. It will also be shown that the new higher resolution ECMWF forecast system, then running in trial mode, performed somewhat better, at least for some of these cases.

  15. How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?

    DOE PAGES

    Hagos, Samson; Ruby Leung, L.; Zhao, Chun; ...

    2018-02-10

    Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less

  16. How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Ruby Leung, L.; Zhao, Chun

    Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less

  17. A Search for H(alpha) Emission in the Far Outer Discs of Extremely Large Spiral Galaxies

    NASA Astrophysics Data System (ADS)

    Rubin, Vera; Hunter, Deidre

    2007-08-01

    Little is known about the kinematics of galaxies far beyond the relatively bright regions sampled in radio or optical radial velocity studies. Most often, the velocities are obtained as part of large surveys, where the effort is made to obtain many rotation curves, rather than to extend a rotation curve as far as possible. Because the composition of dark matter remains unknown, it is important to devise observations that will help to constrain its properties. We propose to obtain ultra-deep Hα images (in the rest frame of the galaxy) for UGC 2885 and NGC 801, two extremely large Sc galaxies. We expect to detect Hα regions far beyond their nuclei and into the extreme outer disc, for which we will then obtain radial velocities. Increased knowledge concerning the kinematics of these galaxies will tighten the constraints on mass models, and shed light on the properties of dark matter. Ultimately, we hope to learn more about the outermost galaxy, where disc and halo blend.

  18. A Large-Scale Multi-Hop Localization Algorithm Based on Regularized Extreme Learning for Wireless Networks.

    PubMed

    Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan

    2017-12-20

    A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.

  19. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  20. Advanced space optics development in freeform optics design, ceramic polishing, rapid and extreme freeform polishing

    NASA Astrophysics Data System (ADS)

    Geyl, R.; Leplan, H.; Ruch, E.

    2017-09-01

    In this paper Safran-Reosc wants to share with the space community its recent work performed in the domain of space optics. Our main topic is a study about the advantages that freeform optical surfaces can offer to advanced space optics in term of compactness or performances. We have separated smart and extreme freeform in our design exploration work. Our second topic is to answer about the immediate question following: can we manufacture and test these freeform optics? We will therefore present our freeform optics capability, report recent achievement in extreme aspheric optics polishing and introduce to the industrialisation process of large off axis optics polishing for the ESO Extremely Large Telescope primary mirror segments. Thirdly we present our R-SiC polishing layer technology for SiC material. This technique has been developed to reduce costs, risks and schedule in the manufacturing of advanced SiC optics for Vis and IR applications.

Top