Sample records for entire process takes

  1. Cabbage Patch Chemistry.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2000

    2000-01-01

    This activity takes students through the process of fermentation. Requires an entire month for the full reaction to take place. The reaction, catalyzed by bacterial enzymes, produces lactic acid from glucose. (SAH)

  2. Process for the reconstruction of three-dimensional images of an area of interest of an object comprising the combination of measurements over the entire object with measurements of an area of interest of said object, and appropriate installation

    DOEpatents

    Azevedo, Stephen; Grangeat, Pierre; Rizo, Philippe

    1995-01-01

    Process and installation making it possible to reconstitute precise images of an area of interest (2) of an object (1) by reducing the errors produced by the contribution of the compliment of the object. A first series of measurements is carried out, where a conical beam (10) only takes in the area of interest of the object (2) and this is followed by a second series of measurements in which the beam takes in the entire object. A combination of the measurements of the two series is carried out in order to make them compatible and obtain a more accurate image of the area of interest (2).

  3. The electrical properties of zero-gravity processed immiscibles

    NASA Technical Reports Server (NTRS)

    Lacy, L. L.; Otto, G. H.

    1974-01-01

    When dispersed or mixed immiscibles are solidified on earth, a large amount of separation of the constituents takes place due to differences in densities. However, when the immiscibles are dispersed and solidified in zero-gravity, density separation does not occur, and unique composite solids can be formed with many new and promising electrical properties. By measuring the electrical resistivity and superconducting critical temperature, Tc, of zero-g processed Ga-Bi samples, it has been found that the electrical properties of such materials are entirely different from the basic constituents and the ground control samples. Our results indicate that space processed immiscible materials may form an entirely new class of electronic materials.

  4. SEPARATION OF INORGANIC SALTS FROM ORGANIC SOLUTIONS

    DOEpatents

    Katzin, L.I.; Sullivan, J.C.

    1958-06-24

    A process is described for recovering the nitrates of uranium and plutonium from solution in oxygen-containing organic solvents such as ketones or ethers. The solution of such salts dissolved in an oxygen-containing organic compound is contacted with an ion exchange resin whereby sorption of the entire salt on the resin takes place and then the salt-depleted liquid and the resin are separated from each other. The reaction seems to be based on an anion formation of the entire salt by complexing with the anion of the resin. Strong base or quaternary ammonium type resins can be used successfully in this process.

  5. A Change Process at German University--Innovation through Information and Communication Technologies?

    ERIC Educational Resources Information Center

    Zentel, Peter; Bett, Katja; Meister, Dorothee M.; Rinn, Ulrike; Wedekind, Joachim

    2004-01-01

    In this article, we describe the current situation of virtual universities in Germany and pursue the question of whether innovation processes taking place throughout the entire higher education landscape. Our study shows that the integration of ICT [information and communication technologies] not only changes the medial characteristics of the…

  6. Risk-Taking Behavior in a Computerized Driving Task: Brain Activation Correlates of Decision-Making, Outcome, and Peer Influence in Male Adolescents.

    PubMed

    Vorobyev, Victor; Kwon, Myoung Soo; Moe, Dagfinn; Parkkola, Riitta; Hämäläinen, Heikki

    2015-01-01

    Increased propensity for risky behavior in adolescents, particularly in peer groups, is thought to reflect maturational imbalance between reward processing and cognitive control systems that affect decision-making. We used functional magnetic resonance imaging (fMRI) to investigate brain functional correlates of risk-taking behavior and effects of peer influence in 18-19-year-old male adolescents. The subjects were divided into low and high risk-taking groups using either personality tests or risk-taking rates in a simulated driving task. The fMRI data were analyzed for decision-making (whether to take a risk at intersections) and outcome (pass or crash) phases, and for the influence of peer competition. Personality test-based groups showed no difference in the amount of risk-taking (similarly increased during peer competition) and brain activation. When groups were defined by actual task performance, risk-taking activated two areas in the left medial prefrontal cortex (PFC) significantly more in low than in high risk-takers. In the entire sample, risky decision-specific activation was found in the anterior and dorsal cingulate, superior parietal cortex, basal ganglia (including the nucleus accumbens), midbrain, thalamus, and hypothalamus. Peer competition increased outcome-related activation in the right caudate head and cerebellar vermis in the entire sample. Our results suggest that the activation of the medial (rather than lateral) PFC and striatum is most specific to risk-taking behavior of male adolescents in a simulated driving situation, and reflect a stronger conflict and thus increased cognitive effort to take risks in low risk-takers, and reward anticipation for risky decisions, respectively. The activation of the caudate nucleus, particularly for the positive outcome (pass) during peer competition, further suggests enhanced reward processing of risk-taking under peer influence.

  7. Green Schools as High Performance Learning Facilities

    ERIC Educational Resources Information Center

    Gordon, Douglas E.

    2010-01-01

    In practice, a green school is the physical result of a consensus process of planning, design, and construction that takes into account a building's performance over its entire 50- to 60-year life cycle. The main focus of the process is to reinforce optimal learning, a goal very much in keeping with the parallel goals of resource efficiency and…

  8. Data Mining.

    ERIC Educational Resources Information Center

    Benoit, Gerald

    2002-01-01

    Discusses data mining (DM) and knowledge discovery in databases (KDD), taking the view that KDD is the larger view of the entire process, with DM emphasizing the cleaning, warehousing, mining, and visualization of knowledge discovery in databases. Highlights include algorithms; users; the Internet; text mining; and information extraction.…

  9. Finger-Writing Intervention Impacts the Spelling and Handwriting Skills of Children with Developmental Language Disorder: A Multiple Single-Case Study

    ERIC Educational Resources Information Center

    Van Reybroeck, Marie; Michiels, Nathalie

    2018-01-01

    Learning to use grapheme to phoneme correspondences (GPCs) provides a powerful mechanism for the foundation of reading skills in children. However, for some children, such as those with Developmental Language Disorder (DLD), the GPC learning process takes time, is laborious, and impacts the entire reading and spelling processes. The present study…

  10. Evaluating Teachers of Writing.

    ERIC Educational Resources Information Center

    Hult, Christine A., Ed.

    Describing the various forms evaluation can take, this book delineates problems in evaluating writing faculty and sets the stage for reconsidering the entire process to produce a fair, equitable, and appropriate system. The book discusses evaluation through real-life examples: evaluation of writing faculty by literature faculty, student…

  11. 77 FR 74876 - Agency Information Collection Activities; Existing Collection, Comments Requested; Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-18

    ... Office of Management and Budget, Office of Information and Regulatory Affairs, Attention Department of... 3,000 enrollments per year. The average response time for reading the directions for the Federal... = 100 hours. The entire process of reading the letter and completing both forms would take 15 minutes...

  12. Composition Theory: Taking It Apart and Putting It Back Together Again.

    ERIC Educational Resources Information Center

    Sydow, Debbie L.

    Despite the fact that social constructivism is accepted as the guiding theory in Composition, that this theory is the field's theoretical center of gravity, it does not account for, nor explain, the entire writing process. Two major challenges to social constructivism must also be considered in theoretical discussions: (1) the cognitive dimension…

  13. Behavioral, Cognitive and Social Science Research in the Military

    DTIC Science & Technology

    2000-08-01

    Education The objective of this project is to demonstrate empirically based cost-effective training strategies , with particular emphasis on how to best ... best to train soldiers for peacekeeping missions and how those techniques differ from training soldiers to take part in war. We want to understand...and formed a new strategic planning process for the entire S&T Program. The foundation of this process is the Defense S&T Strategy , which along with

  14. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    PubMed Central

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  15. An Interdisciplinary Evaluation of Transactive Memory in Distributed Cyber Teams

    ERIC Educational Resources Information Center

    Mancuso, Vincent Francis

    2012-01-01

    In the modern workplace, collaboration is no longer only a face-to-face process. When working together, it is common for teams to rely on technology and operate across geographic, temporal and cultural boundaries. Most research, when looking at distributed teams, takes a uni-disciplinary perspective and fails to address the entire problem space.…

  16. Designing and Proposing Your Research Project. Concise Guides to Conducting Behavioral, Health, and Social Science Research Series

    ERIC Educational Resources Information Center

    Urban, Jennifer Brown; van Eeden-Moorefield, Bradley Matheus

    2017-01-01

    Designing your own study and writing your research proposal takes time, often more so than conducting the study. This practical, accessible guide walks you through the entire process. You will learn to identify and narrow your research topic, develop your research question, design your study, and choose appropriate sampling and measurement…

  17. Ways to Help Your Child through an Immunization: Visual Strategies for Autism and Other Developmental Disorders

    ERIC Educational Resources Information Center

    Hutchinson, Paula; Harvey, Vicki; Naugler, Krista

    2010-01-01

    Many people, whether old or young, male or female, typically developing or living with a disability, become quite anxious at the idea of a needle. They anticipate the possibility of pain, however brief, and try to avoid the experience. The reality is that any discomfort is usually very brief, and the entire process only takes a minute or two from…

  18. An Interactive Design Space Supporting Development of Vehicle Architecture Concept Models

    DTIC Science & Technology

    2011-01-01

    Denver, Colorado, USA IMECE2011-64510 AN INTERACTIVE DESIGN SPACE SUPPORTING DEVELOPMENT OF VEHICLE ARCHITECTURE CONCEPT MODELS Gary Osborne...early in the development cycle. Optimization taking place later in the cycle usually occurs at the detail design level, and tends to result in...architecture changes may be imposed, but such modifications are equivalent to a huge optimization cycle covering almost the entire design process, and

  19. The carbon chemistry of the moon.

    NASA Technical Reports Server (NTRS)

    Eglinton, G.; Maxwell, J. R.; Pillinger, C. T.

    1972-01-01

    The analysis of lunar samples has shown that the carbon chemistry of the moon is entirely different from the carbon chemistry of the earth. Lunar carbon chemistry is more closely related to cosmic physics than to conventional organic chemistry. Sources of carbon on the moon are considered, giving attention to meteorites and the solar wind. The approaches used in the analysis of the samples are discussed, taking into account the method of gas chromatography employed and procedures used by bioscience investigators in the study of the lunar fines. The presence of indigenous methane and carbide in the lunar fines was established. Reactions and processes taking place on the lunar surface are discussed.

  20. Institutional Case-Based Study on the Effect of Research Methods on Project Work in the Curriculum of Mechanical Engineering Programmes in Ghanaian Polytechnics

    ERIC Educational Resources Information Center

    Baffour-Awuah, Emmanuel

    2015-01-01

    Preparing students for Project Work (PROJ 1 and PROJ 2) require them to go through Research Methods (RE) as part of the curriculum though it takes the centre stage of the entire preparation process. Knowledge of the relationships between the two could be a useful tool in improving the performance of students in the former. The purpose of the case…

  1. BELL ANNEALING FURNACES FOR LIGHT GAUGE PRODUCTS (LESS THE 10/1000" ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    BELL ANNEALING FURNACES FOR LIGHT GAUGE PRODUCTS (LESS THE 10/1000" THICKNESS). COILS INSIDE COVERING SHELLS ARE HEATED BY GAS-FIRED JETS TO TEMPERATURES OF 280-400C., OVER 3-4 HOURS. AFTER COMPLETION OF THE HEATING CYCLE, COILS ARE COOLED SLOWLY TO BELOW 100 DEGREES CELSIUS BEFORE THE SHELL IS REMOVED AND THE COILS REMOVED. THE ENTIRE PROCESS TAKES 24 HOURS. - American Brass Foundry, 70 Sayre Street, Buffalo, Erie County, NY

  2. LMI designmethod for networked-based PID control

    NASA Astrophysics Data System (ADS)

    Souza, Fernando de Oliveira; Mozelli, Leonardo Amaral; de Oliveira, Maurício Carvalho; Palhares, Reinaldo Martinez

    2016-10-01

    In this paper, we propose a methodology for the design of networked PID controllers for second-order delayed processes using linear matrix inequalities. The proposed procedure takes into account time-varying delay on the plant, time-varying delays induced by the network and packed dropouts. The design is carried on entirely using a continuous-time model of the closed-loop system where time-varying delays are used to represent sampling and holding occurring in a discrete-time digital PID controller.

  3. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    NASA Astrophysics Data System (ADS)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  4. Are visual peripheries forever young?

    PubMed

    Burnat, Kalina

    2015-01-01

    The paper presents a concept of lifelong plasticity of peripheral vision. Central vision processing is accepted as critical and irreplaceable for normal perception in humans. While peripheral processing chiefly carries information about motion stimuli features and redirects foveal attention to new objects, it can also take over functions typical for central vision. Here I review the data showing the plasticity of peripheral vision found in functional, developmental, and comparative studies. Even though it is well established that afferent projections from central and peripheral retinal regions are not established simultaneously during early postnatal life, central vision is commonly used as a general model of development of the visual system. Based on clinical studies and visually deprived animal models, I describe how central and peripheral visual field representations separately rely on early visual experience. Peripheral visual processing (motion) is more affected by binocular visual deprivation than central visual processing (spatial resolution). In addition, our own experimental findings show the possible recruitment of coarse peripheral vision for fine spatial analysis. Accordingly, I hypothesize that the balance between central and peripheral visual processing, established in the course of development, is susceptible to plastic adaptations during the entire life span, with peripheral vision capable of taking over central processing.

  5. Contextual mediation of perceptions in hauntings and poltergeist-like experiences.

    PubMed

    Lange, R; Houran, J; Harte, T M; Havens, R A

    1996-06-01

    The content of perceived apparitions, e.g., bereavement hallucinations, cannot be explained entirely in terms of electromagnetically induced neurochemical processes. It was shown that contextual variables influential in hallucinatory and hypnotic states also structured reported haunting experiences. As predicted, high congruency was found between the experiential content and the nature of the contextual variables. Further, the number of contextual variables involved in an experience was related to the type of experience and the state or arousal preceding the experience. Based on these findings we argue that a more complete explanation of haunting experiences should take into account both electromagnetically induced neurochemical processes and factors related to contextual mediation.

  6. The development of the tapeworm Diphyllobothrium latum (L. 1756) (Cestoda; Pseudophyllidea) in its definitive hosts, with special references to the growth patterns of D. dendriticum (Nitzsch, 1824) and D. ditremum (Creplin, 1827).

    PubMed

    Andersen, K

    1978-08-01

    When Diphyllobothrium latum develops from larva to adult in a definitive host, it first sheds the entire larval 'body' before growth of an adult strobila starts. This process of shedding off the entire larval abothrial extremity, piece by piece, takes about 48 h. By this time the larva has usually reached the anterior third of the small intestine of the host. D. dendriticum and D. ditremum develop quite differently, although exhibiting similar anterior migrations. In these two species the larvae develop directly into adults without the larval 'body' first being shed. The implications of the observed differences in growth pattern between these three species of Diphyllobothrium to the classification of diphyllobothriid cestodes is discussed briefly.

  7. Analysis of design characteristics of a V-type support using an advanced engineering environment

    NASA Astrophysics Data System (ADS)

    Gwiazda, A.; Banaś, W.; Sękala, A.; Cwikla, G.; Topolska, S.; Foit, K.; Monica, Z.

    2017-08-01

    Modern mining support, for the entire period of their use, is the important part of the mining complex, which includes all the devices in the excavation during his normal use. Therefore, during the design of the support, it is an important task to choose the shape and to select the dimensions of a support as well as its strength characteristics. According to the rules, the design process of a support must take into account, inter alia, the type and the dimensions of the expected means of transport, the number and size of pipelines, and the type of additional equipment used excavation area. The support design must ensure the functionality of the excavation process and job security, while maintaining the economic viability of the entire project. Among others it should ensure the selection of a support for specific natural conditions. It is also important to take into consideration the economic characteristics of the project. The article presents an algorithm of integrative approach and its formalized description in the form of integration the areas of different construction characteristics optimization of a V-type mining support. The paper includes the example of its application for developing the construction of this support. In the paper is also described the results of the characteristics analysis and changings that were introduced afterwards. The support models are prepared in the computer environment of the CAD class (Siemens NX PLM). Also the analyses were conducted in this design, graphical environment.

  8. Trazodone (Desyrel) and Pregnancy

    MedlinePlus

    ... birth defects. Can taking trazodone during my pregnancy cause pregnancy complications? One small study found no greater chance ... I need to take trazodone throughout my entire pregnancy. Will it cause withdrawal symptoms in my baby? Antidepressant use late ...

  9. Gene Expression in the Three-Spined Stickleback (Gasterosteus aculeatus) of Marine and Freshwater Ecotypes.

    PubMed

    Rastorguev, S M; Nedoluzhko, A V; Gruzdeva, N M; Boulygina, E S; Tsygankova, S V; Oshchepkov, D Y; Mazur, A M; Prokhortchouk, E B; Skryabin, K G

    2018-01-01

    Three-spine stickleback (Gasterosteus aculeatus) is a well-known model organism that is routinely used to explore microevolution processes and speciation, and the number of studies related to this fish has been growing recently. The main reason for the increased interest is the processes of freshwater adaptation taking place in natural populations of this species. Freshwater three-spined stickleback populations form when marine water three-spined sticklebacks fish start spending their entire lifecycle in freshwater lakes and streams. To boot, these freshwater populations acquire novel biological traits during their adaptation to a freshwater environment. The processes taking place in these populations are of great interest to evolutionary biologists. Here, we present differential gene expression profiling in G. aculeatus gills, which was performed in marine and freshwater populations of sticklebacks. In total, 2,982 differentially expressed genes between marine and freshwater populations were discovered. We assumed that differentially expressed genes were distributed not randomly along stickleback chromosomes and that they are regularly observed in the "divergence islands" that are responsible for stickleback freshwater adaptation.

  10. Surgical quality assessment. A simplified approach.

    PubMed

    DeLong, D L

    1991-10-01

    The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.

  11. 30 CFR 203.35 - What administrative steps must I take to use the RSV earned by a qualified phase 2 or phase 3...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... is located entirely or partly in water less than 200 meters deep, or before May 3, 2013, on a lease that is located entirely in water more than 200 meters but less than 400 meters deep, the MMS Regional... entirely in water more than 200 meters but less than 400 meters deep. You must provide a credible activity...

  12. Recertification in allergy and immunology: an historical review with special emphasis on the 1983 examination.

    PubMed

    Slavin, R G; Des Prez, L; Mansmann, H C; Meskauskas, J A; Pierson, W

    1985-03-01

    Recertification offers a method of evaluating a diplomate's cognitive knowledge of allergy and immunology. In 1983 candidates for the American Board of Allergy and Immunology recertification examination were offered the entire certifying examination but were informed that they would, for recertification purposes, be held responsible only for a subset of questions judged to be particularly clinically relevant. All 40 candidates elected to take the entire certifying examination. Differences between the performance of certifying and recertifying candidates on the recertifying questions were small. Except for the five-choice questions, the differences in performance between the two groups on the remaining questions were also small in an absolute sense. Recertification performance was not related to the time of original certification. Ninety-eight percent of the candidates completed a questionnaire after the examination. Ninety percent stated that they would encourage their colleagues to participate in the recertification process.

  13. Multicriterion problem of allocation of resources in the heterogeneous distributed information processing systems

    NASA Astrophysics Data System (ADS)

    Antamoshkin, O. A.; Kilochitskaya, T. R.; Ontuzheva, G. A.; Stupina, A. A.; Tynchenko, V. S.

    2018-05-01

    This study reviews the problem of allocation of resources in the heterogeneous distributed information processing systems, which may be formalized in the form of a multicriterion multi-index problem with the linear constraints of the transport type. The algorithms for solution of this problem suggest a search for the entire set of Pareto-optimal solutions. For some classes of hierarchical systems, it is possible to significantly speed up the procedure of verification of a system of linear algebraic inequalities for consistency due to the reducibility of them to the stream models or the application of other solution schemes (for strongly connected structures) that take into account the specifics of the hierarchies under consideration.

  14. [Biotechnology in perspective].

    PubMed

    Brand, A

    1990-06-15

    Biotechnology is a collective term for a large number of manipulations of biological material. Fields of importance in stock-keeping include: (1) manipulation of reproductive processes; (2) genetic manipulation of macro-(farm) animals and micro-organisms and (3) manipulation of metabolism. Fitting in biotechnological findings in breeding-stock farming has repercussions in several fields such as the relationship between producers and the ancillary and processing industries, service industries, consumers and society as a whole. The use of biotechnical findings will also require further automation and adaptation of farm management. Biotechnology opens up a new area and new prospects for farm animal husbandry. These can only be regarded as positive when they take a permanent development of the entire section into account.

  15. Executive functioning in schizophrenia: Unique and shared variance with measures of fluid intelligence.

    PubMed

    Martin, A K; Mowry, B; Reutens, D; Robinson, G A

    2015-10-01

    Patients with schizophrenia often display deficits on tasks thought to measure "executive" processes. Recently, it has been suggested that reductions in fluid intelligence test performance entirely explain deficits reported for patients with focal frontal lesions on classical executive tasks. For patients with schizophrenia, it is unclear whether deficits on executive tasks are entirely accountable by fluid intelligence and representative of a common general process or best accounted for by distinct contributions to the cognitive profile of schizophrenia. In the current study, 50 patients with schizophrenia and 50 age, sex and premorbid intelligence matched controls were assessed using a broad neuropsychological battery, including tasks considered sensitive to executive abilities, namely the Hayling Sentence Completion Test (HSCT), word fluency, Stroop test, digit-span backwards, and spatial working memory. Fluid intelligence was measured using both the Matrix reasoning subtest from the Weschler Abbreviated Scale of Intelligence (WASI) and a composite score derived from a number of cognitive tests. Patients with schizophrenia were impaired on all cognitive measures compared with controls, except smell identification and the optimal betting and risk-taking measures from the Cambridge Gambling Task. After introducing fluid intelligence as a covariate, significant differences remained for HSCT suppression errors, and classical executive function tests such as the Stroop test and semantic/phonemic word fluency, regardless of which fluid intelligence measure was included. Fluid intelligence does not entirely explain impaired performance on all tests considered as reflecting "executive" processes. For schizophrenia, these measures should remain part of a comprehensive neuropsychological assessment alongside a measure of fluid intelligence. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Whole-body 3D kinematics of bird take-off: key role of the legs to propel the trunk

    NASA Astrophysics Data System (ADS)

    Provini, Pauline; Abourachid, Anick

    2018-02-01

    Previous studies showed that birds primarily use their hindlimbs to propel themselves into the air in order to take-off. Yet, it remains unclear how the different parts of their musculoskeletal system move to produce the necessary acceleration. To quantify the relative motions of the bones during the terrestrial phase of take-off, we used biplanar fluoroscopy in two species of birds, diamond dove ( Geopelia cuneata) and zebra finch ( Taeniopygia guttata). We obtained a detailed 3D kinematics analysis of the head, the trunk and the three long bones of the left leg. We found that the entire body assisted the production of the needed forces to take-off, during two distinct but complementary phases. The first one, a relatively slow preparatory phase, started with a movement of the head and an alignment of the different groups of bones with the future take-off direction. It was associated with a pitch down of the trunk and a flexion of the ankle, of the hip and, to a lesser extent, of the knee. This crouching movement could contribute to the loading of the leg muscles and store elastic energy that could be released in the propulsive phase of take-off, during the extension of the leg joints. Combined with the fact that the head, together with the trunk, produced a forward momentum, the entire body assisted the production of the needed forces to take-off. The second phase was faster with mostly horizontal forward and vertical upward translation motions, synchronous to an extension of the entire lower articulated musculoskeletal system. It led to the propulsion of the bird in the air with a fundamental role of the hip and ankle joints to move the trunk upward and forward. Take-off kinematics were similar in both studied species, with a more pronounced crouching movement in diamond dove, which can be related to a large body mass compared to zebra finch.

  17. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    PubMed

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  18. Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online

    PubMed Central

    Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary

    2018-01-01

    Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574

  19. Super Safety and Health Day at KSC

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Employees take a look at this NASCAR auto being displayed during Super Safety and Health Day at KSC. Safety Day is a full day of NASA-sponsored, KSC and 45th Space Wing events involving a number of health and safety related activities: Displays, vendors, technical paper sessions, panel discussions, a keynote speaker, etc. The entire Center and Wing stand down to participate in the planned events. Safety Day is held annually to proactively increase awareness in safety and health among the government and contractor workforce population. The first guiding principle at KSC is '''Safety and Health First.''' KSC's number one goal is to '''Assure sound, safe and efficient practices and processes are in place for privatized/commercialized launch site processing.'''

  20. KSC00pp1586

    NASA Image and Video Library

    2000-10-18

    Employees take a look at this NASCAR auto being displayed during Super Safety and Health Day at KSC. Safety Day is a full day of NASA-sponsored, KSC and 45th Space Wing events involving a number of health and safety related activities: Displays, vendors, technical paper sessions, panel discussions, a keynote speaker, etc. The entire Center and Wing stand down to participate in the planned events. Safety Day is held annually to proactively increase awareness in safety and health among the government and contractor workforce population. The first guiding principle at KSC is “Safety and Health First.” KSC’s number one goal is to “Assure sound, safe and efficient practices and processes are in place for privatized/commercialized launch site processing.

  1. KSC-00pp1586

    NASA Image and Video Library

    2000-10-18

    Employees take a look at this NASCAR auto being displayed during Super Safety and Health Day at KSC. Safety Day is a full day of NASA-sponsored, KSC and 45th Space Wing events involving a number of health and safety related activities: Displays, vendors, technical paper sessions, panel discussions, a keynote speaker, etc. The entire Center and Wing stand down to participate in the planned events. Safety Day is held annually to proactively increase awareness in safety and health among the government and contractor workforce population. The first guiding principle at KSC is “Safety and Health First.” KSC’s number one goal is to “Assure sound, safe and efficient practices and processes are in place for privatized/commercialized launch site processing.

  2. Computation of output feedback gains for linear stochastic systems using the Zangwill-Powell method

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1977-01-01

    Because conventional optimal linear regulator theory results in a controller which requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. To this effect a stochastic linear model has been developed that accounts for process parameter and initial uncertainty, measurement noise, and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was then performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell.

  3. Thermal Infrared Radiometric Calibration of the Entire Landsat 4, 5, and 7 Archive (1982-2010)

    NASA Technical Reports Server (NTRS)

    Schott, John R.; Hook, Simon J.; Barsi, Julia A.; Markham, Brian L.; Miller, Jonathan; Padula, Francis P.; Raqueno, Nina G.

    2012-01-01

    Landsat's continuing record of the thermal state of the earth's surface represents the only long term (1982 to the present) global record with spatial scales appropriate for human scale studies (i.e., tens of meters). Temperature drives many of the physical and biological processes that impact the global and local environment. As our knowledge of, and interest in, the role of temperature on these processes have grown, the value of Landsat data to monitor trends and process has also grown. The value of the Landsat thermal data archive will continue to grow as we develop more effective ways to study the long term processes and trends affecting the planet. However, in order to take proper advantage of the thermal data, we need to be able to convert the data to surface temperatures. A critical step in this process is to have the entire archive completely and consistently calibrated into absolute radiance so that it can be atmospherically compensated to surface leaving radiance and then to surface radiometric temperature. This paper addresses the methods and procedures that have been used to perform the radiometric calibration of the earliest sizable thermal data set in the archive (Landsat 4 data). The completion of this effort along with the updated calibration of the earlier (1985 1999) Landsat 5 data, also reported here, concludes a comprehensive calibration of the Landsat thermal archive of data from 1982 to the present

  4. Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process

    NASA Astrophysics Data System (ADS)

    Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.

    2018-06-01

    A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.

  5. Computation of output feedback gains for linear stochastic systems using the Zangnill-Powell Method

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1975-01-01

    Because conventional optimal linear regulator theory results in a controller which requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. To this effect a stochastic linear model has been developed that accounts for process parameter and initial uncertainty, measurement noise, and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was then performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell. Results using a seventh order process show the proposed procedures to be very effective.

  6. Thorns on My Tongue

    ERIC Educational Resources Information Center

    Reed, Malcolm

    2012-01-01

    How might we bear witness to the fluidity and fragility of identity work that takes place during classroom discussion? How do teachers and pupils play with the personal politics of positioning during our everyday interactions? The piece that follows is written as a story almost entirely in everyday dialogue. It takes a methodological turn towards…

  7. Melting dynamics of ice in the mesoscopic regime

    PubMed Central

    Citroni, Margherita; Fanetti, Samuele; Falsini, Naomi; Foggi, Paolo; Bini, Roberto

    2017-01-01

    How does a crystal melt? How long does it take for melt nuclei to grow? The melting mechanisms have been addressed by several theoretical and experimental works, covering a subnanosecond time window with sample sizes of tens of nanometers and thus suitable to determine the onset of the process but unable to unveil the following dynamics. On the other hand, macroscopic observations of phase transitions, with millisecond or longer time resolution, account for processes occurring at surfaces and time limited by thermal contact with the environment. Here, we fill the gap between these two extremes, investigating the melting of ice in the entire mesoscopic regime. A bulk ice Ih or ice VI sample is homogeneously heated by a picosecond infrared pulse, which delivers all of the energy necessary for complete melting. The evolution of melt/ice interfaces thereafter is monitored by Mie scattering with nanosecond resolution, for all of the time needed for the sample to reequilibrate. The growth of the liquid domains, over distances of micrometers, takes hundreds of nanoseconds, a time orders of magnitude larger than expected from simple H-bond dynamics. PMID:28536197

  8. The Performance of Economics Graduates over the Entire Curriculum: The Determinants of Success

    ERIC Educational Resources Information Center

    Swope, Kurtis J.; Schmitt, Pamela M.

    2006-01-01

    Most studies of the determinants of understanding in economics focus on performance in a single course or standardized exam. Taking advantage of a large data set available at the U.S. Naval Academy (USNA), the authors examined the performance of economics majors over an entire curriculum. They found that gender was not a significant predictor of…

  9. Cloud Surprises Discovered in Moving NASA EOSDIS Applications into Amazon Web Services… and #6 Will Shock You!

    NASA Astrophysics Data System (ADS)

    McLaughlin, B. D.; Pawloski, A. W.

    2017-12-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. From surprising network policy limitations, billing challenges in a government-based cost model, and obtaining certificates in an NASA security-compliant manner to working with multiple applications in a shared and resource-constrained AWS account, these have been the relevant challenges in taking advantage of a cloud model. And most surprising of all… well, you'll just have to wait and see the "gotcha" that caught our entire team off guard!

  10. STUDIES ON A-AVITAMINOSIS IN CHICKENS

    PubMed Central

    Seifried, Oskar

    1930-01-01

    1. The principal tissue changes in the respiratory tract of chickens caused by a vitamin A deficiency in the food are, first, an atrophy and degeneration of the lining mucous membrane epithelium as well as of the epithelium of the mucous membrane glands. This process is followed or accompanied by a replacement or substitution of the degenerating original epithelium of these parts by a squamous stratified keratinizing epithelium. This newly formed epithelium develops from the primitive columnar epithelium and divides and grows very rapidly. The process appears to be one of substitution rather than a metaplasia, and resembles the normal keratinization of the skin or even more closely the incomplete keratinization of the mucous membranes (e.g., the esophagus or certain parts of the tongue of chickens). In this connection findings have been described which not only afford an interesting insight into the complicated mechanism of keratinization, but also show probable relations between keratinization and the development of Guarnieri's inclusion bodies. Balloon and reticular degeneration of the upper layers of the new stratified epithelium has been frequently observed. All parts of the respiratory tract are about equally involved in the process; and the olfactory region as well, so that the sense of smell may be lost. The lesions, which first take place on the surface epithelium and then in the glands, show only minor differences. 2. The protective mechanism inherent in the mucous membranes of the entire respiratory tract is seriously damaged or even entirely destroyed by the degeneration of the ciliated cells at the surface and the lack of secretion with bactericidal. properties. Secondary infections are frequently found, and nasal discharge and various kinds of inflammatory processes are common, including purulent ones, especially in the upper respiratory tract, communicating sinuses, eyes and trachea. The development of the characteristic histological process is not dependent upon the presence of these infections, since it also takes place in the absence of infection. 3. The specific histological lesions make it possible to differentiate between A-avitaminosis and some infectious diseases of the respiratory tract. These studies we hope will serve as a basis for further investigations on the relationship between A-avitaminosis and infection in general. PMID:19869784

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, E.C.; Killough, S.M.; Rowe, J.C.

    The purpose of the Smart Crane Ammunition Transfer System (SCATS) project is to demonstrate robotic/telerobotic controls technology for a mobile articulated crane for missile/munitions handling, delivery, and reload. Missile resupply and reload have been manually intensive operations up to this time. Currently, reload missiles are delivered by truck to the site of the launcher. A crew of four to five personnel reloads the missiles from the truck to the launcher using a hydraulic-powered crane. The missiles are handled carefully for the safety of the missiles and personnel. Numerous steps are required in the reload process and the entire reload operationmore » can take over an hour for some missile systems. Recent US Army directives require the entire operation to be accomplished in a fraction of that time. Current development of SCATS is being based primarily on reloading Patriot missiles. This paper summarizes the current status of the SCATS project at the Oak Ridge National Laboratory (ORNL). Additional information on project background and requirements has been described previously (Bradley, et al., 1995).« less

  12. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  13. Planning of Educational Informatics Network (EIN)

    ERIC Educational Resources Information Center

    Çekiç, Osman; Kusçu, Meltem

    2017-01-01

    Planning is a concept that takes place all throughout one's life. People plan their education, work, private lives, and careers; in other words, they plan their entire lives to achieve specific desired outcomes. Not only do individuals plan which schools they will attend, but also where they should live and which jobs they should take. Similar to…

  14. Rotating Solids and Flipping Instruction

    ERIC Educational Resources Information Center

    Grypp, Lynette; Luebeck, Jennifer

    2015-01-01

    Technology is causing educators to rethink the entire notion of classroom learning, not only with respect to what learning should take place but also where it should take place. One such innovation is flipped instruction, broadly defined by Staker and Horn (2012) as an instructional model in which students learn partly through online delivery and…

  15. Creating "Metaphorical Spaces" in a Language Arts Classroom

    ERIC Educational Resources Information Center

    Wiseman, Angela M.

    2007-01-01

    This paper describes a collaborative relationship between a community member and an eighth grade English teacher that was documented through an ethnographic study during an entire school year. The community member taught a weekly poetry workshop where students are encouraged to take risks in their writing and also take a critical stance towards…

  16. Constructive Failure in Mathematics Classes

    ERIC Educational Resources Information Center

    Rowlett, Joel E.

    2011-01-01

    Great strides in the real world are usually accompanied by failure. Mathematics teachers should accept some failure as their students take risks during mathematical explorations. This is not to imply that students should fail an entire course, but they should have opportunities to take risks that may lead to failure, especially in the area of…

  17. Recommendations for safe vaccination in children at the risk of taking allergic reactions to vaccine components

    PubMed

    2018-04-01

    Vaccines are one of the most important advances in medicine as a public health tool for the control of immunopreventable diseases. Occasionally, adverse reactions may occur. If a child has a reaction to a vaccine, it is likely to disrupt his immunization schedule with risks to himself and the community. This establishes the importance of correctly diagnosing a possible allergy and defining appropriate behavior. Allergic reactions to vaccines may be due to the immunogenic component, to the residual proteins in the manufacturing process and to antimicrobial agents, stabilizers, preservatives and any other element used in the manufacturing process. Vaccination should be a priority in the entire child population, so this document describes particular situations of allergic children to minimize the risk of immunizations and achieve safe vaccination.

  18. Medicare+Choice: what lies ahead?

    PubMed

    Layne, R Jeffrey

    2002-03-01

    Health plans have continued to exit the Medicare+Choice program in recent years, despite efforts of Congress and the Centers for Medicare and Medicaid Services (CMS) to reform the program. Congress and CMS therefore stand poised to make additional, substantial reforms to the program. CMS has proposed to consolidate its oversight of the program, extend the due date for Medicare+Choice plans to file their adjusted community rate proposals, revise risk-adjustment processes, streamline the marketing review process, enhance quality-improvement requirements, institute results based performance assessment audits, coordinate policy changes to coincide with contracting cycles, expand its fall advertising campaign for the program, provide better employer-based Medicare options for beneficiaries, and take steps to minimize beneficiary costs. Congressional leaders have proposed various legislative remedies to improve the program, including creation of an entirely new pricing structure for the program based on a competitive bidding process.

  19. Microfluidic droplet-based liquid-liquid extraction.

    PubMed

    Mary, Pascaline; Studer, Vincent; Tabeling, Patrick

    2008-04-15

    We study microfluidic systems in which mass exchanges take place between moving water droplets, formed on-chip, and an external phase (octanol). Here, no chemical reaction takes place, and the mass exchanges are driven by a contrast in chemical potential between the dispersed and continuous phases. We analyze the case where the microfluidic droplets, occupying the entire width of the channel, extract a solute-fluorescein-from the external phase (extraction) and the opposite case, where droplets reject a solute-rhodamine-into the external phase (purification). Four flow configurations are investigated, based on straight or zigzag microchannels. Additionally to the experimental work, we performed two-dimensional numerical simulations. In the experiments, we analyze the influence of different parameters on the process (channel dimensions, fluid viscosities, flow rates, drop size, droplet spacing, ...). Several regimes are singled out. In agreement with the mass transfer theory of Young et al. (Young, W.; Pumir, A.; Pomeau, Y. Phys. Fluids A 1989, 1, 462), we find that, after a short transient, the amount of matter transferred across the droplet interface grows as the square root of time and the time it takes for the transfer process to be completed decreases as Pe-2/3, where Pe is the Peclet number based on droplet velocity and radius. The numerical simulation is found in excellent consistency with the experiment. In practice, the transfer time ranges between a fraction and a few seconds, which is much faster than conventional systems.

  20. Global Adjoint Tomography: Combining Big Data with HPC Simulations

    NASA Astrophysics Data System (ADS)

    Bozdag, E.; Lefebvre, M. P.; Lei, W.; Peter, D. B.; Smith, J. A.; Komatitsch, D.; Tromp, J.

    2014-12-01

    The steady increase in data quality and the number of global seismographic stations have substantially grown the amount of data available for construction of Earth models. Meanwhile, developments in the theory of wave propagation, numerical methods and HPC systems have enabled unprecedented simulations of seismic wave propagation in realistic 3D Earth models which lead the extraction of more information from data, ultimately culminating in the use of entire three-component seismograms.Our aim is to take adjoint tomography further to image the entire planet which is one of the extreme cases in seismology due to its intense computational requirements and vast amount of high-quality seismic data that can potentially be assimilated in inversions. We have started low resolution (T > 27 s, soon will be > 17 s) global inversions with 253 earthquakes for a transversely isotropic crust and mantle model on Oak Ridge National Laboratory's Cray XK7 "Titan" system. Recent improvements in our 3D solvers, such as the GPU version of the SPECFEM3D_GLOBE package, will allow us perform higher-resolution (T > 9 s) and longer-duration (~180 m) simulations to take the advantage of high-frequency body waves and major-arc surface waves to improve imbalanced ray coverage as a result of uneven distribution of sources and receivers on the globe. Our initial results after 10 iterations already indicate several prominent features reported in high-resolution continental studies, such as major slabs (Hellenic, Japan, Bismarck, Sandwich, etc.) and enhancement in plume structures (the Pacific superplume, the Hawaii hot spot, etc.). Our ultimate goal is to assimilate seismic data from more than 6,000 earthquakes within the magnitude range 5.5 ≤ Mw ≤ 7.0. To take full advantage of this data set on ORNL's computational resources, we need a solid framework for managing big data sets during pre-processing (e.g., data requests and quality checks), gradient calculations, and post-processing (e.g., pre-conditioning and smoothing gradients) where we address the bottlenecks in our global seismic workflow based on ORNL's ADIOS libraries. We will present our "first generation" model, discuss challenges and future directions in global seismology.

  1. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 08: Retrospective Dose Accumulation Workflow in Head and Neck Cancer Patients Using RayStation 4.5.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Olive; Chan, Biu; Moseley, Joanne

    Purpose: We have developed a semi-automated dose accumulation workflow for Head and Neck Cancer (HNC) patients to evaluate volumetric and dosimetric changes that take place during radiotherapy. This work will be used to assess how dosimetric changes affect both toxicity and disease control, hence inform the feasibility and design of a prospective HNC adaptive trial. Methods: RayStation 4.5.2 features deformable image registration (DIR), where structures already defined on the planning CT image set can be deformably mapped onto cone-beam computed tomography (CBCT) images, accounting for daily treatment set-up shifts and changes in patient anatomy. The daily delivered dose can bemore » calculated on each CBCT and mapped back to the planning CT to allow dose accumulation. The process is partially automated using Python scripts developed in collaboration with RaySearch. Results: To date we have performed dose accumulation on 18 HNC patients treated at our institution during 2013–2015 under REB approval. Our semi-automated process establishes clinical feasibility. Generally, dose accumulation for the entire treatment course of one case takes 60–120 minutes: importing all CBCTs requires 20–30 minutes as each patient has 30 to 40 treated fractions; image registration and dose accumulation require 60–90 minutes. This is in contrast to the process without automated scripts where dose accumulation alone would take 3–5 hours. Conclusions: We have developed a reliable workflow for retrospective dose tracking in HNC using RayStation. The process has been validated for HNC patients treated on both Elekta and Varian linacs with CBCTs acquired on XVI and OBI platforms respectively.« less

  2. Geometrical modeling of complete dental shapes by using panoramic X-ray, digital mouth data and anatomical templates.

    PubMed

    Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano

    2015-07-01

    In the field of orthodontic planning, the creation of a complete digital dental model to simulate and predict treatments is of utmost importance. Nowadays, orthodontists use panoramic radiographs (PAN) and dental crown representations obtained by optical scanning. However, these data do not contain any 3D information regarding tooth root geometries. A reliable orthodontic treatment should instead take into account entire geometrical models of dental shapes in order to better predict tooth movements. This paper presents a methodology to create complete 3D patient dental anatomies by combining digital mouth models and panoramic radiographs. The modeling process is based on using crown surfaces, reconstructed by optical scanning, and root geometries, obtained by adapting anatomical CAD templates over patient specific information extracted from radiographic data. The radiographic process is virtually replicated on crown digital geometries through the Discrete Radon Transform (DRT). The resulting virtual PAN image is used to integrate the actual radiographic data and the digital mouth model. This procedure provides the root references on the 3D digital crown models, which guide a shape adjustment of the dental CAD templates. The entire geometrical models are finally created by merging dental crowns, captured by optical scanning, and root geometries, obtained from the CAD templates. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. It Takes a Community

    ERIC Educational Resources Information Center

    Stephens, Missy

    2010-01-01

    Tuscaloosa, Alabama, created a prekindergarten program for at-risk children by bringing together the entire community. With donations and volunteer help, the city managed to fund 19 classrooms serving 275 children.

  4. Integrated Knowledge Translation: illustrated with outcome research in mental health.

    PubMed

    Preyde, Michele; Carter, Jeff; Penney, Randy; Lazure, Kelly; Vanderkooy, John; Chevalier, Pat

    2015-01-01

    Through this article the authors present a case summary of the early phases of research conducted with an Integrated Knowledge Translation (iKT) approach utilizing four factors: research question, research approach, feasibility, and outcome. iKT refers to an approach for conducting research in which community partners, referred to as knowledge users, are engaged in the entire research process. In this collaborative approach, knowledge users and researchers jointly devise the entire research agenda beginning with the development of the research question(s), determination of a feasible research design and feasible methods, interpretation of the results, dissemination of the findings, and the translation of knowledge into practice or policy decisions. Engaging clinical or community partners in the research enterprise can enhance the utility of the research results and facilitate its uptake. This collaboration can be a complex arrangement and flexibility may be required to accommodate the various configurations that the collaboration can take. For example, the research question can be jointly determined and refined; however, one person must take the responsibility for orchestrating the project, including preparing the proposal and application to the Research Ethics Board. This collaborative effort also requires the simultaneous navigation of barriers and facilitators to the research enterprise. Navigating these elements becomes part of the conduct of research with the potential for rewarding results, including an enriched work experience for clinical partners and investigators. One practice implication is that iKT may be considered of great utility to service providers due to its field friendly nature.

  5. gpuPOM: a GPU-based Princeton Ocean Model

    NASA Astrophysics Data System (ADS)

    Xu, S.; Huang, X.; Zhang, Y.; Fu, H.; Oey, L.-Y.; Xu, F.; Yang, G.

    2014-11-01

    Rapid advances in the performance of the graphics processing unit (GPU) have made the GPU a compelling solution for a series of scientific applications. However, most existing GPU acceleration works for climate models are doing partial code porting for certain hot spots, and can only achieve limited speedup for the entire model. In this work, we take the mpiPOM (a parallel version of the Princeton Ocean Model) as our starting point, design and implement a GPU-based Princeton Ocean Model. By carefully considering the architectural features of the state-of-the-art GPU devices, we rewrite the full mpiPOM model from the original Fortran version into a new Compute Unified Device Architecture C (CUDA-C) version. We take several accelerating methods to further improve the performance of gpuPOM, including optimizing memory access in a single GPU, overlapping communication and boundary operations among multiple GPUs, and overlapping input/output (I/O) between the hybrid Central Processing Unit (CPU) and the GPU. Our experimental results indicate that the performance of the gpuPOM on a workstation containing 4 GPUs is comparable to a powerful cluster with 408 CPU cores and it reduces the energy consumption by 6.8 times.

  6. Logistic Principles Application for Managing the Extraction and Transportation of Solid Minerals

    NASA Astrophysics Data System (ADS)

    Tyurin, Alexey

    2017-11-01

    Reducing the cost of resources in solid mineral extraction is an urgent task. For its solution the article proposes logistic approach use to management of mining company all resources, including extraction processes, transport, mineral handling and storage. The account of the uneven operation of mining, transport units and complexes for processing and loading coal into railroad cars allows you to identify the shortcomings in the work of the entire enterprise and reduce resources use at the planned production level. In the article the mining planning model taking into account the dynamics of the production, transport stations and export coal to consumers rail transport on example of Krasnoyarsk region Nazarovo JSC «Razrez Sereul'skiy». Rolling planning methods use and data aggregation allows you to split the planning horizon (month) on equal periods and to use of dynamic programming method for building mining optimal production programme for the month. Coal mining production program definition technique will help align the work of all enterprise units, to optimize resources of all areas, to establish a flexible relationship between manufacturer and consumer, to take into account the irregularity of rail transport.

  7. Official opening remarks: bird conservation in Canada

    Treesearch

    Michael S. W. Bradstreet

    1997-01-01

    It is my pleasure, as Executive Director of Bird Studies Canada (BSC), to take part in the opening ceremonies of the Second International Symposium: Biology and Conservation of Owls of the Northern Hemisphere. It is entirely appropriate that this symposium should take place in one of the colder cities in the Northern Hemisphere, in mid-winter, and it is fitting that...

  8. Reduced state feedback gain computation. [optimization and control theory for aircraft control

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1976-01-01

    Because application of conventional optimal linear regulator theory to flight controller design requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. Therefore, a stochastic linear model that was developed is presented which accounts for aircraft parameter and initial uncertainty, measurement noise, turbulence, pilot command and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell. Results using a seventh order process show the proposed procedures to be very effective.

  9. [Treatment of substance dependence by a bio-cognitive model based on behavioral pharmacology].

    PubMed

    Hori, Toru; Komiyama, Tokutaro; Harada, Seiichi; Matsumoto, Takenori

    2005-01-01

    We have introduced cognitive behavior therapy (CBT) into the treatment of substance dependence patients, which involves disease education and focused group therapy to obtain insight into the taking behavior and to establish concrete countermeasures to prevent relapse. We have created a bio-cognitive model based on biological aspects to explain the pathology of substance dependence. 'Dependence' is a term in behavioral pharmacology defined as reinforced drug seeking and taking behavior. Changes in taking behavior are thought to occur due to the repetition of the reinforcement action of psychoactive substances in the reward system of the brain. Therefore, when intake desire is strong, it is hard for patients to control themselves, and there is a feature of difficulties considering the process of thinking in CBT. In other words, when craving becomes strong, a chain of behavior happens spontaneously, without schema, involving automatic thoughts. We think that the improvement of protracted withdrawal syndrome (PWS) and entire frontal lobe function are important in learning to discern distortion of cognition. When PWS is improved, a conflict is easy to bring about in the process of drug seeking and taking behavior. And, it is easy to execute avoidance plans (coping skills) which are established to cope with craving in advance. We think that a goal for treatment is to discern drug seeking and taking behavior with natural emotion. The recovery of PWS and frontal lobe dysfunction takes a long time with a serious dependence, so we must perform repetition of CBT. As the treatment introduction of involuntary admission cases is adequate or cases of 1 to 3 months of admission treatment based on voluntary admission are hard to treat, treatment to obtain insights into patients while carrying out repeated CBT using a bio-cognitive model and to improve PWS could be a possibility as one treatment for the pathology of diversified substance dependence.

  10. Sifting Through SDO's AIA Cosmic Ray Hits to Find Treasure

    NASA Astrophysics Data System (ADS)

    Kirk, M. S.; Thompson, B. J.; Viall, N. M.; Young, P. R.

    2017-12-01

    The Solar Dynamics Observatory's Atmospheric Imaging Assembly (SDO AIA) has revolutionized solar imaging with its high temporal and spatial resolution, unprecedented spatial and temporal coverage, and seven EUV channels. Automated algorithms routinely clean these images to remove cosmic ray intensity spikes as a part of its preprocessing algorithm. We take a novel approach to survey the entire set of AIA "spike" data to identify and group compact brightenings across the entire SDO mission. The AIA team applies a de-spiking algorithm to remove magnetospheric particle impacts on the CCD cameras, but it has been found that compact, intense solar brightenings are often removed as well. We use the spike database to mine the data and form statistics on compact solar brightenings without having to process large volumes of full-disk AIA data. There are approximately 3 trillion "spiked pixels" removed from images over the mission to date. We estimate that 0.001% of those are of solar origin and removed by mistake, giving us a pre-segmented dataset of 30 million events. We explore the implications of these statistics and the physical qualities of the "spikes" of solar origin.

  11. Real-Time Three-Dimensional Cell Segmentation in Large-Scale Microscopy Data of Developing Embryos.

    PubMed

    Stegmaier, Johannes; Amat, Fernando; Lemon, William C; McDole, Katie; Wan, Yinan; Teodoro, George; Mikut, Ralf; Keller, Philipp J

    2016-01-25

    We present the Real-time Accurate Cell-shape Extractor (RACE), a high-throughput image analysis framework for automated three-dimensional cell segmentation in large-scale images. RACE is 55-330 times faster and 2-5 times more accurate than state-of-the-art methods. We demonstrate the generality of RACE by extracting cell-shape information from entire Drosophila, zebrafish, and mouse embryos imaged with confocal and light-sheet microscopes. Using RACE, we automatically reconstructed cellular-resolution tissue anisotropy maps across developing Drosophila embryos and quantified differences in cell-shape dynamics in wild-type and mutant embryos. We furthermore integrated RACE with our framework for automated cell lineaging and performed joint segmentation and cell tracking in entire Drosophila embryos. RACE processed these terabyte-sized datasets on a single computer within 1.4 days. RACE is easy to use, as it requires adjustment of only three parameters, takes full advantage of state-of-the-art multi-core processors and graphics cards, and is available as open-source software for Windows, Linux, and Mac OS. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. JANUS: A Compilation System for Balancing Parallelism and Performance in OpenVX

    NASA Astrophysics Data System (ADS)

    Omidian, Hossein; Lemieux, Guy G. F.

    2018-04-01

    Embedded systems typically do not have enough on-chip memory for entire an image buffer. Programming systems like OpenCV operate on entire image frames at each step, making them use excessive memory bandwidth and power. In contrast, the paradigm used by OpenVX is much more efficient; it uses image tiling, and the compilation system is allowed to analyze and optimize the operation sequence, specified as a compute graph, before doing any pixel processing. In this work, we are building a compilation system for OpenVX that can analyze and optimize the compute graph to take advantage of parallel resources in many-core systems or FPGAs. Using a database of prewritten OpenVX kernels, it automatically adjusts the image tile size as well as using kernel duplication and coalescing to meet a defined area (resource) target, or to meet a specified throughput target. This allows a single compute graph to target implementations with a wide range of performance needs or capabilities, e.g. from handheld to datacenter, that use minimal resources and power to reach the performance target.

  13. Alpha Matting with KL-Divergence Based Sparse Sampling.

    PubMed

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  14. Analysis of hospital costs as a basis for pricing services in Mali.

    PubMed

    Audibert, Martine; Mathonnat, Jacky; Pareil, Delphine; Kabamba, Raymond

    2007-01-01

    In a move to achieve a better equity in the funding of access to health care, particularly for the poor, a better efficiency of hospital functioning and a better financial balance, the analysis of hospital costs in Mali brings several key elements to improve the pricing of medical services. The method utilized is the classical step-down process which takes into consideration the entire set of direct and indirect costs borne by the hospital. Although this approach does not allow to estimate the economic cost of consultations, it is a useful contribution to assess the financial activity of the hospital and improve its performance, financially speaking, through a more relevant user fees policy. The study shows that there are possibilities of cross-subsidies within the hospital or within services which improve the recovery of some of the current costs. It also leads to several proposals of pricing care while taking into account the constraints, the level of the hospital its specific conditions and equity. Copyright (c) 2007 John Wiley & Sons, Ltd.

  15. Big Data and High-Performance Computing in Global Seismology

    NASA Astrophysics Data System (ADS)

    Bozdag, Ebru; Lefebvre, Matthieu; Lei, Wenjie; Peter, Daniel; Smith, James; Komatitsch, Dimitri; Tromp, Jeroen

    2014-05-01

    Much of our knowledge of Earth's interior is based on seismic observations and measurements. Adjoint methods provide an efficient way of incorporating 3D full wave propagation in iterative seismic inversions to enhance tomographic images and thus our understanding of processes taking place inside the Earth. Our aim is to take adjoint tomography, which has been successfully applied to regional and continental scale problems, further to image the entire planet. This is one of the extreme imaging challenges in seismology, mainly due to the intense computational requirements and vast amount of high-quality seismic data that can potentially be assimilated. We have started low-resolution inversions (T > 30 s and T > 60 s for body and surface waves, respectively) with a limited data set (253 carefully selected earthquakes and seismic data from permanent and temporary networks) on Oak Ridge National Laboratory's Cray XK7 "Titan" system. Recent improvements in our 3D global wave propagation solvers, such as a GPU version of the SPECFEM3D_GLOBE package, will enable us perform higher-resolution (T > 9 s) and longer duration (~180 m) simulations to take the advantage of high-frequency body waves and major-arc surface waves, thereby improving imbalanced ray coverage as a result of the uneven global distribution of sources and receivers. Our ultimate goal is to use all earthquakes in the global CMT catalogue within the magnitude range of our interest and data from all available seismic networks. To take the full advantage of computational resources, we need a solid framework to manage big data sets during numerical simulations, pre-processing (i.e., data requests and quality checks, processing data, window selection, etc.) and post-processing (i.e., pre-conditioning and smoothing kernels, etc.). We address the bottlenecks in our global seismic workflow, which are mainly coming from heavy I/O traffic during simulations and the pre- and post-processing stages, by defining new data formats for seismograms and outputs of our 3D solvers (i.e., meshes, kernels, seismic models, etc.) based on ORNL's ADIOS libraries. We will discuss our global adjoint tomography workflow on HPC systems as well as the current status of our global inversions.

  16. Turbulent mixing and removal of ozone within an Amazon rainforest canopy

    NASA Astrophysics Data System (ADS)

    Freire, L. S.; Gerken, T.; Ruiz-Plancarte, J.; Wei, D.; Fuentes, J. D.; Katul, G. G.; Dias, N. L.; Acevedo, O. C.; Chamecki, M.

    2017-03-01

    Simultaneous profiles of turbulence statistics and mean ozone mixing ratio are used to establish a relation between eddy diffusivity and ozone mixing within the Amazon forest. A one-dimensional diffusion model is proposed and used to infer mixing time scales from the eddy diffusivity profiles. Data and model results indicate that during daytime conditions, the upper (lower) half of the canopy is well (partially) mixed most of the time and that most of the vertical extent of the forest can be mixed in less than an hour. During nighttime, most of the canopy is predominantly poorly mixed, except for periods with bursts of intermittent turbulence. Even though turbulence is faster than chemistry during daytime, both processes have comparable time scales in the lower canopy layers during nighttime conditions. Nonchemical loss time scales (associated with stomatal uptake and dry deposition) for the entire forest are comparable to turbulent mixing time scale in the lower canopy during the day and in the entire canopy during the night, indicating a tight coupling between turbulent transport and dry deposition and stomatal uptake processes. Because of the significant time of day and height variability of the turbulent mixing time scale inside the canopy, it is important to take it into account when studying chemical and biophysical processes happening in the forest environment. The method proposed here to estimate turbulent mixing time scales is a reliable alternative to currently used models, especially for situations in which the vertical distribution of the time scale is relevant.

  17. On Temperature Rise Within the Shear Bands in Bulk Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Bazlov, A. I.; Churyumov, A. Yu.; Buchet, M.; Louzguine-Luzgin, D. V.

    2018-05-01

    Room temperature deformation process in a bulk metallic glassy sample was studied by using a hydraulic thermomechanical simulator. The temperature rise during each separate shear band propagation event was measured with a high data acquisition frequency by a thermocouple welded to the sample. Calculation showed that when propagation of the well developed shear bands takes place along the entire sample the temperature inside the shear band should be close to the glass-transition temperature. It was also possible to resolve the temporal stress distribution and a double-stage character of stress drops was also observed. The obtained results are compared with the literature data obtained by infrared camera measurements and the results of finite elements modeling.

  18. On Temperature Rise Within the Shear Bands in Bulk Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Bazlov, A. I.; Churyumov, A. Yu.; Buchet, M.; Louzguine-Luzgin, D. V.

    2018-03-01

    Room temperature deformation process in a bulk metallic glassy sample was studied by using a hydraulic thermomechanical simulator. The temperature rise during each separate shear band propagation event was measured with a high data acquisition frequency by a thermocouple welded to the sample. Calculation showed that when propagation of the well developed shear bands takes place along the entire sample the temperature inside the shear band should be close to the glass-transition temperature. It was also possible to resolve the temporal stress distribution and a double-stage character of stress drops was also observed. The obtained results are compared with the literature data obtained by infrared camera measurements and the results of finite elements modeling.

  19. Laser cooling at resonance

    NASA Astrophysics Data System (ADS)

    Yudkin, Yaakov; Khaykovich, Lev

    2018-05-01

    We show experimentally that three-dimensional laser cooling of lithium atoms on the D2 line is possible when the laser light is tuned exactly to resonance with the dominant atomic transition. Qualitatively, it can be understood by applying simple Doppler cooling arguments to the specific hyperfine structure of the excited state of lithium atoms, which is both dense and inverted. However, to build a quantitative theory, we must resolve to a full model which takes into account both the entire atomic structure of all 24 Zeeman sublevels and the laser light polarization. Moreover, by means of Monte Carlo simulations, we show that coherent processes play an important role in showing consistency between the theory and the experimental results.

  20. Vitrification of waste with conitnuous filling and sequential melting

    DOEpatents

    Powell, James R.; Reich, Morris

    2001-09-04

    A method of filling a canister with vitrified waste starting with a waste, such as high-level radioactive waste, that is cooler than its melting point. Waste is added incrementally to a canister forming a column of waste capable of being separated into an upper zone and a lower zone. The minimum height of the column is defined such that the waste in the lower zone can be dried and melted while maintaining the waste in the upper zone below its melting point. The maximum height of the column is such that the upper zone remains porous enough to permit evolved gases from the lower zone to flow through the upper zone and out of the canister. Heat is applied to the waste in the lower zone to first dry then to raise and maintain its temperature to a target temperature above the melting point of the waste. Then the heat is applied to a new lower zone above the melted waste and the process of adding, drying and melting the waste continues upward in the canister until the entire canister is filled and the entire contents are melted and maintained at the target temperature for the desired period. Cooling of the melted waste takes place incrementally from the bottom of the canister to the top, or across the entire canister surface area, forming a vitrified product.

  1. Etravirine

    MedlinePlus

    ... HIV in the blood. Although etravirine does not cure HIV, it may decrease your chance of developing acquired ... entire dose is taken.Etravirine helps to control HIV infection, but does not cure it. Continue to take etravirine even if you ...

  2. Comment on "falsification of the Atmospheric CO2 Greenhouse Effects Within the Frame of Physics"

    NASA Astrophysics Data System (ADS)

    Halpern, Joshua B.; Colose, Christopher M.; Ho-Stuart, Chris; Shore, Joel D.; Smith, Arthur P.; Zimmermann, Jörg

    In this journal, Gerhard Gerlich and Ralf D. Tscheuschner claim to have falsified the existence of an atmospheric greenhouse effect.1 Here, we show that their methods, logic, and conclusions are in error. Their most significant errors include trying to apply the Clausius statement of the Second Law of Thermodynamics to only one side of a heat transfer process rather than the entire process, and systematically ignoring most non-radiative heat flows applicable to the Earth's surface and atmosphere. They claim that radiative heat transfer from a colder atmosphere to a warmer surface is forbidden, ignoring the larger transfer in the other direction which makes the complete process allowed. Further, by ignoring heat capacity and non-radiative heat flows, they claim that radiative balance requires that the surface cool by 100 K or more at night, an obvious absurdity induced by an unphysical assumption. This comment concentrates on these two major points, while also taking note of some of Gerlich and Tscheuschner's other errors and misunderstandings.

  3. Identifying the Root Causes of Wait States in Large-Scale Parallel Applications

    DOE PAGES

    Böhme, David; Geimer, Markus; Arnold, Lukas; ...

    2016-07-20

    Driven by growing application requirements and accelerated by current trends in microprocessor design, the number of processor cores on modern supercomputers is increasing from generation to generation. However, load or communication imbalance prevents many codes from taking advantage of the available parallelism, as delays of single processes may spread wait states across the entire machine. Moreover, when employing complex point-to-point communication patterns, wait states may propagate along far-reaching cause-effect chains that are hard to track manually and that complicate an assessment of the actual costs of an imbalance. Building on earlier work by Meira Jr. et al., we present amore » scalable approach that identifies program wait states and attributes their costs in terms of resource waste to their original cause. Ultimately, by replaying event traces in parallel both forward and backward, we can identify the processes and call paths responsible for the most severe imbalances even for runs with hundreds of thousands of processes.« less

  4. Identifying the Root Causes of Wait States in Large-Scale Parallel Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Böhme, David; Geimer, Markus; Arnold, Lukas

    Driven by growing application requirements and accelerated by current trends in microprocessor design, the number of processor cores on modern supercomputers is increasing from generation to generation. However, load or communication imbalance prevents many codes from taking advantage of the available parallelism, as delays of single processes may spread wait states across the entire machine. Moreover, when employing complex point-to-point communication patterns, wait states may propagate along far-reaching cause-effect chains that are hard to track manually and that complicate an assessment of the actual costs of an imbalance. Building on earlier work by Meira Jr. et al., we present amore » scalable approach that identifies program wait states and attributes their costs in terms of resource waste to their original cause. Ultimately, by replaying event traces in parallel both forward and backward, we can identify the processes and call paths responsible for the most severe imbalances even for runs with hundreds of thousands of processes.« less

  5. Dose rate prediction methodology for remote handled transuranic waste workers at the waste isolation pilot plant.

    PubMed

    Hayes, Robert

    2002-10-01

    An approach is described for estimating future dose rates to Waste Isolation Pilot Plant workers processing remote handled transuranic waste. The waste streams will come from the entire U.S. Department of Energy complex and can take on virtually any form found from the processing sequences for defense-related production, radiochemistry, activation and related work. For this reason, the average waste matrix from all generator sites is used to estimate the average radiation fields over the facility lifetime. Innovative new techniques were applied to estimate expected radiation fields. Non-linear curve fitting techniques were used to predict exposure rate profiles from cylindrical sources using closed form equations for lines and disks. This information becomes the basis for Safety Analysis Report dose rate estimates and for present and future ALARA design reviews when attempts are made to reduce worker doses.

  6. Apparatus for Investigating Momentum and Energy Conservation With MBL and Video Analysis

    NASA Astrophysics Data System (ADS)

    George, Elizabeth; Vazquez-Abad, Jesus

    1998-04-01

    We describe the development and use of a laboratory setup that is appropriate for computer-aided student investigation of the principles of conservation of momentum and mechanical energy in collisions. The setup consists of two colliding carts on a low-friction track, with one of the carts (the target) attached to a spring, whose extension or compression takes the place of the pendulum's rise in the traditional ballistic pendulum apparatus. Position vs. time data for each cart are acquired either by using two motion sensors or by digitizing images obtained with a video camera. This setup allows students to examine the time history of momentum and mechanical energy during the entire collision process, rather than simply focusing on the before and after regions. We believe that this setup is suitable for helping students gain understanding as the processes involved are simple to follow visually, to manipulate, and to analyze.

  7. Occurrence and Characteristics of a Rapid Exchange of Phosphate Oxygens Catalyzed by Sarcoplasmic Reticulum Vesicles

    DOE R&D Accomplishments Database

    Kanazawa, T.; Boyer, P. D.

    1972-01-01

    Sarcoplasmic reticulum vesicles isolated from skeletal muscle actively take up Ca{sup ++} from the medium in the presence of Mg{sup ++} and ATP. This transport is coupled to ATP hydrolysis catalyzed by membrane-bound Ca{sup++}, Mg{sup ++}-ATPase which is activated by concurrent presence of Ca{sup ++} and Mg{sup ++}. Considerable informations have accumulated that give insight into the ATPase and its coupling to the calcium transport. The hydrolysis of ATP by this enzyme occurs through a phosphorylated intermediate. Formation and decomposition of the intermediate show vectorial requirements for Ca{sup ++} and Mg{sup ++}, suggesting an intimate involvement of the intermediate in the transport process. ATP synthesis from P{sub i} and ADP coupled to outflow of Ca{sup ++} from sarcoplasmic reticulum vesicles has recently been demonstrated. This indicates the reversibility of the entire process of calcium transport in sarcoplasmic reticulum vesicles.

  8. Implementation of standardization in clinical practice: not always an easy task.

    PubMed

    Panteghini, Mauro

    2012-02-29

    As soon as a new reference measurement system is adopted, clinical validation of correctly calibrated commercial methods should take place. Tracing back the calibration of routine assays to a reference system can actually modify the relation of analyte results to existing reference intervals and decision limits and this may invalidate some of the clinical decision-making criteria currently used. To maintain the accumulated clinical experience, the quantitative relationship to the previous calibration system should be established and, if necessary, the clinical decision-making criteria should be adjusted accordingly. The implementation of standardization should take place in a concerted action of laboratorians, manufacturers, external quality assessment scheme organizers and clinicians. Dedicated meetings with manufacturers should be organized to discuss the process of assay recalibration and studies should be performed to obtain convincing evidence that the standardization works, improving result comparability. Another important issue relates to the surveillance of the performance of standardized assays through the organization of appropriate analytical internal and external quality controls. Last but not least, uncertainty of measurement that fits for this purpose must be defined across the entire traceability chain, starting with the available reference materials, extending through the manufacturers and their processes for assignment of calibrator values and ultimately to the final result reported to clinicians by laboratories.

  9. 50 CFR 217.80 - Specified activity and specified geographical region.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... training team has two days to complete their entire evolution (i.e., detonation of five charges). If... utilized to complete the evolution. (b) The incidental take of marine mammals at Eglin Air Force Base...

  10. 50 CFR 217.80 - Specified activity and specified geographical region.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... training team has two days to complete their entire evolution (i.e., detonation of five charges). If... utilized to complete the evolution. (b) The incidental take of marine mammals at Eglin Air Force Base...

  11. 50 CFR 217.80 - Specified activity and specified geographical region.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... training team has two days to complete their entire evolution (i.e., detonation of five charges). If... utilized to complete the evolution. (b) The incidental take of marine mammals at Eglin Air Force Base...

  12. KENNEDY SPACE CENTER, FLA. - Congressman Tom Feeney (left) and Deputy Director Woodrow Whitlow Jr. take an air boat ride around Kennedy Space Center. During January and February, Congressman Feeney traveled the entire coastline of Florida’s 24th District, and concluded his walks March 1 in Brevard County. On his walks, he met with constituents and community leaders to discuss legislative issues that will be addressed by the 108th Congress. Feeney ended his beach walk at the KSC Visitor Complex main entrance.

    NASA Image and Video Library

    2004-03-01

    KENNEDY SPACE CENTER, FLA. - Congressman Tom Feeney (left) and Deputy Director Woodrow Whitlow Jr. take an air boat ride around Kennedy Space Center. During January and February, Congressman Feeney traveled the entire coastline of Florida’s 24th District, and concluded his walks March 1 in Brevard County. On his walks, he met with constituents and community leaders to discuss legislative issues that will be addressed by the 108th Congress. Feeney ended his beach walk at the KSC Visitor Complex main entrance.

  13. Mechanisms of Molecular Mimicry of Plant CLE Peptide Ligands by the Parasitic Nematode Globodera rostochiensis1[C][W

    PubMed Central

    Guo, Yongfeng; Ni, Jun; Denver, Robert; Wang, Xiaohong; Clark, Steven E.

    2011-01-01

    Nematodes that parasitize plant roots cause huge economic losses and have few mechanisms for control. Many parasitic nematodes infect plants by reprogramming root development to drive the formation of feeding structures. How nematodes take control of plant development is largely unknown. Here, we identify two host factors involved in the function of a receptor ligand mimic, GrCLE1, secreted by the potato cyst nematode Globodera rostochiensis. GrCLE1 is correctly processed to an active form by host plant proteases. Processed GrCLE1 peptides bind directly to the plant CLE receptors CLV2, BAM1, and BAM2. Involvement of these receptors in the ligand-mimicking process is also supported by the fact that the ability of GrCLE1 peptides to alter plant root development in Arabidopsis (Arabidopsis thaliana) is dependent on these receptors. Critically, we also demonstrate that GrCLE1 maturation can be entirely carried out by plant factors and that the availability of CLE processing activity may be essential for successful ligand mimicry. PMID:21750229

  14. Mechanisms of molecular mimicry of plant CLE peptide ligands by the parasitic nematode Globodera rostochiensis.

    PubMed

    Guo, Yongfeng; Ni, Jun; Denver, Robert; Wang, Xiaohong; Clark, Steven E

    2011-09-01

    Nematodes that parasitize plant roots cause huge economic losses and have few mechanisms for control. Many parasitic nematodes infect plants by reprogramming root development to drive the formation of feeding structures. How nematodes take control of plant development is largely unknown. Here, we identify two host factors involved in the function of a receptor ligand mimic, GrCLE1, secreted by the potato cyst nematode Globodera rostochiensis. GrCLE1 is correctly processed to an active form by host plant proteases. Processed GrCLE1 peptides bind directly to the plant CLE receptors CLV2, BAM1, and BAM2. Involvement of these receptors in the ligand-mimicking process is also supported by the fact that the ability of GrCLE1 peptides to alter plant root development in Arabidopsis (Arabidopsis thaliana) is dependent on these receptors. Critically, we also demonstrate that GrCLE1 maturation can be entirely carried out by plant factors and that the availability of CLE processing activity may be essential for successful ligand mimicry.

  15. Adapting to wildfire: Moving beyond homeowner risk perceptions to taking action

    Treesearch

    Patricia Champ

    2017-01-01

    Champ’s presentation focused on how to get homeowners to take action to protect their properties from fire. She framed this challenge as a last-mile problem, which is a concept from the literature on supply chain. The last mile is the end of the supply chain where a product is transferred to the customer. The last mile is often the most difficult part of the entire...

  16. Succeeding at succession: the myth of Orestes.

    PubMed

    Eisold, Kenneth

    2008-11-01

    Although the myth of Oedipus seems an inevitable template for understanding succession in psychoanalysis, the myth of Orestes offers a more complex and promising view of the intergenerational transfer of leadership and authority, one that takes into account the entire community, not merely the individual leader. A closer look at the Aeschylus drama suggests three dimensions that need to be taken into account in managing succession: what are the mechanisms enabling the community to participate, what is the role of the unconscious irrational forces inevitably aroused in the process, and what are the wider social and economic issues that need to be addressed? This paper looks at the myth elaborated in the Greek drama, and then applies it to some of the current problems facing contemporary psychoanalytic institutions.

  17. The GAMCIT gamma ray burst detector

    NASA Technical Reports Server (NTRS)

    Mccall, Benjamin J.; Grunsfeld, John M.; Sobajic, Srdjan D.; Chang, Chinley Leonard; Krum, David M.; Ratner, Albert; Trittschuh, Jennifer E.

    1993-01-01

    The GAMCIT payload is a Get-Away-Special payload designed to search for high-energy gamma-ray bursts and any associated optical transients. This paper presents details on the design of the GAMCIT payload, in the areas of battery selection, power processing, electronics design, gamma-ray detection systems, and the optical imaging of the transients. The paper discusses the progress of the construction, testing, and specific design details of the payload. In addition, this paper discusses the unique challenges involved in bringing this payload to completion, as the project has been designed, constructed, and managed entirely by undergraduate students. Our experience will certainly be valuable to other student groups interested in taking on a challenging project such as a Get-Away-Special payload.

  18. The interaction of ultra-low-frequency pc3-5 waves with charged particles in Earth's magnetosphere

    NASA Astrophysics Data System (ADS)

    Zong, Qiugang; Rankin, Robert; Zhou, Xuzhi

    2017-12-01

    One of the most important issues in space physics is to identify the dominant processes that transfer energy from the solar wind to energetic particle populations in Earth's inner magnetosphere. Ultra-low-frequency (ULF) waves are an important consideration as they propagate electromagnetic energy over vast distances with little dissipation and interact with charged particles via drift resonance and drift-bounce resonance. ULF waves also take part in magnetosphere-ionosphere coupling and thus play an essential role in regulating energy flow throughout the entire system. This review summarizes recent advances in the characterization of ULF Pc3-5 waves in different regions of the magnetosphere, including ion and electron acceleration associated with these waves.

  19. A five-phase process model describing the return to sustainable work of persons who survived cancer: A qualitative study.

    PubMed

    Brusletto, Birgit; Torp, Steffen; Ihlebæk, Camilla Martha; Vinje, Hege Forbech

    2018-06-01

    We investigated persons who survived cancer (PSC) and their experiences in returning to sustainable work. Videotaped, qualitative, in-depth interviews with previous cancer patients were analyzed directly using "Interpretative Phenomenological Analysis" (IPA). Four men and four women aged 42-59 years participated. Mean time since last treatment was nine years. All participants had worked for more than 3 years when interviewed. An advisory team of seven members with diverse cancer experiences contributed as co-researchers. The entire trajectory from cancer diagnosis until achievement of sustainable work was analog to a journey, and a process model comprising five phases was developed, including personal situations, treatments, and work issues. The theme "return-to-work" (RTW) turned out to be difficult to separate from the entire journey that started at the time of diagnosis. PSCs were mainly concerned about fighting for life in phases 1 and 2. In phase 3 and 4, some participants had to adjust and make changes at work more than once over a period of 1-10 years before reaching sustainable work in phase 5. Overall, the ability to adapt to new circumstances, take advantage of emerging opportunities, and finding meaningful occupational activities were crucial. Our process model may be useful as a tool when discussing the future working life of PSCs. Every individual's journey towards sustainable work was unique, and contained distinct and long-lasting efforts and difficulties. The first attempt to RTW after cancer may not be persistent. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Integrated structure vacuum tube: A Concept

    NASA Technical Reports Server (NTRS)

    Dimeff, J.; Kerwin, W. J.

    1974-01-01

    Cathode emission is made to occur by heating entire structure to 600 C, and positive potential is applied to anode with negative potential on grids. Electron flow takes place from ring to circular anode through electric field produced by grids.

  1. From the ORFeome concept to highly comprehensive, full-genome screening libraries.

    PubMed

    Rid, Raphaela; Abdel-Hadi, Omar; Maier, Richard; Wagner, Martin; Hundsberger, Harald; Hintner, Helmut; Bauer, Johann; Onder, Kamil

    2013-02-01

    Recombination-based cloning techniques have in recent times facilitated the establishment of genome-scale single-gene ORFeome repositories. Their further handling and downstream application in systematic fashion is, however, practically impeded because of logistical plus economic challenges. At this juncture, simultaneously transferring entire gene collections in compiled pool format could represent an advanced compromise between systematic ORFeome (an organism's entire set of protein-encoding open reading frames) projects and traditional random library approaches, but has not yet been considered in great detail. In our endeavor to merge the comprehensiveness of ORFeomes with a basically simple, streamlined, and easily executable single-tube design, we have here produced five different pooled screening-ready libraries for both Staphylococcus aureus and Homo sapiens. By evaluating the parallel transfer efficiencies of differentially sized genes from initial polymerase chain reaction (PCR) product amplification to entry and final destination library construction via quantitative real-time PCR, we found that the complexity of the gene population is fairly stably maintained once an entry resource has been successfully established, and that no apparent size-selection bias loss of large inserts takes place. Recombinational transfer processes are hence robust enough for straightforwardly achieving such pooled screening libraries.

  2. Gravitational sliding of the Mt. Etna massif along a sloping basement

    NASA Astrophysics Data System (ADS)

    Murray, John B.; van Wyk de Vries, Benjamin; Pitty, Andy; Sargent, Phil; Wooller, Luke

    2018-04-01

    Geological field evidence and laboratory modelling indicate that volcanoes constructed on slopes slide downhill. If this happens on an active volcano, then the movement will distort deformation data and thus potentially compromise interpretation. Our recent GPS measurements demonstrate that the entire edifice of Mt. Etna is sliding to the ESE, the overall direction of slope of its complex, rough sedimentary basement. We report methods of discriminating the sliding vector from other deformation processes and of measuring its velocity, which averaged 14 mm year-1 during four intervals between 2001 and 2012. Though sliding of one sector of a volcano due to flank instability is widespread and well-known, this is the first time basement sliding of an entire active volcano has been directly observed. This is important because the geological record shows that such sliding volcanoes are prone to devastating sector collapse on the downslope side, and whole volcano migration should be taken into account when assessing future collapse hazard. It is also important in eruption forecasting, as the sliding vector needs to be allowed for when interpreting deformation events that take place above the sliding basement within the superstructure of the active volcano, as might occur with dyke intrusion or inflation/deflation episodes.

  3. Formation of a cavitation cluster in the vicinity of a quasi-empty rupture

    NASA Astrophysics Data System (ADS)

    Bol'shakova, E. S.; Kedrinskiy, V. K.

    2017-09-01

    The presentation deals with one of the experimental and numerical models of a quasi-empty rupture in the magma melt. This rupture is formed in the liquid layer of a distilled cavitating fluid under shock loading within the framework of the problem formulation with a small electromagnetic hydrodynamic shock tube. It is demonstrated that the rupture is shaped as a spherical segment, which retains its topology during the entire process of its evolution and collapsing. The dynamic behavior of the quasi-empty rupture is analyzed, and the growth of cavitating nuclei in the form of the boundary layer near the entire rupture interface is found. It is shown that rupture implosion is accompanied by the transformation of the bubble boundary layer to a cavitating cluster, which takes the form of a ring-shaped vortex floating upward to the free surface of the liquid layer. A p-κ mathematical model is formulated, and calculations are performed to investigate the implosion of a quasi-empty spherical cavity in the cavitating liquid, generation of a shock wave by this cavity, and dynamics of the bubble density growth in the cavitating cluster by five orders of magnitude.

  4. Understanding and Capturing People’s Mobile App Privacy Preferences

    DTIC Science & Technology

    2013-10-28

    The entire apps’ metadata takes up about 500MB of storage space when stored in a MySQL database and all the binary files take approximately 300GB of...functionality that can de- compile Dalvik bytecodes to Java source code faster than other de-compilers. Given the scale of the app analysis we planned on... java libraries, such as parser, sql connectors, etc Targeted Ads 137 admob, adwhirl, greystripe… Provided by mobile behavioral ads company to

  5. Air Force Members’ Guide for Reducing Cholesterol Ratio,

    DTIC Science & Technology

    1987-04-01

    this would limit the amount of cod liver oil you could take safely. You should take fish oil supplements that are high in EPA ( eicosapentaenoic acid ...LIST OF ILLUSTRATIONS _ _ _ _ _ TABLES TABLE 1--Fatty Acid Content of Oils and Spreads .......... 5 TABLE 2--Oznega-3 Grams per 3 1/2 oz...entirely eliminate saturated fats. So, you need to try and have a balance of the fats in your diet. Fatty Acid Content of Oils and Spreads

  6. Development of High Throughput Process for Constructing 454 Titanium and Illumina Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshpande, Shweta; Hack, Christopher; Tang, Eric

    2010-05-28

    We have developed two processes with the Biomek FX robot to construct 454 titanium and Illumina libraries in order to meet the increasing library demands. All modifications in the library construction steps were made to enable the adaptation of the entire processes to work with the 96-well plate format. The key modifications include the shearing of DNA with Covaris E210 and the enzymatic reaction cleaning and fragment size selection with SPRI beads and magnetic plate holders. The construction of 96 Titanium libraries takes about 8 hours from sheared DNA to ssDNA recovery. The processing of 96 Illumina libraries takes lessmore » time than that of the Titanium library process. Although both processes still require manual transfer of plates from robot to other work stations such as thermocyclers, these robotic processes represent about 12- to 24-folds increase of library capacity comparing to the manual processes. To enable the sequencing of many libraries in parallel, we have also developed sets of molecular barcodes for both library types. The requirements for the 454 library barcodes include 10 bases, 40-60percent GC, no consecutive same base, and no less than 3 bases difference between barcodes. We have used 96 of the resulted 270 barcodes to construct libraries and pool to test the ability of accurately assigning reads to the right samples. When allowing 1 base error occurred in the 10 base barcodes, we could assign 99.6percent of the total reads and 100percent of them were uniquely assigned. As for the Illumina barcodes, the requirements include 4 bases, balanced GC, and at least 2 bases difference between barcodes. We have begun to assess the ability to assign reads after pooling different number of libraries. We will discuss the progress and the challenges of these scale-up processes.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, E.C.; Killough, S.M.; Rowe, J.C.

    The purpose of the Smart Crane Ammunition Transfer System (SCATS) project is to demonstrate robotic/telerobotic controls technology for a mobile articulated crane for missile/ munitions handling, delivery, and reload. Missile resupply and reload have been manually intensive operations up to this time. Currently, reload missiles are delivered by truck to the site of the launcher. A crew of four to five personnel reloads the missiles from the truck to the launcher using a hydraulic-powered crane. The missiles are handled carefully for the safety of the missiles and personnel. Numerous steps are required in the reload process and the entire reloadmore » operation can take over 1 h for some missile systems. Recent U.S. Army directives require the entire operation to be accomplished in a fraction of that time. Current requirements for the development of SCATS are being based primarily on reloading Patriot missiles. The planned development approach will integrate robotic control and sensor technology with a commercially available hydraulic articulated crane. SCATS is being developed with commercially available hardware as much as possible. Development plans include adding a 3-D.F. end effector with a grapple to the articulating crane; closed-loop position control for the crane and end effector; digital microprocessor control of crane functions; simplified operator interface; and operating modes which include rectilinear movement, obstacle avoidance, and partial automated operation. The planned development will include progressive technology demonstrations. Ultimate plans are for this technology to be transferred and utilized in the military fielding process.« less

  8. Enhancing professionalism among engineering students through involvements in technical societies.

    PubMed

    Ghosh, Sreejita; Samineni, Anvesh; Mandal, Subhamoy; Murari, Bhaskar Mohan

    2015-08-01

    A student chapter can be considered to be a miniature enterprise; however without the latter's major financial risks. Involvement in the student chapter of a professional society like IEEE at undergraduate level plays a pivotal role in the overall professional development of the student by keeping the students informed about the various career possibilities. A student chapter shapes the hitherto naive students into industry ready professionals and to suitable candidates for some of the best grad schools worldwide. This assertion has been discussed in-depth taking the example of IEEE EMBS Student Branch chapter of VIT University. It has been described how the entire process, - starting from inception of an idea to its materialization in to an activity, has shaped the volunteers and participants into better professionals.

  9. Long-term aging of Ag/a-C:H:O nanocomposite coatings in air and in aqueous environment

    NASA Astrophysics Data System (ADS)

    Drábik, Martin; Pešička, Josef; Biederman, Hynek; Hegemann, Dirk

    2015-04-01

    Nanocomposite coatings of silver particles embedded in a plasma polymer matrix possess interesting properties depending on their microstructure. The film microstructure is affected among others also by the RF power supplied during the deposition, as shown by transmission electron microscopy. The optical properties are characterized by UV-vis-NIR spectroscopy. An anomalous optical absorption peak from the Ag nanoparticles is observed and related to the microstructure of the nanocomposite films. Furthermore, a long-term aging of the coatings is studied in-depth in ambient air and in aqueous environments. It is shown that the studied films are not entirely stable. The deposition conditions and the microstructure of the films affect the processes taking place during their aging in both environments.

  10. 30 CFR 203.44 - What administrative steps must I take to use the royalty suspension volume?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ultra-deep well on a lease that is located entirely in water more than 200 meters and less than 400 meters deep. (e) The MMS Regional Supervisor for Production and Development may extend the deadline for...

  11. Communication; A Scientific American Book.

    ERIC Educational Resources Information Center

    Scientific American, Inc., New York, NY.

    With present advances in communication technology, profound and qualitative changes in our civilization are taking place--in business and politics, in education, in entertainment, interpersonal relations, and the organization of society itself. In honor of the significance of such developments, an entire issue of "Scientific American" magazine…

  12. Concrete Masonry Designs: Educational Issue.

    ERIC Educational Resources Information Center

    Hertzberg, Randi, Ed.

    2001-01-01

    This special journal issue addresses concrete masonry in educational facilities construction. The issue's feature articles are: (1) "It Takes a Village To Construct a Massachusetts Middle School," describing a middle school constructed almost entirely of concrete masonry and modeled after a typical small New England village; (2)…

  13. The Early Years: Discovering through Deconstruction

    ERIC Educational Resources Information Center

    Ashbrook, Peggy

    2016-01-01

    Taking objects apart including old electronics, product packing, and living plants, helps children understand how things work. Documenting this "unbuilding" or "deconstructing" encourages children to first consider the entire object, then the parts, and finally, the purpose of the parts. This article provides a lesson based on…

  14. Characterization of femtosecond-laser pulse induced cell membrane nanosurgical attachment.

    PubMed

    Katchinskiy, Nir; Godbout, Roseline; Elezzabi, Abdulhakem Y

    2016-07-01

    This article provides insight into the mechanism of femtosecond laser nanosurgical attachment of cells. We have demonstrated that during the attachment of two retinoblastoma cells using sub-10 femtosecond laser pulses, with 800 nm central wavelength, the phospholipid molecules of both cells hemifuse and form one shared phospholipid bilayer, at the attachment location. In order to verify the hypothesis that hemifusion takes place, transmission electron microscope images of the cell membranes of retinoblastoma cells were taken. It is shown that at the attachment interface, the two cell membranes coalesce and form one single membrane shared by both cells. Thus, further evidence is provided to support the hypothesis that laser-induced ionization process led to an ultrafast reversible destabilization of the phospholipid layer of the cellular membrane, which resulted in cross-linking of the phospholipid molecules in each membrane. This process of hemifusion occurs throughout the entire penetration depth of the femtosecond laser pulse train. Thus, the attachment between the cells takes place across a large surface area, which affirms our findings of strong physical attachment between the cells. The femtosecond laser pulse hemifusion technique can potentially provide a platform for precise molecular manipulation of cellular membranes. Manipulation of the cellular membrane is an important procedure that could aid in studying diseases such as cancer; where the expression level of plasma proteins on the cell membrane is altered.

  15. Dissolution Mechanism for High Melting Point Transition Elements in Aluminum Melt

    NASA Astrophysics Data System (ADS)

    Lee, Young E.; Houser, Stephen L.

    When added cold in aluminum melt, the alloying process for compacts of transition metal elements such as Mn, Fe, Cr, Ni, Ti, Cu, and Zn takes a sequence of incubation, exothermic reactions to form intermetallic compounds, and dispersion of the alloying elements into aluminum melt. The experiments with Cr compacts show that the incubation period is affected by the content of ingredient Al and size of compacts and by size of Cr particles. Incubation period becomes longer as the content of ingredient aluminum in compact decreases, and this prolonged incubation period negatively impacts the dissolution of the alloying elements in aluminum. Once liquid aluminum forms at reaction sites, the exothermic reaction takes place quickly and significantly raises the temperature of the compacts. As the result of it, the compacts swell in volume with a sponge like structure. Such porous structure encourages the penetration of liquid aluminum from the melt. The compacts become weak mechanically, and the alloying elements are dispersed and entrained in aluminum melt as discrete and small sized units. When Cr compacts are deficient in aluminum, the unreacted Cr particles are encased by the intermetallic compounds in the dispersed particles. They are carried in the melt flow and continue the dissolution reaction in aluminum. The entire dissolution process of Cr compacts completes within 10 to 15 minutes with a full recovery when the aluminum content is 10 to 20% in compacts.

  16. Development of a real-time microchip PCR system for portable plant disease diagnosis.

    PubMed

    Koo, Chiwan; Malapi-Wight, Martha; Kim, Hyun Soo; Cifci, Osman S; Vaughn-Diaz, Vanessa L; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25 × 16 × 8 cm(3) in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample.

  17. Development of a Real-Time Microchip PCR System for Portable Plant Disease Diagnosis

    PubMed Central

    Kim, Hyun Soo; Cifci, Osman S.; Vaughn-Diaz, Vanessa L.; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C.; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25×16×8 cm3 in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample. PMID:24349341

  18. Educational interventions targeted at minors in situations of grave social vulnerability and their families

    NASA Astrophysics Data System (ADS)

    de La Caba Collado, Mariangeles; Bartau Rojas, Isabel

    2010-10-01

    The aim of this article is to outline and assess an educational intervention programme targeted at improving the skills of families and the personal and social development of children living in situations of grave social vulnerability. The sample comprised 10 families during the first phase of the intervention and six during the second. The design, intervention and assessment process of this study was carried out in two phases over a period of a year and a half. For both phases, three different groups—of men/fathers, women/mothers and children—were established. Study variables (parenting skills and children's personal and social development) were evaluated before and after the intervention in every group, as well as during the entire process. The results, taking into account the improvements reported by all the participants (social workers, group monitors, fathers, mothers, children) show that inter-professional involvement and coordination at all phases of the intervention is vital in order to achieve small but significant improvements.

  19. Metal phytoremediation: General strategies, genetically modified plants and applications in metal nanoparticle contamination.

    PubMed

    Gomes, Maria Angélica da Conceição; Hauser-Davis, Rachel Ann; de Souza, Adriane Nunes; Vitória, Angela Pierre

    2016-12-01

    The accumulation of metals in different environmental compartments poses a risk to both the environment and biota health. In particular, the continuous increase of these elements in soil ecosystems is a major worldwide concern. Phytoremediation has been gaining more attention in this regard. This approach takes advantage of the unique and selective uptake capabilities of plant root systems, and applies these natural processes alongside the translocation, bioaccumulation, and contaminant degradation abilities of the entire plant and, although it is a relatively recent technology, beginning in the 90's, it is already considered a green alternative solution to the problem of metal pollution, with great potential. This review focuses on phytoremediation of metals from soil, sludge, wastewater and water, the different strategies applied, the biological and physico-chemical processes involved and the advantages and limitations of each strategy. Special note is given to the use of transgenic species and phytoremediation of metallic nanoparticles. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling

    NASA Astrophysics Data System (ADS)

    Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.

    2016-11-01

    Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.

  1. A landsat data tiling and compositing approach optimized for change detection in the conterminous United States

    USGS Publications Warehouse

    Nelson, Kurtis; Steinwand, Daniel R.

    2015-01-01

    Annual disturbance maps are produced by the LANDFIRE program across the conterminous United States (CONUS). Existing LANDFIRE disturbance data from 1999 to 2010 are available and current efforts will produce disturbance data through 2012. A tiling and compositing approach was developed to produce bi-annual images optimized for change detection. A tiled grid of 10,000 × 10,000 30 m pixels was defined for CONUS and adjusted to consolidate smaller tiles along national borders, resulting in 98 non-overlapping tiles. Data from Landsat-5,-7, and -8 were re-projected to the tile extents, masked to remove clouds, shadows, water, and snow/ice, then composited using a cosine similarity approach. The resultant images were used in a change detection algorithm to determine areas of vegetation change. This approach enabled more efficient processing compared to using single Landsat scenes, by taking advantage of overlap between adjacent paths, and allowed an automated system to be developed for the entire process.

  2. An Efficient, Highly Flexible Multi-Channel Digital Downconverter Architecture

    NASA Technical Reports Server (NTRS)

    Goodhart, Charles E.; Soriano, Melissa A.; Navarro, Robert; Trinh, Joseph T.; Sigman, Elliott H.

    2013-01-01

    In this innovation, a digital downconverter has been created that produces a large (16 or greater) number of output channels of smaller bandwidths. Additionally, this design has the flexibility to tune each channel independently to anywhere in the input bandwidth to cover a wide range of output bandwidths (from 32 MHz down to 1 kHz). Both the flexibility in channel frequency selection and the more than four orders of magnitude range in output bandwidths (decimation rates from 32 to 640,000) presented significant challenges to be solved. The solution involved breaking the digital downconversion process into a two-stage process. The first stage is a 2 oversampled filter bank that divides the whole input bandwidth as a real input signal into seven overlapping, contiguous channels represented with complex samples. Using the symmetry of the sine and cosine functions in a similar way to that of an FFT (fast Fourier transform), this downconversion is very efficient and gives seven channels fixed in frequency. An arbitrary number of smaller bandwidth channels can be formed from second-stage downconverters placed after the first stage of downconversion. Because of the overlapping of the first stage, there is no gap in coverage of the entire input bandwidth. The input to any of the second-stage downconverting channels has a multiplexer that chooses one of the seven wideband channels from the first stage. These second-stage downconverters take up fewer resources because they operate at lower bandwidths than doing the entire downconversion process from the input bandwidth for each independent channel. These second-stage downconverters are each independent with fine frequency control tuning, providing extreme flexibility in positioning the center frequency of a downconverted channel. Finally, these second-stage downconverters have flexible decimation factors over four orders of magnitude The algorithm was developed to run in an FPGA (field programmable gate array) at input data sampling rates of up to 1,280 MHz. The current implementation takes a 1,280-MHz real input, and first breaks it up into seven 160-MHz complex channels, each spaced 80 MHz apart. The eighth channel at baseband was not required for this implementation, and led to more optimization. Afterwards, 16 second stage narrow band channels with independently tunable center frequencies and bandwidth settings are implemented A future implementation in a larger Xilinx FPGA will hold up to 32 independent second-stage channels.

  3. Exact Solutions of Linear Reaction-Diffusion Processes on a Uniformly Growing Domain: Criteria for Successful Colonization

    PubMed Central

    Simpson, Matthew J

    2015-01-01

    Many processes during embryonic development involve transport and reaction of molecules, or transport and proliferation of cells, within growing tissues. Mathematical models of such processes usually take the form of a reaction-diffusion partial differential equation (PDE) on a growing domain. Previous analyses of such models have mainly involved solving the PDEs numerically. Here, we present a framework for calculating the exact solution of a linear reaction-diffusion PDE on a growing domain. We derive an exact solution for a general class of one-dimensional linear reaction—diffusion process on 0

  4. Exact solutions of linear reaction-diffusion processes on a uniformly growing domain: criteria for successful colonization.

    PubMed

    Simpson, Matthew J

    2015-01-01

    Many processes during embryonic development involve transport and reaction of molecules, or transport and proliferation of cells, within growing tissues. Mathematical models of such processes usually take the form of a reaction-diffusion partial differential equation (PDE) on a growing domain. Previous analyses of such models have mainly involved solving the PDEs numerically. Here, we present a framework for calculating the exact solution of a linear reaction-diffusion PDE on a growing domain. We derive an exact solution for a general class of one-dimensional linear reaction-diffusion process on 0

  5. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  6. A Parameterized Inversion Model for Soil Moisture and Biomass from Polarimetric Backscattering Coefficients

    NASA Technical Reports Server (NTRS)

    Truong-Loi, My-Linh; Saatchi, Sassan; Jaruwatanadilok, Sermsak

    2012-01-01

    A semi-empirical algorithm for the retrieval of soil moisture, root mean square (RMS) height and biomass from polarimetric SAR data is explained and analyzed in this paper. The algorithm is a simplification of the distorted Born model. It takes into account the physical scattering phenomenon and has three major components: volume, double-bounce and surface. This simplified model uses the three backscattering coefficients ( sigma HH, sigma HV and sigma vv) at low-frequency (P-band). The inversion process uses the Levenberg-Marquardt non-linear least-squares method to estimate the structural parameters. The estimation process is entirely explained in this paper, from initialization of the unknowns to retrievals. A sensitivity analysis is also done where the initial values in the inversion process are varying randomly. The results show that the inversion process is not really sensitive to initial values and a major part of the retrievals has a root-mean-square error lower than 5% for soil moisture, 24 Mg/ha for biomass and 0.49 cm for roughness, considering a soil moisture of 40%, roughness equal to 3cm and biomass varying from 0 to 500 Mg/ha with a mean of 161 Mg/ha

  7. ADDING REALISM TO NUCLEAR MATERIAL DISSOLVING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, B.

    2011-08-15

    Two new criticality modeling approaches have greatly increased the efficiency of dissolver operations in H-Canyon. The first new approach takes credit for the linear, physical distribution of the mass throughout the entire length of the fuel assembly. This distribution of mass is referred to as the linear density. Crediting the linear density of the fuel bundles results in using lower fissile concentrations, which allows higher masses to be charged to the dissolver. Also, this approach takes credit for the fact that only part of the fissile mass is wetted at a time. There are multiple assemblies stacked on top ofmore » each other in a bundle. On average, only 50-75% of the mass (the bottom two or three assemblies) is wetted at a time. This means that only 50-75% (depending on operating level) of the mass is moderated and is contributing to the reactivity of the system. The second new approach takes credit for the progression of the dissolving process. Previously, dissolving analysis looked at a snapshot in time where the same fissile material existed both in the wells and in the bulk solution at the same time. The second new approach models multiple consecutive phases that simulate the fissile material moving from a high concentration in the wells to a low concentration in the bulk solution. This approach is more realistic and allows higher fissile masses to be charged to the dissolver.« less

  8. [Quality anlysis of the before redrying raw tobacco & after redrying sheet tobacco by using online near infrared spectroscopy].

    PubMed

    Tang, Zhao-qi; Liu, Ying; Shu, Ru-xin; Yang, Kai; Zhao, Long-lian; Zhang, Lu-da; Zhang Ye-hui; Li, Jun-hui

    2014-12-01

    In this paper, the 7 different origin before redrying raw tobacco & after redrying sheet tobacco's online near infrared spectroscopy were collected from sorting & redrying production line specifically for "ZHONGHUA" brand. By using the projection model bulit by different origin tobacco's online spectroscopy and the method of variance and correlation analysis, we studied the uniformity and similarity quality characteristics change before and after the redrying of tobacco, which can provide support for understanding the quality of the tobacco material and cigarette product formulations. This study show that selecting about 10,000 by equally spaced sampling time from a huge number of online near infrared spectroscopy, for modeling are feasible, and representative. After manual sorting, threshing, and redrying, the uiformity of each origin tobacco near-infrared spectroscopy can be increased by 10%~35%, homogeneity of the tobacco leaf has been significantly improved. After redrying, the similar relationship embodied in the origin also have significant changes, overall it reduce significantly, that shows the quality differences embodied by origin significantly improve, which can provide greater space for formulations, it shows the need for high-quality Chinese cigarette production requires large amounts of financial and human resources to implement cured tobacco processing. The traditional means of chemical analysis, it takes a lot of time and effort, it is difficult to control the entire processing chain, Near Infrared Spectroscopy with its rapid, non-destructive advantage, not only can achieve real-time detection and quality control, but also can take full advantage of near-infrared spectroscopy information created in the production process, which is a very promising online analytical detection technology in many industries especially in the agricultural and food processing industries.

  9. Research on cost control and management in high voltage transmission line construction

    NASA Astrophysics Data System (ADS)

    Xu, Xiaobin

    2017-05-01

    Enterprises. The cost control is of vital importance to the construction enterprises. It is the key to the profitability of the transmission line project, which is related to the survival and development of the electric power construction enterprises. Due to the long construction line, complex and changeable construction terrain as well as large construction costs of transmission line, it is difficult for us to take accurate and effective cost control on the project implementation of entire transmission line. Therefore, the cost control of transmission line project is a complicated and arduous task. It is of great theoretical and practical significance to study the cost control scheme of transmission line project by a more scientific and efficient way. Based on the characteristics of the construction project of the transmission line project, this paper analyzes the construction cost structure of the transmission line project and the current cost control problem of the transmission line project, and demonstrates the necessity and feasibility of studying the cost control scheme of the transmission line project more accurately. In this way, the dynamic cycle cost control process including plan, implementation, feedback, correction, modification and re-implement is achieved to realize the accurate and effective cost control of entire electric power transmission line project.

  10. Practical management of chemicals and hazardous wastes: An environmental and safety professional`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhre, W.L.

    This book was written to help the environmental and safety student learn about the field and to help the working professional manage hazardous material and waste issues. For example, one issue that will impact virtually all of these people mentioned is the upcoming environmental standardization movement. The International Standards Organization (ISO) is in the process of adding comprehensive environmental and hazardous waste management systems to their future certification requirements. Most industries worldwide will be working hard to achieve this new level of environmental management. This book presents many of the systems needed to receive certification. In order to properly managemore » hazardous waste, it is important to consider the entire life cycle, including when the waste was a useful chemical or hazardous material. Waste minimization is built upon this concept. Understanding the entire life cycle is also important in terms of liability, since many regulations hold generators responsible from cradle to grave. This book takes the life-cycle concept even further, in order to provide additional insight. The discussion starts with the conception of the chemical and traces its evolution into a waste and even past disposal. At this point the story continues into the afterlife, where responsibility still remains.« less

  11. PatternQuery: web application for fast detection of biomacromolecular structural patterns in the entire Protein Data Bank.

    PubMed

    Sehnal, David; Pravda, Lukáš; Svobodová Vařeková, Radka; Ionescu, Crina-Maria; Koča, Jaroslav

    2015-07-01

    Well defined biomacromolecular patterns such as binding sites, catalytic sites, specific protein or nucleic acid sequences, etc. precisely modulate many important biological phenomena. We introduce PatternQuery, a web-based application designed for detection and fast extraction of such patterns. The application uses a unique query language with Python-like syntax to define the patterns that will be extracted from datasets provided by the user, or from the entire Protein Data Bank (PDB). Moreover, the database-wide search can be restricted using a variety of criteria, such as PDB ID, resolution, and organism of origin, to provide only relevant data. The extraction generally takes a few seconds for several hundreds of entries, up to approximately one hour for the whole PDB. The detected patterns are made available for download to enable further processing, as well as presented in a clear tabular and graphical form directly in the browser. The unique design of the language and the provided service could pave the way towards novel PDB-wide analyses, which were either difficult or unfeasible in the past. The application is available free of charge at http://ncbr.muni.cz/PatternQuery. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Flights of Fancy: Imaginary Travels as Motivation for Reading, Writing, and Speaking German.

    ERIC Educational Resources Information Center

    Bryant, Keri L.; Pohl, Rosa Marie

    1994-01-01

    The article describes an innovative teaching project suitable for students at any age and all levels of German. The project, conducted entirely in German, includes writing, reading, and speaking, and promotes the skills of letter-writing, reading for content, note-taking, and oral presentation. (JL)

  13. 50 CFR 217.144 - Mitigation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the entire 30-minute pre-activity monitoring time period in order for the activity to begin. (v) BP... SPECIFIED ACTIVITIES Taking of Marine Mammals Incidental to Operation of Offshore Oil and Gas Facilities in the U.S. Beaufort Sea § 217.144 Mitigation. (a) When conducting the activities identified in § 217.140...

  14. 41 CFR 304-3.19 - Are there other situations when I may accept payment from a non-Federal source for my travel...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of Personnel Management at 5 CFR part 410). (b) Under 5 U.S.C. 7342 for travel taking place entirely... executive branch employees to accept gifts based on outside business employment relationships. (Note: You...

  15. 41 CFR 304-3.19 - Are there other situations when I may accept payment from a non-Federal source for my travel...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of Personnel Management at 5 CFR part 410). (b) Under 5 U.S.C. 7342 for travel taking place entirely... executive branch employees to accept gifts based on outside business employment relationships. (Note: You...

  16. 41 CFR 304-3.19 - Are there other situations when I may accept payment from a non-Federal source for my travel...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of Personnel Management at 5 CFR part 410). (b) Under 5 U.S.C. 7342 for travel taking place entirely... executive branch employees to accept gifts based on outside business employment relationships. (Note: You...

  17. 41 CFR 304-3.19 - Are there other situations when I may accept payment from a non-Federal source for my travel...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of Personnel Management at 5 CFR part 410). (b) Under 5 U.S.C. 7342 for travel taking place entirely... executive branch employees to accept gifts based on outside business employment relationships. (Note: You...

  18. Distance and Online Social Work Education: Novel Ethical Challenges

    ERIC Educational Resources Information Center

    Reamer, Frederic G.

    2013-01-01

    Digital technology has transformed social work education. Today's students can take individual courses and earn an entire degree without ever meeting their faculty members in person. Technological innovations such as videoconferencing, live online chat, asynchronous podcasts, and webinars enable social work educators to reach students whose…

  19. 14 CFR 302.711 - Conditions upon participation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... conference and for ninety (90) days after its termination, or until the Department takes public action with... read this entire instruction and promises to abide by it and advise any other participant to whom he or... for keeping conference matters confidential, every participant, as defined in paragraph (e) of this...

  20. The Measurement of Instructional Accomplishments.

    ERIC Educational Resources Information Center

    Fraley, Lawrence E.; Vargas, Ernest A.

    Instructional System Technology in recent years has been characterized by an increase in individualized instruction and the modularization of the curriculum. In traditional systems the learners are forced to take blocks of instruction the size of entire courses and these are much too large. The courses can now be broken down into conceptual…

  1. The canopy camera

    Treesearch

    Harry E. Brown

    1962-01-01

    The canopy camera is a device of new design that takes wide-angle, overhead photographs of vegetation canopies, cloud cover, topographic horizons, and similar subjects. Since the entire hemisphere is photographed in a single exposure, the resulting photograph is circular, with the horizon forming the perimeter and the zenith the center. Photographs of this type provide...

  2. "Dropbox" Brings Course Management Back to Teachers

    ERIC Educational Resources Information Center

    Niles, Thaddeus M.

    2013-01-01

    Course management software (CMS) allows teachers to deliver content electronically and manage collaborative coursework, either blending with face-to-face interactions or as the core of an entirely virtual classroom environment. CMS often takes the form of an electronic storehouse of course materials with which students can interact, a virtual…

  3. Prezi: A Different Way to Present

    ERIC Educational Resources Information Center

    Yee, Kevin; Hargis, Jace

    2010-01-01

    In this article, the author discusses Prezi and compares it to other forms of presentation software. Taking a completely different approach to the entire concept of software for presentations, Prezi stands alone as a unique and wholly viable competitor to PowerPoint. With a "Prezi", users display words, images, and videos without using…

  4. Wireless Takes Over

    ERIC Educational Resources Information Center

    Korzeniowski, Paul

    2008-01-01

    After starting off as a simple way to provide users with access to internet resources, WiFi networks have evolved to become an important communications cornerstone for a growing number of academic institutions. Not only have colleges been working to make these networks available across their entire campuses (or at least pretty close to 100 percent…

  5. Writing Research Papers. A Norton Guide. Third Edition.

    ERIC Educational Resources Information Center

    Walker, Melissa

    Designed to take students on a guided tour of entire research projects, from developing interests and focusing ideas to producing finished papers, through the experiences of five students, this book also provides exhaustive coverage of research mechanics. Checklists throughout the book (and indexed inside the front cover) allow students to review…

  6. 'Mars-shine'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] 'Mars-shine' Composite

    NASA's Mars Exploration Rover Spirit continues to take advantage of favorable solar power conditions to conduct occasional nighttime astronomical observations from the summit region of 'Husband Hill.'

    Spirit has been observing the martian moons Phobos and Deimos to learn more about their orbits and surface properties. This has included observing eclipses. On Earth, a solar eclipse occurs when the Moon's orbit takes it exactly between the Sun and Earth, casting parts of Earth into shadow. A lunar eclipse occurs when the Earth is exactly between the Sun and the Moon, casting the Moon into shadow and often giving it a ghostly orange-reddish color. This color is created by sunlight reflected through Earth's atmosphere into the shadowed region. The primary difference between terrestrial and martian eclipses is that Mars' moons are too small to completely block the Sun from view during solar eclipses.

    Recently, Spirit observed a 'lunar' eclipse on Mars. Phobos, the larger of the two martian moons, was photographed while slipping into the shadow of Mars. Jim Bell, the astronomer in charge of the rover's panoramic camera (Pancam), suggested calling it a 'Phobal' eclipse rather than a lunar eclipse as a way of identifying which of the dozens of moons in our solar system was being cast into shadow.

    With the help of the Jet Propulsion Laboratory's navigation team, the Pancam team planned instructions to Spirit for acquiring the views shown here of Phobos as it entered into a lunar eclipse on the evening of the rover's 639th martian day, or sol (Oct. 20, 2005) on Mars. This image is a time-lapse composite of eight Pancam images of Phobos moving across the martian sky. The entire eclipse lasted more than 26 minutes, but Spirit was able to observe only in the first 15 minutes. During the time closest to the shadow crossing, Spirit's cameras were programmed to take images every 10 seconds.

    In the first three images, Phobos was in sunlight, moving toward the upper right. After a 100-second delay while Spirit's computer processed the first three images, the rover then took the fourth image, showing Phobos just starting to enter the darkness of the martian shadow. At that point, an observer sitting on Phobos and looking back toward the Sun would have seen a spectacular sunset! In the fifth image, Phobos appeared like a crescent, almost completely shrouded in darkness.

    In the last three images, Phobos had slipped entirely into the shadow of Mars. However, as with our own Moon during lunar eclipses on Earth, it was not entirely dark. The small amount of light still visible from Phobos is a kind of 'Mars-shine' -- sunlight reflected through Mars' atmosphere and into the shadowed region.

    Rover scientists took some images later in the sequence to try to figure out if this 'Mars-shine' made Phobos colorful while in eclipse, but they'll need more time to complete the analysis because the signal levels are so low. Meanwhile, they will use the information on the timing of the eclipse to refine the orbital path of Phobos. The precise position of Phobos will be important to any future spacecraft taking detailed pictures of the moon or landing on its surface. In the near future it might be possible for one of the rovers to take images of a 'Deimal' eclipse to learn more about Mars' other enigmatic satellite, Deimos, as well.

  7. SWAp dynamics in a decentralized context: experiences from Uganda.

    PubMed

    Jeppsson, Anders

    2002-12-01

    This paper examines the role of the Ministry of Health (MoH) in Uganda in the process of developing a Sector-Wide Approach (SWAp) within the health sector. Power dynamics are integral to any understanding of development assistance, and SWAps bring with them new opportunities for the deployment of influence. The SWAp process has changed the interaction between the donors and the Government, and the perspective of this interaction has shifted from various technical areas to the entire health sector. It is argued that although the decentralization of the public sector has transferred considerable responsibilities and duties from the central level to the districts, significant power, defined as a social construct, has been generated by the MoH in the very process of developing SWAps. The MoH has been able to exercise significant influence on defining the content and boundaries of the SWAp process, as well as the direction it is taking. This development has largely followed blueprints drawn by donors. Through the institutional framework associated with SWAps, the MoH has redefined the interaction between the central level and the districts as well as between the MoH and the donors. While the SWAp process is now moving from the planning to the implementation phase in Uganda, we see a number of new, changing, ambiguous and contradictory strategies emerging.

  8. Analysis of Invasion Dynamics of Matrix-Embedded Cells in a Multisample Format.

    PubMed

    Van Troys, Marleen; Masuzzo, Paola; Huyck, Lynn; Bakkali, Karima; Waterschoot, Davy; Martens, Lennart; Ampe, Christophe

    2018-01-01

    In vitro tests of cancer cell invasion are the "first line" tools of preclinical researchers for screening the multitude of chemical compounds or cell perturbations that may aid in halting or treating cancer malignancy. In order to have predictive value or to contribute to designing personalized treatment regimes, these tests need to take into account the cancer cell environment and measure effects on invasion in sufficient detail. The in vitro invasion assays presented here are a trade-off between feasibility in a multisample format and mimicking the complexity of the tumor microenvironment. They allow testing multiple samples and conditions in parallel using 3D-matrix-embedded cells and deal with the heterogeneous behavior of an invading cell population in time. We describe the steps to take, the technical problems to tackle and useful software tools for the entire workflow: from the experimental setup to the quantification of the invasive capacity of the cells. The protocol is intended to guide researchers to standardize experimental set-ups and to annotate their invasion experiments in sufficient detail. In addition, it provides options for image processing and a solution for storage, visualization, quantitative analysis, and multisample comparison of acquired cell invasion data.

  9. Autonomous Quantum Error Correction with Application to Quantum Metrology

    NASA Astrophysics Data System (ADS)

    Reiter, Florentin; Sorensen, Anders S.; Zoller, Peter; Muschik, Christine A.

    2017-04-01

    We present a quantum error correction scheme that stabilizes a qubit by coupling it to an engineered environment which protects it against spin- or phase flips. Our scheme uses always-on couplings that run continuously in time and operates in a fully autonomous fashion without the need to perform measurements or feedback operations on the system. The correction of errors takes place entirely at the microscopic level through a build-in feedback mechanism. Our dissipative error correction scheme can be implemented in a system of trapped ions and can be used for improving high precision sensing. We show that the enhanced coherence time that results from the coupling to the engineered environment translates into a significantly enhanced precision for measuring weak fields. In a broader context, this work constitutes a stepping stone towards the paradigm of self-correcting quantum information processing.

  10. Revenue cycle management, Part II.

    PubMed

    Crew, Matt

    2007-01-01

    The proper management of your revenue cycle requires the application of "best practices" and the continual monitoring and measuring of the entire cycle. The correct technology will enable you to gain the insight and efficiencies needed in the ever-changing healthcare economy. The revenue cycle is a process that begins when you negotiate payor contracts, set fees, and schedule appointments and continues until claims are paid in full. Every single step in the cycle carries equal importance. Monitoring all phases and a commitment to continually communicating the results will allow you to achieve unparalleled success. In part I of this article, we explored the importance of contracting, scheduling, and case management as well as coding and clinical documentation. We will now take a closer look at the benefits charge capture, claim submission, payment posting, accounts receivable follow-up, and reporting can mean to your practice.

  11. Neurotrophin Signaling via Long-Distance Axonal Transport

    NASA Astrophysics Data System (ADS)

    Chowdary, Praveen D.; Che, Dung L.; Cui, Bianxiao

    2012-05-01

    Neurotrophins are a family of target-derived growth factors that support survival, development, and maintenance of innervating neurons. Owing to the unique architecture of neurons, neurotrophins that act locally on the axonal terminals must convey their signals across the entire axon for subsequent regulation of gene transcription in the cell nucleus. This long-distance retrograde signaling, a motor-driven process that can take hours or days, has been a subject of intense interest. In the last decade, live-cell imaging with high sensitivity has significantly increased our capability to track the transport of neurotrophins, their receptors, and subsequent signals in real time. This review summarizes recent research progress in understanding neurotrophin-receptor interactions at the axonal terminal and their transport dynamics along the axon. We emphasize high-resolution studies at the single-molecule level and also discuss recent technical advances in the field.

  12. Bonding and nondestructive evaluation of graphite/PEEK composite and titanium adherends with thermoplastic adhesives

    NASA Technical Reports Server (NTRS)

    Hodges, W. T.; Tyeryar, J. R.; Berry, M.

    1985-01-01

    Bonded single overlap shear specimens were fabricated from Graphite/PEEK (Polyetheretherketone) composite adherends and titanium adherends. Six advanced thermoplastic adhesives were used for the bonding. The specimens were bonded by an electromagnetic induction technique producing high heating rates and high-strength bonds in a few minutes. This contrasts with conventionally heated presses or autoclaves that take hours to process comparable quality bonds. The Graphite/PEEK composites were highly resistant to delamination during the testing. This allowed the specimen to fail exclusively through the bondline, even at very high shear loads. Nondestructive evaluation of bonded specimens was performed ultrasonically by energizing the entire thickness of the material through the bondline and measuring acoustic impedance parameters. Destructive testing confirmed the unique ultrasonic profiles of strong and weak bonds, establishing a standard for predicting relative bond strength in subsequent specimens.

  13. The advantages and limitations of guideline adaptation frameworks.

    PubMed

    Wang, Zhicheng; Norris, Susan L; Bero, Lisa

    2018-05-29

    The implementation of evidence-based guidelines can improve clinical and public health outcomes by helping health professionals practice in the most effective manner, as well as assisting policy-makers in designing optimal programs. Adaptation of a guideline to suit the context in which it is intended to be applied can be a key step in the implementation process. Without taking the local context into account, certain interventions recommended in evidence-based guidelines may be infeasible under local conditions. Guideline adaptation frameworks provide a systematic way of approaching adaptation, and their use may increase transparency, methodological rigor, and the quality of the adapted guideline. This paper presents a number of adaptation frameworks that are currently available. We aim to compare the advantages and limitations of their processes, methods, and resource implications. These insights into adaptation frameworks can inform the future development of guidelines and systematic methods to optimize their adaptation. Recent adaptation frameworks show an evolution from adapting entire existing guidelines, to adapting specific recommendations extracted from an existing guideline, to constructing evidence tables for each recommendation that needs to be adapted. This is a move towards more recommendation-focused, context-specific processes and considerations. There are still many gaps in knowledge about guideline adaptation. Most of the frameworks reviewed lack any evaluation of the adaptation process and outcomes, including user satisfaction and resources expended. The validity, usability, and health impact of guidelines developed via an adaptation process have not been studied. Lastly, adaptation frameworks have not been evaluated for use in low-income countries. Despite the limitations in frameworks, a more systematic approach to adaptation based on a framework is valuable, as it helps to ensure that the recommendations stay true to the evidence while taking local needs into account. The utilization of frameworks in the guideline implementation process can be optimized by increasing the understanding and upfront estimation of resource and time needed, capacity building in adaptation methods, and increasing the adaptability of the source recommendation document.

  14. Influence of inductive heating on microstructure and material properties in roll forming processes

    NASA Astrophysics Data System (ADS)

    Guk, Anna; Kunke, Andreas; Kräusel, Verena; Landgrebe, Dirk

    2017-10-01

    The increasing demand for sheet metal parts and profiles with enhanced mechanical properties by using high and ultra-high-strength (UHS) steels for the automotive industry must be covered by increasing flexibility of tools and machines. This can be achieved by applying innovative technologies such as roll forming with integrated inductive heating. This process is similar to indirect press hardening and can be used for the production of hardened profiles and profiles with graded properties in longitudinal and traverse direction. The advantage is that the production of hardened components takes place in a continuous process and the integration of heating and quenching units in the profiling system increases flexibility, accompanied by shortening of the entire process chain and minimizing the springback risk. The features of the mentioned process consists of the combination of inhomogeneous strain distribution over the stripe width by roll forming and inhomogeneity of microstructure by accelerated inductive heating to austenitizing temperature. Therefore, these two features have a direct influence on the mechanical properties of the material during forming and hardening. The aim of this work is the investigation of the influence of heating rates on microstructure evolution and mechanical properties to determine the process window. The results showed that heating rate should be set at 110 K/s for economic integration of inductive heating into the roll forming process.

  15. Biogeography of plant invasions

    Treesearch

    Dean Pearson; Yvette Ortega

    2013-01-01

    The fact that most of our worst animal and weed pests come from other continents is no coincidence. Biological invasions are fundamentally a biogeographic phenomenon. That is to say, there is something rather significant about taking an organism from a specific evolutionary history and ecological context and casting it into an entirely new environment that can...

  16. Using Art to Teach the Abstract

    ERIC Educational Resources Information Center

    Barton, Sara

    2007-01-01

    Most students in America can graduate from high school without ever analyzing a piece of art. Perhaps these students will take an art history or an art appreciation course in college that may incorporate a few references to literature and history. Math or science connections will most likely remain entirely absent. Why do we treat art analysis…

  17. 21st Century Curriculum

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2013-01-01

    In rural Kansas, five high school students take seats in their school's distance learning classroom for a Mandarin language class they share with students at four other schools around the state. In Atlanta, an entire class of kindergartners sits in front of a big screen, speaking to their teacher in China. The students laugh, raise their hands,…

  18. An Updated Perspective on Emergent Science

    ERIC Educational Resources Information Center

    Russell, Terry; McGuigan, Linda

    2017-01-01

    Our concern is to offer support to the entire spectrum of staff wishing to nurture the development of early years science, from unqualified personnel through to early years professionals who may hold any one of the plethora of relevant qualifications. We reflect on what form that science might take, bearing in mind criticisms of science education…

  19. Validating Lidar Depolorization Calibration using Solar Radiation Scattered by Ice Clouds

    NASA Technical Reports Server (NTRS)

    Liu, Zhao-Yang; McGill, Matthew; Hu, Yong-Xiang; Hostetter, Chris; Winker, David; Vaughan, Mark

    2004-01-01

    This letter proposes the use of solar background radiation scattered by ice clouds for validating space lidar depolarization calibration. The method takes advantage of the fact that the background light scattered by ice clouds is almost entirely unpolarized. The theory is examined with Cloud Physics Lidar (CPL) background light measurements.

  20. 77 FR 14565 - Information Collection Request Sent to the Office of Management and Budget (OMB) for Approval...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... client evaluation helps refuge managers detect potential problems with guide services so that we can take... public record. Before including your address, phone number, email address, or other personal identifying information in your comment, you should be aware that your entire comment, including your personal identifying...

  1. Taking School Reform Success to "Scale": Governance and Leadership Issues in Two Restructuring Elementary Buildings.

    ERIC Educational Resources Information Center

    Goldman, Paul; Tindal, Gerald

    This paper explores the difficulties of extending good, workable educational ideas to entire schools or districts. Two restructured schools that participated in a 4-year collaborative project that involved multi-age primary classrooms, inclusion of special-needs students in regular classrooms, and increasing specificity in assessing student…

  2. Speaking up for Equity Takes Courage--But the Standards Have Your Back

    ERIC Educational Resources Information Center

    Lechtenberg, Kate; Phillips, Jeanie

    2018-01-01

    The National School Library Standards require school librarians to make equity a value that permeates the entire school library community. Creating displays to celebrate diversity is not enough. We cannot allow ourselves to approach diversity as a "social good," in which isolated programs serve marginalized students without challenging…

  3. Computational Simulation and Analysis of Mutations: Nucleotide Fixation, Allelic Age and Rare Genetic Variations in Population

    ERIC Educational Resources Information Center

    Qiu, Shuhao

    2015-01-01

    In order to investigate the complexity of mutations, a computational approach named Genome Evolution by Matrix Algorithms ("GEMA") has been implemented. GEMA models genomic changes, taking into account hundreds of mutations within each individual in a population. By modeling of entire human chromosomes, GEMA precisely mimics real…

  4. Get Ready for the Gamer Generation

    ERIC Educational Resources Information Center

    Carstens, Adam; Beck, John

    2005-01-01

    There is a new generation of workers taking over key positions in organizations and in classrooms. This generation is younger, yes, but they are also different in ways that will definitely change how business is done and how learning is accomplished. Research shows that the way they spent their formative years has given them an entirely different…

  5. All It Takes Is Leadership

    ERIC Educational Resources Information Center

    Papin, Tom; Houck, Treva

    2005-01-01

    The authors, as leaders in a public child welfare system, have teamed together and reached out to their private sector partners in a large, rural county in western Colorado. This effort was part of a comprehensive, community-wide effort to redesign and fundamentally improve the entire child welfare service delivery system. Across the country in…

  6. Community-Authored Resources for Education

    ERIC Educational Resources Information Center

    Tinker, Robert; Linn, Marcia; Gerard, Libby; Staudt, Carolyn

    2010-01-01

    Textbooks are resources for learning that provide the structure, content, assessments, and teacher guidance for an entire course. Technology can provide a far better resource that provides the same functions, but takes full advantage of computers. This resource would be much more than text on a screen. It would use the best technology; it would be…

  7. Learning Environments as Basis for Cognitive Achievements of Students in Basic Science Classrooms in Nigeria

    ERIC Educational Resources Information Center

    Atomatofa, Rachel; Okoye, Nnamdi; Igwebuike, Thomas

    2016-01-01

    The nature of classroom learning environments created by teachers had been considered very important for learning to take place effectively. This study investigated the effect of creating constructivist and transmissive learning environments on achievements of science students of different ability levels. 243 students formed the entire study…

  8. Down with Walls, Up with Malls: Taking Classes to the Shopping Centers.

    ERIC Educational Resources Information Center

    Duerden, Noel H.

    1980-01-01

    Learn and Shop, a concept of offering university credit courses by university faculty in shopping centers which was developed by Indiana University-Purdue University in Indianapolis, is described. The Learn and Shop curriculum permits individuals to earn a two-year associate degree in liberal arts entirely at shopping centers. (MLW)

  9. Singapore Math: Place Value, Computation & Number Sense. [CD-ROM

    ERIC Educational Resources Information Center

    Chen, Sandra

    2008-01-01

    "Singapore Math: Place Value, Computation & Number Sense" is a six-part presentation on CD-ROM that can be used by individual teachers or an entire school. The author takes primary to upper elementary grade teachers through place value skills with each of the computational operations: addition, subtraction, multiplication, and division. She gives…

  10. Writing in the Content Areas. Second Edition

    ERIC Educational Resources Information Center

    Benjamin, Amy

    2005-01-01

    Do you spend entirely too much time correcting your students' papers? Do your students' essays and term papers take side trips to nowhere? Is their writing riddled with mechanical errors? Do their lab reports and essays lack specificity and clarity? Writing in the Content Areas, Second Edition is for middle and high school content area teachers…

  11. RADIUM-226 AND POLONIUM-210 IN LEAF TOBACCO AND TOBACCO SOIL.

    PubMed

    TSO, T C; HALLDEN, N A; ALEXANDER, L T

    1964-11-20

    Contents of radium-226 and polonium-210 in leaf tobacco and tobacco-growing soils vary with the source. The differences may result from production locality, culture, and curing. The polonium seems to be not entirely derived from the radium; plants probably take it up from the soil or air.

  12. Taking the Metropolitan University to a Rural Community: The Role of a Needs Assessment Survey.

    ERIC Educational Resources Information Center

    Ahmed, Shamima

    1999-01-01

    When metropolitan universities refer to serving the entire metropolitan area, they often refer to rural fringes as well as concentrated urban populations. Working with rural communities requires somewhat different approaches to planning programs and understanding needs. Survey research helps the campus understand the perceptions and realities of…

  13. Taking a Systems View of Creativity: On the Right Path toward Understanding

    ERIC Educational Resources Information Center

    Hennessey, Beth A.

    2017-01-01

    In our 2010 "Annual Review" paper, Teresa Amabile and I argued that a virtual explosion of topics, viewpoints, and methodologies had muddied the investigative waters. Few, if any, "big" questions were being pursued by a critical mass of creativity researchers. Instead, investigators in one subfield seemed entirely unaware of…

  14. Adolescents' Commitment to Continuing Psychotropic Medication: A Preliminary Investigation of Considerations, Contradictions, and Correlates

    ERIC Educational Resources Information Center

    Moses, Tally

    2011-01-01

    This mixed-method study examines (1) the extent to which fifty adolescents receiving wraparound treatment and prescribed psychotropic medication for various psychiatric disorders report that they would continue taking medication if the decision was entirely their own (termed "medication commitment"); (2) their general subjective experiences with…

  15. Takes Two to Tango: Essential Practices of Highly Effective Transfer Partnerships

    ERIC Educational Resources Information Center

    Fink, John; Jenkins, Davis

    2017-01-01

    Objective: The objective of this study was to describe practices of 2- and 4-year institutional partnerships effective in supporting transfer student success. Method: Using student records from the National Student Clearinghouse (NSC) for the entire 2007 fall cohort of first-time-in-college community college students nationwide, researchers…

  16. A Smooth Road from Conventional Teaching to Distance Learning in Teacher Education

    ERIC Educational Resources Information Center

    Nishinosono, Haruo

    2002-01-01

    As a result of rapid developments in information and communication technology (ICT), dramatic changes are taking place in approaches to teaching and learning in university and school classrooms. These changes offer new opportunities to explore entirely new methods of instruction in teacher education. ICT provides alternatives to conventional…

  17. 24 CFR 203.421 - Allocation of Mutual Mortgage Insurance Fund income or loss.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... net income, or loss, the Commissioner shall allocate, after taking into account the actuarial status of the entire Mutual Mortgage Insurance Fund, such net income or such loss to the General Surplus Account and/or to the Participating Reserve Account as the Commissioner may determine to be in accord with...

  18. Teachers in the Lead: A District's Approach to Shared Leadership

    ERIC Educational Resources Information Center

    Stegall, David; Linton, Jayme

    2012-01-01

    Whether a principal builds a structure of shared decision making, shared leadership, or not, teachers will have ideas and conversations about what they feel may be more effective. These conversations impact the entire culture of a school. When teachers have the opportunity to take ownership of decision making and planning, the ultimate decisions…

  19. Lumping, Splitting and the Integration of Museum Studies with LIS

    ERIC Educational Resources Information Center

    Latham, Kiersten F.

    2015-01-01

    This paper is an attempt to support and promote education programs that cover the entire cultural heritage landscape (libraries, archives, museums) as an integrated, larger meta-discipline. By taking a larger picture approach, professionals who do the work of memory institutions can be more effective in their work, in the promotion of that…

  20. The inevitability of normative analysis.

    PubMed

    Sarkar, Sahotra

    2014-08-01

    Wilson et al. make the case for taking control of our future using evolutionary analysis. However, they are entirely silent on the ethical questions that must be addressed. This piece emphasizes this problem and notes that the relevant answers will require nontrivial analysis. This is where the humanities become relevant - in particular, philosophy and cultural anthropology.

  1. The hidden traps in decision making.

    PubMed

    Hammond, J S; Keeney, R L; Raiffa, H

    1998-01-01

    Bad decisions can often be traced back to the way the decisions were made--the alternatives were not clearly defined, the right information was not collected, the costs and benefits were not accurately weighted. But sometimes the fault lies not in the decision-making process but rather in the mind of the decision maker. The way the human brain works can sabotage the choices we make. John Hammond, Ralph Keeney, and Howard Raiffa examine eight psychological traps that are particularly likely to affect the way we make business decisions: The anchoring trap leads us to give disproportionate weight to the first information we receive. The statusquo trap biases us toward maintaining the current situation--even when better alternatives exist. The sunk-cost trap inclines us to perpetuate the mistakes of the past. The confirming-evidence trap leads us to seek out information supporting an existing predilection and to discount opposing information. The framing trap occurs when we misstate a problem, undermining the entire decision-making process. The overconfidence trap makes us overestimate the accuracy of our forecasts. The prudence trap leads us to be overcautious when we make estimates about uncertain events. And the recallability trap leads us to give undue weight to recent, dramatic events. The best way to avoid all the traps is awareness--forewarned is forearmed. But executives can also take other simple steps to protect themselves and their organizations from the various kinds of mental lapses. The authors show how to take action to ensure that important business decisions are sound and reliable.

  2. Kassiopeia: a modern, extensible C++ particle tracking package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furse, Daniel; Groh, Stefan; Trost, Nikolaus

    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less

  3. A novel approach for supercapacitors degradation characterization

    NASA Astrophysics Data System (ADS)

    Oz, Alon; Gelman, Danny; Goren, Emanuelle; Shomrat, Neta; Baltianski, Sioma; Tsur, Yoed

    2017-07-01

    A novel approach to analyze electrochemical impedance spectroscopy (EIS), based on evolutionary programming, has been utilized to characterize supercapacitors operation mechanism and degradation processes. This approach poses the ability of achieving a comprehensive study of supercapacitors via solely AC measurements. Commercial supercapacitors were examined during accelerated degradation. The microstructure of the electrode-electrolyte interface changes upon degradation; electrolyte parasitic reactions yield the formation of precipitates on the porous surface, which limit the access of the electrolyte ions to the active area and thus reduces performance. EIS analysis using Impedance Spectroscopy Genetic Programming (ISGP) technique enables identifying how the changing microstructure is affecting the operation mechanism of supercapacitors, in terms of each process effective capacitance and time constant. The most affected process is the transport of electrolyte ions at the porous electrode. Their access to the whole active area is hindered, which is shown in our analysis by the decrease of the capacitance gained in the transport and the longer time it takes to penetrate the entire pores depth. Early failure detection is also demonstrated, in a way not readily possible via conventional indicators. ISGP advanced analysis method has been verified using conventional and proven techniques: cyclic voltammetry and post mortem measurements.

  4. On compensatory strategies and computational models: the case of pure alexia.

    PubMed

    Shallice, Tim

    2014-01-01

    The article is concerned with inferences from the behaviour of neurological patients to models of normal function. It takes the letter-by-letter reading strategy common in pure alexic patients as an example of the methodological problems involved in making such inferences that compensatory strategies produce. The evidence is discussed on the possible use of three ways the letter-by-letter reading process might operate: "reversed spelling"; the use of the phonological input buffer as a temporary holding store during word building; and the use of serial input to the visual word-form system entirely within the visual-orthographic domain such as in the model of Plaut [1999. A connectionist approach to word reading and acquired dyslexia: Extension to sequential processing. Cognitive Science, 23, 543-568]. The compensatory strategy used by, at least, one pure alexic patient does not fit with the third of these possibilities. On the more general question, it is argued that even if compensatory strategies are being used, the behaviour of neurological patients can be useful for the development and assessment of first-generation information-processing models of normal function, but they are not likely to be useful for the development and assessment of second-generation computational models.

  5. Kassiopeia: a modern, extensible C++ particle tracking package

    DOE PAGES

    Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...

    2017-05-16

    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less

  6. Kassiopeia: a modern, extensible C++ particle tracking package

    NASA Astrophysics Data System (ADS)

    Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael

    2017-05-01

    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.

  7. The Role of Change Agents in Technology Adoption Process

    NASA Astrophysics Data System (ADS)

    Gyampoh-Vidogah, Regina; Moreton, Robert

    Although the total or partial failure of Information Technology (IT) projects are well documented such failures are not entirely technical in nature (Donohue et al, 2001). Project failures are often caused by lack of attention to social factors. (2002) identified ethical issues whilst (1999) and (2002) point to human factors, which in essence are the norms and culture of the implementation environment. On the. influence of culture on project success, (2003) noted that, the cultural problems are much bigger than the technical ones, adding: "The biggest hurdle is making people realise that information needs to be shared. It is only with this ethos of sharing information that take-up of technologies will be hastened." Consequently, research and debate about IT implementation is likely to continue until the development process is under better control (Nolan 1999). This state of constant evaluation is crucial because aborted IT projects are still common place. According to (1998), 31% of all corporate technology development projects resulted in cancellation. Although in broad terms, there seems to be ample evidence of the influence of non-technical factors on project failure the dynamics of how this happens is not widely discussed. There are some pointers to the dynamics of the process in literature.

  8. High Performance GPU-Based Fourier Volume Rendering.

    PubMed

    Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr

    2015-01-01

    Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)log⁡N) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  9. Toward the integration of expert knowledge and instrumental data to control food processes: application to Camembert-type cheese ripening.

    PubMed

    Sicard, M; Perrot, N; Leclercq-Perlat, M-N; Baudrit, C; Corrieu, G

    2011-01-01

    Modeling the cheese ripening process remains a challenge because of its complexity. We still lack the knowledge necessary to understand the interactions that take place at different levels of scale during the process. However, information may be gathered from expert knowledge. Combining this expertise with knowledge extracted from experimental databases may allow a better understanding of the entire ripening process. The aim of this study was to elicit expert knowledge and to check its validity to assess the evolution of organoleptic quality during a dynamic food process: Camembert cheese ripening. Experiments on a pilot scale were carried out at different temperatures and relative humidities to obtain contrasting ripening kinetics. During these experiments, macroscopic evolution was evaluated from an expert's point of view and instrumental measurements were carried out to simultaneously monitor microbiological, physicochemical, and biochemical kinetics. A correlation of 76% was established between the microbiological, physicochemical, and biochemical data and the sensory phases measured according to expert knowledge, highlighting the validity of the experts' measurements. In the future, it is hoped that this expert knowledge may be integrated into food process models to build better decision-aid systems that will make it possible to preserve organoleptic qualities by linking them to other phenomena at the microscopic level. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    NASA Technical Reports Server (NTRS)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  11. Understanding and becoming - the heart of the matter in nurse education.

    PubMed

    Sandvik, Ann-Helén; Eriksson, Katie; Hilli, Yvonne

    2015-03-01

    The aim of this study was to deepen the understanding of student nurses' processes of understanding and becoming nurses. The study is phenomenological-hermeneutic in design, comprising data from three focus group interviews in two Scandinavian countries. The process of student nurses' understanding and becoming a nurse emerged as a hermeneutical movement. A caring student-preceptor relationship and a growth-promoting preception in a supportive and inclusive environment provide the frame within which the movement happens. The movement towards understanding and becoming is initiated as students, based on their level of knowledge, are given responsibility. In order to fulfil the responsibility imposed on them, students take their entire repertoire of knowledge into consideration. By tying these threads together, they found the basis for conscious action, and care is provided according to what the current situation requires. The experiences obtained are reflected on and integrated with earlier knowledge, which leads to enhanced understanding. Students form a new base to stand on. They show increased readiness for still more responsibility and action. This movement towards deeper understanding and becoming affects the students also ethically and deepens their ethical awareness. When one loop of understanding and becoming is closed the process continues by passing into a new loop. This movement could be described as a hermeneutical spiral consisting of interconnected loops taking the students further and deeper in their process of understanding and becoming a nurse. The student-preceptor relationship and the ethos permeating it are decisive for students' learning both epistemologically and ontologically. Responsibility is the catalyst in students' understanding and becoming both intellectually and ethically. Understanding and becoming are ongoing processes of appropriation, thus altering students both professionally and personally. Understanding and becoming can be perceived as the hearth of the matter in nurse education. © 2014 Nordic College of Caring Science.

  12. The life cycle of a mineral deposit: a teacher's guide for hands-on mineral education activities

    USGS Publications Warehouse

    Frank, Dave; Galloway, John; Assmus, Ken

    2005-01-01

    This teacher's guide defines what a mineral deposit is and how a mineral deposit is identified and measured, how the mineral resources are extracted, and how the mining site is reclaimed; how minerals and mineral resources are processed; and how we use mineral resources in our every day lives. Included are 10 activitybased learning exercises that educate students on basic geologic concepts; the processes of finding, identifying, and extracting the resources from a mineral deposit; and the uses of minerals. The guide is intended for K through 12 Earth science teachers and students and is designed to meet the National Science Content Standards as defined by the National Research Council (1996). To assist in the understanding of some of the geology and mineral terms, see the Glossary (appendix 1) and Minerals and Their Uses (appendix 2). The process of finding or exploring for a mineral deposit, extracting or mining the resource, recovering the resource, also known as beneficiation, and reclaiming the land mined can be described as the “life cycle” of a mineral deposit. The complete process is time consuming and expensive, requiring the use of modern technology and equipment, and may take many years to complete. Sometimes one entity or company completes the entire process from discovery to reclamation, but often it requires multiple groups with specialized experience working together. Mineral deposits are the source of many important commodities, such as copper and gold, used by our society, but it is important to realize that mineral deposits are a nonrenewable resource. Once mined, they are exhausted, and another source must be found. New mineral deposits are being continuously created by the Earth but may take millions of years to form. Mineral deposits differ from renewable resources, such as agricultural and timber products, which may be replenished within a few months to several years.

  13. Neurocognitive inefficacy of the strategy process.

    PubMed

    Klein, Harold E; D'Esposito, Mark

    2007-11-01

    The most widely used (and taught) protocols for strategic analysis-Strengths, Weaknesses, Opportunities, and Threats (SWOT) and Porter's (1980) Five Force Framework for industry analysis-have been found to be insufficient as stimuli for strategy creation or even as a basis for further strategy development. We approach this problem from a neurocognitive perspective. We see profound incompatibilities between the cognitive process-deductive reasoning-channeled into the collective mind of strategists within the formal planning process through its tools of strategic analysis (i.e., rational technologies) and the essentially inductive reasoning process actually needed to address ill-defined, complex strategic situations. Thus, strategic analysis protocols that may appear to be and, indeed, are entirely rational and logical are not interpretable as such at the neuronal substrate level where thinking takes place. The analytical structure (or propositional representation) of these tools results in a mental dead end, the phenomenon known in cognitive psychology as functional fixedness. The difficulty lies with the inability of the brain to make out meaningful (i.e., strategy-provoking) stimuli from the mental images (or depictive representations) generated by strategic analysis tools. We propose decreasing dependence on these tools and conducting further research employing brain imaging technology to explore complex data handling protocols with richer mental representation and greater potential for strategy creation.

  14. Synthetic biology stretching the realms of possibility in wine yeast research.

    PubMed

    Jagtap, Umesh B; Jadhav, Jyoti P; Bapat, Vishwas A; Pretorius, Isak S

    2017-07-03

    It took several millennia to fully understand the scientific intricacies of the process through which grape juice is turned into wine. This yeast-driven fermentation process is still being perfected and advanced today. Motivated by ever-changing consumer preferences and the belief that the 'best' wine is yet to be made, numerous approaches are being pursued to improve the process of yeast fermentation and the quality of wine. Central to recent enhancements in winemaking processes and wine quality is the development of Saccharomyces cerevisiae yeast strains with improved robustness, fermentation efficiencies and sensory properties. The emerging science of Synthetic Biology - including genome engineering and DNA editing technologies - is taking yeast strain development into a totally new realm of possibility. The first example of how future wine strain development might be impacted by these new 'history-making' Synthetic Biology technologies, is the de novo production of the raspberry ketone aroma compound, 4-[4-hydroxyphenyl]butan-2-one, in a wine yeast containing a synthetic DNA cassette. This article explores how this breakthrough and the imminent outcome of the international Yeast 2.0 (or Sc2.0) project, aimed at the synthesis of the entire genome of a laboratory strain of S. cerevisiae, might accelerate the design of improved wine yeasts. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Neurokernel: An Open Source Platform for Emulating the Fruit Fly Brain

    PubMed Central

    2016-01-01

    We have developed an open software platform called Neurokernel for collaborative development of comprehensive models of the brain of the fruit fly Drosophila melanogaster and their execution and testing on multiple Graphics Processing Units (GPUs). Neurokernel provides a programming model that capitalizes upon the structural organization of the fly brain into a fixed number of functional modules to distinguish between these modules’ local information processing capabilities and the connectivity patterns that link them. By defining mandatory communication interfaces that specify how data is transmitted between models of each of these modules regardless of their internal design, Neurokernel explicitly enables multiple researchers to collaboratively model the fruit fly’s entire brain by integration of their independently developed models of its constituent processing units. We demonstrate the power of Neurokernel’s model integration by combining independently developed models of the retina and lamina neuropils in the fly’s visual system and by demonstrating their neuroinformation processing capability. We also illustrate Neurokernel’s ability to take advantage of direct GPU-to-GPU data transfers with benchmarks that demonstrate scaling of Neurokernel’s communication performance both over the number of interface ports exposed by an emulation’s constituent modules and the total number of modules comprised by an emulation. PMID:26751378

  16. Monte Carlo simulation of efficient data acquisition for an entire-body PET scanner

    NASA Astrophysics Data System (ADS)

    Isnaini, Ismet; Obi, Takashi; Yoshida, Eiji; Yamaya, Taiga

    2014-07-01

    Conventional PET scanners can image the whole body using many bed positions. On the other hand, an entire-body PET scanner with an extended axial FOV, which can trace whole-body uptake images at the same time and improve sensitivity dynamically, has been desired. The entire-body PET scanner would have to process a large amount of data effectively. As a result, the entire-body PET scanner has high dead time at a multiplex detector grouping process. Also, the entire-body PET scanner has many oblique line-of-responses. In this work, we study an efficient data acquisition for the entire-body PET scanner using the Monte Carlo simulation. The simulated entire-body PET scanner based on depth-of-interaction detectors has a 2016-mm axial field-of-view (FOV) and an 80-cm ring diameter. Since the entire-body PET scanner has higher single data loss than a conventional PET scanner at grouping circuits, the NECR of the entire-body PET scanner decreases. But, single data loss is mitigated by separating the axially arranged detector into multiple parts. Our choice of 3 groups of axially-arranged detectors has shown to increase the peak NECR by 41%. An appropriate choice of maximum ring difference (MRD) will also maintain the same high performance of sensitivity and high peak NECR while at the same time reduces the data size. The extremely-oblique line of response for large axial FOV does not contribute much to the performance of the scanner. The total sensitivity with full MRD increased only 15% than that with about half MRD. The peak NECR was saturated at about half MRD. The entire-body PET scanner promises to provide a large axial FOV and to have sufficient performance values without using the full data.

  17. China.

    PubMed

    1981-01-01

    China's 3rd national census will belong to the era of modern census taking. Over 6 million enumerators will be involved along with 29 computers for data processing. The 3-year budget exceeds the equivalent of $135 million. A pilot census was taken in the city and country of Wuxi in Jiangsu province south of Shanghai during June 1980. Additional pilot censuses are to be conducted in the provinces beginning early in 1981. The full count is scheduled to be 1 year later on July 1, 1982. Results will be processed and made available by 1984 so that planners can utilize them in drafting the 5-year development plan for 1985-1990. The censuses of 1953 and 1964 yeilded little data by modern standards. The longterm objective of the Population Census Office is to build up a modern census taking capability. This will provide data for the formulation of population and development policies, programs to implement those policies, and family planning activities. Another longterm objective is to extend the new data processing system to 399 prefectures and 2168 counties in China. The equipment will be subsequently used in related research activities. For the current census, a complete organization of census offices, census working teams, and census working groups will be established at successive administrative levels down to neighborhood (urban) and brigade (rural) levels, beginning early in 1981. The full census will cover 29 provinces of China. Approximately 6 million enumerators will each cover about 30-40 households. 2 models of computer and corresponding data entry systems are being used: 8 Wang VS 2200 systems and 21 IBM 4300 series systems from the U.S. The United Nations Fund for Population Activities is supplying equipment and technical assistance for the entire census amounting to more than $15 million. The Population Census Office will analyze and publish the census data.

  18. Who has the D? How clear decision roles enhance organizational performance.

    PubMed

    Rogers, Paul; Blenko, Marcia

    2006-01-01

    Decisions are the coin of the realm in business. But even in highly respected companies, decisions can get stuck inside the organization like loose change. As a result, the entire decision-making process can stall, usually at one of four bottlenecks: global versus local, center versus business unit, function versus function, and inside versus outside partners. Decision-making bottlenecks can occur whenever there is ambiguity or tension over who gets to decide what. For example, do marketers or product developers get to decide the features of a new product? Should a major capital investment depend on the approval of the business unit that will own it, or should headquarters make the final call? Which decisions can be delegated to an outsourcing partner, and which must be made internally? Bain consultants Paul Rogers and Marcia Blenko use an approach called RAPID (recommend, agree, perform, input, and decide) to help companies unclog their decision-making bottlenecks by explicitly defining roles and responsibilities. For example, British American Tobacco struck a new balance between global and local decision making to take advantage of the company's scale while maintaining its agility in local markets. At Wyeth Pharmaceuticals, a growth opportunity revealed the need to push more decisions down to the business units. And at the UK department-store chain John Lewis, buyers and sales staff clarified their decision roles in order to implement a new strategy for selling its salt and pepper mills. When revamping its decision-making process, a company must take some practical steps: Align decision roles with the most important sources of value, make sure that decisions are made by the right people at the right levels of the organization, and let the people who will live with the new process help design it.

  19. Strategy for development of the Polish electricity sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dybowski, J.

    1995-12-01

    This paper represents the strategy for development of the Polish Electricity Sector dealing with specific problems which are common for all of East Central Europe. In 1990 Poland adopted a restructuring program for the entire energy sector. Very ambitious plans were changed several times but still the main direction of change was preserved. The most difficult period of transformation is featured by several contradictions which have to be balanced. Electricity prices should increase in order to cover the modernization and development program but the society is not able to take this burden in such a short time. Furthermore the newmore » environment protection standards force the growth of capital investment program which sooner or later has to be transferred through the electricity prices. New economic mechanisms have to be introduced to the electricity sector to replace the old ones noneffective, centrally planned. This process has to follow slow management changes. Also, introduction of new electricity market is limited by those constraints. However, this process of change would not be possible without parallel governmental initiation like preparation of new energy law and regulatory frames.« less

  20. Fresh old news from Ferenczi about the function of dreams: the dream as a Kur, as a treatment and as a Gyógyászat.

    PubMed

    Canesin Dal Molin, Eugênio

    2012-10-01

    This article discusses a text on the function of dreams and their relation to trauma. Ferenczi intended to present this material as a talk at the 12th International Congress of Psychoanalysis, which was to take place in Interlaken, Switzerland the same year that he wrote it (1931). The entire conference, however, was postponed, and parts of this communication's content appeared in other texts in which Ferenczi rethinks the concept of trauma and its clinical significance. In the present article the author makes use of the Freud/Ferenczi correspondence to contextualize Freud's Hungarian follower's originality regarding his theorizations about different aspects of the function of dreams. In the 1931 speech, as well as in this article, Ferenczi used a patient's dream work as a clinical example of a process in which traumatic experiences and unmastered sensory impressions can be repeated to achieve a better working-through for the dreamer. The process Ferenczi describes resembles an effort of self-treatment, of self-Kur. Copyright © 2012 Institute of Psychoanalysis.

  1. LINCS: Livermore's network architecture. [Octopus computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.

    1982-01-01

    Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less

  2. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  3. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  4. The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1992-01-01

    The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.

  5. The origin, current diversity and future conservation of the modern lion (Panthera leo)

    PubMed Central

    Barnett, Ross; Yamaguchi, Nobuyuki; Barnes, Ian; Cooper, Alan

    2006-01-01

    Understanding the phylogeographic processes affecting endangered species is crucial both to interpreting their evolutionary history and to the establishment of conservation strategies. Lions provide a key opportunity to explore such processes; however, a lack of genetic diversity and shortage of suitable samples has until now hindered such investigation. We used mitochondrial control region DNA (mtDNA) sequences to investigate the phylogeographic history of modern lions, using samples from across their entire range. We find the sub-Saharan African lions are basal among modern lions, supporting a single African origin model of modern lion evolution, equivalent to the ‘recent African origin’ model of modern human evolution. We also find the greatest variety of mtDNA haplotypes in the centre of Africa, which may be due to the distribution of physical barriers and continental-scale habitat changes caused by Pleistocene glacial oscillations. Our results suggest that the modern lion may currently consist of three geographic populations on the basis of their recent evolutionary history: North African–Asian, southern African and middle African. Future conservation strategies should take these evolutionary subdivisions into consideration. PMID:16901830

  6. Process analysis of a molten carbonate fuel cell power plant fed with a biomass syngas

    NASA Astrophysics Data System (ADS)

    Tomasi, C.; Baratieri, M.; Bosio, B.; Arato, E.; Baggio, P.

    The coupling of renewable energy sources and innovative power generation technologies is of topical interest to meet demands for increased power generation and cleaner environmental performance. Accordingly, biomass is receiving considerable attention as a partial substitute for fossil fuels, as it is more environmentally friendly and provides a profitable way of disposing of waste. In addition, fuel cells are perceived as most promising electrical power generation systems. Today, many plants combining these two concepts are under study; they differ in terms of biomass type and/or power plant configuration. Even if the general feasibility of such applications has been demonstrated, there are still many associated problems to be resolved. This study examines a plant configuration based on a molten carbonate fuel cell (MCFC) and a recirculated fluidized-bed reactor which has been applied to the thermal conversion of many types of biomass. Process analysis is conducted by simulating the entire plant using a commercial code. In particular, an energy assessment is studied by taking account of the energy requirements of auxiliary equipment and the possibility of utilizing the exhaust gases for cogeneration.

  7. A Dividend in Food Safety

    NASA Technical Reports Server (NTRS)

    1991-01-01

    When NASA faced the problem of how and what to feed an astronaut in a sealed capsule under weightless conditions while planning for manned space mission, they enlisted the aid of The Pillsbury Company. There were two principal concerns: barring crumbs of food that might contaminate the spacecraft's atmosphere or float their way into sensitive instruments; and assuring absolute freedom from potentially catastrophic disease-producing bacteria and toxins. Pillsbury quickly solved the first concern, but the other part of the problem was not as easy. They found that with using standard methods, there was no way to be assured there would not be any bacteria. It was concluded that the only way to succeed was to establish control over the entire process, the raw materials, the processing environment and the people involved. Pillsbury developed the Hazard Analysis and Critical Control Point (HACCP) concept. The HACCP is designed to prevent food safety problems rather than to catch them after they have occurred. Three other government agencies are taking preliminary steps toward extending HACCP to meat/poultry and seafood inspection operations. Today, Pillsbury plants are still operating under HACCP.

  8. Vesselness propagation: a fast interactive vessel segmentation method

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Dachille, Frank; Harris, Gordon J.; Yoshida, Hiroyuki

    2006-03-01

    With the rapid development of multi-detector computed tomography (MDCT), resulting in increasing temporal and spatial resolution of data sets, clinical use of computed tomographic angiography (CTA) is rapidly increasing. Analysis of vascular structures is much needed in CTA images; however, the basis of the analysis, vessel segmentation, can still be a challenging problem. In this paper, we present a fast interactive method for CTA vessel segmentation, called vesselness propagation. This method is a two-step procedure, with a pre-processing step and an interactive step. During the pre-processing step, a vesselness volume is computed by application of a CTA transfer function followed by a multi-scale Hessian filtering. At the interactive stage, the propagation is controlled interactively in terms of the priority of the vesselness. This method was used successfully in many CTA applications such as the carotid artery, coronary artery, and peripheral arteries. It takes less than one minute for a user to segment the entire vascular structure. Thus, the proposed method provides an effective way of obtaining an overview of vascular structures.

  9. Women in Physics in Germany, 2008

    NASA Astrophysics Data System (ADS)

    Kluge, Hanna

    2009-04-01

    The status of women in physics in Germany has not changed dramatically in the three years since the last IUPAP Women in Physics Conference was held in 2005. The salary of a woman remains approximately 25% lower than that of a man in a comparable professional position. The number of female professors is growing slowly. The number of young women beginning to study physics is around 20%. There is, however, a noticeable increase in organization and societal acceptance of female physicists, and an increasing amount of men taking part in this process. There is also increased acceptance and support of dual-career couples. The Helmholtz Alliance for "Physics at the Terascale" founded a dual-career option program. In 2008, the annual Conference of German Female Physicists (DPT) held in Muenster became an official conference of the DPG (German Physical Society). Various scientific groups working for equal opportunity have formed a "network of networks." At the DESY (German Electron Synchrotron), a group of women led by an equal opportunity officer is involved in the entire process of hiring new staff members in all positions, including directors.

  10. Mantle dynamics and seismic tomography

    PubMed Central

    Tanimoto, Toshiro; Lay, Thorne

    2000-01-01

    Three-dimensional imaging of the Earth's interior, called seismic tomography, has achieved breakthrough advances in the last two decades, revealing fundamental geodynamical processes throughout the Earth's mantle and core. Convective circulation of the entire mantle is taking place, with subducted oceanic lithosphere sinking into the lower mantle, overcoming the resistance to penetration provided by the phase boundary near 650-km depth that separates the upper and lower mantle. The boundary layer at the base of the mantle has been revealed to have complex structure, involving local stratification, extensive structural anisotropy, and massive regions of partial melt. The Earth's high Rayleigh number convective regime now is recognized to be much more interesting and complex than suggested by textbook cartoons, and continued advances in seismic tomography, geodynamical modeling, and high-pressure–high-temperature mineral physics will be needed to fully quantify the complex dynamics of our planet's interior. PMID:11035784

  11. Improving a workstation in an existing cab by means of a participatory approach: the case of subway operators' workstations.

    PubMed

    Bellemare, Marie; Beaugrand, Sylvie; Larue, Christian; Champoux, Danièle

    2009-01-01

    A study was conducted to identify possible solutions for redesigning a subway cab in order to improve the posture of drivers working in a restricted space. The approach used included the participation of a working group comprised of operations, maintenance, and engineering managers as well as several drivers. After 6 meetings in which different simulation techniques were used, the working group proposed changes for increasing the available space inside the cab and three seat designs. The involvement of the actors from the three departments affected by the changes, as well as the operators, throughout the process, was a determining factor in the advancement and acceptance of the projects. The fact that 400 cars are currently in service and must be modified means that it will take several years to implement the modifications in the entire fleet.

  12. The NASA program in Space Energy Conversion Research and Technology

    NASA Astrophysics Data System (ADS)

    Mullin, J. P.; Flood, D. J.; Ambrus, J. H.; Hudson, W. R.

    The considered Space Energy Conversion Program seeks advancement of basic understanding of energy conversion processes and improvement of component technologies, always in the context of the entire power subsystem. Activities in the program are divided among the traditional disciplines of photovoltaics, electrochemistry, thermoelectrics, and power systems management and distribution. In addition, a broad range of cross-disciplinary explorations of potentially revolutionary new concepts are supported under the advanced energetics program area. Solar cell research and technology are discussed, taking into account the enhancement of the efficiency of Si solar cells, GaAs liquid phase epitaxy and vapor phase epitaxy solar cells, the use of GaAs solar cells in concentrator systems, and the efficiency of a three junction cascade solar cell. Attention is also given to blanket and array technology, the alkali metal thermoelectric converter, a fuel cell/electrolysis system, and thermal to electric conversion.

  13. Real-time Simulation of Turboprop Engine Control System

    NASA Astrophysics Data System (ADS)

    Sheng, Hanlin; Zhang, Tianhong; Zhang, Yi

    2017-05-01

    On account of the complexity of turboprop engine control system, real-time simulation is the technology, under the prerequisite of maintaining real-time, to effectively reduce development cost, shorten development cycle and avert testing risks. The paper takes RT-LAB as a platform and studies the real-time digital simulation of turboprop engine control system. The architecture, work principles and external interfaces of RT-LAB real-time simulation platform are introduced firstly. Then based on a turboprop engine model, the control laws of propeller control loop and fuel control loop are studied. From that and on the basis of Matlab/Simulink, an integrated controller is designed which can realize the entire process control of the engine from start-up to maximum power till stop. At the end, on the basis of RT-LAB platform, the real-time digital simulation of the designed control system is studied, different regulating plans are tried and more ideal control effects have been obtained.

  14. The NASA program in Space Energy Conversion Research and Technology

    NASA Technical Reports Server (NTRS)

    Mullin, J. P.; Flood, D. J.; Ambrus, J. H.; Hudson, W. R.

    1982-01-01

    The considered Space Energy Conversion Program seeks advancement of basic understanding of energy conversion processes and improvement of component technologies, always in the context of the entire power subsystem. Activities in the program are divided among the traditional disciplines of photovoltaics, electrochemistry, thermoelectrics, and power systems management and distribution. In addition, a broad range of cross-disciplinary explorations of potentially revolutionary new concepts are supported under the advanced energetics program area. Solar cell research and technology are discussed, taking into account the enhancement of the efficiency of Si solar cells, GaAs liquid phase epitaxy and vapor phase epitaxy solar cells, the use of GaAs solar cells in concentrator systems, and the efficiency of a three junction cascade solar cell. Attention is also given to blanket and array technology, the alkali metal thermoelectric converter, a fuel cell/electrolysis system, and thermal to electric conversion.

  15. Brain stimulation reveals crucial role of overcoming self-centeredness in self-control

    PubMed Central

    Soutschek, Alexander; Ruff, Christian C.; Strombach, Tina; Kalenscher, Tobias; Tobler, Philippe N.

    2016-01-01

    Neurobiological models of self-control predominantly focus on the role of prefrontal brain mechanisms involved in emotion regulation and impulse control. We provide evidence for an entirely different neural mechanism that promotes self-control by overcoming bias for the present self, a mechanism previously thought to be mainly important for interpersonal decision-making. In two separate studies, we show that disruptive transcranial magnetic stimulation (TMS) of the temporo-parietal junction—a brain region involved in overcoming one’s self-centered perspective—increases the discounting of delayed and prosocial rewards. This effect of TMS on temporal and social discounting is accompanied by deficits in perspective-taking and does not reflect altered spatial reorienting and number recognition. Our findings substantiate a fundamental commonality between the domains of self-control and social decision-making and highlight a novel aspect of the neurocognitive processes involved in self-control. PMID:27774513

  16. Brain stimulation reveals crucial role of overcoming self-centeredness in self-control.

    PubMed

    Soutschek, Alexander; Ruff, Christian C; Strombach, Tina; Kalenscher, Tobias; Tobler, Philippe N

    2016-10-01

    Neurobiological models of self-control predominantly focus on the role of prefrontal brain mechanisms involved in emotion regulation and impulse control. We provide evidence for an entirely different neural mechanism that promotes self-control by overcoming bias for the present self, a mechanism previously thought to be mainly important for interpersonal decision-making. In two separate studies, we show that disruptive transcranial magnetic stimulation (TMS) of the temporo-parietal junction-a brain region involved in overcoming one's self-centered perspective-increases the discounting of delayed and prosocial rewards. This effect of TMS on temporal and social discounting is accompanied by deficits in perspective-taking and does not reflect altered spatial reorienting and number recognition. Our findings substantiate a fundamental commonality between the domains of self-control and social decision-making and highlight a novel aspect of the neurocognitive processes involved in self-control.

  17. Noise induced escape from a nonhyperbolic chaotic attractor of a periodically driven nonlinear oscillator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Zhen, E-mail: czkillua@icloud.com, E-mail: xbliu@nuaa.edu.cn; Li, Yang; Liu, Xianbin, E-mail: czkillua@icloud.com, E-mail: xbliu@nuaa.edu.cn

    2016-06-15

    Noise induced escape from the domain of attraction of a nonhyperbolic chaotic attractor in a periodically excited nonlinear oscillator is investigated. The general mechanism of the escape in the weak noise limit is studied in the continuous case, and the fluctuational path is obtained by statistical analysis. Selecting the primary homoclinic tangency as the initial condition, the action plot is presented by parametrizing the set of escape trajectories and the global minimum gives rise to the optimal path. Results of both methods show good agreements. The entire process of escape is discussed in detail step by step using the fluctuationalmore » force. A structure of hierarchical heteroclinic crossings of stable and unstable manifolds of saddle cycles is found, and the escape is observed to take place through successive jumps through this deterministic hierarchical structure.« less

  18. Modeling of agent-based complex network under cyber-violence

    NASA Astrophysics Data System (ADS)

    Huang, Chuanchao; Hu, Bin; Jiang, Guoyin; Yang, Ruixian

    2016-09-01

    Public opinion reversal arises frequently in modern society, due to the continual interactions between individuals and their surroundings. To explore the underlying mechanism of the interesting social phenomenon, we introduce here a new model which takes the relationship between the individual cognitive bias and their corresponding choice behavior into account. Experimental results show that the proposed model can provide an accurate description of the entire process of public opinion reversal under the internet environment and the distribution of cognitive bias plays the role of a measure for the reversal probability. In particular, the application to cyber violence, a typical example of public opinion reversal, suggests that public opinion is prone to be seriously affected by the spread of misleading and harmful information. Furthermore, our model is very robust and thus can be employed to other empirical studies that concern the sudden change of public and personal opinion on internet.

  19. The energy landscape of adenylate kinase during catalysis

    PubMed Central

    Kerns, S. Jordan; Agafonov, Roman V.; Cho, Young-Jin; Pontiggia, Francesco; Otten, Renee; Pachov, Dimitar V.; Kutter, Steffen; Phung, Lien A.; Murphy, Padraig N.; Thai, Vu; Alber, Tom; Hagan, Michael F.; Kern, Dorothee

    2014-01-01

    Kinases perform phosphoryl-transfer reactions in milliseconds; without enzymes, these reactions would take about 8000 years under physiological conditions. Despite extensive studies, a comprehensive understanding of kinase energy landscapes, including both chemical and conformational steps, is lacking. Here we scrutinize the microscopic steps in the catalytic cycle of adenylate kinase, through a combination of NMR measurements during catalysis, pre-steady-state kinetics, MD simulations, and crystallography of active complexes. We find that the Mg2+ cofactor activates two distinct molecular events, phosphoryl transfer (>105-fold) and lid-opening (103-fold). In contrast, mutation of an essential active-site arginine decelerates phosphoryl transfer 103-fold without substantially affecting lid-opening. Our results highlight the importance of the entire energy landscape in catalysis and suggest that adenylate kinases have evolved to activate key processes simultaneously by precise placement of a single, charged and very abundant cofactor in a pre-organized active site. PMID:25580578

  20. Seismic analysis of the frame structure reformed by cutting off column and jacking based on stiffness ratio

    NASA Astrophysics Data System (ADS)

    Zhao, J. K.; Xu, X. S.

    2017-11-01

    The cutting off column and jacking technology is a method for increasing story height, which has been widely used and paid much attention in engineering. The stiffness will be changed after the process of cutting off column and jacking, which directly affects the overall seismic performance. It is usually necessary to take seismic strengthening measures to enhance the stiffness. A five story frame structure jacking project in Jinan High-tech Zone was taken as an example, and three finite element models were established which contains the frame model before lifting, after lifting and after strengthening. Based on the stiffness, the dynamic time-history analysis was carried out to research its seismic performance under the EL-Centro seismic wave, the Taft seismic wave and the Tianjin artificial seismic wave. The research can provide some guidance for the design and construction of the entire jack lifting structure.

  1. 2009 AERA Distinguished Lecture: Causes and Consequences of Cognitive Functioning across the Life Course

    ERIC Educational Resources Information Center

    Hauser, Robert M.

    2010-01-01

    Research on variation in cognitive abilities has focused largely on their genetic or experiential sources and on their economic consequences. This article takes a broader look at the consequences of cognitive ability--IQ--across the life course. Contrary to received wisdom, the effects of IQ on economic success are almost entirely mediated by…

  2. Advanced Math Equals Career Readiness. Math Works

    ERIC Educational Resources Information Center

    Achieve, Inc., 2013

    2013-01-01

    The equation is simple: No matter their background, students who take challenging math courses in high school get better jobs and earn more money throughout their entire lives. This paper stresses that: (1) Higher-level math opens doors for any and all postsecondary programs and keeps it open for advancement beyond entry-level jobs; and (2)…

  3. 32 CFR 644.114 - Acquisition by declaration of taking.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... amount for the entire interest holding to have added value, for operational or other reasons, because it... determination be made as to whether the value of growing crops should be added to the value of the land... purchase due to failure to reach an agreement with the owners as to value, inability to contact the owners...

  4. 32 CFR 644.114 - Acquisition by declaration of taking.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... amount for the entire interest holding to have added value, for operational or other reasons, because it... determination be made as to whether the value of growing crops should be added to the value of the land... purchase due to failure to reach an agreement with the owners as to value, inability to contact the owners...

  5. 32 CFR 644.114 - Acquisition by declaration of taking.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... amount for the entire interest holding to have added value, for operational or other reasons, because it... determination be made as to whether the value of growing crops should be added to the value of the land... purchase due to failure to reach an agreement with the owners as to value, inability to contact the owners...

  6. Creating an Amazing Montessori Toddler Home Environment

    ERIC Educational Resources Information Center

    Woo, Stephanie

    2014-01-01

    The author states that raising her twins the Montessori way has made her life easy. Imagine two 1-year-olds eating entire meals on their own, setting their own tables by 20 months, and becoming potty-trained before 2. These are not statistics found in just one household. Children raised the Montessori way can take care of themselves and their…

  7. How to Walk to School: Blueprint for a Neighborhood School Renaissance

    ERIC Educational Resources Information Center

    Edelberg, Jacqueline; Kurland, Susan

    2011-01-01

    "How to Walk to School" is the story--from the highs to the lows--of motivated neighborhood parents galvanizing and then organizing an entire community to take a leap of faith, transforming a challenged urban school into one of Chicago's best, virtually overnight. The fate of public education is not beyond our control. In "How to…

  8. Building a Green House in the Redwoods

    ERIC Educational Resources Information Center

    Enos, David

    2013-01-01

    Designing and building homes is normally the responsibility of a professionally trained workforce. So a lot of people are skeptical about the idea of a team of high school students taking on those tasks. Would a young, untrained group of students be able to undertake and successfully complete the construction of an entire home, from start to…

  9. Once She Makes It, She's There!: A Case Study

    ERIC Educational Resources Information Center

    Gal-Ezer, Judith; Vilner, Tamar; Zur, Ela

    2008-01-01

    Computer science is possibly one of the few remaining disciplines almost entirely dominated by men, especially university staff and in the hi-tech industries. This phenomenon prevails throughout the western world; in Israel it starts in high school, where only 30% of students who choose to take computer science as an elective are women, and…

  10. Methylene Blue-Ascorbic Acid: An Undergraduate Experiment in Kinetics.

    ERIC Educational Resources Information Center

    Snehalatha, K. C.; And Others

    1997-01-01

    Describes a laboratory exercise involving methylene blue and L-ascorbic acid in a simple clock reaction technique to illustrate the basic concepts of chemical kinetics. If stock solutions are supplied and each type of experiment takes no more than half an hour, the entire investigation can be completed in three practical sessions of three hours…

  11. Building A Help Desk System

    ERIC Educational Resources Information Center

    O'Shea, Sheryl

    2005-01-01

    Are you trying to support an entire school district full of computers with just a few hardware technicians? Do you have a non-technical person answering the phone? Are the teachers getting upset because of the length of time it takes to resolve their issues? Unfortunately, I had to answer yes to all three questions. Our school district includes…

  12. Students, Parents, and Teachers Say, "Take This Test and Shove It!"

    ERIC Educational Resources Information Center

    Spritzler, John

    2000-01-01

    Public school students, parents, and teachers are protesting "high stakes" standardized tests that bar many deserving students from promotion or graduation. A typical high stakes test is a state-mandated 10th-grade test that students must pass to graduate high school. They are called high stakes because a student's entire high school career rides…

  13. Adult Education in Portugal. Adult Education in Europe Studies and Documents No. 16.

    ERIC Educational Resources Information Center

    Melo, Alberto

    This report on Portuguese adult education is focused on the principles and practices adopted by the Directorate-General, due to adult education's present embryonic state. Basic statistics and a brief introduction appear first. Part I, The System of Adult Education, is presented as a succession of initiatives and takes practically the entire length…

  14. Examining Online College Cyber Cheating Methods and Prevention Measures

    ERIC Educational Resources Information Center

    Moten, James, Jr.; Fitterer, Alex; Brazier, Elise; Leonard, Jonathan; Brown, Avis

    2013-01-01

    Academic dishonesty in the online cheating environment of distance education learning has gained traction in the past decade. By a few simple keystrokes, students' can find a wide array of online services for hire to write research papers, complete homework assignments, or enroll on behalf of the student on record to take the entire online…

  15. The Dynamic Density Bottle: A Make-and-Take, Guided Inquiry Activity on Density

    ERIC Educational Resources Information Center

    Kuntzleman, Thomas S.

    2015-01-01

    An activity is described wherein students observe dynamic floating and sinking behavior of plastic pieces in various liquids. The liquids and solids are all contained within a plastic bottle; the entire assembly is called a "density bottle". After completing a series of experiments that guides students to think about the relative…

  16. Advice for the Sleep-Deprived

    ERIC Educational Resources Information Center

    Wolfe, Pat

    2005-01-01

    A research has uncovered that adolescent sleep patterns are influenced not so much by the activities of the young adults as by the changes taking place in the biological timing system of their brains. It is evident that teenagers are not getting the amount of sleep they require and suggestions are presented to help diminish if not entirely avoid…

  17. 26 CFR 1.162-25 - Deductions with respect to noncash fringe benefits.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... fringe benefit under a special accounting rule that allows the employer to treat the value of benefits... benefit and includes the entire value of the benefit in the employee's gross income without taking into... thereunder, the employee may deduct that value multiplied by the percentage of the total use of the vehicle...

  18. Healthy co-twins of patients with affective disorders show reduced risk-related activation of the insula during a monetary gambling task.

    PubMed

    Macoveanu, Julian; Miskowiak, Kamilla; Kessing, Lars V; Vinberg, Maj; Siebner, Hartwig R

    2016-01-01

    Healthy first-degree relatives of patients with affective disorders are at increased risk for affective disorders and express discrete structural and functional abnormalities in the brain reward system. However, value-based decision making is not well understood in these at-risk individuals. We investigated healthy monozygotic and dizygotic twins with or without a co-twin history of affective disorders (high-risk and low-risk groups, respectively) using functional MRI during a gambling task. We assessed group differences in activity related to gambling risk over the entire brain. We included 30 monozygotic and 37 dizygotic twins in our analysis. Neural activity in the anterior insula and ventral striatum increased linearly with the amount of gambling risk in the entire cohort. Individual neuroticism scores were positively correlated with the neural response in the ventral striatum to increasing gambling risk and negatively correlated with individual risk-taking behaviour. Compared with low-risk twins, the high-risk twins showed a bilateral reduction of risk-related activity in the middle insula extending into the temporal cortex with increasing gambling risk. Post hoc analyses revealed that this effect was strongest in dizygotic twins. The relatively old average age of the mono- and dizygotic twin cohort (49.2 yr) may indicate an increased resilience to affective disorders. The size of the monozygotic high-risk group was relatively small (n = 13). The reduced processing of risk magnitude in the middle insula may indicate a deficient integration of exteroceptive information related to risk-related cues with interoceptive states in individuals at familial risk for affective disorders. Impaired risk processing might contribute to increased vulnerability to affective disorders.

  19. The star formation history of early-type galaxies as a function of mass and environment

    NASA Astrophysics Data System (ADS)

    Clemens, M. S.; Bressan, A.; Nikolic, B.; Alexander, P.; Annibali, F.; Rampazzo, R.

    2006-08-01

    Using the third data release of the Sloan Digital Sky Survey (SDSS), we have rigorously defined a volume-limited sample of early-type galaxies in the redshift range 0.005 < z <= 0.1. We have defined the density of the local environment for each galaxy using a method which takes account of the redshift bias introduced by survey boundaries if traditional methods are used. At luminosities greater than our absolute r-band magnitude cut-off of -20.45, the mean density of environment shows no trend with redshift. We calculate the Lick indices for the entire sample and correct for aperture effects and velocity dispersion in a model-independent way. Although we find no dependence of redshift or luminosity on environment, we do find that the mean velocity dispersion, σ, of early-type galaxies in dense environments tends to be higher than in low-density environments. Taking account of this effect, we find that several indices show small but very significant trends with environment that are not the result of the correlation between indices and velocity dispersion. The statistical significance of the data is sufficiently high to reveal that models accounting only for α-enhancement struggle to produce a consistent picture of age and metallicity of the sample galaxies, whereas a model that also includes carbon enhancement fares much better. We find that early-type galaxies in the field are younger than those in environments typical of clusters but that neither metallicity, α-enhancement nor carbon enhancement are influenced by the environment. The youngest early-type galaxies in both field and cluster environments are those with the lowest σ. However, there is some evidence that the objects with the largest σ are slightly younger, especially in denser environments. Independent of environment both the metallicity and α-enhancement grow monotonically with σ. This suggests that the typical length of the star formation episodes which formed the stars of early-type galaxies decreases with σ. More massive galaxies were formed in faster bursts. We argue that the timing of the process of formation of early-type galaxies is determined by the environment, while the details of the process of star formation, which has built up the stellar mass, are entirely regulated by the halo mass. These results suggest that the star formation took place after the mass assembly and favours an anti-hierarchical model. In such a model, the majority of the mergers must take place before the bulk of the stars form. This can only happen if there exists an efficient feedback mechanism which inhibits the star formation in low-mass haloes and is progressively reduced as mergers increase the mass.

  20. Triggering factor evolution and dynamic process simulation of the Formosa Highway dip-slope failure, northern Taiwan

    NASA Astrophysics Data System (ADS)

    Huang, Mei-Jen; Chiang, Yi-Lin; Chang, Ho-Shyang; Chang, Kuo-Jen

    2013-04-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Accordingly, if the new-built construction does not take into account this threaten, tremendous disasters will occur. On April 25th 2010, Formosa Freeway dip-slope failure caused four deaths, resulted from artificial slope cutting and rock-bot supporting system weakening. This research integrates high resolution Digital Terrain Model (DTM) and numerical simulation to evaluate the triggering mechanism and dynamic process of the landslide. First of all, to access the landslide geometry, the morphology of the event before and after landslide is constructed from high resolution DTM by means of aerial photos. The slid and the deposit volumes of the landslide are thus estimated accordingly. Only part of the surface of separation between slide block and slide slope is exposed. Based on the exposed planar strata/sliding surface, situated on the upper part of the slope, by means of extrapolating part of the plane to mimic the entire slide surface. From DTMs, the slide block is approximately 0.15 million cubic meters. The extrapolated planar surface serves as sliding surface for the numerical models. For numerical model preparation, the particle clusters produced by isotropic stress and the porosity are take into account. To ensure the production range should cover the entire slid mass from the source area, the particle clusters represent the slid block is been rotated, scaled and translated to the source area. Then, part of the particles are been eliminated if it is situated outside the upper and lower surface from the DTM before and after landslide. According to the geological map, the model of the particles to mimic the slide block can be divided into two parts: 1) the underneath interbedded sandstone and shale which may soften by water 2) the supposed upper layer composed of sandstone. Furthermore, set up a layer of particles to simulate ground anchor. The advantages of DTM collocate PFC3d are that real terrain can be represented on the model, and can be simulated the complete landslide process dynamically. Comparing with continuum mechanic analysis that only provides state of instability, but by using discrete element method it can provide the dynamical process of sliding include trajectory, velocity change, sliding distance and also accumulation patterns after landslide and know the affected areas from the disaster event. Results shows: 1) the peak and the residual frictional angle of the sliding surface should be small than 14 and 4 degree, respectively, in the condition of 30% effective resistance of rock-bolt remains. 2)The maximum sliding speed could be as high as 15.34 m/s, caused thus hazard event.

  1. Barrel organ of plate tectonics - a new tool for outreach and education

    NASA Astrophysics Data System (ADS)

    Broz, Petr; Machek, Matěj; Šorm, Zdar

    2016-04-01

    Plate tectonics is the major geological concept to explain dynamics and structure of Earth's outer shell, the lithosphere. In the plate tectonic theory processes in the Earth lithosphere and its dynamics is driven by the relative motion and interaction of lithospheric plates. Geologically most active regions on Earth often correlate with the lithospheric plate boundaries. Thus for explaining the earth surface evolution, mountain building, volcanism and earthquake origin it is important to understand processes at the plate boundaries. However these processes associated with plate tectonics usually require significant period of time to take effects, therefore, their entire cycles cannot be directly observed in the nature by humans. This makes a challenge for scientists studying these processes, but also for teachers and popularizers trying to explain them to students and to the general public. Therefore, to overcome this problem, we developed a mechanical model of plate tectonics enabling demonstration of most important processes associated with plate tectonics in real time. The mechanical model is a wooden box, more specifically a special type of barrel organ, with hand painted backdrops in the front side. These backdrops are divided into several components representing geodynamic processes associated with plate tectonics, specifically convective currents occurring in the mantle, sea-floor spreading, a subduction of the oceanic crust under the continental crust, partial melting and volcanism associated with subduction, a formation of magmatic stripes, an ascent of mantle plume throughout the mantle, a volcanic activity associated with hot spots, and a formation and degradation of volcanic islands on moving lithospheric plate. All components are set in motion by a handle controlled by a human operator, and the scene is illuminated with colored lights controlled automatically by an electric device embedded in the box. Operation of the model may be seen on www.geologyinexperiments.com where additional pictures and details about the construction are available. This mechanical model represents a unique outreach tool how to present processes, normally taking eons to occur, to students and to the public in easy and funny way, and how to attract their attention to the most important concept in geology.

  2. Process Reengineering for Quality Improvement in ICU Based on Taylor's Management Theory.

    PubMed

    Tao, Ziqi

    2015-06-01

    Using methods including questionnaire-based surveys and control analysis, we analyzed the improvements in the efficiency of ICU rescue, service quality, and patients' satisfaction, in Xuzhou Central Hospital after the implementation of fine management, with an attempt to further introduce the concept of fine management and implement the brand construction. Originating in Taylor's "Theory of Scientific Management" (1982), fine management uses programmed, standardized, digitalized, and informational approaches to ensure each unit of an organization is running with great accuracy, high efficiency, strong coordination, and at sustained duration (Wang et al., Fine Management, 2007). The nature of fine management is a process that breaks up the strategy and goal, and executes it. Strategic planning takes place at every part of the process. Fine management demonstrates that everybody has a role to play in the management process, every area must be examined through the management process, and everything has to be managed (Zhang et al., The Experience of Hospital Nursing Precise Management, 2006). In other words, this kind of management theory demands all people to be involved in the entire process (Liu and Chen, Med Inf, 2007). As public hospital reform is becoming more widespread, it becomes imperative to "build a unified and efficient public hospital management system" and "improve the quality of medical services" (Guidelines on the Pilot Reform of Public Hospitals, 2010). The execution of fine management is of importance in optimizing the medical process, improving medical services and building a prestigious hospital brand.

  3. STAKEHOLDER INVOLVEMENT IN THE HEALTH TECHNOLOGY ASSESSMENT PROCESS IN LATIN AMERICA.

    PubMed

    Pichon-Riviere, Andres; Soto, Natalie; Augustovski, Federico; Sampietro-Colom, Laura

    2018-06-11

    Latin American countries are taking important steps to expand and strengthen universal health coverage, and health technology assessment (HTA) has an increasingly prominent role in this process. Participation of all relevant stakeholders has become a priority in this effort. Key issues in this area were discussed during the 2017 Latin American Health Technology Assessment International (HTAi) Policy Forum. The Forum included forty-one participants from Latin American HTA agencies; public, social security, and private insurance sectors; and the pharmaceutical and medical device industry. A background paper and presentations by invited experts and Forum members supported discussions. This study presents a summary of these discussions. Stakeholder involvement in HTA remains inconsistently implemented in the region and few countries have established formal processes. Participants agreed that stakeholder involvement is key to improve the HTA process, but the form and timing of such improvements must be adapted to local contexts. The legitimization of both HTA and decision-making processes was identified as one of the main reasons to promote stakeholder involvement; but to be successful, the entire system of assessment and decision making must be properly staffed and organized, and certain basic conditions must be met, including transparency in the HTA process and a clear link between HTA and decision making. Participants suggested a need for establishing clear rules of participation in HTA that would protect HTA producers and decision makers from potentially distorting external influences. Such rules and mechanisms could help foster trust and credibility among stakeholders, supporting actual involvement in HTA processes.

  4. EDITORIAL: Welcome to the 2008 volume

    NASA Astrophysics Data System (ADS)

    Puers, R.

    2008-01-01

    It is my pleasure to address these few lines to you all on the occasion of the start of the 2008 volume of Journal of Micromechanics and Microengineering, the journal's eighteenth year, and my eleventh year of service as Editor-in-Chief. As in previous years, I would like to take the opportunity to reflect on the achievements of the past year. The number of submissions to the journal continues to grow, to almost 800 in 2007. Importantly, the journal's ISI® impact factor remains at a solid 2.321. This is an achievement we can all be proud of. In 2007, an incredible 350 000 papers were downloaded, which clearly reflects the visibility and appreciation of our research work. These excellent results are entirely due to the fact that more of you are choosing to submit your high-quality work to the journal, and because more of you are also choosing to cite recent papers published within the journal. I would like to take this opportunity to thank each one of you: readers, authors and referees alike. To cope with the steadily increasing number of incoming papers, the review process had to be expanded. In 2007, more than 700 experts selected from 35 countries agreed to our requests to referee. In the name of the entire team, I would like to express my thanks to all our referees for their careful and well constructed reports, which are of paramount importance in maintaining the quality standards of Journal of Micromechanics and Microengineering. The average time to produce an individual report is a mere 19 days, contributing towards a very favourable overall processing time which is an attractive feature of the journal. Of course all this would not be possible without the constant hard work of the publishing, production and marketing staff in Bristol. In the name of the Editorial Board, contributing authors and readers, I wish to thank them for their support. Finally, I believe we have established a clear and distinct profile in the broad spectrum of journals in our field, and I hope we can expand this profile even further. This is not possible without the efforts of every individual researcher in our community. I therefore wish each of you a prosperous and adventurous 2008 in your quest to shift the frontiers in micromachining and microengineering. May health and prosperity be yours as you work to achieve these goals!

  5. Evolutionary games on cycles with strong selection

    NASA Astrophysics Data System (ADS)

    Altrock, P. M.; Traulsen, A.; Nowak, M. A.

    2017-02-01

    Evolutionary games on graphs describe how strategic interactions and population structure determine evolutionary success, quantified by the probability that a single mutant takes over a population. Graph structures, compared to the well-mixed case, can act as amplifiers or suppressors of selection by increasing or decreasing the fixation probability of a beneficial mutant. Properties of the associated mean fixation times can be more intricate, especially when selection is strong. The intuition is that fixation of a beneficial mutant happens fast in a dominance game, that fixation takes very long in a coexistence game, and that strong selection eliminates demographic noise. Here we show that these intuitions can be misleading in structured populations. We analyze mean fixation times on the cycle graph under strong frequency-dependent selection for two different microscopic evolutionary update rules (death-birth and birth-death). We establish exact analytical results for fixation times under strong selection and show that there are coexistence games in which fixation occurs in time polynomial in population size. Depending on the underlying game, we observe inherence of demographic noise even under strong selection if the process is driven by random death before selection for birth of an offspring (death-birth update). In contrast, if selection for an offspring occurs before random removal (birth-death update), then strong selection can remove demographic noise almost entirely.

  6. Predicting film dose to aid in cassette placement for radiation therapy portal verification film images.

    PubMed

    Keys, Richard A; Marks, James E; Haus, Arthur G

    2002-12-01

    EC film has improved portal localization images with better contrast and improved distinction of bony structures and air-tissue interfaces. A cassette with slower speed screens was used with EC film to image the treatment portal during the entire course of treatment (verification) instead of taking separate films after treatment. Measurements of film density vs source to film distance (SFD) were made using 15 and 25 cm thick water phantoms with both 6 and 18 MV photons from I to 40 cm past the phantom. A characteristic (H & D) curve was measured in air to compare dose to film density. Results show the reduction in radiation between patient and cassette more closely follows an "inverse cube law" rather than an inverse square law. Formulas to calculate radiation exposure to the film, and the desired SFD were based on patient tumor dose, calculation of the exit dose, and the inverse cube relationship. A table of exposure techniques based on the SFD for a given tumor dose was evaluated and compared to conventional techniques. Although the film has a high contrast, there is enough latitude that excellent films can be achieved using a fixed SFD based simply on the tumor dose and beam energy. Patient diameter has a smaller effect. The benefits of imaging portal films during the entire treatment are more reliability in the accuracy of the portal image, ability to detect patient motion, and reduction in the time it takes to take portal images.

  7. Can We Play "Fun Gay"? Disjuncture and Difference, and the Precarious Mobilities of Millennial Queer Youth Narratives

    ERIC Educational Resources Information Center

    Bryson, Mary K.; MacIntosh, Lori B.

    2010-01-01

    This article takes up the complex project of unthinking neoliberal accounts of a progressive modernity. The authors position their anxieties about an "after" to queer as an affect modality productive of both an opportunity and an obligation to think critically about the move to delimit historically, and as a gesture to an entirely different…

  8. Re(Place) Your Typical Writing Assignment: An Argument for Place-Based Writing

    ERIC Educational Resources Information Center

    Jacobs, Elliot

    2011-01-01

    Place-based writing affords students an opportunity to write meaningfully about themselves, grounded in a place that they know. Place-based writing is versatile and can be additive--taking just a week or two within a semester of different projects--or transformative, if positioned as the theme for an entire course. If students can learn to write…

  9. Drilling side holes from a borehole

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr.

    1980-01-01

    Machine takes long horizontal stratum samples from confines of 21 cm bore hole. Stacked interlocking half cylindrical shells mate to form rigid thrust tube. Drive shaft and core storage device is flexible and retractable. Entire machine fits in 10 meter length of steel tube. Machine could drill drainage or ventilation holes in coal mines, or provide important information for geological, oil, and geothermal surveys.

  10. Why "Faith-Based" Is Here to Stay

    ERIC Educational Resources Information Center

    Daly, Lew

    2009-01-01

    By the time he left office, President Bush's faith-based initiative had become a kind of stand-in for his entire presidency. Whenever something went wrong on Bush's watch it was tarred as yet another "faith-based" policy. As the 2008 presidential election began to take shape, with the Democrats newly in charge of both the House and the Senate and…

  11. Writing Virtue and Indigenous Rights: Juan Bautista De Pomar and the "Relación de Texcoco"

    ERIC Educational Resources Information Center

    Espericueta, José

    2015-01-01

    In his "Relación de Texcoco," Juan Bautista de Pomar (c. 1535-90) takes a political and moral stance against Spanish colonialism in Texcoco and the entire viceroyalty of New Spain. Responding to the "Instrucción y memoria's" (1577) request for information about the history and cultural practices of local populations, Pomar…

  12. Building Blocks: Making Children Successful in the Early Years of School

    ERIC Educational Resources Information Center

    Maeroff, Gene I.

    2006-01-01

    A student's entire journey along the educational spectrum is affected by what occurs--and, crucially, by what does not occur--before the age of eight or nine. Yet early learning has never received the attention it deserves and needs. In his latest book, education expert Gene Maeroff takes a hard look at early learning and the primary grades of…

  13. The marine atmospheric boundary layer under strong wind conditions: Organized turbulence structure and flux estimates by airborne measurements

    NASA Astrophysics Data System (ADS)

    Brilouet, Pierre-Etienne; Durand, Pierre; Canut, Guylaine

    2017-02-01

    During winter, cold air outbreaks take place in the northwestern Mediterranean sea. They are characterized by local strong winds (Mistral and Tramontane) which transport cold and dry continental air across a warmer sea. In such conditions, high values of surface sensible and latent heat flux are observed, which favor deep oceanic convection. The HyMeX/ASICS-MED field campaign was devoted to the study of these processes. Airborne measurements, gathered in the Gulf of Lion during the winter of 2013, allowed for the exploration of the mean and turbulent structure of the marine atmospheric boundary layer (MABL). A spectral analysis based on an analytical model was conducted on 181 straight and level runs. Profiles of characteristic length scales and sharpness parameter of the vertical wind spectrum revealed larger eddies along the mean wind direction associated with an organization of the turbulence field into longitudinal rolls. These were highlighted by boundary layer cloud bands on high-resolution satellite images. A one-dimensional description of the vertical exchanges is then a tricky issue. Since the knowledge of the flux profile throughout the entire MABL is essential for the estimation of air-sea exchanges, a correction of eddy covariance turbulent fluxes was developed taking into account the systematic and random errors due to sampling and data processing. This allowed the improvement of surface fluxes estimates, computed from the extrapolation of the stacked levels. A comparison between those surface fluxes and bulk fluxes computed at a moored buoy revealed considerable differences, mainly regarding the latent heat flux under strong wind conditions.

  14. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center.

    PubMed

    Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude

    2011-07-01

    Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. DigiMemo: Facilitating the Note Taking Process

    ERIC Educational Resources Information Center

    Kurt, Serhat

    2009-01-01

    Everyone takes notes daily for various reasons. Note taking is very popular in school settings and generally recognized as an effective learning strategy. Further, note taking is a complex process because it requires understanding, selection of information and writing. Some new technological tools may facilitate the note taking process. Among such…

  16. Applying policy and health effects of air pollution in South Korea: focus on ambient air quality standards

    PubMed Central

    Ha, Jongsik

    2014-01-01

    Objectives South Korea’s air quality standards are insufficient in terms of establishing a procedure for their management. The current system lacks a proper decision-making process and prior evidence is not considered. The purpose of this study is to propose a measure for establishing atmospheric environmental standards in South Korea that will take into consideration the health of its residents. Methods In this paper, the National Ambient Air Quality Standards (NAAQS) of the US was examined in order to suggest ways, which consider health effects, to establish air quality standards in South Korea. Up-to-date research on the health effects of air pollution was then reviewed, and tools were proposed to utilize the key results. This was done in an effort to ensure the reliability of the standards with regard to public health. Results This study showed that scientific research on the health effects of air pollution and the methodology used in the research have contributed significantly to establishing air quality standards. However, as the standards are legally binding, the procedure should take into account the effects on other sectors. Realistically speaking, it is impossible to establish standards that protect an entire population from air pollution. Instead, it is necessary to find a balance between what should be done and what can be done. Conclusions Therefore, establishing air quality standards should be done as part of an evidence-based policy that identifies the health effects of air pollution and takes into consideration political, economic, and social contexts. PMID:25300297

  17. Converting customer expectations into achievable results.

    PubMed

    Landis, G A

    1999-11-01

    It is not enough in today's environment to just meet customers' expectations--we must exceed them. Therefore, one must learn what constitutes expectations. These needs have expanded during the past few years from just manufacturing the product and looking at the outcome from a provincial standpoint. Now we must understand and satisfy the entire supply chain. To manage this process and satisfy the customer, the process now involves the supplier, the manufacturer, and the entire distribution system.

  18. Longitudinal Intravital Imaging of the Retina Reveals Long-term Dynamics of Immune Infiltration and Its Effects on the Glial Network in Experimental Autoimmune Uveoretinitis, without Evident Signs of Neuronal Dysfunction in the Ganglion Cell Layer

    PubMed Central

    Bremer, Daniel; Pache, Florence; Günther, Robert; Hornow, Jürgen; Andresen, Volker; Leben, Ruth; Mothes, Ronja; Zimmermann, Hanna; Brandt, Alexander U.; Paul, Friedemann; Hauser, Anja E.; Radbruch, Helena; Niesner, Raluca

    2016-01-01

    A hallmark of autoimmune retinal inflammation is the infiltration of the retina with cells of the innate and adaptive immune system, leading to detachment of the retinal layers and even to complete loss of the retinal photoreceptor layer. As the only optical system in the organism, the eye enables non-invasive longitudinal imaging studies of these local autoimmune processes and of their effects on the target tissue. Moreover, as a window to the central nervous system (CNS), the eye also reflects general neuroinflammatory processes taking place at various sites within the CNS. Histological studies in murine neuroinflammatory models, such as experimental autoimmune uveoretinitis (EAU) and experimental autoimmune encephalomyelitis, indicate that immune infiltration is initialized by effector CD4+ T cells, with the innate compartment (neutrophils, macrophages, and monocytes) contributing crucially to tissue degeneration that occurs at later phases of the disease. However, how the immune attack is orchestrated by various immune cell subsets in the retina and how the latter interact with the target tissue under in vivo conditions is still poorly understood. Our study addresses this gap with a novel approach for intravital two-photon microscopy, which enabled us to repeatedly track CD4+ T cells and LysM phagocytes during the entire course of EAU and to identify a specific radial infiltration pattern of these cells within the inflamed retina, starting from the optic nerve head. In contrast, highly motile CX3CR1+ cells display an opposite radial motility pattern, toward the optic nerve head. These inflammatory processes induce modifications of the microglial network toward an activated morphology, especially around the optic nerve head and main retinal blood vessels, but do not affect the neurons within the ganglion cell layer. Thanks to the new technology, non-invasive correlation of clinical scores of CNS-related pathologies with immune infiltrate behavior and subsequent tissue dysfunction is now possible. Hence, the new approach paves the way for deeper insights into the pathology of neuroinflammatory processes on a cellular basis, over the entire disease course. PMID:28066446

  19. Design and practice of a comprehensively functional integrated management information system for major construction

    NASA Astrophysics Data System (ADS)

    Liu, Yuling; Wang, Xiaoping; Zhu, Yuhui; Fei, Lanlan

    2017-08-01

    This paper introduces a Comprehensively Functional Integrated Management Information System designed for the Optical Engineering Major by the College of Optical Science and Engineering, Zhejiang University, which combines the functions of teaching, students learning, educational assessment and management. The system consists of 5 modules, major overview, online curriculum, experiment teaching management, graduation project management and teaching quality feedback. The major overview module introduces the development history, training program, curriculums and experiment syllabus and teaching achievements of optical engineering major in Zhejiang University. The Management Information System is convenient for students to learn in a mobile and personalized way. The online curriculum module makes it very easy for teachers to setup a website for new curriculums. On the website, teachers can help students on their problems about the curriculums in time and collect their homework online. The experiment teaching management module and the graduation project management module enables the students to fulfill their experiment process and graduation thesis under the help of their supervisors. Before students take an experiment in the lab, they must pass the pre-experiment quiz on the corresponding module. After the experiment, students need to submit the experiment report to the web server. Moreover, the module contains experiment process video recordings, which are very helpful to improve the effect of the experiment education. The management of the entire process of a student's graduation program, including the project selection, mid-term inspection, progress report of every two weeks, final thesis, et al, is completed by the graduation project management module. The teaching quality feedback module is not only helpful for teachers to know whether the education effect of curriculum is good or not, but also helpful for the administrators of the college to know whether the design of syllabus is reasonable or not. The Management Information System changes the management object from the education results to the entire education processes. And it improves the efficiency of the management. It provides an effective method to promote curriculum construction management by supervision and evaluation, which improves students' learning outcomes and the quality of curriculums. As a result, it promotes the quality system of education obviously.

  20. Analysis and optimization of solid oxide fuel cell-based auxiliary power units using a generic zero-dimensional fuel cell model

    NASA Astrophysics Data System (ADS)

    Göll, S.; Samsun, R. C.; Peters, R.

    Fuel-cell-based auxiliary power units can help to reduce fuel consumption and emissions in transportation. For this application, the combination of solid oxide fuel cells (SOFCs) with upstream fuel processing by autothermal reforming (ATR) is seen as a highly favorable configuration. Notwithstanding the necessity to improve each single component, an optimized architecture of the fuel cell system as a whole must be achieved. To enable model-based analyses, a system-level approach is proposed in which the fuel cell system is modeled as a multi-stage thermo-chemical process using the "flowsheeting" environment PRO/II™. Therein, the SOFC stack and the ATR are characterized entirely by corresponding thermodynamic processes together with global performance parameters. The developed model is then used to achieve an optimal system layout by comparing different system architectures. A system with anode and cathode off-gas recycling was identified to have the highest electric system efficiency. Taking this system as a basis, the potential for further performance enhancement was evaluated by varying four parameters characterizing different system components. Using methods from the design and analysis of experiments, the effects of these parameters and of their interactions were quantified, leading to an overall optimized system with encouraging performance data.

  1. Pressure Mapping and Efficiency Analysis of an EPPLER 857 Hydrokinetic Turbine

    NASA Astrophysics Data System (ADS)

    Clark, Tristan

    A conceptual energy ship is presented to provide renewable energy. The ship, driven by the wind, drags a hydrokinetic turbine through the water. The power generated is used to run electrolysis on board, taking the resultant hydrogen back to shore to be used as an energy source. The basin efficiency (Power/thrust*velocity) of the Hydrokinetic Turbine (HTK) plays a vital role in this process. In order to extract the maximum allowable power from the flow, the blades need to be optimized. The structural analysis of the blade is important, as the blade will undergo high pressure loads from the water. A procedure for analysis of a preliminary Hydrokinetic Turbine blade design is developed. The blade was designed by a non-optimized Blade Element Momentum Theory (BEMT) code. Six simulations were run, with varying mesh resolution, turbulence models, and flow region size. The procedure was developed that provides detailed explanation for the entire process, from geometry and mesh generation to post-processing analysis tools. The efficiency results from the simulations are used to study the mesh resolution, flow region size, and turbulence models. The results are compared to the BEMT model design targets. Static pressure maps are created that can be used for structural analysis of the blades.

  2. Mosaic construction, processing, and review of very large electron micrograph composites

    NASA Astrophysics Data System (ADS)

    Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.

    1996-11-01

    A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.

  3. Preparation of clinical-grade 89Zr-panitumumab as a positron emission tomography biomarker for evaluating epidermal growth factor receptor-targeted therapy

    PubMed Central

    Wei, Ling; Shi, Jianfeng; Afari, George; Bhattacharyya, Sibaprasad

    2014-01-01

    Panitumumab is a fully human monoclonal antibody approved for the treatment of epidermal growth factor receptor (EGFR) positive colorectal cancer. Recently, panitumumab has been radiolabeled with 89Zr and evaluated for its potential to be used as immuno-positron emission tomography (PET) probe for EGFR positive cancers. Interesting preclinical results published by several groups of researchers have prompted us to develop a robust procedure for producing clinical-grade 89Zr-panitumumab as an immuno-PET probe to evaluate EGFR-targeted therapy. In this process, clinical-grade panitumumab is bio-conjugated with desferrioxamine chelate and subsequently radiolabeled with 89Zr resulting in high radiochemical yield (>70%, n=3) and purity (>98%, n=3). All quality control (QC) tests were performed according to United States Pharmacopeia specifications. QC tests showed that 89Zr-panitumumab met all specifications for human injection. Herein, we describe a step-by-step method for the facile synthesis and QC tests of 89Zr-panitumumab for medical use. The entire process of bioconjugation, radiolabeling, and all QC tests will take about 5h. Because the synthesis is fully manual, two rapid, in-process QC tests have been introduced to make the procedure robust and error free. PMID:24448743

  4. DNA Dilemma: A Perspective on Current U.S. Patent and Trademarh Office Philosophy Concerning Life Patents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franz, K.; Faletra, P.

    The lack of a solid set of criteria for determining patentability of subject matter - particularly subject matter dealing with life - has recently been of increasing public concern in the United States. Alarm for patent practices related to life systems ranges from patents being granted on biochemical processes and the knowledge of these processes to the patenting of entire organisms. One of the most volatile concerns is the patenting of human genes or parts of genes since this genetic material is the basic informational molecule for all life. Current patent law, legislated in 1952, has been interpreted by themore » U.S. Supreme Court to allow broad patents of DNA, biochemical processes, and what are generally considered 'inventions' of life systems. Several issues are addressed in this paper regarding the unsound reasoning underlying both the interpretation and execution of patent law. Lapses in logic provide a gateway for businesses and individuals to take patenting to an illogical and unworkable extreme. Patent Office disorder of this magnitude is unnecessary and has great potential for harming the mission that the patent office was designed to serve. Recently disclosed patent-granting guidelines suggest the United States Patent and Trademark Office is not upholding its Constitutional responsibility of promoting the progress of science.« less

  5. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  6. Charge Transfer Reactions

    NASA Astrophysics Data System (ADS)

    Dennerl, Konrad

    2010-12-01

    Charge transfer, or charge exchange, describes a process in which an ion takes one or more electrons from another atom. Investigations of this fundamental process have accompanied atomic physics from its very beginning, and have been extended to astrophysical scenarios already many decades ago. Yet one important aspect of this process, i.e. its high efficiency in generating X-rays, was only revealed in 1996, when comets were discovered as a new class of X-ray sources. This finding has opened up an entirely new field of X-ray studies, with great impact due to the richness of the underlying atomic physics, as the X-rays are not generated by hot electrons, but by ions picking up electrons from cold gas. While comets still represent the best astrophysical laboratory for investigating the physics of charge transfer, various studies have already spotted a variety of other astrophysical locations, within and beyond our solar system, where X-rays may be generated by this process. They range from planetary atmospheres, the heliosphere, the interstellar medium and stars to galaxies and clusters of galaxies, where charge transfer may even be observationally linked to dark matter. This review attempts to put the various aspects of the study of charge transfer reactions into a broader historical context, with special emphasis on X-ray astrophysics, where the discovery of cometary X-ray emission may have stimulated a novel look at our universe.

  7. The practical equity implications of advanced metering infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felder, Frank A.

    2010-07-15

    Reductions in advanced metering costs and the efficiency benefits of dynamic pricing make a compelling case to adopt both, particularly for industrial and commercial facilities. Regulators should seriously consider such policies for residential households as well. Regulators can take meaningful steps to mitigate, if not entirely offset, the possibility that some low-income ratepayers may have higher electricity bills with AM and DP. (author)

  8. Large Quantum Probability Backflow and the Azimuthal Angle-Angular Momentum Uncertainty Relation for an Electron in a Constant Magnetic Field

    ERIC Educational Resources Information Center

    Strange, P.

    2012-01-01

    In this paper we demonstrate a surprising aspect of quantum mechanics that is accessible to an undergraduate student. We discuss probability backflow for an electron in a constant magnetic field. It is shown that even for a wavepacket composed entirely of states with negative angular momentum the effective angular momentum can take on positive…

  9. Numerical Simulation Of Silicon-Ribbon Growth

    NASA Technical Reports Server (NTRS)

    Woda, Ben K.; Kuo, Chin-Po; Utku, Senol; Ray, Sujit Kumar

    1987-01-01

    Mathematical model includes nonlinear effects. In development simulates growth of silicon ribbon from melt. Takes account of entire temperature and stress history of ribbon. Numerical simulations performed with new model helps in search for temperature distribution, pulling speed, and other conditions favoring growth of wide, flat, relatively defect-free silicon ribbons for solar photovoltaic cells at economically attractive, high production rates. Also applicable to materials other than silicon.

  10. Naval Personnel Organization; A Cultural-Historical Approach

    DTIC Science & Technology

    1982-08-01

    qualifying them for promotion much earlier, no doubt, than if they had had to learn their trade entirely from on the job apprenticeships . The...mechanical or mathematical model rather than one which takes into account the ability of human beings to interpret, misunderstand, adjust or violate...one. They underwent training in reading, mathematics , geography, seamanship and gunnery at apprentice training schools before being sent into training

  11. Leading change: 3--implementation.

    PubMed

    Kerridge, Joanna

    The potential for all staff to contribute to service improvement, irrespective of discipline, role or function, is outlined in the 2011 NHS leadership framework. This advocates developing the skills of the entire workforce to create a climate of continuous service improvement. As nurses are often required to take the lead in managing change in clinical practice, this final article in a three-part series focuses on implementing ande potentia reviewing change.

  12. 41 CFR 304-3.19 - Are there other situations when I may accept payment from a non-Federal source for my travel...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false Are there other....19 Public Contracts and Property Management Federal Travel Regulation System PAYMENT OF TRAVEL... of Personnel Management at 5 CFR part 410). (b) Under 5 U.S.C. 7342 for travel taking place entirely...

  13. Disparities in Early Learning and Development:Lessons from the Early Childhood Longitudinal Study--Birth Cohort (ECLS-B)

    ERIC Educational Resources Information Center

    Halle, Tamara; Forry, Nicole; Hair, Elizabeth; Perper, Kate; Wandner, Laura; Wessel, Julia; Vick, Jessica

    2009-01-01

    Education and business leaders as well as the public at large have grown increasingly concerned about the achievement disparities that children from at-risk backgrounds manifest at a young age. Early childhood initiatives that take into account the entire preschool period of 0 to 5 years need a better understanding of the disparities which may be…

  14. Entire Photodamaged Chloroplasts Are Transported to the Central Vacuole by Autophagy[OPEN

    PubMed Central

    2017-01-01

    Turnover of dysfunctional organelles is vital to maintain homeostasis in eukaryotic cells. As photosynthetic organelles, plant chloroplasts can suffer sunlight-induced damage. However, the process for turnover of entire damaged chloroplasts remains unclear. Here, we demonstrate that autophagy is responsible for the elimination of sunlight-damaged, collapsed chloroplasts in Arabidopsis thaliana. We found that vacuolar transport of entire chloroplasts, termed chlorophagy, was induced by UV-B damage to the chloroplast apparatus. This transport did not occur in autophagy-defective atg mutants, which exhibited UV-B-sensitive phenotypes and accumulated collapsed chloroplasts. Use of a fluorescent protein marker of the autophagosomal membrane allowed us to image autophagosome-mediated transport of entire chloroplasts to the central vacuole. In contrast to sugar starvation, which preferentially induced distinct type of chloroplast-targeted autophagy that transports a part of stroma via the Rubisco-containing body (RCB) pathway, photooxidative damage induced chlorophagy without prior activation of RCB production. We further showed that chlorophagy is induced by chloroplast damage caused by either artificial visible light or natural sunlight. Thus, this report establishes that an autophagic process eliminates entire chloroplasts in response to light-induced damage. PMID:28123106

  15. Recent Progress on the Second Generation CMORPH: A Prototype Operational Processing System

    NASA Astrophysics Data System (ADS)

    Xie, Pingping; Joyce, Robert; Wu, Shaorong

    2016-04-01

    As reported at the EGU General Assembly of 2015, a conceptual test system was developed for the second generation CMORPH to produce global analyses of 30-min precipitation on a 0.05deg lat/lon grid over the entire globe from pole to pole through integration of information from satellite observations as well as numerical model simulations. The second generation CMORPH is built upon the Kalman Filter based CMORPH algorithm of Joyce and Xie (2011). Inputs to the system include both rainfall and snowfall rate retrievals from passive microwave (PMW) measurements aboard all available low earth orbit (LEO) satellites, precipitation estimates derived from infrared (IR) observations of geostationary (GEO) as well as LEO platforms, and precipitation simulations from numerical global models. Sub-systems were developed and refined to derive precipitation estimates from the GEO and LEO IR observations and to compute precipitating cloud motion vectors. The results were reported at the EGU of 2014 and the AGU 2015 Fall Meetings. In this presentation, we report our recent work on the construction of a prototype operational processing system for the second generation CMORPH. The second generation CMORPH prototype operational processing system takes in the passive microwave (PMW) retrievals of instantaneous precipitation rates from all available sensors, the full-resolution GEO and LEO IR data, as well as the hourly precipitation fields generated by the NOAA/NCEP Climate Forecast System (CFS) Reanalysis (CFS). First, a combined field of PMW based precipitation retrievals (MWCOMB) is created on a 0.05deg lat/lon grid over the entire globe through inter-calibrating retrievals from various sensors against a common reference. For this experiment, the reference field is the GMI based retrievals with climatological adjustment against the TMI retrievals using data over the overlapping period. Precipitation estimation is then derived from the GEO and LEO IR data through calibration against the global MWCOMB and the CloudSat CPR based estimates. At the meantime, precipitating cloud motion vectors are derived through the combination of vectors computed from the GEO IR based precipitation estimates and the CFSR precipitation with a 2DVAR technique. A prototype system is applied to generate integrated global precipitation estimates over the entire globe for a three-month period from June 1 to August 31 of 2015. Preliminary tests are conducted to optimize the performance of the system. Specific efforts are made to improve the computational efficiency of the system. The second generation CMORPH test products are compared to the first generation CMORPH and ground observations. Detailed results will be reported at the EGU.

  16. Deuterium-tritium pulse propulsion with hydrogen as propellant and the entire space-craft as a gigavolt capacitor for ignition

    NASA Astrophysics Data System (ADS)

    Winterberg, F.

    2013-08-01

    A deuterium-tritium (DT) nuclear pulse propulsion concept for fast interplanetary transport is proposed utilizing almost all the energy for thrust and without the need for a large radiator: By letting the thermonuclear micro-explosion take place in the center of a liquid hydrogen sphere with the radius of the sphere large enough to slow down and absorb the neutrons of the DT fusion reaction, heating the hydrogen to a fully ionized plasma at a temperature of ∼105 K. By using the entire spacecraft as a magnetically insulated gigavolt capacitor, igniting the DT micro-explosion with an intense GeV ion beam discharging the gigavolt capacitor, possible if the space craft has the topology of a torus.

  17. 26 CFR 1.924(d)-1 - Requirement that economic processes take place outside the United States.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 10 2011-04-01 2011-04-01 false Requirement that economic processes take place... Citizens of United States § 1.924(d)-1 Requirement that economic processes take place outside the United... any transaction only if economic processes with respect to such transaction take place outside the...

  18. Frequent use of opioids in patients with dementia and nursing home residents: A study of the entire elderly population of Denmark.

    PubMed

    Jensen-Dahm, Christina; Gasse, Christiane; Astrup, Aske; Mortensen, Preben Bo; Waldemar, Gunhild

    2015-06-01

    Pain is believed to be undertreated in patients with dementia; however, no larger studies have been conducted. The aim was to investigate prevalent use of opioids in elderly with and without dementia in the entire elderly population of Denmark. A register-based cross-sectional study in the entire elderly (≥65 years) population in 2010 was conducted. Opioid use among elderly with dementia (N = 35,455) was compared with elderly without (N = 870,645), taking age, sex, comorbidity, and living status into account. Nursing home residents (NHRs) used opioids most frequently (41%), followed by home-living patients with dementia (27.5%) and home-living patients without dementia (16.9%). Buprenorphine and fentanyl (primarily patches) were commonly used among NHRs (18.7%) and home-living patients with dementia (10.7%) but less often by home-living patients without dementia (2.4%). Opioid use in the elderly Danish population was frequent but particularly in patients with dementia and NHR, which may challenge patient safety and needs further investigation. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Generation of eggs from mouse embryonic stem cells and induced pluripotent stem cells.

    PubMed

    Hayashi, Katsuhiko; Saitou, Mitinori

    2013-08-01

    Oogenesis is an integrated process through which an egg acquires the potential for totipotency, a fundamental condition for creating new individuals. Reconstitution of oogenesis in a culture that generates eggs with proper function from pluripotent stem cells (PSCs) is therefore one of the key goals in basic biology as well as in reproductive medicine. Here we describe a stepwise protocol for the generation of eggs from mouse PSCs, such as embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs). ESCs and iPSCs are first induced into primordial germ cell-like cells (PGCLCs) that are in turn aggregated with somatic cells of female embryonic gonads, the precursors for adult ovaries. Induction of PGCLCs followed by aggregation with the somatic cells takes up to 8 d. The aggregations are then transplanted under the ovarian bursa, in which PGCLCs grow into germinal vesicle (GV) oocytes in ∼1 month. The PGCLC-derived GV oocytes can be matured into eggs in 1 d by in vitro maturation (IVM), and they can be fertilized with spermatozoa by in vitro fertilization (IVF) to obtain healthy and fertile offspring. This method provides an initial step toward reconstitution of the entire process of oogenesis in vitro.

  20. Markov model of fatigue of a composite material with the poisson process of defect initiation

    NASA Astrophysics Data System (ADS)

    Paramonov, Yu.; Chatys, R.; Andersons, J.; Kleinhofs, M.

    2012-05-01

    As a development of the model where only one weak microvolume (WMV) and only a pulsating cyclic loading are considered, in the current version of the model, we take into account the presence of several weak sites where fatigue damage can accumulate and a loading with an arbitrary (but positive) stress ratio. The Poisson process of initiation of WMVs is considered, whose rate depends on the size of a specimen. The cumulative distribution function (cdf) of the fatigue life of every individual WMV is calculated using the Markov model of fatigue. For the case where this function is approximated by a lognormal distribution, a formula for calculating the cdf of fatigue life of the specimen (modeled as a chain of WMVs) is obtained. Only a pulsating cyclic loading was considered in the previous version of the model. Now, using the modified energy method, a loading cycle with an arbitrary stress ratio is "transformed" into an equivalent cycle with some other stress ratio. In such a way, the entire probabilistic fatigue diagram for any stress ratio with a positive cycle stress can be obtained. Numerical examples are presented.

  1. Aging in the colonial chordate, Botryllus schlosseri.

    PubMed

    Munday, Roma; Rodriguez, Delany; Di Maio, Alessandro; Kassmer, Susannah; Braden, Brian; Taketa, Daryl A; Langenbacher, Adam; De Tomaso, Anthony

    2015-01-30

    What mechanisms underlie aging? One theory, the wear-and-tear model, attributes aging to progressive deterioration in the molecular and cellular machinery which eventually lead to death through the disruption of physiological homeostasis. The second suggests that life span is genetically programmed, and aging may be derived from intrinsic processes which enforce a non-random, terminal time interval for the survivability of the organism. We are studying an organism that demonstrates both properties: the colonial ascidian, Botryllus schlosseri. Botryllus is a member of the Tunicata, the sister group to the vertebrates, and has a number of life history traits which make it an excellent model for studies on aging. First, Botryllus has a colonial life history, and grows by a process of asexual reproduction during which entire bodies, including all somatic and germline lineages, regenerate every week, resulting in a colony of genetically identical individuals. Second, previous studies of lifespan in genetically distinct Botryllus lineages suggest that a direct, heritable basis underlying mortality exists that is unlinked to reproductive effort and other life history traits. Here we will review recent efforts to take advantage of the unique life history traits of B. schlosseri and develop it into a robust model for aging research.

  2. Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr; Vlachos, Dionisios; Katsoulakis, Markos

    2013-09-05

    The overall objective of this project is to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals. Specific goals include: (i) Development of rigorous spatio-temporal coarse-grained kinetic Monte Carlo (KMC) mathematics and simulation for microscopic processes encountered in biomassmore » transformation. (ii) Development of hybrid multiscale simulation that links stochastic simulation to a deterministic partial differential equation (PDE) model for an entire reactor. (iii) Development of hybrid multiscale simulation that links KMC simulation with quantum density functional theory (DFT) calculations. (iv) Development of parallelization of models of (i)-(iii) to take advantage of Petaflop computing and enable real world applications of complex, multiscale models. In this NCE period, we continued addressing these objectives and completed the proposed work. Main initiatives, key results, and activities are outlined.« less

  3. MicroRNA, mRNA, and protein expression link development and aging in human and macaque brain

    PubMed Central

    Somel, Mehmet; Guo, Song; Fu, Ning; Yan, Zheng; Hu, Hai Yang; Xu, Ying; Yuan, Yuan; Ning, Zhibin; Hu, Yuhui; Menzel, Corinna; Hu, Hao; Lachmann, Michael; Zeng, Rong; Chen, Wei; Khaitovich, Philipp

    2010-01-01

    Changes in gene expression levels determine differentiation of tissues involved in development and are associated with functional decline in aging. Although development is tightly regulated, the transition between development and aging, as well as regulation of post-developmental changes, are not well understood. Here, we measured messenger RNA (mRNA), microRNA (miRNA), and protein expression in the prefrontal cortex of humans and rhesus macaques over the species' life spans. We find that few gene expression changes are unique to aging. Instead, the vast majority of miRNA and gene expression changes that occur in aging represent reversals or extensions of developmental patterns. Surprisingly, many gene expression changes previously attributed to aging, such as down-regulation of neural genes, initiate in early childhood. Our results indicate that miRNA and transcription factors regulate not only developmental but also post-developmental expression changes, with a number of regulatory processes continuing throughout the entire life span. Differential evolutionary conservation of the corresponding genomic regions implies that these regulatory processes, although beneficial in development, might be detrimental in aging. These results suggest a direct link between developmental regulation and expression changes taking place in aging. PMID:20647238

  4. Evolutionary dynamics of public goods games with diverse contributions in finite populations

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Wu, Bin; Chen, Xiaojie; Wang, Long

    2010-05-01

    The public goods game is a powerful metaphor for exploring the maintenance of social cooperative behavior in a group of interactional selfish players. Here we study the emergence of cooperation in the public goods games with diverse contributions in finite populations. The theory of stochastic process is innovatively adopted to investigate the evolutionary dynamics of the public goods games involving a diversity of contributions. In the limit of rare mutations, the general stationary distribution of this stochastic process can be analytically approximated by means of diffusion theory. Moreover, we demonstrate that increasing the diversity of contributions greatly reduces the probability of finding the population in a homogeneous state full of defectors. This increase also raises the expectation of the total contribution in the entire population and thus promotes social cooperation. Furthermore, by investigating the evolutionary dynamics of optional public goods games with diverse contributions, we find that nonparticipation can assist players who contribute more in resisting invasion and taking over individuals who contribute less. In addition, numerical simulations are performed to confirm our analytical results. Our results may provide insight into the effect of diverse contributions on cooperative behaviors in the real world.

  5. Role of Multiple Atmospheric Reflections in Formation of Electron Distribution Function in the Diffuse Aurora Region. Chapter 9

    NASA Technical Reports Server (NTRS)

    Khazanov, George V.; Himwich, Elizabeth W.; Glocer, Alex; Sibeck, David G.

    2015-01-01

    The precipitation of high-energy magnetospheric electrons (E greater than 500-600 electronvolts) in the diffuse aurora contributes significant energy flux into Earth's ionosphere. In the diffuse aurora, precipitating electrons initially injected from the plasmasheet via wave-particle interaction processes degrade in the atmosphere toward lower energies and produce secondary electrons via impact ionization of the neutral atmosphere. These initially precipitating electrons of magnetospheric origin can be additionally reflected back into the magnetosphere by the two magnetically conjugated atmospheres, leading to a series of multiple reflections that can greatly influence the initially precipitating flux at the upper ionospheric boundary (700-800 kilometers) and the resultant population of secondary electrons and electrons cascading toward lower energies. We present the solution of the Boltzmann.Landau kinetic equation that uniformly describes the entire electron distribution function in the diffuse aurora, including the affiliated production of secondary electrons (E is less than or equal to 600 electronvolts) and their energy interplay in the magnetosphere and two conjugated ionospheres. This solution takes into account the role of multiple atmospheric reflections of the precipitated electrons that were initially moved into the loss cone via wave.particle interaction processes in Earth's plasmasheet.

  6. 2012 financial outlook: physicians and podiatrists.

    PubMed

    Schaum, Kathleen D

    2012-04-01

    Although the nationally unadjusted average Medicare allowable rates have not increased or decreased significantly, the new codes, the new coding regulations, the NCCI edits, and the Medicare contractors' local coverage determinations (LCDs) will greatly impact physicians' and podiatrists' revenue in 2012. Therefore, every wound care physician and podiatrist should take the time to update their charge sheets and their data entry systems with correct codes, units, and appropriate charges (that account for all the resources needed to perform each service or procedure). They should carefully read the LCDs that are pertinent to the work they perform. If the LCDs contain language that is unclear or incorrect, physicians and podiatrists should contact the Medicare contractor medical director and request a revision through the LCD Reconsideration Process. Medicare has stabilized the MPFS allowable rates for 2012-now physicians and podiatrists must do their part to implement the new coding, payment, and coverage regulations. To be sure that the entire revenue process is working properly, physicians and podiatrists should conduct quarterly, if not monthly, audits of their revenue cycle. Healthcare providers will maintain a healthy revenue cycle by conducting internal audits before outside auditors conduct audits that result in repayments that could have been prevented.

  7. Saguaro: a distributed operating system based on pools of servers. Annual report, 1 January 1984-31 December 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, G.R.

    1986-03-03

    Prototypes of components of the Saguaro distributed operating system were implemented and the design of the entire system refined based on the experience. The philosophy behind Saguaro is to support the illusion of a single virtual machine while taking advantage of the concurrency and robustness that are possible in a network architecture. Within the system, these advantages are realized by the use of pools of server processes and decentralized allocation protocols. Potential concurrency and robustness are also made available to the user through low-cost mechanisms to control placement of executing commands and files, and to support semi-transparent file replication andmore » access. Another unique aspect of Saguaro is its extensive use of type system to describe user data such as files and to specify the types of arguments to commands and procedures. This enables the system to assist in type checking and leads to a user interface in which command-specific templates are available to facilitate command invocation. A mechanism, channels, is also provided to enable users to construct applications containing general graphs of communication processes.« less

  8. Population size vs. social connectedness - A gene-culture coevolutionary approach to cumulative cultural evolution.

    PubMed

    Kobayashi, Yutaka; Ohtsuki, Hisashi; Wakano, Joe Y

    2016-10-01

    It has long been debated if population size is a crucial determinant of the level of culture. While empirical results are mixed, recent theoretical studies suggest that social connectedness between people may be a more important factor than the size of the entire population. These models, however, do not take into account evolutionary responses of learning strategies determining the mode of transmission and innovation and are hence not suitable for predicting the long-term implications of parameters of interest. In the present paper, to address this issue, we provide a gene-culture coevolution model, in which the microscopic learning process of each individual is explicitly described as a continuous-time stochastic process and time allocation to social and individual learning is allowed to evolve. We have found that social connectedness has a larger impact on the equilibrium level of culture than population size especially when connectedness is weak and population size is large. This result, combined with those of previous culture-only models, points to the importance of studying separate effects of population size and internal social structure to better understand spatiotemporal variation in the level of culture. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. ChromaStarPy: A Stellar Atmosphere and Spectrum Modeling and Visualization Lab in Python

    NASA Astrophysics Data System (ADS)

    Short, C. Ian; Bayer, Jason H. T.; Burns, Lindsey M.

    2018-02-01

    We announce ChromaStarPy, an integrated general stellar atmospheric modeling and spectrum synthesis code written entirely in python V. 3. ChromaStarPy is a direct port of the ChromaStarServer (CSServ) Java modeling code described in earlier papers in this series, and many of the associated JavaScript (JS) post-processing procedures have been ported and incorporated into CSPy so that students have access to ready-made data products. A python integrated development environment (IDE) allows a student in a more advanced course to experiment with the code and to graphically visualize intermediate and final results, ad hoc, as they are running it. CSPy allows students and researchers to compare modeled to observed spectra in the same IDE in which they are processing observational data, while having complete control over the stellar parameters affecting the synthetic spectra. We also take the opportunity to describe improvements that have been made to the related codes, ChromaStar (CS), CSServ, and ChromaStarDB (CSDB), that, where relevant, have also been incorporated into CSPy. The application may be found at the home page of the OpenStars project: http://www.ap.smu.ca/OpenStars/.

  10. Robots with language.

    PubMed

    Parisi, Domenico

    2010-01-01

    Trying to understand human language by constructing robots that have language necessarily implies an embodied view of language, where the meaning of linguistic expressions is derived from the physical interactions of the organism with the environment. The paper describes a neural model of language according to which the robot's behaviour is controlled by a neural network composed of two sub-networks, one dedicated to the non-linguistic interactions of the robot with the environment and the other one to processing linguistic input and producing linguistic output. We present the results of a number of simulations using the model and we suggest how the model can be used to account for various language-related phenomena such as disambiguation, the metaphorical use of words, the pervasive idiomaticity of multi-word expressions, and mental life as talking to oneself. The model implies a view of the meaning of words and multi-word expressions as a temporal process that takes place in the entire brain and has no clearly defined boundaries. The model can also be extended to emotional words if we assume that an embodied view of language includes not only the interactions of the robot's brain with the external environment but also the interactions of the brain with what is inside the body.

  11. Atmospheric Spray Freeze-Drying: Numerical Modeling and Comparison With Experimental Measurements.

    PubMed

    Borges Sebastião, Israel; Robinson, Thomas D; Alexeenko, Alina

    2017-01-01

    Atmospheric spray freeze-drying (ASFD) represents a novel approach to dry thermosensitive solutions via sublimation. Tests conducted with a second-generation ASFD equipment, developed for pharmaceutical applications, have focused initially on producing a light, fine, high-grade powder consistently and reliably. To better understand the heat and mass transfer physics and drying dynamics taking place within the ASFD chamber, 3 analytical models describing the key processes are developed and validated. First, by coupling the dynamics and heat transfer of single droplets sprayed into the chamber, the velocity, temperature, and phase change evolutions of these droplets are estimated for actual operational conditions. This model reveals that, under typical operational conditions, the sprayed droplets require less than 100 ms to freeze. Second, because understanding the heat transfer throughout the entire freeze-drying process is so important, a theoretical model is proposed to predict the time evolution of the chamber gas temperature. Finally, a drying model, calibrated with hygrometer measurements, is used to estimate the total time required to achieve a predefined final moisture content. Results from these models are compared with experimental data. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Weldability Characteristics of Sintered Hot-Forged AISI 4135 Steel Produced through P/M Route by Using Pulsed Current Gas Tungsten Arc Welding

    NASA Astrophysics Data System (ADS)

    Joseph, Joby; Muthukumaran, S.; Pandey, K. S.

    2016-01-01

    Present investigation is an attempt to study the weldability characteristics of sintered hot-forged plates of AISI 4135 steel produced through powder metallurgy (P/M) route using matching filler materials of ER80S B2. Compacts of homogeneously blended elemental powders corresponding to the above steel were prepared on a universal testing machine (UTM) by taking pre-weighed powder blend with a suitable die, punch and bottom insert assembly. Indigenously developed ceramic coating was applied on the entire surface of the compacts in order to protect them from oxidation during sintering. Sintered preforms were hot forged to flat, approximately rectangular plates, welded by pulsed current gas tungsten arc welding (PCGTAW) processes with aforementioned filler materials. Microstructural, tensile and hardness evaluations revealed that PCGTAW process with low heat input could produce weldments of good quality with almost nil defects. It was established that PCGTAW joints possess improved tensile properties compared to the base metal and it was mainly attributed to lower heat input, resulting in finer fusion zone grains and higher fusion zone hardness. Thus, the present investigation opens a new and demanding field in research.

  13. Aging in the colonial chordate, Botryllus schlosseri

    PubMed Central

    Munday, Roma; Rodriguez, Delany; Di Maio, Alessandro; Kassmer, Susannah; Braden, Brian; Taketa, Daryl A.; Langenbacher, Adam; De Tomaso, Anthony

    2015-01-01

    What mechanisms underlie aging? One theory, the wear-and-tear model, attributes aging to progressive deterioration in the molecular and cellular machinery which eventually lead to death through the disruption of physiological homeostasis. The second suggests that life span is genetically programmed, and aging may be derived from intrinsic processes which enforce a non-random, terminal time interval for the survivability of the organism. We are studying an organism that demonstrates both properties: the colonial ascidian, Botryllus schlosseri. Botryllus is a member of the Tunicata, the sister group to the vertebrates, and has a number of life history traits which make it an excellent model for studies on aging. First, Botryllus has a colonial life history, and grows by a process of asexual reproduction during which entire bodies, including all somatic and germline lineages, regenerate every week, resulting in a colony of genetically identical individuals. Second, previous studies of lifespan in genetically distinct Botryllus lineages suggest that a direct, heritable basis underlying mortality exists that is unlinked to reproductive effort and other life history traits. Here we will review recent efforts to take advantage of the unique life history traits of B. schlosseri and develop it into a robust model for aging research. PMID:26136620

  14. Belief revision and delusions: how do patients with schizophrenia take advice?

    PubMed

    Kaliuzhna, Mariia; Chambon, Valérian; Franck, Nicolas; Testud, Bérangère; Van der Henst, Jean-Baptiste

    2012-01-01

    The dominant cognitive model that accounts for the persistence of delusional beliefs in schizophrenia postulates that patients suffer from a general deficit in belief revision. It is generally assumed that this deficit is a consequence of impaired reasoning skills. However, the possibility that such inflexibility affects the entire system of a patient's beliefs has rarely been empirically tested. Using delusion-neutral material in a well-documented advice-taking task, the present study reports that patients with schizophrenia: 1) revise their beliefs, 2) take into account socially provided information to do so, 3) are not overconfident about their judgments, and 4) show less egocentric advice-discounting than controls. This study thus shows that delusional patients' difficulty in revising beliefs is more selective than had been previously assumed. The specificities of the task and the implications for a theory of delusion formation are discussed.

  15. Belief Revision and Delusions: How Do Patients with Schizophrenia Take Advice?

    PubMed Central

    Kaliuzhna, Mariia; Chambon, Valérian; Franck, Nicolas; Testud, Bérangère; Van der Henst, Jean-Baptiste

    2012-01-01

    The dominant cognitive model that accounts for the persistence of delusional beliefs in schizophrenia postulates that patients suffer from a general deficit in belief revision. It is generally assumed that this deficit is a consequence of impaired reasoning skills. However, the possibility that such inflexibility affects the entire system of a patient's beliefs has rarely been empirically tested. Using delusion-neutral material in a well-documented advice-taking task, the present study reports that patients with schizophrenia: 1) revise their beliefs, 2) take into account socially provided information to do so, 3) are not overconfident about their judgments, and 4) show less egocentric advice-discounting than controls. This study thus shows that delusional patients' difficulty in revising beliefs is more selective than had been previously assumed. The specificities of the task and the implications for a theory of delusion formation are discussed. PMID:22536329

  16. Validation of an in vitro digestive system for studying macronutrient decomposition in humans.

    PubMed

    Kopf-Bolanz, Katrin A; Schwander, Flurina; Gijs, Martin; Vergères, Guy; Portmann, Reto; Egger, Lotti

    2012-02-01

    The digestive process transforms nutrients and bioactive compounds contained in food to physiologically active compounds. In vitro digestion systems have proven to be powerful tools for understanding and monitoring the complex transformation processes that take place during digestion. Moreover, the investigation of the physiological effects of certain nutrients demands an in vitro digestive process that is close to human physiology. In this study, human digestion was simulated with a 3-step in vitro process that was validated in depth by choosing pasteurized milk as an example of a complex food matrix. The evolution and decomposition of the macronutrients was followed over the entire digestive process to the level of intestinal enterocyte action, using protein and peptide analysis by SDS-PAGE, reversed-phase HPLC, size exclusion HPLC, and liquid chromatography-MS. The mean peptide size after in vitro digestion of pasteurized milk was 5-6 amino acids (AA). Interestingly, mostly essential AA (93.6%) were released during in vitro milk digestion, a significantly different relative distribution compared to the total essential AA concentration of bovine milk (44.5%). All TG were degraded to FFA and monoacylglycerols. Herein, we present a human in vitro digestion model validated for its ability to degrade the macronutrients of dairy products comparable to physiological ranges. It is suited to be used in combination with a human intestinal cell culture system, allowing ex vivo bioavailability measurements and assessment of the bioactive properties of food components.

  17. 7 CFR 51.3416 - Classification of defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Maximum allowed for U.S. No. 2 processing Occurring outside of or not entirely confined to the vascular ring Internal Black Spot, Internal Discoloration, Vascular Browning, Fusarium Wilt, Net Necrosis, Other Necrosis, Stem End Browning 5% waste 10% waste. Occurring entirely within the vascular ring Hollow Heart or...

  18. Specific immunotherapy of experimental myasthenia by genetically engineered APCs: the "guided missile" strategy.

    PubMed

    Drachman, D B; Wu, J-M; Miagkov, A; Williams, M A; Adams, R N; Wu, B

    2003-09-01

    Although treatment of MG with general immunosuppressive agents is often effective, it has important drawbacks, including suppression of the immune system as a whole, with the risks of infection and neoplasia, and numerous other adverse side effects. Ideally, treatment of MG should eliminate the specific pathogenic autoimmune response to AChR, without otherwise suppressing the immune system or producing other adverse side effects. Although antibodies to AChR are directly responsible for the loss of AChRs at neuromuscular junctions in MG, the AChR antibody response is T cell-dependent, and immunotherapy directed at T cells can abrogate the autoantibody response, with resulting benefit. As in other autoimmune diseases, the T cell response in MG is highly heterogeneous. The design of specific immunotherapy must take this heterogeneity into account and target the entire repertoire of AChR-specific T cells. We describe our investigation of a novel strategy for specific immunotherapy of MG, involving gene transfer to convert antigen-presenting cells (APCs) to "guided missiles" that target AChR-specific T cells, and that induce apoptosis and elimination of those T cells. This strategy uses the ability of APCs from a given individual to present the entire spectrum of AChR epitopes unique for that individual, and thereby to target the entire repertoire of antigen-specific T cells of the same individual. Using viral vectors, we have genetically engineered the APCs to process and present the most important domain of the AChR molecule, and to express a "warhead" of Fas ligand (FasL) to eliminate the activated AChR-specific T cells with which they interact. Our results show that the APCs express the appropriate gene products, and effectively and specifically eliminate AChR-specific T cells by the Fas/FasL pathway, while sparing T cells of other specificities.

  19. Theoretical study of optical pump process in solid gain medium based on four-energy-level model

    NASA Astrophysics Data System (ADS)

    Ma, Yongjun; Fan, Zhongwei; Zhang, Bin; Yu, Jin; Zhang, Hongbo

    2018-04-01

    A semiclassical algorithm is explored to a four-energy level model, aiming to find out the factors that affect the dynamics behavior during the pump process. The impacts of pump intensity Ω p , non-radiative transition rate γ 43 and decay rate of electric dipole δ 14 are discussed in detail. The calculation results show that large γ 43, small δ 14, and strong pumping Ω p are beneficial to the establishing of population inversion. Under strong pumping conditions, the entire pump process can be divided into four different phases, tentatively named far-from-equilibrium process, Rabi oscillation process, quasi dynamic equilibrium process and ‘equilibrium’ process. The Rabi oscillation can slow the pumping process and cause some instability. Moreover, the duration of the entire process is negatively related to Ω p and γ 43 whereas positively related to δ 14.

  20. Capturing molecular multimode relaxation processes in excitable gases based on decomposition of acoustic relaxation spectra

    NASA Astrophysics Data System (ADS)

    Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng

    2017-08-01

    Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.

  1. Pool and flow boiling in variable and microgravity

    NASA Technical Reports Server (NTRS)

    Merte, Herman, Jr.

    1994-01-01

    As is well known, boiling is an effective mode of heat transfer in that high heat flux levels are possible with relatively small temperature differences. Its optimal application requires that the process be adequately understood. A measure of the understanding of any physical event lies in the ability to predict its behavior in terms of the relevant parameters. Despite many years of research the predictability of boiling is currently possible only for quite specialized circumstances, e.g., the critical heat flux and film boiling for the pool boiling case, and then only with special geometries. Variable gravity down to microgravity provides the opportunity to test this understanding, but possibly more important, by changing the dimensional and time scales involved permits more detailed observations of elements involved in the boiling process, and perhaps discloses phenomena heretofore unknown. The focus here is on nucleate boiling although, as will be demonstrated below, under but certain circumstances in microgravity it can take place concurrently with the dryout process. In the presence of earth gravity or forced convection effects, the latter process is usually referred to as film boiling. However, no vapor film as such forms with pool boiling in microgravity, only dryout. Initial results are presented here for pool boiling in microgravity, and were made possible at such an early date by the availability of the Get-Away-Specials (GAS). Also presented here are some results of ground testing of a flow loop for the study of low velocity boiling, eventually to take place also in microgravity. In the interim, variable buoyancy normal to the heater surface is achieved by rotation of the entire loop relative to earth gravity. Of course, this is at the expense of varying the buoyancy parallel to the heater surface. Two questions which must be resolved early in the study of flow boiling in microgravity are (1) the lower limits of liquid flow velocity where buoyancy effects become significant to the boiling process (2) the effect of lower liquid flow velocities on the Critical Heat Flux when buoyancy is removed. Results of initial efforts in these directions are presented, albeit restricted currently to the ever present earth gravity.

  2. Parallel In Situ Screening of Remediation Strategies for Improved Decision Making, Remedial Design, and Cost Savings

    DTIC Science & Technology

    2013-02-01

    November 2012 phosphorus, and vitamin B12. Additionally a reductant reacts directly with hexavalent chromium to reduce it to the trivalent state. SRS...being operated under continuous flow conditions in the laboratory. Entire assembly takes up approximately 5 sq. ft. in a fume hood. . 44 Figure 5-3...61 Figure 5-16. Hexavalent Chromium detected in ISMA effluent post in situ incubation

  3. An all-woman crew to Mars: a radical proposal

    NASA Technical Reports Server (NTRS)

    Landis, G. A.

    2000-01-01

    It is logical to propose that if a human mission is flown to Mars, it should be composed of an entirely female crew. On the average, women have lower mass and take less volume than males, and use proportionately less consumables. In addition, sociological research indicates that a female crew may have a preferable interpersonal dynamic, and be likely to choose non-confrontational approaches to solve interpersonal problems.

  4. Small intestinal function and dietary status in dermatitis herpetiformis.

    PubMed Central

    Gawkrodger, D J; McDonald, C; O'Mahony, S; Ferguson, A

    1991-01-01

    Small intestinal morphology and function were assessed in 82 patients with dermatitis herpetiformis, 51 of whom were taking a normal diet and 31 a gluten free diet. Methods used were histopathological evaluation of jejunal mucosal biopsy specimens, quantitation of intraepithelial lymphocytes, cellobiose/mannitol permeability test, tissue disaccharidase values, serum antigliadin antibodies, and formal assessment of dietary gluten content by a dietician. There was no correlation between dietary gluten intake and the degree of enteropathy in the 51 patients taking a normal diet, whereas biopsy specimens were normal in 24 of the 31 patients on a gluten free diet, all previously having been abnormal. Eighteen patients on gluten containing diets had normal jejunal histology and in seven of these all tests of small intestinal morphology and function were entirely normal. Intestinal permeability was abnormal and serum antigliadin antibodies were present in most patients with enteropathy. Studies of acid secretion in seven patients showed that hypochlorhydria or achlorhydria did not lead to abnormal permeability in the absence of enteropathy. This study shows that a combination of objective tests of small intestinal architecture and function will detect abnormalities in most dermatitis herpetiformis patients, including some with histologically normal jejunal biopsy specimens. Nevertheless there is a small group in whom all conventional intestinal investigations are entirely normal. PMID:2026337

  5. Salton Sea Scientific Drilling Program

    USGS Publications Warehouse

    Sass, J.H.

    1988-01-01

    The Salton Sea Scientific Drilling Program (SSSDP) was the first large-scale drilling project undertaken by the U.S Continental Scientific Drilling Program. The objectives of the SSSDP were (1) to drill a deep well into the Salton Sea Geothermal Field in the Imperial Valley of California, (2) to retrieve a high percentage of core and cuttings along the entire depth of the well, (3) to obtain a comprehensive suite of geophysical logs, (4) to conduct flow tests at two depths  (and to take fluid samples therefrom), and (5) to carry out several downhole experiments. These activites enabled the U.S Geological Survey and cooperating agencies to study the physical and chemical processes involved in an active hydrothermal system driven by a molten-rock heat source. This program, orginally conceived by Wilfred A. Elders, professor of geology at the University of California at Riverside, was coordinated under an inter-agency accord among the Geological Survey, the U.S Department of Energy, and the National Science Foundation. 

  6. Preparing for new business directions in competitive markets. The concept phase of an integrated business planning process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norton, G.; Wahlin, D.

    With deregulation, electric utilities face previously unknown threats as their franchise markets have been opened to competitors such as power marketers who want to wheel power into distribution grids, independent power producers, co-generation firms who want to take the biggest customers off the grid, and many others (including hostile utilities eyeing value in under-performing stocks). On the other hand, firms are now presented with opportunities they have not had in decades, if ever, to sell to customers in other, formerly protected franchise territories, invest in new businesses, or use internal expertise and experience to create new products and services formore » existing customers or entirely new customers and markets. Utilities are entering businesses as diverse as maintenance services, retail appliance stores, and telecommunications. This paper will discuss how to evaluate and plan for some of the opportunities available to the electric utilities as the result of changes in FERC, state utility regulations, and proposed legislation.« less

  7. Constructing compact and effective graphs for recommender systems via node and edge aggregations

    DOE PAGES

    Lee, Sangkeun; Kahng, Minsuk; Lee, Sang-goo

    2014-12-10

    Exploiting graphs for recommender systems has great potential to flexibly incorporate heterogeneous information for producing better recommendation results. As our baseline approach, we first introduce a naive graph-based recommendation method, which operates with a heterogeneous log-metadata graph constructed from user log and content metadata databases. Although the na ve graph-based recommendation method is simple, it allows us to take advantages of heterogeneous information and shows promising flexibility and recommendation accuracy. However, it often leads to extensive processing time due to the sheer size of the graphs constructed from entire user log and content metadata databases. In this paper, we proposemore » node and edge aggregation approaches to constructing compact and e ective graphs called Factor-Item bipartite graphs by aggregating nodes and edges of a log-metadata graph. Furthermore, experimental results using real world datasets indicate that our approach can significantly reduce the size of graphs exploited for recommender systems without sacrificing the recommendation quality.« less

  8. Labor-Management Cooperation in Illinois: How a Joint Union Company Team Is Improving Facility Safety.

    PubMed

    Mahan, Bruce; Maclin, Reggie; Ruttenberg, Ruth; Mundy, Keith; Frazee, Tom; Schwartzkopf, Randy; Morawetz, John

    2018-01-01

    This study of Afton Chemical Corporation's Sauget facility and its International Chemical Workers Union Council (ICWUC) Local 871C demonstrates how significant safety improvements can be made when committed leadership from both management and union work together, build trust, train the entire work force in U.S. Occupational Safety and Health Administration 10-hour classes, and communicate with their work force, both salaried and hourly. A key finding is that listening to the workers closest to production can lead to solutions, many of them more cost-efficient than top-down decision-making. Another is that making safety and health an authentic value is hard work, requiring time, money, and commitment. Third, union and management must both have leadership willing to take chances and learn to trust one another. Fourth, training must be for everyone and ongoing. Finally, health and safety improvements require dedicated funding. The result was resolution of more than one hundred safety concerns and an ongoing institutionalized process for continuing improvement.

  9. Apparatus to collect, classify, concentrate, and characterize gas-borne particles

    DOEpatents

    Rader, Daniel J.; Torczynski, John R.; Wally, Karl; Brockmann, John E.

    2002-01-01

    An aerosol lab-on-a-chip (ALOC) integrates one or more of a variety of aerosol collection, classification, concentration (enrichment), and characterization processes onto a single substrate or layered stack of such substrates. By taking advantage of modern micro-machining capabilities, an entire suite of discrete laboratory aerosol handling and characterization techniques can be combined in a single portable device that can provide a wealth of data on the aerosol being sampled. The ALOC offers parallel characterization techniques and close proximity of the various characterization modules helps ensure that the same aerosol is available to all devices (dramatically reducing sampling and transport errors). Micro-machine fabrication of the ALOC significantly reduces unit costs relative to existing technology, and enables the fabrication of small, portable ALOC devices, as well as the potential for rugged design to allow operation in harsh environments. Miniaturization also offers the potential of working with smaller particle sizes and lower pressure drops (leading to reduction of power consumption).

  10. What Fraction of Active Galaxies Actually Show Outflows?

    NASA Astrophysics Data System (ADS)

    Ganguly, Rajib; Brotherton, M. S.

    2007-12-01

    Outflows from active galactic nuclei (AGNs) seem to be common and are thought to be important from a variety of perspectives: as an agent of chemical enhancement of the interstellar and intergalactic media, as an agent of angular momentum removal from the accreting central engine, and as an agent limiting star formation in starbursting systems by blowing out gas and dust from the host galaxy. To understand these processes, we must determine what fraction of AGNs feature outflows and understand what forms they take. We examine recent surveys of outflows detected in ultraviolet absorption over the entire range of velocities and velocity widths (i.e., broad absorption lines, associated absorption lines, and high-velocity narrow absorption lines). While the fraction of specific forms of outflows depends on AGN properties, the overall fraction displaying outflows is fairly constant, approximately 60%, over many orders of magnitude in luminosity. We discuss implications of this result and ways to refine our understanding of outflows. We acknowledge support from the US National Science Foundation through grant AST 05-07781.

  11. Onshore and Offshore Outsourcing with Agility: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Kussmaul, Clifton

    This chapter reflects on case study based an agile distributed project that ran for approximately three years (from spring 2003 to spring 2006). The project involved (a) a customer organization with key personnel distributed across the US, developing an application with rapidly changing requirements; (b) onshore consultants with expertise in project management, development processes, offshoring, and relevant technologies; and (c) an external offsite development team in a CMM-5 organization in southern India. This chapter is based on surveys and discussions with multiple participants. The several years since the project was completed allow greater perspective on both the strengths and weaknesses, since the participants can reflect on the entire life of the project, and compare it to subsequent experiences. Our findings emphasize the potential for agile project management in distributed software development, and the importance of people and interactions, taking many small steps to find and correct errors, and matching the structures of the project and product to support implementation of agility.

  12. Cloud Surprises in Moving NASA EOSDIS Applications into Amazon Web Services

    NASA Technical Reports Server (NTRS)

    Mclaughlin, Brett

    2017-01-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. We ran into surprising network policy limitations, billing challenges in a government-based cost model, and difficulty in obtaining certificates in an NASA security-compliant manner. On the other hand, this approach has allowed us to move a number of applications from local hosting to the cloud in a matter of hours (yes, hours!!), and our CMR application now services 95% of granule searches and an astonishing 99% of all collection searches in under a second. And most surprising of all, well, you'll just have to wait and see the realization that caught our entire team off guard!

  13. METscout: a pathfinder exploring the landscape of metabolites, enzymes and transporters.

    PubMed

    Geffers, Lars; Tetzlaff, Benjamin; Cui, Xiao; Yan, Jun; Eichele, Gregor

    2013-01-01

    METscout (http://metscout.mpg.de) brings together metabolism and gene expression landscapes. It is a MySQL relational database linking biochemical pathway information with 3D patterns of gene expression determined by robotic in situ hybridization in the E14.5 mouse embryo. The sites of expression of ∼1500 metabolic enzymes and of ∼350 solute carriers (SLCs) were included and are accessible as single cell resolution images and in the form of semi-quantitative image abstractions. METscout provides several graphical web-interfaces allowing navigation through complex anatomical and metabolic information. Specifically, the database shows where in the organism each of the many metabolic reactions take place and where SLCs transport metabolites. To link enzymatic reactions and transport, the KEGG metabolic reaction network was extended to include metabolite transport. This network in conjunction with spatial expression pattern of the network genes allows for a tracing of metabolic reactions and transport processes across the entire body of the embryo.

  14. Key issues in life cycle assessment of ethanol production from lignocellulosic biomass: Challenges and perspectives.

    PubMed

    Singh, Anoop; Pant, Deepak; Korres, Nicholas E; Nizami, Abdul-Sattar; Prasad, Shiv; Murphy, Jerry D

    2010-07-01

    Progressive depletion of conventional fossil fuels with increasing energy consumption and greenhouse gas (GHG) emissions have led to a move towards renewable and sustainable energy sources. Lignocellulosic biomass is available in massive quantities and provides enormous potential for bioethanol production. However, to ascertain optimal biofuel strategies, it is necessary to take into account environmental impacts from cradle to grave. Life cycle assessment (LCA) techniques allow detailed analysis of material and energy fluxes on regional and global scales. This includes indirect inputs to the production process and associated wastes and emissions, and the downstream fate of products in the future. At the same time if not used properly, LCA can lead to incorrect and inappropriate actions on the part of industry and/or policy makers. This paper aims to list key issues for quantifying the use of resources and releases to the environment associated with the entire life cycle of lignocellulosic bioethanol production. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. Drug-nutrient interactions in enteral feeding: a primary care focus.

    PubMed

    Varella, L; Jones, E; Meguid, M M

    1997-06-01

    Drug and nutrient interactions are complex and can take many forms, including malabsorption of either the drug or the nutrient component. Some drugs can stimulate or suppress appetite, whereas others can cause nausea and vomiting resulting in inadequate nutritional intake. Absorption of drugs is a complex process that can be affected by the physical characteristics of the gastrointestinal tract (GIT) as well. Depending on the physical properties of a drug, it may be absorbed in a limited area of the GIT or more diffusely along much of the entire length. Many diseases and conditions are also known to affect the GIT either directly or indirectly. Dietary factors also need to be considered when the "food" is an enteral formula. The widespread use of enteral tubes requires that consideration be given to patients receiving both enteral feedings and medication concurrently. The location of a tube in the gastrointestinal tract, as well as the problems involved in crushing and administering solid dosage forms, creates a unique set of problems.

  16. The energy landscape of adenylate kinase during catalysis

    DOE PAGES

    Kerns, S. Jordan; Agafonov, Roman V.; Cho, Young-Jin; ...

    2015-01-12

    Kinases perform phosphoryl-transfer reactions in milliseconds; without enzymes, these reactions would take about 8,000 years under physiological conditions. Despite extensive studies, a comprehensive understanding of kinase energy landscapes, including both chemical and conformational steps, is lacking. In this paper, we scrutinize the microscopic steps in the catalytic cycle of adenylate kinase, through a combination of NMR measurements during catalysis, pre-steady-state kinetics, molecular-dynamics simulations and crystallography of active complexes. We find that the Mg 2+ cofactor activates two distinct molecular events: phosphoryl transfer (>10 5-fold) and lid opening (10 3-fold). In contrast, mutation of an essential active site arginine decelerates phosphorylmore » transfer 10 3-fold without substantially affecting lid opening. Finally, our results highlight the importance of the entire energy landscape in catalysis and suggest that adenylate kinases have evolved to activate key processes simultaneously by precise placement of a single, charged and very abundant cofactor in a preorganized active site.« less

  17. Safety modelling and testing of lithium-ion batteries in electrified vehicles

    NASA Astrophysics Data System (ADS)

    Deng, Jie; Bae, Chulheung; Marcicki, James; Masias, Alvaro; Miller, Theodore

    2018-04-01

    To optimize the safety of batteries, it is important to understand their behaviours when subjected to abuse conditions. Most early efforts in battery safety modelling focused on either one battery cell or a single field of interest such as mechanical or thermal failure. These efforts may not completely reflect the failure of batteries in automotive applications, where various physical processes can take place in a large number of cells simultaneously. In this Perspective, we review modelling and testing approaches for battery safety under abuse conditions. We then propose a general framework for large-scale multi-physics modelling and experimental work to address safety issues of automotive batteries in real-world applications. In particular, we consider modelling coupled mechanical, electrical, electrochemical and thermal behaviours of batteries, and explore strategies to extend simulations to the battery module and pack level. Moreover, we evaluate safety test approaches for an entire range of automotive hardware sets from cell to pack. We also discuss challenges in building this framework and directions for its future development.

  18. One Cold Fusion Speaker is One Too Many for a Future Energy Conference

    NASA Astrophysics Data System (ADS)

    Vallone, Thomas

    2001-04-01

    In 1998, a Conference on Future Energy (COFE) was scheduled to take place at the State Department Open Forum in April, 1999. Only one speaker, Ed Storms (formerly with Los Alamos Lab), was scheduled to talk about cold fusion as part of fourteen plenary lectures over a two-day period. However, the entire meeting was labeled a "cold fusion" conference by APS Spokesperson Bob Park who repeated the words four times in one 1999 What's New column. What transpired afterwards has become a part of the cold fusion suppression history, including several APS ``pseudoscience" presentations mocking COFE scientists. A review of the actual COFE contents reveals the rational side of emerging energy technologies normally associated with the scientific process. The Park-related events display an opposite pattern of behavior ultimately designed to discredit the COFE organizer and deprive him of his livelihood (see APS News, March, 2000). The compiled record shows how the communication of scientific information becomes distorted by undue prejudice and unethical lobbying.

  19. Geovisualization for Smart Video Surveillance

    NASA Astrophysics Data System (ADS)

    Oves García, R.; Valentín, L.; Serrano, S. A.; Palacios-Alonso, M. A.; Sucar, L. Enrique

    2017-09-01

    Nowadays with the emergence of smart cities and the creation of new sensors capable to connect to the network, it is not only possible to monitor the entire infrastructure of a city, including roads, bridges, rail/subways, airports, communications, water, power, but also to optimize its resources, plan its preventive maintenance and monitor security aspects while maximizing services for its citizens. In particular, the security aspect is one of the most important issues due to the need to ensure the safety of people. However, if we want to have a good security system, it is necessary to take into account the way that we are going to present the information. In order to show the amount of information generated by sensing devices in real time in an understandable way, several visualization techniques are proposed for both local (involves sensing devices in a separated way) and global visualization (involves sensing devices as a whole). Taking into consideration that the information is produced and transmitted from a geographic location, the integration of a Geographic Information System to manage and visualize the behavior of data becomes very relevant. With the purpose of facilitating the decision-making process in a security system, we have integrated the visualization techniques and the Geographic Information System to produce a smart security system, based on a cloud computing architecture, to show relevant information about a set of monitored areas with video cameras.

  20. Being an expert witness in geomorphology

    NASA Astrophysics Data System (ADS)

    Keller, Edward A.

    2015-02-01

    Gathering your own data and coming to your own conclusion through scientific research and discovery is the most important principle to remember when being an expert witness in geomorphology. You can only be questioned in deposition and trial in your area of expertise. You are qualified as an expert by education, knowledge, and experience. You will have absolutely nothing to fear from cross-examination if you are prepared and confident about your work. Being an expert witness requires good communication skills. When you make a presentation, speak clearly and avoid jargon, especially when addressing a jury. Keep in mind that when you take on a case that may eventually go to court as a lawsuit, the entire process, with appeals and so forth, can take several years. Therefore, being an expert may become a long-term commitment of your time and energy. You may be hired by either side in a dispute, but your job is the same - determine the scientific basis of the case and explain your scientific reasoning to the lawyers, the judge, and the jury. Your work, including pre-trial investigations, often determines what the case will be based on. The use of science in the discovery part of an investigation is demonstrated from a California case involving the Ventura River, where building of a flood control levee restricted flow to a narrower channel, increasing unit stream power as well as potential for bank erosion and landsliding.

  1. DeepFruits: A Fruit Detection System Using Deep Neural Networks

    PubMed Central

    Sa, Inkyu; Ge, Zongyuan; Dayoub, Feras; Upcroft, Ben; Perez, Tristan; McCool, Chris

    2016-01-01

    This paper presents a novel approach to fruit detection using deep convolutional neural networks. The aim is to build an accurate, fast and reliable fruit detection system, which is a vital element of an autonomous agricultural robotic platform; it is a key element for fruit yield estimation and automated harvesting. Recent work in deep neural networks has led to the development of a state-of-the-art object detector termed Faster Region-based CNN (Faster R-CNN). We adapt this model, through transfer learning, for the task of fruit detection using imagery obtained from two modalities: colour (RGB) and Near-Infrared (NIR). Early and late fusion methods are explored for combining the multi-modal (RGB and NIR) information. This leads to a novel multi-modal Faster R-CNN model, which achieves state-of-the-art results compared to prior work with the F1 score, which takes into account both precision and recall performances improving from 0.807 to 0.838 for the detection of sweet pepper. In addition to improved accuracy, this approach is also much quicker to deploy for new fruits, as it requires bounding box annotation rather than pixel-level annotation (annotating bounding boxes is approximately an order of magnitude quicker to perform). The model is retrained to perform the detection of seven fruits, with the entire process taking four hours to annotate and train the new model per fruit. PMID:27527168

  2. DeepFruits: A Fruit Detection System Using Deep Neural Networks.

    PubMed

    Sa, Inkyu; Ge, Zongyuan; Dayoub, Feras; Upcroft, Ben; Perez, Tristan; McCool, Chris

    2016-08-03

    This paper presents a novel approach to fruit detection using deep convolutional neural networks. The aim is to build an accurate, fast and reliable fruit detection system, which is a vital element of an autonomous agricultural robotic platform; it is a key element for fruit yield estimation and automated harvesting. Recent work in deep neural networks has led to the development of a state-of-the-art object detector termed Faster Region-based CNN (Faster R-CNN). We adapt this model, through transfer learning, for the task of fruit detection using imagery obtained from two modalities: colour (RGB) and Near-Infrared (NIR). Early and late fusion methods are explored for combining the multi-modal (RGB and NIR) information. This leads to a novel multi-modal Faster R-CNN model, which achieves state-of-the-art results compared to prior work with the F1 score, which takes into account both precision and recall performances improving from 0 . 807 to 0 . 838 for the detection of sweet pepper. In addition to improved accuracy, this approach is also much quicker to deploy for new fruits, as it requires bounding box annotation rather than pixel-level annotation (annotating bounding boxes is approximately an order of magnitude quicker to perform). The model is retrained to perform the detection of seven fruits, with the entire process taking four hours to annotate and train the new model per fruit.

  3. The Proell Effect: A Macroscopic Maxwell's Demon

    NASA Astrophysics Data System (ADS)

    Rauen, Kenneth M.

    2011-12-01

    Maxwell's Demon is a legitimate challenge to the Second Law of Thermodynamics when the "demon" is executed via the Proell effect. Thermal energy transfer according to the Kinetic Theory of Heat and Statistical Mechanics that takes place over distances greater than the mean free path of a gas circumvents the microscopic randomness that leads to macroscopic irreversibility. No information is required to sort the particles as no sorting occurs; the entire volume of gas undergoes the same transition. The Proell effect achieves quasi-spontaneous thermal separation without sorting by the perturbation of a heterogeneous constant volume system with displacement and regeneration. The classical analysis of the constant volume process, such as found in the Stirling Cycle, is incomplete and therefore incorrect. There are extra energy flows that classical thermo does not recognize. When a working fluid is displaced across a regenerator with a temperature gradient in a constant volume system, complimentary compression and expansion work takes place that transfers energy between the regenerator and the bulk gas volumes of the hot and cold sides of the constant volume system. Heat capacity at constant pressure applies instead of heat capacity at constant volume. The resultant increase in calculated, recyclable energy allows the Carnot Limit to be exceeded in certain cycles. Super-Carnot heat engines and heat pumps have been designed and a US patent has been awarded.

  4. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  5. Effects of Gyejibongnyeong-hwan on dysmenorrhea caused by blood stagnation: study protocol for a randomized controlled trial

    PubMed Central

    2012-01-01

    Background Gyejibongnyeong-hwan (GJBNH) is one of the most popular Korean medicine formulas for menstrual pain of dysmenorrhea. The concept of blood stagnation in Korean medicine is considered the main factor of causing abdominal pain, or cramps, during menstrual periods. To treat the symptoms, GJBNH is used to fluidify the stagnated blood and induce the blood flow to be smooth, reducing pain as the result. The purpose of this trial is to identify the efficacy of GJBNH in dysmenorrhea caused by blood stagnation. Methods This study is a multi-centre, randomised, double-blind, controlled trial with two parallel arms: the group taking GJBNH and the group taking placebo. 100 patients (women from age 18 to 35) will be enrolled to the trial. Through randomization 50 patients will be in experiment arm, and the other 50 patients will be in control arm. At the second visit (baseline), all participants who were already screened that they fulfil both the inclusion and the exclusion criteria will be randomised into two groups. Each group will take the intervention three times per day during two menstrual cycles. After the treatment for two cycles, each patient will be followed up during their 3rd, 4th and 5th menstrual cycles. From the screening (Visit 1) through the second follow-up (Visit 6) the entire process will take 25 weeks. Discussion This trial will provide evidence for the effectiveness of GJBNH in treating periodical pain due to dysmenorrhea that is caused by blood stagnation. The primary outcome between the two groups will be measured by changes in the Visual Analogue Score (VAS) of pain. The secondary outcome will be measured by the Blood Stagnation Scale, the Short-form McGill questionnaire and the COX menstrual symptom scale. Analysis of covariance (ANCOVA) and repeated measured ANOVA will be used to analyze the data analysis. Trial registration Current Controlled Trials: ISRCTN30426947 PMID:22217258

  6. How long did it take for life to begin and evolve to cyanobacteria?

    NASA Technical Reports Server (NTRS)

    Lazcano, A.; Miller, S. L.

    1994-01-01

    There is convincing paleontological evidence showing that stromatolite-building phototactic prokaryotes were already in existence 3.5 x 10(9) years ago. Late accretion impacts may have killed off life on our planet as late as 3.8 x 10(9) years ago. This leaves only 300 million years to go from the prebiotic soup to the RNA world and to cyanobacteria. However, 300 million years should be more than sufficient time. All known prebiotic reactions take place in geologically rapid time scales, and very slow prebiotic reactions are not feasible because the intermediate compounds would have been destroyed due to the passage of the entire ocean through deep-sea vents every 10(7) years or in even less time. Therefore, it is likely that self-replicating systems capable of undergoing Darwinian evolution emerged in a period shorter than the destruction rates of its components (<5 million years). The time for evolution from the first DNA/protein organisms to cyanobacteria is usually thought to be very long. However, the similarities of many enzymatic reactions, together with the analysis of the available sequence data, suggest that a significant number of the components involved in basic biological processes are the result of ancient gene duplication events. Assuming that the rate of gene duplication of ancient prokaryotes was comparable to today's present values, the development of a filamentous cyanobacterial-like genome would require approximately 7 x 10(6) years--or perhaps much less. Thus, in spite of the many uncertainties involved in the estimates of time for life to arise and evolve to cyanobacteria, we see no compelling reason to assume that this process, from the beginning of the primitive soup to cyanobacteria, took more than 10 million years.

  7. Quality factors in interventional neuroradiology.

    PubMed

    Lasjaunias, P

    2001-01-01

    The interest we take in medical economics and strategy is like the one we take in politics: we may scorn politics, but it cannot be denied that it commands our entire life. For this reason, we must try to determine the conditions required to evaluate the quality of interventional neuroradiology, its operators, its practice, its advances, its teaching, and to maintain this quality. It is probably vital to the freedom of our future therapeutic decisions that we contribute effectively to this discussion before the standard is forced upon us by an exclusively economical or administrative logic. On the other hand, any advance can only be turned into progress if it is diffused and applied. There is no doubt that several levels of quality are acceptable, thus the best approach will be to look for and identify the minimum standard for quality or the limits of non-quality. We shall refrain from suggesting that the level of excellence at a given moment should be imposed upon all operators and constitute the standard level of practice. Practice is based on knowledge and competence. The most skilled surgical act cannot guarantee safe medical treatment if it is not supported by sufficient knowledge about the diseases and their symptoms. Mastership of the decision process requires a thorough vision of the therapeutic decision tree involved. Quality is a composition of global view and detailed analysis to allow a fuzzy gestion of the performance. Regardless of the plan chosen, openmindedness should be kept to allow adaptation, correction or interruption of a given therapeutic process in view of unpredicted pieces of information. Such input is a predictable possibility that should be explained to the patient prior to starting the procedure. Dealing with human beings, the attitude along with the technical management will be of paramount importance in the overall quality assessment.

  8. Using Place-Based Independent Class Projects as a Means to Hone Research Skills and Prepare a Future Geospatial Workforce

    NASA Astrophysics Data System (ADS)

    Prakash, A.; Gens, R.; Cristobal, J.; Waigl, C. F.; Balazs, M. S.; Graham, P. R.; Butcher, C. E.; Sparrow, E. B.

    2015-12-01

    It is never too early to bring in your own research into teaching. Considerable efforts have been made globally to introduce STEM research themes in K12 environments. These efforts a laudable as they help to create STEM identity in students and get students excited to pursue higher education. The task of a post-secondary educator is to build on that excitement and ensure that the students who enter higher education come out knowledgeable, skilled, and employable. At the University of Alaska Fairbanks we have structured our geospatial curricula to include place-based, independent research projects in several semester-long classes. These class-projects serve as mini capstone research experiences that take a student through the entire process of research including: identifying a problem or need; building a hypothesis; formulating the science question; searching, acquiring, and processing data; analyzing and interpreting the research results; and presenting the outcomes in written and oral format to a peer group. Over a decade of experience has shown that students tend to engage and perform well when the research addresses an authentic problem they can relate to and take ownership of. Over 150 student-lead class projects using a variety of freely available datasets have contributed not only to preparing the future workforce, but also to enhancing the research profile of UAF. We extended the same model to a summer internship program where graduate students who have gone through the experience of an in-class research project serve as mentors for undergraduate interns. Even the condensed time frame yields positive outcomes including joint publications between faculty, staff, graduate students and undergraduate students in the peer-reviewed literature.

  9. Interactive dualism as a partial solution to the mind-brain problem for psychiatry.

    PubMed

    McLaren, N

    2006-01-01

    With the collapse of the psychoanalytic and the behaviorist models, and the failure of reductive biologism to account for mental life, psychiatry has been searching for a broad, integrative theory on which to base daily practice. The most recent attempt at such a model, Engel's 'biopsychosocial model', has been shown to be devoid of any scientific content, meaning that psychiatry, alone among the medical disciplines, has no recognised scientific basis. It is no coincidence that psychiatry is constantly under attack from all quarters. In order to develop, the discipline requires an integrative and interactive model which can take account of both the mental and the physical dimensions of human life, yet still remain within the materialist scientific ethos. This paper proposes an entirely new model of mind based in Chalmers' 'interactive dualism' which satisfies those needs. It attributes the causation of all behaviour to mental life, but proposes a split in the nature of mentality such that mind becomes a composite function with two, profoundly different aspects. Causation is assigned to a fast, inaccessible cognitive realm operating within the brain machinery while conscious experience is seen as the outcome of a higher order level of brain processing. The particular value of this model is that it immediately offers a practical solution to the mind-brain problem in that, while all information-processing takes place in the mental realm, it is not in the same order of abstraction as perception. This leads to a model of rational interaction which acknowledges both psyche and soma. It can fill the gap left by the demise of Engel's empty 'biopsychosocial model'.

  10. Neural mechanisms of human perceptual learning: electrophysiological evidence for a two-stage process.

    PubMed

    Hamamé, Carlos M; Cosmelli, Diego; Henriquez, Rodrigo; Aboitiz, Francisco

    2011-04-26

    Humans and other animals change the way they perceive the world due to experience. This process has been labeled as perceptual learning, and implies that adult nervous systems can adaptively modify the way in which they process sensory stimulation. However, the mechanisms by which the brain modifies this capacity have not been sufficiently analyzed. We studied the neural mechanisms of human perceptual learning by combining electroencephalographic (EEG) recordings of brain activity and the assessment of psychophysical performance during training in a visual search task. All participants improved their perceptual performance as reflected by an increase in sensitivity (d') and a decrease in reaction time. The EEG signal was acquired throughout the entire experiment revealing amplitude increments, specific and unspecific to the trained stimulus, in event-related potential (ERP) components N2pc and P3 respectively. P3 unspecific modification can be related to context or task-based learning, while N2pc may be reflecting a more specific attentional-related boosting of target detection. Moreover, bell and U-shaped profiles of oscillatory brain activity in gamma (30-60 Hz) and alpha (8-14 Hz) frequency bands may suggest the existence of two phases for learning acquisition, which can be understood as distinctive optimization mechanisms in stimulus processing. We conclude that there are reorganizations in several neural processes that contribute differently to perceptual learning in a visual search task. We propose an integrative model of neural activity reorganization, whereby perceptual learning takes place as a two-stage phenomenon including perceptual, attentional and contextual processes.

  11. Fins effectiveness and efficiency with position function of rhombus sectional area in unsteady condition

    NASA Astrophysics Data System (ADS)

    Nugroho, Tito Dwi; Purwadi, P. K.

    2017-01-01

    The function of the fin is to extend surfaces so that objects fitted with fin can remove the heat to the surrounding environment so that the cooling process can take place more quickly. The purpose of this study is to calculate and determine the effect of (a) the convective heat transfer coefficient of fluid on the value of the fin on the efficiency and effectiveness of non-steady state, and (b) the fin material to the value of the fins on the efficiency and effectiveness of non-steady state. The studied fins are in the form of straight fins with rhombus sectional area which is a function of position x with the short diagonal length of D1 and D2 as long diagonal length, L as fin's length and α as fin's tilt angle. Research solved numerical computation, using a finite difference method on the explicit way. At first, the fin has the same initial temperature with essentially temperature Ti = Tb, then abruptly fin conditioned on fluid temperature environment T∞. Fin's material is assumed with uniform properties, does not change with changes in temperature, and fin does not change the shape and volume during the process. The temperature of the fluid around the fins and the value of the convective heat transfer coefficient are permanently constant, and there is no energy generation in the fin. Fin's heat transfer conduction only take place in one direction, namely in the direction perpendicular to the fin base (or x-direction). The entire surface of the fin makes the process of heat transfer to a fluid environment around the fins. The results show that (a) the greater the value of heat transfer coefficient of convection h, the smaller the efficiency fin and effectiveness fins (b) In circumstances of unsteady state, the efficiency and effectivity influenced by the value of density, specific heat, heat transfer coefficient of conduction and thermal diffusivity fin material.

  12. Flight Investigation of the Stability and Control Characteristics of a 1/4-Scale Model of a Tilt-Wing Vertical-Take-Off-and-Landing Aircraft

    NASA Technical Reports Server (NTRS)

    Tosti, Louis P.

    1959-01-01

    An experimental investigation has been conducted to determine the dynamic stability and control characteristics of a tilt-wing vertical-take-off-and-landing aircraft with the use of a remotely controlled 1/4-scale free-flight model. The model had two propellers with hinged (flapping) blades mounted on the wing which could be tilted up to an incidence angle of nearly 90 deg for vertical take-off and landing. The investigation consisted of hovering flights in still air, vertical take-offs and landings, and slow constant-altitude transitions from hovering to forward flight. The stability and control characteristics of the model were generally satisfactory except for the following characteristics. In hovering flight, the model had an unstable pitching oscillation of relatively long period which the pilots were able to control without artificial stabilization but which could not be considered entirely satisfactory. At very low speeds and angles of wing incidence on the order of 70 deg, the model experienced large nose-up pitching moments which severely limited the allowable center-of-gravity range.

  13. Active System for Electromagnetic Perturbation Monitoring in Vehicles

    NASA Astrophysics Data System (ADS)

    Matoi, Adrian Marian; Helerea, Elena

    Nowadays electromagnetic environment is rapidly expanding in frequency domain and wireless services extend in terms of covered area. European electromagnetic compatibility regulations refer to limit values regarding emissions, as well as procedures for determining susceptibility of the vehicle. Approval procedure for a series of cars is based on determining emissions/immunity level for a few vehicles picked randomly from the entire series, supposing that entire vehicle series is compliant. During immunity assessment, the vehicle is not subjected to real perturbation sources, but exposed to electric/magnetic fields generated by laboratory equipment. Since current approach takes into account only partially real situation regarding perturbation sources, this paper proposes an active system for determining electromagnetic parameters of vehicle's environment, that implements a logical diagram for measurement, satisfying the imposed requirements. This new and original solution is useful for EMC assessment of hybrid and electrical vehicles.

  14. Autoregressive-model-based missing value estimation for DNA microarray time series data.

    PubMed

    Choong, Miew Keen; Charbit, Maurice; Yan, Hong

    2009-01-01

    Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.

  15. Adaptive Neuro-Fuzzy Modeling of UH-60A Pilot Vibration

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi; Malki, Heidar A.; Langari, Reza

    2003-01-01

    Adaptive neuro-fuzzy relationships have been developed to model the UH-60A Black Hawk pilot floor vertical vibration. A 200 point database that approximates the entire UH-60A helicopter flight envelope is used for training and testing purposes. The NASA/Army Airloads Program flight test database was the source of the 200 point database. The present study is conducted in two parts. The first part involves level flight conditions and the second part involves the entire (200 point) database including maneuver conditions. The results show that a neuro-fuzzy model can successfully predict the pilot vibration. Also, it is found that the training phase of this neuro-fuzzy model takes only two or three iterations to converge for most cases. Thus, the proposed approach produces a potentially viable model for real-time implementation.

  16. Fort Riley Building Inventory and Evaluation, 1964-1974: Volume 1 of 2

    DTIC Science & Technology

    2017-08-07

    Clubhouse Sports Pro Shop 5315 1965 Chapel Chapel 5320 1965 Exchange Service Station Exchange Auto Service 7305 1966 Special Weapons Training ...recognition as an important base of advanced military training . The schools offered theory and practical in- struction in drill and firing practice, stable...management, and horse train - ing. Entire units, not individual men, were sent to Fort Riley to take part in the instruction the schools offered

  17. Chinese Military Reforms: A Pessimistic Take

    DTIC Science & Technology

    2016-10-01

    organizational structure of the People’s Liberation Army ( PLA ). One change has been the dismantling of the four “general depart- ments” that formerly served...the PLA Army (PLAA) and as a joint staff for the entire military. Most joint staff–type functions have been moved to the Central Military Commis...sion (CMC) while a separate PLAA head- quarters has been created, comparable to the headquarters of the PLA Navy, Air Force, and Rocket Force

  18. New marketing strategy has CA hospital saying 'in with the old and out with the new'.

    PubMed

    1999-03-01

    Redesigning the marketing department can offer a sweet surprise of savings. Long Beach (CA) Memorial Hospital revamped its marketing operations as an afterthought to a hospitalwide reengineering effort. Little did they know they could slash close to $1 million out of their advertising and marketing budget and redesign job functions to meet the changes taking shape within the entire facility. Learn their cost-saving secrets.

  19. Conference summary: the Bologna-M16 Questions

    NASA Astrophysics Data System (ADS)

    Davies, M. B.

    Rather than attempt to summarise an entire week of excellent talks, I will instead take the material covered in this meeting as a starting point and from it produce a list of questions which cover a number of outstanding questions within the field of stellar cluster formation and evolution. I have five questions in total. Given the location (Bologna) and nature (Modest-16) of the meeting, I label my questions the Bologna-M16 Questions.

  20. A Multi-Strategy Gaming Environment.

    DTIC Science & Technology

    1982-03-01

    themselves to the game environment have been incorporated in various machine players. These, however, should not be considered as competitive ...could take only three "actions", each a strategy for the entire game . Subsequent Bayesian players were e ::tensions of MSI: CALLER2 also observed which...enouiu core -. e -or% to test the Zadeh function against other learning strategies. 3.2.7. SA r.,, B un Figure 4 shows the chance in purse size vs. the game

  1. Expectation, information processing, and subjective duration.

    PubMed

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  2. Measuring the Autocorrelation Function of Nanoscale Three-Dimensional Density Distribution in Individual Cells Using Scanning Transmission Electron Microscopy, Atomic Force Microscopy, and a New Deconvolution Algorithm.

    PubMed

    Li, Yue; Zhang, Di; Capoglu, Ilker; Hujsak, Karl A; Damania, Dhwanil; Cherkezyan, Lusik; Roth, Eric; Bleher, Reiner; Wu, Jinsong S; Subramanian, Hariharan; Dravid, Vinayak P; Backman, Vadim

    2017-06-01

    Essentially all biological processes are highly dependent on the nanoscale architecture of the cellular components where these processes take place. Statistical measures, such as the autocorrelation function (ACF) of the three-dimensional (3D) mass-density distribution, are widely used to characterize cellular nanostructure. However, conventional methods of reconstruction of the deterministic 3D mass-density distribution, from which these statistical measures can be calculated, have been inadequate for thick biological structures, such as whole cells, due to the conflict between the need for nanoscale resolution and its inverse relationship with thickness after conventional tomographic reconstruction. To tackle the problem, we have developed a robust method to calculate the ACF of the 3D mass-density distribution without tomography. Assuming the biological mass distribution is isotropic, our method allows for accurate statistical characterization of the 3D mass-density distribution by ACF with two data sets: a single projection image by scanning transmission electron microscopy and a thickness map by atomic force microscopy. Here we present validation of the ACF reconstruction algorithm, as well as its application to calculate the statistics of the 3D distribution of mass-density in a region containing the nucleus of an entire mammalian cell. This method may provide important insights into architectural changes that accompany cellular processes.

  3. LHCb Online event processing and filtering

    NASA Astrophysics Data System (ADS)

    Alessio, F.; Barandela, C.; Brarda, L.; Frank, M.; Franek, B.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Köstner, S.; Moine, G.; Neufeld, N.; Somogyi, P.; Stoica, R.; Suman, S.

    2008-07-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed.

  4. Replacement processes in crystalline rocks

    NASA Astrophysics Data System (ADS)

    John, Timm; Putnis, Andrew

    2010-05-01

    A substantial question in metamorphism is what is the mechanism that dominates the conversion of one mineral assemblage to another in response to a change in the ambient physical and/or chemical conditions. Petrological, microstructural, and isotopic data indicate that aqueous fluids must be involved even in the reequilibration of large-scale systems. Fluid-mineral reactions take place by dissolution - precipitation processes, but converting one solid rock to another requires pervasive, either dominantly advective or diffusive fluid-mediated transport through the entire rock. The generation of reaction-induced porosity and the spatial and temporal coupling of dissolution and precipitation can account for fluid and element transport through rocks and the replacement of one mineral assemblage by another. To determine the mechanism of metamorphic reactions we refer to examples of interfaces and reaction textures which contain both the "before" (precursor) and "after" mineral assemblages - case studies where the process of conversion is frozen in. We will illustrate some aspects of the role of fluids in metamorphic reactions and discuss how reactive fluids can pervasively infiltrate a rock. The examples we will use are focussed on crystalline rocks and include reactions from the lower continental crust, the subducting oceanic crust, and the continental upper crust to show that except at very high-temperature conditions, essentially the same mechanisms are responsible for converting rocks to thermodynamically more stable mineral assemblages for given Pressure-Temperature-fluid composition (P-T-X) conditions.

  5. Tracer-based characterization of hyporheic exchange and benthic biolayers in streams

    NASA Astrophysics Data System (ADS)

    Knapp, Julia L. A.; González-Pinzón, Ricardo; Drummond, Jennifer D.; Larsen, Laurel G.; Cirpka, Olaf A.; Harvey, Judson W.

    2017-02-01

    Shallow benthic biolayers at the top of the streambed are believed to be places of enhanced biogeochemical turnover within the hyporheic zone. They can be investigated by reactive stream tracer tests with tracer recordings in the streambed and in the stream channel. Common in-stream measurements of such reactive tracers cannot localize where the processing primarily takes place, whereas isolated vertical depth profiles of solutes within the hyporheic zone are usually not representative of the entire stream. We present results of a tracer test where we injected the conservative tracer bromide together with the reactive tracer resazurin into a third-order stream and combined the recording of in-stream breakthrough curves with multidepth sampling of the hyporheic zone at several locations. The transformation of resazurin was used as an indicator of metabolism, and high-reactivity zones were identified from depth profiles. The results from our subsurface analysis indicate that the potential for tracer transformation (i.e., the reaction rate constant) varied with depth in the hyporheic zone. This highlights the importance of the benthic biolayer, which we found to be on average 2 cm thick in this study, ranging from one third to one half of the full depth of the hyporheic zone. The reach-scale approach integrated the effects of processes along the reach length, isolating hyporheic processes relevant for whole-stream chemistry and estimating effective reaction rates.

  6. The contamination of scientific literature: looking for an antidote

    NASA Astrophysics Data System (ADS)

    Liotta, Marcello

    2017-04-01

    Science may have very strong implications for society. The knowledge of the processes occurring around the society represents a good opportunity to take responsible decisions. This is particularly true in the field of geosciences. Earthquakes, volcanic eruptions, landslides, climate changes and many other natural phenomena still need to be further investigated. The role of the scientific community is to increase the knowledge. Each member can share his own ideas and data thus allowing the entire scientific community to receive a precious contribution. The latter one often derives from research activities, which are expensive in terms of consumed time and resources. Nowadays the sharing of scientific results occurs through the publication on scientific journals. The reading of available scientific literature thus represents a unique opportunity to define the state of the art on a specific topic and to address research activities towards something new. When published results are obtained through a rigorous scientific process, they constitute a solid background where each member can add his ideas and evidences. Differently, published results may be affected by scientific misconduct; they constitute a labyrinth where the scientists lose their time in the attempt of truly understanding the natural processes. The normal scientific dialectic should unmask such results, thus avoiding literature contamination and making the scientific framework more stimulating. The scientific community should look for the best practice to reduce the risk of literature contamination.

  7. Infinite Systems of Interacting Chains with Memory of Variable Length—A Stochastic Model for Biological Neural Nets

    NASA Astrophysics Data System (ADS)

    Galves, A.; Löcherbach, E.

    2013-06-01

    We consider a new class of non Markovian processes with a countable number of interacting components. At each time unit, each component can take two values, indicating if it has a spike or not at this precise moment. The system evolves as follows. For each component, the probability of having a spike at the next time unit depends on the entire time evolution of the system after the last spike time of the component. This class of systems extends in a non trivial way both the interacting particle systems, which are Markovian (Spitzer in Adv. Math. 5:246-290, 1970) and the stochastic chains with memory of variable length which have finite state space (Rissanen in IEEE Trans. Inf. Theory 29(5):656-664, 1983). These features make it suitable to describe the time evolution of biological neural systems. We construct a stationary version of the process by using a probabilistic tool which is a Kalikow-type decomposition either in random environment or in space-time. This construction implies uniqueness of the stationary process. Finally we consider the case where the interactions between components are given by a critical directed Erdös-Rényi-type random graph with a large but finite number of components. In this framework we obtain an explicit upper-bound for the correlation between successive inter-spike intervals which is compatible with previous empirical findings.

  8. Measuring the Autocorrelation Function of Nanoscale Three-Dimensional Density Distribution in Individual Cells Using Scanning Transmission Electron Microscopy, Atomic Force Microscopy, and a New Deconvolution Algorithm

    PubMed Central

    Li, Yue; Zhang, Di; Capoglu, Ilker; Hujsak, Karl A.; Damania, Dhwanil; Cherkezyan, Lusik; Roth, Eric; Bleher, Reiner; Wu, Jinsong S.; Subramanian, Hariharan; Dravid, Vinayak P.; Backman, Vadim

    2018-01-01

    Essentially all biological processes are highly dependent on the nanoscale architecture of the cellular components where these processes take place. Statistical measures, such as the autocorrelation function (ACF) of the three-dimensional (3D) mass–density distribution, are widely used to characterize cellular nanostructure. However, conventional methods of reconstruction of the deterministic 3D mass–density distribution, from which these statistical measures can be calculated, have been inadequate for thick biological structures, such as whole cells, due to the conflict between the need for nanoscale resolution and its inverse relationship with thickness after conventional tomographic reconstruction. To tackle the problem, we have developed a robust method to calculate the ACF of the 3D mass–density distribution without tomography. Assuming the biological mass distribution is isotropic, our method allows for accurate statistical characterization of the 3D mass–density distribution by ACF with two data sets: a single projection image by scanning transmission electron microscopy and a thickness map by atomic force microscopy. Here we present validation of the ACF reconstruction algorithm, as well as its application to calculate the statistics of the 3D distribution of mass–density in a region containing the nucleus of an entire mammalian cell. This method may provide important insights into architectural changes that accompany cellular processes. PMID:28416035

  9. Tracer-based characterization of hyporheic exchange and benthic biolayers in streams

    USGS Publications Warehouse

    Knapp, Julia L.A.; González-Pinzón, Ricardo; Drummond, Jennifer D.; Larsen, Laurel G.; Cirpka, Olaf A.; Harvey, Judson W.

    2017-01-01

    Shallow benthic biolayers at the top of the streambed are believed to be places of enhanced biogeochemical turnover within the hyporheic zone. They can be investigated by reactive stream tracer tests with tracer recordings in the streambed and in the stream channel. Common in-stream measurements of such reactive tracers cannot localize where the processing primarily takes place, whereas isolated vertical depth profiles of solutes within the hyporheic zone are usually not representative of the entire stream. We present results of a tracer test where we injected the conservative tracer bromide together with the reactive tracer resazurin into a third-order stream and combined the recording of in-stream breakthrough curves with multidepth sampling of the hyporheic zone at several locations. The transformation of resazurin was used as an indicator of metabolism, and high-reactivity zones were identified from depth profiles. The results from our subsurface analysis indicate that the potential for tracer transformation (i.e., the reaction rate constant) varied with depth in the hyporheic zone. This highlights the importance of the benthic biolayer, which we found to be on average 2 cm thick in this study, ranging from one third to one half of the full depth of the hyporheic zone. The reach-scale approach integrated the effects of processes along the reach length, isolating hyporheic processes relevant for whole-stream chemistry and estimating effective reaction rates.

  10. A quantitative risk assessment of exposure to adventitious agents in a cell culture-derived subunit influenza vaccine.

    PubMed

    Gregersen, Jens-Peter

    2008-06-19

    A risk-assessment model has demonstrated the ability of a new cell culture-based vaccine manufacturing process to reduce the level of any adventitious agent to a million-fold below infectious levels. The cell culture-derived subunit influenza vaccine (OPTAFLU), Novartis Vaccines and Diagnostics) is produced using Madin-Darby canine kidney (MDCK) cells to propagate seasonal viral strains, as an alternative to embryonated chicken-eggs. As only a limited range of mammalian viruses can grow in MDCK cells, similar to embryonated eggs, MDCK cells can act as an effective filter for a wide range of adventitious agents that might be introduced during vaccine production. However, the introduction of an alternative cell substrate (for example, MDCK cells) into a vaccine manufacturing process requires thorough investigations to assess the potential for adventitious agent risk in the final product, in the unlikely event that contamination should occur. The risk assessment takes into account the entire manufacturing process, from initial influenza virus isolation, through to blending of the trivalent subunit vaccine and worst-case residual titres for the final vaccine formulation have been calculated for >20 viruses or virus families. Maximum residual titres for all viruses tested were in the range of 10(-6) to 10(-16) infectious units per vaccine dose. Thus, the new cell culture-based vaccine manufacturing process can reduce any adventitious agent to a level that is unable to cause infection.

  11. RiboGalaxy: A browser based platform for the alignment, analysis and visualization of ribosome profiling data

    PubMed Central

    Michel, Audrey M.; Mullan, James P. A.; Velayudhan, Vimalkumar; O'Connor, Patrick B. F.; Donohue, Claire A.; Baranov, Pavel V.

    2016-01-01

    ABSTRACT Ribosome profiling (ribo-seq) is a technique that uses high-throughput sequencing to reveal the exact locations and densities of translating ribosomes at the entire transcriptome level. The technique has become very popular since its inception in 2009. Yet experimentalists who generate ribo-seq data often have to rely on bioinformaticians to process and analyze their data. We present RiboGalaxy (http://ribogalaxy.ucc.ie), a freely available Galaxy-based web server for processing and analyzing ribosome profiling data with the visualization functionality provided by GWIPS-viz (http://gwips.ucc.ie). RiboGalaxy offers researchers a suite of tools specifically tailored for processing ribo-seq and corresponding mRNA-seq data. Researchers can take advantage of the published workflows which reduce the multi-step alignment process to a minimum of inputs from the user. Users can then explore their own aligned data as custom tracks in GWIPS-viz and compare their ribosome profiles to existing ribo-seq tracks from published studies. In addition, users can assess the quality of their ribo-seq data, determine the strength of the triplet periodicity signal, generate meta-gene ribosome profiles as well as analyze the relative impact of mRNA sequence features on local read density. RiboGalaxy is accompanied by extensive documentation and tips for helping users. In addition we provide a forum (http://gwips.ucc.ie/Forum) where we encourage users to post their questions and feedback to improve the overall RiboGalaxy service. PMID:26821742

  12. Hybrid neuro-heuristic methodology for simulation and control of dynamic systems over time interval.

    PubMed

    Woźniak, Marcin; Połap, Dawid

    2017-09-01

    Simulation and positioning are very important aspects of computer aided engineering. To process these two, we can apply traditional methods or intelligent techniques. The difference between them is in the way they process information. In the first case, to simulate an object in a particular state of action, we need to perform an entire process to read values of parameters. It is not very convenient for objects for which simulation takes a long time, i.e. when mathematical calculations are complicated. In the second case, an intelligent solution can efficiently help on devoted way of simulation, which enables us to simulate the object only in a situation that is necessary for a development process. We would like to present research results on developed intelligent simulation and control model of electric drive engine vehicle. For a dedicated simulation method based on intelligent computation, where evolutionary strategy is simulating the states of the dynamic model, an intelligent system based on devoted neural network is introduced to control co-working modules while motion is in time interval. Presented experimental results show implemented solution in situation when a vehicle transports things over area with many obstacles, what provokes sudden changes in stability that may lead to destruction of load. Therefore, applied neural network controller prevents the load from destruction by positioning characteristics like pressure, acceleration, and stiffness voltage to absorb the adverse changes of the ground. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.

  14. Solid Waste Management in Greater Shillong Planning Area (GSPA) Using Spatial Multi-Criteria Decision Analysis for Site Suitability Assessment

    NASA Astrophysics Data System (ADS)

    Mipun, B. S.; Hazarika, R.; Mondal, M.; Mukhopadhyay, S.

    2015-04-01

    In Shillong city the existing solid waste management system is mobile waste bins (72%). About 12 percent burn the waste generated by them. Door to door collection is about 5 percent. Over 2 percent households throw the wastes in the open space. Another 9 percent households throw their wastes into the waste bins located in the neighbourhood. The local headman takes care about half of the household's wastes, while Municipality takes care about 34 percent households. About 10 percent households are ignorant about the collection and disposal of wastes. Some NGO's takes care about 5 percent household's wastes. Awareness about segregation of waste into organic and non-bio degradable waste is 64 percent and a significant numbers do the segregation. In Shillong Municipality Board (SMB) area collects 45.91% (78.42 MT) waste, outside SMB area collection is 32.61% (45.99 MT) and entire GSPA the percentage of garbage collected is 41percent. The only dumping ground in GSPA is Marten, Mawiong, and the capacity to hold garbage is decreasing due to limited landfill. The sanitary landfill site is 5.0 acres that it is not enough to meet the demand. Out of he total area 170.69 sq. km. (GSPA) only 25.67% is most suitable and 18.58% is unsuitable to set up a new landfill area. Eastern part of the GSPA, is most suitable, which fulfils the entire criterion adopted in this study. In this the best-stated criterion are land cover (vacant space), slope (<15%), proximity to road (400-800m), distance from River (>2000m) and elevation (1300-1500m). The eastern part of the GSPA is most suitable landfill location.

  15. AVHRR channel selection for land cover classification

    USGS Publications Warehouse

    Maxwell, S.K.; Hoffer, R.M.; Chapman, P.L.

    2002-01-01

    Mapping land cover of large regions often requires processing of satellite images collected from several time periods at many spectral wavelength channels. However, manipulating and processing large amounts of image data increases the complexity and time, and hence the cost, that it takes to produce a land cover map. Very few studies have evaluated the importance of individual Advanced Very High Resolution Radiometer (AVHRR) channels for discriminating cover types, especially the thermal channels (channels 3, 4 and 5). Studies rarely perform a multi-year analysis to determine the impact of inter-annual variability on the classification results. We evaluated 5 years of AVHRR data using combinations of the original AVHRR spectral channels (1-5) to determine which channels are most important for cover type discrimination, yet stabilize inter-annual variability. Particular attention was placed on the channels in the thermal portion of the spectrum. Fourteen cover types over the entire state of Colorado were evaluated using a supervised classification approach on all two-, three-, four- and five-channel combinations for seven AVHRR biweekly composite datasets covering the entire growing season for each of 5 years. Results show that all three of the major portions of the electromagnetic spectrum represented by the AVHRR sensor are required to discriminate cover types effectively and stabilize inter-annual variability. Of the two-channel combinations, channels 1 (red visible) and 2 (near-infrared) had, by far, the highest average overall accuracy (72.2%), yet the inter-annual classification accuracies were highly variable. Including a thermal channel (channel 4) significantly increased the average overall classification accuracy by 5.5% and stabilized interannual variability. Each of the thermal channels gave similar classification accuracies; however, because of the problems in consistently interpreting channel 3 data, either channel 4 or 5 was found to be a more appropriate choice. Substituting the thermal channel with a single elevation layer resulted in equivalent classification accuracies and inter-annual variability.

  16. The Defense Life Cycle Management System as a Working Model for Academic Application

    ERIC Educational Resources Information Center

    Burian, Philip E.; Keffel, Leslie M.; Maffei, Francis R., III

    2011-01-01

    Performing the review and assessment of masters' level degree programs can be an overwhelming and challenging endeavor. Getting organized and mapping out the entire review and assessment process can be extremely helpful and more importantly provide a path for successfully accomplishing the review and assessment of the entire program. This paper…

  17. Some Memories Are Odder than Others: Judgments of Episodic Oddity Violate Known Decision Rules

    ERIC Educational Resources Information Center

    O'Connor, Akira R.; Guhl, Emily N.; Cox, Justin C.; Dobbins, Ian G.

    2011-01-01

    Current decision models of recognition memory are based almost entirely on one paradigm, single item old/new judgments accompanied by confidence ratings. This task results in receiver operating characteristics (ROCs) that are well fit by both signal-detection and dual-process models. Here we examine an entirely new recognition task, the judgment…

  18. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.

  19. Unifying model of shoot gravitropism reveals proprioception as a central feature of posture control in plants

    PubMed Central

    Bastien, Renaud; Bohr, Tomas; Moulia, Bruno; Douady, Stéphane

    2013-01-01

    Gravitropism, the slow reorientation of plant growth in response to gravity, is a key determinant of the form and posture of land plants. Shoot gravitropism is triggered when statocysts sense the local angle of the growing organ relative to the gravitational field. Lateral transport of the hormone auxin to the lower side is then enhanced, resulting in differential gene expression and cell elongation causing the organ to bend. However, little is known about the dynamics, regulation, and diversity of the entire bending and straightening process. Here, we modeled the bending and straightening of a rod-like organ and compared it with the gravitropism kinematics of different organs from 11 angiosperms. We show that gravitropic straightening shares common traits across species, organs, and orders of magnitude. The minimal dynamic model accounting for these traits is not the widely cited gravisensing law but one that also takes into account the sensing of local curvature, what we describe here as a graviproprioceptive law. In our model, the entire dynamics of the bending/straightening response is described by a single dimensionless “bending number” B that reflects the ratio between graviceptive and proprioceptive sensitivities. The parameter B defines both the final shape of the organ at equilibrium and the timing of curving and straightening. B can be estimated from simple experiments, and the model can then explain most of the diversity observed in experiments. Proprioceptive sensing is thus as important as gravisensing in gravitropic control, and the B ratio can be measured as phenotype in genetic studies. PMID:23236182

  20. A novel quantitative model of cell cycle progression based on cyclin-dependent kinases activity and population balances.

    PubMed

    Pisu, Massimo; Concas, Alessandro; Cao, Giacomo

    2015-04-01

    Cell cycle regulates proliferative cell capacity under normal or pathologic conditions, and in general it governs all in vivo/in vitro cell growth and proliferation processes. Mathematical simulation by means of reliable and predictive models represents an important tool to interpret experiment results, to facilitate the definition of the optimal operating conditions for in vitro cultivation, or to predict the effect of a specific drug in normal/pathologic mammalian cells. Along these lines, a novel model of cell cycle progression is proposed in this work. Specifically, it is based on a population balance (PB) approach that allows one to quantitatively describe cell cycle progression through the different phases experienced by each cell of the entire population during its own life. The transition between two consecutive cell cycle phases is simulated by taking advantage of the biochemical kinetic model developed by Gérard and Goldbeter (2009) which involves cyclin-dependent kinases (CDKs) whose regulation is achieved through a variety of mechanisms that include association with cyclins and protein inhibitors, phosphorylation-dephosphorylation, and cyclin synthesis or degradation. This biochemical model properly describes the entire cell cycle of mammalian cells by maintaining a sufficient level of detail useful to identify check point for transition and to estimate phase duration required by PB. Specific examples are discussed to illustrate the ability of the proposed model to simulate the effect of drugs for in vitro trials of interest in oncology, regenerative medicine and tissue engineering. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Unifying model of shoot gravitropism reveals proprioception as a central feature of posture control in plants.

    PubMed

    Bastien, Renaud; Bohr, Tomas; Moulia, Bruno; Douady, Stéphane

    2013-01-08

    Gravitropism, the slow reorientation of plant growth in response to gravity, is a key determinant of the form and posture of land plants. Shoot gravitropism is triggered when statocysts sense the local angle of the growing organ relative to the gravitational field. Lateral transport of the hormone auxin to the lower side is then enhanced, resulting in differential gene expression and cell elongation causing the organ to bend. However, little is known about the dynamics, regulation, and diversity of the entire bending and straightening process. Here, we modeled the bending and straightening of a rod-like organ and compared it with the gravitropism kinematics of different organs from 11 angiosperms. We show that gravitropic straightening shares common traits across species, organs, and orders of magnitude. The minimal dynamic model accounting for these traits is not the widely cited gravisensing law but one that also takes into account the sensing of local curvature, what we describe here as a graviproprioceptive law. In our model, the entire dynamics of the bending/straightening response is described by a single dimensionless "bending number" B that reflects the ratio between graviceptive and proprioceptive sensitivities. The parameter B defines both the final shape of the organ at equilibrium and the timing of curving and straightening. B can be estimated from simple experiments, and the model can then explain most of the diversity observed in experiments. Proprioceptive sensing is thus as important as gravisensing in gravitropic control, and the B ratio can be measured as phenotype in genetic studies.

  2. Statistical procedures for determination and verification of minimum reporting levels for drinking water methods.

    PubMed

    Winslow, Stephen D; Pepich, Barry V; Martin, John J; Hallberg, George R; Munch, David J; Frebis, Christopher P; Hedrick, Elizabeth J; Krop, Richard A

    2006-01-01

    The United States Environmental Protection Agency's Office of Ground Water and Drinking Water has developed a single-laboratory quantitation procedure: the lowest concentration minimum reporting level (LCMRL). The LCMRL is the lowest true concentration for which future recovery is predicted to fall, with high confidence (99%), between 50% and 150%. The procedure takes into account precision and accuracy. Multiple concentration replicates are processed through the entire analytical method and the data are plotted as measured sample concentration (y-axis) versus true concentration (x-axis). If the data support an assumption of constant variance over the concentration range, an ordinary least-squares regression line is drawn; otherwise, a variance-weighted least-squares regression is used. Prediction interval lines of 99% confidence are drawn about the regression. At the points where the prediction interval lines intersect with data quality objective lines of 50% and 150% recovery, lines are dropped to the x-axis. The higher of the two values is the LCMRL. The LCMRL procedure is flexible because the data quality objectives (50-150%) and the prediction interval confidence (99%) can be varied to suit program needs. The LCMRL determination is performed during method development only. A simpler procedure for verification of data quality objectives at a given minimum reporting level (MRL) is also presented. The verification procedure requires a single set of seven samples taken through the entire method procedure. If the calculated prediction interval is contained within data quality recovery limits (50-150%), the laboratory performance at the MRL is verified.

  3. TakeCARE, a Video Bystander Program to Help Prevent Sexual Violence on College Campuses: Results of Two Randomized, Controlled Trials

    PubMed Central

    Jouriles, Ernest N.; McDonald, Renee; Rosenfield, David; Levy, Nicole; Sargent, Kelli; Caiozzo, Christina; Grych, John H.

    2015-01-01

    Objective The present research reports on two randomized controlled trials evaluating TakeCARE, a video bystander program designed to help prevent sexual violence on college campuses. Method In Study 1, students were recruited from psychology courses at two universities. In Study 2, first-year students were recruited from a required course at one university. In both studies, students were randomly assigned to view one of two videos: TakeCARE or a control video on study skills. Just before viewing the videos, students completed measures of bystander behavior toward friends and ratings of self-efficacy for performing such behaviors. The efficacy measure was administered again after the video, and both the bystander behavior measure and the efficacy measure were administered at either one (Study 1) or two (Study 2) months later. Results In both studies, students who viewed TakeCARE, compared to students who viewed the control video, reported engaging in more bystander behavior toward friends and greater feelings of efficacy for performing such behavior. In Study 1, feelings of efficacy mediated effects of TakeCARE on bystander behavior; this result did not emerge in Study 2. Conclusions This research demonstrates that TakeCARE, a video bystander program, can positively influence bystander behavior toward friends. Given its potential to be easily distributed to an entire campus community, TakeCARE might be an effective addition to campus efforts to prevent sexual violence. PMID:27867694

  4. Navy Coalescence Test on Petroleum F-76 Fuel with FAME Additive at 1%

    DTIC Science & Technology

    2012-06-20

    sponsored studies have shown that in many countries there is an undesirable concentration of Fatty Acid Methyl - Ester ( FAME ) present in the F-76. This...have shown that in many countries there may be an undesirable concentration of Fatty Acid Methyl - Ester ( FAME ) present in the F-76. This study was...Acid Methyl - Ester DEFINITIONS Turnover ..................amount of time it takes to flow the entire volume of fluid in a container, also

  5. Bispectral Inversion: The Construction of a Time Series from Its Bispectrum

    DTIC Science & Technology

    1988-04-13

    take the inverse transform . Since the goal is to compute a time series given its bispectrum, it would also be nice to stay entirely in the frequency...domain and be able to go directly from the bispectrum to the Fourier transform of the time series without the need to inverse transform continuous...the picture. The approximations arise from representing the bicovariance, which is the inverse transform of a continuous function, by the inverse disrte

  6. Russia After Putin

    DTIC Science & Technology

    2014-05-01

    was operating in the dark surrounded by “yes men.” Granted it would take time, but the handwriting was on the wall; the days of the Power Vertical...ideas across the entire expanse of the FSU. . . . Needless to say , the war on the Internet and independent media outlets presently underway in Russia...Russia’s Far East Remains ‘More Dead than Alive’ Experts Say ,” Window on Eurasia, December 2, 2012. He notes that “within a radius of 1,000 kilometers

  7. Extra-bodily DNA sampling by the police.

    PubMed

    Gans, Jeremy

    2013-12-01

    Forensic investigators have statutory powers to take DNA samples directly from suspects' bodies in certain circumstances but sometimes the powers fall short, legally or practically Police may then look for samples that have become separated from their suspects for one reason or another. No jurisdiction currently bars or even regulates this practice, which is instead loosely governed by laws on property, consent and evidence. This article argues that this lack of regulation undermines the entire system of forensic procedure laws.

  8. International Standard Payload Rack volume

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Outer dimensions of the International Standard Payload Rack (ISPR) that will be used on the International Space Station (ISS) sets the envelope for scientists designing hardware for experiments in biological and physical sciences aboard ISS. The ISPR includes attachments to ISS utilities (electrical power, heating and cooling, data, fluids, vacuum, etc.) through standoffs that hold the racks in place in the lab modules. Usage will range from facilities that take entire racks to specialized drawers occupying a portion of a rack.

  9. Worldwide Emerging Environmental Issues Affecting the U.S. Military. October 2009

    DTIC Science & Technology

    2009-10-01

    nuclear arms. Deliberations will take at least until May 2010. Belgium has already banned cluster and depleted uranium munitions. [Related item: UN...dissolve in cold water than warm. Research carried out in the archipelago of Svalbard revealed that seawater could reach corrosive levels within 10 years... corrosively acidic by 2018; 50% by 2050; and entirely by the end of the century. 6.8.2 Food and Water Security The number of hungry people in the

  10. Building emotional resilience over 14 sessions of emotion focused therapy: Micro-longitudinal analyses of productive emotional patterns.

    PubMed

    Pascual-Leone, A; Yeryomenko, N; Sawashima, T; Warwar, S

    2017-05-04

    Pascual-Leone and Greenberg's sequential model of emotional processing has been used to explore process in over 24 studies. This line of research shows emotional processing in good psychotherapy often follows a sequential order, supporting a saw-toothed pattern of change within individual sessions (progressing "2-steps-forward, 1-step-back"). However, one cannot assume that local in-session patterns are scalable across an entire course of therapy. Thus, the primary objective of this exploratory study was to consider how the sequential patterns identified by Pascual-Leone, may apply across entire courses of treatment. Intensive emotion coding in two separate single-case designs were submitted for quantitative analyses of longitudinal patterns. Comprehensive coding in these cases involved recording observations for every emotional event in an entire course of treatment (using the Classification of Affective-Meaning States), which were then treated as a 9-point ordinal scale. Applying multilevel modeling to each of the two cases showed significant patterns of change over a large number of sessions, and those patterns were either nested at the within-session level or observed at the broader session-by-session level of change. Examining successful treatment cases showed several theoretically coherent kinds of temporal patterns, although not always in the same case. Clinical or methodological significance of this article: This is the first paper to demonstrate systematic temporal patterns of emotion over the course of an entire treatment. (1) The study offers a proof of concept that longitudinal patterns in the micro-processes of emotion can be objectively derived and quantified. (2) It also shows that patterns in emotion may be identified on the within-session level, as well as the session-by-session level of analysis. (3) Finally, observed processes over time support the ordered pattern of emotional states hypothesized in Pascual-Leone and Greenberg's ( 2007 ) model of emotional processing.

  11. Square2 - A Web Application for Data Monitoring in Epidemiological and Clinical Studies

    PubMed

    Schmidt, Carsten Oliver; Krabbe, Christine; Schössow, Janka; Albers, Martin; Radke, Dörte; Henke, Jörg

    2017-01-01

    Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square 2 . It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square 2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square 2 comprises an extensive user rights and roles concept.

  12. Edge control in a computer controlled optical surfacing process using a heterocercal tool influence function.

    PubMed

    Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun

    2016-11-14

    Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.

  13. Melt impregnation as a post processing treatment for performance enhancement in high capacity 3D microporous tin-copper-nickel intermetallic anode for Li-ion battery supported by electrodeposited nickel scaffold: A structural study

    NASA Astrophysics Data System (ADS)

    Sengupta, Srijan; Patra, Arghya; Mitra, Arijit; Jena, Sambedan; Das, Karabi; Majumder, Subhasish Basu; Das, Siddhartha

    2018-05-01

    This paper communicates stabilization of a Sn anode by impregnating it within the porous framework of a Ni-scaffold. The impregnation is carried out by electrodeposition Sn on Ni-foam followed by heating at 300 °C for 1 h. The Ni-foam was also electrodeposited on a Cu foil prior to deposition of Sn. The melting step leads to the formation of Nisbnd Sn and Cusbnd Sn intermetallics within pores of the Ni-scaffold. Snsbnd Cu/Ni intermetallics lithiate following the active-inactive strategy in which the inactive Cu/Ni buffers the volume expansion while Sn lithiates. Furthermore, this entire process takes place within Ni-scaffold which resists material pulverization and delamination and provide better electronic pathway for charge transfer. This active-inactive Sn:Snsbnd Cu/Ni intermetallic within a protected Ni-scaffold assembly results in 100th cycle discharge capacity of 587.9 mA h/g at a rate of 500 mA/g (0.5 C), and superior rate capability delivering 463 mAh/g at a rate of 2 A/g (2 C) while retaining structural integrity as compared to pure Sn electrodeposited (without heat-treatment) on the nickel scaffold.

  14. More Than Filaments and Cores: Statistical Study of Structure Formation and Dynamics in Nearby Molecular Clouds

    NASA Astrophysics Data System (ADS)

    Chen, How-Huan; Goodman, Alyssa

    2018-01-01

    In the past decade, multiple attempts at understanding the connection between filaments and star forming cores have been made using observations across the entire epectrum. However, the filaments and the cores are usually treated as predefined--and well-defined--entities, instead of structures that often come at different sizes, shapes, with substantially different dynamics, and inter-connected at different scales. In my dissertation, I present an array of studies using different statistical methods, including the dendrogram and the probability distribution function (PDF), of structures at different size scales within nearby molecular clouds. These structures are identified using observations of different density tracers, and where possible, in the multi-dimensional parameter space of key dynamic properties--the LSR velocity, the velocity dispersion, and the column density. The goal is to give an overview of structure formation in nearby star-forming clouds, as well as of the dynamics in these structures. I find that the overall statistical properties of a larger structure is often the summation/superposition of sub-structures within, and that there could be significant variations due to local physical processes. I also find that the star formation process within molecular clouds could in fact take place in a non-monolithic manner, connecting potentially merging and/or transient structures, at different scales.

  15. The use of ordered mixtures for improving the dissolution rate of low solubility compounds.

    PubMed

    Nyström, C; Westerberg, M

    1986-03-01

    The dissolution rate of micronized griseofulvin has been investigated, both for the agglomerated raw material and the material formulated as an ordered mixture, by means of the USP XX paddle method. During the experiments, which were performed at sink condition and constant temperature, the effects of adding a surfactant and of agitation were tested. The ordered mixture with sodium chloride gave a fast dissolution rate, practically independent of the test parameters. Micronized griseofulvin alone gave dissolution profiles that were improved by adding polysorbate 80 and by increased agitation, but the dissolution rates obtained were much lower than those for the ordered mixture. It was concluded that the rate limiting step in the dissolution of griseofulvin as the raw material is the penetration of the dissolution medium into the agglomerates. With an ordered mixture, these agglomerates were deaggregated during the mixing process, producing a system in which the entire external surface area of the primary particles was exposed to the dissolution medium. This conclusion was supported by calculation of the contact surface areas taking part in the dissolution process for the systems tested. The procedure developed in this study could be applied to preformulation work where a cohesive, low solubility drug of hydrophobic nature is to be formulated.

  16. Giant ripples on comet 67P/Churyumov–Gerasimenko sculpted by sunset thermal wind

    PubMed Central

    Jia, Pan; Andreotti, Bruno; Claudin, Philippe

    2017-01-01

    Explaining the unexpected presence of dune-like patterns at the surface of the comet 67P/Churyumov–Gerasimenko requires conceptual and quantitative advances in the understanding of surface and outgassing processes. We show here that vapor flow emitted by the comet around its perihelion spreads laterally in a surface layer, due to the strong pressure difference between zones illuminated by sunlight and those in shadow. For such thermal winds to be dense enough to transport grains—10 times greater than previous estimates—outgassing must take place through a surface porous granular layer, and that layer must be composed of grains whose roughness lowers cohesion consistently with contact mechanics. The linear stability analysis of the problem, entirely tested against laboratory experiments, quantitatively predicts the emergence of bedforms in the observed wavelength range and their propagation at the scale of a comet revolution. Although generated by a rarefied atmosphere, they are paradoxically analogous to ripples emerging on granular beds submitted to viscous shear flows. This quantitative agreement shows that our understanding of the coupling between hydrodynamics and sediment transport is able to account for bedform emergence in extreme conditions and provides a reliable tool to predict the erosion and accretion processes controlling the evolution of small solar system bodies. PMID:28223535

  17. Emergence of Virtual Reality as a Tool for Upper Limb Rehabilitation: Incorporation of Motor Control and Motor Learning Principles

    PubMed Central

    Weiss, Patrice L.; Keshner, Emily A.

    2015-01-01

    The primary focus of rehabilitation for individuals with loss of upper limb movement as a result of acquired brain injury is the relearning of specific motor skills and daily tasks. This relearning is essential because the loss of upper limb movement often results in a reduced quality of life. Although rehabilitation strives to take advantage of neuroplastic processes during recovery, results of traditional approaches to upper limb rehabilitation have not entirely met this goal. In contrast, enriched training tasks, simulated with a wide range of low- to high-end virtual reality–based simulations, can be used to provide meaningful, repetitive practice together with salient feedback, thereby maximizing neuroplastic processes via motor learning and motor recovery. Such enriched virtual environments have the potential to optimize motor learning by manipulating practice conditions that explicitly engage motivational, cognitive, motor control, and sensory feedback–based learning mechanisms. The objectives of this article are to review motor control and motor learning principles, to discuss how they can be exploited by virtual reality training environments, and to provide evidence concerning current applications for upper limb motor recovery. The limitations of the current technologies with respect to their effectiveness and transfer of learning to daily life tasks also are discussed. PMID:25212522

  18. Expanded Processing Techniques for EMI Systems

    DTIC Science & Technology

    2012-07-01

    possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets

  19. An Optimization of Manufacturing Systems using a Feedback Control Scheduling Model

    NASA Astrophysics Data System (ADS)

    Ikome, John M.; Kanakana, Grace M.

    2018-03-01

    In complex production system that involves multiple process, unplanned disruption often turn to make the entire production system vulnerable to a number of problems which leads to customer’s dissatisfaction. However, this problem has been an ongoing problem that requires a research and methods to streamline the entire process or develop a model that will address it, in contrast to this, we have developed a feedback scheduling model that can minimize some of this problem and after a number of experiment, it shows that some of this problems can be eliminated if the correct remedial actions are implemented on time.

  20. A method for optimizing the cosine response of solar UV diffusers

    NASA Astrophysics Data System (ADS)

    Pulli, Tomi; Kärhä, Petri; Ikonen, Erkki

    2013-07-01

    Instruments measuring global solar ultraviolet (UV) irradiance at the surface of the Earth need to collect radiation from the entire hemisphere. Entrance optics with angular response as close as possible to the ideal cosine response are necessary to perform these measurements accurately. Typically, the cosine response is obtained using a transmitting diffuser. We have developed an efficient method based on a Monte Carlo algorithm to simulate radiation transport in the solar UV diffuser assembly. The algorithm takes into account propagation, absorption, and scattering of the radiation inside the diffuser material. The effects of the inner sidewalls of the diffuser housing, the shadow ring, and the protective weather dome are also accounted for. The software implementation of the algorithm is highly optimized: a simulation of 109 photons takes approximately 10 to 15 min to complete on a typical high-end PC. The results of the simulations agree well with the measured angular responses, indicating that the algorithm can be used to guide the diffuser design process. Cost savings can be obtained when simulations are carried out before diffuser fabrication as compared to a purely trial-and-error-based diffuser optimization. The algorithm was used to optimize two types of detectors, one with a planar diffuser and the other with a spherically shaped diffuser. The integrated cosine errors—which indicate the relative measurement error caused by the nonideal angular response under isotropic sky radiance—of these two detectors were calculated to be f2=1.4% and 0.66%, respectively.

  1. A discriminative model-constrained graph cuts approach to fully automated pediatric brain tumor segmentation in 3-D MRI.

    PubMed

    Wels, Michael; Carneiro, Gustavo; Aplas, Alexander; Huber, Martin; Hornegger, Joachim; Comaniciu, Dorin

    2008-01-01

    In this paper we present a fully automated approach to the segmentation of pediatric brain tumors in multi-spectral 3-D magnetic resonance images. It is a top-down segmentation approach based on a Markov random field (MRF) model that combines probabilistic boosting trees (PBT) and lower-level segmentation via graph cuts. The PBT algorithm provides a strong discriminative observation model that classifies tumor appearance while a spatial prior takes into account the pair-wise homogeneity in terms of classification labels and multi-spectral voxel intensities. The discriminative model relies not only on observed local intensities but also on surrounding context for detecting candidate regions for pathology. A mathematically sound formulation for integrating the two approaches into a unified statistical framework is given. The proposed method is applied to the challenging task of detection and delineation of pediatric brain tumors. This segmentation task is characterized by a high non-uniformity of both the pathology and the surrounding non-pathologic brain tissue. A quantitative evaluation illustrates the robustness of the proposed method. Despite dealing with more complicated cases of pediatric brain tumors the results obtained are mostly better than those reported for current state-of-the-art approaches to 3-D MR brain tumor segmentation in adult patients. The entire processing of one multi-spectral data set does not require any user interaction, and takes less time than previously proposed methods.

  2. GREENSCOPE Technical User’s Guide

    EPA Pesticide Factsheets

    GREENSCOPE’s methodology has been developed and its software tool designed such that it can be applied to an entire process, to a piece of equipment or process unit, or at the investigatory bench scale.

  3. Toward a virtual platform for materials processing

    NASA Astrophysics Data System (ADS)

    Schmitz, G. J.; Prahl, U.

    2009-05-01

    Any production is based on materials eventually becoming components of a final product. Material properties being determined by the microstructure of the material thus are of utmost importance both for productivity and reliability of processing during production and for application and reliability of the product components. A sound prediction of materials properties therefore is highly important. Such a prediction requires tracking of microstructure and properties evolution along the entire component life cycle starting from a homogeneous, isotropic and stress-free melt and eventually ending in failure under operational load. This article will outline ongoing activities at the RWTH Aachen University aiming at establishing a virtual platform for materials processing comprising a virtual, integrative numerical description of processes and of the microstructure evolution along the entire production chain and even extending further toward microstructure and properties evolution under operational conditions.

  4. Supreme Court of the United States Syllabus: Cleveland Board of Education Et Al. v. LaFleur Et Al. Certiorari to the United States Court of Appeals for the Sixth Circuit. No. 72-77, Argued October 15, 1973 -- Decided January 21, 1974.

    ERIC Educational Resources Information Center

    Supreme Court of the U. S., Washington, DC.

    School board rules for the Cleveland, Ohio, and the Chesterfield County, Virginia, districts required pregnant teachers to take unpaid maternity leave five months and four months respectively before expected childbirth. A date for eligibility for return to work was also arbitrarily set. This pamphlet contains the entire official text of the…

  5. Military Personnel: Visibility over Junior Enlisted Servicemember Access to Services on Bases Could Be Enhanced

    DTIC Science & Technology

    2015-05-01

    dining facility to get dinner before it closed at 6:30 p.m. However, leadership at the installations we visited stated there are accommodations made to...hosts an informal dinner for the junior enlisted servicemembers where he discusses any concerns. In addition, one senior official we spoke with at...the entire lunch hour to get to the dining facility, eat , and return to work, which does not leave any time to take care of other tasks during lunch

  6. The acoustic low-degree modes of the Sun measured with 14 years of continuous GOLF & VIRGO measurements

    NASA Astrophysics Data System (ADS)

    García, R. A.; Salabert, D.; Ballot, J.; Sato, K.; Mathur, S.; Jiménez, A.

    2011-01-01

    The helioseismic Global Oscillation at Low Frequency (GOLF) and the Variability of solar Irradiance and Gravity Oscillations (VIRGO) instruments onboard SoHO, have been observing the Sun continuously for the last 14 years. In this preliminary work, we characterize the acoustic modes over the entire p-mode range in both, Doppler velocity and luminosity, with a special care for the low-frequency modes taking advantage of the stability and the high duty cycle of space observations.

  7. Sino-American Relations in the 21st Century: Taking a Page from the Venezuelan Crisis of 1895

    DTIC Science & Technology

    2015-04-13

    necessarily endorsed by the Joint Forces Staff College or the Department of Defense. This paper is entirely my own work except as documented in...NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR Joseph H. Wenckus Lieutenant Colonel, U.S. Air Force 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...transition theory applies. This paper posits that there are real similarities between the peaceful Anglo-American power transition of last century, and

  8. Environmental Impact Statement/Environmental Impact Report for the Disposal and Reuse of Mare Island Naval Shipyard Vallejo, California. Volume 1.

    DTIC Science & Technology

    1998-04-01

    Valley (Kroeber & Heizer 1970). In 1972, the Bureau of Indian Affairs listed only 11 individuals claiming Patwin ancestry in the entire territory...facility from the dredge disposal area to the upland open space scenic resource area would render this facility visible from viewpoints with . high...take. The COE probably would not issue a permit unless the USFWS rendered a "non-jeopardy" Biological Opinion, which would incorporate mitigations for

  9. Taking Trust to the Field: Pilot Study on Trust and Communication in Teams

    DTIC Science & Technology

    2005-02-23

    assault group the entirely wrong direction. Person 4 from other AG: Smith…you’re right the fuck out of it…on your nav… Smith: Point that way…until the...monitoring (McAllister, 1995). Other than to address the trust violation that just occurred, there appears little reason for this information to be...of trust-relevant communications. In starting this work, we were concerned that coders should have to use as little extrapolation or inference as

  10. Internal tibial torsion correction study. [measurements of strain for corrective rotation of stressed tibia

    NASA Technical Reports Server (NTRS)

    Cantu, J. M.; Madigan, C. M.

    1974-01-01

    A quantitative study of internal torsion in the entire tibial bone was performed by using strain gauges to measure the amount of deformation occuring at different locations. Comparison of strain measurements with physical dimensions of the bone produced the modulus of rigidity and its behavior under increased torque. Computerized analysis of the stress distribution shows that more strain occurs near the torqued ends of the bones where also most of the twisting and fracturing takes place.

  11. Turmoil, Transition - Triumph? The Democratic Revolution in the Philippines

    DTIC Science & Technology

    1986-06-01

    The entire lobby, mezzanine, bar, and stairways were jammed . We couldn’t even get close to the room in which she was speaking, and that was with her...went to the top of the Manila Hotel to take pictures of the throng. One immediately noticeable difference was the lobby. It was jammed ! The place was...at the entry to the school. It was a real festival atmosphere, and though it was just eight in the morning the school was jammed . Each classroom in

  12. Translations on USSR Science and Technology, Physical Sciences and Technology, Number 15

    DTIC Science & Technology

    1977-07-26

    systems which they de - veloped and introduced. Over 8,000 Azneft’ oil wells are equipped with these automated systems. But, this equipment is...validity of informacion received in this manner and they are inclined to obtain it the old way. Drive over, take a look-- and that’s the entire method...TiT^^T de ^ermlned by the reactive parameters of the circuit and the diode is hundredths of a nanosecond for the microwave switches. The switching

  13. Valuing the Accreditation Process

    ERIC Educational Resources Information Center

    Bahr, Maria

    2018-01-01

    The value of the National Association for Developmental Education (NADE) accreditation process is far-reaching. Not only do students and programs benefit from the process, but also the entire institution. Through data collection of student performance, analysis, and resulting action plans, faculty and administrators can work cohesively towards…

  14. The ambiguity of altruism in nursing: A qualitative study.

    PubMed

    Slettmyr, Anna; Schandl, Anna; Arman, Maria

    2017-01-01

    For a long time, altruism was the basis for caring. Today, when society is more individualized, it is of interest to explore the meaning of altruism in nursing. In all, 13 nurses from a Swedish acute care setting participated in two focus group interviews performed as Socratic dialogues. Data were analyzed using a phenomenological hermeneutical method. Ethical considerations: Ethical issues were considered throughout the process according to established ethical principles. Informed consent was obtained from all participants, confidentiality regarding the data was guaranteed and quotations anonymized. Altruism created a sense of ambivalence and ambiguity, described as a rise of sovereign expressions of life caused by "the other's" need, but also unwillingness to take unconditional responsibility for "the other." Society's expectations of altruism and nurses' perception of their work as a salaried job collide in modern healthcare. Nurses are not willing to fully respond to the ethical demand of the patients. In case of a disaster, when nurses personal safety, life and health may be at risk, there might be reasons to question whether the healthcare organization would be able to fulfill its obligations of providing healthcare to an entire population.

  15. Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes. Part 3; A Recursive Maximum Likelihood Decoding

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Fossorier, Marc

    1998-01-01

    The Viterbi algorithm is indeed a very simple and efficient method of implementing the maximum likelihood decoding. However, if we take advantage of the structural properties in a trellis section, other efficient trellis-based decoding algorithms can be devised. Recently, an efficient trellis-based recursive maximum likelihood decoding (RMLD) algorithm for linear block codes has been proposed. This algorithm is more efficient than the conventional Viterbi algorithm in both computation and hardware requirements. Most importantly, the implementation of this algorithm does not require the construction of the entire code trellis, only some special one-section trellises of relatively small state and branch complexities are needed for constructing path (or branch) metric tables recursively. At the end, there is only one table which contains only the most likely code-word and its metric for a given received sequence r = (r(sub 1), r(sub 2),...,r(sub n)). This algorithm basically uses the divide and conquer strategy. Furthermore, it allows parallel/pipeline processing of received sequences to speed up decoding.

  16. Managing occupational risk in creative practice: a new perspective for occupational health and safety.

    PubMed

    Oughton, Nicholas

    2013-01-01

    There has been little recognition of the fact that creative production operates in a somewhat different environment and timeframe to that associated with traditional industries. This has resulted in the application of an orthodox, generic or ``one size fits all'' framework of Occupational Health and Safety (OHS) systems across all industries. With the rapid growth of ``creative industry,'' certain challenges arise from the application of this "generic" strategy, mainly because the systems currently employed may not be entirely suitable for creative practice. Some OHS practitioners suggest that the current OHS paradigm is failing. This paper questions the appropriateness of applying a twentieth century OHS model in the present industrial context, and considers what framework will best provide for the well-being of creative workers and their enterprise in the twenty-first century. The paper questions the notion of "Risk" and the paradox associated with "Risk Management," particularly in the context of the creative process. Clearly, risk taking contributes to creative enterprise and effective risk management should accommodate both risk minimization and risk exploitation.

  17. Determination of Plutonium Isotope Ratios at Very Low Levels by ICP-MS using On-Line Electrochemically Modulated Separations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liezers, Martin; Lehn, Scott A; Olsen, Khris B

    2009-10-01

    Electrochemically modulated separations (EMS) are shown to be a rapid and selective means of extracting and concentrating Pu from complex solutions prior to isotopic analysis by inductively coupled plasma mass spectrometry (ICP-MS). This separation is performed in a flow injection mode, on-line with the ICP-MS. A three-electrode, flow-by electrochemical cell is used to accumulate Pu at an anodized glassy carbon electrode by redox conversion of Pu(III) to Pu (IV&VI). The entire process takes place in 2% v/v (0.46M) HNO 3. No redox chemicals or acid concentration changes are required. Plutonium accumulation and release is redox dependent and controlled by themore » applied cell potential. Thus large transient volumetric concentration enhancements can be achieved. Based on more negative U(IV) potentials relative to Pu(IV), separation of Pu from uranium is efficient, thereby eliminating uranium hydride interferences. EMS-ICP-MS isotope ratio measurement performance will be presented for femtogram to attogram level plutonium concentrations.« less

  18. Sample-based engine noise synthesis using an enhanced pitch-synchronous overlap-and-add method.

    PubMed

    Jagla, Jan; Maillard, Julien; Martin, Nadine

    2012-11-01

    An algorithm for the real time synthesis of internal combustion engine noise is presented. Through the analysis of a recorded engine noise signal of continuously varying engine speed, a dataset of sound samples is extracted allowing the real time synthesis of the noise induced by arbitrary evolutions of engine speed. The sound samples are extracted from a recording spanning the entire engine speed range. Each sample is delimitated such as to contain the sound emitted during one cycle of the engine plus the necessary overlap to ensure smooth transitions during the synthesis. The proposed approach, an extension of the PSOLA method introduced for speech processing, takes advantage of the specific periodicity of engine noise signals to locate the extraction instants of the sound samples. During the synthesis stage, the sound samples corresponding to the target engine speed evolution are concatenated with an overlap and add algorithm. It is shown that this method produces high quality audio restitution with a low computational load. It is therefore well suited for real time applications.

  19. Evaluation of sensor arrays for engine oils using artificial oil alteration

    NASA Astrophysics Data System (ADS)

    Sen, Sedat; Schneidhofer, Christoph; Dörr, Nicole; Vellekoop, Michael J.

    2011-06-01

    With respect to varying operation conditions, only sensors directly installed in the engine can detect the current oil condition hence enabling to get the right time for the oil change. Usually, only one parameter is not sufficient to obtain reliable information about the current oil condition. For this reason, appropriate sensor principles were evaluated for the design of sensor arrays for the measurement of critical lubricant parameters. In this contribution, we report on the development of a sensor array for engine oils using laboratory analyses of used engine oils for the correlation with sensor signals. The sensor array comprises the measurement of conductivity, permittivity, viscosity and temperature as well as oil corrosiveness as a consequence of acidification of the lubricant. As a key method, rapid evaluation of the sensors was done by short term simulation of entire oil change intervals based on artificial oil alteration. Thereby, the compatibility of the sensor array to the lubricant and the oil deterioration during the artificial alteration process was observed by the sensors and confirmed by additional laboratory analyses of oil samples take.

  20. The emergence of learning-teaching trajectories in education: a complex dynamic systems approach.

    PubMed

    Steenbeek, Henderien; van Geert, Paul

    2013-04-01

    In this article we shall focus on learning-teaching trajectories ='successful' as well as 'unsuccessful' ones - as emergent and dynamic phenomena resulting from the interactions in the entire educational context, in particular the interaction between students and teachers viewed as processes of intertwining self-, other- and co-regulation. The article provides a review of the educational research literature on action regulation in learning and teaching, and interprets this literature in light of the theory of complex dynamic systems. Based on this reinterpretation of the literature, two dynamic models are proposed, one focusing on the short-term dynamics of learning-teaching interactions as they take place in classrooms, the other focusing on the long-term dynamics of interactions in a network of variables encompassing concerns, evaluations, actions and action effects (such as learning) students and teachers. The aim of presenting these models is to demonstrate, first, the possibility of transforming existing educational theory into dynamic models and, second, to provide some suggestions as to how such models can be used to further educational theory and practice.

  1. Study of the Cooldown and Warmup for the Eight Sectors of the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Liu, L.; Riddone, G.; Tavian, L.

    2004-06-01

    The LHC cryogenic system is based on a five-point feed scheme with eight refrigerators serving the eight sectors of the LHC machine. The paper presents the simplified flow scheme of the eight sectors and the mathematical methods including the program flowchart and the boundary conditions to simulate the cooldown and warmup of these sectors. The methods take into account the effect of the pressure drop across the valves as well as the pressure evolution in the different headers of the cryogenic distribution line. The simulated pressure and temperature profiles of headers of the LHC sector during the cooldown and warmup are given and the temperature evolutions of entire processes of cooldown and warmup are presented. As a conclusion, the functions of the input temperature for the normal and fast cooldown and warmup, the cooldown and warmup time of each sector and the distributions of mass flow rates in each sector are summarized. The results indicate that it is possible to cool down any of the LHC sector within 12.7 days in normal operation and 6.8 days in case of fast operation.

  2. Real-time earthquake data feasible

    NASA Astrophysics Data System (ADS)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  3. Finding the numerical compensation in multiple criteria decision-making problems under fuzzy environment

    NASA Astrophysics Data System (ADS)

    Gupta, Mahima; Mohanty, B. K.

    2017-04-01

    In this paper, we have developed a methodology to derive the level of compensation numerically in multiple criteria decision-making (MCDM) problems under fuzzy environment. The degree of compensation is dependent on the tranquility and anxiety level experienced by the decision-maker while taking the decision. Higher tranquility leads to the higher realisation of the compensation whereas the increased level of anxiety reduces the amount of compensation in the decision process. This work determines the level of tranquility (or anxiety) using the concept of fuzzy sets and its various level sets. The concepts of indexing of fuzzy numbers, the risk barriers and the tranquility level of the decision-maker are used to derive his/her risk prone or risk averse attitude of decision-maker in each criterion. The aggregation of the risk levels in each criterion gives us the amount of compensation in the entire MCDM problem. Inclusion of the compensation leads us to model the MCDM problem as binary integer programming problem (BIP). The solution to BIP gives us the compensatory decision to MCDM. The proposed methodology is illustrated through a numerical example.

  4. Embedded Multimaterial Extrusion Bioprinting.

    PubMed

    Rocca, Marco; Fragasso, Alessio; Liu, Wanjun; Heinrich, Marcel A; Zhang, Yu Shrike

    2018-04-01

    Embedded extrusion bioprinting allows for the generation of complex structures that otherwise cannot be achieved with conventional layer-by-layer deposition from the bottom, by overcoming the limits imposed by gravitational force. By taking advantage of a hydrogel bath, serving as a sacrificial printing environment, it is feasible to extrude a bioink in freeform until the entire structure is deposited and crosslinked. The bioprinted structure can be subsequently released from the supporting hydrogel and used for further applications. Combining this advanced three-dimensional (3D) bioprinting technique with a multimaterial extrusion printhead setup enables the fabrication of complex volumetric structures built from multiple bioinks. The work described in this paper focuses on the optimization of the experimental setup and proposes a workflow to automate the bioprinting process, resulting in a fast and efficient conversion of a virtual 3D model into a physical, extruded structure in freeform using the multimaterial embedded bioprinting system. It is anticipated that further development of this technology will likely lead to widespread applications in areas such as tissue engineering, pharmaceutical testing, and organs-on-chips.

  5. Automatic and Direct Identification of Blink Components from Scalp EEG

    PubMed Central

    Kong, Wanzeng; Zhou, Zhanpeng; Hu, Sanqing; Zhang, Jianhai; Babiloni, Fabio; Dai, Guojun

    2013-01-01

    Eye blink is an important and inevitable artifact during scalp electroencephalogram (EEG) recording. The main problem in EEG signal processing is how to identify eye blink components automatically with independent component analysis (ICA). Taking into account the fact that the eye blink as an external source has a higher sum of correlation with frontal EEG channels than all other sources due to both its location and significant amplitude, in this paper, we proposed a method based on correlation index and the feature of power distribution to automatically detect eye blink components. Furthermore, we prove mathematically that the correlation between independent components and scalp EEG channels can be translating directly from the mixing matrix of ICA. This helps to simplify calculations and understand the implications of the correlation. The proposed method doesn't need to select a template or thresholds in advance, and it works without simultaneously recording an electrooculography (EOG) reference. The experimental results demonstrate that the proposed method can automatically recognize eye blink components with a high accuracy on entire datasets from 15 subjects. PMID:23959240

  6. MMEJ-assisted gene knock-in using TALENs and CRISPR-Cas9 with the PITCh systems.

    PubMed

    Sakuma, Tetsushi; Nakade, Shota; Sakane, Yuto; Suzuki, Ken-Ichi T; Yamamoto, Takashi

    2016-01-01

    Programmable nucleases enable engineering of the genome by utilizing endogenous DNA double-strand break (DSB) repair pathways. Although homologous recombination (HR)-mediated gene knock-in is well established, it cannot necessarily be applied in every cell type and organism because of variable HR frequencies. We recently reported an alternative method of gene knock-in, named the PITCh (Precise Integration into Target Chromosome) system, assisted by microhomology-mediated end-joining (MMEJ). MMEJ harnesses independent machinery from HR, and it requires an extremely short homologous sequence (5-25 bp) for DSB repair, resulting in precise gene knock-in with a more easily constructed donor vector. Here we describe a streamlined protocol for PITCh knock-in, including the design and construction of the PITCh vectors, and their delivery to either human cell lines by transfection or to frog embryos by microinjection. The construction of the PITCh vectors requires only a few days, and the entire process takes ∼ 1.5 months to establish knocked-in cells or ∼ 1 week from injection to early genotyping in frog embryos.

  7. Dynamics of tunneling ionization using Bohmian mechanics

    NASA Astrophysics Data System (ADS)

    Douguet, Nicolas; Bartschat, Klaus

    2018-01-01

    Recent attoclock experiments and theoretical studies regarding the strong-field ionization of atoms by few-cycle infrared pulses revealed features that have attracted much attention. Here we investigate tunneling ionization and the dynamics of the electron probability using Bohmian mechanics. We consider a one-dimensional problem to illustrate the underlying mechanisms of the ionization process. It is revealed that in the major part of the below-the-barrier ionization regime, in an intense and short infrared pulse, the electron does not tunnel through the entire barrier, but rather starts already from the classically forbidden region. Moreover, we highlight the correspondence between the probability of locating the electron at a particular initial position and its asymptotic momentum. Bohmian mechanics also provides a natural definition of mean tunneling time and exit position, taking account of the time dependence of the barrier. Finally, we find that the electron can exit the barrier with significant kinetic energy, thereby corroborating the results of a recent study [N. Camus et al., Phys. Rev. Lett. 119, 023201 (2017), 10.1103/PhysRevLett.119.023201].

  8. Large impacts of climatic warming on growth of boreal forests since 1960.

    PubMed

    Kauppi, Pekka E; Posch, Maximilian; Pirinen, Pentti

    2014-01-01

    Boreal forests are sensitive to climatic warming, because low temperatures hold back ecosystem processes, such as the mobilization of nitrogen in soils. A greening of the boreal landscape has been observed using remote sensing, and the seasonal amplitude of CO2 in the northern hemisphere has increased, indicating warming effects on ecosystem productivity. However, field observations on responses of ecosystem productivity have been lacking on a large sub-biome scale. Here we report a significant increase in the annual growth of boreal forests in Finland in response to climatic warming, especially since 1990. This finding is obtained by linking meteorological records and forest inventory data on an area between 60° and 70° northern latitude. An additional increase in growth has occurred in response to changes in other drivers, such as forest management, nitrogen deposition and/or CO2 concentration. A similar warming impact can be expected in the entire boreal zone, where warming takes place. Given the large size of the boreal biome - more than ten million km2- important climate feedbacks are at stake, such as the future carbon balance, transpiration and albedo.

  9. Effects of aqueous humor hydrodynamics on human eye heat transfer under external heat sources.

    PubMed

    Tiang, Kor L; Ooi, Ean H

    2016-08-01

    The majority of the eye models developed in the late 90s and early 00s considers only heat conduction inside the eye. This assumption is not entirely correct, since the anterior and posterior chambers are filled aqueous humor (AH) that is constantly in motion due to thermally-induced buoyancy. In this paper, a three-dimensional model of the human eye is developed to investigate the effects AH hydrodynamics have on the human eye temperature under exposure to external heat sources. If the effects of AH flow are negligible, then future models can be developed without taking them into account, thus simplifying the modeling process. Two types of external thermal loads are considered; volumetric and surface irradiation. Results showed that heat convection due to AH flow contributes to nearly 95% of the total heat flow inside the anterior chamber. Moreover, the circulation inside the anterior chamber can cause an upward shift of the location of hotspot. This can have significant consequences to our understanding of heat-induced cataractogenesis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  10. Identification of a killer by a definitive sneaker pattern and his beating instruments by their distinctive patterns.

    PubMed

    Zugibe, F T; Costello, J; Breithaupt, M

    1996-03-01

    A 39-year-old male service station attendant was found murdered on the floor of a gasoline service area by a passing motorist who had stopped for gas. The victim had been brutally beaten all over his entire body. After carefully examining the body and scene and taking selective photographs, special procedures were implemented in an attempt to preserve and transport the body without disturbing any items of evidence. In addition, specific evidentiary items were noted and collected for processing. The victim was meticulously examined externally at autopsy using a special protocol to locate clues that might assist in identifying a suspect or instrument of injury or death. Patterned impressions and subsequent DNA analysis proved successful in identifying the perpetrator of the crime and the instruments used in inflicting the beating. It is the purpose of this paper to show how a meticulous examination of the body for the presence of patterned injuries and critical studies of these patterns and impressions led to the identification of a killer and the instruments he used in a brutal beating.

  11. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  12. Development of a scintillating G-GEM detector for a 6-MeV X-band Linac for medical applications

    NASA Astrophysics Data System (ADS)

    Fujiwara, T.; Tanaka, S.; Mitsuya, Y.; Takahashi, H.; Tagi, K.; Kusano, J.; Tanabe, E.; Yamamoto, M.; Nakamura, N.; Dobashi, K.; Tomita, H.; Uesaka, M.

    2013-12-01

    We recently developed glass gas electron multipliers (G-GEMs) with an entirely new process using photo-etchable glass. The photo-etchable glass used for the substrate is called PEG3 (Hoya Corporation). Taking advantage of low outgassing material, we have envisioned a medical application of G-GEMs. A two-dimensional position-sensitive dosimetry system based on a scintillating gas detector is being developed for real-time dose distribution monitoring in X-ray radiation therapy. The dosimetry system consists of a chamber filled with an Ar/CF4 scintillating gas mixture, inside of which G-GEM structures are mounted. Photons produced by the excited Ar/CF4 gas molecules during the gas multiplication in the GEM holes are detected by a mirror-lens-CCD-camera system. We found that the intensity distribution of the measured light spot is proportional to the 2D dose distribution. In this work, we report on the first results from a scintillating G-GEM detector for a position-sensitive X-ray beam dosimeter.

  13. The governance of quality management in dutch health care: new developments and strategic challenges.

    PubMed

    Maarse, J A M; Ruwaard, D; Spreeuwenberg, C

    2013-01-01

    This article gives a brief sketch of quality management in Dutch health care. Our focus is upon the governance of guideline development and quality measurement. Governance is conceptualized as the structure and process of steering of quality management. The governance structure of guideline development in the Netherlands can be conceptualized as a network without central coordination. Much depends upon the self-initiative of stakeholders. A similar picture can be found in quality measurement. Special attention is given to the development of care standards for chronic disease. Care standards have a broader scope than guidelines and take an explicit patient perspective. They not only contain evidence-based and up-to-date guidelines for the care pathway but also contain standards for self-management. Furthermore, they comprise a set of indicators for measuring the quality of care of the entire pathway covered by the standard. The final part of the article discusses the mission, tasks and strategic challenges of the newly established National Health Care Institute (Zorginstituut Nederland), which is scheduled to be operative in 2013.

  14. Customerizing the clinical laboratory. Repositioning for enhanced service and a competitive advantage.

    PubMed

    Schuler, R S

    1989-01-01

    The call for excellence has never been louder, especially in the health-care industry. This call typically means increased service, i.e., faster, more accurate and, of course, friendlier service--all easier said than done, but qualities that make enhanced customer service so powerful. The excellent companies are learning that because it is so difficult to customerize, few competitors do so. Therefore, by devoting the time and effort necessary for customerization, they can move ahead of their competitors. But surpassing competitors by excellent service can be done inside of companies as well as outside. All units and departments have customers. The key to customerization inside is finding out what your customers want and behaving accordingly. The results go beyond enhanced customer satisfaction. They also include enhanced energy levels, reduced turnover, increased pride, and greater creativity for the newly customerized department. All it takes is an understanding of and dedication to customerization. Repositioning the existing department is critical to the success of any attempt to customerize. This article thoroughly describes customerization and the entire process of repositioning the clinical laboratory. One will not occur without the other.

  15. Hydrological simulation of Sperchios River basin in Central Greece using the MIKE SHE model and geographic information systems

    NASA Astrophysics Data System (ADS)

    Paparrizos, Spyridon; Maris, Fotios

    2017-05-01

    The MIKE SHE model is able to simulate the entire stream flow which includes direct and basic flow. Many models either do not simulate or use simplistic methods to determine the basic flow. The MIKE SHE model takes into account many hydrological data. Since this study was directed towards the simulation of surface runoff and infiltration into saturated and unsaturated zone, the MIKE SHE is an appropriate model for reliable conclusions. In the current research, the MIKE SHE model was used to simulate runoff in the area of Sperchios River basin. Meteorological data from eight rainfall stations within the Sperchios River basin were used as inputs. Vegetation as well as geological data was used to perform the calibration and validation of the physical processes of the model. Additionally, ArcGIS program was used. The results indicated that the model was able to simulate the surface runoff satisfactorily, representing all the hydrological data adequately. Some minor differentiations appeared which can be eliminated with the appropriate adjustments that can be decided by the researcher's experience.

  16. Ensembler: Enabling High-Throughput Molecular Simulations at the Superfamily Scale.

    PubMed

    Parton, Daniel L; Grinaway, Patrick B; Hanson, Sonya M; Beauchamp, Kyle A; Chodera, John D

    2016-06-01

    The rapidly expanding body of available genomic and protein structural data provides a rich resource for understanding protein dynamics with biomolecular simulation. While computational infrastructure has grown rapidly, simulations on an omics scale are not yet widespread, primarily because software infrastructure to enable simulations at this scale has not kept pace. It should now be possible to study protein dynamics across entire (super)families, exploiting both available structural biology data and conformational similarities across homologous proteins. Here, we present a new tool for enabling high-throughput simulation in the genomics era. Ensembler takes any set of sequences-from a single sequence to an entire superfamily-and shepherds them through various stages of modeling and refinement to produce simulation-ready structures. This includes comparative modeling to all relevant PDB structures (which may span multiple conformational states of interest), reconstruction of missing loops, addition of missing atoms, culling of nearly identical structures, assignment of appropriate protonation states, solvation in explicit solvent, and refinement and filtering with molecular simulation to ensure stable simulation. The output of this pipeline is an ensemble of structures ready for subsequent molecular simulations using computer clusters, supercomputers, or distributed computing projects like Folding@home. Ensembler thus automates much of the time-consuming process of preparing protein models suitable for simulation, while allowing scalability up to entire superfamilies. A particular advantage of this approach can be found in the construction of kinetic models of conformational dynamics-such as Markov state models (MSMs)-which benefit from a diverse array of initial configurations that span the accessible conformational states to aid sampling. We demonstrate the power of this approach by constructing models for all catalytic domains in the human tyrosine kinase family, using all available kinase catalytic domain structures from any organism as structural templates. Ensembler is free and open source software licensed under the GNU General Public License (GPL) v2. It is compatible with Linux and OS X. The latest release can be installed via the conda package manager, and the latest source can be downloaded from https://github.com/choderalab/ensembler.

  17. Life-cycle evaluation of nitrogen-use in rice-farming systems: implications for economically-optimal nitrogen rates

    NASA Astrophysics Data System (ADS)

    Xia, Y.; Yan, X.

    2011-11-01

    Nitrogen (N) fertilizer plays an important role in agricultural systems in terms of food yield. However, N application rates (NARs) are often overestimated over the rice (Oryza sativa L.) growing season in the Taihu Lake region of China. This is largely because negative externalities are not entirely included when evaluating economically-optimal nitrogen rate (EONR), such as only individual N losses are taken into account, or the inventory flows of reactive N have been limited solely to the farming process when evaluating environmental and economic effects of N fertilizer. This study integrates important material and energy flows resulting from N use into a rice agricultural inventory that constitutes the hub of the life-cycle assessment (LCA) method. An economic evaluation is used to determine an environmental and economic NAR for the Taihu Lake region. The analysis reveals that production and exploitation processes consume the largest proportion of resources, accounting for 77.2 % and 22.3 % of total resources, respectively. Regarding environmental impact, global warming creates the highest cost with contributions stemming mostly from fertilizer production and farming processes. Farming process incurs the biggest environmental impact of the three environmental impact categories considered, whereas transportation has a much smaller effect. When taking account of resource consumption and environmental cost, the marginal benefit of 1 kg rice would decrease from 2.4 to only 1.05 yuan. Accordingly, our current EONR has been evaluated at 187 kg N ha-1 for a single rice-growing season. This could enhance profitability, as well as reduce the N losses associated with rice growing.

  18. Statistical Comparisons of Meso- and Small-Scale Field-Aligned Currents with Auroral Electron Acceleration Mechanisms from FAST Observations

    NASA Astrophysics Data System (ADS)

    Dombeck, J. P.; Cattell, C. A.; Prasad, N.; Sakher, A.; Hanson, E.; McFadden, J. P.; Strangeway, R. J.

    2016-12-01

    Field-aligned currents (FACs) provide a fundamental driver and means of Magnetosphere-Ionosphere (M-I) coupling. These currents need to be supported by local physics along the entire field line generally with quasi-static potential structures, but also supporting the time-evolution of the structures and currents, producing Alfvén waves and Alfvénic electron acceleration. In regions of upward current, precipitating auroral electrons are accelerated earthward. These processes can result in ion outflow, changes in ionospheric conductivity, and affect the particle distributions on the field line, affecting the M-I coupling processes supporting the individual FACs and potentially the entire FAC system. The FAST mission was well suited to study both the FACs and the electron auroral acceleration processes. We present the results of the comparisons between meso- and small-scale FACs determined from FAST using the method of Peria, et al., 2000, and our FAST auroral acceleration mechanism study when such identification is possible for the entire ˜13 year FAST mission. We also present the latest results of the electron energy (and number) flux ionospheric input based on acceleration mechanism (and FAC characteristics) from our FAST auroral acceleration mechanism study.

  19. Bully Victimization: Selection and Influence Within Adolescent Friendship Networks and Cliques.

    PubMed

    Lodder, Gerine M A; Scholte, Ron H J; Cillessen, Antonius H N; Giletta, Matteo

    2016-01-01

    Adolescents tend to form friendships with similar peers and, in turn, their friends further influence adolescents' behaviors and attitudes. Emerging work has shown that these selection and influence processes also might extend to bully victimization. However, no prior work has examined selection and influence effects involved in bully victimization within cliques, despite theoretical account emphasizing the importance of cliques in this regard. This study examined selection and influence processes in adolescence regarding bully victimization both at the level of the entire friendship network and the level of cliques. We used a two-wave design (5-month interval). Participants were 543 adolescents (50.1% male, Mage = 15.8) in secondary education. Stochastic actor-based models indicated that at the level of the larger friendship network, adolescents tended to select friends with similar levels of bully victimization as they themselves. In addition, adolescent friends influenced each other in terms of bully victimization over time. Actor Parter Interdependence models showed that similarities in bully victimization between clique members were not due to selection of clique members. For boys, average clique bully victimization predicted individual bully victimization over time (influence), but not vice versa. No influence was found for girls, indicating that different mechanisms may underlie friend influence on bully victimization for girls and boys. The differences in results at the level of the larger friendship network versus the clique emphasize the importance of taking the type of friendship ties into account in research on selection and influence processes involved in bully victimization.

  20. Three-dimensional reconstruction of teeth and jaws based on segmentation of CT images using watershed transformation.

    PubMed

    Naumovich, S S; Naumovich, S A; Goncharenko, V G

    2015-01-01

    The objective of the present study was the development and clinical testing of a three-dimensional (3D) reconstruction method of teeth and a bone tissue of the jaw on the basis of CT images of the maxillofacial region. 3D reconstruction was performed using the specially designed original software based on watershed transformation. Computed tomograms in digital imaging and communications in medicine format obtained on multispiral CT and CBCT scanners were used for creation of 3D models of teeth and the jaws. The processing algorithm is realized in the stepwise threshold image segmentation with the placement of markers in the mode of a multiplanar projection in areas relating to the teeth and a bone tissue. The developed software initially creates coarse 3D models of the entire dentition and the jaw. Then, certain procedures specify the model of the jaw and cut the dentition into separate teeth. The proper selection of the segmentation threshold is very important for CBCT images having a low contrast and high noise level. The developed semi-automatic algorithm of multispiral and cone beam computed tomogram processing allows 3D models of teeth to be created separating them from a bone tissue of the jaws. The software is easy to install in a dentist's workplace, has an intuitive interface and takes little time in processing. The obtained 3D models can be used for solving a wide range of scientific and clinical tasks.

  1. The second laws of quantum thermodynamics.

    PubMed

    Brandão, Fernando; Horodecki, Michał; Ng, Nelly; Oppenheim, Jonathan; Wehner, Stephanie

    2015-03-17

    The second law of thermodynamics places constraints on state transformations. It applies to systems composed of many particles, however, we are seeing that one can formulate laws of thermodynamics when only a small number of particles are interacting with a heat bath. Is there a second law of thermodynamics in this regime? Here, we find that for processes which are approximately cyclic, the second law for microscopic systems takes on a different form compared to the macroscopic scale, imposing not just one constraint on state transformations, but an entire family of constraints. We find a family of free energies which generalize the traditional one, and show that they can never increase. The ordinary second law relates to one of these, with the remainder imposing additional constraints on thermodynamic transitions. We find three regimes which determine which family of second laws govern state transitions, depending on how cyclic the process is. In one regime one can cause an apparent violation of the usual second law, through a process of embezzling work from a large system which remains arbitrarily close to its original state. These second laws are relevant for small systems, and also apply to individual macroscopic systems interacting via long-range interactions. By making precise the definition of thermal operations, the laws of thermodynamics are unified in this framework, with the first law defining the class of operations, the zeroth law emerging as an equivalence relation between thermal states, and the remaining laws being monotonicity of our generalized free energies.

  2. The second laws of quantum thermodynamics

    PubMed Central

    Brandão, Fernando; Horodecki, Michał; Ng, Nelly; Oppenheim, Jonathan; Wehner, Stephanie

    2015-01-01

    The second law of thermodynamics places constraints on state transformations. It applies to systems composed of many particles, however, we are seeing that one can formulate laws of thermodynamics when only a small number of particles are interacting with a heat bath. Is there a second law of thermodynamics in this regime? Here, we find that for processes which are approximately cyclic, the second law for microscopic systems takes on a different form compared to the macroscopic scale, imposing not just one constraint on state transformations, but an entire family of constraints. We find a family of free energies which generalize the traditional one, and show that they can never increase. The ordinary second law relates to one of these, with the remainder imposing additional constraints on thermodynamic transitions. We find three regimes which determine which family of second laws govern state transitions, depending on how cyclic the process is. In one regime one can cause an apparent violation of the usual second law, through a process of embezzling work from a large system which remains arbitrarily close to its original state. These second laws are relevant for small systems, and also apply to individual macroscopic systems interacting via long-range interactions. By making precise the definition of thermal operations, the laws of thermodynamics are unified in this framework, with the first law defining the class of operations, the zeroth law emerging as an equivalence relation between thermal states, and the remaining laws being monotonicity of our generalized free energies. PMID:25675476

  3. Trapping in scale-free networks with hierarchical organization of modularity.

    PubMed

    Zhang, Zhongzhi; Lin, Yuan; Gao, Shuyang; Zhou, Shuigeng; Guan, Jihong; Li, Mo

    2009-11-01

    A wide variety of real-life networks share two remarkable generic topological properties: scale-free behavior and modular organization, and it is natural and important to study how these two features affect the dynamical processes taking place on such networks. In this paper, we investigate a simple stochastic process--trapping problem, a random walk with a perfect trap fixed at a given location, performed on a family of hierarchical networks that exhibit simultaneously striking scale-free and modular structure. We focus on a particular case with the immobile trap positioned at the hub node having the largest degree. Using a method based on generating functions, we determine explicitly the mean first-passage time (MFPT) for the trapping problem, which is the mean of the node-to-trap first-passage time over the entire network. The exact expression for the MFPT is calculated through the recurrence relations derived from the special construction of the hierarchical networks. The obtained rigorous formula corroborated by extensive direct numerical calculations exhibits that the MFPT grows algebraically with the network order. Concretely, the MFPT increases as a power-law function of the number of nodes with the exponent much less than 1. We demonstrate that the hierarchical networks under consideration have more efficient structure for transport by diffusion in contrast with other analytically soluble media including some previously studied scale-free networks. We argue that the scale-free and modular topologies are responsible for the high efficiency of the trapping process on the hierarchical networks.

  4. Translating evidence-based guidelines to improve feedback practices: the interACT case study.

    PubMed

    Barton, Karen L; Schofield, Susie J; McAleer, Sean; Ajjawi, Rola

    2016-02-09

    There has been a substantial body of research examining feedback practices, yet the assessment and feedback landscape in higher education is described as 'stubbornly resistant to change'. The aim of this paper is to present a case study demonstrating how an entire programme's assessment and feedback practices were re-engineered and evaluated in line with evidence from the literature in the interACT (Interaction and Collaboration via Technology) project. Informed by action research the project conducted two cycles of planning, action, evaluation and reflection. Four key pedagogical principles informed the re-design of the assessment and feedback practices. Evaluation activities included document analysis, interviews with staff (n = 10) and students (n = 7), and student questionnaires (n = 54). Descriptive statistics were used to analyse the questionnaire data. Framework thematic analysis was used to develop themes across the interview data. InterACT was reported by students and staff to promote self-evaluation, engagement with feedback and feedback dialogue. Streamlining the process after the first cycle of action research was crucial for improving engagement of students and staff. The interACT process of promoting self-evaluation, reflection on feedback, feedback dialogue and longitudinal perspectives of feedback has clear benefits and should be transferable to other contexts. InterACT has involved comprehensive re-engineering of the assessment and feedback processes using educational principles to guide the design taking into account stakeholder perspectives. These principles and the strategies to enact them should be transferable to other contexts.

  5. 50 CFR 260.6 - Terms defined.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... preparation of the product from its raw state through each step in the entire process; or observe conditions... under the regulations in this part which has been preserved by any recognized commercial process..., or by fermentation. Quality. “Quality” means the inherent properties of any processed product which...

  6. Social outcomes due to the interplay between language competition and ideology struggle

    NASA Astrophysics Data System (ADS)

    Barreira da Silva Rocha, André

    2018-02-01

    I study the interplay between language competition and ideology struggle in a country where there are two competing languages. Language transition is governed by a three-state model similar to Minett-Wang (2008) and Heinsalu et al. (2014). In this class of models, I further assume that among monolinguals of one of the competing languages there is an ideology struggle between assimilationist individuals who accept to deal with foreign language speakers and nationalist individuals who oppose any form of foreign culture. Ideology transition follows a two-state model as in Abrams-Strogatz (2003). Depending on both ideology and language status, the possible equilibria show that when nationalism is introduced in the language competition model, complete assimilation might take place and one language disappears with the entire population becoming monolingual. On the other hand, when bilingualism emerges, it is associated with a segregated society with a bilingual group surviving in the long run together with an isolated monolingual group entirely composed of nationalist individuals. Another kind of segregation might also emerge in the equilibrium, in which two monolingual groups survive in the long run, one of them entirely composed of nationalists. In the latter case, both linguistic segregation and isolation are the negative social outcomes of ideology struggle.

  7. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  8. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  9. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  10. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  11. Pragmatics as Metacognitive Control

    PubMed Central

    Kissine, Mikhail

    2016-01-01

    The term “pragmatics” is often used to refer without distinction, on one hand, to the contextual selection of interpretation norms and, on the other hand, to the context-sensitive processes guided by these norms. Pragmatics in the first acception depends on language-independent contextual factors that can, but need not, involve Theory of Mind; in the second acception, pragmatics is a language-specific metacognitive process, which may unfold at an unconscious level without involving any mental state (meta-)representation. Distinguishing between these two kinds of ways context drives the interpretation of communicative stimuli helps dissolve the dispute between proponents of an entirely Gricean pragmatics and those who claim that some pragmatic processes do not depend on mind-reading capacities. According to the model defended in this paper, the typology of pragmatic processes is not entirely determined by a hierarchy of meanings, but by contextually set norms of interpretation. PMID:26834671

  12. Pragmatics as Metacognitive Control.

    PubMed

    Kissine, Mikhail

    2015-01-01

    The term "pragmatics" is often used to refer without distinction, on one hand, to the contextual selection of interpretation norms and, on the other hand, to the context-sensitive processes guided by these norms. Pragmatics in the first acception depends on language-independent contextual factors that can, but need not, involve Theory of Mind; in the second acception, pragmatics is a language-specific metacognitive process, which may unfold at an unconscious level without involving any mental state (meta-)representation. Distinguishing between these two kinds of ways context drives the interpretation of communicative stimuli helps dissolve the dispute between proponents of an entirely Gricean pragmatics and those who claim that some pragmatic processes do not depend on mind-reading capacities. According to the model defended in this paper, the typology of pragmatic processes is not entirely determined by a hierarchy of meanings, but by contextually set norms of interpretation.

  13. Water displacement mercury pump

    DOEpatents

    Nielsen, Marshall G.

    1985-01-01

    A water displacement mercury pump has a fluid inlet conduit and diffuser, a valve, a pressure cannister, and a fluid outlet conduit. The valve has a valve head which seats in an opening in the cannister. The entire assembly is readily insertable into a process vessel which produces mercury as a product. As the mercury settles, it flows into the opening in the cannister displacing lighter material. When the valve is in a closed position, the pressure cannister is sealed except for the fluid inlet conduit and the fluid outlet conduit. Introduction of a lighter fluid into the cannister will act to displace a heavier fluid from the cannister via the fluid outlet conduit. The entire pump assembly penetrates only a top wall of the process vessel, and not the sides or the bottom wall of the process vessel. This insures a leak-proof environment and is especially suitable for processing of hazardous materials.

  14. Water displacement mercury pump

    DOEpatents

    Nielsen, M.G.

    1984-04-20

    A water displacement mercury pump has a fluid inlet conduit and diffuser, a valve, a pressure cannister, and a fluid outlet conduit. The valve has a valve head which seats in an opening in the cannister. The entire assembly is readily insertable into a process vessel which produces mercury as a product. As the mercury settles, it flows into the opening in the cannister displacing lighter material. When the valve is in a closed position, the pressure cannister is sealed except for the fluid inlet conduit and the fluid outlet conduit. Introduction of a lighter fluid into the cannister will act to displace a heavier fluid from the cannister via the fluid outlet conduit. The entire pump assembly penetrates only a top wall of the process vessel, and not the sides or the bottom wall of the process vessel. This insures a leak-proof environment and is especially suitable for processing of hazardous materials.

  15. Rigor + Results = Impact: Measuring Impact with Integrity (Invited)

    NASA Astrophysics Data System (ADS)

    Davis, H. B.; Scalice, D.

    2013-12-01

    Are you struggling to measure and explain the impact of your EPO efforts? The NASA Astrobiology Institute (NAI) is using an evaluation process to determine the impact of its 15 EPO projects with over 200 activities. What is the current impact? How can it be improved in the future? We have developed a process that preserves autonomy at the project implementation level while still painting a picture of the entire portfolio. The impact evaluation process looks at an education/public outreach activity through its entire project cycle. Working with an external evaluator, education leads: 1) rate the quality/health of an activity in each stage of its cycle, and 2) determine the impact based on the results of the evaluation and the rigor of the methods used. The process has created a way to systematically codify a project's health and its impact, while offering support for improving both impact and how it is measured.

  16. Improve medical malpractice law by letting health care insurers take charge.

    PubMed

    Reinker, Kenneth S; Rosenberg, David

    2011-01-01

    This essay discusses unlimited insurance subrogation (UIS) as a means of improving the deterrence and compensation results of medical malpractice law. Under UIS, health care insureds could assign their entire potential medical malpractice claims to their first-party commercial and government insurers. UIS should improve deterrence by establishing first-party insurers as plaintiffs to confront liability insurers on the defense side, leading to more effective prosecution of meritorious claims and reducing meritless and unnecessary litigation. UIS should improve compensation outcomes by converting litigation cost- and risk- laden "tort insurance" into cheaper and enhanced first-party insurance. UIS also promises dynamic benefits through further reforms by contract between the first-party and liability insurers that would take charge of system. No UIS-related costs are apparent that would outweigh these benefits. © 2011 American Society of Law, Medicine & Ethics, Inc.

  17. Genomic imprinting in Drosophila has properties of both mammalian and insect imprinting.

    PubMed

    Anaka, Matthew; Lynn, Audra; McGinn, Patrick; Lloyd, Vett K

    2009-02-01

    Genomic imprinting is a process that marks DNA, causing a change in gene or chromosome behavior, depending on the sex of the transmitting parent. In mammals, most examples of genomic imprinting affect the transcription of individual or small clusters of genes whereas in insects, genomic imprinting tends to silence entire chromosomes. This has been interpreted as evidence of independent evolutionary origins for imprinting. To investigate how these types of imprinting are related, we performed a phenotypic, molecular, and cytological analysis of an imprinted chromosome in Drosophila melanogaster. Analysis of this chromosome reveals that the imprint results in transcriptional silencing. Yet, the domain of transcriptional silencing is very large, extending at least 1.2 Mb and encompassing over 100 genes, and is associated with decreased somatic polytenization of the entire chromosome. We propose that repression of somatic replication in polytenized cells, as a secondary response to the imprint, acts to extend the size of the imprinted domain to an entire chromosome. Thus, imprinting in Drosophila has properties of both typical mammalian and insect imprinting which suggests that genomic imprinting in Drosophila and mammals is not fundamentally different; imprinting is manifest as transcriptional silencing of a few genes or silencing of an entire chromosome depending on secondary processes such as differences in gene density and polytenization.

  18. The sales learning curve.

    PubMed

    Leslie, Mark; Holloway, Charles A

    2006-01-01

    When a company launches a new product into a new market, the temptation is to immediately ramp up sales force capacity to gain customers as quickly as possible. But hiring a full sales force too early just causes the firm to burn through cash and fail to meet revenue expectations. Before it can sell an innovative product efficiently, the entire organization needs to learn how customers will acquire and use it, a process the authors call the sales learning curve. The concept of a learning curve is well understood in manufacturing. Employees transfer knowledge and experience back and forth between the production line and purchasing, manufacturing, engineering, planning, and operations. The sales learning curve unfolds similarly through the give-and-take between the company--marketing, sales, product support, and product development--and its customers. As customers adopt the product, the firm modifies both the offering and the processes associated with making and selling it. Progress along the manufacturing curve is measured by tracking cost per unit: The more a firm learns about the manufacturing process, the more efficient it becomes, and the lower the unit cost goes. Progress along the sales learning curve is measured in an analogous way: The more a company learns about the sales process, the more efficient it becomes at selling, and the higher the sales yield. As the sales yield increases, the sales learning process unfolds in three distinct phases--initiation, transition, and execution. Each phase requires a different size--and kind--of sales force and represents a different stage in a company's production, marketing, and sales strategies. Adjusting those strategies as the firm progresses along the sales learning curve allows managers to plan resource allocation more accurately, set appropriate expectations, avoid disastrous cash shortfalls, and reduce both the time and money required to turn a profit.

  19. Spatiotemporal patterns and source attribution of nitrogen pollution in a typical headwater agricultural watershed in Southeastern China.

    PubMed

    Chen, Wenjun; He, Bin; Nover, Daniel; Duan, Weili; Luo, Chuan; Zhao, Kaiyan; Chen, Wen

    2018-01-01

    Excessive nitrogen (N) discharge from agriculture causes widespread problems in aquatic ecosystems. Knowledge of spatiotemporal patterns and source attribution of N pollution is critical for nutrient management programs but is poorly studied in headwaters with various small water bodies and mini-point pollution sources. Taking a typical small watershed in the low mountains of Southeastern China as an example, N pollution and source attribution were studied for a multipond system around a village using the Hydrological Simulation Program-Fortran (HSPF) model. The results exhibited distinctive spatio-seasonal variations with an overall seriousness rank for the three indicators: total nitrogen (TN) > nitrate/nitrite nitrogen (NO x - -N) > ammonia nitrogen (NH 3 -N), according to the Chinese Surface Water Quality Standard. TN pollution was severe for the entire watershed, while NO x - -N pollution was significant for ponds and ditches far from the village, and the NH 3 -N concentrations were acceptable except for the ponds near the village in summer. Although food and cash crop production accounted for the largest source of N loads, we discovered that mini-point pollution sources, including animal feeding operations, rural residential sewage, and waste, together contributed as high as 47% of the TN and NH 3 -N loads in ponds and ditches. So, apart from eco-fertilizer programs and concentrated animal feeding operations, the importance of environmental awareness building for resource management is highlighted for small farmers in headwater agricultural watersheds. As a first attempt to incorporate multipond systems into the process-based modeling of nonpoint source (NPS) pollution, this work can inform other hydro-environmental studies on scattered and small water bodies. The results are also useful to water quality improvement for entire river basins.

  20. Simulations of fully deformed oscillating flux tubes

    NASA Astrophysics Data System (ADS)

    Karampelas, K.; Van Doorsselaere, T.

    2018-02-01

    Context. In recent years, a number of numerical studies have been focusing on the significance of the Kelvin-Helmholtz instability in the dynamics of oscillating coronal loops. This process enhances the transfer of energy into smaller scales, and has been connected with heating of coronal loops, when dissipation mechanisms, such as resistivity, are considered. However, the turbulent layer is expected near the outer regions of the loops. Therefore, the effects of wave heating are expected to be confined to the loop's external layers, leaving their denser inner parts without a heating mechanism. Aim. In the current work we aim to study the spatial evolution of wave heating effects from a footpoint driven standing kink wave in a coronal loop. Methods: Using the MPI-AMRVAC code, we performed ideal, three dimensional magnetohydrodynamic simulations of footpoint driven transverse oscillations of a cold, straight coronal flux tube, embedded in a hotter environment. We have also constructed forward models for our simulation using the FoMo code. Results: The developed transverse wave induced Kelvin-Helmholtz (TWIKH) rolls expand throughout the tube cross-section, and cover it entirely. This turbulence significantly alters the initial density profile, leading to a fully deformed cross section. As a consequence, the resistive and viscous heating rate both increase over the entire loop cross section. The resistive heating rate takes its maximum values near the footpoints, while the viscous heating rate at the apex. Conclusions: We conclude that even a monoperiodic driver can spread wave heating over the whole loop cross section, potentially providing a heating source in the inner loop region. Despite the loop's fully deformed structure, forward modelling still shows the structure appearing as a loop. A movie attached to Fig. 1 is available at http://https://www.aanda.org

  1. Investigating and predicting landslides using a rainfall runoff model in Norway

    NASA Astrophysics Data System (ADS)

    Kråbøl, Eline; Skaugen, Thomas; Devoli, Graziella; Xu, Chong-Yu

    2016-04-01

    Landslides are amongst the most destructive natural hazards, causing damage to infrastructures, such as roads, railroads and houses, and can, in a worst-case scenario, take lives. A better understanding of the triggering processes of landslides are important as it enables us to perform better forecasts, improve mapping of zones with landslide risk and carry out mitigation measures. In this study, a parameter-parsimonious rainfall-runoff model, DDD (Distance Distribution Dynamics), is used to simulate the hydrological conditions for rainfall-induced landslide events. The model estimates the capacity of the subsurface reservoir at different levels of saturation and predicts overland flow. The subsurface in the DDD has a 2-D representation in that it calculates the saturated and unsaturated soil moisture along a hillslope representing the entire catchment in question. In this study, 50 landslide events in 10 catchments in Southern Norway are investigated. Characteristics of the subsurface states, before, during and after the landslide are analysed for the whole catchment and at three points (lower, middle and upper part) of the hillslope. Preliminary results show that the hysteretic loop of storage and discharge follow complex clockwise and anti-clockwise patterns. Anti-clockwise loops occur more frequent, except for the middle part of the hillslope. In the upper part of the hillslope, anti-clockwise loop occur almost exclusively (94 %). Evaluated for the entire catchment, 57 % of the landslide events occurred at maximum saturation, while 77 % of the events occurred at saturation above 80 %. We found the majority of the landslide events to be associated with the rising limb and the top of the hysteretic curve with 64 % and 17 %, respectively. Overland flow was found for 68 % of the events.

  2. FRESHEM - Fresh-saline groundwater distribution in Zeeland (NL) derived from airborne EM

    NASA Astrophysics Data System (ADS)

    Siemon, Bernhard; van Baaren, Esther; Dabekaussen, Willem; Delsman, Joost; Gunnik, Jan; Karaoulis, Marios; de Louw, Perry; Oude Essink, Gualbert; Pauw, Pieter; Steuer, Annika; Meyer, Uwe

    2017-04-01

    In a setting of predominantly saline surface waters, the availability of fresh water for agricultural purposes is not obvious in Zeeland, The Netherlands. Canals and ditches are mainly brackish to saline due to saline seepage, which originates from old marine deposits and salt-water transgressions during historical times. The only available fresh groundwater is present in the form of freshwater lenses floating on top of the saline groundwater. This fresh groundwater is vital for agricultural, industrial, ecological, water conservation and drinking water functions. An essential first step for managing this fresh groundwater properly is to know the present spatial fresh-brackish-saline groundwater distribution. As traditional salinity monitoring is labour-intensive, airborne electromagnetics (AEM), which is fast and can cover large areas in short time, is an efficient alternative. A consortium of BGR, Deltares and TNO started FRESHEM Zeeland (FREsh Salt groundwater distribution by Helicopter ElectroMagnetic survey in the Province of Zeeland) in October 2014. Within 3x2 weeks of the first project year, the entire area of about 2000 km2 was surveyed using BGR's helicopter-borne geophysical system totalling to about 10,000 line-km. The HEM datasets of 17 subareas were carefully processed using advanced BGR in-house software and inverted to 2.5 Million resistivity-depth models. Ground truthing demonstrated that the large-scale HEM results fit very well with small-scale ground EM data (ECPT). Based on this spatial resistivity distribution, a 3D voxel model for Chloride concentration was derived for the entire province taking into account geological model data (GeoTOP) for the lithology correction and local in-situ groundwater measurements for the translation of water conductivity to Chloride concentration. The 3D voxel model enables stakeholders to implement spatial Chloride concentration in their groundwater models.

  3. 36 CFR 1202.44 - How long will it take for NARA to process my request?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false How long will it take for NARA to process my request? 1202.44 Section 1202.44 Parks, Forests, and Public Property NATIONAL... Individual Access to Records § 1202.44 How long will it take for NARA to process my request? (a) NARA will...

  4. Certification of vapor phase hydrogen peroxide sterilization process for spacecraft application

    NASA Technical Reports Server (NTRS)

    Rohatgi, N.; Schubert, W.; Koukol, R.; Foster, T. L.; Stabekis, P. D.

    2002-01-01

    This paper describes the selection process and research activities JPL is planning to conduct for certification of hydrogen peroxide as a NASA approved technique for sterilization of various spacecraft parts/components and entire modern spacecraft.

  5. Automated drug identification system

    NASA Technical Reports Server (NTRS)

    Campen, C. F., Jr.

    1974-01-01

    System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.

  6. Variable dynamic testbed vehicle : safety plan

    DOT National Transportation Integrated Search

    1997-02-01

    This safety document covers the entire safety process from inception to delivery of the Variable Dynamic Testbed Vehicle. In addition to addressing the process of safety on the vehicle , it should provide a basis on which to build future safety proce...

  7. From Physical Campus Space to a Full-view Figure: University Atlas Compiling Based on `Information Design' Concept

    NASA Astrophysics Data System (ADS)

    Song, Ge; Tang, Xi; Zhu, Feng

    2018-05-01

    Traditional university maps, taking campus as the principal body, mainly realize the abilities of space localization and navigation. They don't take full advantage of map, such as multi-scale representations and thematic geo-graphical information visualization. And their inherent propaganda functions have not been entirely developed. Therefore, we tried to take East China Normal University (ECNU) located in Shanghai as an example, and integrated various information related to university propaganda need (like spatial patterns, history and culture, landscape ecology, disciplinary constructions, cooperation, social services, development plans and so on). We adopted the frontier knowledge of `information design' as well as kinds of information graphics and visualization solutions. As a result, we designed and compiled a prototype atlas of `ECNU Impression' to provide a series of views of ECNU, which practiced a new model of `narrative campus map'. This innovative propaganda product serves as a supplement to typical shows with official authority, data maturity, scientificity, dimension diversity, and timing integrity. The university atlas will become a usable media for university overall figure shaping.

  8. Role of river bank erosion in sediment budgets of catchments within the Loire river basin (France)

    NASA Astrophysics Data System (ADS)

    Gay, Aurore; Cerdan, Olivier; Poisvert, Cecile; Landemaine, Valentin

    2014-05-01

    Quantifying volumes of sediments produced on hillslopes or in channels and transported or stored within river systems is necessary to establish sediment budgets. If research efforts on hillslope erosion processes have led to a relatively good understanding and quantification of local sources, in-channel processes remain poorly understood and quasi inexistent in global budgets. However, profound landuse changes and agricultural practices have altered river functioning, caused river bank instability and stream incision. During the past decades in France, river channelization has been perfomed extensively to allow for new agricultural practices to take place. Starting from a recent study on the quantification of sediment fluxes for catchments within the Loire river basin (Gay et al. 2013), our aim is to complete sediment budgets by taking into account various sources and sinks both on hillslope and within channel. The emphasis of this study is on river bank erosion and how bank erosion contributes to global budgets. A model of bank retreat is developed for the entire Loire river basin. In general, our results show that bank retreat is on average quite low with approximately 1 cm.yr-1. However, a strong variability exists within the study area with channels displaying values of bank retreat up to ~10 cm.yr-1. Our results corroborate those found by Landemaine et al. in 2013 on a small agricultural catchment. From this first step, quantification of volumes of sediment eroded from banks and available for transport should be calculated and integrated in sediment budgets to allow for a better understanding of basin functioning. Gay A., Cerdan O., Delmas M., Desmet M., Variability of sediment yields in the Loire river basin (France): the role of small scale catchments (under review). Landemaine V., Gay A., Cerdan O., Salvador-Blanes S., Rodriguez S. Recent morphological evolution of a headwater stream in agricultural context after channelization in the Ligoire river (France) (in prep)

  9. Near-surface coherent structures explored by large eddy simulation of entire tropical cyclones.

    PubMed

    Ito, Junshi; Oizumi, Tsutao; Niino, Hiroshi

    2017-06-19

    Taking advantage of the huge computational power of a massive parallel supercomputer (K-supercomputer), this study conducts large eddy simulations of entire tropical cyclones by employing a numerical weather prediction model, and explores near-surface coherent structures. The maximum of the near-surface wind changes little from that simulated based on coarse-resolution runs. Three kinds of coherent structures appeared inside the boundary layer. The first is a Type-A roll, which is caused by an inflection-point instability of the radial flow and prevails outside the radius of maximum wind. The second is a Type-B roll that also appears to be caused by an inflection-point instability but of both radial and tangential winds. Its roll axis is almost orthogonal to the Type-A roll. The third is a Type-C roll, which occurs inside the radius of maximum wind and only near the surface. It transports horizontal momentum in an up-gradient sense and causes the largest gusts.

  10. Dedradation of buried ice and permafrost in the Veleta Cirque (Sierra Nevada, Spain) from 2006-2013

    NASA Astrophysics Data System (ADS)

    Gómez-Ortiz, A.; Oliva, M.; Salvador-Franch, F.; Salvà-Catarineu, M.; Palacios, D.; de Sanjosé-Blasco, J. J.; Tanarro-García, L. M.

    2014-04-01

    The Veleta cirque is located at the foot of the Veleta peak, one of the highest summits of the Sierra Nevada National Park (Southern Spain). This cirque was the source of a glacier valley during the Quaternary cold periods. During the Little Ice Age it sheltered a small glacier, the most southerly in Europe, about which we have possessed written records since the XVII century. This glacier still had ice residues until the mid-XX century. This ice is no longer visible, but a residue persists along with discontinuous permafrost trapped under strata of rock blocks that make up an incipient rock glacier. From 2006 to 2013, this rock glacier was monitored by measurement of the temperature of the active layer, the degree of snow cover on the ground, movements of the body of the rock glacier and geophysical prospection inside it. The results show that the relict ice and trapped permafrost have been steadily declining. The processes that explain this degradation occur in chain, starting from the external radiation that affects the ground in summer, which is when the temperatures are higher. In effect, when this radiation steadily melts the snow on the ground, the thermal expansive wave advances into the heart of the active layer, reaching the ceiling of the frozen mass, which it then degrades and melts. In this entire linked process, the circulation of melt waters fulfil a highly significant function, as they act as heat transmitters. The complementary nature of these processes explains the subsidence and continuous changes in the entire clastic pack and the melting of the frozen ceiling on which it rests. This happens in summer in just a few weeks. All these events, in particular the geomorphological ones, take place on the Sierra Nevada peaks within certain climate conditions that are at present unfavourable to the maintenance of snow on the ground in summer. These conditions could be related to recent variations in the climate, starting in the mid-XIX century and most markedly since the second half of the XX century. The work and results highlight the climate sensitivity of the peaks of the Sierra Nevada to the effect of climate change and its impact on the dynamics of ecosystems, which is a benchmark for evaluating the current evolution of landscapes Mediterranean high mountain.

  11. Assessing treatment integrity in cognitive-behavioral therapy: comparing session segments with entire sessions.

    PubMed

    Weck, Florian; Grikscheit, Florian; Höfling, Volkmar; Stangier, Ulrich

    2014-07-01

    The evaluation of treatment integrity (therapist adherence and competence) is a necessary condition to ensure the internal and external validity of psychotherapy research. However, the evaluation process is associated with high costs, because therapy sessions must be rated by experienced clinicians. It is debatable whether rating session segments is an adequate alternative to rating entire sessions. Four judges evaluated treatment integrity (i.e., therapist adherence and competence) in 84 randomly selected videotapes of cognitive-behavioral therapy for major depressive disorder, social anxiety disorder, and hypochondriasis (from three different treatment outcome studies). In each case, two judges provided ratings based on entire therapy sessions and two on session segments only (i.e., the middle third of the entire sessions). Interrater reliability of adherence and competence evaluations proved satisfactory for ratings based on segments and the level of reliability did not differ from ratings based on entire sessions. Ratings of treatment integrity that were based on entire sessions and session segments were strongly correlated (r=.62 for adherence and r=.73 for competence). The relationship between treatment integrity and outcome was comparable for ratings based on session segments and those based on entire sessions. However, significant relationships between therapist competence and therapy outcome were only found in the treatment of social anxiety disorder. Ratings based on segments proved to be adequate for the evaluation of treatment integrity. The findings demonstrate that session segments are an adequate and cost-effective alternative to entire sessions for the evaluation of therapist adherence and competence. Copyright © 2014. Published by Elsevier Ltd.

  12. Technical Writing: Process and Product. Third Edition.

    ERIC Educational Resources Information Center

    Gerson, Sharon J.; Gerson, Steven M.

    This book guides students through the entire writing process--prewriting, writing, and rewriting--developing an easy-to-use, step-by-step technique for writing the types of documents they will encounter on the job. It engages students in the writing process and encourages hands-on application as well as discussions about ethics, audience…

  13. A PROCESS FOR DEVELOPING AND EVALUATING INDICIES OF FISH ASSEMBLAGE INTEGRITY

    EPA Science Inventory

    We describe a general process for developing an index of fish assemblage integrity, using the Willamette Valley of Oregon, U.S.A., as an example. Such an index is useful for assessing the effects of humans on entire fish assemblages, and the general process can be applied to any ...

  14. Theory-Driven Process Evaluation of a Complementary Feeding Trial in Four Countries

    ERIC Educational Resources Information Center

    Newman, Jamie E.; Garces, Ana; Mazariegos, Manolo; Hambidge, K. Michael; Manasyan, Albert; Tshefu, Antoinette; Lokangaka, Adrien; Sami, Neelofar; Carlo, Waldemar A.; Bose, Carl L.; Pasha, Omrana; Goco, Norman; Chomba, Elwyn; Goldenberg, Robert L.; Wright, Linda L.; Koso-Thomas, Marion; Krebs, Nancy F.

    2014-01-01

    We conducted a theory-driven process evaluation of a cluster randomized controlled trial comparing two types of complementary feeding (meat versus fortified cereal) on infant growth in Guatemala, Pakistan, Zambia and the Democratic Republic of Congo. We examined process evaluation indicators for the entire study cohort (N = 1236) using chi-square…

  15. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  16. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  17. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  18. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  19. Structural violence and the state: HIV and labour migration from Pakistan to the Persian Gulf.

    PubMed

    Qureshi, Ayaz

    2013-01-01

    This paper examines the biopolitics of HIV and labour migration from Pakistan (a country classified by UNAIDS as at 'high risk' of a generalised epidemic) to the countries of the Gulf Cooperation Council (GCC). The remittances by the labour migrants in the Gulf are an invaluable source of foreign exchange for Pakistan and a large number of households are entirely dependent upon them. At the same time, the National AIDS Control Programme regards Gulf migrants as a key risk factor for an HIV epidemic. The majority of HIV positive people in clinics comprise Gulf returnee migrants and their family members. This paper suggests that in the process of migrating, prospective migrants are subjected to structural violence that increases their HIV vulnerabilities. In this process, they are subjected to regimes of medical inspection, reduced to their certifiable labour power, inscribed with nationalist ideologies identifying HIV as a disease that strikes 'the other', and exposed to exploitation that increases their vulnerabilities. After migration, they are made to undergo compulsory periodic medical examinations in the GCC and, if found to be HIV positive, they are forcibly deported without papers, proper diagnosis or healthcare - only to return as 'failed subjects'. Taking a disaggregated view of the state, the paper argues that, in order to be effective, debates on structural violence and the HIV epidemic must make explicit the role of the state in producing migrants' vulnerabilities.

  20. Taking Advantage of Selective Change Driven Processing for 3D Scanning

    PubMed Central

    Vegara, Francisco; Zuccarello, Pedro; Boluda, Jose A.; Pardo, Fernando

    2013-01-01

    This article deals with the application of the principles of SCD (Selective Change Driven) vision to 3D laser scanning. Two experimental sets have been implemented: one with a classical CMOS (Complementary Metal-Oxide Semiconductor) sensor, and the other one with a recently developed CMOS SCD sensor for comparative purposes, both using the technique known as Active Triangulation. An SCD sensor only delivers the pixels that have changed most, ordered by the magnitude of their change since their last readout. The 3D scanning method is based on the systematic search through the entire image to detect pixels that exceed a certain threshold, showing the SCD approach to be ideal for this application. Several experiments for both capturing strategies have been performed to try to find the limitations in high speed acquisition/processing. The classical approach is limited by the sequential array acquisition, as predicted by the Nyquist–Shannon sampling theorem, and this has been experimentally demonstrated in the case of a rotating helix. These limitations are overcome by the SCD 3D scanning prototype achieving a significantly higher performance. The aim of this article is to compare both capturing strategies in terms of performance in the time and frequency domains, so they share all the static characteristics including resolution, 3D scanning method, etc., thus yielding the same 3D reconstruction in static scenes. PMID:24084110

  1. Preparation and characterization of maghemite nanoparticles from mild steel for magnetically guided drug therapy.

    PubMed

    Kumar, Nitesh; Kulkarni, Kaustubh; Behera, Laxmidhar; Verma, Vivek

    2017-08-01

    Maghemite (γ-Fe 2 O 3 ) nanoparticles for therapeutic applications are prepared from mild steel but the existing synthesis technique is very cumbersome. The entire process takes around 100 days with multiple steps which lack proper understanding. In the current work, maghemite nanoparticles of cuboidal and spheroidal morphologies were prepared from mild steel chips by a novel cost effective oil reduction technique for magnetically guided intravascular drug delivery. The technique developed in this work yields isometric sized γ-Fe 2 O 3 nanoparticles in 6 h with higher saturation magnetization as compared to the existing similar solid state synthesis route. Mass and heat flow kinetics during the heating and quenching steps were studied with the help of Finite element simulations. Qualitative and quantitative analysis of the γ-Fe 2 O 3 phase is performed with the help of x-ray diffraction, transmission electron microscope and x-ray photoelectron spectroscopy. Mechanism for the α-Fe 2 O 3 (haematite) to γ-Fe 2 O 3 (maghemite) phase evolution during the synthesis process is also investigated. Maghemite (γ-Fe 2 O 3 ) nanoparticles were prepared bya novel cost effective oil reduction technique as mentioned below in the figure. The raw materials included mild steel chips which is one of the most abundant engineering materials. These particles can be used as ideal nanocarriers for targeted drug delivery through the vascular network.

  2. Research on large spatial coordinate automatic measuring system based on multilateral method

    NASA Astrophysics Data System (ADS)

    Miao, Dongjing; Li, Jianshuan; Li, Lianfu; Jiang, Yuanlin; Kang, Yao; He, Mingzhao; Deng, Xiangrui

    2015-10-01

    To measure the spatial coordinate accurately and efficiently in large size range, a manipulator automatic measurement system which based on multilateral method is developed. This system is divided into two parts: The coordinate measurement subsystem is consists of four laser tracers, and the trajectory generation subsystem is composed by a manipulator and a rail. To ensure that there is no laser beam break during the measurement process, an optimization function is constructed by using the vectors between the laser tracers measuring center and the cat's eye reflector measuring center, then an orientation automatically adjust algorithm for the reflector is proposed, with this algorithm, the laser tracers are always been able to track the reflector during the entire measurement process. Finally, the proposed algorithm is validated by taking the calibration of laser tracker for instance: the actual experiment is conducted in 5m × 3m × 3.2m range, the algorithm is used to plan the orientations of the reflector corresponding to the given 24 points automatically. After improving orientations of some minority points with adverse angles, the final results are used to control the manipulator's motion. During the actual movement, there are no beam break occurs. The result shows that the proposed algorithm help the developed system to measure the spatial coordinates over a large range with efficiency.

  3. Multi-spectral observations of flares

    NASA Astrophysics Data System (ADS)

    Zuccarello, F.

    2016-11-01

    Observations show that during solar flares radiation can be emitted across the entire electromagnetic spectrum, spanning from gamma rays to radio waves. These emissions, related to the conversion of magnetic energy into other forms of energy (kinetic, thermal, waves) through magnetic reconnection, are due to different physical processes that can occur in different layers of the Sun. This means that flare observations need to be carried out using instruments operating in different wave-bands in order to achieve a complete scenario of the processes going on. Taking into account that most of the radiative energy is emitted at optical and UV wavelengths, observations carried out from space, need to be complemented by observations carried out from ground-based telescopes. Nowadays, the possibility to carry on high temporal, spatial and spectral resolution from ground-based telescopes in coordinated campaigns with space-borne instruments (like, i.e., IRIS and HINODE) gives the opportunity to investigate the details of the flare emission at different wavelengths and can provide useful hints to understand these phenomena and compare observations with models. However, it is undoubted that sometimes the pointing to the flaring region is not an easy task, due to the necessity to provide the target coordinates to satellites with some hours in advance. Some problems arising from this issue will be discussed. Moreover, new projects related to flare catalogues and archives will be presented.

  4. Key Elements of a Low Voltage, Ultracompact Plasma Spectrometer

    NASA Technical Reports Server (NTRS)

    Scime, E. E.; Barrie, A.; Dugas, M.; Elliott, D.; Ellison, S.; Keesee, A. M.; Pollock, C. J.; Rager, A.; Tersteeg, J.

    2016-01-01

    Taking advantage of technological developments in wafer-scale processing over the past two decades, such as deep etching, 3-D chip stacking, and double-sided lithography, we have designed and fabricated the key elements of an ultracompact 1.5cm (exp 3)plasma spectrometer that requires only low-voltage power supplies, has no microchannel plates, and has a high aperture area to instrument volume ratio. The initial design of the instrument targets the measurement of charged particles in the 3-20keV range with a highly directional field of view and a 100 duty cycle; i.e., the entire energy range Is continuously measured. In addition to reducing mass, size, and voltage requirements, the new design will affect the manufacturing process of plasma spectrometers, enabling large quantities of identical instruments to be manufactured at low individual unit cost. Such a plasma spectrometer is ideal for heliophysics plasma investigations, particularly for small satellite and multispacecraft missions. Two key elements of the instrument have been fabricated: the collimator and the energy analyzer. An initial collimator transparency of 20 with 3deg x 3deg angular resolution was achieved. The targeted 40 collimator transparency appears readily achievable. The targeted energy analyzer scaling factor of 1875 was achieved; i.e.20 keV electrons were selected for only a 10.7V bias voltage in the energy analyzer.

  5. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  6. Governance Options to Enhance Ecosystem Services in Cocoa, Soy, Tropical Timber and Palm Oil Value Chains.

    PubMed

    Ingram, Verina; van den Berg, Jolanda; van Oorschot, Mark; Arets, Eric; Judge, Lucas

    2018-02-06

    Dutch policies have advocated sustainable commodity value chains, which have implications for the landscapes from which these commodities originate. This study examines governance and policy options for sustainability in terms of how ecosystem services are addressed in cocoa, soy, tropical timber and palm oil value chains with Dutch links. A range of policies addressing ecosystem services were identified, from market governance (certification, payments for ecosystem services) to multi-actor platforms (roundtables) and public governance (policies and regulations). An analysis of policy narratives and interviews identified if and how ecosystem services are addressed within value chains and policies; how the concept has been incorporated into value chain governance; and which governance options are available. The Dutch government was found to take a steering but indirect role in all the cases, primarily through supporting, financing, facilitating and partnering policies. Interventions mainly from end-of-chain stakeholders located in processing and consumption countries resulted in new market governance, notably voluntary sustainability standards. These have been successful in creating awareness of some ecosystem services and bringing stakeholders together. However, they have not fully addressed all ecosystem services or stakeholders, thus failing to increase the sustainability of value chains or of the landscapes of origin. We argue that chains sourced in tropical landscapes may be governed more effectively for sustainability if voluntary, market policy tools and governance arrangements have more integrated goals that take account of sourcing landscapes and impacts along the entire value chain. Given the international nature of these commodities. These findings have significance for debates on public-private approaches to value chain and landscape governance.

  7. Mating behaviour in a slave-making ant, Rossomyrmex minuchae (Hymenoptera, Formicidae).

    PubMed

    Ruano, Francisca; Tinaut, Alberto

    2005-07-01

    The mating behaviour of the ant Rossomyrmex minuchae, a rare, protected slave-making species in Spain, seems to be significantly affected by its particular life history and patchy habitat. The mating behaviour of the entire genus Rossomyrmex is virtually unknown. We present here the results of a 3-year study of mating behaviour in R. minuchae.Behavioural observations and limited nest excavations revealed that R. minuchae does not produce sexuals every year, the number of sexuals is low, and the sex ratio tends to be female biased. Females typically exhibit two distinct activity periods. The first, the mating period, takes place in early afternoon: the ants "call" near the natal nest, mate and then return to their nest. The second, the dispersal period takes place in late afternoon: the mated females exit their nest and fly in search of a new, non-parasitized Proformica longiseta host nest. Males are highly active during the mating period, but will remain inactive in the dispersal period even if experimentally presented with virgin females. It appears that females are monogamous, while males are polygamous. When males are late arriving at the female calling site, the females will frequently congregate presumably calling in chorus. The low reproductive efficiency exhibited by R. minuchae, coupled with the postulated low genetic variation in the population, as sisters may mate with the same male, could result in a low survival rate and risk of eventual extinction. The observed decrease in nest density we observed during the 2004 season may be indicative of such a process.

  8. Scheduling time-critical graphics on multiple processors

    NASA Technical Reports Server (NTRS)

    Meyer, Tom W.; Hughes, John F.

    1995-01-01

    This paper describes an algorithm for the scheduling of time-critical rendering and computation tasks on single- and multiple-processor architectures, with minimal pipelining. It was developed to manage scientific visualization scenes consisting of hundreds of objects, each of which can be computed and displayed at thousands of possible resolution levels. The algorithm generates the time-critical schedule using progressive-refinement techniques; it always returns a feasible schedule and, when allowed to run to completion, produces a near-optimal schedule which takes advantage of almost the entire multiple-processor system.

  9. CDF Top Physics

    DOE R&D Accomplishments Database

    Tartarelli, G. F.; CDF Collaboration

    1996-05-01

    The authors present the latest results about top physics obtained by the CDF experiment at the Fermilab Tevatron collider. The data sample used for these analysis (about 110 pb{sup{minus}1}) represents almost the entire statistics collected by CDF during four years (1992--95) of data taking. This large data size has allowed detailed studies of top production and decay properties. The results discussed here include the determination of the top quark mass, the measurement of the production cross section, the study of the kinematics of the top events and a look at top decays.

  10. Simplifying Chandra aperture photometry with srcflux

    NASA Astrophysics Data System (ADS)

    Glotfelty, Kenny

    2014-11-01

    This poster will highlight some of the features of the srcflux script in CIAO. This script combines many threads and tools together to compute photometric properties for sources: counts, rates, various fluxes, and confidence intervals or upper limits. Beginning and casual X-ray astronomers greatly benefit from the simple interface: just specify the event file and a celestial location, while power-users and X-ray astronomy experts can take advantage of the all the parameters to automatically produce catalogs for entire fields. Current limitations and future enhancements of the script will also be presented.

  11. Performance simulation for the design of solar heating and cooling systems

    NASA Technical Reports Server (NTRS)

    Mccormick, P. O.

    1975-01-01

    Suitable approaches for evaluating the performance and the cost of a solar heating and cooling system are considered, taking into account the value of a computer simulation concerning the entire system in connection with the large number of parameters involved. Operational relations concerning the collector efficiency in the case of a new improved collector and a reference collector are presented in a graph. Total costs for solar and conventional heating, ventilation, and air conditioning systems as a function of time are shown in another graph.

  12. IYL project: pinky-powered photons

    NASA Astrophysics Data System (ADS)

    Dreyer, Elizabeth F. C.; Aku-Leh, Cynthia; Nees, John A.; Sala, Anca L.; Smith, Arlene; Jones, Timothy

    2016-09-01

    Pinky-powered Photons is an activity created by the Michigan Light Project during the International Year of Light to encourage creativity in learning about light. It is a low-cost project. Participants make and take home a colorful LED light powered entirely by their fingers. Younger visitors "package" the electrical element into their own creation while older visitors solder the electrical parts together and then create their own design. This paper will detail the learning objectives and outcomes of this project as well as how to implement it in an outreach event or classroom.

  13. Cognitive Design for Learning: Cognition and Emotion in the Design Process

    ERIC Educational Resources Information Center

    Hasebrook, Joachim

    2016-01-01

    We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…

  14. Mathematical model of rolling an elastic wheel over deformable support base

    NASA Astrophysics Data System (ADS)

    Volskaia, V. N.; Zhileykin, M. M.; Zakharov, A. Y.

    2018-02-01

    One of the main direction of economic growth in Russia remains to be a speedy development of north and northeast regions that are the constituents of the 60 percent of the country territory. The further development of these territories requires new methods and technologies for solving transport and technological problems when off-road transportation of cargoes and people is conducting. One of the fundamental methods of patency prediction is imitation modeling of wheeled vehicles movement in different operating conditions. Both deformable properties of tires and physical and mechanical properties of the ground: normal tire deflection and gauge depth; variation of contact patch area depending on the load and pressure of air in the tire; existence of hysteresis losses in the tire material which are influencing on the rolling resistance due to friction processes between tire and ground in the contact patch; existence of the tangential reaction from the ground by entire contact area influence on the tractive patency. Nowadays there are two main trends in theoretical research of interaction wheeled propulsion device with ground: analytical method involving mathematical description of explored process and finite element method based on computational modeling. Mathematical models of interaction tire with the ground are used both in processes of interaction individual wheeled propulsion device with ground and researches of mobile vehicle dynamical models operated in specific road and climate conditions. One of the most significant imperfection of these models is the description of interaction wheel with flat deformable support base whereas profile of real support base surface has essential height of unevenness which is commensurate with radius of the wheel. The description of processes taking place in the ground under influence of the wheeled propulsion device using the finite element method is relatively new but most applicable lately. The application of this method allows to provide the most accurate description of the interaction process of a wheeled propulsion devices and the ground, also this method allows to define tension in the ground, deformation of the ground and the tire and ground’s compression. However, the high laboriousness of computations is essential shortcoming of that method therefore it’s hard to use these models as part of the general motion model of multi-axis wheeled vehicles. The purpose of this research is the elaboration of mathematical model of elastic wheel rolling over deformable rough support base taking into account the contact patch deformation. The mathematical model of rectilinear rolling an elastic wheel over rough deformable support base, taking into account variation of contact patch area and variation in the direction of the radial and tangential reactions also load bearing capacity of the ground, is developed. The efficiency of developed mathematical model of rectilinear rolling an elastic wheel over rough deformable support base is proved by the simulation methods.

  15. Management of a fire in the operating room.

    PubMed

    Kaye, Alan David; Kolinsky, Daniel; Urman, Richard D

    2014-04-01

    Operating room (OR) fires remain a significant source of liability for anesthesia providers and injury for patients, despite existing practice guidelines and other improvements in operating room safety. Factors contributing to OR fires are well understood and these occurrences are generally preventable. OR personnel must be familiar with the fire triad which consists of a fuel supply, an oxidizing agent, and an ignition source. Existing evidence shows that OR-related fires can result in significant patient complications and malpractice claims. Steps to reduce fires include taking appropriate safety measures before a patient is brought to the OR, taking proper preventive measures during surgery, and effectively managing fire and patient complications when they occur. Decreasing the incidence of fires should be a team effort involving the entire OR personnel, including surgeons, anesthesia providers, nurses, scrub technologists, and administrators. Communication and coordination among members of the OR team is essential to creating a culture of safety.

  16. Use of Spacecraft Command Language for Advanced Command and Control Applications

    NASA Technical Reports Server (NTRS)

    Mims, Tikiela L.

    2008-01-01

    The purpose of this work is to evaluate the use of SCL in building and monitoring command and control applications in order to determine its fitness for space operations. Approximately 24,325 lines of PCG2 code was converted to SCL yielding a 90% reduction in the number of lines of code as many of the functions and scripts utilized in SCL could be ported and reused. Automated standalone testing, simulating the actual production environment, was performed in order to generalize and gauge the relative time it takes for SCL to update and write a given display. The use of SCL rules, functions, and scripts allowed the creation of several test cases permitting the detection of the amount of time it takes update a given set of measurements given the change in a globally existing CUI or CUI. It took the SCL system an average 926.09 ticks to update the entire display of 323 measurements.

  17. Assistive Solutions in Practice: Experiences from AAL Pilot Regions in Austria.

    PubMed

    Ates, Nesrin; Aumayr, Georg; Drobics, Mario; Förster, Kristina Maria; Frauenberger, Christopher; Garschall, Markus; Kofler, Manfred; Krainer, Daniela; Kropf, Johannes; Majcen, Kurt; Oberzaucher, Johannes; Piazolo, Felix; Rzepka, Angelika; Sauskojus, Julia; Schneider, Cornelia; Stainer-Hochgatterer, Andreas; Sturm, Nadine; Waibel, Uli; Willner, Viktoria

    2017-01-01

    Since 2012 six AAL pilot regions were launched in Austria. The main goal of these pilot regions is to evaluate the impact of AAL technologies in daily use considering the entire value chain. Additionally, go-to market strategies for assistive technologies based on an involvement of all relevant stakeholders are developed. Within this paper an overview of the specific objectives, approaches and the status of all Austrian AAL pilot regions is given. Taking into account the different experiences of the different pilot regions, specific challenges in establishing, implementing and sustaining pilot region projects are discussed and lessons-learned are presented. Results show that a careful planning of all project phases taking into account available resources is crucial for the successful implementation of an AAL pilot region. In particular, this applies to all activities related to the active involvement of end-users.

  18. Integrating LMINET with TAAM and SIMMOD: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Long, Dou; Stouffer-Coston, Virginia; Kostiuk, Peter; Kula, Richard; Yackovetsky, Robert (Technical Monitor)

    2001-01-01

    LMINET is a queuing network air traffic simulation model implemented at 64 large airports and the entire National Airspace System in the United States. TAAM and SIMMOD are two widely used air traffic event-driven simulation models mostly for airports. Based on our proposed Progressive Augmented window approach, TAAM and SIMMOD are integrated with LMINET though flight schedules. In the integration, the flight schedules are modified through the flight delays reported by the other models. The benefit to the local simulation study is to let TAAM or SIMMOD take the modified schedule from LMINET, which takes into account of the air traffic congestion and flight delays at the national network level. We demonstrate the value of the integrated models by the case studies at Chicago O'Hare International Airport and Washington Dulles International Airport. Details of the integration are reported and future work for a full-blown integration is identified.

  19. Single Day Construction of Multigene Circuits with 3G Assembly.

    PubMed

    Halleran, Andrew D; Swaminathan, Anandh; Murray, Richard M

    2018-05-18

    The ability to rapidly design, build, and test prototypes is of key importance to every engineering discipline. DNA assembly often serves as a rate limiting step of the prototyping cycle for synthetic biology. Recently developed DNA assembly methods such as isothermal assembly and type IIS restriction enzyme systems take different approaches to accelerate DNA construction. We introduce a hybrid method, Golden Gate-Gibson (3G), that takes advantage of modular part libraries introduced by type IIS restriction enzyme systems and isothermal assembly's ability to build large DNA constructs in single pot reactions. Our method is highly efficient and rapid, facilitating construction of entire multigene circuits in a single day. Additionally, 3G allows generation of variant libraries enabling efficient screening of different possible circuit constructions. We characterize the efficiency and accuracy of 3G assembly for various construct sizes, and demonstrate 3G by characterizing variants of an inducible cell-lysis circuit.

  20. Simultaneous detection of rotational and translational motion in optical tweezers by measurement of backscattered intensity.

    PubMed

    Roy, Basudev; Bera, Sudipta K; Banerjee, Ayan

    2014-06-01

    We describe a simple yet powerful technique of simultaneously measuring both translational and rotational motion of mesoscopic particles in optical tweezers by measuring the backscattered intensity on a quadrant photodiode (QPD). While the measurement of translational motion by taking the difference of the backscattered intensity incident on adjacent quadrants of a QPD is well known, we demonstrate that rotational motion can be measured very precisely by taking the difference between the diagonal quadrants. The latter measurement eliminates the translational component entirely and leads to a detection sensitivity of around 50 mdeg at S/N of 2 for angular motion of a driven microrod. The technique is also able to resolve the translational and rotational Brownian motion components of the microrod in an unperturbed trap and can be very useful in measuring translation-rotation coupling of micro-objects induced by hydrodynamic interactions.

  1. NASA SETI microwave observing project: Sky Survey element

    NASA Technical Reports Server (NTRS)

    Klein, M. J.

    1991-01-01

    The SETI Sky Survey Observing Program is one of two complimentary strategies that NASA plans to use in its microwave Search for Extraterrestrial Intelligence (SETI). The primary objective of the sky survey is to search the entire sky over the frequency range of 1.0 to 10.0 GHz for evidence of narrow band signals of extraterrestrial intelligent origin. Frequency resolutions of 30 Hz or narrower will be used across the entire band. Spectrum analyzers with upwards of ten million channels are required to keep the survey time approximately 6 years. Data rates in excess of 10 megabits per second will be generated in the data taking process. Sophisticated data processing techniques will be required to determine the ever changing receiver baselines, and to detect and archive potential SETI signals. Existing radio telescopes, including several of NASA's Deep Space Network (DSN) 34 meter antennas located at Goldstone, CA and Tidbinbilla, Australia will be used for the observations. The JPL has the primary responsibility to develop and carry out the sky survey. In order to lay the foundation for the full scale SETI Sky Survey, a prototype system is being developed at the JPL. The system will be installed at the new 34-m high efficiency antenna at the Deep Space Station (DSS) 13 research and development station, Goldstone, CA, where it will be used to initiate the observational phase of the NASA SETI Sky Survey. It is anticipated that the early observations will be useful to test signal detection algorithms, scan strategies, and radio frequency interference rejection schemes. The SETI specific elements of the prototype system are: (1) the Wide Band Spectrum Analyzer (WBSA); a 2-million channel fast Fourier transformation (FFT) spectrum analyzer which covers an instantaneous bandpass of 40 MHz; (2) the signal detection processor; and (3) the SETI Sky Survey Manager, a network-based C-language environment that provides observatory control, performs data acquisition and analysis algorithms. A high level description of the prototype hardware and software systems will be given and the current status of the system development will be reported.

  2. Understanding the ignition mechanism of high-pressure spray flames

    DOE PAGES

    Dahms, Rainer N.; Paczko, Günter A.; Skeen, Scott A.; ...

    2016-10-25

    A conceptual model for turbulent ignition in high-pressure spray flames is presented. The model is motivated by first-principles simulations and optical diagnostics applied to the Sandia n-dodecane experiment. The Lagrangian flamelet equations are combined with full LLNL kinetics (2755 species; 11,173 reactions) to resolve all time and length scales and chemical pathways of the ignition process at engine-relevant pressures and turbulence intensities unattainable using classic DNS. The first-principles value of the flamelet equations is established by a novel chemical explosive mode-diffusion time scale analysis of the fully-coupled chemical and turbulent time scales. Contrary to conventional wisdom, this analysis reveals thatmore » the high Damköhler number limit, a key requirement for the validity of the flamelet derivation from the reactive Navier–Stokes equations, applies during the entire ignition process. Corroborating Rayleigh-scattering and formaldehyde PLIF with simultaneous schlieren imaging of mixing and combustion are presented. Our combined analysis establishes a characteristic temporal evolution of the ignition process. First, a localized first-stage ignition event consistently occurs in highest temperature mixture regions. This initiates, owed to the intense scalar dissipation, a turbulent cool flame wave propagating from this ignition spot through the entire flow field. This wave significantly decreases the ignition delay of lower temperature mixture regions in comparison to their homogeneous reference. This explains the experimentally observed formaldehyde formation across the entire spray head prior to high-temperature ignition which consistently occurs first in a broad range of rich mixture regions. There, the combination of first-stage ignition delay, shortened by the cool flame wave, and the subsequent delay until second-stage ignition becomes minimal. A turbulent flame subsequently propagates rapidly through the entire mixture over time scales consistent with experimental observations. As a result, we demonstrate that the neglect of turbulence-chemistry-interactions fundamentally fails to capture the key features of this ignition process.« less

  3. Annotating images by mining image search results.

    PubMed

    Wang, Xin-Jing; Zhang, Lei; Li, Xirong; Ma, Wei-Ying

    2008-11-01

    Although it has been studied for years by the computer vision and machine learning communities, image annotation is still far from practical. In this paper, we propose a novel attempt at model-free image annotation, which is a data-driven approach that annotates images by mining their search results. Some 2.4 million images with their surrounding text are collected from a few photo forums to support this approach. The entire process is formulated in a divide-and-conquer framework where a query keyword is provided along with the uncaptioned image to improve both the effectiveness and efficiency. This is helpful when the collected data set is not dense everywhere. In this sense, our approach contains three steps: 1) the search process to discover visually and semantically similar search results, 2) the mining process to identify salient terms from textual descriptions of the search results, and 3) the annotation rejection process to filter out noisy terms yielded by Step 2. To ensure real-time annotation, two key techniques are leveraged-one is to map the high-dimensional image visual features into hash codes, the other is to implement it as a distributed system, of which the search and mining processes are provided as Web services. As a typical result, the entire process finishes in less than 1 second. Since no training data set is required, our approach enables annotating with unlimited vocabulary and is highly scalable and robust to outliers. Experimental results on both real Web images and a benchmark image data set show the effectiveness and efficiency of the proposed algorithm. It is also worth noting that, although the entire approach is illustrated within the divide-and conquer framework, a query keyword is not crucial to our current implementation. We provide experimental results to prove this.

  4. Level 2 Perspective Taking Entails Two Processes: Evidence from PRP Experiments

    ERIC Educational Resources Information Center

    Janczyk, Markus

    2013-01-01

    In many situations people need to mentally adopt the (spatial) perspective of other persons, an ability that is referred to as "Level 2 perspective taking." Its underlying processes have been ascribed to mental self-rotation that can be dissociated from mental object-rotation. Recent findings suggest that perspective taking/self-rotation…

  5. A Decade of Friction Stir Welding R and D at NASA's Marshall Space Flight Center and a Glance into the Future

    NASA Technical Reports Server (NTRS)

    Ding, Jeff; Carter, Bob; Lawless, Kirby; Nunes, Arthur; Russell, Carolyn; Suites, Michael; Schneider, Judy

    2006-01-01

    Welding at NASA's Marshall Space Flight Center (MSFC), Huntsville, Alabama, has taken a new direction through the last 10 years. Fusion welding processes, namely variable polarity plasma arc (VPPA) and tungsten inert gas (TIG) were once the corner stone of welding development in the Space Flight Center's welding laboratories, located in the part of MSFC know as National Center for Advanced Manufacturing (NCM). Developed specifically to support the Shuttle Program's External Tank and later International Space Station manufacturing programs, was viewed as the paragon of welding processes for joining aluminum alloys. Much has changed since 1994, however, when NASA's Jeff Ding brought the FSW process to the NASA agency. Although, at that time, FSW was little more than a "lab curiosity", NASA researchers started investigating where the FSW process would best fit NASA manufacturing programs. A laboratory FSW system was procured and the first welds were made in fall of 1995. The small initial investment NASA made into the first FSW system has certainly paid off for the NASA agency in terms of cost savings, hardware quality and notoriety. FSW is now a part of Shuttle External Tank (ET) production and the preferred weld process for the manufacturing of components for the new Crew Launch Vehicle (CLV) and Heavy Lift Launch Vehicle (HLLV) that will take this country back to the moon. It is one of the solid state welding processes being considered for on-orbit space welding and repair, and is of considerable interest for Department of Defense @OD) manufacturing programs. MSFC involvement in these and other programs makes NASA a driving force in this country's development of FSW and other solid state welding technologies. Now, a decade later, almost the entire on-going welding R&D at MSFC now focuses on FSW and other more advanced solid state welding processes.

  6. Learning from Nature - Mapping of Complex Hydrological and Geomorphological Process Systems for More Realistic Modelling of Hazard-related Maps

    NASA Astrophysics Data System (ADS)

    Chifflard, Peter; Tilch, Nils

    2010-05-01

    Introduction Hydrological or geomorphological processes in nature are often very diverse and complex. This is partly due to the regional characteristics which vary over time and space, as well as changeable process-initiating and -controlling factors. Despite being aware of this complexity, such aspects are usually neglected in the modelling of hazard-related maps due to several reasons. But particularly when it comes to creating more realistic maps, this would be an essential component to consider. The first important step towards solving this problem would be to collect data relating to regional conditions which vary over time and geographical location, along with indicators of complex processes. Data should be acquired promptly during and after events, and subsequently digitally combined and analysed. Study area In June 2009, considerable damage occurred in the residential area of Klingfurth (Lower Austria) as a result of great pre-event wetness and repeatedly heavy rainfall, leading to flooding, debris flow deposit and gravitational mass movement. One of the causes is the fact that the meso-scale watershed (16 km²) of the Klingfurth stream is characterised by adverse geological and hydrological conditions. Additionally, the river system network with its discharge concentration within the residential zone contributes considerably to flooding, particularly during excessive rainfall across the entire region, as the flood peaks from different parts of the catchment area are superposed. First results of mapping Hydro(geo)logical surveys across the entire catchment area have shown that - over 600 gravitational mass movements of various type and stage have occurred. 516 of those have acted as a bed load source, while 325 mass movements had not reached the final stage yet and could thus supply bed load in the future. It should be noted that large mass movements in the initial or intermediate stage were predominately found in clayey-silty areas and weathered material, where the fluvial bank erosion only plays a minor role as an initiating factor. On the other hand, fluvial bank erosion does appear to be a cause of smaller mass movements in their final stage which develop spontaneously, most noticeably in regions of gravel-rich soils (coarse-grained) and of shallow weathered material (several decimetres). - numerous marks of surface runoff were found over the entire catchment area to a greatly variable extent and intensity. In the more eastern parts of the catchment, these signs can be linked especially to anthropogenic concentrated inputs of surface discharge e.g. drainage system of streets. Their spread is limited, but usually associated with huge erosion channels of up to 2 m depth. In the western parts of the catchment, however, signs of surface discharge are more commonly found in forests. Depending on their location, they can be a result of an up-hill infiltration surplus in areas of fields and pastures, or an infiltration surplus in the forest itself. In many places, rapid interflow through biologically-created macropores takes place, which often re-emerges at the surface in the form of return flow. In general, it is noticeable that marks of surface runoff often terminate at the scarps of landslides, which were not caused by fluvial bank erosion. The excess water produces a strong local saturation of the ground, which gives a higher landslide-susceptibility of the embankment. Future work Based on the acquired field knowledge, it was possible to distinguish areas of different heterogeneities/homogeneities of the dominant process chains for several micro-scale parts of the catchment area. Subsequently, conceptual slope profiles should be derived from the detailed field data, and these should include information of the dominant and complex process systems. This forms an essential starting point in order to be able to realistically consider relevant hazard-related processes as part of process-oriented modelling.

  7. City public service learns to speed read. [Computerized routing system for meter reading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitken, E.L.

    1994-02-01

    City Public Service (CPS) of San Antonio, TX is a municipally owned utility that serves a densely populated 1,566 square miles in and around San Antonio. CPS's service area is divided into 21 meter reading districts, each of which is broken down into no more than 99 regular routes. Every day, a CPS employee reads one of the districts, following one or more routes. In 1991, CPS began using handheld computers to record reads for regular routes, which are stored on the devices themselves. In contrast, rereads and final reads occur at random throughout the service area. Because they changemore » every day, the process of creating routes that can be loaded onto a handheld device is difficult. Until recently, rereads and final reads were printed on paper orders, and route schedulers would spend close to two hours sorting the paper orders into routes. Meter readers would then hand-sequence the orders on their routes, often using a city map, before taking them into the field in stacks. When the meter readers returned, their completed orders had to be separated by type of reread, and then keyed into the mainframe computer before bill processing could begin. CPS's data processing department developed a computerized routing system of its own that saves time and labor, as well as paper. The system eliminates paper orders entirely, enabling schedulers to create reread and final read routes graphically on a PC. Information no longer needs to be keyed from hard copy, reducing the margin of error and streamlining bill processing by incorporating automated data transfer between systems.« less

  8. Two-stage Energy Release Process of a Confined Flare with Double HXR Peaks

    NASA Astrophysics Data System (ADS)

    Ning, Hao; Chen, Yao; Wu, Zhao; Su, Yang; Tian, Hui; Li, Gang; Du, Guohui; Song, Hongqiang

    2018-02-01

    A complete understanding of the onset and subsequent evolution of confined flares has not been achieved. Earlier studies mainly analyzed disk events so as to reveal their magnetic topology and the cause of confinement. In this study, taking advantage of a tandem of instruments working at different wavelengths of X-rays, EUVs, and microwaves, we present dynamic details about a confined flare observed on the northwestern limb of the solar disk on 2016 July 24. The entire dynamic evolutionary process starting from its onset is consistent with a loop–loop interaction scenario. The X-ray profiles manifest an intriguing double-peak feature. From the spectral fitting, it has been found that the first peak is nonthermally dominated, while the second peak is mostly multithermal with a hot (∼10 MK) and a super-hot (∼30 MK) component. This double-peak feature is unique in that the two peaks are clearly separated by 4 minutes, and the second peak reaches up to 25–50 keV in addition, at energy bands above 3 keV, the X-ray fluxes decline significantly between the two peaks. This, together with other available imaging and spectral data, manifest a two-stage energy release process. A comprehensive analysis is carried out to investigate the nature of this two-stage process. We conclude that the second stage with the hot and super-hot sources mainly involves direct heating through a loop–loop reconnection at a relatively high altitude in the corona. The uniqueness of the event characteristics and the complete dataset make the study a nice addition to present literature on solar flares.

  9. Impact of tool wear on cross wedge rolling process stability and on product quality

    NASA Astrophysics Data System (ADS)

    Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric

    2017-10-01

    Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.

  10. A Cognitive Neural Architecture Able to Learn and Communicate through Natural Language.

    PubMed

    Golosio, Bruno; Cangelosi, Angelo; Gamotina, Olesya; Masala, Giovanni Luca

    2015-01-01

    Communicative interactions involve a kind of procedural knowledge that is used by the human brain for processing verbal and nonverbal inputs and for language production. Although considerable work has been done on modeling human language abilities, it has been difficult to bring them together to a comprehensive tabula rasa system compatible with current knowledge of how verbal information is processed in the brain. This work presents a cognitive system, entirely based on a large-scale neural architecture, which was developed to shed light on the procedural knowledge involved in language elaboration. The main component of this system is the central executive, which is a supervising system that coordinates the other components of the working memory. In our model, the central executive is a neural network that takes as input the neural activation states of the short-term memory and yields as output mental actions, which control the flow of information among the working memory components through neural gating mechanisms. The proposed system is capable of learning to communicate through natural language starting from tabula rasa, without any a priori knowledge of the structure of phrases, meaning of words, role of the different classes of words, only by interacting with a human through a text-based interface, using an open-ended incremental learning process. It is able to learn nouns, verbs, adjectives, pronouns and other word classes, and to use them in expressive language. The model was validated on a corpus of 1587 input sentences, based on literature on early language assessment, at the level of about 4-years old child, and produced 521 output sentences, expressing a broad range of language processing functionalities.

  11. LANDSAT D data transmission and dissemination study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    An assessment of the quantity of data processed by the system is discussed investigating the various methods for transmission within the system. Various methods of data storage are considered. It is concluded that the entire processing system should be located in White Sands, New Mexico.

  12. Mathematical model of whole-process calculation for bottom-blowing copper smelting

    NASA Astrophysics Data System (ADS)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song

    2017-11-01

    The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.

  13. Methodological challenges when doing research that includes ethnic minorities: a scoping review.

    PubMed

    Morville, Anne-Le; Erlandsson, Lena-Karin

    2016-11-01

    There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.

  14. A combined electron beam/optical lithography process step for the fabrication of sub-half-micron-gate-length MMIC chips

    NASA Technical Reports Server (NTRS)

    Sewell, James S.; Bozada, Christopher A.

    1994-01-01

    Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.

  15. A combined electron beam/optical lithography process step for the fabrication of sub-half-micron-gate-length MMIC chips

    NASA Astrophysics Data System (ADS)

    Sewell, James S.; Bozada, Christopher A.

    1994-02-01

    Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.

  16. Remaining gaps for "safe" CO2 storage: the INGV CO2GAPS vision of "learning by doing" monitoring geogas leakage, reservoirs contamination/mixing and induced/triggered seismicity

    NASA Astrophysics Data System (ADS)

    Quattrocchi, F.; Vinciguerra, S.; Chiarabba, C.; Boschi, E.; Anselmi, M.; Burrato, P.; Buttinelli, M.; Cantucci, B.; Cinti, D.; Galli, G.; Improta, L.; Nazzari, M.; Pischiutta, M.; Pizzino, L.; Procesi, M.; Rovelli, A.; Sciarra, A.; Voltattorni, N.

    2012-12-01

    The CO2GAPS project proposed by INGV is intended to build up an European Proposal for a new kind of research strategy in the field of the geogas storage. Aim of the project would be to fill such key GAPS concerning the main risks associated to CO2 storage and their implications on the entire Carbon Capture and Storage (CCS) process, which are: i) the geogas leakage both in soils and shallow aquifers, up to indoor seepage; ii) the reservoirs contamination/mixing by hydrocarbons and heavy metals; iii) induced or triggered seismicity and microseismicity, especially for seismogenic blind faults. In order to consider such risks and make the CCS public acceptance easier, a new kind of research approach should be performed by: i) a better multi-disciplinary and "site specific" risk assessment; ii) the development of more reliable multi-disciplinary monitoring protocols. In this view robust pre-injection base-lines (seismicity and degassing) as well as identification and discrimination criteria for potential anomalies are mandatory. CO2 injection dynamic modelling presently not consider reservoirs geomechanical properties during reactive mass-transport large scale simulations. Complex simulations of the contemporaneous physic-chemical processes involving CO2-rich plumes which move, react and help to crack the reservoir rocks are not totally performed. These activities should not be accomplished only by the oil-gas/electric companies, since the experienced know-how should be shared among the CCS industrial operators and research institutions, with the governments support and overview, also flanked by a transparent and "peer reviewed" scientific popularization process. In this context, a preliminary and reliable 3D modelling of the entire "storage complex" as defined by the European Directive 31/2009 is strictly necessary, taking into account the above mentioned geological, geochemical and geophysical risks. New scientific results could also highlighting such opportunities recently shown by strategic researches on the synergies between the use of underground space (e.g. CH4, CO2 storage and deep geothermics) for energetic supplying purposes. The CO2GAPS approach would merge together geomechanical and geochemical data with seismic velocity and anisotropy properties of the crust, induced seismicity data, gravimetry, EM techniques, and "early alarm" procedures for leakage/cracks detection in shallow geo-spheres (e.g. abandoned wells, naturally seismic and degassing zones). Moreover, a full merging of those data is necessary for a reliable 3D-Earth modelling and the subsequent reactive transport simulations. CO2GAPS vision would apply and verify these issues working on several European selected sites, taking also into account complex systems such as "inland" active faulted blocks close to potential off-shore CO2 storage sites, ECBM faulted prone-areas, "inland" injection test site and CO2 natural faulted analogues. The purpose of these activities focus on the study of long-term fate of stored CO2, leakage mechanisms through the cap-rock and/or abandoned wells, cement wells reactivity, as well as the effects of impurities in the CO2 streams, their removal costs, the use of tracers and the role of biota.

  17. Thickness dependent charge transfer states and dark carriers density in vacuum deposited small molecule organic photocell

    NASA Astrophysics Data System (ADS)

    Shekhar, Himanshu; Tzabari, Lior; Solomeshch, Olga; Tessler, Nir

    2016-10-01

    We have investigated the influence of the active layer thickness on the balance of the internal mechanisms affecting the efficiency of copper phthalocyanine - fullerene (C60) based vacuum deposited bulk heterojunction organic photocell. We fabricated a range of devices for which we varied the thickness of the active layer from 40 to 120 nm and assessed their performance using optical and electrical characterization techniques. As reported previously for phthalocyanine:C60, the performance of the device is highly dependent on the active layer thickness and of all the thicknesses we tried, the 40 nm thin active layer device showed the best solar cell characteristic parameters. Using the transfer matrix based optical model, which includes interference effects, we calculated the optical power absorbed in the active layers for the entire absorption band, and we found that this cannot explain the trend with thickness. Measurement of the cell quantum efficiency as a function of light intensity showed that the relative weight of the device internal processes changes when going from 40 nm to 120 nm thick active layer. Electrical modeling of the device, which takes different internal processes into account, allowed to quantify the changes in the processes affecting the generation - recombination balance. Sub gap external quantum efficiency and morphological analysis of the surface of the films agree with the model's result. We found that as the thickness grows the density of charge transfer states and of dark carriers goes up and the uniformity in the vertical direction is reduced.

  18. Distinct cytoskeleton populations and extensive crosstalk control Ciona notochord tubulogenesis.

    PubMed

    Dong, Bo; Deng, Wei; Jiang, Di

    2011-04-01

    Cell elongation is a fundamental process that allows cells and tissues to adopt new shapes and functions. During notochord tubulogenesis in the ascidian Ciona intestinalis, a dramatic elongation of individual cells takes place that lengthens the notochord and, consequently, the entire embryo. We find a novel dynamic actin- and non-muscle myosin II-containing constriction midway along the anteroposterior aspect of each notochord cell during this process. Both actin polymerization and myosin II activity are required for the constriction and cell elongation. Discontinuous localization of myosin II in the constriction indicates that the actomyosin network produces local contractions along the circumference. This reveals basal constriction by the actomyosin network as a novel mechanism for cell elongation. Following elongation, the notochord cells undergo a mesenchymal-epithelial transition and form two apical domains at opposite ends. Extracellular lumens then form at the apical surfaces. We show that cortical actin and Ciona ezrin/radixin/moesin (ERM) are essential for lumen formation and that a polarized network of microtubules, which contributes to lumen development, forms in an actin-dependent manner at the apical cortex. Later in notochord tubulogenesis, when notochord cells initiate a bi-directional crawling movement on the notochordal sheath, the microtubule network rotates 90° and becomes organized as parallel bundles extending towards the leading edges of tractive lamellipodia. This process is required for the correct organization of actin-based protrusions and subsequent lumen coalescence. In summary, we establish the contribution of the actomyosin and microtubule networks to notochord tubulogenesis and reveal extensive crosstalk and regulation between these two cytoskeleton components.

  19. Variations in understanding the drug-prescribing process: a qualitative study among Swedish GPs.

    PubMed

    Rahmner, Pia Bastholm; Gustafsson, Lars L; Larsson, Jan; Rosenqvist, Urban; Tomson, Göran; Holmström, Inger

    2009-04-01

    A majority of doctor-patient meetings result in the patient getting a prescription. This underlines the need for a high-quality prescription process. While studies have been made on single therapeutic drug groups, a complete study of the physicians' general thought process that comprises the prescription of all drugs still remains to be made. To identify variations in ways of understanding drug prescribing among GPs. A descriptive qualitative study was conducted with 20 Swedish physicians. Informants were recruited purposively and their understandings about prescribing were studied in semi-structured interviews. Data were analysed using a phenomenographic approach. Five categories were identified as follows: (A) GP prescribed safe, reliable and well-documented drugs for obvious complaints; (B) GP sought to convince the patient of the most effective drug treatment; (C) GP chose the best drug treatment taking into consideration the patient's entire life situation; (D) GP used clinical judgement and close follow-up to minimize unnecessary drug prescribing and (E) GP prescribed drugs which are cheap for society and environmentally friendly. The categories are interrelated, but have different foci: the biomedical, the patient and the society. Each GP had more than one view but none included all five. The findings also indicate that complexity increases when a drug is prescribed for primary or secondary prevention. GPs understand prescribing differently despite similar external circumstances. The most significant factor to influence prescribing behaviour was the physician's patient relation approach. GPs may need to reflect on difficulties they face while prescribing to enhance their understandings.

  20. [Current status of thoracoscopic surgery for thoracic and lumbar spine. Part 2: treatment of the thoracic disc hernia, spinal deformities, spinal tumors, infections and miscellaneous].

    PubMed

    Verdú-López, Francisco; Beisse, Rudolf

    2014-01-01

    Thoracoscopic surgery or video-assisted thoracic surgery (VATS) of the thoracic and lumbar spine has evolved greatly since it appeared less than 20 years ago. It is currently used in a large number of processes and injuries. The aim of this article, in its two parts, is to review the current status of VATS of the thoracic and lumbar spine in its entire spectrum. After reviewing the current literature, we developed each of the large groups of indications where VATS takes place, one by one. This second part reviews and discusses the management, treatment and specific thoracoscopic technique in thoracic disc herniation, spinal deformities, tumour pathology, infections of the spine and other possible indications for VATS. Thoracoscopic surgery is in many cases an alternative to conventional open surgery. The transdiaphragmatic approach has made endoscopic treatment of many thoracolumbar junction processes possible, thus widening the spectrum of therapeutic indications. These include the treatment of spinal deformities, spinal tumours, infections and other pathological processes, as well as the reconstruction of injured spinal segments and decompression of the spinal canal if lesion placement is favourable to antero-lateral approach. Good clinical results of thoracoscopic surgery are supported by growing experience reflected in a large number of articles. The degree of complications in thoracoscopic surgery is comparable to open surgery, with benefits in regard to morbidity of the approach and subsequent patient recovery. Copyright © 2012 Sociedad Española de Neurocirugía. Published by Elsevier España. All rights reserved.

Top