Sample records for entire development process

  1. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  2. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  3. An Optimization of Manufacturing Systems using a Feedback Control Scheduling Model

    NASA Astrophysics Data System (ADS)

    Ikome, John M.; Kanakana, Grace M.

    2018-03-01

    In complex production system that involves multiple process, unplanned disruption often turn to make the entire production system vulnerable to a number of problems which leads to customer’s dissatisfaction. However, this problem has been an ongoing problem that requires a research and methods to streamline the entire process or develop a model that will address it, in contrast to this, we have developed a feedback scheduling model that can minimize some of this problem and after a number of experiment, it shows that some of this problems can be eliminated if the correct remedial actions are implemented on time.

  4. A PROCESS FOR DEVELOPING AND EVALUATING INDICIES OF FISH ASSEMBLAGE INTEGRITY

    EPA Science Inventory

    We describe a general process for developing an index of fish assemblage integrity, using the Willamette Valley of Oregon, U.S.A., as an example. Such an index is useful for assessing the effects of humans on entire fish assemblages, and the general process can be applied to any ...

  5. Cell-accurate optical mapping across the entire developing heart.

    PubMed

    Weber, Michael; Scherf, Nico; Meyer, Alexander M; Panáková, Daniela; Kohl, Peter; Huisken, Jan

    2017-12-29

    Organogenesis depends on orchestrated interactions between individual cells and morphogenetically relevant cues at the tissue level. This is true for the heart, whose function critically relies on well-ordered communication between neighboring cells, which is established and fine-tuned during embryonic development. For an integrated understanding of the development of structure and function, we need to move from isolated snap-shot observations of either microscopic or macroscopic parameters to simultaneous and, ideally continuous, cell-to-organ scale imaging. We introduce cell-accurate three-dimensional Ca 2+ -mapping of all cells in the entire electro-mechanically uncoupled heart during the looping stage of live embryonic zebrafish, using high-speed light sheet microscopy and tailored image processing and analysis. We show how myocardial region-specific heterogeneity in cell function emerges during early development and how structural patterning goes hand-in-hand with functional maturation of the entire heart. Our method opens the way to systematic, scale-bridging, in vivo studies of vertebrate organogenesis by cell-accurate structure-function mapping across entire organs.

  6. Cell-accurate optical mapping across the entire developing heart

    PubMed Central

    Meyer, Alexander M; Panáková, Daniela; Kohl, Peter

    2017-01-01

    Organogenesis depends on orchestrated interactions between individual cells and morphogenetically relevant cues at the tissue level. This is true for the heart, whose function critically relies on well-ordered communication between neighboring cells, which is established and fine-tuned during embryonic development. For an integrated understanding of the development of structure and function, we need to move from isolated snap-shot observations of either microscopic or macroscopic parameters to simultaneous and, ideally continuous, cell-to-organ scale imaging. We introduce cell-accurate three-dimensional Ca2+-mapping of all cells in the entire electro-mechanically uncoupled heart during the looping stage of live embryonic zebrafish, using high-speed light sheet microscopy and tailored image processing and analysis. We show how myocardial region-specific heterogeneity in cell function emerges during early development and how structural patterning goes hand-in-hand with functional maturation of the entire heart. Our method opens the way to systematic, scale-bridging, in vivo studies of vertebrate organogenesis by cell-accurate structure-function mapping across entire organs. PMID:29286002

  7. Why Can't a Computer Be More Like a Brain?

    ERIC Educational Resources Information Center

    Lerner, Eric J.

    1984-01-01

    Engineers seeking to develop intelligent computers have looked to studies of the human brain in hope of imitating its processes. A theory (known as cooperative action) that the brain processes information with electromagnetic waves may inspire engineers to develop entirely new types of computers. (JN)

  8. GREENSCOPE Technical User’s Guide

    EPA Pesticide Factsheets

    GREENSCOPE’s methodology has been developed and its software tool designed such that it can be applied to an entire process, to a piece of equipment or process unit, or at the investigatory bench scale.

  9. Distributed Group Design Process: Lessons Learned.

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ganesan, Radha

    A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…

  10. Instructional Systems Development

    ERIC Educational Resources Information Center

    Watson, Russell

    The United States Army, confronted with sophisticated defense machinery and entry level soldiers with low educational backgrounds, selected a systems approach to training that was developed in 1975 by Florida State University. Instructional Systems Development (IDS), a five-phase process encompassing the entire educational environment, is…

  11. Capturing molecular multimode relaxation processes in excitable gases based on decomposition of acoustic relaxation spectra

    NASA Astrophysics Data System (ADS)

    Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng

    2017-08-01

    Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.

  12. Re-Engineering the Curriculum at a Rural Institution: Reflection on the Process of Development

    ERIC Educational Resources Information Center

    Naude, A.; Wium, A. M.; du Plessis, S.

    2011-01-01

    The Department of Speech-Language Pathology and Audiology at the University of Limpopo (Medunsa Campus) redesigned their curriculum at the beginning of 2010. The template that was developed shows the horizontal and vertical integration of outcomes. Although the outcomes of the entire process met the requirements of the Health Professions Council…

  13. The "National Medium- and Long-Term Educational Reform and Development Guideline (2010-20)": Expectations, Strategies, and Significance

    ERIC Educational Resources Information Center

    Xiaobing, Sun

    2012-01-01

    This paper starts out by describing the research and drafting processes of the "National Medium- and Long-Term Educational Reform and Development Guideline" (2010-20) (hereafter abbreviated as the "Guideline") and analyzes a series of core concepts that ran through the entire process of researching and drafting the…

  14. Technical Writing: Process and Product. Third Edition.

    ERIC Educational Resources Information Center

    Gerson, Sharon J.; Gerson, Steven M.

    This book guides students through the entire writing process--prewriting, writing, and rewriting--developing an easy-to-use, step-by-step technique for writing the types of documents they will encounter on the job. It engages students in the writing process and encourages hands-on application as well as discussions about ethics, audience…

  15. Cognitive Design for Learning: Cognition and Emotion in the Design Process

    ERIC Educational Resources Information Center

    Hasebrook, Joachim

    2016-01-01

    We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…

  16. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center.

    PubMed

    Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude

    2011-07-01

    Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    NASA Astrophysics Data System (ADS)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  18. Discussion on the Development of Green Chemistry and Chemical Engineering

    NASA Astrophysics Data System (ADS)

    Zhang, Yunshen

    2017-11-01

    Chemical industry plays a vital role in the development process of national economy. However, in view of the special nature of the chemical industry, a large number of poisonous and harmful substances pose a great threat to the ecological environment and human health in the entire process of raw material acquisition, production, transportation, product manufacturing, and the final practical application. Therefore, it is a general trend to promote the development of chemistry and chemical engineering towards a greener environment. This article will focus on some basic problems occurred in the development process of green chemistry and chemical engineering.

  19. Online Student Orientation in Higher Education: A Developmental Study

    ERIC Educational Resources Information Center

    Cho, Moon-Heum

    2012-01-01

    Although orientation for online students is important to their success, little information about how to develop an online student orientation (OSO) has appeared in the literature; therefore, the purpose of this article was to describe the entire process of developing an OSO. This article describes the analysis, design, development, and evaluation…

  20. Expanding lean thinking to the product and process design and development within the framework of sustainability

    NASA Astrophysics Data System (ADS)

    Sorli, M.; Sopelana, A.; Salgado, M.; Pelaez, G.; Ares, E.

    2012-04-01

    Companies require tools to change towards a new way of developing and producing innovative products to be manufactured considering the economic, social and environmental impact along the product life cycle. Based on translating Lean principles in Product Development (PD) from the design stage and, along the entire product life cycle, it is aimed to address both sustainability and environmental issues. The drivers of sustainable culture within a lean PD have been identified and a baseline for future research on the development of appropriate tools and techniques has been provided. This research provide industry with a framework which balance environmental and sustainable factors with lean principles to be considered and incorporated from the beginning of product design and development covering the entire product lifecycle.

  1. Quantum efficiency and dark current evaluation of a backside illuminated CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Vereecke, Bart; Cavaco, Celso; De Munck, Koen; Haspeslagh, Luc; Minoglou, Kyriaki; Moore, George; Sabuncuoglu, Deniz; Tack, Klaas; Wu, Bob; Osman, Haris

    2015-04-01

    We report on the development and characterization of monolithic backside illuminated (BSI) imagers at imec. Different surface passivation, anti-reflective coatings (ARCs), and anneal conditions were implemented and their effect on dark current (DC) and quantum efficiency (QE) are analyzed. Two different single layer ARC materials were developed for visible light and near UV applications, respectively. QE above 75% over the entire visible spectrum range from 400 to 700 nm is measured. In the spectral range from 260 to 400 nm wavelength, QE values above 50% over the entire range are achieved. A new technique, high pressure hydrogen anneal at 20 atm, was applied on photodiodes and improvement in DC of 30% for the BSI imager with HfO2 as ARC as well as for the front side imager was observed. The entire BSI process was developed 200 mm wafers and evaluated on test diode structures. The knowhow is then transferred to real imager sensors arrays.

  2. The Montana experience

    NASA Technical Reports Server (NTRS)

    Dundas, T. R.

    1981-01-01

    The development and capabilities of the Montana geodata system are discussed. The system is entirely dependent on the state's central data processing facility which serves all agencies and is therefore restricted to batch mode processing. The computer graphics equipment is briefly described along with its application to state lands and township mapping and the production of water quality interval maps.

  3. Management of major system programs and projects. Handbook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Handbook establishes the detailed policies and processes for implementing NMI 7120.4, 'Management of Major System Programs and Projects'. It constitutes a comprehensive source of the specific policies and processes governing management of major development programs/projects and is intended as a resource to the entire program/project management (PPM) community.

  4. Extending the Marine Microcosm Laboratory

    ERIC Educational Resources Information Center

    Ryswyk, Hal Van; Hall, Eric W.; Petesch, Steven J.; Wiedeman, Alice E.

    2007-01-01

    The traditional range of marine microcosm laboratory experiments is presented as an ideal environment to teach the entire analysis process. The microcosm lab provides student-centered approach with opportunities for collaborative learning and to develop critical communication skills.

  5. Production Techniques for Computer-Based Learning Material.

    ERIC Educational Resources Information Center

    Moonen, Jef; Schoenmaker, Jan

    Experiences in the development of educational software in the Netherlands have included the use of individual and team approaches, the determination of software content and how it should be presented, and the organization of the entire development process, from experimental programs to prototype to final product. Because educational software is a…

  6. Understanding Personality Development: An Integrative State Process Model

    ERIC Educational Resources Information Center

    Geukes, Katharina; van Zalk, Maarten; Back, Mitja D.

    2018-01-01

    While personality is relatively stable over time, it is also subject to change across the entire lifespan. On a macro-analytical level, empirical research has identified patterns of normative and differential development that are affected by biological and environmental factors, specific life events, and social role investments. On a…

  7. Augmenting SCA project management and automation framework

    NASA Astrophysics Data System (ADS)

    Iyapparaja, M.; Sharma, Bhanupriya

    2017-11-01

    In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.

  8. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  9. Rigor + Results = Impact: Measuring Impact with Integrity (Invited)

    NASA Astrophysics Data System (ADS)

    Davis, H. B.; Scalice, D.

    2013-12-01

    Are you struggling to measure and explain the impact of your EPO efforts? The NASA Astrobiology Institute (NAI) is using an evaluation process to determine the impact of its 15 EPO projects with over 200 activities. What is the current impact? How can it be improved in the future? We have developed a process that preserves autonomy at the project implementation level while still painting a picture of the entire portfolio. The impact evaluation process looks at an education/public outreach activity through its entire project cycle. Working with an external evaluator, education leads: 1) rate the quality/health of an activity in each stage of its cycle, and 2) determine the impact based on the results of the evaluation and the rigor of the methods used. The process has created a way to systematically codify a project's health and its impact, while offering support for improving both impact and how it is measured.

  10. Analysis of stress-strain relationships in silicon ribbon

    NASA Technical Reports Server (NTRS)

    Dillon, O. W., Jr.

    1984-01-01

    An analysis of stress-strain relationships in silicon ribbon is presented. A model to present entire process, dynamical Transit Analysis is developed. It is found that knowledge of past-strain history is significant in modeling activities.

  11. 78 FR 32424 - Notice of Issuance of Final Determination Concerning Monochrome Laser Printers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... for manufacture in the U.S. and subsequent sale to U.S. government agencies. Ricoh states that it developed the SP52000-series printers in Japan, and that the entire engineering, development, design and..., Ltd. At the initial stage of the printers production process, individual parts are assembled into...

  12. Research and Development: A Complex Relationship Part I [and] Part II.

    ERIC Educational Resources Information Center

    Pollard, John Douglas Edward

    Part 1 of this document describes the background, format, and early groundwork that went into the development of a test sponsored entirely by private enterprise. The discipline imposed by a financial bottom line imposes special pressures but also offers new opportunities. This private enterprise model is a multi-constructional process where…

  13. An Integrated Model of Choices and Response Times in Absolute Identification

    ERIC Educational Resources Information Center

    Brown, Scott D.; Marley, A. A. J.; Donkin, Christopher; Heathcote, Andrew

    2008-01-01

    Recent theoretical developments in the field of absolute identification have stressed differences between relative and absolute processes, that is, whether stimulus magnitudes are judged relative to a shorter term context provided by recently presented stimuli or a longer term context provided by the entire set of stimuli. The authors developed a…

  14. Second Aerospace Environmental Technology Conference

    NASA Technical Reports Server (NTRS)

    Whitaker, A. F. (Editor); Clark-Ingram, M. (Editor)

    1997-01-01

    The mandated elimination of CFC'S, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application, verification, compliant coatings including corrosion protection system and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards.

  15. Second Aerospace Environmental Technology Conference

    NASA Technical Reports Server (NTRS)

    Whitaker, A. F.; Clark-Ingram, M.; Hessler, S. L.

    1997-01-01

    The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards.

  16. The development of the tapeworm Diphyllobothrium latum (L. 1756) (Cestoda; Pseudophyllidea) in its definitive hosts, with special references to the growth patterns of D. dendriticum (Nitzsch, 1824) and D. ditremum (Creplin, 1827).

    PubMed

    Andersen, K

    1978-08-01

    When Diphyllobothrium latum develops from larva to adult in a definitive host, it first sheds the entire larval 'body' before growth of an adult strobila starts. This process of shedding off the entire larval abothrial extremity, piece by piece, takes about 48 h. By this time the larva has usually reached the anterior third of the small intestine of the host. D. dendriticum and D. ditremum develop quite differently, although exhibiting similar anterior migrations. In these two species the larvae develop directly into adults without the larval 'body' first being shed. The implications of the observed differences in growth pattern between these three species of Diphyllobothrium to the classification of diphyllobothriid cestodes is discussed briefly.

  17. Lakes and ponds recreation management: a state-wide application of the visitor impact management process

    Treesearch

    Jerry J. Vaske; Rodney R. Zwick; Maureen P. Donnelly

    1992-01-01

    The Visitor Impact Management (VIM) process is designed to identify unacceptable changes occurring as a result of visitor use and to develop management strategies to keep visitor impacts within acceptable levels. All previous attempts to apply the VIM planning framework have concentrated on specific resources. This paper expands this focus to an entire state. Based on...

  18. Effect of Humidity of Poly-Cereal Flour Mixture and Screw Rotation Rate on Efficiency of Extrusion Process

    ERIC Educational Resources Information Center

    Ospanov, ?bdymanap ?.; Muslimov, Nurzhan Zh.; Timurbekova, ?igul ?.; Jumabekova, Gulnar? B.

    2016-01-01

    The food industry is an important constituent of a country's economy, which provides the population with food. The development of the food industry and the supply of food products to the entire population requires improving food-manufacturing technologies, such as the process for production of extruded poly-cereal food products using…

  19. National policy and military doctrine: development of a nuclear concept of land warfare, 1949-1964

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bockar, D.

    In the thirty years that battle field nuclear weapons have been available, no one has originated an idea of how they might be used as an entirely new weapon. New weapons are routinely introduced into existing combat organizations before an appropriate tactical concept has been invented. But never before in history has a new weapon been deployed on so massive a scale without a tactical concept that exploited the radical implications of its novel technology for traditional warfare. This study is an attempt to understand the problem of developing a persuasive tactical concept for nuclear weapons. The process of assimilationmore » by which military organizations integrate new weapons with existing weapons in novel tactical and organizational concepts has an intellectual, and institutional, and a political dimension. The principle of civilian control, however, makes the process by which weapons are assimilated part of the process by which national security policies are made. In peacetime the military's formulation of doctrine is almost entirely consequent upon the world view, the methodological and managerial assumptions, and the domestic policy concerns of political authority.« less

  20. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  1. Financial issues for commercial space ventures: Paying for the dreams

    NASA Technical Reports Server (NTRS)

    Egan, J. J.

    1984-01-01

    Various financial issues involved in commercial space enterprise are discussed. Particular emphasis is placed on the materials processing area: the current state of business plan and financial developments, what is needed for enhanced probability of success of future materials development efforts in attracting financial backing, and finally, the risks involved in this entire business area.

  2. General purpose molecular dynamics simulations fully implemented on graphics processing units

    NASA Astrophysics Data System (ADS)

    Anderson, Joshua A.; Lorenz, Chris D.; Travesset, A.

    2008-05-01

    Graphics processing units (GPUs), originally developed for rendering real-time effects in computer games, now provide unprecedented computational power for scientific applications. In this paper, we develop a general purpose molecular dynamics code that runs entirely on a single GPU. It is shown that our GPU implementation provides a performance equivalent to that of fast 30 processor core distributed memory cluster. Our results show that GPUs already provide an inexpensive alternative to such clusters and discuss implications for the future.

  3. Methodological challenges when doing research that includes ethnic minorities: a scoping review.

    PubMed

    Morville, Anne-Le; Erlandsson, Lena-Karin

    2016-11-01

    There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.

  4. Training for Skill in Fault Diagnosis

    ERIC Educational Resources Information Center

    Turner, J. D.

    1974-01-01

    The Knitting, Lace and Net Industry Training Board has developed a training innovation called fault diagnosis training. The entire training process concentrates on teaching based on the experiences of troubleshooters or any other employees whose main tasks involve fault diagnosis and rectification. (Author/DS)

  5. DEVELOPMENT OF THE VIRTUAL BEACH MODEL, PHASE 1: AN EMPIRICAL MODEL

    EPA Science Inventory

    With increasing attention focused on the use of multiple linear regression (MLR) modeling of beach fecal bacteria concentration, the validity of the entire statistical process should be carefully evaluated to assure satisfactory predictions. This work aims to identify pitfalls an...

  6. Aerospace Environmental Technology Conference

    NASA Technical Reports Server (NTRS)

    Whitaker, A. F. (Editor)

    1995-01-01

    The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards. The Executive Summary of this Conference is published as NASA CP-3297.

  7. Mathematical model of whole-process calculation for bottom-blowing copper smelting

    NASA Astrophysics Data System (ADS)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song

    2017-11-01

    The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.

  8. Decision Science Challenges for C2 Agility

    DTIC Science & Technology

    2014-06-01

    decision -making effectiveness , but also the adaptive capacities needed to assure the resilience of the decision -making process itself. New methods are... effectiveness , but also the adaptive capacities needed to assure the resilience of the decision -making process itself. New methods are needed to help...of the literature on human biases and limitations, and hence it has been formative of entire programs of resarch and development on

  9. Development of hydrogen peroxide technique for bioburden reduction

    NASA Astrophysics Data System (ADS)

    Rohatgi, N.; Schwartz, L.; Stabekis, P.; Barengoltz, J.

    In order to meet the National Aeronautics and Space Administration (NASA) Planetary Protection microbial reduction requirements for Mars in-situ life detection and sample return missions, entire planetary spacecraft (including planetary entry probes and planetary landing capsules) may have to be exposed to a qualified sterilization process. Presently, dry heat is the only NASA approved sterilization technique available for spacecraft application. However, with the increasing use of various man-made materials, highly sophisticated electronic circuit boards, and sensors in a modern spacecraft, compatibility issues may render this process unacceptable to design engineers and thus impractical to achieve terminal sterilization of the entire spacecraft. An alternative vapor phase hydrogen peroxide sterilization process, which is currently used in various industries, has been selected for further development. Strategic Technology Enterprises, Incorporated (STE), a subsidiary of STERIS Corporation, under a contract from the Jet Propulsion Laboratory (JPL) is developing systems and methodologies to decontaminate spacecraft using vaporized hydrogen peroxide (VHP) technology. The VHP technology provides an effective, rapid and low temperature means for inactivation of spores, mycobacteria, fungi, viruses and other microorganisms. The VHP application is a dry process affording excellent material compatibility with many of the components found in spacecraft such as polymers, paints and electronic systems. Furthermore, the VHP process has innocuous residuals as it decomposes to water vapor and oxygen. This paper will discuss the approach that is being used to develop this technique and will present lethality data that have been collected to establish deep vacuum VHP sterilization cycles. In addition, the application of this technique to meet planetary protection requirements will be addressed.

  10. An Interactive Design Space Supporting Development of Vehicle Architecture Concept Models

    DTIC Science & Technology

    2011-01-01

    Denver, Colorado, USA IMECE2011-64510 AN INTERACTIVE DESIGN SPACE SUPPORTING DEVELOPMENT OF VEHICLE ARCHITECTURE CONCEPT MODELS Gary Osborne...early in the development cycle. Optimization taking place later in the cycle usually occurs at the detail design level, and tends to result in...architecture changes may be imposed, but such modifications are equivalent to a huge optimization cycle covering almost the entire design process, and

  11. The impact of modifying antenna size of photosystem II on canopy photosynthetic efficiency – development of a new canopy photosynthesis model scaling from metabolism to canopy level processes

    USDA-ARS?s Scientific Manuscript database

    Canopy photosynthesis describes photosynthesis of an entire crop field and positively correlates with biomass production. Much effort in crop breeding has focused on improving canopy architecture and hence light distribution inside the canopy. Here, we develop a new integrated canopy photosynthesis ...

  12. Advances in Polyhydroxyalkanoate (PHA) Production.

    PubMed

    Koller, Martin

    2017-11-02

    This editorial paper provides a synopsis of the contributions to the Bioengineering special issue "Advances in Polyhydroxyalkanoate (PHA) Production". It illustrates the embedding of the issue's individual research articles in the current global research and development landscape related to polyhydroxyalkanoates (PHA). The article shows how these articles are interrelated to each other, reflecting the entire PHA process chain including strain selection, metabolic and genetic considerations, feedstock evaluation, fermentation regimes, process engineering, and polymer processing towards high-value marketable products.

  13. Improving Standoff Bombing Capacity in the Face of Anti-Access Area Denial Threats

    DTIC Science & Technology

    2015-02-01

    potential U.S. adversaries possess or are in the process of acquiring these so-called advanced anti-access/area denial (A2/AD) defense systems, defined in...due to some aircraft being not mission capable due to scheduled and unscheduled maintenance. xiv Air Force is in the process of developing a...Timson, and Christopher Mouton. They were invaluable to me and provided me with incredible support throughout the entire research process . Jim was an

  14. Tetraethylene glycol promoted two-step, one-pot rapid synthesis of indole-3-[1- 11C]acetic acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Sojeong; Qu, Wenchao; Alexoff, David L.

    2014-12-12

    An operationally friendly, two-step, one-pot process has been developed for the rapid synthesis of carbon-11 labeled indole-3-acetic acid ([ 11]IAA or [ 11]auxin). By replacing an aprotic polar solvent with tetraethylene glycol, nucleophilic [ 11]cyanation and alkaline hydrolysis reactions were performed consecutively in a single pot without a time-consuming intermediate purification step. The entire production time for this updated procedure is 55 min, which dramatically simplifies the entire synthesis and reduces the starting radioactivity required for a whole plant imaging study.

  15. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes.

    PubMed

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-04-01

    Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.

  16. A functional–structural model of rice linking quantitative genetic information with morphological development and physiological processes

    PubMed Central

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905

  17. Novel technologies for decontamination of fresh and minimally processed fruits and vegetables

    USDA-ARS?s Scientific Manuscript database

    The complex challenges facing producers and processors of fresh produce require creative applications of conventional treatments and innovative approaches to develop entirely novel treatments. The varied nature of fresh and fresh-cut produce demands solutions that are adapted and optimized for each ...

  18. Interactive Videodisc Design and Production Workshop Guide.

    ERIC Educational Resources Information Center

    Campbell, J. Olin; And Others

    This "how to" workshop guide provides an overview of the entire videodisc authoring and production process through six individual modules. Focusing on project planning, the first module provides guidelines, procedures, and job aids to help each instructional development team member effectively use the videodisc medium. The second module…

  19. Educating Teachers for Intercultural Education

    ERIC Educational Resources Information Center

    Ermenc, Klara Skubic

    2015-01-01

    The paper begins with a short overview of the development of intercultural education and proposes a definition of interculturality in education as a pedagogical principle that guides the entire process of planning, implementing, and evaluating education at the systemic, curricular, school, and classroom levels to enable recognition and empowerment…

  20. Development and implementation of an interdisciplinary plan of care.

    PubMed

    Lewis, Cynthia; Hoffmann, Mary Lou; Gard, Angela; Coons, Jacqueline; Bichinich, Pat; Euclid, Jeff

    2005-01-01

    In January 2002 Aurora Health Care Metro Region chartered an interdisciplinary team to develop a process and structure for patient-centered interdisciplinary care planning. This unique endeavor created a process that includes the patient, family, and all clinical disciplines involved in planning and providing care to patients from system point of entry throughout the entire acute care episode. The interdisciplinary plan of care (IPOC) demonstrates the integration of prioritized problems, outcomes, and measurement toward goal attainment. This article focuses on the journey of this team to the successful implementation of an IPOC.

  1. Adaptive Process Controls and Ultrasonics for High Temperature PEM MEA Manufacture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walczyk, Daniel F.

    2015-08-26

    The purpose of this 5-year DOE-sponsored project was to address major process bottlenecks associated with fuel cell manufacturing. New technologies were developed to significantly reduce pressing cycle time for high temperature PEM membrane electrode assembly (MEA) through the use of novel, robust ultrasonic (U/S) bonding processes along with low temperature (<100°C) PEM MEAs. In addition, greater manufacturing uniformity and performance was achieved through (a) an investigation into the causes of excessive variation in ultrasonically and thermally bonded MEAs using more diagnostics applied during the entire fabrication and cell build process, and (b) development of rapid, yet simple quality control measurementmore » techniques for use by industry.« less

  2. Advances in Polyhydroxyalkanoate (PHA) Production

    PubMed Central

    2017-01-01

    This editorial paper provides a synopsis of the contributions to the Bioengineering special issue “Advances in Polyhydroxyalkanoate (PHA) Production”. It illustrates the embedding of the issue’s individual research articles in the current global research and development landscape related to polyhydroxyalkanoates (PHA). The article shows how these articles are interrelated to each other, reflecting the entire PHA process chain including strain selection, metabolic and genetic considerations, feedstock evaluation, fermentation regimes, process engineering, and polymer processing towards high-value marketable products. PMID:29099065

  3. Mathematics and Experiential Learning--Are They Compatible?

    ERIC Educational Resources Information Center

    Davidovitch, Nitza; Yavich, Roman; Keller, Nelly

    2014-01-01

    In the process of experiential learning, students acquire skills and values as the consequence of a direct experience. Experiential learning draws on senses, emotions, and cognition and appeals to learners' entire being. Such learning, by nature, enables the development of a variety of capabilities, such as planning, teamwork, coping with…

  4. Smart in Everything Except School.

    ERIC Educational Resources Information Center

    Getman, G. N.

    This book focuses on the prevention of academic failure through focus on developmental processes (especially development of essential visual skills) within the individual learner. A distinction is made between sight and vision with vision involving the entire person and his/her learning experiences The first chapter examines "The Dynamics of the…

  5. User-Centered Design of Online Learning Communities

    ERIC Educational Resources Information Center

    Lambropoulos, Niki, Ed.; Zaphiris, Panayiotis, Ed.

    2007-01-01

    User-centered design (UCD) is gaining popularity in both the educational and business sectors. This is due to the fact that UCD sheds light on the entire process of analyzing, planning, designing, developing, using, evaluating, and maintaining computer-based learning. "User-Centered Design of Online Learning Communities" explains how…

  6. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System

    DTIC Science & Technology

    1992-06-01

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  7. 40 CFR 86.1824-01 - Durability demonstration procedures for evaporative emissions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... their full useful life. The manufacturer shall use good engineering judgment in determining this process... actual use over its full useful life. The manufacturer shall use good engineering judgement in developing... hardware and software) must be installed and operating for the entire mileage accumulation period. (ii...

  8. Synthesizing genetic divergence and climate modeling to inform conservation planning for ponderosa pine

    Treesearch

    Kevin M. Potter; Douglas J. Shinneman; Robert E. Means; Valerie D. Hipkins; Mary Frances Mahalovich

    2017-01-01

    Geological, climatological and ecological processes partially or entirely isolate evolutionary lineages within tree species. These lineages may develop adaptations to different local environmental conditions, and may eventually evolve into distinct forms or species. Isolation also can reduce adaptive genetic variation within populations of a species, potentially...

  9. Aerospace Environmental Technology Conference: Exectutive summary

    NASA Technical Reports Server (NTRS)

    Whitaker, A. F. (Editor)

    1995-01-01

    The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards. The papers from this conference are being published in a separate volume as NASA CP-3298.

  10. A new look at low-energy nuclear reaction research.

    PubMed

    Krivit, Steven B; Marwan, Jan

    2009-10-01

    This paper presents a new look at low-energy nuclear reaction research, a field that has developed from one of the most controversial subjects in science, "cold fusion." Early in the history of this controversy, beginning in 1989, a strong polarity existed; many scientists fiercely defended the claim of new physical effects as well as a new process in which like-charged atomic nuclei overcome the Coulomb barrier at normal temperatures and pressures. Many other scientists considered the entire collection of physical observations-along with the hypothesis of a "cold fusion"--entirely a mistake. Twenty years later, some people who had dismissed the field in its entirety are considering the validity of at least some of the reported experimental phenomena. As well, some researchers in the field are wondering whether the underlying phenomena may be not a fusion process but a neutron capture/absorption process. In 2002, a related tabletop form of thermonuclear fusion was discovered in the field of acoustic inertial confinement fusion. We briefly review some of this work, as well.

  11. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  12. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1996-01-01

    The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.

  13. Examination of the U.S. Air Force's Science, Technology, Engineering, and Mathematics (STEM) Workforce Needs in the Future and Its Strategy to Meet Those Needs

    ERIC Educational Resources Information Center

    National Academies Press, 2010

    2010-01-01

    The Air Force requires technical skills and expertise across the entire range of activities and processes associated with the development, fielding, and employment of air, space, and cyber operational capabilities. The growing complexity of both traditional and emerging missions is placing new demands on education, training, career development,…

  14. 78 FR 32427 - Notice of Issuance of Final Determination Concerning Multifunctional Digital Imaging Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... manufacture different types of Controller units. Ricoh considers the manufacturing of the Controller unit... components and subassemblies of the MFPs from China and the Philippines for manufacture in the U.S. and..., and that the entire engineering, development, design and artwork processes for the MFPs took place in...

  15. Comprehensive Experiment--Clinical Biochemistry: Determination of Blood Glucose and Triglycerides in Normal and Diabetic Rats

    ERIC Educational Resources Information Center

    Jiao, Li; Xiujuan, Shi; Juan, Wang; Song, Jia; Lei, Xu; Guotong, Xu; Lixia, Lu

    2015-01-01

    For second year medical students, we redesigned an original laboratory experiment and developed a combined research-teaching clinical biochemistry experiment. Using an established diabetic rat model to detect blood glucose and triglycerides, the students participate in the entire experimental process, which is not normally experienced during a…

  16. Controlling Curriculum Redesign with a Process Improvement Model

    ERIC Educational Resources Information Center

    Drinka, Dennis; Yen, Minnie Yi-Miin

    2008-01-01

    A portion of the curriculum for a Management Information Systems degree was redesigned to enhance the experiential learning of students by focusing it on a three-semester community-based system development project. The entire curriculum was then redesigned to have a project-centric focus with each course in the curriculum contributing to the…

  17. Space Coatings for Industry

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Ball Aerospace developed entirely new space lubrication technologies. A new family of dry lubricants emerged from Apollo, specifically designed for long life in space, together with processes for applying them to spacecraft components in microscopically thin coatings. Lubricants worked successfully on seven Orbiting Solar Observatory flights over the span of a decade and attracted attention to other contractors which became Ball customers. The company has developed several hundred variations of the original OSO technology generally designed to improve the quality and useful life of a wide range of products or improve efficiency of the industrial processes by which such products are manufactured.

  18. Managing distribution changes in time series prediction

    NASA Astrophysics Data System (ADS)

    Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.

    2006-07-01

    When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.

  19. Internet-based data warehousing

    NASA Astrophysics Data System (ADS)

    Boreisha, Yurii

    2001-10-01

    In this paper, we consider the process of the data warehouse creation and population using the latest Internet and database access technologies. The logical three-tier model is applied. This approach allows developing of an enterprise schema by analyzing the various processes in the organization, and extracting the relevant entities and relationships from them. Integration with local schemas and population of the data warehouse is done through the corresponding user, business, and data services components. The hierarchy of these components is used to hide from the data warehouse users the entire complex online analytical processing functionality.

  20. High temperature aircraft research furnace facilities

    NASA Technical Reports Server (NTRS)

    Smith, James E., Jr.; Cashon, John L.

    1992-01-01

    Focus is on the design, fabrication, and development of the High Temperature Aircraft Research Furnace Facilities (HTARFF). The HTARFF was developed to process electrically conductive materials with high melting points in a low gravity environment. The basic principle of operation is to accurately translate a high temperature arc-plasma gas front as it orbits around a cylindrical sample, thereby making it possible to precisely traverse the entire surface of a sample. The furnace utilizes the gas-tungsten-arc-welding (GTAW) process, also commonly referred to as Tungsten-Inert-Gas (TIG). The HTARFF was developed to further research efforts in the areas of directional solidification, float-zone processing, welding in a low-gravity environment, and segregation effects in metals. The furnace is intended for use aboard the NASA-JSC Reduced Gravity Program KC-135A Aircraft.

  1. Do network relationships matter? Comparing network and instream habitat variables to explain densities of juvenile coho salmon (Oncorhynchus kisutch) in mid-coastal Oregon, USA

    Treesearch

    Rebecca L. Flitcroft; Kelly M. Burnett; Gordon H. Reeves; Lisa M. Ganio

    2012-01-01

    Aquatic ecologists are working to develop theory and techniques for analysis of dynamic stream processes and communities of organisms. Such work is critical for the development of conservation plans that are relevant at the scale of entire ecosystems. The stream network is the foundation upon which stream systems are organized. Natural and human disturbances in streams...

  2. From bench to FDA to bedside: US regulatory trends for new stem cell therapies.

    PubMed

    Knoepfler, Paul S

    2015-03-01

    The phrase "bench-to-bedside" is commonly used to describe the translation of basic discoveries such as those on stem cells to the clinic for therapeutic use in human patients. However, there is a key intermediate step in between the bench and the bedside involving governmental regulatory oversight such as by the Food and Drug Administration (FDA) in the United States (US). Thus, it might be more accurate in most cases to describe the stem cell biological drug development process in this way: from bench to FDA to bedside. The intermediate development and regulatory stage for stem cell-based biological drugs is a multifactorial, continually evolving part of the process of developing a biological drug such as a stem cell-based regenerative medicine product. In some situations, stem cell-related products may not be classified as biological drugs in which case the FDA plays a relatively minor role. However, this middle stage is generally a major element of the process and is often colloquially referred to in an ominous way as "The Valley of Death". This moniker seems appropriate because it is at this point, and in particular in the work that ensues after Phase 1, clinical trials that most drug product development is terminated, often due to lack of funding, diseases being refractory to treatment, or regulatory issues. Not surprisingly, workarounds to deal with or entirely avoid this difficult stage of the process are evolving both inside and outside the domains of official regulatory authorities. In some cases these efforts involve the FDA invoking new mechanisms of accelerating the bench to beside process, but in other cases these new pathways bypass the FDA in part or entirely. Together these rapidly changing stem cell product development and regulatory pathways raise many scientific, ethical, and medical questions. These emerging trends and their potential consequences are reviewed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Problem Solving Processes and Video Games: The Sim City Creator Case

    ERIC Educational Resources Information Center

    Monjelat, Natalia; Mendez-Zaballos, Laura; Lacasa, Pilar

    2012-01-01

    Introduction: Video games have proven to be a valuable resource to work different school subjects and topics. Beyond specific content, they could help to develop different abilities, like problem solving. However, not much has been studied on this topic, or many of the studies followed a perspective not entirely compatible with an educational…

  4. Predicting bending stiffness of randomly oriented hybrid panels

    Treesearch

    Laura Moya; William T.Y. Tze; Jerrold E. Winandy

    2010-01-01

    This study was conducted to develop a simple model to predict the bending modulus of elasticity (MOE) of randomly oriented hybrid panels. The modeling process involved three modules: the behavior of a single layer was computed by applying micromechanics equations, layer properties were adjusted for densification effects, and the entire panel was modeled as a three-...

  5. Determining the 95% limit of detection for waterborne pathogen analyses from primary concentration to qPCR

    USDA-ARS?s Scientific Manuscript database

    The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay rather than the entire sample process. Our objective was to develop a method to determine the 95% LOD (lowest co...

  6. A Resource Guide for the Maryland Plan's Group Project and Line Production.

    ERIC Educational Resources Information Center

    Day, Gerald F.

    This guide was developed for teachers who are using the Maryland Plan's group processes--the group project and line production methods. The guide is divided into four sections. The first section is an overview of the entire Maryland Plan. It describes the program which provides high school industrial arts students, from seventh grade through ninth…

  7. A guide to LIDAR data acquisition and processing for the forests of the Pacific Northwest.

    Treesearch

    Demetrios Gatziolis; Hans-Erik Andersen

    2008-01-01

    Light detection and ranging (LIDAR) is an emerging remote-sensing technology with promising potential to assist in mapping, monitoring, and assessment of forest resources. Continuous technological advancement and substantial reductions in data acquisition cost have enabled acquisition of laser data over entire states and regions. These developments have triggered an...

  8. [Design of medical devices management system supporting full life-cycle process management].

    PubMed

    Su, Peng; Zhong, Jianping

    2014-03-01

    Based on the analysis of the present status of medical devices management, this paper optimized management process, developed a medical devices management system with Web technologies. With information technology to dynamic master the use of state of the entire life-cycle of medical devices. Through the closed-loop management with pre-event budget, mid-event control and after-event analysis, improved the delicacy management level of medical devices, optimized asset allocation, promoted positive operation of devices.

  9. Turbofan forced mixer-nozzle internal flowfield. Volume 2: Computational fluid dynamic predictions

    NASA Technical Reports Server (NTRS)

    Werle, M. J.; Vasta, V. N.

    1982-01-01

    A general program was conducted to develop and assess a computational method for predicting the flow properties in a turbofan forced mixed duct. The detail assessment of the resulting computer code is presented. It was found that the code provided excellent predictions of the kinematics of the mixing process throughout the entire length of the mixer nozzle. The thermal mixing process between the hot core and cold fan flows was found to be well represented in the low speed portion of the flowfield.

  10. Ballistic Signature Identification System Study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.

  11. Rapid Processing of Radio Interferometer Data for Transient Surveys

    NASA Astrophysics Data System (ADS)

    Bourke, S.; Mooley, K.; Hallinan, G.

    2014-05-01

    We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.

  12. Safeguards Approaches for Black Box Processes or Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz-Marcano, Helly; Gitau, Ernest TN; Hockert, John

    2013-09-25

    The objective of this study is to determine whether a safeguards approach can be developed for “black box” processes or facilities. These are facilities where a State or operator may limit IAEA access to specific processes or portions of a facility; in other cases, the IAEA may be prohibited access to the entire facility. The determination of whether a black box process or facility is safeguardable is dependent upon the details of the process type, design, and layout; the specific limitations on inspector access; and the restrictions placed upon the design information that can be provided to the IAEA. Thismore » analysis identified the necessary conditions for safeguardability of black box processes and facilities.« less

  13. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  14. Transcriptome analysis on the exoskeleton formation in early developmetal stages and reconstruction scenario in growth-moulting in Litopenaeus vannamei.

    PubMed

    Gao, Yi; Wei, Jiankai; Yuan, Jianbo; Zhang, Xiaojun; Li, Fuhua; Xiang, Jianhai

    2017-04-24

    Exoskeleton construction is an important issue in shrimp. To better understand the molecular mechanism of exoskeleton formation, development and reconstruction, the transcriptome of the entire developmental process in Litopenaeus vannamei, including nine early developmental stages and eight adult-moulting stages, was sequenced and analysed using Illumina RNA-seq technology. A total of 117,539 unigenes were obtained, with 41.2% unigenes predicting the full-length coding sequence. Gene Ontology, Clusters of Orthologous Group (COG), the Kyoto Encyclopedia of Genes and Genomes (KEGG) analysis and functional annotation of all unigenes gave a better understanding of the exoskeleton developmental process in L. vannamei. As a result, more than six hundred unigenes related to exoskeleton development were identified both in the early developmental stages and adult-moulting. A cascade of sequential expression events of exoskeleton-related genes were summarized, including exoskeleton formation, regulation, synthesis, degradation, mineral absorption/reabsorption, calcification and hardening. This new insight on major transcriptional events provide a deep understanding for exoskeleton formation and reconstruction in L. vannamei. In conclusion, this is the first study that characterized the integrated transcriptomic profiles cover the entire exoskeleton development from zygote to adult-moulting in a crustacean, and these findings will serve as significant references for exoskeleton developmental biology and aquaculture research.

  15. LANDSAT information for state planning

    NASA Technical Reports Server (NTRS)

    Faust, N. L.; Spann, G. W.

    1977-01-01

    The transfer of remote sensing technology for the digital processing of LANDSAT data to state and local agencies in Georgia and other southeastern states is discussed. The project consists of a series of workshops, seminars, and demonstration efforts, and transfer of NASA-developed hardware concepts and computer software to state agencies. Throughout the multi-year effort, digital processing techniques have been emphasized classification algorithms. Software for LANDSAT data rectification and processing have been developed and/or transferred. A hardware system is available at EES (engineering experiment station) to allow user interactive processing of LANDSAT data. Seminars and workshops emphasize the digital approach to LANDSAT data utilization and the system improvements scheduled for LANDSATs C and D. Results of the project indicate a substantially increased awareness of the utility of digital LANDSAT processing techniques among the agencies contracted throughout the southeast. In Georgia, several agencies have jointly funded a program to map the entire state using digitally processed LANDSAT data.

  16. Efficient material decomposition method for dual-energy X-ray cargo inspection system

    NASA Astrophysics Data System (ADS)

    Lee, Donghyeon; Lee, Jiseoc; Min, Jonghwan; Lee, Byungcheol; Lee, Byeongno; Oh, Kyungmin; Kim, Jaehyun; Cho, Seungryong

    2018-03-01

    Dual-energy X-ray inspection systems are widely used today for it provides X-ray attenuation contrast of the imaged object and also its material information. Material decomposition capability allows a higher detection sensitivity of potential targets including purposely loaded impurities in agricultural product inspections and threats in security scans for example. Dual-energy X-ray transmission data can be transformed into two basis material thickness data, and its transformation accuracy heavily relies on a calibration of material decomposition process. The calibration process in general can be laborious and time consuming. Moreover, a conventional calibration method is often challenged by the nonuniform spectral characteristics of the X-ray beam in the entire field-of-view (FOV). In this work, we developed an efficient material decomposition calibration process for a linear accelerator (LINAC) based high-energy X-ray cargo inspection system. We also proposed a multi-spot calibration method to improve the decomposition performance throughout the entire FOV. Experimental validation of the proposed method has been demonstrated by use of a cargo inspection system that supports 6 MV and 9 MV dual-energy imaging.

  17. Clinical genomics information management software linking cancer genome sequence and clinical decisions.

    PubMed

    Watt, Stuart; Jiao, Wei; Brown, Andrew M K; Petrocelli, Teresa; Tran, Ben; Zhang, Tong; McPherson, John D; Kamel-Reid, Suzanne; Bedard, Philippe L; Onetto, Nicole; Hudson, Thomas J; Dancey, Janet; Siu, Lillian L; Stein, Lincoln; Ferretti, Vincent

    2013-09-01

    Using sequencing information to guide clinical decision-making requires coordination of a diverse set of people and activities. In clinical genomics, the process typically includes sample acquisition, template preparation, genome data generation, analysis to identify and confirm variant alleles, interpretation of clinical significance, and reporting to clinicians. We describe a software application developed within a clinical genomics study, to support this entire process. The software application tracks patients, samples, genomic results, decisions and reports across the cohort, monitors progress and sends reminders, and works alongside an electronic data capture system for the trial's clinical and genomic data. It incorporates systems to read, store, analyze and consolidate sequencing results from multiple technologies, and provides a curated knowledge base of tumor mutation frequency (from the COSMIC database) annotated with clinical significance and drug sensitivity to generate reports for clinicians. By supporting the entire process, the application provides deep support for clinical decision making, enabling the generation of relevant guidance in reports for verification by an expert panel prior to forwarding to the treating physician. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. The Developing Purposes of Low-Income College Students in China's Elite Universities: A Longitudinal Case Study of How Socioeconomic Background and University Culture Interact to Influence the Development and Realization of Low Income College Students' Goals

    ERIC Educational Resources Information Center

    Zhao, Wanxia

    2014-01-01

    In the process of China's transformation from a socialist to a post-socialist society, China's entire system of education has experienced breathtaking expansion and reform. In this context, first-tier universities increasingly accept students from more financially well off backgrounds. While second-tier universities are inclined to accept more…

  19. An investigation into creative design methodologies for textiles and fashion

    NASA Astrophysics Data System (ADS)

    Gault, Alison

    2017-10-01

    Understanding market intelligence, trends, influences and personal approaches are essential tools for design students to develop their ideas in textiles and fashion. Identifying different personal approaches including, visual, process-led or concept by employing creative methodologies are key to developing a brief. A series of ideas or themes start to emerge and through the design process serve to underpin and inform an entire collection. These investigations ensure that the design collections are able to produce a diverse range of outcomes. Following key structures and coherent stages in the design process creates authentic collections in textiles and fashion. A range of undergraduate students presented their design portfolios (180) and the methodologies employed were mapped against success at module level, industry response and graduate employment.

  20. Development of an imaging system for the detection of alumina on turbine blades

    NASA Astrophysics Data System (ADS)

    Greenwell, S. J.; Kell, J.; Day, J. C. C.

    2014-03-01

    An imaging system capable of detecting alumina on turbine blades by acquiring LED-induced fluorescence images has been developed. Acquiring fluorescence images at adjacent spectral bands allows the system to distinguish alumina from fluorescent surface contaminants. Repair and overhaul processes require that alumina is entirely removed from the blades by grit blasting and chemical stripping. The capability of the system to detect alumina has been investigated with two series of turbine blades provided by Rolls-Royce plc. The results illustrate that the system provides a superior inspection method to visual assessment when ascertaining whether alumina is present on turbine blades during repair and overhaul processes.

  1. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  2. The Need for V&V in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.

  3. The complex leaves of the monkey's comb (Amphilophium crucigerum, Bignoniaceae): a climbing strategy without glue.

    PubMed

    Seidelmann, Katrin; Melzer, Björn; Speck, Thomas

    2012-11-01

    Monkey's comb (Amphilophium crucigerum) is a widely spread neotropical leaf climber that develops attachment pads for anchorage. A single complex leaf of the species comprises a basal pair of foliate, assimilating leaflets and apical, attaching leaflet tendrils. This study aims to analyze these leaves and their ontogenetic development for a better understanding of the attachment process, the form-structure-function relationships involved, and the overall maturation of the leaves. Thorough morphometrical, morphological, and anatomical analyses incorporated high-resolution microscopy, various staining techniques, SEM, and photographic recordings over the entire ontogenetic course of leaf development. The foliate, assimilating leaflets and the anchorage of the more apical leaflet tendrils acted independently of each other. Attachment was achieved by coiling of the leaflet tendrils and/or development of attachment pads at the tendril apices that grow opportunistically into gaps and fissures of the substrate. In contact zones with the substrate, the cells of the pads differentiate into a vessel element-like tissue. During the entire attachment process of the plant, no glue was excreted. The complex leaves of monkey's comb are highly differentiated organs with specialized leaf parts whose functions-photosynthesis or attachment-work independently of each other. The function of attachment includes coiling and maturation process of the leaflet tendrils and the formation of attachment pads, resulting in a biomechanically sound and persistent anchorage of the plant without the need of glue excretion. This kind of glue-less attachment is not only of interest in the framework of analyzing the functional variety of attachment structures evolved in climbing plants, but also for the development of innovative biomimetic attachment structures for manifold technical applications.

  4. Review of the Application of Green Building and Energy Saving Technology

    NASA Astrophysics Data System (ADS)

    Tong, Zhineng

    2017-12-01

    The use of energy-saving technologies in green buildings should run through the entire process of building design, construction and use, enabling green energy-saving technologies to maximize their effectiveness in construction. Realize the sustainable development of green building, reduce energy consumption, reduce people’s interference with the natural environment, suitable for people living in “green” building.

  5. Responding to the Concerns of Student Cultural Groups: Redesigning Spaces for Cultural Centers

    ERIC Educational Resources Information Center

    McDowell, Anise Mazone; Higbee, Jeanne L.

    2014-01-01

    This paper describes the engagement of a student committee in redesigning an entire floor of a university union to accommodate student cultural centers and provide space in a fair and equitable manner. The reorganization focused on the process as well as the task of allocating space, with an emphasis on the opportunity to foster the development of…

  6. Designing and Proposing Your Research Project. Concise Guides to Conducting Behavioral, Health, and Social Science Research Series

    ERIC Educational Resources Information Center

    Urban, Jennifer Brown; van Eeden-Moorefield, Bradley Matheus

    2017-01-01

    Designing your own study and writing your research proposal takes time, often more so than conducting the study. This practical, accessible guide walks you through the entire process. You will learn to identify and narrow your research topic, develop your research question, design your study, and choose appropriate sampling and measurement…

  7. Bayridge Secondary School: A Case Study of the Planning and Implementation of Educational Change.

    ERIC Educational Resources Information Center

    Eastabrook, Glen; And Others

    This is an account of the planning and implementation processes of a new secondary school (Bayridge Secondary School), located in a suburban area of a medium-sized city in Ontario, Canada. This report traces the planning and development of the school's goals, which included involvement of the entire school community, from 1970 through 1974. The…

  8. Following the Yellow Brick Road to Simplified Link Management

    ERIC Educational Resources Information Center

    Engard, Nicole C.

    2005-01-01

    Jenkins Law Library is the oldest law library in America, and has a reputation for offering great content not only to local attorneys, but also to the entire legal research community. In this article, the author, who is Web manager at Jenkins, describes the development of an automated process by which research links can be added to the database so…

  9. Theoretical analysis of the axial growth of nanowires starting with a binary eutectic droplet via vapor-liquid-solid mechanism

    NASA Astrophysics Data System (ADS)

    Liu, Qing; Li, Hejun; Zhang, Yulei; Zhao, Zhigang

    2018-06-01

    A series of theoretical analysis is carried out for the axial vapor-liquid-solid (VLS) growth of nanowires starting with a binary eutectic droplet. The growth model considering the entire process of axial VLS growth is a development of the approaches already developed by previous studies. In this model, the steady and unsteady state growth are considered both. The amount of solute species in a variable liquid droplet, the nanowire length, radius, growth rate and all other parameters during the entire axial growth process are treated as functions of growth time. The model provides theoretical predictions for the formation of nanowire shape, the length-radius and growth rate-radius dependences. It is also suggested by the model that the initial growth of single nanowire is significantly affected by Gibbs-Thompson effect due to the shape change. The model was applied on predictions of available experimental data of Si and Ge nanowires grown from Au-Si and Au-Ge systems respectively reported by other works. The calculations with the proposed model are in satisfactory agreement with the experimental results of the previous works.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, E.C.; Killough, S.M.; Rowe, J.C.

    The purpose of the Smart Crane Ammunition Transfer System (SCATS) project is to demonstrate robotic/telerobotic controls technology for a mobile articulated crane for missile/ munitions handling, delivery, and reload. Missile resupply and reload have been manually intensive operations up to this time. Currently, reload missiles are delivered by truck to the site of the launcher. A crew of four to five personnel reloads the missiles from the truck to the launcher using a hydraulic-powered crane. The missiles are handled carefully for the safety of the missiles and personnel. Numerous steps are required in the reload process and the entire reloadmore » operation can take over 1 h for some missile systems. Recent U.S. Army directives require the entire operation to be accomplished in a fraction of that time. Current requirements for the development of SCATS are being based primarily on reloading Patriot missiles. The planned development approach will integrate robotic control and sensor technology with a commercially available hydraulic articulated crane. SCATS is being developed with commercially available hardware as much as possible. Development plans include adding a 3-D.F. end effector with a grapple to the articulating crane; closed-loop position control for the crane and end effector; digital microprocessor control of crane functions; simplified operator interface; and operating modes which include rectilinear movement, obstacle avoidance, and partial automated operation. The planned development will include progressive technology demonstrations. Ultimate plans are for this technology to be transferred and utilized in the military fielding process.« less

  11. Thermal design of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bachrtel, F. D.; Vaniman, J. L.; Stuckey, J. M.; Gray, C.; Widofsky, B.

    1985-01-01

    The shuttle external tank thermal design presents many challenges in meeting the stringent requirements established by the structures, main propulsion systems, and Orbiter elements. The selected thermal protection design had to meet these requirements, and ease of application, suitability for mass production considering low weight, cost, and high reliability. This development led to a spray-on-foam (SOFI) which covers the entire tank. The need and design for a SOFI material with a dual role of cryogenic insulation and ablator, and the development of the SOFI over SLA concept for high heating areas are discussed. Further issuses of minimum surface ice/frost, no debris, and the development of the TPS spray process considering the required quality and process control are examined.

  12. Don’t fence us in

    USGS Publications Warehouse

    Oliver, J.

    1991-01-01

    When I was a graduate student around 1950 I used to read the entire Bulletin of the Seismological Society of America. it was a pwoerful and inspiring educational experience, with an effect quite different from that of the more usual process of looking up a few articles in the chain of references in a subject of current interest. Reading the entire journal reveals how ideas, techniques, and seismologists appear and evolve. It is likely the best substitute for a firsthand personal experience with the early development of the field. And in spite of, or perhaps because of, the missteps, the wasted effort, and the lack of sophistication that those first volumes reveal, the reader can sense the opportunity and be inspired by the vibrancy of the young subject. 

  13. The intricate mechanisms of neurodegeneration in prion diseases

    PubMed Central

    Soto, Claudio; Satani, Nikunj

    2010-01-01

    Prion diseases are a group of infectious neurodegenerative diseases with an entirely novel mechanism of transmission, involving a protein-only infectious agent that propagates the disease by transmitting protein conformational changes. The disease results from extensive and progressive brain degeneration. The molecular mechanisms involved in neurodegeneration are not entirely known but involve multiple processes operating simultaneously and synergistically in the brain, including spongiform degeneration, synaptic alterations, brain inflammation, neuronal death and the accumulation of protein aggregates. Here, we review the pathways implicated in prion-induced brain damage and put the pieces together into a possible model of neurodegeneration in prion disorders. A more comprehensive understanding of the molecular basis of brain degeneration is essential to develop a much needed therapy for these devastating diseases. PMID:20889378

  14. Writing instrument interfaces with xf/tktcl

    NASA Technical Reports Server (NTRS)

    Henden, A. A.

    1992-01-01

    Tcl is an embedded control language written in C, running primarily under Unix and with an interpreted C look-and-feel. Tk is an X11 toolkit based on tcl. Xf is an application builder for tk. The entire package is public domain and available from sprite.berkeley.edu. This paper discusses the use of tk to develop a user interface for OSIRIS, an infrared camera/spectrograph now operational on the OSU Perkins 1.8m telescope. The good and bad features of the development process are described.

  15. Work unit compensation.

    PubMed

    Sodano, M J

    1991-01-01

    The author describes an innovative "work unit compensation" system that acts as an adjunct to existing personnel payment structures. The process, developed as a win-win alternative for both employees and their institution, includes a reward system for the entire department and insures a team atmosphere. The Community Medical Center in Toms River, New Jersey developed the plan which sets the four basic goals: to be fair, economical, lasting and transferable (FELT). The plan has proven to be a useful tool in retention and recruitment of qualified personnel.

  16. A CAD approach to magnetic bearing design

    NASA Technical Reports Server (NTRS)

    Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.

    1988-01-01

    A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.

  17. The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eden, H.F.; Mooers, C.N.K.

    1990-06-01

    The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less

  18. Dosage compensation, the origin and the afterlife of sex chromosomes.

    PubMed

    Larsson, Jan; Meller, Victoria H

    2006-01-01

    Over the past 100 years Drosophila has been developed into an outstanding model system for the study of evolutionary processes. A fascinating aspect of evolution is the differentiation of sex chromosomes. Organisms with highly differentiated sex chromosomes, such as the mammalian X and Y, must compensate for the imbalance in gene dosage that this creates. The need to adjust the expression of sex-linked genes is a potent force driving the rise of regulatory mechanisms that act on an entire chromosome. This review will contrast the process of dosage compensation in Drosophila with the divergent strategies adopted by other model organisms. While the machinery of sex chromosome compensation is different in each instance, all share the ability to direct chromatin modifications to an entire chromosome. This review will also explore the idea that chromosome-targeting systems are sometimes adapted for other purposes. This appears the likely source of a chromosome-wide targeting system displayed by the Drosophila fourth chromosome.

  19. A combined electron beam/optical lithography process step for the fabrication of sub-half-micron-gate-length MMIC chips

    NASA Technical Reports Server (NTRS)

    Sewell, James S.; Bozada, Christopher A.

    1994-01-01

    Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.

  20. A combined electron beam/optical lithography process step for the fabrication of sub-half-micron-gate-length MMIC chips

    NASA Astrophysics Data System (ADS)

    Sewell, James S.; Bozada, Christopher A.

    1994-02-01

    Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.

  1. Monte Carlo simulation of efficient data acquisition for an entire-body PET scanner

    NASA Astrophysics Data System (ADS)

    Isnaini, Ismet; Obi, Takashi; Yoshida, Eiji; Yamaya, Taiga

    2014-07-01

    Conventional PET scanners can image the whole body using many bed positions. On the other hand, an entire-body PET scanner with an extended axial FOV, which can trace whole-body uptake images at the same time and improve sensitivity dynamically, has been desired. The entire-body PET scanner would have to process a large amount of data effectively. As a result, the entire-body PET scanner has high dead time at a multiplex detector grouping process. Also, the entire-body PET scanner has many oblique line-of-responses. In this work, we study an efficient data acquisition for the entire-body PET scanner using the Monte Carlo simulation. The simulated entire-body PET scanner based on depth-of-interaction detectors has a 2016-mm axial field-of-view (FOV) and an 80-cm ring diameter. Since the entire-body PET scanner has higher single data loss than a conventional PET scanner at grouping circuits, the NECR of the entire-body PET scanner decreases. But, single data loss is mitigated by separating the axially arranged detector into multiple parts. Our choice of 3 groups of axially-arranged detectors has shown to increase the peak NECR by 41%. An appropriate choice of maximum ring difference (MRD) will also maintain the same high performance of sensitivity and high peak NECR while at the same time reduces the data size. The extremely-oblique line of response for large axial FOV does not contribute much to the performance of the scanner. The total sensitivity with full MRD increased only 15% than that with about half MRD. The peak NECR was saturated at about half MRD. The entire-body PET scanner promises to provide a large axial FOV and to have sufficient performance values without using the full data.

  2. Reengineering outcomes management: an integrated approach to managing data, systems, and processes.

    PubMed

    Neuman, K; Malloch, K; Ruetten, V

    1999-01-01

    The integration of outcomes management into organizational reengineering projects is often overlooked or marginalized in proportion to the entire project. Incorporation of an integrated outcomes management program strengthens the overall quality of reengineering projects and enhances their sustainability. This article presents a case study in which data, systems, and processes were reengineered to form an effective Outcomes Management program as a component of the organization's overall project. The authors describe eight steps to develop and monitor an integrated outcomes management program. An example of an integrated report format is included.

  3. Cyber-Informed Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert S.; Benjamin, Jacob; Wright, Virginia L.

    A continuing challenge for engineers who utilize digital systems is to understand the impact of cyber-attacks across the entire product and program lifecycle. This is a challenge due to the evolving nature of cyber threats that may impact the design, development, deployment, and operational phases of all systems. Cyber Informed Engineering is the process by which engineers are made aware of both how to use their engineering knowledge to positively impact the cyber security in the processes by which they architect and design components and the services and security of the components themselves.

  4. JSOU and NDIA SO/LIC Division Essays (2007)

    DTIC Science & Technology

    2007-04-01

    Create several content-rich Darknet environments—a private virtual network where users connect only to people they trust7—that offer e-mail, file...chat rooms, and Darknets ). Moon: Cyber-Herding Cyber-Herding Nodes and Relationship Network Gatherer Construction Demolition Structure of Cyber-Herding...the extrem- ist messages, concentrating Web sites, and developing Darknets . A visual illustration of the entire process follows Phase 7. Phase 5

  5. Visual Purple, the Next Generation Crisis Management Decision Training Tool

    DTIC Science & Technology

    2001-09-01

    talents of professional Hollywood screenwriters during the scripting and writing process of the simulations. Additionally, cinematic techniques learned...cultural, and language experts for research development. Additionally, GTA provides country specific support in script writing and cinematic resources as...The result is an entirely new dimension of realism that traditional exercises often fail to capture. The scenario requires the participant to make the

  6. Guidance and Control Software,

    DTIC Science & Technology

    1980-05-01

    commitments of function, cost, and schedule . The phrase "software engineering" was intended to contrast with the phrase "computer science" the latter aims...the software problems of cost, delivery schedule , and quality were gradually being recognized at the highest management levels. Thus, in a project... schedule dates. Although the analysis of software problems indicated that the entire software development process (figure 1) needed new methods, only

  7. The complete process of large elastic-plastic deflection of a cantilever

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoqiang; Yu, Tongxi

    1986-11-01

    An extension of the Elastica theory is developed to study the large deflection of an elastic-perfectly plastic horizontal cantilever beam subjected to a vertical concentrated force at its tip. The entire process is divided into four stages: I.elastic in the whole cantilever; II.loading and developing of the plastic region; III.unloading in the plastic region; and IV.reverse loading. Solutions for stages I and II are presented in a closed form. A combination of closed-form solution and numerical integration is presented for stage III. Finally, stage IV is qualitatively studied. Computed results are given and compared with those from small-deflection theory and from the Elastica theory.

  8. Temperature distribution of thick thermoset composites

    NASA Astrophysics Data System (ADS)

    Guo, Zhan-Sheng; Du, Shanyi; Zhang, Boming

    2004-05-01

    The development of temperature distribution of thick polymeric matrix laminates during an autoclave vacuum bag process was measured and compared with numerically calculated results. The finite element formulation of the transient heat transfer problem was carried out for polymeric matrix composite materials from the heat transfer differential equations including internal heat generation produced by exothermic chemical reactions. Software based on the general finite element software package was developed for numerical simulation of the entire composite process. From the experimental and numerical results, it was found that the measured temperature profiles were in good agreement with the numerical ones, and conventional cure cycles recommended by prepreg manufacturers for thin laminates should be modified to prevent temperature overshoot.

  9. Optical surface analysis: a new technique for the inspection and metrology of optoelectronic films and wafers

    NASA Astrophysics Data System (ADS)

    Bechtler, Laurie; Velidandla, Vamsi

    2003-04-01

    In response to demand for higher volumes and greater product capability, integrated optoelectronic device processing is rapidly increasing in complexity, benefiting from techniques developed for conventional silicon integrated circuit processing. The needs for high product yield and low manufacturing cost are also similar to the silicon wafer processing industry. This paper discusses the design and use of an automated inspection instrument called the Optical Surface Analyzer (OSA) to evaluate two critical production issues in optoelectronic device manufacturing: (1) film thickness uniformity, and (2) defectivity at various process steps. The OSA measurement instrument is better suited to photonics process development than most equipment developed for conventional silicon wafer processing in two important ways: it can handle both transparent and opaque substrates (unlike most inspection and metrology tools), and it is a full-wafer inspection method that captures defects and film variations over the entire substrate surface (unlike most film thickness measurement tools). Measurement examples will be provided in the paper for a variety of films and substrates used for optoelectronics manufacturing.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkoske, Kyle; Nielsen, Michelle; Brown, Erika

    The Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) have worked together in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment and technologies, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. Early community engagement and uptake survey data showed 70% of Canadian centers are part of this process and that the data in the guideline documents reflect, and are influencing the way Canadian radiation treatmentmore » centres run their technical quality control programs. As the TQC development framework matured as a cross-country initiative, guidance documents have been developed in many clinical technologies. Recently, there have been new TQC documents initiated for Gamma Knife and Cyberknife technologies where the entire communities within Canada are involved in the review process. At the same time, QARSAC reviewed the suite as a whole for the first time and it was found that some tests and tolerances overlapped across multiple documents as single tests could pertain to multiple quality control areas. The work to streamline the entire suite has allowed for improved usability of the suite while keeping the integrity of single quality control areas. The suite will be published by the JACMP, in the coming year.« less

  11. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  12. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    PubMed

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  13. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    PubMed

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  14. An Exploration of Software-Based GNSS Signal Processing at Multiple Frequencies

    NASA Astrophysics Data System (ADS)

    Pasqual Paul, Manuel; Elosegui, Pedro; Lind, Frank; Vazquez, Antonio; Pankratius, Victor

    2017-01-01

    The Global Navigation Satellite System (GNSS; i.e., GPS, GLONASS, Galileo, and other constellations) has recently grown into numerous areas that go far beyond the traditional scope in navigation. In the geosciences, for example, high-precision GPS has become a powerful tool for a myriad of geophysical applications such as in geodynamics, seismology, paleoclimate, cryosphere, and remote sensing of the atmosphere. Positioning with millimeter-level accuracy can be achieved through carrier-phase-based, multi-frequency signal processing, which mitigates various biases and error sources such as those arising from ionospheric effects. Today, however, most receivers with multi-frequency capabilities are highly specialized hardware receiving systems with proprietary and closed designs, limited interfaces, and significant acquisition costs. This work explores alternatives that are entirely software-based, using Software-Defined Radio (SDR) receivers as a way to digitize the entire spectrum of interest. It presents an overview of existing open-source frameworks and outlines the next steps towards converting GPS software receivers from single-frequency to dual-frequency, geodetic-quality systems. In the future, this development will lead to a more flexible multi-constellation GNSS processing architecture that can be easily reused in different contexts, as well as to further miniaturization of receivers.

  15. Healthcare under siege: Geopolitics of medical service provision in the Gaza Strip.

    PubMed

    Smith, Ron J

    2015-12-01

    Siege, a process of political domination aimed at isolating an entire population, represents a unique threat to healthcare provision. This study is a qualitative examination of the impacts of siege on the practices and systems that underlie health in Gaza. Data are from participant observation conducted over a period of six years (2009-2014), along over 20 interviews with doctors and health administrators in the Non-Governmental Organisation (NGO), Governmental, and United Nations sectors. Analyses were informed by two connected theories. First, the theory of surplus population was used, an idea that builds on Marx's conception of primitive accumulation and Harvey's accumulation by dispossession. Second, Roy's theory of de-development was used, particularly as it is connected to neoliberal trends in healthcare systems organizing and financing. Findings indicate that siege impinges on effective healthcare provision through two central, intertwined processes: withholding materials and resources and undermining healthcare at a systems level. These strains pose considerable threats to healthcare, particularly within the Ministry of Health but also within and among other entities in Gaza that deliver care. The strategies of de-development described by participants reflect the ways the population that is codified as a surplus population. Gazan society is continually divested of any of the underpinnings necessary for a well-functioning sovereign health care infrastructure. Instead of a self-governing, independent system, this analysis of health care structures in Gaza reveals a system that is continually at risk of being comprised entirely of captive consumers who are entirely dependent on Israel, international bodies, and the aid industry for goods and services. This study points to the importance of foregrounding the geopolitical context for analysis of medical service delivery within conflict settings. Findings also highlight the importance of advocating for sovereignty and self-determination as related to health systems.

  16. Combined process "helical rolling-pressing" and its effect on the microstructure of ferrous and non-ferrous materials

    NASA Astrophysics Data System (ADS)

    Naizabekov, Abdrakhman; Lezhnev, Sergey; Arbuz, Alexandr; Panin, Evgeniy

    2018-02-01

    Ultrafine-grained materials are one of the most promising structural and functional materials. However, the known methods of obtaining them are not enough powerful and technologically advanced for profitable industrial applications. Development of the combined process "helical rolling-pressing" is an attempt to bring technology to produce ultrafine-grained materials to the industry. The combination of intense processing of the surface by helical rolling and the entire cross section of workpiece in equal channel angular matrix, with intense deformation by torsion between rolls and matrix will increase the degree of deformation per pass and allows to mutually compensate disadvantages of these methods in the case of their separate use. This paper describes the development of a laboratory stand and study of influence of combined process "helical rolling-pressing"on the microstructure of tool steel, technical copper and high alloy stainless high-temperature steel.

  17. Access NASA Satellite Global Precipitation Data Visualization on YouTube

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Su, J.; Acker, J. G.; Huffman, G. J.; Vollmer, B.; Wei, J.; Meyer, D. J.

    2017-12-01

    Since the satellite era began, NASA has collected a large volume of Earth science observations for research and applications around the world. Satellite data at 12 NASA data centers can also be used for STEM activities such as disaster events, climate change, etc. However, accessing satellite data can be a daunting task for non-professional users such as teachers and students because of unfamiliarity of terminology, disciplines, data formats, data structures, computing resources, processing software, programing languages, etc. Over the years, many efforts have been developed to improve satellite data access, but barriers still exist for non-professionals. In this presentation, we will present our latest activity that uses the popular online video sharing web site, YouTube, to access visualization of global precipitation datasets at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC). With YouTube, users can access and visualize a large volume of satellite data without necessity to learn new software or download data. The dataset in this activity is the 3-hourly TRMM (Tropical Rainfall Measuring Mission) Multi-satellite Precipitation Analysis (TMPA). The video consists of over 50,000 data files collected since 1998 onwards, covering a zone between 50°N-S. The YouTube video will last 36 minutes for the entire dataset record (over 19 years). Since the time stamp is on each frame of the video, users can begin at any time by dragging the time progress bar. This precipitation animation will allow viewing precipitation events and processes (e.g., hurricanes, fronts, atmospheric rivers, etc.) on a global scale. The next plan is to develop a similar animation for the GPM (Global Precipitation Measurement) Integrated Multi-satellitE Retrievals for GPM (IMERG). The IMERG provides precipitation on a near-global (60°N-S) coverage at half-hourly time interval, showing more details on precipitation processes and development, compared to the 3-hourly TMPA product. The entire video will contain more than 330,000 files and will last 3.6 hours. Future plans include development of fly-over videos for orbital data for an entire satellite mission or project. All videos will be uploaded and available at the GES DISC site on YouTube (https://www.youtube.com/user/NASAGESDISC).

  18. A design of experiment approach for efficient multi-parametric drug testing using a Caenorhabditis elegans model.

    PubMed

    Letizia, M C; Cornaglia, M; Tranchida, G; Trouillon, R; Gijs, M A M

    2018-01-22

    When studying the drug effectiveness towards a target model, one should distinguish the effects of the drug itself and of all the other factors that could influence the screening outcome. This comprehensive knowledge is crucial, especially when model organisms are used to study the drug effect at a systemic level, as a higher number of factors can influence the drug-testing outcome. Covering the entire experimental domain and studying the effect of the simultaneous change in several factors would require numerous experiments, which are costly and time-consuming. Therefore, a design of experiment (DoE) approach in drug-testing is emerging as a robust and efficient method to reduce the use of resources, while maximizing the knowledge of the process. Here, we used a 3-factor-Doehlert DoE to characterize the concentration-dependent effect of the drug doxycycline on the development duration of the nematode Caenorhabditis elegans. To cover the experimental space, 13 experiments were designed and performed, where different doxycycline concentrations were tested, while also varying the temperature and the food amount, which are known to influence the duration of C. elegans development. A microfluidic platform was designed to isolate and culture C. elegans larvae, while testing the doxycycline effect with full control of temperature and feeding over the entire development. Our approach allowed predicting the doxycycline effect on C. elegans development in the complete drug concentration/temperature/feeding experimental space, maximizing the understanding of the effect of this antibiotic on the C. elegans development and paving the way towards a standardized and optimized drug-testing process.

  19. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  20. Using ISI Web of Science to Compare Top-Ranked Journals to the Citation Habits of a "Real World" Academic Department

    ERIC Educational Resources Information Center

    Cusker, Jeremy

    2012-01-01

    Quantitative measurements can be used to yield lists of top journals for individual fields. However, these lists represent assessments of the entire "universe" of citation. A much more involved process is needed if the goal is to develop a nuanced picture of what a specific group of authors, such as an academic department, is citing. This article…

  1. Ways to Help Your Child through an Immunization: Visual Strategies for Autism and Other Developmental Disorders

    ERIC Educational Resources Information Center

    Hutchinson, Paula; Harvey, Vicki; Naugler, Krista

    2010-01-01

    Many people, whether old or young, male or female, typically developing or living with a disability, become quite anxious at the idea of a needle. They anticipate the possibility of pain, however brief, and try to avoid the experience. The reality is that any discomfort is usually very brief, and the entire process only takes a minute or two from…

  2. Contingency plans for chromium utilization. Publication NMAB-335

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The United States depends entirely on foreign sources for the critical material, chromium, making it very vulnerable to supply disruptions. The effectiveness of programs such as stockpiling, conservation, and research and development for substitutes to reduce the impact of disruption of imports of chromite and ferrochromium are discussed. Alternatives for decreasing chromium consumption also are identified for chromium-containing materials in the areas of design, processing, and substitution.

  3. Muscle fatigue evaluation of astronaut upper limb based on sEMG and subjective assessment

    NASA Astrophysics Data System (ADS)

    Zu, Xiaoqi; Zhou, Qianxiang; Li, Yun

    2012-07-01

    All movements are driven by muscle contraction, and it is easy to cause muscle fatigue. Evaluation of muscle fatigue is a hot topic in the area of astronaut life support training and rehabilitation. If muscle gets into fatigue condition, it may reduce work efficiency and has an impact on psychological performance. Therefore it is necessary to develop an accurate and usable method on muscle fatigue evaluation of astronaut upper limb. In this study, we developed a method based on surface electromyography (sEMG) and subjective assessment (Borg scale) to evaluate local muscle fatigue. Fifteen healthy young male subjects participated in the experiment. They performed isometric muscle contractions of the upper limb. sEMG of the biceps brachii were recorded during the entire process of isotonic muscle contraction and Borg scales of muscle fatigue were collected in certain times. sEMG were divided into several parts, and then mean energy of each parts were calculated by the one-twelfth band octave method. Equations were derived based on the relationship between the mean energy of sEMG and Borg scale. The results showed that cubic curve could describe the degree of local muscle fatigue, and could be used to evaluate and monitor local muscle fatigue during the entire process.

  4. The Core Flight System (cFS) Community: Providing Low Cost Solutions for Small Spacecraft

    NASA Technical Reports Server (NTRS)

    McComas, David; Wilmot, Jonathan; Cudmore, Alan

    2016-01-01

    In February 2015 the NASA Goddard Space Flight Center (GSFC) completed the open source release of the entire Core Flight Software (cFS) suite. After the open source release a multi-NASA center Configuration Control Board (CCB) was established that has managed multiple cFS product releases. The cFS was developed and is being maintained in compliance with the NASA Class B software development process requirements and the open source release includes all Class B artifacts. The cFS is currently running on three operational science spacecraft and is being used on multiple spacecraft and instrument development efforts. While the cFS itself is a viable flight software (FSW) solution, we have discovered that the cFS community is a continuous source of innovation and growth that provides products and tools that serve the entire FSW lifecycle and future mission needs. This paper summarizes the current state of the cFS community, the key FSW technologies being pursued, the development/verification tools and opportunities for the small satellite community to become engaged. The cFS is a proven high quality and cost-effective solution for small satellites with constrained budgets.

  5. Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process

    NASA Astrophysics Data System (ADS)

    Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.

    2018-06-01

    A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.

  6. Developmental and adult-specific processes contribute to de novo neuromuscular regeneration in the lizard tail.

    PubMed

    Tokuyama, Minami A; Xu, Cindy; Fisher, Rebecca E; Wilson-Rawls, Jeanne; Kusumi, Kenro; Newbern, Jason M

    2018-01-15

    Peripheral nerves exhibit robust regenerative capabilities in response to selective injury among amniotes, but the regeneration of entire muscle groups following volumetric muscle loss is limited in birds and mammals. In contrast, lizards possess the remarkable ability to regenerate extensive de novo muscle after tail loss. However, the mechanisms underlying reformation of the entire neuromuscular system in the regenerating lizard tail are not completely understood. We have tested whether the regeneration of the peripheral nerve and neuromuscular junctions (NMJs) recapitulate processes observed during normal neuromuscular development in the green anole, Anolis carolinensis. Our data confirm robust axonal outgrowth during early stages of tail regeneration and subsequent NMJ formation within weeks of autotomy. Interestingly, NMJs are overproduced as evidenced by a persistent increase in NMJ density 120 and 250 days post autotomy (DPA). Substantial Myelin Basic Protein (MBP) expression could also be detected along regenerating nerves indicating that the ability of Schwann cells to myelinate newly formed axons remained intact. Overall, our data suggest that the mechanism of de novo nerve and NMJ reformation parallel, in part, those observed during neuromuscular development. However, the prolonged increase in NMJ number and aberrant muscle differentiation hint at processes specific to the adult response. An examination of the coordinated exchange between peripheral nerves, Schwann cells, and newly synthesized muscle of the regenerating neuromuscular system may assist in the identification of candidate molecules that promote neuromuscular recovery in organisms incapable of a robust regenerative response. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Examination and evaluation of the use of screen heaters for the measurement of the high temperature pyrolysis kinetics of polyethene and polypropene

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westerhout, R.W.J.; Balk, R.H.P.; Meijer, R.

    1997-08-01

    A screen heater with a gas sweep was developed and applied to study the pyrolysis kinetics of low density polyethene (LDPE) and polypropene (PP) at temperatures ranging from 450 to 530 C. The aim of this study was to examine the applicability of screen heaters to measure these kinetics. On-line measurement of the rate of volatiles formation using a hydrocarbon analyzer was applied to enable the determination of the conversion rate over the entire conversion range on the basis of a single experiment. Another important feature of the screen heater used in this study is the possibility to measure pyrolysismore » kinetics under nearly isothermal conditions. The kinetic constants for LDPE and PP pyrolysis were determined, using a first order model to describe the conversion rate in the 70--90% conversion range and the random chain dissociation model for the entire conversion range. In addition to the experimental work two single particle models have been developed which both incorporate a mass and a (coupled) enthalpy balance, which were used to assess the influence of internal and external heat transfer processes on the pyrolysis process. The first model assumes a variable density and constant volume during the pyrolysis process, whereas the second model assumes a constant density and a variable volume. An important feature of these models is that they can accommodate kinetic models for which no analytical representation of the pyrolysis kinetics is available.« less

  8. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  9. Musical rhythm and reading development: does beat processing matter?

    PubMed

    Ozernov-Palchik, Ola; Patel, Aniruddh D

    2018-05-20

    There is mounting evidence for links between musical rhythm processing and reading-related cognitive skills, such as phonological awareness. This may be because music and speech are rhythmic: both involve processing complex sound sequences with systematic patterns of timing, accent, and grouping. Yet, there is a salient difference between musical and speech rhythm: musical rhythm is often beat-based (based on an underlying grid of equal time intervals), while speech rhythm is not. Thus, the role of beat-based processing in the reading-rhythm relationship is not clear. Is there is a distinct relation between beat-based processing mechanisms and reading-related language skills, or is the rhythm-reading link entirely due to shared mechanisms for processing nonbeat-based aspects of temporal structure? We discuss recent evidence for a distinct link between beat-based processing and early reading abilities in young children, and suggest experimental designs that would allow one to further methodically investigate this relationship. We propose that beat-based processing taps into a listener's ability to use rich contextual regularities to form predictions, a skill important for reading development. © 2018 New York Academy of Sciences.

  10. Physical and mechanical metallurgy of NiAl

    NASA Technical Reports Server (NTRS)

    Noebe, Ronald D.; Bowman, Randy R.; Nathal, Michael V.

    1994-01-01

    Considerable research has been performed on NiAl over the last decade, with an exponential increase in effort occurring over the last few years. This is due to interest in this material for electronic, catalytic, coating and especially high-temperature structural applications. This report uses this wealth of new information to develop a complete description of the properties and processing of NiAl and NiAl-based materials. Emphasis is placed on the controlling fracture and deformation mechanisms of single and polycrystalline NiAl and its alloys over the entire range of temperatures for which data are available. Creep, fatigue, and environmental resistance of this material are discussed. In addition, issues surrounding alloy design, development of NiAl-based composites, and materials processing are addressed.

  11. Approximate Model of Zone Sedimentation

    NASA Astrophysics Data System (ADS)

    Dzianik, František

    2011-12-01

    The process of zone sedimentation is affected by many factors that are not possible to express analytically. For this reason, the zone settling is evaluated in practice experimentally or by application of an empirical mathematical description of the process. The paper presents the development of approximate model of zone settling, i.e. the general function which should properly approximate the behaviour of the settling process within its entire range and at the various conditions. Furthermore, the specification of the model parameters by the regression analysis of settling test results is shown. The suitability of the model is reviewed by graphical dependencies and by statistical coefficients of correlation. The approximate model could by also useful on the simplification of process design of continual settling tanks and thickeners.

  12. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  13. Space station automation study: Automation requriements derived from space manufacturing concepts,volume 2

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Automation reuirements were developed for two manufacturing concepts: (1) Gallium Arsenide Electroepitaxial Crystal Production and Wafer Manufacturing Facility, and (2) Gallium Arsenide VLSI Microelectronics Chip Processing Facility. A functional overview of the ultimate design concept incoporating the two manufacturing facilities on the space station are provided. The concepts were selected to facilitate an in-depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, sensors, and artificial intelligence. While the cost-effectiveness of these facilities was not analyzed, both appear entirely feasible for the year 2000 timeframe.

  14. The aging-disease false dichotomy: understanding senescence as pathology

    PubMed Central

    Gems, David

    2015-01-01

    From a biological perspective aging (senescence) appears to be a form of complex disease syndrome, though this is not the traditional view. This essay aims to foster a realistic understanding of aging by scrutinizing ideas old and new. The conceptual division between aging-related diseases and an underlying, non-pathological aging process underpins various erroneous traditional ideas about aging. Among biogerontologists, another likely error involves the aspiration to treat the entire aging process, which recent advances suggest is somewhat utopian. It also risks neglecting a more modest but realizable goal: to develop preventative treatments that partially protect against aging. PMID:26136770

  15. Mask fabrication process

    DOEpatents

    Cardinale, Gregory F.

    2000-01-01

    A method for fabricating masks and reticles useful for projection lithography systems. An absorber layer is conventionally patterned using a pattern and etch process. Following the step of patterning, the entire surface of the remaining top patterning photoresist layer as well as that portion of an underlying protective photoresist layer where absorber material has been etched away is exposed to UV radiation. The UV-exposed regions of the protective photoresist layer and the top patterning photoresist layer are then removed by solution development, thereby eliminating the need for an oxygen plasma etch and strip and chances for damaging the surface of the substrate or coatings.

  16. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  17. Aging Biology and Novel Targets for Drug Discovery

    PubMed Central

    McLachlan, Andrew J.; Quinn, Ronald J.; Simpson, Stephen J.; de Cabo, Rafael

    2012-01-01

    Despite remarkable technological advances in genetics and drug screening, the discovery of new pharmacotherapies has slowed and new approaches to drug development are needed. Research into the biology of aging is generating many novel targets for drug development that may delay all age-related diseases and be used long term by the entire population. Drugs that successfully delay the aging process will clearly become “blockbusters.” To date, the most promising leads have come from studies of the cellular pathways mediating the longevity effects of caloric restriction (CR), particularly target of rapamycin and the sirtuins. Similar research into pathways governing other hormetic responses that influence aging is likely to yield even more targets. As aging becomes a more attractive target for drug development, there will be increasing demand to develop biomarkers of aging as surrogate outcomes for the testing of the effects of new agents on the aging process. PMID:21693687

  18. Converting customer expectations into achievable results.

    PubMed

    Landis, G A

    1999-11-01

    It is not enough in today's environment to just meet customers' expectations--we must exceed them. Therefore, one must learn what constitutes expectations. These needs have expanded during the past few years from just manufacturing the product and looking at the outcome from a provincial standpoint. Now we must understand and satisfy the entire supply chain. To manage this process and satisfy the customer, the process now involves the supplier, the manufacturer, and the entire distribution system.

  19. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  20. Shielded loaded bowtie antenna incorporating the presence of paving structure for improved GPR pipe detection

    NASA Astrophysics Data System (ADS)

    Seyfried, Daniel; Jansen, Ronald; Schoebel, Joerg

    2014-12-01

    In civil engineering Ground Penetrating Radar becomes more and more a considerable tool for nondestructive testing and exploration of the underground. For example, the detection of existence of utilization pipe networks prior to construction works or detection of damaged spot beneath a paved street is a highly advantageous application. However, different surface conditions as well as ground bounce reflection and antenna cross-talk may seriously affect the detection capability of the entire radar system. Therefore, proper antenna design is an essential part in order to obtain radar data of high quality. In this paper we redesign a given loaded bowtie antenna in order to reduce strong and unwanted signal contributions such as ground bounce reflection and antenna cross-talk. During the optimization process we also review all parameters of our existing antenna in order to maximize energy transfer into ground. The entire process incorporating appropriate simulations along with running measurements on our GPR test site where we buried different types of pipes and cables for testing and developing radar hardware and software algorithms under quasi-real conditions is described in this paper.

  1. Surgical quality assessment. A simplified approach.

    PubMed

    DeLong, D L

    1991-10-01

    The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.

  2. Measurement of Meteor Impact Experiments Using Three-Component Particle Image Velocimetry

    NASA Technical Reports Server (NTRS)

    Heineck, James T.; Schultz, Peter H.

    2002-01-01

    The study of hypervelocity impacts has been aggressively pursued for more than 30 years at Ames as a way to simulate meteoritic impacts. Development of experimental methods coupled with new perspectives over this time has greatly improved the understanding of the basic physics and phenomenology of the impact process. These fundamental discoveries have led to novel methods for identifying impact craters and features in craters on both Earth and other planetary bodies. Work done at the Ames Vertical Gun Range led to the description of the mechanics of the Chicxualub crater (a.k.a. K-T crater) on the Yucatan Peninsula, widely considered to be the smoking gun impact that brought an end to the dinosaur era. This is the first attempt in the world to apply three-component particle image velocimetry (3-D PIV) to measure the trajectory of the entire ejecta curtain simultaneously with the fluid structure resulting from impact dynamics. The science learned in these experiments will build understanding in the entire impact process by simultaneously measuring both ejecta and atmospheric mechanics.

  3. End-to-end commissioning demonstration of the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Acton, D. Scott; Towell, Timothy; Schwenker, John; Shields, Duncan; Sabatke, Erin; Contos, Adam R.; Hansen, Karl; Shi, Fang; Dean, Bruce; Smith, Scott

    2007-09-01

    The one-meter Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate the design and implementation of the wavefront sensing and control (WFSC) capabilities of the James Webb Space Telescope (JWST). We have recently conducted an "end-to-end" demonstration of the flight commissioning process on the TBT. This demonstration started with the Primary Mirror (PM) segments and the Secondary Mirror (SM) in random positions, traceable to the worst-case flight deployment conditions. The commissioning process detected and corrected the deployment errors, resulting in diffraction-limited performance across the entire science FOV. This paper will describe the commissioning demonstration and the WFSC algorithms used at each step in the process.

  4. Development of a real-time microchip PCR system for portable plant disease diagnosis.

    PubMed

    Koo, Chiwan; Malapi-Wight, Martha; Kim, Hyun Soo; Cifci, Osman S; Vaughn-Diaz, Vanessa L; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25 × 16 × 8 cm(3) in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample.

  5. Development of a Real-Time Microchip PCR System for Portable Plant Disease Diagnosis

    PubMed Central

    Kim, Hyun Soo; Cifci, Osman S.; Vaughn-Diaz, Vanessa L.; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C.; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25×16×8 cm3 in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample. PMID:24349341

  6. Extending the granularity of representation and control for the MIL-STD CAIS 1.0 node model

    NASA Technical Reports Server (NTRS)

    Rogers, Kathy L.

    1986-01-01

    The Common APSE (Ada 1 Program Support Environment) Interface Set (CAIS) (DoD85) node model provides an excellent baseline for interfaces in a single-host development environment. To encompass the entire spectrum of computing, however, the CAIS model should be extended in four areas. It should provide the interface between the engineering workstation and the host system throughout the entire lifecycle of the system. It should provide a basis for communication and integration functions needed by distributed host environments. It should provide common interfaces for communications mechanisms to and among target processors. It should provide facilities for integration, validation, and verification of test beds extending to distributed systems on geographically separate processors with heterogeneous instruction set architectures (ISAS). Additions to the PROCESS NODE model to extend the CAIS into these four areas are proposed.

  7. PrimerDesign-M: A multiple-alignment based multiple-primer design tool for walking across variable genomes

    DOE PAGES

    Yoon, Hyejin; Leitner, Thomas

    2014-12-17

    Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less

  8. Time-Resolved Electronic Relaxation Processes in Self-Organized Quantum Dots

    DTIC Science & Technology

    2005-05-16

    in a quantum dot infrared photodetector ,” paper CthM11, presented at CLEO, Baltimore, 2003. K. Kim, T. Norris, J. Singh, P. Bhattacharya...nanostructures have been equally spectacular. Following the development of quantum-well infrared photodetectors in the late 1980’s and early 90’s...4]. The quantum cascade laser is of course the best known of the new devices, as it constitutes an entirely new concept in semiconductor laser

  9. Disruptive and Sustaining Technology Development Approaches in Defense Acquisition

    DTIC Science & Technology

    2014-04-30

    feature for the emerging personal computer market. Disruptive innovation also operates on the scale of an entire market. The story of Eastman Kodak ...quality pictures, it was only available to those with expertise in, and desire to, chemically process the film. The Kodak box camera took lower quality...into a niche market. A century later the scenario repeated itself in amateur photography. Kodak had become locked into their century-old business

  10. Proposal of Heuristic Algorithm for Scheduling of Print Process in Auto Parts Supplier

    NASA Astrophysics Data System (ADS)

    Matsumoto, Shimpei; Okuhara, Koji; Ueno, Nobuyuki; Ishii, Hiroaki

    We are interested in the print process on the manufacturing processes of auto parts supplier as an actual problem. The purpose of this research is to apply our scheduling technique developed in university to the actual print process in mass customization environment. Rationalization of the print process is depending on the lot sizing. The manufacturing lead time of the print process is long, and in the present method, production is done depending on worker’s experience and intuition. The construction of an efficient production system is urgent problem. Therefore, in this paper, in order to shorten the entire manufacturing lead time and to reduce the stock, we reexamine the usual method of the lot sizing rule based on heuristic technique, and we propose the improvement method which can plan a more efficient schedule.

  11. Entire Photodamaged Chloroplasts Are Transported to the Central Vacuole by Autophagy[OPEN

    PubMed Central

    2017-01-01

    Turnover of dysfunctional organelles is vital to maintain homeostasis in eukaryotic cells. As photosynthetic organelles, plant chloroplasts can suffer sunlight-induced damage. However, the process for turnover of entire damaged chloroplasts remains unclear. Here, we demonstrate that autophagy is responsible for the elimination of sunlight-damaged, collapsed chloroplasts in Arabidopsis thaliana. We found that vacuolar transport of entire chloroplasts, termed chlorophagy, was induced by UV-B damage to the chloroplast apparatus. This transport did not occur in autophagy-defective atg mutants, which exhibited UV-B-sensitive phenotypes and accumulated collapsed chloroplasts. Use of a fluorescent protein marker of the autophagosomal membrane allowed us to image autophagosome-mediated transport of entire chloroplasts to the central vacuole. In contrast to sugar starvation, which preferentially induced distinct type of chloroplast-targeted autophagy that transports a part of stroma via the Rubisco-containing body (RCB) pathway, photooxidative damage induced chlorophagy without prior activation of RCB production. We further showed that chlorophagy is induced by chloroplast damage caused by either artificial visible light or natural sunlight. Thus, this report establishes that an autophagic process eliminates entire chloroplasts in response to light-induced damage. PMID:28123106

  12. Traceability System For Agricultural Productsbased on Rfid and Mobile Technology

    NASA Astrophysics Data System (ADS)

    Sugahara, Koji

    In agriculture, it is required to establish and integrate food traceability systems and risk management systems in order to improve food safety in the entire food chain. The integrated traceability system for agricultural products was developed, based on innovative technology of RFID and mobile computing. In order to identify individual products on the distribution process efficiently,small RFID tags with unique ID and handy RFID readers were applied. On the distribution process, the RFID tags are checked by using the readers, and transit records of the products are stored to the database via wireless LAN.Regarding agricultural production, the recent issues of pesticides misuse affect consumer confidence in food safety. The Navigation System for Appropriate Pesticide Use (Nouyaku-navi) was developed, which is available in the fields by Internet cell-phones. Based on it, agricultural risk management systems have been developed. These systems collaborate with traceability systems and they can be applied for process control and risk management in agriculture.

  13. The implementation of microstructural and heat treatment models to development of forming technology of critical aluminum-alloy parts

    NASA Astrophysics Data System (ADS)

    Biba, Nikolay; Alimov, Artem; Shitikov, Andrey; Stebunov, Sergei

    2018-05-01

    The demand for high performance and energy efficient transportation systems have boosted interest in lightweight design solutions. To achieve maximum weight reductions, it is not enough just to replace steel parts by their aluminium analogues, but it is necessary to change the entire concept of vehicle design. In this case we must develop methods for manufacturing a variety of critical parts with unusual and difficult to produce shapes. The mechanical properties of the material in these parts must also be optimised and tightly controlled to provide the best distribution within the part volume. The only way to achieve these goals is to implement technology development methods based on simulation of the entire manufacturing chain from preparing a billet through the forming operations and heat treatment of the product. The paper presents an approach to such technology development. The simulation of the technological chain starts with extruding a round billet. Depending on the extrusion process parameters, the billet can have different levels of material workout and variation of grain size throughout the volume. After extrusion, the billet gets formed into the required shape in a forging process. The main requirements at this stage are to get the near net shape of the product without defects and to provide proper configuration of grain flow that strengthens the product in the most critical direction. Then the product undergoes solution treatment, quenching and ageing. The simulation of all these stages are performed by QForm FEM code that provides thermo-mechanical coupled deformation of the material during extrusion and forging. To provide microstructure and heat treatment simulation, special subroutines has been developed by the authors. The proposed approach is illustrated by an industrial case study.

  14. AVE-SESAME program for the REEDA System

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1981-01-01

    The REEDA system software was modified and improved to process the AVE-SESAME severe storm data. A random access file system for the AVE storm data was designed, tested, and implemented. The AVE/SESAME software was modified to incorporate the random access file input and to interface with new graphics hardware/software now available on the REEDA system. Software was developed to graphically display the AVE/SESAME data in the convention normally used by severe storm researchers. Software was converted to AVE/SESAME software systems and interfaced with existing graphics hardware/software available on the REEDA System. Software documentation was provided for existing AVE/SESAME programs underlining functional flow charts and interacting questions. All AVE/SESAME data sets in random access format was processed to allow developed software to access the entire AVE/SESAME data base. The existing software was modified to allow for processing of different AVE/SESAME data set types including satellite surface and radar data.

  15. Bio-inspired piezoelectric artificial hair cell sensor fabricated by powder injection molding

    NASA Astrophysics Data System (ADS)

    Han, Jun Sae; Oh, Keun Ha; Moon, Won Kyu; Kim, Kyungseop; Joh, Cheeyoung; Seo, Hee Seon; Bollina, Ravi; Park, Seong Jin

    2015-12-01

    A piezoelectric artificial hair cell sensor was fabricated by the powder injection molding process in order to make an acoustic vector hydrophone. The entire process of powder injection molding was developed and optimized for PMN-PZT ceramic powder. The artificial hair cell sensor, which consists of high aspect ratio hair cell and three rectangular mechanoreceptors, was precisely fabricated through the developed powder injection molding process. The density and the dielectric property of the fabricated sensor shows 98% of the theoretical density and 85% of reference dielectric property of PMN-PZT ceramic powder. With regard to homogeneity, three rectangular mechanoreceptors have the same dimensions, with 3 μm of tolerance with 8% of deviation of dielectric property. Packaged vector hydrophones measure the underwater acoustic signals from 500 to 800 Hz with -212 dB of sensitivity. Directivity of vector hydrophone was acquired at 600 Hz as analyzing phase differences of electric signals.

  16. The formation of blobs from a pure interchange process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, P., E-mail: pzhu@ustc.edu.cn; Department of Engineering Physics, University of Wisconsin-Madison, Madison, Wisconsin 53706; Sovinec, C. R.

    2015-02-15

    In this work, we focus on examining a pure interchange process in a shear-less slab configuration as a prototype mechanism for blob formation. We employ full magnetohydrodynamic simulations to demonstrate that the blob-like structures can emerge through the nonlinear development of a pure interchange instability originating from a pedestal-like transition region. In the early nonlinear stage, filamentary structures develop and extend in the direction of the effective gravity. The blob-like structures appear when the radially extending filaments break off and disconnect from the core plasma. The morphology and the dynamics of these filaments and blobs vary dramatically with a sensitivemore » dependence on the dissipation mechanisms in the system and the initial perturbation. Despite the complexity in morphology and dynamics, the nature of the entire blob formation process in the shear-less slab configuration remains strictly interchange without involving any change in magnetic topology.« less

  17. Dynamics of biochemical processes and redox conditions in geochemically linked landscapes of oligotrophic bogs

    NASA Astrophysics Data System (ADS)

    Inisheva, L. I.; Szajdak, L.; Sergeeva, M. A.

    2016-04-01

    The biological activity in oligotrophic peatlands at the margins of the Vasyugan Mire has been studied. It is shown found that differently directed biochemical processes manifest themselves in the entire peat profile down to the underlying mineral substrate. Their activity is highly variable. It is argued that the notion about active and inert layers in peat soils is only applicable for the description of their water regime. The degree of the biochemical activity is specified by the physical soil properties. As a result of the biochemical processes, a micromosaic aerobic-anaerobic medium is developed under the surface waterlogged layer of peat deposits. This layer contains the gas phase, including oxygen. It is concluded that the organic and mineral parts of peat bogs represent a single functional system of a genetic peat profile with a clear record of the history of its development.

  18. Indicator methods to evaluate the hygienic performance of industrial scale operating Biowaste Composting Plants.

    PubMed

    Martens, Jürgen

    2005-01-01

    The hygienic performance of biowaste composting plants to ensure the quality of compost is of high importance. Existing compost quality assurance systems reflect this importance through intensive testing of hygienic parameters. In many countries, compost quality assurance systems are under construction and it is necessary to check and to optimize the methods to state the hygienic performance of composting plants. A set of indicator methods to evaluate the hygienic performance of normal operating biowaste composting plants was developed. The indicator methods were developed by investigating temperature measurements from indirect process tests from 23 composting plants belonging to 11 design types of the Hygiene Design Type Testing System of the German Compost Quality Association (BGK e.V.). The presented indicator methods are the grade of hygienization, the basic curve shape, and the hygienic risk area. The temperature courses of single plants are not distributed normally, but they were grouped by cluster analysis in normal distributed subgroups. That was a precondition to develop the mentioned indicator methods. For each plant the grade of hygienization was calculated through transformation into the standard normal distribution. It shows the part in percent of the entire data set which meet the legal temperature requirements. The hygienization grade differs widely within the design types and falls below 50% for about one fourth of the plants. The subgroups are divided visually into basic curve shapes which stand for different process courses. For each plant the composition of the entire data set out of the various basic curve shapes can be used as an indicator for the basic process conditions. Some basic curve shapes indicate abnormal process courses which can be emended through process optimization. A hygienic risk area concept using the 90% range of variation of the normal temperature courses was introduced. Comparing the design type range of variation with the legal temperature defaults showed hygienic risk areas over the temperature courses which could be minimized through process optimization. The hygienic risk area of four design types shows a suboptimal hygienic performance.

  19. Writing next-generation display photomasks

    NASA Astrophysics Data System (ADS)

    Sandstrom, Tor; Wahlsten, Mikael; Park, Youngjin

    2016-10-01

    Recent years have seen a fast technical development within the display area. Displays get ever higher pixel density and the pixels get smaller. Current displays have over 800 PPI and market forces will eventually drive for densities of 2000 PPI or higher. The transistor backplanes also get more complex. OLED displays require 4-7 transistors per pixel instead of the typical 1-2 transistors used for LCDs, and they are significantly more sensitive to errors. New large-area maskwriters have been developed for masks used in high volume production of screens for state-of-theart smartphones. Redesigned laser optics with higher NA and lower aberrations improve resolution and CD uniformity and reduce mura effects. The number of beams has been increased to maintain the throughput despite the higher writing resolution. OLED displays are highly sensitive to placement errors and registration in the writers has been improved. To verify the registration of produced masks a separate metrology system has been developed. The metrology system is self-calibrated to high accuracy. The calibration is repeatable across machines and sites using Z-correction. The repeatability of the coordinate system makes it possible to standardize the coordinate system across an entire supply chain or indeed across the entire industry. In-house metrology is a commercial necessity for high-end mask shop, but also the users of the masks, the panel makers, would benefit from having in-house metrology. It would act as the reference for their mask suppliers, give better predictive and post mortem diagnostic power for the panel process, and the metrology could be used to characterize and improve the entire production loop from data to panel.

  20. The Nasa-Isro SAR Mission Science Data Products and Processing Workflows

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Agram, P. S.; Lavalle, M.; Cohen, J.; Buckley, S.; Kumar, R.; Misra-Ray, A.; Ramanujam, V.; Agarwal, K. M.

    2017-12-01

    The NASA-ISRO SAR (NISAR) Mission is currently in the development phase and in the process of specifying its suite of data products and algorithmic workflows, responding to inputs from the NISAR Science and Applications Team. NISAR will provide raw data (Level 0), full-resolution complex imagery (Level 1), and interferometric and polarimetric image products (Level 2) for the entire data set, in both natural radar and geocoded coordinates. NASA and ISRO are coordinating the formats, meta-data layers, and algorithms for these products, for both the NASA-provided L-band radar and the ISRO-provided S-band radar. Higher level products will be also be generated for the purpose of calibration and validation, over large areas of Earth, including tectonic plate boundaries, ice sheets and sea-ice, and areas of ecosystem disturbance and change. This level of comprehensive product generation has been unprecedented for SAR missions in the past, and leads to storage processing challenges for the production system and the archive center. Further, recognizing the potential to support applications that require low latency product generation and delivery, the NISAR team is optimizing the entire end-to-end ground data system for such response, including exploring the advantages of cloud-based processing, algorithmic acceleration using GPUs, and on-demand processing schemes that minimize computational and transport costs, but allow rapid delivery to science and applications users. This paper will review the current products, workflows, and discuss the scientific and operational trade-space of mission capabilities.

  1. Thermal Infrared Radiometric Calibration of the Entire Landsat 4, 5, and 7 Archive (1982-2010)

    NASA Technical Reports Server (NTRS)

    Schott, John R.; Hook, Simon J.; Barsi, Julia A.; Markham, Brian L.; Miller, Jonathan; Padula, Francis P.; Raqueno, Nina G.

    2012-01-01

    Landsat's continuing record of the thermal state of the earth's surface represents the only long term (1982 to the present) global record with spatial scales appropriate for human scale studies (i.e., tens of meters). Temperature drives many of the physical and biological processes that impact the global and local environment. As our knowledge of, and interest in, the role of temperature on these processes have grown, the value of Landsat data to monitor trends and process has also grown. The value of the Landsat thermal data archive will continue to grow as we develop more effective ways to study the long term processes and trends affecting the planet. However, in order to take proper advantage of the thermal data, we need to be able to convert the data to surface temperatures. A critical step in this process is to have the entire archive completely and consistently calibrated into absolute radiance so that it can be atmospherically compensated to surface leaving radiance and then to surface radiometric temperature. This paper addresses the methods and procedures that have been used to perform the radiometric calibration of the earliest sizable thermal data set in the archive (Landsat 4 data). The completion of this effort along with the updated calibration of the earlier (1985 1999) Landsat 5 data, also reported here, concludes a comprehensive calibration of the Landsat thermal archive of data from 1982 to the present

  2. Bio-markers: traceability in food safety issues.

    PubMed

    Raspor, Peter

    2005-01-01

    Research and practice are focusing on development, validation and harmonization of technologies and methodologies to ensure complete traceability process throughout the food chain. The main goals are: scale-up, implementation and validation of methods in whole food chains, assurance of authenticity, validity of labelling and application of HACCP (hazard analysis and critical control point) to the entire food chain. The current review is to sum the scientific and technological basis for ensuring complete traceability. Tracing and tracking (traceability) of foods are complex processes due to the (bio)markers, technical solutions and different circumstances in different technologies which produces various foods (processed, semi-processed, or raw). Since the food is produced for human or animal consumption we need suitable markers to be stable and traceable all along the production chain. Specific biomarkers can have a function in technology and in nutrition. Such approach would make this development faster and more comprehensive and would make possible that food effect could be monitored with same set of biomarkers in consumer. This would help to develop and implement food safety standards that would be based on real physiological function of particular food component.

  3. Mechanisms of Molecular Mimicry of Plant CLE Peptide Ligands by the Parasitic Nematode Globodera rostochiensis1[C][W

    PubMed Central

    Guo, Yongfeng; Ni, Jun; Denver, Robert; Wang, Xiaohong; Clark, Steven E.

    2011-01-01

    Nematodes that parasitize plant roots cause huge economic losses and have few mechanisms for control. Many parasitic nematodes infect plants by reprogramming root development to drive the formation of feeding structures. How nematodes take control of plant development is largely unknown. Here, we identify two host factors involved in the function of a receptor ligand mimic, GrCLE1, secreted by the potato cyst nematode Globodera rostochiensis. GrCLE1 is correctly processed to an active form by host plant proteases. Processed GrCLE1 peptides bind directly to the plant CLE receptors CLV2, BAM1, and BAM2. Involvement of these receptors in the ligand-mimicking process is also supported by the fact that the ability of GrCLE1 peptides to alter plant root development in Arabidopsis (Arabidopsis thaliana) is dependent on these receptors. Critically, we also demonstrate that GrCLE1 maturation can be entirely carried out by plant factors and that the availability of CLE processing activity may be essential for successful ligand mimicry. PMID:21750229

  4. Mechanisms of molecular mimicry of plant CLE peptide ligands by the parasitic nematode Globodera rostochiensis.

    PubMed

    Guo, Yongfeng; Ni, Jun; Denver, Robert; Wang, Xiaohong; Clark, Steven E

    2011-09-01

    Nematodes that parasitize plant roots cause huge economic losses and have few mechanisms for control. Many parasitic nematodes infect plants by reprogramming root development to drive the formation of feeding structures. How nematodes take control of plant development is largely unknown. Here, we identify two host factors involved in the function of a receptor ligand mimic, GrCLE1, secreted by the potato cyst nematode Globodera rostochiensis. GrCLE1 is correctly processed to an active form by host plant proteases. Processed GrCLE1 peptides bind directly to the plant CLE receptors CLV2, BAM1, and BAM2. Involvement of these receptors in the ligand-mimicking process is also supported by the fact that the ability of GrCLE1 peptides to alter plant root development in Arabidopsis (Arabidopsis thaliana) is dependent on these receptors. Critically, we also demonstrate that GrCLE1 maturation can be entirely carried out by plant factors and that the availability of CLE processing activity may be essential for successful ligand mimicry.

  5. High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. Moreover, a range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. In this paper, we illustrate how the method can be used to: (1) distinguishmore » between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required.« less

  6. High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    DOE PAGES

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; ...

    2015-07-09

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. Moreover, a range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. In this paper, we illustrate how the method can be used to: (1) distinguishmore » between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required.« less

  7. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    PubMed Central

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-01-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. Here, we illustrate how the method can be used to: (1) distinguish between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required. PMID:26156000

  8. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    NASA Astrophysics Data System (ADS)

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-07-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. Here, we illustrate how the method can be used to: (1) distinguish between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required.

  9. Policymaking in European healthy cities.

    PubMed

    de Leeuw, Evelyne; Green, Geoff; Spanswick, Lucy; Palmer, Nicola

    2015-06-01

    This paper assesses policy development in, with and for Healthy Cities in the European Region of the World Health Organization. Materials for the assessment were sourced through case studies, a questionnaire and statistical databases. They were compiled in a realist synthesis methodology, applying theory-based evaluation principles. Non-response analyses were applied to ascertain the degree of representatives of the high response rates for the entire network of Healthy Cities in Europe. Further measures of reliability and validity were applied, and it was found that our material was indicative of the entire network. European Healthy Cities are successful in developing local health policy across many sectors within and outside government. They were also successful in addressing 'wicked' problems around equity, governance and participation in themes such as Healthy Urban Planning. It appears that strong local leadership for policy change is driven by international collaboration and the stewardship of the World Health Organization. The processes enacted by WHO, structuring membership of the Healthy City Network (designation) and the guidance on particular themes, are identified as being important for the success of local policy development. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. 7 CFR 51.3416 - Classification of defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Maximum allowed for U.S. No. 2 processing Occurring outside of or not entirely confined to the vascular ring Internal Black Spot, Internal Discoloration, Vascular Browning, Fusarium Wilt, Net Necrosis, Other Necrosis, Stem End Browning 5% waste 10% waste. Occurring entirely within the vascular ring Hollow Heart or...

  11. Theoretical study of optical pump process in solid gain medium based on four-energy-level model

    NASA Astrophysics Data System (ADS)

    Ma, Yongjun; Fan, Zhongwei; Zhang, Bin; Yu, Jin; Zhang, Hongbo

    2018-04-01

    A semiclassical algorithm is explored to a four-energy level model, aiming to find out the factors that affect the dynamics behavior during the pump process. The impacts of pump intensity Ω p , non-radiative transition rate γ 43 and decay rate of electric dipole δ 14 are discussed in detail. The calculation results show that large γ 43, small δ 14, and strong pumping Ω p are beneficial to the establishing of population inversion. Under strong pumping conditions, the entire pump process can be divided into four different phases, tentatively named far-from-equilibrium process, Rabi oscillation process, quasi dynamic equilibrium process and ‘equilibrium’ process. The Rabi oscillation can slow the pumping process and cause some instability. Moreover, the duration of the entire process is negatively related to Ω p and γ 43 whereas positively related to δ 14.

  12. Institutional plan. Fiscal year, 1997--2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-10-01

    The Institutional Plan is the culmination of Argonne`s annual planning cycle. The document outlines what Argonne National Laboratory (ANL) regards as the optimal development of programs and resources in the context of national research and development needs, the missions of the Department of Energy and Argonne National Laboratory, and pertinent resource constraints. It is the product of ANL`s internal planning process and extensive discussions with DOE managers. Strategic planning is important for all of Argonne`s programs, and coordination of planning for the entire institution is crucial. This Institutional Plan will increasingly reflect the planning initiatives that have recently been implemented.

  13. Fragment-based Quantum Mechanical/Molecular Mechanical Simulations of Thermodynamic and Kinetic Process of the Ru2+-Ru3+ Self-Exchange Electron Transfer.

    PubMed

    Zeng, Xiancheng; Hu, Xiangqian; Yang, Weitao

    2012-12-11

    A fragment-based fractional number of electron (FNE) approach, is developed to study entire electron transfer (ET) processes from the electron donor region to the acceptor region in condensed phase. Both regions are described by the density-fragment interaction (DFI) method while FNE as an efficient ET order parameter is applied to simulate the electron transfer process. In association with the QM/MM energy expression, the DFI-FNE method is demonstrated to describe ET processes robustly with the Ru 2+ -Ru 3+ self-exchange ET as a proof-of-concept example. This method allows for systematic calculations of redox free energies, reorganization energies, and electronic couplings, and the absolute ET rate constants within the Marcus regime.

  14. Romania program targets methanol and Fischer-Tropsch research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-03-01

    Currently, the chemical organic industry, the petrochemical and engine fuels industry in Romania are entirely based on hydrocarbons from oil. To reduce the oil dependence of this sector and to ensure the stipulated growth rate of 8-9%, research and development programs have been set up with a view to the diversification of raw materials. In research on hydrocarbons from alcohol conversion, three process variants are known, i.e. olefins from methanol, gasolines from methanol and a combined gasolines and aromatic hydrocarbons from methanol. The Romanian process of methanol conversion to hydrocarbons is very flexible, with all the variants mentioned being carriedmore » out in the same plant by modifying the catalysts. In research on hydrocarbons from synthesis gas a modern process is being developed for gasification of brown coal in a fluidized bed, under pressure, in the presence of oxygen and water vapors. In the field of carbon oxide hydrogenation, studies have been carried out on selective Fischer-Tropsch processes in which the reaction products are high value hydrocarbon fractions.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objective of the contract is to consolidate the advances made during the previous contract in the conversion of syngas to motor fuels using Molecular Sieve-containing catalysts and to demonstrate the practical utility and economic value of the new catalyst/process systems with appropriate laboratory runs. Work on the program is divided into the following six tasks: (1) preparation of a detailed work plan covering the entire performance of the contract; (2) preliminary techno-economic assessment of the UCC catalyst/process system; (3) optimization of the most promising catalyst developed under prior contract; (4) optimization of the UCC catalyst system in a mannermore » that will give it the longest possible service life; (5) optimization of a UCC process/catalyst system based upon a tubular reactor with a recycle loop containing the most promising catalyst developed under Tasks 3 and 4 studies; and (6) economic evaluation of the optimal performance found under Task 5 for the UCC process/catalyst system. Progress reports are presented for tasks 2 through 5. 232 figs., 19 tabs.« less

  16. The process development of laser surface modification of commercially pure titanium (Grade 2) with rhenium

    NASA Astrophysics Data System (ADS)

    Kobiela, K.; Smolina, I.; Dziedzic, R.; Szymczyk, P.; Kurzynowski, T.; Chlebus, E.

    2016-12-01

    The paper presents the results of the process development of laser surface modification of commercially pure titanium with rhenium. The criterion of the successful/optimal process is the repetitive geometry of the surface, characterized by predictable and repetitive chemical composition over its entire surface as well as special mechanical properties (hardness and wear resistance). The analysis of surface geometry concluded measurements of laser penetration depth and heat affected zone (HAZ), the width of a single track as well as width of a clad. The diode laser installed on the industrial robot carried out the laser treatment. This solution made possible the continuous supply of powder to the substrate during the process. The aim of an investigation is find out the possibility of improving the tribological characteristics of the surface due to the rhenium alloying. The verification of the surface properties (tribological) concluded geometry measurements, microstructure observation, hardness tests and evaluation of wear resistance.

  17. Multi-scale process and supply chain modelling: from lignocellulosic feedstock to process and products

    PubMed Central

    Hosseini, Seyed Ali; Shah, Nilay

    2011-01-01

    There is a large body of literature regarding the choice and optimization of different processes for converting feedstock to bioethanol and bio-commodities; moreover, there has been some reasonable technological development in bioconversion methods over the past decade. However, the eventual cost and other important metrics relating to sustainability of biofuel production will be determined not only by the performance of the conversion process, but also by the performance of the entire supply chain from feedstock production to consumption. Moreover, in order to ensure world-class biorefinery performance, both the network and the individual components must be designed appropriately, and allocation of resources over the resulting infrastructure must effectively be performed. The goal of this work is to describe the key challenges in bioenergy supply chain modelling and then to develop a framework and methodology to show how multi-scale modelling can pave the way to answer holistic supply chain questions, such as the prospects for second generation bioenergy crops. PMID:22482032

  18. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This document is a review journal that covers significant developments in the field of nuclear safety. Its scope includes the analysis and control of hazards associated with nuclear energy, operations involving fissionable materials, and the products of nuclear fission and their effects on the environment. Primary emphasis is on safety in reactor design, construction, and operation; however, the safety aspects of the entire fuel cycle, including fuel fabrication, spent-fuel processing, nuclear waste disposal, handling of radioisotopes, and environmental effects of these operations, are also treated.

  20. The Neurolab mission and biomedical engineering: a partnership for the future.

    PubMed

    Liskowsky, D R; Frey, M A; Sulzman, F M; White, R J; Likowsky, D R

    1996-01-01

    Over the last five years, with the advent of flights of U.S. Shuttle/Spacelab missions dedicated entirely to life sciences research, the opportunities for conducting serious studies that use a fully outfitted space laboratory to better understand basic biological processes have increased. The last of this series of Shuttle/Spacelab missions, currently scheduled for 1998, is dedicated entirely to neuroscience and behavioral research. The mission, named Neurolab, includes a broad range of experiments that build on previous research efforts, as well as studies related to less mature areas of space neuroscience. The Neurolab mission provides the global scientific community with the opportunity to use the space environment for investigations that exploit microgravity to increase our understanding of basic processes in neuroscience. The results from this premier mission should lead to a significant advancement in the field as a whole and to the opening of new lines of investigation for future research. Experiments under development for this mission will utilize human subjects as well as a variety of other species. The capacity to carry out detailed experiments on both human and animal subjects in space allows a diverse complement of studies that investigate functional changes and their underlying molecular, cellular, and physiological mechanisms. In order to conduct these experiments, a wide array of biomedical instrumentation will be used, including some instruments and devices being developed especially for the mission.

  1. The Neurolab mission and biomedical engineering: a partnership for the future

    NASA Technical Reports Server (NTRS)

    Liskowsky, D. R.; Frey, M. A.; Sulzman, F. M.; White, R. J.; Likowsky, D. R.

    1996-01-01

    Over the last five years, with the advent of flights of U.S. Shuttle/Spacelab missions dedicated entirely to life sciences research, the opportunities for conducting serious studies that use a fully outfitted space laboratory to better understand basic biological processes have increased. The last of this series of Shuttle/Spacelab missions, currently scheduled for 1998, is dedicated entirely to neuroscience and behavioral research. The mission, named Neurolab, includes a broad range of experiments that build on previous research efforts, as well as studies related to less mature areas of space neuroscience. The Neurolab mission provides the global scientific community with the opportunity to use the space environment for investigations that exploit microgravity to increase our understanding of basic processes in neuroscience. The results from this premier mission should lead to a significant advancement in the field as a whole and to the opening of new lines of investigation for future research. Experiments under development for this mission will utilize human subjects as well as a variety of other species. The capacity to carry out detailed experiments on both human and animal subjects in space allows a diverse complement of studies that investigate functional changes and their underlying molecular, cellular, and physiological mechanisms. In order to conduct these experiments, a wide array of biomedical instrumentation will be used, including some instruments and devices being developed especially for the mission.

  2. Using machine learning to explore the long-term evolution of GRS 1915+105

    NASA Astrophysics Data System (ADS)

    Huppenkothen, Daniela; Heil, Lucy M.; Hogg, David W.; Mueller, Andreas

    2017-04-01

    Among the population of known Galactic black hole X-ray binaries, GRS 1915+105 stands out in multiple ways. It has been in continuous outburst since 1992, and has shown a wide range of different states that can be distinguished by their timing and spectral properties. These states, also observed in IGR J17091-3624, have in the past been linked to accretion dynamics. Here, we present the first comprehensive study into the long-term evolution of GRS 1915+105, using the entire data set observed with Rossi X-ray Timing Explorer over its 16-yr lifetime. We develop a set of descriptive features allowing for automatic separation of states, and show that supervised machine learning in the form of logistic regression and random forests can be used to efficiently classify the entire data set. For the first time, we explore the duty cycle and time evolution of states over the entire 16-yr time span, and find that the temporal distribution of states has likely changed over the span of the observations. We connect the machine classification with physical interpretations of the phenomenology in terms of chaotic and stochastic processes.

  3. Processes of contaminant accumulation in an Arctic beluga whale population

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hickie, B.E.; Muir, D.; Kingsley, M.

    1995-12-31

    As long-lived top predators in marine food chains, marine mammals accumulate high levels of persistent organic contaminants. While arctic marine mammal contaminant concentrations are lower than those from temperate regions, levels are sufficiently high to be a health concern to people who rely on marine mammals as food. Monitoring programs developed to address this problem and to define spatial and temporal trends often are difficult to interpret since tissue contaminant concentrations vary with species, age, sex, reproductive effort, and condition (ie blubber thickness). It can be difficult to relate contaminant concentrations in other environmental compartments to those in marine mammalsmore » since their residues reflect exposure over their entire life, often 20 to 30 years. Contaminant accumulation models for marine mammals enable us to better understand the importance of, and interaction between, factors affecting contaminant accumulation, and can provide a dynamic framework for interpreting contaminant monitoring data. The authors developed two models for the beluga whale (Delphinapterus leucas): one provides a detailed view of processes at the individual level, the other examines population-based processes. The models quantify uptake, release and disposition of organic contaminants over their entire lifespan by incorporating all aspects of life-history. These models are used together to examine impact of a variety of factors on patterns and variability of PCBs found in the West Greenland beluga population (sample size: 696, 729). Factors examined include: energetics, growth, birth rate, lactation, contaminant assimilation and clearance rates, and dietary contaminant concentrations. Results are discussed in relation to the use of marine mammals for monitoring contaminant trends.« less

  4. Real-Time Three-Dimensional Cell Segmentation in Large-Scale Microscopy Data of Developing Embryos.

    PubMed

    Stegmaier, Johannes; Amat, Fernando; Lemon, William C; McDole, Katie; Wan, Yinan; Teodoro, George; Mikut, Ralf; Keller, Philipp J

    2016-01-25

    We present the Real-time Accurate Cell-shape Extractor (RACE), a high-throughput image analysis framework for automated three-dimensional cell segmentation in large-scale images. RACE is 55-330 times faster and 2-5 times more accurate than state-of-the-art methods. We demonstrate the generality of RACE by extracting cell-shape information from entire Drosophila, zebrafish, and mouse embryos imaged with confocal and light-sheet microscopes. Using RACE, we automatically reconstructed cellular-resolution tissue anisotropy maps across developing Drosophila embryos and quantified differences in cell-shape dynamics in wild-type and mutant embryos. We furthermore integrated RACE with our framework for automated cell lineaging and performed joint segmentation and cell tracking in entire Drosophila embryos. RACE processed these terabyte-sized datasets on a single computer within 1.4 days. RACE is easy to use, as it requires adjustment of only three parameters, takes full advantage of state-of-the-art multi-core processors and graphics cards, and is available as open-source software for Windows, Linux, and Mac OS. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Automated ammunition logistics for the Crusader program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speaks, D.M.; Kring, C.T.; Lloyd, P.D.

    1997-03-01

    The US Army`s next generation artillery system is called the Crusader. A self-propelled howitzer and a resupply vehicle constitute the Crusader system, which will be designed for improved mobility, increased firepower, and greater survivability than current generation vehicles. The Army`s Project Manager, Crusader, gave Oak Ridge National Laboratory (ORNL) the task of developing and demonstrating a concept for the resupply vehicle. The resupply vehicle is intended to sustain the howitzer with ammunition and fuel and will significantly increase capabilities over those of current resupply vehicles. Ammunition is currently processed and transferred almost entirely by hand. ORNL identified and evaluated variousmore » concepts for automated upload, processing, storage, docking and delivery. Each of the critical technologies was then developed separately and demonstrated on discrete test platforms. An integrated technology demonstrator, incorporating each of the individual technology components to realistically simulate performance of the selected vehicle concept, was developed and successfully demonstrated for the Army.« less

  6. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  7. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  8. Software for Analyzing Laminar-to-Turbulent Flow Transitions

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software. This is achieved by combining two prior programs in an innovative manner

  9. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  10. System Engineering Infrastructure Evolution Galileo IOV and the Steps Beyond

    NASA Astrophysics Data System (ADS)

    Eickhoff, J.; Herpel, H.-J.; Steinle, T.; Birn, R.; Steiner, W.-D.; Eisenmann, H.; Ludwig, T.

    2009-05-01

    The trends to more and more constrained financial budgets in satellite engineering require a permanent optimization of the S/C system engineering processes and infrastructure. Astrium in the recent years already has built up a system simulation infrastructure - the "Model-based Development & Verification Environment" - which meanwhile is well known all over Europe and is established as Astrium's standard approach for ESA, DLR projects and now even the EU/ESA-Project Galileo IOV. The key feature of the MDVE / FVE approach is to provide entire S/C simulation (with full featured OBC simulation) already in early phases to start OBSW code tests on a simulated S/C and to later add hardware in the loop step by step up to an entire "Engineering Functional Model (EFM)" or "FlatSat". The subsequent enhancements to this simulator infrastructure w.r.t. spacecraft design data handling are reported in the following sections.

  11. Updated CCPS Investigation Guidelines book.

    PubMed

    Philley, J; Pearson, K; Sepeda, A

    2003-11-14

    Incident investigation standards and performance criteria continue to improve. In recognition, the Center for Chemical Process Safety (CCPS) undertook a major project to upgrade and update the Incident Investigation Guidelines originally published in 1992. These significantly expanded guidelines provide a practical resource for effective investigation of process-related incidents, and reflect changes in good practices and expectations of regulators. This paper highlights the content of the new guidelines with special emphasis on what is new and improved. Entirely new chapters address the topics of legal considerations, the near-miss event, and continuous improvement of the investigation system. The objective of the guidelines is to allow chemical process organizations to develop and implement an incident investigation management system that is effective in identifying underlying causes.

  12. Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover

    NASA Technical Reports Server (NTRS)

    Dangelo, K. R.

    1974-01-01

    A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.

  13. SWAp dynamics in a decentralized context: experiences from Uganda.

    PubMed

    Jeppsson, Anders

    2002-12-01

    This paper examines the role of the Ministry of Health (MoH) in Uganda in the process of developing a Sector-Wide Approach (SWAp) within the health sector. Power dynamics are integral to any understanding of development assistance, and SWAps bring with them new opportunities for the deployment of influence. The SWAp process has changed the interaction between the donors and the Government, and the perspective of this interaction has shifted from various technical areas to the entire health sector. It is argued that although the decentralization of the public sector has transferred considerable responsibilities and duties from the central level to the districts, significant power, defined as a social construct, has been generated by the MoH in the very process of developing SWAps. The MoH has been able to exercise significant influence on defining the content and boundaries of the SWAp process, as well as the direction it is taking. This development has largely followed blueprints drawn by donors. Through the institutional framework associated with SWAps, the MoH has redefined the interaction between the central level and the districts as well as between the MoH and the donors. While the SWAp process is now moving from the planning to the implementation phase in Uganda, we see a number of new, changing, ambiguous and contradictory strategies emerging.

  14. The electrical properties of zero-gravity processed immiscibles

    NASA Technical Reports Server (NTRS)

    Lacy, L. L.; Otto, G. H.

    1974-01-01

    When dispersed or mixed immiscibles are solidified on earth, a large amount of separation of the constituents takes place due to differences in densities. However, when the immiscibles are dispersed and solidified in zero-gravity, density separation does not occur, and unique composite solids can be formed with many new and promising electrical properties. By measuring the electrical resistivity and superconducting critical temperature, Tc, of zero-g processed Ga-Bi samples, it has been found that the electrical properties of such materials are entirely different from the basic constituents and the ground control samples. Our results indicate that space processed immiscible materials may form an entirely new class of electronic materials.

  15. Process Engineering Technology Center Initiative

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    2001-01-01

    NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at KSC because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how KSC has benefited from PE and how KSC has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where KSC's PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.

  16. Process Engineering Technology Center Initiative

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    2002-01-01

    NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at K.S.C. because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how K.S.C. has benefited from PE and how K.S.C. has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where K.S.C.'s PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.

  17. An engineering code to analyze hypersonic thermal management systems

    NASA Technical Reports Server (NTRS)

    Vangriethuysen, Valerie J.; Wallace, Clark E.

    1993-01-01

    Thermal loads on current and future aircraft are increasing and as a result are stressing the energy collection, control, and dissipation capabilities of current thermal management systems and technology. The thermal loads for hypersonic vehicles will be no exception. In fact, with their projected high heat loads and fluxes, hypersonic vehicles are a prime example of systems that will require thermal management systems (TMS) that have been optimized and integrated with the entire vehicle to the maximum extent possible during the initial design stages. This will not only be to meet operational requirements, but also to fulfill weight and performance constraints in order for the vehicle to takeoff and complete its mission successfully. To meet this challenge, the TMS can no longer be two or more entirely independent systems, nor can thermal management be an after thought in the design process, the typical pervasive approach in the past. Instead, a TMS that was integrated throughout the entire vehicle and subsequently optimized will be required. To accomplish this, a method that iteratively optimizes the TMS throughout the vehicle will not only be highly desirable, but advantageous in order to reduce the manhours normally required to conduct the necessary tradeoff studies and comparisons. A thermal management engineering computer code that is under development and being managed at Wright Laboratory, Wright-Patterson AFB, is discussed. The primary goal of the code is to aid in the development of a hypersonic vehicle TMS that has been optimized and integrated on a total vehicle basis.

  18. Whole-animal imaging with high spatio-temporal resolution

    NASA Astrophysics Data System (ADS)

    Chhetri, Raghav; Amat, Fernando; Wan, Yinan; Höckendorf, Burkhard; Lemon, William C.; Keller, Philipp J.

    2016-03-01

    We developed isotropic multiview (IsoView) light-sheet microscopy in order to image fast cellular dynamics, such as cell movements in an entire developing embryo or neuronal activity throughput an entire brain or nervous system, with high resolution in all dimensions, high imaging speeds, good physical coverage and low photo-damage. To achieve high temporal resolution and high spatial resolution at the same time, IsoView microscopy rapidly images large specimens via simultaneous light-sheet illumination and fluorescence detection along four orthogonal directions. In a post-processing step, these four views are then combined by means of high-throughput multiview deconvolution to yield images with a system resolution of ≤ 450 nm in all three dimensions. Using IsoView microscopy, we performed whole-animal functional imaging of Drosophila embryos and larvae at a spatial resolution of 1.1-2.5 μm and at a temporal resolution of 2 Hz for up to 9 hours. We also performed whole-brain functional imaging in larval zebrafish and multicolor imaging of fast cellular dynamics across entire, gastrulating Drosophila embryos with isotropic, sub-cellular resolution. Compared with conventional (spatially anisotropic) light-sheet microscopy, IsoView microscopy improves spatial resolution at least sevenfold and decreases resolution anisotropy at least threefold. Compared with existing high-resolution light-sheet techniques, such as lattice lightsheet microscopy or diSPIM, IsoView microscopy effectively doubles the penetration depth and provides subsecond temporal resolution for specimens 400-fold larger than could previously be imaged.

  19. Developmental biology of the pancreas: a comprehensive review.

    PubMed

    Gittes, George K

    2009-02-01

    Pancreatic development represents a fascinating process in which two morphologically distinct tissue types must derive from one simple epithelium. These two tissue types, exocrine (including acinar cells, centro-acinar cells, and ducts) and endocrine cells serve disparate functions, and have entirely different morphology. In addition, the endocrine tissue must become disconnected from the epithelial lining during its development. The pancreatic development field has exploded in recent years, and numerous published reviews have dealt specifically with only recent findings, or specifically with certain aspects of pancreatic development. Here I wish to present a more comprehensive review of all aspects of pancreatic development, though still there is not a room for discussion of stem cell differentiation to pancreas, nor for discussion of post-natal regeneration phenomena, two important fields closely related to pancreatic development.

  20. Magnetically Enhanced Solid-Liquid Separation

    NASA Astrophysics Data System (ADS)

    Rey, C. M.; Keller, K.; Fuchs, B.

    2005-07-01

    DuPont is developing an entirely new method of solid-liquid filtration involving the use of magnetic fields and magnetic field gradients. The new hybrid process, entitled Magnetically Enhanced Solid-Liquid Separation (MESLS), is designed to improve the de-watering kinetics and reduce the residual moisture content of solid particulates mechanically separated from liquid slurries. Gravitation, pressure, temperature, centrifugation, and fluid dynamics have dictated traditional solid-liquid separation for the past 50 years. The introduction of an external field (i.e. the magnetic field) offers the promise to manipulate particle behavior in an entirely new manner, which leads to increased process efficiency. Traditional solid-liquid separation typically consists of two primary steps. The first is a mechanical step in which the solid particulate is separated from the liquid using e.g. gas pressure through a filter membrane, centrifugation, etc. The second step is a thermal drying process, which is required due to imperfect mechanical separation. The thermal drying process is over 100-200 times less energy efficient than the mechanical step. Since enormous volumes of materials are processed each year, more efficient mechanical solid-liquid separations can be leveraged into dramatic reductions in overall energy consumption by reducing downstream drying requirements have a tremendous impact on energy consumption. Using DuPont's MESLS process, initial test results showed four very important effects of the magnetic field on the solid-liquid filtration process: 1) reduction of the time to reach gas breakthrough, 2) less loss of solid into the filtrate, 3) reduction of the (solids) residual moisture content, and 4) acceleration of the de-watering kinetics. These test results and their potential impact on future commercial solid-liquid filtration is discussed. New applications can be found in mining, chemical and bioprocesses.

  1. The Effectiveness of Full Day School System for Students’ Character Building

    NASA Astrophysics Data System (ADS)

    Benawa, A.; Peter, R.; Makmun, S.

    2018-01-01

    The study aims to put forward that full day school which was delivered in Marsudirini Elementary School in Bogor is effective for students’ character building. The study focused on the implementation of full day school system. The qualitative-based research method applied in the study is characteristic evaluation involving non-participant observation, interview, and documentation analysis. The result of this study concludes that the full day school system is significantly effective in education system for elementary students’ character building. The full day school system embraced the entire relevant processes based on the character building standard. The synergy of comprehensive components in instructional process at full day school has influenced the building of the students’ character effectively and efficiently. The relationship emerged between instructional development process in full day school system and the character building of the students. By developing instructional process through systemic and systematic process in full day school system, the support of stakeholders (leaders, human resources, students, parents’ role) and other components (learning resources, facilities, budget) provides a potent and expeditious contribution for character building among the students eventually.

  2. Recent developments in turbomachinery component materials and manufacturing challenges for aero engine applications

    NASA Astrophysics Data System (ADS)

    Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.

    2018-02-01

    In the recent years the development of turbomachinery materials performance enhancement plays a vital role especially in aircraft air breathing engines like turbojet engine, turboprop engine, turboshaft engine and turbofan engines. Especially the transonic flow engines required highly sophisticated materials where it can sustain the entire thrust which can create by the engine. The main objective of this paper is to give an overview of the present cost-effective and technological capabilities process for turbomachinery component materials. Especially the main focus is given to study the Electro physical, Photonic additive removal process and Electro chemical process for turbomachinery parts manufacture. The aeronautical propulsion based technologies are reviewed thoroughly where in surface reliability, geometrical precession, and material removal and highly strengthened composite material deposition rates usually difficult to cut dedicated steels, Titanium and Nickel based alloys. In this paper the past aeronautical and propulsion mechanical based manufacturing technologies, current sophisticated technologies and also future challenging material processing techniques are covered. The paper also focuses on the brief description of turbomachinery components of shaping process and coating in aeromechanical applications.

  3. STUDIES ON A-AVITAMINOSIS IN CHICKENS

    PubMed Central

    Seifried, Oskar

    1930-01-01

    1. The principal tissue changes in the respiratory tract of chickens caused by a vitamin A deficiency in the food are, first, an atrophy and degeneration of the lining mucous membrane epithelium as well as of the epithelium of the mucous membrane glands. This process is followed or accompanied by a replacement or substitution of the degenerating original epithelium of these parts by a squamous stratified keratinizing epithelium. This newly formed epithelium develops from the primitive columnar epithelium and divides and grows very rapidly. The process appears to be one of substitution rather than a metaplasia, and resembles the normal keratinization of the skin or even more closely the incomplete keratinization of the mucous membranes (e.g., the esophagus or certain parts of the tongue of chickens). In this connection findings have been described which not only afford an interesting insight into the complicated mechanism of keratinization, but also show probable relations between keratinization and the development of Guarnieri's inclusion bodies. Balloon and reticular degeneration of the upper layers of the new stratified epithelium has been frequently observed. All parts of the respiratory tract are about equally involved in the process; and the olfactory region as well, so that the sense of smell may be lost. The lesions, which first take place on the surface epithelium and then in the glands, show only minor differences. 2. The protective mechanism inherent in the mucous membranes of the entire respiratory tract is seriously damaged or even entirely destroyed by the degeneration of the ciliated cells at the surface and the lack of secretion with bactericidal. properties. Secondary infections are frequently found, and nasal discharge and various kinds of inflammatory processes are common, including purulent ones, especially in the upper respiratory tract, communicating sinuses, eyes and trachea. The development of the characteristic histological process is not dependent upon the presence of these infections, since it also takes place in the absence of infection. 3. The specific histological lesions make it possible to differentiate between A-avitaminosis and some infectious diseases of the respiratory tract. These studies we hope will serve as a basis for further investigations on the relationship between A-avitaminosis and infection in general. PMID:19869784

  4. The economics of new drugs: can we afford to make progress in a common disease?

    PubMed

    Hirsch, Bradford R; Schulman, Kevin A

    2013-01-01

    The concept of personalized medicine is beginning to come to fruition, but the cost of drug development is untenable today. To identify new initiatives that would support a more sustainable business model, the economics of drug development are analyzed, including the cost of drug development, cost of capital, target market size, returns to innovators at the product and firm levels, and, finally, product pricing. We argue that a quick fix is not available. Instead, a rethinking of the entire pharmaceutical development process is needed from the way that clinical trials are conducted, to the role of biomarkers in segmenting markets, to the use of grant support, and conditional approval to decrease the cost of capital. In aggregate, the opportunities abound.

  5. Nuclear Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, E G

    This document is a review journal that covers significant developments in the field of nuclear safety. Its scope includes the analysis and control of hazards associated with nuclear energy, operations involving fissionable materials, and the products of nuclear fission and their effects on the environment. Primary emphasis is on safety in reactor design, construction, and operation; however, the safety aspects of the entire fuel cycle, including fuel fabrication, spent-fuel processing, nuclear waste disposal, handling of radioisotopes, and environmental effects of these operations, are also treated.

  6. Developing Best Practices for Capturing As-Built Building Information Models (BIM) for Existing Facilities

    DTIC Science & Technology

    2010-08-01

    students conducting the data capture and data entry, an analytical method known as the Task Load Index ( NASA TLX Version 2.0) was used. This method was...published by the NASA Ames Research Center in December 2003. The entire report can be found at: http://humansystems.arc.nasa.gov/groups/ TLX The...completion of each task in the survey process, surveyors were required to complete a NASA TLX form to report their assessment of the workload for

  7. Space station automation study. Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The two manufacturing concepts developed represent innovative, technologically advanced manufacturing schemes. The concepts were selected to facilitate an in depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, and artificial intelligence. While the cost effectiveness of these facilities has not been analyzed as part of this study, both appear entirely feasible for the year 2000 timeframe. The growing demand for high quality gallium arsenide microelectronics may warrant the ventures.

  8. Establishment of an Off-Highway Vehicle (OHV) Program at Arnold Air Force Base, Tennessee Final Environmental Assessment

    DTIC Science & Technology

    2010-05-01

    adverse impacts. This process was applied to the entire OHV area for the following resource areas: geomorphology and soils , water quality and...interaction with highly erodible soils . If such areas are utilized, operational constraints would be implemented that would minimize impacts in these areas...such as restricted use in wet soils and speed limits. At the motocross 2 area, the riding h·ack would be developed based on constraints associated

  9. Determination of Absolute Configuration of Secondary Alcohols Using Thin-Layer Chromatography

    PubMed Central

    Wagner, Alexander J.; Rychnovsky, Scott D.

    2013-01-01

    A new implementation of the Competing Enantioselective Conversion (CEC) method was developed to qualitatively determine the absolute configuration of enantioenriched secondary alcohols using thin-layer chromatography. The entire process for the method requires approximately 60 min and utilizes micromole quantities of the secondary alcohol being tested. A number of synthetically relevant secondary alcohols are presented. Additionally, 1H NMR spectroscopy was conducted on all samples to provide evidence of reaction conversion that supports the qualitative method presented herein. PMID:23593963

  10. Method of examining microcircuit patterns

    NASA Technical Reports Server (NTRS)

    Suszko, S. F. (Inventor)

    1986-01-01

    Examination of microstructures of LSI and VLSI devices is facilitated by employing a method in which the device is photographed through a darkfield illumination optical microscope and the resulting negative subjected to inverse processing to form a positive on a photographic film. The film is then developed to form photographic prints or transparencies which clearly illustrate the structure of the device. The entire structure of a device may be examined by alternately photographing the device and selectively etching layers of the device in order to expose underlying layers.

  11. Books, children, dogs, artists: library programs for the entire family.

    PubMed

    Haver, Mary Katherine

    2014-01-01

    The promotion of library resources and services is a continuous process for all libraries, especially hospital family resource center libraries. Like public libraries, a family resource center can utilize programs as a pathway for connecting with and developing awareness of library resources and services available to patient families. This column describes the programs currently offered for All Children's Hospital Johns Hopkins Medicine patient families, marketing initiatives to promote these programs, and utilization of grant funding to supplement a program.

  12. Perfect relativistic magnetohydrodynamics around black holes in horizon penetrating coordinates

    NASA Astrophysics Data System (ADS)

    Cherubini, Christian; Filippi, Simonetta; Loppini, Alessandro; Moradi, Rahim; Ruffini, Remo; Wang, Yu; Xue, She-Sheng

    2018-03-01

    Plasma accreting processes on black holes represent a central problem for relativistic astrophysics. In this context, here we specifically revisit the classical Ruffini-Wilson work developed for analytically modeling via geodesic equations the accretion of perfect magnetized plasma on a rotating Kerr black hole. Introducing the horizon penetrating coordinates found by Doran 25 years later, we revisit the entire approach studying Maxwell invariants, electric and magnetic fields, volumetric charge density and electromagnetic total energy. We finally discuss the physical implications of this analysis.

  13. Wnt affects symmetry and morphogenesis during post-embryonic development in colonial chordates.

    PubMed

    Di Maio, Alessandro; Setar, Leah; Tiozzo, Stefano; De Tomaso, Anthony W

    2015-01-01

    Wnt signaling is one of the earliest and most highly conserved regulatory pathways for the establishment of the body axes during regeneration and early development. In regeneration, body axes determination occurs independently of tissue rearrangement and early developmental cues. Modulation of the Wnt signaling in either process has shown to result in unusual body axis phenotypes. Botryllus schlosseri is a colonial ascidian that can regenerate its entire body through asexual budding. This processes leads to an adult body via a stereotypical developmental pathway (called blastogenesis), without proceeding through any embryonic developmental stages. In this study, we describe the role of the canonical Wnt pathway during the early stages of asexual development. We characterized expression of three Wnt ligands (Wnt2B, Wnt5A, and Wnt9A) by in situ hybridization and qRT-PCR. Chemical manipulation of the pathway resulted in atypical budding due to the duplication of the A/P axes, supernumerary budding, and loss of the overall cell apical-basal polarity. Our results suggest that Wnt signaling is used for equivalent developmental processes both during embryogenesis and asexual development in an adult organism, suggesting that patterning mechanisms driving morphogenesis are conserved, independent of embryonic, or regenerative development.

  14. Finite Element Simulation of Residual Stress Development in Thermally Sprayed Coatings

    NASA Astrophysics Data System (ADS)

    Elhoriny, Mohamed; Wenzelburger, Martin; Killinger, Andreas; Gadow, Rainer

    2017-04-01

    The coating buildup process of Al2O3/TiO2 ceramic powder deposited on stainless-steel substrate by atmospheric plasma spraying has been simulated by creating thermomechanical finite element models that utilize element death and birth techniques in ANSYS commercial software and self-developed codes. The simulation process starts with side-by-side deposition of coarse subparts of the ceramic layer until the entire coating is created. Simultaneously, the heat flow into the material, thermal deformation, and initial quenching stress are computed. The aim is to be able to predict—for the considered spray powder and substrate material—the development of residual stresses and to assess the risk of coating failure. The model allows the prediction of the heat flow, temperature profile, and residual stress development over time and position in the coating and substrate. The proposed models were successfully run and the results compared with actual residual stresses measured by the hole drilling method.

  15. Structural Optimisation Of Payload Fairings

    NASA Astrophysics Data System (ADS)

    Santschi, Y.; Eaton, N.; Verheyden, S.; Michaud, V.

    2012-07-01

    RUAG Space are developing materials and processing technologies for manufacture of the Next Generation Launcher (NGL) payload fairing, together with the Laboratory of Polymer and Composite Technology at the EPFL, in a project running under the ESA Future Launchers Preparatory Program (FLPP). In this paper the general aims and scope of the project are described, details of the results obtained shall be presented at a later stage. RUAG Space design, develop and manufacture fairings for the European launch vehicles Ariane 5 and VEGA using well proven composite materials and production methods which provide adequate cost/performance ratio for these applications. However, the NGL shall make full use of innovations in materials and process technologies to achieve a gain in performance at a much reduced overall manufacturing cost. NGL is scheduled to become operational in 2025, with actual development beginning in 2014. In this current project the basic technology is being developed and validated, in readiness for application in the NGL. For this new application, an entirely new approach to the fairing manufacture is evaluated.

  16. Hybrid-PIC Computer Simulation of the Plasma and Erosion Processes in Hall Thrusters

    NASA Technical Reports Server (NTRS)

    Hofer, Richard R.; Katz, Ira; Mikellides, Ioannis G.; Gamero-Castano, Manuel

    2010-01-01

    HPHall software simulates and tracks the time-dependent evolution of the plasma and erosion processes in the discharge chamber and near-field plume of Hall thrusters. HPHall is an axisymmetric solver that employs a hybrid fluid/particle-in-cell (Hybrid-PIC) numerical approach. HPHall, originally developed by MIT in 1998, was upgraded to HPHall-2 by the Polytechnic University of Madrid in 2006. The Jet Propulsion Laboratory has continued the development of HPHall-2 through upgrades to the physical models employed in the code, and the addition of entirely new ones. Primary among these are the inclusion of a three-region electron mobility model that more accurately depicts the cross-field electron transport, and the development of an erosion sub-model that allows for the tracking of the erosion of the discharge chamber wall. The code is being developed to provide NASA science missions with a predictive tool of Hall thruster performance and lifetime that can be used to validate Hall thrusters for missions.

  17. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    NASA Astrophysics Data System (ADS)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  18. Expectation, information processing, and subjective duration.

    PubMed

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  19. Advances in the production of freeform optical surfaces

    NASA Astrophysics Data System (ADS)

    Tohme, Yazid E.; Luniya, Suneet S.

    2007-05-01

    Recent market demands for free-form optics have challenged the industry to find new methods and techniques to manufacture free-form optical surfaces with a high level of accuracy and reliability. Production techniques are becoming a mix of multi-axis single point diamond machining centers or deterministic ultra precision grinding centers coupled with capable measurement systems to accomplish the task. It has been determined that a complex software tool is required to seamlessly integrate all aspects of the manufacturing process chain. Advances in computational power and improved performance of computer controlled precision machinery have driven the use of such software programs to measure, visualize, analyze, produce and re-validate the 3D free-form design thus making the process of manufacturing such complex surfaces a viable task. Consolidation of the entire production cycle in a comprehensive software tool that can interact with all systems in design, production and measurement phase will enable manufacturers to solve these complex challenges providing improved product quality, simplified processes, and enhanced performance. The work being presented describes the latest advancements in developing such software package for the entire fabrication process chain for aspheric and free-form shapes. It applies a rational B-spline based kernel to transform an optical design in the form of parametrical definition (optical equation), standard CAD format, or a cloud of points to a central format that drives the simulation. This software tool creates a closed loop for the fabrication process chain. It integrates surface analysis and compensation, tool path generation, and measurement analysis in one package.

  20. Reliability of clinical guideline development using mail-only versus in-person expert panels.

    PubMed

    Washington, Donna L; Bernstein, Steven J; Kahan, James P; Leape, Lucian L; Kamberg, Caren J; Shekelle, Paul G

    2003-12-01

    Clinical practice guidelines quickly become outdated. One reason they might not be updated as often as needed is the expense of collecting expert judgment regarding the evidence. The RAND-UCLA Appropriateness Method is one commonly used method for collecting expert opinion. We tested whether a less expensive, mail-only process could substitute for the standard in-person process normally used. We performed a 4-way replication of the appropriateness panel process for coronary revascularization and hysterectomy, conducting 3 panels using the conventional in-person method and 1 panel entirely by mail. All indications were classified as inappropriate or not (to evaluate overuse), and coronary revascularization indications were classified as necessary or not (to evaluate underuse). Kappa statistics were calculated for the comparison in ratings from the 2 methods. Agreement beyond chance between the 2 panel methods ranged from moderate to substantial. The kappa statistic to detect overuse was 0.57 for coronary revascularization and 0.70 for hysterectomy. The kappa statistic to detect coronary revascularization underuse was 0.76. There were no cases in which coronary revascularization was considered inappropriate by 1 method, but necessary or appropriate by the other. Three of 636 (0.5%) hysterectomy cases were categorized as inappropriate by 1 method but appropriate by the other. The reproducibility of the overuse and underuse assessments from the mail-only compared with the conventional in-person conduct of expert panels in this application was similar to the underlying reproducibility of the process. This suggests a potential role for updating guidelines using an expert judgment process conducted entirely through the mail.

  1. BPM 3.0

    NASA Astrophysics Data System (ADS)

    Scheer, August-Wilhelm; Klueckmann, Joerg

    Business Process Management (BPM) is an established management discipline. Since today’s organizations expect every employee to think and act like an entrepreneur, i.e., like a manager, BPM is also increasingly becoming part of everyday operations. But merely adopting a process-based approach across the enterprise is not enough to enable BPM at every level. What is needed is a combination of organizational forms and technologies that support distributed BPM initiatives while simultaneously consolidating them company-wide. Every employee must be empowered to model and optimize their own processes. At the same time, the entire BPM community needs a platform that brings together all the individual initiatives. This is the only way to leverage the full potential of process-oriented management. In the following article, the authors describe the trends in BPM development that are turning users into process managers and supporting the creation of a BPM community.

  2. Development and expansion of high-quality control region databases to improve forensic mtDNA evidence interpretation.

    PubMed

    Irwin, Jodi A; Saunier, Jessica L; Strouss, Katharine M; Sturk, Kimberly A; Diegoli, Toni M; Just, Rebecca S; Coble, Michael D; Parson, Walther; Parsons, Thomas J

    2007-06-01

    In an effort to increase the quantity, breadth and availability of mtDNA databases suitable for forensic comparisons, we have developed a high-throughput process to generate approximately 5000 control region sequences per year from regional US populations, global populations from which the current US population is derived and global populations currently under-represented in available forensic databases. The system utilizes robotic instrumentation for all laboratory steps from pre-extraction through sequence detection, and a rigorous eight-step, multi-laboratory data review process with entirely electronic data transfer. Over the past 3 years, nearly 10,000 control region sequences have been generated using this approach. These data are being made publicly available and should further address the need for consistent, high-quality mtDNA databases for forensic testing.

  3. Linkages and feedbacks in orogenic systems: An introduction

    USGS Publications Warehouse

    Thigpen, J. Ryan; Law, Richard D.; Merschat, Arthur J.; Stowell, Harold

    2017-01-01

    Orogenic processes operate at scales ranging from the lithosphere to grain-scale, and are inexorably linked. For example, in many orogens, fault and shear zone architecture controls distribution of heat advection along faults and also acts as the primary mechanism for redistribution of heat-producing material. This sets up the thermal structure of the orogen, which in turn controls lithospheric rheology, the nature and distribution of deformation and strain localization, and ultimately, through localized mechanical strengthening and weakening, the fundamental shape of the developing orogenic wedge (Fig. 1). Strain localization establishes shear zone and fault geometry, and it is the motion on these structures, in conjunction with climate, that often focuses erosional and exhumational processes. This climatic focusing effect can even drive development of asymmetry at the scale of the entire wedge (Willett et al., 1993).

  4. NSF's Perspective on Space Weather Research for Building Forecasting Capabilities

    NASA Astrophysics Data System (ADS)

    Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.

    2017-12-01

    Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.

  5. Genome-wide association study reveals multiple loci associated with primary tooth development during infancy.

    PubMed

    Pillas, Demetris; Hoggart, Clive J; Evans, David M; O'Reilly, Paul F; Sipilä, Kirsi; Lähdesmäki, Raija; Millwood, Iona Y; Kaakinen, Marika; Netuveli, Gopalakrishnan; Blane, David; Charoen, Pimphen; Sovio, Ulla; Pouta, Anneli; Freimer, Nelson; Hartikainen, Anna-Liisa; Laitinen, Jaana; Vaara, Sarianna; Glaser, Beate; Crawford, Peter; Timpson, Nicholas J; Ring, Susan M; Deng, Guohong; Zhang, Weihua; McCarthy, Mark I; Deloukas, Panos; Peltonen, Leena; Elliott, Paul; Coin, Lachlan J M; Smith, George Davey; Jarvelin, Marjo-Riitta

    2010-02-26

    Tooth development is a highly heritable process which relates to other growth and developmental processes, and which interacts with the development of the entire craniofacial complex. Abnormalities of tooth development are common, with tooth agenesis being the most common developmental anomaly in humans. We performed a genome-wide association study of time to first tooth eruption and number of teeth at one year in 4,564 individuals from the 1966 Northern Finland Birth Cohort (NFBC1966) and 1,518 individuals from the Avon Longitudinal Study of Parents and Children (ALSPAC). We identified 5 loci at P<5x10(-8), and 5 with suggestive association (P<5x10(-6)). The loci included several genes with links to tooth and other organ development (KCNJ2, EDA, HOXB2, RAD51L1, IGF2BP1, HMGA2, MSRB3). Genes at four of the identified loci are implicated in the development of cancer. A variant within the HOXB gene cluster associated with occlusion defects requiring orthodontic treatment by age 31 years.

  6. Genome-Wide Association Study Reveals Multiple Loci Associated with Primary Tooth Development during Infancy

    PubMed Central

    Sipilä, Kirsi; Lähdesmäki, Raija; Millwood, Iona Y.; Kaakinen, Marika; Netuveli, Gopalakrishnan; Blane, David; Charoen, Pimphen; Sovio, Ulla; Pouta, Anneli; Freimer, Nelson; Hartikainen, Anna-Liisa; Laitinen, Jaana; Vaara, Sarianna; Glaser, Beate; Crawford, Peter; Timpson, Nicholas J.; Ring, Susan M.; Deng, Guohong; Zhang, Weihua; McCarthy, Mark I.; Deloukas, Panos; Peltonen, Leena

    2010-01-01

    Tooth development is a highly heritable process which relates to other growth and developmental processes, and which interacts with the development of the entire craniofacial complex. Abnormalities of tooth development are common, with tooth agenesis being the most common developmental anomaly in humans. We performed a genome-wide association study of time to first tooth eruption and number of teeth at one year in 4,564 individuals from the 1966 Northern Finland Birth Cohort (NFBC1966) and 1,518 individuals from the Avon Longitudinal Study of Parents and Children (ALSPAC). We identified 5 loci at P<5×10−8, and 5 with suggestive association (P<5×10−6). The loci included several genes with links to tooth and other organ development (KCNJ2, EDA, HOXB2, RAD51L1, IGF2BP1, HMGA2, MSRB3). Genes at four of the identified loci are implicated in the development of cancer. A variant within the HOXB gene cluster associated with occlusion defects requiring orthodontic treatment by age 31 years. PMID:20195514

  7. Modelling the EDLC-based Power Supply Module for a Maneuvering System of a Nanosatellite

    NASA Astrophysics Data System (ADS)

    Kumarin, A. A.; Kudryavtsev, I. A.

    2018-01-01

    The development of the model of the power supply module of a maneuvering system of a nanosatellite is described. The module is based on an EDLC battery as an energy buffer. The EDLC choice is described. Experiments are conducted to provide data for model. Simulation of the power supply module is made for charging and discharging of the battery processes. The difference between simulation and experiment does not exceed 0.5% for charging and 10% for discharging. The developed model can be used in early design and to adjust charger and load parameters. The model can be expanded to represent the entire power system.

  8. Inequality and mortality: demographic hypotheses regarding advanced and peripheral capitalism.

    PubMed

    Gregory, J W; Piché, V

    1983-01-01

    This paper analyzes mortality differences between social classes and between advanced and peripheral regions of the world economy. The demographic analysis of mortality is integrated with the study of political economy, which emphasizes the entire process of social reproduction. As part of this dialectic model, both the struggle of the working class to improve health and the interest of capital in maximizing profits are examined. Data from Québec and Upper Volta are used to illustrate the hypothesis that substantially higher mortality rates exist for the working class compared with the bourgeoisie and in the less developed peripheral regions compared with the more developed regions.

  9. Spline function approximation techniques for image geometric distortion representation. [for registration of multitemporal remote sensor imagery

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.

    1975-01-01

    Least squares approximation techniques were developed for use in computer aided correction of spatial image distortions for registration of multitemporal remote sensor imagery. Polynomials were first used to define image distortion over the entire two dimensional image space. Spline functions were then investigated to determine if the combination of lower order polynomials could approximate a higher order distortion with less computational difficulty. Algorithms for generating approximating functions were developed and applied to the description of image distortion in aircraft multispectral scanner imagery. Other applications of the techniques were suggested for earth resources data processing areas other than geometric distortion representation.

  10. The Defense Life Cycle Management System as a Working Model for Academic Application

    ERIC Educational Resources Information Center

    Burian, Philip E.; Keffel, Leslie M.; Maffei, Francis R., III

    2011-01-01

    Performing the review and assessment of masters' level degree programs can be an overwhelming and challenging endeavor. Getting organized and mapping out the entire review and assessment process can be extremely helpful and more importantly provide a path for successfully accomplishing the review and assessment of the entire program. This paper…

  11. Some Memories Are Odder than Others: Judgments of Episodic Oddity Violate Known Decision Rules

    ERIC Educational Resources Information Center

    O'Connor, Akira R.; Guhl, Emily N.; Cox, Justin C.; Dobbins, Ian G.

    2011-01-01

    Current decision models of recognition memory are based almost entirely on one paradigm, single item old/new judgments accompanied by confidence ratings. This task results in receiver operating characteristics (ROCs) that are well fit by both signal-detection and dual-process models. Here we examine an entirely new recognition task, the judgment…

  12. Georgia resource assessment project: Institutionalizing LANDSAT and geographic data base techniques

    NASA Technical Reports Server (NTRS)

    Pierce, R. R.; Rado, B. Q.; Faust, N.

    1981-01-01

    Digital data from LANDSAT for each 1.1-acre cell in Georgia were processed and the land cover conditions were categorized. Several test cases were completed and an operational hardware and software processing capability was established at the Georgia Institute of Technology. The operational capability was developed to process the entire state (60,000 sq. miles and 14 LANDSAT scenes) in a cooperative project between eleven divisions and agencies at the regional, state, and federal levels. Products were developed for State agencies such as in both mapped and statistical formats. A computerized geographical data base was developed for management programs. To a large extent the applications of the data base evolved as users of LANDSAT information requested that other data (i.e., soils, slope, land use, etc.) be made compatible with LANDSAT for management programs. To date, geographic data bases incorporating LANDSAT and other spatial data deal with elements of the municipal solid waste management program, and reservoir management for the Corps of Engineers. LANDSAT data are also being used for applications in wetland, wildlife, and forestry management.

  13. PlotXY: A High Quality Plotting System for the Herschel Interactive Processing Environment (HIPE) and the Astronomical Community

    NASA Astrophysics Data System (ADS)

    Panuzzo, P.; Li, J.; Caux, E.

    2012-09-01

    The Herschel Interactive Processing Environment (HIPE) was developed by the European Space Agency (ESA) in collaboration with NASA and the Herschel Instrument Control Centres, to provide the astronomical community a complete environment to process and analyze the data gathered by the Herschel Space Observatory. One of the most important components of HIPE is the plotting system (named PlotXY) that we present here. With PlotXY it is possible to produce easily high quality publication-ready 2D plots. It provides a long list of features, with fully configurable components, and interactive zooming. The entire code of HIPE is written in Java and is open source released under the GNU Lesser General Public License version 3. A new version of PlotXY is being developed to be independent from the HIPE code base; it is available to the software development community for the inclusion in other projects at the URL http://code.google.com/p/jplot2d/.

  14. AMCC casting development. Volume 1: Executive Summary

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Advanced Combustion Chamber Casting (AMCC) has been a technically challenging part due to its size, configuration, and alloy type. The height and weight of the wax pattern assembly necessitated the development of a hollow gating system to ensure structural integrity of the shell throughout the investment process. The complexity in the jacket area of the casting required the development of an innovative casting technology that PCC has termed 'TGC' or Thermal Gradient Control. This method, of setting up thermal gradients in the casting during solidification, represents a significant process improvement for PCC and has been successfully implemented on other programs. Metallurgical integrity of the final four castings was very good. Only the areas of the parts that utilized 'TGC Shape & Location System #2' showed any significant areas of microshrinkage when evaluated by non-destructive tests. Alumina oxides detected by FPI on the 'float' surfaces (top sid surfaces of the casting during solidification) of the part were almost entirely less than the acceptance criteria of .032 inches in diameter. Destructive chem mill of the castings was required to determine the effect of the process variables used during the processing of these last four parts (with the exception of the 'Shape & Location of TGC' variable).

  15. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  16. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. [The Balanced Scorecard as a management tool in a public health organization].

    PubMed

    Villalbí, Joan R; Villalbí, Joan; Guix, Joan; Casas, Conrad; Borrell, Carme; Duran, Júlia; Artazcoz, Lucía; Camprubí, Esteve; Cusí, Meritxell; Rodríguez-Montuquín, Pau; Armengol, Josep M; Jiménez, Guy

    2007-01-01

    The Balanced Scorecard is a tool for strategic planning in business. We present our experience after introducing this instrument in a public health agency to align daily management practice with strategic objectives. Our management team required deep discussions with external support to clarify the concepts behind the Balanced Scorecard, adapt them to a public organization in the health field distinct from the business sector in which the Balanced Scorecard was designed, and adopt this instrument as a management tool. This process led to definition of the Balanced Scorecard by our Management Committee in 2002, the subsequent evaluation of the degree to which its objectives had been reached, and its periodic redefinition. In addition, second-level Balanced Scorecards were defined for different divisions and services within the agency. The adoption of the Balanced Scorecard by the management team required prior effort to clarify who are the stockholders and who are the clients of a public health organization. The agency's activity and production were also analyzed and a key processes model was defined. Although it is hard to attribute specific changes to a single cause, we believe several improvements in management can be ascribed, at least in part, to the use of the Balanced Scorecard. The systematic use of the Balanced Scorecard produced greater cohesion in the management team and the entire organization and brought the strategic objectives closer to daily management operations. The organization is more attentive to its clients, has taken steps to improve its most complex cross-sectional processes, and has developed further actions for the development and growth of its officers and its entire personnel. At the same time, its management team is more in tune with the needs of the agency's administrative bodies that compose its governing board.

  18. Sohbrit: Autonomous COTS System for Satellite Characterization

    NASA Astrophysics Data System (ADS)

    Blazier, N.; Tarin, S.; Wells, M.; Brown, N.; Nandy, P.; Woodbury, D.

    As technology continues to improve, driving down the cost of commercial astronomical products while increasing their capabilities, manpower to run observations has become the limiting factor in acquiring continuous and repeatable space situational awareness data. Sandia National Laboratories set out to automate a testbed comprised entirely of commercial off-the-shelf (COTS) hardware for space object characterization (SOC) focusing on satellites in geosynchronous orbit. Using an entirely autonomous system allows collection parameters such as target illumination and nightly overlap to be accounted for habitually; this enables repeatable development of target light curves to establish patterns of life in a variety of spectral bands. The system, known as Sohbrit, is responsible for autonomously creating an optimized schedule, checking the weather, opening the observatory dome, aligning and focusing the telescope, executing the schedule by slewing to each target and imaging it in a number of spectral bands (e.g., B, V, R, I, wide-open) via a filter wheel, closing the dome at the end of observations, processing the data, and storing/disseminating the data for exploitation via the web. Sohbrit must handle various situations such as weather outages and focus changes due to temperature shifts and optical seeing variations without human interaction. Sohbrit can collect large volumes of data nightly due to its high level of automation. To store and disseminate these large quantities of data, we utilize a cloud-based big data architecture called Firebird, which exposes the data out to the community for use by developers and analysts. Sohbrit is the first COTS system we are aware of to automate the full process of multispectral geosynchronous characterization from scheduling all the way to processed, disseminated data. In this paper we will discuss design decisions, issues encountered and overcome during implementation, and show results produced by Sohbrit.

  19. [Biotechnology in perspective].

    PubMed

    Brand, A

    1990-06-15

    Biotechnology is a collective term for a large number of manipulations of biological material. Fields of importance in stock-keeping include: (1) manipulation of reproductive processes; (2) genetic manipulation of macro-(farm) animals and micro-organisms and (3) manipulation of metabolism. Fitting in biotechnological findings in breeding-stock farming has repercussions in several fields such as the relationship between producers and the ancillary and processing industries, service industries, consumers and society as a whole. The use of biotechnical findings will also require further automation and adaptation of farm management. Biotechnology opens up a new area and new prospects for farm animal husbandry. These can only be regarded as positive when they take a permanent development of the entire section into account.

  20. The spatial and temporal organization of ubiquitin networks

    PubMed Central

    Grabbe, Caroline; Husnjak, Koraljka; Dikic, Ivan

    2013-01-01

    In the past decade, the diversity of signals generated by the ubiquitin system has emerged as a dominant regulator of biological processes and propagation of information in the eukaryotic cell. A wealth of information has been gained about the crucial role of spatial and temporal regulation of ubiquitin species of different lengths and linkages in the nuclear factor-κB (NF-κB) pathway, endocytic trafficking, protein degradation and DNA repair. This spatiotemporal regulation is achieved through sophisticated mechanisms of compartmentalization and sequential series of ubiquitylation events and signal decoding, which control diverse biological processes not only in the cell but also during the development of tissues and entire organisms. PMID:21448225

  1. Teaching by research at undergraduate schools: an experience

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.

    1997-12-01

    On this communication I will report a pedagogical experience undertaken in the 1995 class of Image Processing of the course of Applied Physics of the University of Minho. The learning process requires always an active critical participation of the student in an experience essentially personal that should and must be rewarding and fulfilling. To us scientists virtually nothing gives us more pleasure and fulfillment than the research processes. Furthermore it is our main way to improve our, and I stress our, knowledge. Thus I decided to center my undergraduate students' learning process of the basics of digital image processing on a simple applied research program. The proposed project was to develop a process of inspection to be introduced in a generic production line. Measured should be the transversal distance between an object and the extremity of a conveyor belt where it is transported. The measurement method was proposed to be optical triangulation combined with shadow analysis. To the students was given almost entire liberty and responsibility. I limited my self to asses the development of the project orienting them and point out different or pertinent points of view only when strictly necessary.

  2. Are visual peripheries forever young?

    PubMed

    Burnat, Kalina

    2015-01-01

    The paper presents a concept of lifelong plasticity of peripheral vision. Central vision processing is accepted as critical and irreplaceable for normal perception in humans. While peripheral processing chiefly carries information about motion stimuli features and redirects foveal attention to new objects, it can also take over functions typical for central vision. Here I review the data showing the plasticity of peripheral vision found in functional, developmental, and comparative studies. Even though it is well established that afferent projections from central and peripheral retinal regions are not established simultaneously during early postnatal life, central vision is commonly used as a general model of development of the visual system. Based on clinical studies and visually deprived animal models, I describe how central and peripheral visual field representations separately rely on early visual experience. Peripheral visual processing (motion) is more affected by binocular visual deprivation than central visual processing (spatial resolution). In addition, our own experimental findings show the possible recruitment of coarse peripheral vision for fine spatial analysis. Accordingly, I hypothesize that the balance between central and peripheral visual processing, established in the course of development, is susceptible to plastic adaptations during the entire life span, with peripheral vision capable of taking over central processing.

  3. Early development of the circumferential axonal pathway in mouse and chick spinal cord.

    PubMed

    Holley, J A

    1982-03-10

    The early development of the circumferential axonal pathway in the brachial and lumbar spinal cord of mouse and chick embryos was studied by scanning and transmission electron microscopy. The cellular processes which comprise this pathway grow in the transverse plane and along the lateral margin of the marginal zone (i.e., circumferentially oriented), as typified by the early embryonic commissural axons. The first formative event observed was in the ventrolateral margin of the primitive spinal cord ventricular zone. Cellular processes were found near the external limiting membrane that appeared to grow a variable distance either dorsally or ventrally. Later in development, presumptive motor column neurons migrated into the ventrolateral region, distal to these early circumferentially oriented processes. Concurrently, other circumferentially oriented perikarya and processes appeared along the dorsolateral margin. Due to their aligned sites of origin and parallel growth, the circumferential processes formed a more or less continuous line or pathway, which in about 10% of the scanned specimens could be followed along the entire lateral margin of the embryonic spinal cord. Several specimens later in development had two sets of aligned circumferential processes in the ventral region. Large numbers of circumferential axons were then found to follow the preformed pathway by fasciculation, after the primitive motor column had become established. Since the earliest circumferential processes appeared to differentiate into axons and were found nearly 24 hours prior to growth of most circumferential axons, their role in guidance as pioneering axons was suggested.

  4. Fetal programming and environmental exposures ...

    EPA Pesticide Factsheets

    Fetal programming is an enormously complex process that relies on numerous environmental inputs from uterine tissue, the placenta, the maternal blood supply, and other sources. Recent evidence has made clear that the process is not based entirely on genetics, but rather on a delicate series of interactions between genes and the environment. It is likely that epigenctic (“above the genome”) changes are responsible for modifying gene expression in the developing fetus, and these modifications can have long-lasting health impacts. Determining which epigenetic regulators are most vital in embryonic development will improve pregnancy outcomes and our ability to treat and prevent disorders that emerge later in life. “Fetal Programming and Environmental Exposures: Implications for Prenatal Care and Preterm Birth’ began with a keynote address by Frederick vom Saal, who explained that low-level exposure to endocrine disrupting chemicals (EDCs) perturbs hormone systems in utero and can have negative effects on fetal development. vom Saal presented data on the LOC bisphenol A (BPA), an estrogen-mimicking compound found in many plastics. He suggested that low-dose exposure to LOCs can alter the development process and enhance chances of acquiring adult diseases, such as breastcancer, diabetes, and even developmental disorders such as attention deficit disorder (ADHD).’ Fetal programming is an enormously complex process that relies on numerous environmental inputs

  5. Neurokernel: An Open Source Platform for Emulating the Fruit Fly Brain

    PubMed Central

    2016-01-01

    We have developed an open software platform called Neurokernel for collaborative development of comprehensive models of the brain of the fruit fly Drosophila melanogaster and their execution and testing on multiple Graphics Processing Units (GPUs). Neurokernel provides a programming model that capitalizes upon the structural organization of the fly brain into a fixed number of functional modules to distinguish between these modules’ local information processing capabilities and the connectivity patterns that link them. By defining mandatory communication interfaces that specify how data is transmitted between models of each of these modules regardless of their internal design, Neurokernel explicitly enables multiple researchers to collaboratively model the fruit fly’s entire brain by integration of their independently developed models of its constituent processing units. We demonstrate the power of Neurokernel’s model integration by combining independently developed models of the retina and lamina neuropils in the fly’s visual system and by demonstrating their neuroinformation processing capability. We also illustrate Neurokernel’s ability to take advantage of direct GPU-to-GPU data transfers with benchmarks that demonstrate scaling of Neurokernel’s communication performance both over the number of interface ports exposed by an emulation’s constituent modules and the total number of modules comprised by an emulation. PMID:26751378

  6. The study of develop optimization to control various resist defect in Photomask fabrication

    NASA Astrophysics Data System (ADS)

    Lim, JongHoon; Kim, ByungJu; Son, JaeSik; Park, EuiSang; Kim, SangPyo; Yim, DongGyu

    2015-07-01

    To reduce the pattern size in photomask is an inevitable trend because of the minimization of chip size. So it makes a big challenge to control defects in photomask industry. Defects below a certain size that had not been any problem in previous technology node are becoming an issue as the patterns are smaller. Therefore, the acceptable tolerance levels for current defect size and quantity are dramatically reduced. Because these defects on photomask can be the sources of the repeating defects on wafer, small size defects smaller than 200nm should not be ignored any more. Generally, almost defects are generated during develop process and etch process. Especially it is difficult to find the root cause of defects formed during the develop process because of their various types and very small size. In this paper, we studied how these small defects can be eliminated by analyzing the defects and tuning the develop process. There are 3 types of resist defects which are named as follows. The first type is `Popcorn' defect which is mainly occurred in negative resist and exists on the dark features. The second type is `Frog eggs' defect which is occurred in 2nd process of HTPSM and exists on the wide space area. The last type is `Spot' defect which also exists on the wide space area. These defects are generally appeared on the entire area of a plate and the number of these defects is about several hundred. It is thought that the original source is the surface's hydrophilic state before develop process or the incongruity between resist and developer. This study shows that the optimizing the develop process can be a good solution for some resist defects.

  7. Equalizer design techniques for dispersive cables with application to the SPS wideband kicker

    NASA Astrophysics Data System (ADS)

    Platt, Jason; Hofle, Wolfgang; Pollock, Kristin; Fox, John

    2017-10-01

    A wide-band vertical instability feedback control system in development at CERN requires 1-1.5 GHz of bandwidth for the entire processing chain, from the beam pickups through the feedback signal digital processing to the back-end power amplifiers and kicker structures. Dispersive effects in cables, amplifiers, pickup and kicker elements can result in distortions in the time domain signal as it proceeds through the processing system, and deviations from linear phase response reduce the allowable bandwidth for the closed-loop feedback system. We have developed an equalizer analog circuit that compensates for these dispersive effects. Here we present a design technique for the construction of an analog equalizer that incorporates the effect of parasitic circuit elements in the equalizer to increase the fidelity of the implemented equalizer. Finally, we show results from the measurement of an assembled backend equalizer that corrects for dispersive elements in the cables over a bandwidth of 10-1000 MHz.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objective of the contract is to consolidate the advances made during the previous contract in the conversion of syngas to motor fuels using Molecular Sieve-containing catalysts and to demonstrate the practical utility and economic value of the new catalyst/process systems with appropriate laboratory runs. Work on the program is divided into the following six tasks: (1) preparation of a detailed work plan covering the entire performance of the contract; (2) techno-economic studies that will supplement those that are presently being carried out by MITRE; (3) optimization of the most promising catalysts developed under prior contract; (4) optimization of themore » UCC catalyst system in a manner that will give it the longest possible service life; (5) optimization of a UCC process/catalyst system based upon a tubular reactor with a recycle loop containing the most promising catalyst developed under Tasks 3 and 4 studies; and (6) economic evaluation of the optimal performance found under Task 5 for the UCC process/catalyst system. Progress reports are presented for Tasks 1, 3, 4, and 5.« less

  9. Optimum random and age replacement policies for customer-demand multi-state system reliability under imperfect maintenance

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Luan; Chang, Chin-Chih; Sheu, Dwan-Fang

    2016-04-01

    This paper proposes the generalised random and age replacement policies for a multi-state system composed of multi-state elements. The degradation of the multi-state element is assumed to follow the non-homogeneous continuous time Markov process which is a continuous time and discrete state process. A recursive approach is presented to efficiently compute the time-dependent state probability distribution of the multi-state element. The state and performance distribution of the entire multi-state system is evaluated via the combination of the stochastic process and the Lz-transform method. The concept of customer-centred reliability measure is developed based on the system performance and the customer demand. We develop the random and age replacement policies for an aging multi-state system subject to imperfect maintenance in a failure (or unacceptable) state. For each policy, the optimum replacement schedule which minimises the mean cost rate is derived analytically and discussed numerically.

  10. The economical utilization of geothermal energy

    NASA Astrophysics Data System (ADS)

    Rose, G.

    1982-12-01

    The geothermal energy which is stored in hot dry rock could be theoretically utilized for the generation of power. The hot-dry-rock procedure can provide a flow of hot water. The considered binary system can transform the obtained thermal energy into electrical energy. The system makes use of a Rankine cycle with a working fluid having a low boiling point. Heat from the hot water is transferred to the working fluid. The present investigation is concerned with the development of a method for the calculation of the entire process. The results obtained with the computational method are to provide a basis for the determination of the operational characteristics. The development method is used for the study of a process based on the use of carbon dioxide as working fluid. The economics of a use of the hot-dry-rock process with the binary system is also investigated. It is found that the considered procedure is not economical. Economical operation requires, in particular, hot water supplied at a much lower cost.

  11. Description and Evaluation of Chiral Interactive Sites on Bonded Cyclodextrin Stationary Phases for Liquid Chromatography

    NASA Astrophysics Data System (ADS)

    Beesley, Thomas E.

    Development of chiral separations has been essential to the drug discovery and development process. The solubility requirements for a number of methods and/or the mobile phase requirements for application of certain detection systems have opened up many opportunities for cyclodextrin-based CSPs for liquid chromatography. Even though a few chiral stationary phases cover a wide area of enantioselectivity, they do not meet the entire needs of the industry. Cyclodextrin phases offer some unique mechanisms and opportunities to resolve chiral separation problems especially in the aqueous reversed-phase and non-aqueous polar organic modes. This chapter addresses the need to understand the chiral stationary phase structure, the mechanisms at work, and the role mobile phase composition plays in driving those mechanisms to produce enantioselectivity. In addition, the development of certain derivatives has played an essential part in expanding that basic role for certain chiral separations. What these derivatives contribute in concert with the basic structure is a critical part of the understanding to the effective use of these phases. During this study it was determined that the role of steric hindrance has been vastly underestimated, both to the extent that it has occurred and to its effectiveness for obtaining enantioselectivity. References to the entire 20-year history of the cyclodextrin phase development and application literature up to this current date have been reviewed and incorporated.

  12. Processing Satellite Images on Tertiary Storage: A Study of the Impact of Tile Size on Performance

    NASA Technical Reports Server (NTRS)

    Yu, JieBing; DeWitt, David J.

    1996-01-01

    Before raw data from a satellite can be used by an Earth scientist, it must first undergo a number of processing steps including basic processing, cleansing, and geo-registration. Processing actually expands the volume of data collected by a factor of 2 or 3 and the original data is never deleted. Thus processing and storage requirements can exceed 2 terrabytes/day. Once processed data is ready for analysis, a series of algorithms (typically developed by the Earth scientists) is applied to a large number of images in a data set. The focus of this paper is how best to handle such images stored on tape using the following assumptions: (1) all images of interest to a scientist are stored on a single tape, (2) images are accessed and processed in the order that they are stored on tape, and (3) the analysis requires access to only a portion of each image and not the entire image.

  13. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    NASA Astrophysics Data System (ADS)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  14. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    PubMed Central

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  15. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  16. Positive results of clinical educational support in situations of psychological distress.

    PubMed

    Tavormina, Dominique

    2014-11-01

    Education is a complex process that involves the individual in the course of his entire life and leads to the maturation and the overall development of his personality. The educational process involves the complete growth of each and completes the infinite possibilities that every child has potential since birth. Education also is a necessity for the human being, as only adequate environmental stimulation causes the mental processes to begin. In fact, the higher intellectual functions, such as language, thought, memory, emerge only from social and educational experiences of the child. The educational surgery creates experiences and learning that allow the person to change by improving the efficiency of synaptic connections. Clinical pedagogy has developed in Italy in the last decades of the twentieth century with the aim of research and experimenting educational purposes suitable for different situations in order to provide each subject with appropriate development opportunities. Clinical pedagogical support is offered in the form of artistic or bodily activities and represents for the individual a positive environment that allows the development of different brain areas and the potential inherent in them. The various methods are suitable for any situation of existential discomfort, which are understood as moments of personal growth.

  17. Access NASA Satellite Global Precipitation Data Visualization on YouTube

    NASA Technical Reports Server (NTRS)

    Liu, Z.; Su, J.; Acker, J.; Huffman, G.; Vollmer, B.; Wei, J.; Meyer, D.

    2017-01-01

    Since the satellite era began, NASA has collected a large volume of Earth science observations for research and applications around the world. The collected and archived satellite data at 12 NASA data centers can also be used for STEM education and activities such as disaster events, climate change, etc. However, accessing satellite data can be a daunting task for non-professional users such as teachers and students because of unfamiliarity of terminology, disciplines, data formats, data structures, computing resources, processing software, programming languages, etc. Over the years, many efforts including tools, training classes, and tutorials have been developed to improve satellite data access for users, but barriers still exist for non-professionals. In this presentation, we will present our latest activity that uses a very popular online video sharing Web site, YouTube (https://www.youtube.com/), for accessing visualizations of our global precipitation datasets at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC). With YouTube, users can access and visualize a large volume of satellite data without the necessity to learn new software or download data. The dataset in this activity is a one-month animation for the GPM (Global Precipitation Measurement) Integrated Multi-satellite Retrievals for GPM (IMERG). IMERG provides precipitation on a near-global (60 deg. N-S) coverage at half-hourly time interval, providing more details on precipitation processes and development compared to the 3-hourly TRMM (Tropical Rainfall Measuring Mission) Multisatellite Precipitation Analysis (TMPA, 3B42) product. When the retro-processing of IMERG during the TRMM era is finished in 2018, the entire video will contain more than 330,000 files and will last 3.6 hours. Future plans include development of flyover videos for orbital data for an entire satellite mission or project. All videos, including the one-month animation, will be uploaded and available at the GES DISC site on YouTube (https://www.youtube.com/user/NASAGESDISC).

  18. Telemedicine utilization to support the management of the burns treatment involving patient pathways in both developed and developing countries: a case study.

    PubMed

    Syed-Abdul, Shabbir; Scholl, Jeremiah; Chen, Chiehfeng Cliff; Santos, Martinho D P S; Jian, Wen-Shan; Liou, Der-Ming; Li, Yu-Chuan

    2012-01-01

    This case study reports on the utilization of telemedicine to support the management of the burns treatment in the islands of Sao Tome and Principe by Taipei Medical University-affiliated hospital in Taiwan. The authors share experiences about usage of telemedicine to support treatment of the burn victims in a low-income country that receive reconstructive surgery in a developed country. Throughout the entire care process, telemedicine has been used not only to provide an expert advice from distance but also to help establish and maintain the doctor-patient relationship, to keep patients in contact with their families, and to help educate and consult the medical personal physically present in Sao Tome and Principe. This case study presents the details of how this process has been conducted to date, on what were learned from this process, and on issues that should be considered to improve this process in the future. The authors plan to create instructional videos and post them on YouTube to aid clinical workers providing similar treatment during the acute care and rehabilitation process and also to support eLearning in many situations where it otherwise is not possible to use videoconferencing to establish real-time contact between doctors at the local site and remote specialists.

  19. The Palomar Transient Factory: High Quality Realtime Data Processing in a Cost-Constrained Environment

    NASA Astrophysics Data System (ADS)

    Surace, J.; Laher, R.; Masci, F.; Grillmair, C.; Helou, G.

    2015-09-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.

  20. Pelvis of Gargoyleosaurus (Dinosauria: Ankylosauria) and the Origin and Evolution of the Ankylosaur Pelvis

    PubMed Central

    Carpenter, Kenneth; DiCroce, Tony; Kinneer, Billy; Simon, Robert

    2013-01-01

    Discovery of a pelvis attributed to the Late Jurassic armor-plated dinosaur Gargoyleosaurus sheds new light on the origin of the peculiar non-vertical, broad, flaring pelvis of ankylosaurs. It further substantiates separation of the two ankylosaurs from the Morrison Formation of the western United States, Gargoyleosaurus and Mymoorapelta. Although horizontally oriented and lacking the medial curve of the preacetabular process seen in Mymoorapelta, the new ilium shows little of the lateral flaring seen in the pelvis of Cretaceous ankylosaurs. Comparison with the basal thyreophoran Scelidosaurus demonstrates that the ilium in ankylosaurs did not develop entirely by lateral rotation as is commonly believed. Rather, the preacetabular process rotated medially and ventrally and the postacetabular process rotated in opposition, i.e., lateral and ventrally. Thus, the dorsal surfaces of the preacetabular and postacetabular processes are not homologous. In contrast, a series of juvenile Stegosaurus ilia show that the postacetabular process rotated dorsally ontogenetically. Thus, the pelvis of the two major types of Thyreophora most likely developed independently. Examination of other ornithischians show that a non-vertical ilium had developed independently in several different lineages, including ceratopsids, pachycephalosaurs, and iguanodonts. Therefore, a separate origin for the non-vertical ilium in stegosaurs and ankylosaurs does have precedent. PMID:24244573

  1. Building emotional resilience over 14 sessions of emotion focused therapy: Micro-longitudinal analyses of productive emotional patterns.

    PubMed

    Pascual-Leone, A; Yeryomenko, N; Sawashima, T; Warwar, S

    2017-05-04

    Pascual-Leone and Greenberg's sequential model of emotional processing has been used to explore process in over 24 studies. This line of research shows emotional processing in good psychotherapy often follows a sequential order, supporting a saw-toothed pattern of change within individual sessions (progressing "2-steps-forward, 1-step-back"). However, one cannot assume that local in-session patterns are scalable across an entire course of therapy. Thus, the primary objective of this exploratory study was to consider how the sequential patterns identified by Pascual-Leone, may apply across entire courses of treatment. Intensive emotion coding in two separate single-case designs were submitted for quantitative analyses of longitudinal patterns. Comprehensive coding in these cases involved recording observations for every emotional event in an entire course of treatment (using the Classification of Affective-Meaning States), which were then treated as a 9-point ordinal scale. Applying multilevel modeling to each of the two cases showed significant patterns of change over a large number of sessions, and those patterns were either nested at the within-session level or observed at the broader session-by-session level of change. Examining successful treatment cases showed several theoretically coherent kinds of temporal patterns, although not always in the same case. Clinical or methodological significance of this article: This is the first paper to demonstrate systematic temporal patterns of emotion over the course of an entire treatment. (1) The study offers a proof of concept that longitudinal patterns in the micro-processes of emotion can be objectively derived and quantified. (2) It also shows that patterns in emotion may be identified on the within-session level, as well as the session-by-session level of analysis. (3) Finally, observed processes over time support the ordered pattern of emotional states hypothesized in Pascual-Leone and Greenberg's ( 2007 ) model of emotional processing.

  2. Limits of thermochemical and photochemical syntheses of gaseous fuels: a finite-time thermodynamic analysis. Annual report, September 1983-February, 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, R.S.

    The objectives of this project are to develop methods for the evaluation of syntheses of gaseous fuels in terms of their optimum possible performance, particularly when they are required to supply those fuels at nonzero rates. The first objective is entirely in the tradition of classical thermodynamics, the processes, given the characteristics and constraints that define them. The new element which this project introduces is the capability to set limits more realistic than those from classical thermodynamics, by the inclusion of the influence of the rate or duration of a process on its performance. The development of these analyses ismore » a natural step in the evolution represented by the evaluative papers of Appendix IV, e.g., by Funk et al., Abraham, Shinnar, Bilgen and Fletcher. A second objective is to determine how any given process should be carried out, within its constraints, in order to yield its optimum performance and to use this information whenever possible to help guide the design of that process.« less

  3. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  4. Process for preparing superconducting film having substantially uniform phase development

    DOEpatents

    Bharacharya, Raghuthan; Parilla, Philip A.; Blaugher, Richard D.

    1995-01-01

    A process for preparing a superconducting film, such as a thallium-barium-calcium-copper oxide superconducting film, having substantially uniform phase development. The process comprises providing an electrodeposition bath having one or more soluble salts of one or more respective potentially superconducting metals in respective amounts adequate to yield a superconducting film upon subsequent appropriate treatment. Should all of the metals required for producing a superconducting film not be made available in the bath, such metals can be a part of the ambient during a subsequent annealing process. A soluble silver salt in an amount between about 0.1% and about 4.0% by weight of the provided other salts is also provided to the bath, and the bath is electrically energized to thereby form a plated film. The film is annealed in ambient conditions suitable to cause formation of a superconductor film. Doping with silver reduces the temperature at which the liquid phase appears during the annealing step, initiates a liquid phase throughout the entire volume of deposited material, and influences the nucleation and growth of the deposited material.

  5. Process for preparing superconducting film having substantially uniform phase development

    DOEpatents

    Bharacharya, R.; Parilla, P.A.; Blaugher, R.D.

    1995-12-19

    A process is disclosed for preparing a superconducting film, such as a thallium-barium-calcium-copper oxide superconducting film, having substantially uniform phase development. The process comprises providing an electrodeposition bath having one or more soluble salts of one or more respective potentially superconducting metals in respective amounts adequate to yield a superconducting film upon subsequent appropriate treatment. Should all of the metals required for producing a superconducting film not be made available in the bath, such metals can be a part of the ambient during a subsequent annealing process. A soluble silver salt in an amount between about 0.1% and about 4.0% by weight of the provided other salts is also provided to the bath, and the bath is electrically energized to thereby form a plated film. The film is annealed in ambient conditions suitable to cause formation of a superconductor film. Doping with silver reduces the temperature at which the liquid phase appears during the annealing step, initiates a liquid phase throughout the entire volume of deposited material, and influences the nucleation and growth of the deposited material. 3 figs.

  6. A systems engineering perspective on the human-centered design of health information systems.

    PubMed

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  7. Expanded Processing Techniques for EMI Systems

    DTIC Science & Technology

    2012-07-01

    possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets

  8. The longevity of habitable planets and the development of intelligent life

    NASA Astrophysics Data System (ADS)

    Simpson, Fergus

    2017-07-01

    Why did the emergence of our species require a timescale similar to the entire habitable period of our planet? Our late appearance has previously been interpreted by Carter (2008) as evidence that observers typically require a very long development time, implying that intelligent life is a rare occurrence. Here we present an alternative explanation, which simply asserts that many planets possess brief periods of habitability. We also propose that the rate-limiting step for the formation of observers is the enlargement of species from an initially microbial state. In this scenario, the development of intelligent life is a slow but almost inevitable process, greatly enhancing the prospects of future search for extra-terrestrial intelligence (SETI) experiments such as the Breakthrough Listen project.

  9. Development Of A Web Service And Android 'APP' For The Distribution Of Rainfall Data. A Bottom-Up Remote Sensing Data Mining And Redistribution Project In The Age Of The 'Web 2.0'

    NASA Astrophysics Data System (ADS)

    Mantas, Vasco M.; Pereira, A. J. S. C.; Liu, Zhong

    2013-12-01

    A project was devised to develop a set of freely available applications and web services that can (1) simplify access from Mobile Devices to TOVAS data and (2) support the development of new datasets through data repackaging and mash-up. The bottom-up approach enables the multiplication of new services, often of limited direct interest to the organizations that produces the original, global datasets, but significant to small, local users. Through this multiplication of services, the development cost is transferred to the intermediate or end users and the entire process is made more efficient, even allowing new players to use the data in innovative ways.

  10. SEPARATION OF INORGANIC SALTS FROM ORGANIC SOLUTIONS

    DOEpatents

    Katzin, L.I.; Sullivan, J.C.

    1958-06-24

    A process is described for recovering the nitrates of uranium and plutonium from solution in oxygen-containing organic solvents such as ketones or ethers. The solution of such salts dissolved in an oxygen-containing organic compound is contacted with an ion exchange resin whereby sorption of the entire salt on the resin takes place and then the salt-depleted liquid and the resin are separated from each other. The reaction seems to be based on an anion formation of the entire salt by complexing with the anion of the resin. Strong base or quaternary ammonium type resins can be used successfully in this process.

  11. Paper-based microreactor array for rapid screening of cell signaling cascades.

    PubMed

    Huang, Chia-Hao; Lei, Kin Fong; Tsang, Ngan-Ming

    2016-08-07

    Investigation of cell signaling pathways is important for the study of pathogenesis of cancer. However, the related operations used in these studies are time consuming and labor intensive. Thus, the development of effective therapeutic strategies may be hampered. In this work, gel-free cell culture and subsequent immunoassay has been successfully integrated and conducted in a paper-based microreactor array. Study of the activation level of different kinases of cells stimulated by different conditions, i.e., IL-6 stimulation, starvation, and hypoxia, was demonstrated. Moreover, rapid screening of cell signaling cascades after the stimulations of HGF, doxorubicin, and UVB irradiation was respectively conducted to simultaneously screen 40 kinases and transcription factors. Activation of multi-signaling pathways could be identified and the correlation between signaling pathways was discussed to provide further information to investigate the entire signaling network. The present technique integrates most of the tedious operations using a single paper substrate, reduces sample and reagent consumption, and shortens the time required by the entire process. Therefore, it provides a first-tier rapid screening tool for the study of complicated signaling cascades. It is expected that the technique can be developed for routine protocol in conventional biological research laboratories.

  12. A new clarification method to visualize biliary degeneration during liver metamorphosis in sea lamprey (Petromyzon marinus)

    USGS Publications Warehouse

    Chung-Davidson, Yu-Wen; Davidson, Peter J.; Scott, Anne M.; Walaszczyk, Erin J.; Brant, Cory O.; Buchinger, Tyler; Johnson, Nicholas S.; Li, Weiming

    2014-01-01

    Biliary atresia is a rare disease of infancy, with an estimated 1 in 15,000 frequency in the southeast United States, but more common in East Asian countries, with a reported frequency of 1 in 5,000 in Taiwan. Although much is known about the management of biliary atresia, its pathogenesis is still elusive. The sea lamprey (Petromyzon marinus) provides a unique opportunity to examine the mechanism and progression of biliary degeneration. Sea lamprey develop through three distinct life stages: larval, parasitic, and adult. During the transition from larvae to parasitic juvenile, sea lamprey undergo metamorphosis with dramatic reorganization and remodeling in external morphology and internal organs. In the liver, the entire biliary system is lost, including the gall bladder and the biliary tree. A newly-developed method called “CLARITY” was modified to clarify the entire liver and the junction with the intestine in metamorphic sea lamprey. The process of biliary degeneration was visualized and discerned during sea lamprey metamorphosis by using laser scanning confocal microscopy. This method provides a powerful tool to study biliary atresia in a unique animal model.

  13. A low-frequency near-field interferometric-TOA 3-D Lightning Mapping Array

    NASA Astrophysics Data System (ADS)

    Lyu, Fanchao; Cummer, Steven A.; Solanki, Rahulkumar; Weinert, Joel; McTague, Lindsay; Katko, Alex; Barrett, John; Zigoneanu, Lucian; Xie, Yangbo; Wang, Wenqi

    2014-11-01

    We report on the development of an easily deployable LF near-field interferometric-time of arrival (TOA) 3-D Lightning Mapping Array applied to imaging of entire lightning flashes. An interferometric cross-correlation technique is applied in our system to compute windowed two-sensor time differences with submicrosecond time resolution before TOA is used for source location. Compared to previously reported LF lightning location systems, our system captures many more LF sources. This is due mainly to the improved mapping of continuous lightning processes by using this type of hybrid interferometry/TOA processing method. We show with five station measurements that the array detects and maps different lightning processes, such as stepped and dart leaders, during both in-cloud and cloud-to-ground flashes. Lightning images mapped by our LF system are remarkably similar to those created by VHF mapping systems, which may suggest some special links between LF and VHF emission during lightning processes.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objective of the contract is to consolidate the advances made during the previous contract in the conversion of syngas to motor fuels using Molecular Sieve-containing catalysts and to demonstrate the practical utility and economic value of the new catalyst/process systems with appropriate laboratory runs. Work on the program is divided into the following six tasks: (1) preparation of a detailed work plan covering the entire performance of the contract; (2) preliminary techno-economic assessment of the UCC catalyst/process system; (3) optimization of the most promising catalysts developed under prior contract; (4) optimization of the UCC catalyst system in a mannermore » that will give it the longest possible service life; (5) optimization of a UCC process/catalyst system based upon a tubular reactor with a recycle loop; and (6) economic evaluation of the optimal performance found under Task 5 for the UCC process/catalyst system. Accomplishments are reported for Tasks 2 through 5.« less

  15. A social pedagogy approach to residential care: balancing education and placement in the development of an innovative child welfare residential program in Ontario, Canada.

    PubMed

    Gharabaghi, Kiaras; Groskleg, Ron

    2010-01-01

    This paper chronicles the exploration and development of a residential program of the child welfare authority of Renfrew County in Ontario, Canada. Recognizing that virtually its entire population of youth in care was failing to achieve positive outcomes in education, Renfrew County Family and Children Services embarked on a program development process that included many unique elements within the Ontario child welfare context. This process introduced the theoretical framework of social pedagogy to the provision of residential care, and it replaced the idea of psychotherapy as the primary agent of change for youth with the concept of living and learning. The result is a template for the Ottawa River Academy, a living and learning program for youth in care that exemplifies the possibilities embedded in creative thought, attention to research and evidence, and a preparedness to transcend traditional assumptions with respect to service designs and business models for residential care in child welfare.

  16. Micro- and meso-scale simulations of magnetospheric processes related to the aurora and substorm morphology

    NASA Technical Reports Server (NTRS)

    Swift, Daniel W.

    1991-01-01

    The primary methodology during the grant period has been the use of micro or meso-scale simulations to address specific questions concerning magnetospheric processes related to the aurora and substorm morphology. This approach, while useful in providing some answers, has its limitations. Many of the problems relating to the magnetosphere are inherently global and kinetic. Effort during the last year of the grant period has increasingly focused on development of a global-scale hybrid code to model the entire, coupled magnetosheath - magnetosphere - ionosphere system. In particular, numerical procedures for curvilinear coordinate generation and exactly conservative differencing schemes for hybrid codes in curvilinear coordinates have been developed. The new computer algorithms and the massively parallel computer architectures now make this global code a feasible proposition. Support provided by this project has played an important role in laying the groundwork for the eventual development or a global-scale code to model and forecast magnetospheric weather.

  17. The practical use of simplicity in developing ground water models

    USGS Publications Warehouse

    Hill, M.C.

    2006-01-01

    The advantages of starting with simple models and building complexity slowly can be significant in the development of ground water models. In many circumstances, simpler models are characterized by fewer defined parameters and shorter execution times. In this work, the number of parameters is used as the primary measure of simplicity and complexity; the advantages of shorter execution times also are considered. The ideas are presented in the context of constructing ground water models but are applicable to many fields. Simplicity first is put in perspective as part of the entire modeling process using 14 guidelines for effective model calibration. It is noted that neither very simple nor very complex models generally produce the most accurate predictions and that determining the appropriate level of complexity is an ill-defined process. It is suggested that a thorough evaluation of observation errors is essential to model development. Finally, specific ways are discussed to design useful ground water models that have fewer parameters and shorter execution times.

  18. [Significance of re-evaluation and development of Chinese herbal drugs].

    PubMed

    Gao, Yue; Ma, Zengchun; Zhang, Boli

    2012-01-01

    The research of new herbal drugs involves in new herbal drugs development and renew the old drugs. It is necessary to research new herbal drugs based on the theory of traditional Chinese medicine (TCM). The current development of famous TCM focuses on the manufacture process, quality control standards, material basis and clinical research. But system management of security evaluation is deficient, the relevant system for the safety assessment TCM has not been established. The causes of security problems, security risks, target organ of toxicity, weak link of safety evaluation, and ideas of safety evaluation are discussed in this paper. The toxicology research of chinese herbal drugs is necessary based on standard of good laboratory practices (GLP), the characteristic of Chinese herbal drugs is necessary to be fully integrated into safety evaluation. The safety of new drug research is necessary to be integrated throughout the entire process. Famous Chinese medicine safety research must be paid more attention in the future.

  19. Immobilization of concanavalin A receptors during differentiation of neuroblastoma cells.

    PubMed

    Fishman, M C; Dragsten, P R; Spector, I

    1981-04-30

    Neuroblastoma cells serve as a useful model of neuronal development because compounds such as dimethyl sulphoxide (DMSO) and dibutyryl cyclic AMP cause them to undergo a process of controlled differentiation in tissue culture, during which they can extend long processes, develop characteristic excitability mechanisms, synthesize neurotransmitters and form synapses. We have used the technique of fluorescence photobleaching recovery to study the lateral mobility of cell-surface constituents during the differentiation of neuroblastoma clone N1E-115 cells. The concanavalin A (Con A) binding sites appear as discrete patches distributed over the entire cell surface and exhibit lateral mobility in undifferentiated cells comparable with that of surface glycoproteins of other cells. After induction of differentiation, however, the vast majority of Con A binding sites become immobilized, and we present data which suggest that the mechanism of this immobilization may involve linkage to the internal actin network.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Hong.

    A new technique was developed and demonstrated for combining carbon fibers with aromatic thermoplastic matrices to form a high-quality towpreg. The developed technique utilizes an in-situ electrochemical process (Electrochemical polymerization - ECP) to create the entire polymer matrix surrounding the fiber array by direct polymerization of monomer. Poly-paraxylylene (PPX) and derivatives are successfully polymerized in-situ on carbon fiber surfaces through ECP. A PPX/carbon-fiber towpreg with 40 vol % of matrix is achieved in a fairly short reaction time with a high polymer-coating efficiency. Vapor deposition polymerization (VDP) was also studied. PPX and carbon-fiber towpreg were made successfully by this process.more » A comparison between ECP and VDP was conducted. A study on electrochemical oxidation (ECO) of carbon fibers was also performed. The ECO treatment may be suitable for carbon fibers incorporated in composites with high-temperature curing resins and thermoplastic matrices.« less

  1. Open Science CBS Neuroimaging Repository: Sharing ultra-high-field MR images of the brain.

    PubMed

    Tardif, Christine Lucas; Schäfer, Andreas; Trampel, Robert; Villringer, Arno; Turner, Robert; Bazin, Pierre-Louis

    2016-01-01

    Magnetic resonance imaging at ultra high field opens the door to quantitative brain imaging at sub-millimeter isotropic resolutions. However, novel image processing tools to analyze these new rich datasets are lacking. In this article, we introduce the Open Science CBS Neuroimaging Repository: a unique repository of high-resolution and quantitative images acquired at 7 T. The motivation for this project is to increase interest for high-resolution and quantitative imaging and stimulate the development of image processing tools developed specifically for high-field data. Our growing repository currently includes datasets from MP2RAGE and multi-echo FLASH sequences from 28 and 20 healthy subjects respectively. These datasets represent the current state-of-the-art in in-vivo relaxometry at 7 T, and are now fully available to the entire neuroimaging community. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Cirrus clouds. I - A cirrus cloud model. II - Numerical experiments on the formation and maintenance of cirrus

    NASA Technical Reports Server (NTRS)

    Starr, D. OC.; Cox, S. K.

    1985-01-01

    A simplified cirrus cloud model is presented which may be used to investigate the role of various physical processes in the life cycle of a cirrus cloud. The model is a two-dimensional, time-dependent, Eulerian numerical model where the focus is on cloud-scale processes. Parametrizations are developed to account for phase changes of water, radiative processes, and the effects of microphysical structure on the vertical flux of ice water. The results of a simulation of a thin cirrostratus cloud are given. The results of numerical experiments performed with the model are described in order to demonstrate the important role of cloud-scale processes in determining the cloud properties maintained in response to larger scale forcing. The effects of microphysical composition and radiative processes are considered, as well as their interaction with thermodynamic and dynamic processes within the cloud. It is shown that cirrus clouds operate in an entirely different manner than liquid phase stratiform clouds.

  3. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  4. Developing neuronal networks: Self-organized criticality predicts the future

    NASA Astrophysics Data System (ADS)

    Pu, Jiangbo; Gong, Hui; Li, Xiangning; Luo, Qingming

    2013-01-01

    Self-organized criticality emerged in neural activity is one of the key concepts to describe the formation and the function of developing neuronal networks. The relationship between critical dynamics and neural development is both theoretically and experimentally appealing. However, whereas it is well-known that cortical networks exhibit a rich repertoire of activity patterns at different stages during in vitro maturation, dynamical activity patterns through the entire neural development still remains unclear. Here we show that a series of metastable network states emerged in the developing and ``aging'' process of hippocampal networks cultured from dissociated rat neurons. The unidirectional sequence of state transitions could be only observed in networks showing power-law scaling of distributed neuronal avalanches. Our data suggest that self-organized criticality may guide spontaneous activity into a sequential succession of homeostatically-regulated transient patterns during development, which may help to predict the tendency of neural development at early ages in the future.

  5. Toward a virtual platform for materials processing

    NASA Astrophysics Data System (ADS)

    Schmitz, G. J.; Prahl, U.

    2009-05-01

    Any production is based on materials eventually becoming components of a final product. Material properties being determined by the microstructure of the material thus are of utmost importance both for productivity and reliability of processing during production and for application and reliability of the product components. A sound prediction of materials properties therefore is highly important. Such a prediction requires tracking of microstructure and properties evolution along the entire component life cycle starting from a homogeneous, isotropic and stress-free melt and eventually ending in failure under operational load. This article will outline ongoing activities at the RWTH Aachen University aiming at establishing a virtual platform for materials processing comprising a virtual, integrative numerical description of processes and of the microstructure evolution along the entire production chain and even extending further toward microstructure and properties evolution under operational conditions.

  6. History of the Great Patriotic War of the Soviet Union, 1941-1945. Volume 6. Results of the Great Patriotic War,

    DTIC Science & Technology

    1982-10-06

    Under %he peculiar conditions of Second World War this Leninist prediction/forecast completely justified. The process of the nonuniform development of...and resourcefulness were required heroines Ye. G. Mazanik, N. V. Troyan and M. B. Osipovoy in order to select into the very lair of deputy Hitler in...the front, in the rear, in the lair of enemy - to give entire their energy to the work of victory were that decisive force which gave rise to the mass

  7. The SCIAMACHY Consolidated Level 0 Data Set

    NASA Astrophysics Data System (ADS)

    Gottwald, Manfred; Krieg, Eckhart; Reissig, Katja; How, John; Brizzi, Gabriele; Dehn, Agelika; Fehr, Thorsten

    2013-12-01

    By the end of the ENVISAT mission, SCIAMACHY had executed 52867 orbits. In most of those SCIAMACHY acquired measurement data. SCIAMACHY's complex measurement schemes are best reflected in the consolidated level 0 products. The cL0 products are the basis for level 0-1b and level 1b-2 processing whenever highest precision is required. It was therefore of paramount importance to develop a cL0 data archive for the entire in-orbit mission lifetime being as complete as possible and containing quality controlled measurement data.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, L.M.

    This paper describes the role of the Monsanto Chemical Company in the cleanup of a Superfund site in Galveston, Texas. Although other companies had sent waste to the site over an extended period of time, Monsanto was charged with the entire cost. Monsanto responded by identifying other site users and determining the extent of their liability through chemical analysis of the wastes. They took the lead in organizing the other users and developing an effective cleanup process at a cost much less than the EPA's estimates. They also helped to improve industry's relations with the community.

  9. Tracking data in the office environment.

    PubMed

    Erickson, Ty B

    2010-09-01

    Data tracking in the office setting focuses on a narrow spectrum of the entire patient safety arena; however, when properly executed, data tracking increases staff members' awareness of the importance of patient safety. Data tracking is also a high-volume event and thereby continues to loop back on the consciousness of providers in all aspects of their practice. Improvement in date tracking will improve the collateral areas of patient safety such as proper medication usage, legibility of written communication, effective delegation of patient safety initiatives, and a collegial effort at developing teams for safety design processes.

  10. Metal-Insulator-Metal Diode Process Development for Energy Harvesting Applications

    DTIC Science & Technology

    2010-04-01

    Sputter Tool Dep Method: Sputtering (DC Magnetron ) Recipe: MC_Pt 1640A_TiO2 1000A_Ti 2000A_500C_1a MC_Pt 1640A_TiO2 1000A_Ti 2000A_300C_1a MC_Pt...thin films were sputtered onto silicon substrates with silicon dioxide overlayers. I-V measurements were taken using an electrical characterization...deposition of the entire MIM material stack to be done without breaking the vacuum within a multi-material system DC sputtering tool. A CAD layout of a MIM

  11. Solar energy to biofuels.

    PubMed

    Agrawal, Rakesh; Singh, Navneet R

    2010-01-01

    In a solar economy, sustainably available biomass holds the potential to be an excellent nonfossil source of high energy density transportation fuel. However, if sustainably available biomass cannot supply the liquid fuel need for the entire transport sector, alternatives must be sought. This article reviews biomass to liquid fuel conversion processes that treat biomass primarily as a carbon source and boost liquid fuel production substantially by using supplementary energy that is recovered from solar energy at much higher efficiencies than the biomass itself. The need to develop technologies for an energy-efficient future sustainable transport sector infrastructure that will use different forms of energy, such as electricity, H(2), and heat, in a synergistic interaction with each other is emphasized. An enabling template for such a future transport infrastructure is presented. An advantage of the use of such a template is that it reduces the land area needed to propel an entire transport sector. Also, some solutions for the transition period that synergistically combine biomass with fossil fuels are briefly discussed.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, E.C.; Killough, S.M.; Rowe, J.C.

    The purpose of the Smart Crane Ammunition Transfer System (SCATS) project is to demonstrate robotic/telerobotic controls technology for a mobile articulated crane for missile/munitions handling, delivery, and reload. Missile resupply and reload have been manually intensive operations up to this time. Currently, reload missiles are delivered by truck to the site of the launcher. A crew of four to five personnel reloads the missiles from the truck to the launcher using a hydraulic-powered crane. The missiles are handled carefully for the safety of the missiles and personnel. Numerous steps are required in the reload process and the entire reload operationmore » can take over an hour for some missile systems. Recent US Army directives require the entire operation to be accomplished in a fraction of that time. Current development of SCATS is being based primarily on reloading Patriot missiles. This paper summarizes the current status of the SCATS project at the Oak Ridge National Laboratory (ORNL). Additional information on project background and requirements has been described previously (Bradley, et al., 1995).« less

  13. Study of Mechanical Properties and Characterization of Pipe Steel welded by Hybrid (Friction Stir Weld + Root Arc Weld) Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Yong Chae; Sanderson, Samuel; Mahoney, Murray

    Friction stir welding (FSW) has recently attracted attention as an alternative construction process for gas/oil transportation applications due to advantages compared to fusion welding techniques. A significant advantage is the ability of FSW to weld the entire or nearly the entire wall thickness in a single pass, while fusion welding requires multiple passes. However, when FSW is applied to a pipe or tube geometry, an internal back support anvil is required to resist the plunging forces exerted during FSW. Unfortunately, it may not be convenient or economical to use internal backing support due to limited access for some applications. Tomore » overcome this issue, ExxonMobil recently developed a new concept, combining root arc welding and FSW. That is, a root arc weld is made prior to FSW that supports the normal loads associated with FSW. In the present work, mechanical properties of a FSW + root arc welded pipe steel are reported including microstructure and microhardness.« less

  14. Porous polycarbene-bearing membrane actuator for ultrasensitive weak-acid detection and real-time chemical reaction monitoring.

    PubMed

    Sun, Jian-Ke; Zhang, Weiyi; Guterman, Ryan; Lin, Hui-Juan; Yuan, Jiayin

    2018-04-30

    Soft actuators with integration of ultrasensitivity and capability of simultaneous interaction with multiple stimuli through an entire event ask for a high level of structure complexity, adaptability, and/or multi-responsiveness, which is a great challenge. Here, we develop a porous polycarbene-bearing membrane actuator built up from ionic complexation between a poly(ionic liquid) and trimesic acid (TA). The actuator features two concurrent structure gradients, i.e., an electrostatic complexation (EC) degree and a density distribution of a carbene-NH 3 adduct (CNA) along the membrane cross-section. The membrane actuator performs the highest sensitivity among the state-of-the-art soft proton actuators toward acetic acid at 10 -6  mol L -1 (M) level in aqueous media. Through competing actuation of the two gradients, it is capable of monitoring an entire process of proton-involved chemical reactions that comprise multiple stimuli and operational steps. The present achievement constitutes a significant step toward real-life application of soft actuators in chemical sensing and reaction technology.

  15. Fast title extraction method for business documents

    NASA Astrophysics Data System (ADS)

    Katsuyama, Yutaka; Naoi, Satoshi

    1997-04-01

    Conventional electronic document filing systems are inconvenient because the user must specify the keywords in each document for later searches. To solve this problem, automatic keyword extraction methods using natural language processing and character recognition have been developed. However, these methods are slow, especially for japanese documents. To develop a practical electronic document filing system, we focused on the extraction of keyword areas from a document by image processing. Our fast title extraction method can automatically extract titles as keywords from business documents. All character strings are evaluated for similarity by rating points associated with title similarity. We classified these points as four items: character sitting size, position of character strings, relative position among character strings, and string attribution. Finally, the character string that has the highest rating is selected as the title area. The character recognition process is carried out on the selected area. It is fast because this process must recognize a small number of patterns in the restricted area only, and not throughout the entire document. The mean performance of this method is an accuracy of about 91 percent and a 1.8 sec. processing time for an examination of 100 Japanese business documents.

  16. Material quality development during the automated tow placement process

    NASA Astrophysics Data System (ADS)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  17. Educational Technology in the Third World: A General Systems Perspective. Report Number 9.

    ERIC Educational Resources Information Center

    Awa, Njoku; And Others

    Change is more advantageous if it incorporates an entire society rather than an element of that society, and the goals of educational development will be most fruitfully realized with a minimum of harmful effects if programs of instructional development are made to focus on the entire culture, rather than on a single social institution. A systems…

  18. The Effect of Radiation on Selected Photographic Film

    NASA Technical Reports Server (NTRS)

    Slater, Richard; Kinard, John; Firsov, Ivan

    2000-01-01

    We conducted this film test to evaluate several manufacturers' photographic films for their ability to acquire imagery on the International Space Station. We selected 25 motion picture, photographic slide, and negative films from three different film manufacturers. We based this selection on the fact that their films ranked highest in other similar film tests, and on their general acceptance by the international community. This test differed from previous tests because the entire evaluation process leading up to the final selection was based on information derived after the original flight film was scanned to a digital file. Previously conducted tests were evaluated entirely based on 8 x 10s that were produced from the film either directly or through the internegative process. This new evaluation procedure provided accurate quantitative data on granularity and contrast from the digital data. This test did not try to define which film was best visually. This is too often based on personal preference. However, the test results did group the films by good, marginal, and unacceptable. We developed, and included in this report, a template containing quantitative, graphical, and visual information for each film. These templates should be sufficient for comparing the different films tested and subsequently selecting a film or films to be used for experiments and general documentation on the International Space Station.

  19. Valuing the Accreditation Process

    ERIC Educational Resources Information Center

    Bahr, Maria

    2018-01-01

    The value of the National Association for Developmental Education (NADE) accreditation process is far-reaching. Not only do students and programs benefit from the process, but also the entire institution. Through data collection of student performance, analysis, and resulting action plans, faculty and administrators can work cohesively towards…

  20. Laser-zone Growth in a Ribbon-to-ribbon (RTR) Process Silicon Sheet Growth Development for the Large Area Silicon Sheet Task of the Low Cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Baghdadi, A.; Gurtler, R. W.; Legge, R.; Sopori, B.; Rice, M. J.; Ellis, R. J.

    1979-01-01

    A technique for growing limited-length ribbons continually was demonstrated. This Rigid Edge technique can be used to recrystallize about 95% of the polyribbon feedstock. A major advantage of this method is that only a single, constant length silicon ribbon is handled throughout the entire process sequence; this may be accomplished using cassettes similar to those presently in use for processing Czochralski waters. Thus a transition from Cz to ribbon technology can be smoothly affected. The maximum size being considered, 3 inches x 24 inches, is half a square foot, and will generate 6 watts for 12% efficiency at 1 sun. Silicon dioxide has been demonstrated as an effective, practical diffusion barrier for use during the polyribbon formation.

  1. Friction Stir Welding and NASA

    NASA Technical Reports Server (NTRS)

    Horton, K Renee

    2016-01-01

    Friction stir welding (FSW) is a solid state welding process with potential advantages for aerospace and automotive industries dealing with light alloys. Self-reacting friction stir welding (SR-FSW) is one variation of the FSW process being developed at the National Aeronautics and Space Administration (NASA) for use in the fabrication of propellant tanks and other areas used on the Space Launch System (SLS) NASA's SLS is an advanced, heavy-lift launch vehicle which will provide an entirely new capability for science and human exploration beyond Earth's orbit. The SLS will give the nation a safe, affordable and sustainable means of reaching beyond our current limits and open new doors of discovery from the unique vantage point of space This talk will elaborate on the SR-FSW process and it's usage on the current Space Launch System Program at NASA.

  2. Computation of output feedback gains for linear stochastic systems using the Zangnill-Powell Method

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1975-01-01

    Because conventional optimal linear regulator theory results in a controller which requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. To this effect a stochastic linear model has been developed that accounts for process parameter and initial uncertainty, measurement noise, and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was then performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell. Results using a seventh order process show the proposed procedures to be very effective.

  3. 33 CFR 329.8 - Improved or natural conditions of the waterbody.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... therefore navigable in law from that time forward. The changes in engineering practices or the coming of new... entirely reasonable in a thickly populated, highly developed industrial region may have been entirely too...

  4. 33 CFR 329.8 - Improved or natural conditions of the waterbody.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... therefore navigable in law from that time forward. The changes in engineering practices or the coming of new... entirely reasonable in a thickly populated, highly developed industrial region may have been entirely too...

  5. 33 CFR 329.8 - Improved or natural conditions of the waterbody.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... therefore navigable in law from that time forward. The changes in engineering practices or the coming of new... entirely reasonable in a thickly populated, highly developed industrial region may have been entirely too...

  6. 33 CFR 329.8 - Improved or natural conditions of the waterbody.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... therefore navigable in law from that time forward. The changes in engineering practices or the coming of new... entirely reasonable in a thickly populated, highly developed industrial region may have been entirely too...

  7. 33 CFR 329.8 - Improved or natural conditions of the waterbody.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... therefore navigable in law from that time forward. The changes in engineering practices or the coming of new... entirely reasonable in a thickly populated, highly developed industrial region may have been entirely too...

  8. Demonstration of the James Webb Space Telescope commissioning on the JWST testbed telescope

    NASA Astrophysics Data System (ADS)

    Acton, D. Scott; Towell, Timothy; Schwenker, John; Swensen, John; Shields, Duncan; Sabatke, Erin; Klingemann, Lana; Contos, Adam R.; Bauer, Brian; Hansen, Karl; Atcheson, Paul D.; Redding, David; Shi, Fang; Basinger, Scott; Dean, Bruce; Burns, Laura

    2006-06-01

    The one-meter Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate the design and implementation of the wavefront sensing and control (WFS&C) capabilities of the James Webb Space Telescope (JWST). The TBT is used to develop and verify the WFS&C algorithms, check the communication interfaces, validate the WFS&C optical components and actuators, and provide risk reduction opportunities for test approaches for later full-scale cryogenic vacuum testing of the observatory. In addition, the TBT provides a vital opportunity to demonstrate the entire WFS&C commissioning process. This paper describes recent WFS&C commissioning experiments that have been performed on the TBT.

  9. A multiresolution method for climate system modeling: application of spherical centroidal Voronoi tessellations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringler, Todd; Ju, Lili; Gunzburger, Max

    2008-11-14

    During the next decade and beyond, climate system models will be challenged to resolve scales and processes that are far beyond their current scope. Each climate system component has its prototypical example of an unresolved process that may strongly influence the global climate system, ranging from eddy activity within ocean models, to ice streams within ice sheet models, to surface hydrological processes within land system models, to cloud processes within atmosphere models. These new demands will almost certainly result in the develop of multiresolution schemes that are able, at least regionally, to faithfully simulate these fine-scale processes. Spherical centroidal Voronoimore » tessellations (SCVTs) offer one potential path toward the development of a robust, multiresolution climate system model components. SCVTs allow for the generation of high quality Voronoi diagrams and Delaunay triangulations through the use of an intuitive, user-defined density function. In each of the examples provided, this method results in high-quality meshes where the quality measures are guaranteed to improve as the number of nodes is increased. Real-world examples are developed for the Greenland ice sheet and the North Atlantic ocean. Idealized examples are developed for ocean–ice shelf interaction and for regional atmospheric modeling. In addition to defining, developing, and exhibiting SCVTs, we pair this mesh generation technique with a previously developed finite-volume method. Our numerical example is based on the nonlinear, shallow water equations spanning the entire surface of the sphere. This example is used to elucidate both the potential benefits of this multiresolution method and the challenges ahead.« less

  10. Ship and satellite bio-optical research in the California Bight

    NASA Technical Reports Server (NTRS)

    Smith, R. C.; Baker, K. S.

    1982-01-01

    Mesoscale biological patterns and processes in productive coastal waters were studied. The physical and biological processes leading to chlorophyll variability were investigated. The ecological and evolutionary significance of this variability, and its relation to the prediction of fish recruitment and marine mammal distributions was studied. Seasonal primary productivity (using chlorophyll as an indication of phytoplankton biomass) for the entire Southern California Bight region was assessed. Complementary and contemporaneous ship and satellite (Nimbus 7-CZCS) bio-optical data from the Southern California Bight and surrounding waters were obtained and analyzed. These data were also utilized for the development of multi-platform sampling strategies and the optimization of algorithms for the estimation of phytoplankton biomass and primary production from satellite imagery.

  11. Computation of output feedback gains for linear stochastic systems using the Zangwill-Powell method

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1977-01-01

    Because conventional optimal linear regulator theory results in a controller which requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. To this effect a stochastic linear model has been developed that accounts for process parameter and initial uncertainty, measurement noise, and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was then performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell.

  12. Inactivation of viruses in bubbling processes utilized for personal bioaerosol monitoring.

    PubMed

    Agranovski, I E; Safatov, A S; Borodulin, A I; Pyankov, O V; Petrishchenko, V A; Sergeev, A N; Agafonov, A P; Ignatiev, G M; Sergeev, A A; Agranovski, V

    2004-12-01

    A new personal bioaerosol sampler has recently been developed and evaluated for sampling of viable airborne bacteria and fungi under controlled laboratory conditions and in the field. The operational principle of the device is based on the passage of air through porous medium immersed in liquid. This process leads to the formation of bubbles within the filter as the carrier gas passes through and thus provides effective mechanisms for aerosol removal. As demonstrated in previous studies, the culturability of sampled bacterium and fungi remained high for the entire 8-h sampling period. The present study is the first step of the evaluation of the new sampler for monitoring of viable airborne viruses. It focuses on the investigation of the inactivation rate of viruses in the bubbling process during 4 h of continuous operation. Four microbes were used in this study, influenza, measles, mumps, and vaccinia viruses. It was found that the use of distilled water as the collection fluid was associated with a relatively high decay rate. A significant improvement was achieved by utilizing virus maintenance fluid prepared by using Hank's solution with appropriate additives. The survival rates of the influenza, measles, and mumps viruses were increased by 1.4 log, 0.83 log, and 0.82 log, respectively, after the first hour of operation compared to bubbling through the sterile water. The same trend was observed throughout the entire 4-h experiment. There was no significant difference observed only for the robust vaccinia virus.

  13. Inactivation of Viruses in Bubbling Processes Utilized for Personal Bioaerosol Monitoring

    PubMed Central

    Agranovski, I. E.; Safatov, A. S.; Borodulin, A. I.; Pyankov, O. V.; Petrishchenko, V. A.; Sergeev, A. N.; Agafonov, A. P.; Ignatiev, G. M.; Sergeev, A. A.; Agranovski, V.

    2004-01-01

    A new personal bioaerosol sampler has recently been developed and evaluated for sampling of viable airborne bacteria and fungi under controlled laboratory conditions and in the field. The operational principle of the device is based on the passage of air through porous medium immersed in liquid. This process leads to the formation of bubbles within the filter as the carrier gas passes through and thus provides effective mechanisms for aerosol removal. As demonstrated in previous studies, the culturability of sampled bacterium and fungi remained high for the entire 8-h sampling period. The present study is the first step of the evaluation of the new sampler for monitoring of viable airborne viruses. It focuses on the investigation of the inactivation rate of viruses in the bubbling process during 4 h of continuous operation. Four microbes were used in this study, influenza, measles, mumps, and vaccinia viruses. It was found that the use of distilled water as the collection fluid was associated with a relatively high decay rate. A significant improvement was achieved by utilizing virus maintenance fluid prepared by using Hank's solution with appropriate additives. The survival rates of the influenza, measles, and mumps viruses were increased by 1.4 log, 0.83 log, and 0.82 log, respectively, after the first hour of operation compared to bubbling through the sterile water. The same trend was observed throughout the entire 4-h experiment. There was no significant difference observed only for the robust vaccinia virus. PMID:15574888

  14. The development of a hydrologic-hydraulic representation of an urbanscape: the case study of Nashville, Tennessee

    NASA Astrophysics Data System (ADS)

    Sedlar, F.; Ivanov, V. Y.; Shao, J.; Narayan, U.; Nardi, F.; Adams, T. E.; Merwade, V.; Wright, D. B.; Kim, J.; Fatichi, S.; Rakhmatulina, E.

    2013-12-01

    Incorporating elevation data into coupled hydraulic and hydrologic models with the use of triangulated irregular networks (TINs) provides a detailed and highly customizable representation of the original domain. Until recently the resolution of such digital elevation models was 1 or 1/3 arc second (10-30 meters). Aided by the use of LiDAR, digital elevation models are now available at the 1/9 arc second resolution (1-3 meters). With elevation data at this level of resolution watershed details that are overlooked at a 10-30 meter resolution can now be resolved and incorporated into the TIN. For urban flood modeling this implies that street level features can be resolved. However to provide a useful picture of the flooding as a whole, this data would need to be integrated across a citywide scale. To prove the feasibility, process, and capabilities of generating such a detailed and large scale TIN, we present a case study of Nashville, TN, USA, during the May 1-2, 2010 flooding, a 1,000 year storm event. With the use of ArcGIS, HEC-RAS, Triangle, and additionally developed processing methodologies, an approach is developed to generate a hydrologically relevant and detailed TIN of the entire urbanscape of Nashville. This TIN incorporates three separate aspects; the watershed, the floodplain, and the city. The watershed component contains the elevation data for the delineated watershed, roughly 1,000 km2 at 1-3 meter resolution. The floodplain encompasses over 300 channel cross sections of the Cumberland River and a delineated floodplain. The city element comprises over 500,000 buildings and all major roadways within the watershed. Once generated, the resulting triangulation of the TIN is optimized with the Triangle software for input to the coupled hydraulic and hydrological model, tRIBS-OFM. Hydrologically relevant areas such as the floodplain are densified and constraints are set on the minimum triangle area for the entire TIN. Upon running the coupled hydraulic and hydrological model with the appropriate forcings, the spatial dynamics of the flooding will then be resolved at a street level across the entire city. The analysis capabilities afforded at this resolution and across such a large area will facilitate urban flood predictions coupled with hydrologic forecasts as well as a better understanding of the spatial dynamics of urban flooding.

  15. Statistical Comparisons of Meso- and Small-Scale Field-Aligned Currents with Auroral Electron Acceleration Mechanisms from FAST Observations

    NASA Astrophysics Data System (ADS)

    Dombeck, J. P.; Cattell, C. A.; Prasad, N.; Sakher, A.; Hanson, E.; McFadden, J. P.; Strangeway, R. J.

    2016-12-01

    Field-aligned currents (FACs) provide a fundamental driver and means of Magnetosphere-Ionosphere (M-I) coupling. These currents need to be supported by local physics along the entire field line generally with quasi-static potential structures, but also supporting the time-evolution of the structures and currents, producing Alfvén waves and Alfvénic electron acceleration. In regions of upward current, precipitating auroral electrons are accelerated earthward. These processes can result in ion outflow, changes in ionospheric conductivity, and affect the particle distributions on the field line, affecting the M-I coupling processes supporting the individual FACs and potentially the entire FAC system. The FAST mission was well suited to study both the FACs and the electron auroral acceleration processes. We present the results of the comparisons between meso- and small-scale FACs determined from FAST using the method of Peria, et al., 2000, and our FAST auroral acceleration mechanism study when such identification is possible for the entire ˜13 year FAST mission. We also present the latest results of the electron energy (and number) flux ionospheric input based on acceleration mechanism (and FAC characteristics) from our FAST auroral acceleration mechanism study.

  16. 50 CFR 260.6 - Terms defined.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... preparation of the product from its raw state through each step in the entire process; or observe conditions... under the regulations in this part which has been preserved by any recognized commercial process..., or by fermentation. Quality. “Quality” means the inherent properties of any processed product which...

  17. Synthetic biology stretching the realms of possibility in wine yeast research.

    PubMed

    Jagtap, Umesh B; Jadhav, Jyoti P; Bapat, Vishwas A; Pretorius, Isak S

    2017-07-03

    It took several millennia to fully understand the scientific intricacies of the process through which grape juice is turned into wine. This yeast-driven fermentation process is still being perfected and advanced today. Motivated by ever-changing consumer preferences and the belief that the 'best' wine is yet to be made, numerous approaches are being pursued to improve the process of yeast fermentation and the quality of wine. Central to recent enhancements in winemaking processes and wine quality is the development of Saccharomyces cerevisiae yeast strains with improved robustness, fermentation efficiencies and sensory properties. The emerging science of Synthetic Biology - including genome engineering and DNA editing technologies - is taking yeast strain development into a totally new realm of possibility. The first example of how future wine strain development might be impacted by these new 'history-making' Synthetic Biology technologies, is the de novo production of the raspberry ketone aroma compound, 4-[4-hydroxyphenyl]butan-2-one, in a wine yeast containing a synthetic DNA cassette. This article explores how this breakthrough and the imminent outcome of the international Yeast 2.0 (or Sc2.0) project, aimed at the synthesis of the entire genome of a laboratory strain of S. cerevisiae, might accelerate the design of improved wine yeasts. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A LEAN approach toward automated analysis and data processing of polymers using proton NMR spectroscopy.

    PubMed

    de Brouwer, Hans; Stegeman, Gerrit

    2011-02-01

    To maximize utilization of expensive laboratory instruments and to make most effective use of skilled human resources, the entire chain of data processing, calculation, and reporting that is needed to transform raw NMR data into meaningful results was automated. The LEAN process improvement tools were used to identify non-value-added steps in the existing process. These steps were eliminated using an in-house developed software package, which allowed us to meet the key requirement of improving quality and reliability compared with the existing process while freeing up valuable human resources and increasing productivity. Reliability and quality were improved by the consistent data treatment as performed by the software and the uniform administration of results. Automating a single NMR spectrophotometer led to a reduction in operator time of 35%, doubling of the annual sample throughput from 1400 to 2800, and reducing the turn around time from 6 days to less than 2. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  19. Pragmatics as Metacognitive Control

    PubMed Central

    Kissine, Mikhail

    2016-01-01

    The term “pragmatics” is often used to refer without distinction, on one hand, to the contextual selection of interpretation norms and, on the other hand, to the context-sensitive processes guided by these norms. Pragmatics in the first acception depends on language-independent contextual factors that can, but need not, involve Theory of Mind; in the second acception, pragmatics is a language-specific metacognitive process, which may unfold at an unconscious level without involving any mental state (meta-)representation. Distinguishing between these two kinds of ways context drives the interpretation of communicative stimuli helps dissolve the dispute between proponents of an entirely Gricean pragmatics and those who claim that some pragmatic processes do not depend on mind-reading capacities. According to the model defended in this paper, the typology of pragmatic processes is not entirely determined by a hierarchy of meanings, but by contextually set norms of interpretation. PMID:26834671

  20. Pragmatics as Metacognitive Control.

    PubMed

    Kissine, Mikhail

    2015-01-01

    The term "pragmatics" is often used to refer without distinction, on one hand, to the contextual selection of interpretation norms and, on the other hand, to the context-sensitive processes guided by these norms. Pragmatics in the first acception depends on language-independent contextual factors that can, but need not, involve Theory of Mind; in the second acception, pragmatics is a language-specific metacognitive process, which may unfold at an unconscious level without involving any mental state (meta-)representation. Distinguishing between these two kinds of ways context drives the interpretation of communicative stimuli helps dissolve the dispute between proponents of an entirely Gricean pragmatics and those who claim that some pragmatic processes do not depend on mind-reading capacities. According to the model defended in this paper, the typology of pragmatic processes is not entirely determined by a hierarchy of meanings, but by contextually set norms of interpretation.

  1. Water displacement mercury pump

    DOEpatents

    Nielsen, Marshall G.

    1985-01-01

    A water displacement mercury pump has a fluid inlet conduit and diffuser, a valve, a pressure cannister, and a fluid outlet conduit. The valve has a valve head which seats in an opening in the cannister. The entire assembly is readily insertable into a process vessel which produces mercury as a product. As the mercury settles, it flows into the opening in the cannister displacing lighter material. When the valve is in a closed position, the pressure cannister is sealed except for the fluid inlet conduit and the fluid outlet conduit. Introduction of a lighter fluid into the cannister will act to displace a heavier fluid from the cannister via the fluid outlet conduit. The entire pump assembly penetrates only a top wall of the process vessel, and not the sides or the bottom wall of the process vessel. This insures a leak-proof environment and is especially suitable for processing of hazardous materials.

  2. Water displacement mercury pump

    DOEpatents

    Nielsen, M.G.

    1984-04-20

    A water displacement mercury pump has a fluid inlet conduit and diffuser, a valve, a pressure cannister, and a fluid outlet conduit. The valve has a valve head which seats in an opening in the cannister. The entire assembly is readily insertable into a process vessel which produces mercury as a product. As the mercury settles, it flows into the opening in the cannister displacing lighter material. When the valve is in a closed position, the pressure cannister is sealed except for the fluid inlet conduit and the fluid outlet conduit. Introduction of a lighter fluid into the cannister will act to displace a heavier fluid from the cannister via the fluid outlet conduit. The entire pump assembly penetrates only a top wall of the process vessel, and not the sides or the bottom wall of the process vessel. This insures a leak-proof environment and is especially suitable for processing of hazardous materials.

  3. Getting from A to IRB: developing an institutional review board at a historically black university.

    PubMed

    Howard, Daniel L; Boyd, Carlton L; Nelson, Daniel K; Godley, Paul

    2010-03-01

    Shaw University, the oldest historically black college or university in the southern USA, recently partnered with the University of North Carolina at Chapel Hill, a major research institution in North Carolina, to further develop Shaw's research infrastructure. One aim of the partnership involved establishing a human research ethics committee and an accompanying administrative structure and research ethics education program. This paper describes the process of developing an entire human research protection program de novo through collaboration with and mentoring by the members of the human research protection program at a nearby major research institution. This paper provides a detailed description of the aims, procedures, accomplishments, and challenges involved in such a project, which may serve as a useful model for other primarily teaching institutions wishing to develop research infrastructure and ethical capacity.

  4. Getting From A to IRB: Developing an Institutional Review Board at a Historically Black University

    PubMed Central

    Howard, Daniel L.; Boyd, Carlton L.; Nelson, Daniel K.; Godley, Paul

    2011-01-01

    Shaw University, the oldest historically black college or university in the southern USA, recently partnered with the University of North Carolina at Chapel Hill, a major research institution in North Carolina, to further develop Shaw’s research infrastructure. One aim of the partnership involved establishing a human research ethics committee and an accompanying administrative structure and research ethics education program. This paper describes the process of developing an entire human research protection program de novo through collaboration with and mentoring by the members of the human research protection program at a nearby major research institution. This paper provides a detailed description of the aims, procedures, accomplishments, and challenges involved in such a project, which may serve as a useful model for other primarily teaching institutions wishing to develop research infrastructure and ethical capacity. PMID:20235865

  5. Development and evaluation of a physics-based windblown ...

    EPA Pesticide Factsheets

    A new windblown dust emission treatment was incorporated in the Community Multiscale Air Quality (CMAQ) modeling system. This new model treatment has been built upon previously developed physics-based parameterization schemes from the literature. A distinct and novel feature of this scheme, however, is the incorporation of a newly developed dynamic relation for the surface roughness length relevant to small-scale dust generation processes. Through this implementation, the effect of nonerodible elements on the local flow acceleration, drag partitioning, and surface coverage protection is modeled in a physically based and consistent manner. Careful attention is paid in integrating the new windblown dust treatment in the CMAQ model to ensure that the required input parameters are correctly configured. To test the performance of the new dust module in CMAQ, the entire year 2011 is simulated for the continental United States, with particular emphasis on the southwestern United States (SWUS) where windblown dust concentrations are relatively large. Overall, the model shows good performance with the daily mean bias of soil concentrations fluctuating in the range of ±1 µg m−3 for the entire year. Springtime soil concentrations are in quite good agreement (normalized mean bias of 8.3%) with observations, while moderate to high underestimation of soil concentration is seen in the summertime. The latter is attributed to the issue of representing the convective dust sto

  6. Genomic imprinting in Drosophila has properties of both mammalian and insect imprinting.

    PubMed

    Anaka, Matthew; Lynn, Audra; McGinn, Patrick; Lloyd, Vett K

    2009-02-01

    Genomic imprinting is a process that marks DNA, causing a change in gene or chromosome behavior, depending on the sex of the transmitting parent. In mammals, most examples of genomic imprinting affect the transcription of individual or small clusters of genes whereas in insects, genomic imprinting tends to silence entire chromosomes. This has been interpreted as evidence of independent evolutionary origins for imprinting. To investigate how these types of imprinting are related, we performed a phenotypic, molecular, and cytological analysis of an imprinted chromosome in Drosophila melanogaster. Analysis of this chromosome reveals that the imprint results in transcriptional silencing. Yet, the domain of transcriptional silencing is very large, extending at least 1.2 Mb and encompassing over 100 genes, and is associated with decreased somatic polytenization of the entire chromosome. We propose that repression of somatic replication in polytenized cells, as a secondary response to the imprint, acts to extend the size of the imprinted domain to an entire chromosome. Thus, imprinting in Drosophila has properties of both typical mammalian and insect imprinting which suggests that genomic imprinting in Drosophila and mammals is not fundamentally different; imprinting is manifest as transcriptional silencing of a few genes or silencing of an entire chromosome depending on secondary processes such as differences in gene density and polytenization.

  7. Cleave and couple: toward fully sustainable catalytic conversion of lignocellulose to value added building blocks and fuels.

    PubMed

    Sun, Zhuohua; Barta, Katalin

    2018-06-21

    The structural complexity of lignocellulose offers unique opportunities for the development of entirely new, energy efficient and waste-free pathways in order to obtain valuable bio-based building blocks. Such sustainable catalytic methods - specifically tailored to address the efficient conversion of abundant renewable starting materials - are necessary to successfully compete, in the future, with fossil-based multi-step processes. In this contribution we give a summary of recent developments in this field and describe our "cleave and couple" strategy, where "cleave" refers to the catalytic deconstruction of lignocellulose to aromatic and aliphatic alcohol intermediates, and "couple" involves the development of novel, sustainable transformations for the formation of C-C and C-N bonds in order to obtain a range of attractive products from lignocellulose.

  8. Developing a framework for energy technology portfolio selection

    NASA Astrophysics Data System (ADS)

    Davoudpour, Hamid; Ashrafi, Maryam

    2012-11-01

    Today, the increased consumption of energy in world, in addition to the risk of quick exhaustion of fossil resources, has forced industrial firms and organizations to utilize energy technology portfolio management tools viewed both as a process of diversification of energy sources and optimal use of available energy sources. Furthermore, the rapid development of technologies, their increasing complexity and variety, and market dynamics have made the task of technology portfolio selection difficult. Considering high level of competitiveness, organizations need to strategically allocate their limited resources to the best subset of possible candidates. This paper presents the results of developing a mathematical model for energy technology portfolio selection at a R&D center maximizing support of the organization's strategy and values. The model balances the cost and benefit of the entire portfolio.

  9. Regenerative medicine in kidney disease: where we stand and where to go.

    PubMed

    Borges, Fernanda T; Schor, Nestor

    2017-07-22

    The kidney is a complex organ with more than 20 types of specialized cells that play an important role in maintaining the body's homeostasis. The epithelial tubular cell is formed during embryonic development and has little proliferative capacity under physiological conditions, but after acute injury the kidney does have regenerative capacity. However, after repetitive or severe lesions, it may undergo a maladaptation process that predisposes it to chronic kidney injury. Regenerative medicine includes various repair and regeneration techniques, and these have gained increasing attention in the scientific literature. In the future, not only will these techniques contribute to the repair and regeneration of the human kidney, but probably also to the construction of an entire organ. New mechanisms studied for kidney regeneration and repair include circulating stem cells as mesenchymal stromal/stem cells and their paracrine mechanisms of action; renal progenitor stem cells; the leading role of tubular epithelial cells in the tubular repair process; the study of zebrafish larvae to understand the process of nephron development, kidney scaffold and its repopulation; and, finally, the development of organoids. This review elucidates where we are in terms of current scientific knowledge regarding these mechanisms and the promises of future scientific perspectives.

  10. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  11. A Versatile Mounting Method for Long Term Imaging of Zebrafish Development.

    PubMed

    Hirsinger, Estelle; Steventon, Ben

    2017-01-26

    Zebrafish embryos offer an ideal experimental system to study complex morphogenetic processes due to their ease of accessibility and optical transparency. In particular, posterior body elongation is an essential process in embryonic development by which multiple tissue deformations act together to direct the formation of a large part of the body axis. In order to observe this process by long-term time-lapse imaging it is necessary to utilize a mounting technique that allows sufficient support to maintain samples in the correct orientation during transfer to the microscope and acquisition. In addition, the mounting must also provide sufficient freedom of movement for the outgrowth of the posterior body region without affecting its normal development. Finally, there must be a certain degree in versatility of the mounting method to allow imaging on diverse imaging set-ups. Here, we present a mounting technique for imaging the development of posterior body elongation in the zebrafish D. rerio. This technique involves mounting embryos such that the head and yolk sac regions are almost entirely included in agarose, while leaving out the posterior body region to elongate and develop normally. We will show how this can be adapted for upright, inverted and vertical light-sheet microscopy set-ups. While this protocol focuses on mounting embryos for imaging for the posterior body, it could easily be adapted for the live imaging of multiple aspects of zebrafish development.

  12. Artificial neural networks to model formulation-property correlations in the process of inline-compounding on an injection moulding machine

    NASA Astrophysics Data System (ADS)

    Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer

    2015-05-01

    Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.

  13. Frontal slab composite magnetic resonance neurography of the brachial plexus: implications for infraclavicular block approaches.

    PubMed

    Raphael, David T; McIntee, Diane; Tsuruda, Jay S; Colletti, Patrick; Tatevossian, Ray

    2005-12-01

    Magnetic resonance neurography (MRN) is an imaging method by which nerves can be selectively highlighted. Using commercial software, the authors explored a variety of approaches to develop a three-dimensional volume-rendered MRN image of the entire brachial plexus and used it to evaluate the accuracy of infraclavicular block approaches. With institutional review board approval, MRN of the brachial plexus was performed in 10 volunteer subjects. MRN imaging was performed on a GE 1.5-tesla magnetic resonance scanner (General Electric Healthcare Technologies, Waukesha, WI) using a phased array torso coil. Coronal STIR and T1 oblique sagittal sequences of the brachial plexus were obtained. Multiple software programs were explored for enhanced display and manipulation of the composite magnetic resonance images. The authors developed a frontal slab composite approach that allows single-frame reconstruction of a three-dimensional volume-rendered image of the entire brachial plexus. Automatic segmentation was supplemented by manual segmentation in nearly all cases. For each of three infraclavicular approaches (posteriorly directed needle below midclavicle, infracoracoid, or caudomedial to coracoid), the targeting error was measured as the distance from the MRN plexus midpoint to the approach-targeted site. Composite frontal slabs (coronal views), which are single-frame three-dimensional volume renderings from image-enhanced two-dimensional frontal view projections of the underlying coronal slices, were created. The targeting errors (mean +/- SD) for the approaches-midclavicle, infracoracoid, caudomedial to coracoid-were 0.43 +/- 0.67, 0.99 +/- 1.22, and 0.65 +/- 1.14 cm, respectively. Image-processed three-dimensional volume-rendered MNR scans, which allow visualization of the entire brachial plexus within a single composite image, have educational value in illustrating the complexity and individual variation of the plexus. Suggestions for improved guidance during infraclavicular block procedures are presented.

  14. Certification of vapor phase hydrogen peroxide sterilization process for spacecraft application

    NASA Technical Reports Server (NTRS)

    Rohatgi, N.; Schubert, W.; Koukol, R.; Foster, T. L.; Stabekis, P. D.

    2002-01-01

    This paper describes the selection process and research activities JPL is planning to conduct for certification of hydrogen peroxide as a NASA approved technique for sterilization of various spacecraft parts/components and entire modern spacecraft.

  15. Automated drug identification system

    NASA Technical Reports Server (NTRS)

    Campen, C. F., Jr.

    1974-01-01

    System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.

  16. Variable dynamic testbed vehicle : safety plan

    DOT National Transportation Integrated Search

    1997-02-01

    This safety document covers the entire safety process from inception to delivery of the Variable Dynamic Testbed Vehicle. In addition to addressing the process of safety on the vehicle , it should provide a basis on which to build future safety proce...

  17. A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.

    PubMed

    Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva

    2015-11-01

    It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.

  18. Development of an automated ammunition processing system for battlefield use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speaks, D.M.; Chesser, J.B.; Lloyd, P.D.

    1995-03-01

    The Future Armored Resupply Vehicle (FARV) will be the companion ammunition resupply vehicle to the Advanced Field Artillery System (AFAS). These systems are currently being investigated by the US Army for future acquisition. The FARV will sustain the AFAS with ammunition and fuel and will significantly increase capabilities over current resupply vehicles. Currently ammunition is transferred to field artillery almost entirely by hand. The level of automation to be included into the FARV is still under consideration. At the request of the US Army`s Project Manager, AFAS/FARV, Oak Ridge National Laboratory (ORNL) identified and evaluated various concepts for the automatedmore » upload, processing, storage, and delivery equipment for the FARV. ORNL, working with the sponsor, established basic requirements and assumptions for concept development and the methodology for concept selection. A preliminary concept has been selected, and the associated critical technologies have been identified. ORNL has provided technology demonstrations of many of these critical technologies. A technology demonstrator which incorporates all individual components into a total process demonstration is planned for late FY 1995.« less

  19. Automatic conversational scene analysis in children with Asperger syndrome/high-functioning autism and typically developing peers.

    PubMed

    Tavano, Alessandro; Pesarin, Anna; Murino, Vittorio; Cristani, Marco

    2014-01-01

    Individuals with Asperger syndrome/High Functioning Autism fail to spontaneously attribute mental states to the self and others, a life-long phenotypic characteristic known as mindblindness. We hypothesized that mindblindness would affect the dynamics of conversational interaction. Using generative models, in particular Gaussian mixture models and observed influence models, conversations were coded as interacting Markov processes, operating on novel speech/silence patterns, termed Steady Conversational Periods (SCPs). SCPs assume that whenever an agent's process changes state (e.g., from silence to speech), it causes a general transition of the entire conversational process, forcing inter-actant synchronization. SCPs fed into observed influence models, which captured the conversational dynamics of children and adolescents with Asperger syndrome/High Functioning Autism, and age-matched typically developing participants. Analyzing the parameters of the models by means of discriminative classifiers, the dialogs of patients were successfully distinguished from those of control participants. We conclude that meaning-free speech/silence sequences, reflecting inter-actant synchronization, at least partially encode typical and atypical conversational dynamics. This suggests a direct influence of theory of mind abilities onto basic speech initiative behavior.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mechalakos, J.

    The process of converting to an electronic chart for radiation therapy can be daunting. It requires a dedicated committee to first research and choose appropriate software, to review the entire documentation policy and flow of the clinic, to convert this system to electronic form or if necessary, redesign the system to more easily conform to the electronic process. Those making the conversion and those who already use electronic charting would benefit from the shared experience of those who have been through the process in the past. Therefore TG262 was convened to provide guidance on electronic charting for external beam radiationmore » therapy and brachytherapy. This course will present the results of an internal survey of task group members on EMR practices in External Beam Radiation Therapy as well as discuss important issues in EMR development and structure for both EBRT and brachytherapy. Learning Objectives: Be familiarized with common practices and pitfalls in development and maintenance of an electronic chart in Radiation Oncology Be familiarized with important issues related to electronic charting in External Beam Radiation Therapy Be familiarized with important issues related to electronic charting in Brachytherapy.« less

  1. Lysine Fermentation: History and Genome Breeding.

    PubMed

    Ikeda, Masato

    Lysine fermentation by Corynebacterium glutamicum was developed in 1958 by Kyowa Hakko Kogyo Co. Ltd. (current Kyowa Hakko Bio Co. Ltd.) and is the second oldest amino acid fermentation process after glutamate fermentation. The fundamental mechanism of lysine production, discovered in the early stages of the process's history, gave birth to the concept known as "metabolic regulatory fermentation," which is now widely applied to metabolite production. After the development of rational metabolic engineering, research on lysine production first highlighted the need for engineering of the central metabolism from the viewpoints of precursor supply and NADPH regeneration. Furthermore, the existence of active export systems for amino acids was first demonstrated for lysine in C. glutamicum, and this discovery has resulted in the current recognition of such exporters as an important consideration in metabolite production. Lysine fermentation is also notable as the first process to which genomics was successfully applied to improve amino acid production. The first global "genome breeding" strategy was developed using a lysine producer as a model; this has since led to new lysine producers that are more efficient than classical industrial producers. These advances in strain development technology, combined with recent systems-level approaches, have almost achieved the optimization of entire cellular systems as cell factories for lysine production. In parallel, the continuous improvement of the process has resulted not only in fermentation processes with reduced load on downstream processing but also in commercialization of various product forms according to their intended uses. Nowadays lysine fermentation underpins a giant lysine demand of more than 2 million metric tons per year.

  2. Modelling Coastal Cliff Recession Based on the GIM-DDD Method

    NASA Astrophysics Data System (ADS)

    Gong, Bin; Wang, Shanyong; Sloan, Scott William; Sheng, Daichao; Tang, Chun'an

    2018-04-01

    The unpredictable and instantaneous collapse behaviour of coastal rocky cliffs may cause damage that extends significantly beyond the area of failure. Gravitational movements that occur during coastal cliff recession involve two major stages: the small deformation stage and the large displacement stage. In this paper, a method of simulating the entire progressive failure process of coastal rocky cliffs is developed based on the gravity increase method (GIM), the rock failure process analysis method and the discontinuous deformation analysis method, and it is referred to as the GIM-DDD method. The small deformation stage, which includes crack initiation, propagation and coalescence processes, and the large displacement stage, which includes block translation and rotation processes during the rocky cliff collapse, are modelled using the GIM-DDD method. In addition, acoustic emissions, stress field variations, crack propagation and failure mode characteristics are further analysed to provide insights that can be used to predict, prevent and minimize potential economic losses and casualties. The calculation and analytical results are consistent with previous studies, which indicate that the developed method provides an effective and reliable approach for performing rocky cliff stability evaluations and coastal cliff recession analyses and has considerable potential for improving the safety and protection of seaside cliff areas.

  3. Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr; Vlachos, Dionisios; Katsoulakis, Markos

    2013-09-05

    The overall objective of this project is to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals. Specific goals include: (i) Development of rigorous spatio-temporal coarse-grained kinetic Monte Carlo (KMC) mathematics and simulation for microscopic processes encountered in biomassmore » transformation. (ii) Development of hybrid multiscale simulation that links stochastic simulation to a deterministic partial differential equation (PDE) model for an entire reactor. (iii) Development of hybrid multiscale simulation that links KMC simulation with quantum density functional theory (DFT) calculations. (iv) Development of parallelization of models of (i)-(iii) to take advantage of Petaflop computing and enable real world applications of complex, multiscale models. In this NCE period, we continued addressing these objectives and completed the proposed work. Main initiatives, key results, and activities are outlined.« less

  4. MicroRNA, mRNA, and protein expression link development and aging in human and macaque brain

    PubMed Central

    Somel, Mehmet; Guo, Song; Fu, Ning; Yan, Zheng; Hu, Hai Yang; Xu, Ying; Yuan, Yuan; Ning, Zhibin; Hu, Yuhui; Menzel, Corinna; Hu, Hao; Lachmann, Michael; Zeng, Rong; Chen, Wei; Khaitovich, Philipp

    2010-01-01

    Changes in gene expression levels determine differentiation of tissues involved in development and are associated with functional decline in aging. Although development is tightly regulated, the transition between development and aging, as well as regulation of post-developmental changes, are not well understood. Here, we measured messenger RNA (mRNA), microRNA (miRNA), and protein expression in the prefrontal cortex of humans and rhesus macaques over the species' life spans. We find that few gene expression changes are unique to aging. Instead, the vast majority of miRNA and gene expression changes that occur in aging represent reversals or extensions of developmental patterns. Surprisingly, many gene expression changes previously attributed to aging, such as down-regulation of neural genes, initiate in early childhood. Our results indicate that miRNA and transcription factors regulate not only developmental but also post-developmental expression changes, with a number of regulatory processes continuing throughout the entire life span. Differential evolutionary conservation of the corresponding genomic regions implies that these regulatory processes, although beneficial in development, might be detrimental in aging. These results suggest a direct link between developmental regulation and expression changes taking place in aging. PMID:20647238

  5. Historic Frontier Processes active in Future Space-Based Mineral Extraction

    NASA Astrophysics Data System (ADS)

    Gray, D. M.

    2000-01-01

    The forces that shaped historic mining frontiers are in many cases not bound by geographic or temporal limits. The forces that helped define historic frontiers are active in today's physical and virtual frontiers, and will be present in future space-based frontiers. While frontiers derived from position and technology are primarily economic in nature, non-economic conditions affect the success or failure of individual frontier endeavors, local "mining camps" and even entire frontiers. Frontiers can be defined as the line of activity that divides the established markets and infrastructure of civilization from the unclaimed resources and potential wealth of a wilderness. At the frontier line, ownership of resources is established. The resource can then be developed using capital, energy and information. In a mining setting, the resource is concentrated for economic shipment to the markets of civilization. Profits from the sale of the resource are then used to fund further development of the resource and/or pay investors. Both positional and technical frontiers develop as a series of generations. The profits from each generation of development provides the capital and/or investment incentive for the next round of development. Without profit, the self-replicating process of frontiers stops.

  6. Comparative study of thermochemical processes for hydrogen production from biomass fuels.

    PubMed

    Biagini, Enrico; Masoni, Lorenzo; Tognotti, Leonardo

    2010-08-01

    Different thermochemical configurations (gasification, combustion, electrolysis and syngas separation) are studied for producing hydrogen from biomass fuels. The aim is to provide data for the production unit and the following optimization of the "hydrogen chain" (from energy source selection to hydrogen utilization) in the frame of the Italian project "Filiera Idrogeno". The project focuses on a regional scale (Tuscany, Italy), renewable energies and automotive hydrogen. Decentred and small production plants are required to solve the logistic problems of biomass supply and meet the limited hydrogen infrastructures. Different options (gasification with air, oxygen or steam/oxygen mixtures, combustion, electrolysis) and conditions (varying the ratios of biomass and gas input) are studied by developing process models with uniform hypothesis to compare the results. Results obtained in this work concern the operating parameters, process efficiencies, material and energetic needs and are fundamental to optimize the entire hydrogen chain. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. IMM estimator with out-of-sequence measurements

    NASA Astrophysics Data System (ADS)

    Bar-Shalom, Yaakov; Chen, Huimin

    2004-08-01

    In multisensor tracking systems that operate in a centralized information processing architecture, measurements from the same target obtained by different sensors can arrive at the processing center out of sequence. In order to avoid either a delay in the output or the need for reordering and reprocessing an entire sequence of measurements, such measurements have to be processed as out-of-sequence measurements (OOSM). Recent work developed procedures for incorporating OOSMs into a Kalman filter (KF). Since the state of the art tracker for real (maneuvering) targets is the Interacting Multiple Model (IMM) estimator, this paper presents the algorithm for incorporating OOSMs into an IMM estimator. Both data association and estimation are considered. Simulation results are presented for two realistic problems using measurements from two airborne GMTI sensors. It is shown that the proposed algorithm for incorporating OOSMs into an IMM estimator yields practically the same performance as the reordering and in-sequence reprocessing of the measurements.

  8. Advanced Biofuels and Beyond: Chemistry Solutions for Propulsion and Production.

    PubMed

    Leitner, Walter; Klankermayer, Jürgen; Pischinger, Stefan; Pitsch, Heinz; Kohse-Höinghaus, Katharina

    2017-05-08

    Sustainably produced biofuels, especially when they are derived from lignocellulosic biomass, are being discussed intensively for future ground transportation. Traditionally, research activities focus on the synthesis process, while leaving their combustion properties to be evaluated by a different community. This Review adopts an integrative view of engine combustion and fuel synthesis, focusing on chemical aspects as the common denominator. It will be demonstrated that a fundamental understanding of the combustion process can be instrumental to derive design criteria for the molecular structure of fuel candidates, which can then be targets for the analysis of synthetic pathways and the development of catalytic production routes. With such an integrative approach to fuel design, it will be possible to improve systematically the entire system, spanning biomass feedstock, conversion process, fuel, engine, and pollutants with a view to improve the carbon footprint, increase efficiency, and reduce emissions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. On-line photolithography modeling using spectrophotometry and Prolith/2

    NASA Astrophysics Data System (ADS)

    Engstrom, Herbert L.; Beacham, Jeanne E.

    1994-05-01

    Spectrophotometry has been applied to optimizing photolithography processes in semiconductor manufacturing. For many years thin film measurement systems have been used in manufacturing for controlling film deposition processes. The combination of film thickness mapping with photolithography modeling has expanded the applications of this technology. Experimental measurements of dose-to-clear, the minimum light exposure dose required to fully develop a photoresist, are described. It is shown how dose-to-clear and photoresist contrast may be determined rapidly and conveniently from measurements of a dose exposure matrix on a monitor wafer. Such experimental measurements may underestimate the dose-to- clear because of thickness variations of the photoresist and underlying layers on the product wafer. Online modeling of the photolithographic process together with film thickness maps of the entire wafer can overcome this problem. Such modeling also provides maps of dose-to- clear and resist linewidth that can be used to estimate and optimize yield.

  10. Long term trending of engineering data for the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Cox, Ross M.

    1993-01-01

    A major goal in spacecraft engineering analysis is the detection of component failures before the fact. Trending is the process of monitoring subsystem states to discern unusual behaviors. This involves reducing vast amounts of data about a component or subsystem into a form that helps humans discern underlying patterns and correlations. A long term trending system has been developed for the Hubble Space Telescope. Besides processing the data for 988 distinct telemetry measurements each day, it produces plots of 477 important parameters for the entire 24 hours. Daily updates to the trend files also produce 339 thirty day trend plots each month. The total system combines command procedures to control the execution of the C-based data processing program, user-written FORTRAN routines, and commercial off-the-shelf plotting software. This paper includes a discussion the performance of the trending system and of its limitations.

  11. The ISMARA client

    PubMed Central

    Ioannidis, Vassilios; van Nimwegen, Erik; Stockinger, Heinz

    2016-01-01

    ISMARA ( ismara.unibas.ch) automatically infers the key regulators and regulatory interactions from high-throughput gene expression or chromatin state data. However, given the large sizes of current next generation sequencing (NGS) datasets, data uploading times are a major bottleneck. Additionally, for proprietary data, users may be uncomfortable with uploading entire raw datasets to an external server. Both these problems could be alleviated by providing a means by which users could pre-process their raw data locally, transferring only a small summary file to the ISMARA server. We developed a stand-alone client application that pre-processes large input files (RNA-seq or ChIP-seq data) on the user's computer for performing ISMARA analysis in a completely automated manner, including uploading of small processed summary files to the ISMARA server. This reduces file sizes by up to a factor of 1000, and upload times from many hours to mere seconds. The client application is available from ismara.unibas.ch/ISMARA/client. PMID:28232860

  12. VAMPnets for deep learning of molecular kinetics.

    PubMed

    Mardt, Andreas; Pasquali, Luca; Wu, Hao; Noé, Frank

    2018-01-02

    There is an increasing demand for computing the relevant structures, equilibria, and long-timescale kinetics of biomolecular processes, such as protein-drug binding, from high-throughput molecular dynamics simulations. Current methods employ transformation of simulated coordinates into structural features, dimension reduction, clustering the dimension-reduced data, and estimation of a Markov state model or related model of the interconversion rates between molecular structures. This handcrafted approach demands a substantial amount of modeling expertise, as poor decisions at any step will lead to large modeling errors. Here we employ the variational approach for Markov processes (VAMP) to develop a deep learning framework for molecular kinetics using neural networks, dubbed VAMPnets. A VAMPnet encodes the entire mapping from molecular coordinates to Markov states, thus combining the whole data processing pipeline in a single end-to-end framework. Our method performs equally or better than state-of-the-art Markov modeling methods and provides easily interpretable few-state kinetic models.

  13. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  14. Guidelines of the Design of Electropyrotechnic Firing Circuit for Unmanned Flight and Ground Test Projects

    NASA Technical Reports Server (NTRS)

    Gonzalez, Guillermo A.; Lucy, Melvin H.; Massie, Jeffrey J.

    2013-01-01

    The NASA Langley Research Center, Engineering Directorate, Electronic System Branch, is responsible for providing pyrotechnic support capabilities to Langley Research Center unmanned flight and ground test projects. These capabilities include device selection, procurement, testing, problem solving, firing system design, fabrication and testing; ground support equipment design, fabrication and testing; checkout procedures and procedure?s training to pyro technicians. This technical memorandum will serve as a guideline for the design, fabrication and testing of electropyrotechnic firing systems. The guidelines will discuss the entire process beginning with requirements definition and ending with development and execution.

  15. Preface: Special Topic on Nuclear Quantum Effects

    NASA Astrophysics Data System (ADS)

    Tuckerman, Mark; Ceperley, David

    2018-03-01

    Although the observable universe strictly obeys the laws of quantum mechanics, in many instances, a classical description that either ignores quantum effects entirely or accounts for them at a very crude level is sufficient to describe a wide variety of phenomena. However, when this approximation breaks down, as is often the case for processes involving light nuclei, a full quantum treatment becomes indispensable. This Special Topic in The Journal of Chemical Physics showcases recent advances in our understanding of nuclear quantum effects in condensed phases as well as novel algorithmic developments and applications that have enhanced the capability to study these effects.

  16. Numerical simulation of plasma response to externally applied resonant magnetic perturbation on the J-TEXT tokamak

    NASA Astrophysics Data System (ADS)

    Bicheng, LI; Zhonghe, JIANG; Jian, LV; Xiang, LI; Bo, RAO; Yonghua, DING

    2018-05-01

    Nonlinear magnetohydrodynamic (MHD) simulations of an equilibrium on the J-TEXT tokamak with applied resonant magnetic perturbations (RMPs) are performed with NIMROD (non-ideal MHD with rotation, open discussion). Numerical simulation of plasma response to RMPs has been developed to investigate magnetic topology, plasma density and rotation profile. The results indicate that the pure applied RMPs can stimulate 2/1 mode as well as 3/1 mode by the toroidal mode coupling, and finally change density profile by particle transport. At the same time, plasma rotation plays an important role during the entire evolution process.

  17. Postacute Care in Cancer Rehabilitation.

    PubMed

    Guo, Ying; Fu, Jack B; Guo, Hong; Camp, Jennifer; Shin, Ki Y; Tu, Shi-Ming; Palmer, Lynn J; Yadav, Rajesh

    2017-02-01

    Acute care is usually associated with disease progression, treatments for cancer, and medical comorbidities. Patients with cancer may develop sudden functional deficits that require rehabilitation. Some of these patients benefit from acute rehabilitation, others benefit from subacute rehabilitation. After acute rehabilitation, continuous care for these patients has not been well described. Three studies are presented to demonstrate that cancer rehabilitation is a continuous process. Rehabilitation professionals should know how to detect fall risk, monitor symptoms, and render symptom management. Patients with cancer often require rehabilitation services during their entire disease trajectory. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Preface: Special Topic on Nuclear Quantum Effects.

    PubMed

    Tuckerman, Mark; Ceperley, David

    2018-03-14

    Although the observable universe strictly obeys the laws of quantum mechanics, in many instances, a classical description that either ignores quantum effects entirely or accounts for them at a very crude level is sufficient to describe a wide variety of phenomena. However, when this approximation breaks down, as is often the case for processes involving light nuclei, a full quantum treatment becomes indispensable. This Special Topic in The Journal of Chemical Physics showcases recent advances in our understanding of nuclear quantum effects in condensed phases as well as novel algorithmic developments and applications that have enhanced the capability to study these effects.

  19. Assessing treatment integrity in cognitive-behavioral therapy: comparing session segments with entire sessions.

    PubMed

    Weck, Florian; Grikscheit, Florian; Höfling, Volkmar; Stangier, Ulrich

    2014-07-01

    The evaluation of treatment integrity (therapist adherence and competence) is a necessary condition to ensure the internal and external validity of psychotherapy research. However, the evaluation process is associated with high costs, because therapy sessions must be rated by experienced clinicians. It is debatable whether rating session segments is an adequate alternative to rating entire sessions. Four judges evaluated treatment integrity (i.e., therapist adherence and competence) in 84 randomly selected videotapes of cognitive-behavioral therapy for major depressive disorder, social anxiety disorder, and hypochondriasis (from three different treatment outcome studies). In each case, two judges provided ratings based on entire therapy sessions and two on session segments only (i.e., the middle third of the entire sessions). Interrater reliability of adherence and competence evaluations proved satisfactory for ratings based on segments and the level of reliability did not differ from ratings based on entire sessions. Ratings of treatment integrity that were based on entire sessions and session segments were strongly correlated (r=.62 for adherence and r=.73 for competence). The relationship between treatment integrity and outcome was comparable for ratings based on session segments and those based on entire sessions. However, significant relationships between therapist competence and therapy outcome were only found in the treatment of social anxiety disorder. Ratings based on segments proved to be adequate for the evaluation of treatment integrity. The findings demonstrate that session segments are an adequate and cost-effective alternative to entire sessions for the evaluation of therapist adherence and competence. Copyright © 2014. Published by Elsevier Ltd.

  20. A methodology to select a wire insulation for use in habitable spacecraft.

    PubMed

    Paulos, T; Apostolakis, G

    1998-08-01

    This paper investigates electrical overheating events aboard a habitable spacecraft. The wire insulation involved in these failures plays a major role in the entire event scenario from threat development to detection and damage assessment. Ideally, if models of wire overheating events in microgravity existed, the various wire insulations under consideration could be quantitatively compared. However, these models do not exist. In this paper, a methodology is developed that can be used to select a wire insulation that is best suited for use in a habitable spacecraft. The results of this study show that, based upon the Analytic Hierarchy Process and simplifying assumptions, the criteria selected, and data used in the analysis, Tefzel is better than Teflon for use in a habitable spacecraft.

  1. Theory-Driven Process Evaluation of a Complementary Feeding Trial in Four Countries

    ERIC Educational Resources Information Center

    Newman, Jamie E.; Garces, Ana; Mazariegos, Manolo; Hambidge, K. Michael; Manasyan, Albert; Tshefu, Antoinette; Lokangaka, Adrien; Sami, Neelofar; Carlo, Waldemar A.; Bose, Carl L.; Pasha, Omrana; Goco, Norman; Chomba, Elwyn; Goldenberg, Robert L.; Wright, Linda L.; Koso-Thomas, Marion; Krebs, Nancy F.

    2014-01-01

    We conducted a theory-driven process evaluation of a cluster randomized controlled trial comparing two types of complementary feeding (meat versus fortified cereal) on infant growth in Guatemala, Pakistan, Zambia and the Democratic Republic of Congo. We examined process evaluation indicators for the entire study cohort (N = 1236) using chi-square…

  2. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  3. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  4. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  5. 40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...

  6. Development of the e-Baby serious game with regard to the evaluation of oxygenation in preterm babies: contributions of the emotional design.

    PubMed

    Fonseca, Luciana Mara Monti; Dias, Danielle Monteiro Vilela; Góes, Fernanda Dos Santos Nogueira; Seixas, Carlos Alberto; Scochi, Carmen Gracinda Silvan; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2014-09-01

    The present study aimed to describe the development process of a serious game that enables users to evaluate the respiratory process in a preterm infant based on an emotional design model. The e-Baby serious game was built to feature the simulated environment of an incubator, in which the user performs a clinical evaluation of the respiratory process in a virtual preterm infant. The user learns about the preterm baby's history, chooses the tools for the clinical evaluation, evaluates the baby, and determines whether his/her evaluation is appropriate. The e-Baby game presents phases that contain respiratory process impairments of higher or lower complexity in the virtual preterm baby. Included links give the user the option of recording the entire evaluation procedure and sharing his/her performance on a social network. e-Baby integrates a Clinical Evaluation of the Preterm Baby course in the Moodle virtual environment. This game, which evaluates the respiratory process in preterm infants, could support a more flexible, attractive, and interactive teaching and learning process that includes simulations with features very similar to neonatal unit realities, thus allowing more appropriate training for clinical oxygenation evaluations in at-risk preterm infants. e-Baby allows advanced user-technology-educational interactions because it requires active participation in the process and is emotionally integrated.

  7. [Pre-verbality in focusing and the need for self check. An attempt at "focusing check"].

    PubMed

    Masui, T; Ikemi, A; Murayama, S

    1983-06-01

    Though the Focusing process is not entirely non-verbal, in Focusing, careful attention is paid by the Focuser and the Listener to the pre-verbal experiential process. In other words, Focusing involves attending to the felt sense that is not easily expressed in words immediately. Hence, during the process of learning to Focus, the Focusing teacher attempts to communicate the experiences of Focusing to the student which are not easily done by words. Due to such difficulties, the Focusing student may (and quite frequently does) mistake the experiential process in Focusing with other processes. Often, the felt sense can be confused with other phenomena such as "autogenic discharge". Also the Focuser may not stay with the felt sense and drift into "free association" or frequently, certain processes in "meditation" can be confused with Focusing. Therefore, there is a need for a "check" by which the Focusing student can confirm the Focusing experience for himself. For the Focusing student, such a "check" serves not only to confirm the Focusing process, but also an aid to learning Focusing. We will report here a "Focusing Check" which we developed by translating Eugene Gendlin's "Focusing Check" and making several modifications in it so that it will be more understandable to the Japanese. Along with the "Focusing Check" we developed, the authors discuss the need for such a check.

  8. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  9. Approaches for in silico finishing of microbial genome sequences

    PubMed Central

    Kremer, Frederico Schmitt; McBride, Alan John Alexander; Pinto, Luciano da Silva

    2017-01-01

    Abstract The introduction of next-generation sequencing (NGS) had a significant effect on the availability of genomic information, leading to an increase in the number of sequenced genomes from a large spectrum of organisms. Unfortunately, due to the limitations implied by the short-read sequencing platforms, most of these newly sequenced genomes remained as “drafts”, incomplete representations of the whole genetic content. The previous genome sequencing studies indicated that finishing a genome sequenced by NGS, even bacteria, may require additional sequencing to fill the gaps, making the entire process very expensive. As such, several in silico approaches have been developed to optimize the genome assemblies and facilitate the finishing process. The present review aims to explore some free (open source, in many cases) tools that are available to facilitate genome finishing. PMID:28898352

  10. Coordination and establishment of centralized facilities and services of the University of Alaska ERTS survey of the Alaskan environment

    NASA Technical Reports Server (NTRS)

    Belon, A. E. (Principal Investigator); Miller, J. M.

    1973-01-01

    The author has identified the following significant results. The objective of this project is to provide a focus for the entire University of Alaska ERTS-1 effort (12 projects covering 10 disciplines and involving 8 research institutes and science departments). Activities have been concentrated on the implementation of the project's three primary functions: (1) coordination and management of the U of A ERTS-1 program, including management of the flow of data and data products; (2) acquisition, installation, test, operation, and maintanence of centralized facilities for processing ERTS-1, aircraft, and ground truth data; and (3) development of photographic and digital techniques for processing and interpreting ERTS-1 and aircraft data. With minor exceptions these three functions are now well-established and working smoothly.

  11. Approaches for in silico finishing of microbial genome sequences.

    PubMed

    Kremer, Frederico Schmitt; McBride, Alan John Alexander; Pinto, Luciano da Silva

    The introduction of next-generation sequencing (NGS) had a significant effect on the availability of genomic information, leading to an increase in the number of sequenced genomes from a large spectrum of organisms. Unfortunately, due to the limitations implied by the short-read sequencing platforms, most of these newly sequenced genomes remained as "drafts", incomplete representations of the whole genetic content. The previous genome sequencing studies indicated that finishing a genome sequenced by NGS, even bacteria, may require additional sequencing to fill the gaps, making the entire process very expensive. As such, several in silico approaches have been developed to optimize the genome assemblies and facilitate the finishing process. The present review aims to explore some free (open source, in many cases) tools that are available to facilitate genome finishing.

  12. RTD-based Material Tracking in a Fully-Continuous Dry Granulation Tableting Line.

    PubMed

    Martinetz, M C; Karttunen, A-P; Sacher, S; Wahl, P; Ketolainen, J; Khinast, J G; Korhonen, O

    2018-06-06

    Continuous manufacturing (CM) offers quality and cost-effectiveness benefits over currently dominating batch processing. One challenge that needs to be addressed when implementing CM is traceability of materials through the process, which is needed for the batch/lot definition and control strategy. In this work the residence time distributions (RTD) of single unit operations (blender, roller compactor and tablet press) of a continuous dry granulation tableting line were captured with NIR based methods at selected mass flow rates to create training data. RTD models for continuous operated unit operations and the entire line were developed based on transfer functions. For semi-continuously operated bucket conveyor and pneumatic transport an assumption based the operation frequency was used. For validation of the parametrized process model, a pre-defined API step change and its propagation through the manufacturing line was computed and compared to multi-scale experimental runs conducted with the fully assembled continuous operated manufacturing line. This novel approach showed a very good prediction power at the selected mass flow rates for a complete continuous dry granulation line. Furthermore, it shows and proves the capabilities of process simulation as a tool to support development and control of pharmaceutical manufacturing processes. Copyright © 2018. Published by Elsevier B.V.

  13. Sound envelope processing in the developing human brain: A MEG study.

    PubMed

    Tang, Huizhen; Brock, Jon; Johnson, Blake W

    2016-02-01

    This study investigated auditory cortical processing of linguistically-relevant temporal modulations in the developing brains of young children. Auditory envelope following responses to white noise amplitude modulated at rates of 1-80 Hz in healthy children (aged 3-5 years) and adults were recorded using a paediatric magnetoencephalography (MEG) system and a conventional MEG system, respectively. For children, there were envelope following responses to slow modulations but no significant responses to rates higher than about 25 Hz, whereas adults showed significant envelope following responses to almost the entire range of stimulus rates. Our results show that the auditory cortex of preschool-aged children has a sharply limited capacity to process rapid amplitude modulations in sounds, as compared to the auditory cortex of adults. These neurophysiological results are consistent with previous psychophysical evidence for a protracted maturational time course for auditory temporal processing. The findings are also in good agreement with current linguistic theories that posit a perceptual bias for low frequency temporal information in speech during language acquisition. These insights also have clinical relevance for our understanding of language disorders that are associated with difficulties in processing temporal information in speech. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  15. Floral development at multiple spatial scales in Polygonum jucundum (Polygonaceae), a distylous species with broadly open flowers.

    PubMed

    Huang, Lan-Jie; Fu, Wen-Long; Wang, Xiao-Fan

    2014-01-01

    Distyly, a special polymorph, has evolved in many groups of angiosperms and has attracted attention since Darwin's time. Development studies on distylous taxa have helped us to understand the evolutionary process of this polymorph, but most of these studies focus on species with narrowly tubular corolla. Here, we studied the floral development of Polygonum jucundum, a distylous species with broadly open flowers, at multiple spatial scales. Results showed that the difference in stigma height between flowers of the two morphs was caused by differences in style growth throughout the entire floral development process. The observed difference in anther heights between the two morphs was because the filaments grew faster in short-styled (SS) than in long-styled (LS) flowers in the later stages of floral development. In addition, the longer styles in LS flowers than in SS flowers was because of faster cell division in the early stages of floral development. However, SS flowers had longer filaments than LS flowers primarily because of greater cell elongation. These results indicate that floral development in P. jucundum differs from that of distylous taxa with floral tubes shown in previous studies. Further, we conclude that the presence of distyly in species with open flowers is a result of convergent evolution.

  16. Reconceptualizing children's complex discharge with health systems theory: novel integrative review with embedded expert consultation and theory development.

    PubMed

    Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh

    2014-05-01

    To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA  CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW  Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.

  17. Longitudinal Processing Speed Impairments in Males with Autism and the Effects of White Matter Microstructure

    PubMed Central

    Travers, Brittany G.; Bigler, Erin D.; Tromp, Do P. M.; Adluru, Nagesh; Froehlich, Alyson L.; Ennis, Chad; Lange, Nicholas; Nielsen, Jared A.; Prigge, Molly B. D.; Alexander, Andrew L.; Lainhart, Janet E.

    2014-01-01

    The present study used an accelerated longitudinal design to examine group differences and age-related changes in processing speed in 81 individuals with Autism Spectrum Disorder (ASD) compared to 56 age-matched individuals with typical development (ages 6–39 years). Processing speed was assessed using the Wechsler Intelligence Scale for Children-3rd edition (WISC-III) and the Wechsler Adult Intelligence Scale-3rd edition (WAIS-III). Follow-up analyses examined processing speed subtest performance and relations between processing speed and white matter microstructure (as measured with diffusion tensor imaging [DTI] in a subset of these participants). After controlling for full scale IQ, the present results show that processing speed index standard scores were on average 12 points lower in the group with ASD compared to the group with typical development. There were, however, no significant group differences in standard score age-related changes within this age range. For subtest raw scores, the group with ASD demonstrated robustly slower processing speeds in the adult versions of the IQ test (i.e., WAIS-III) but not in the child versions (WISC-III), even though age-related changes were similar in both the ASD and typically developing groups. This pattern of results may reflect difficulties that become increasingly evident in ASD on more complex measures of processing speed. Finally, DTI measures of whole-brain white matter microstructure suggested that fractional anisotropy (but not mean diffusivity, radial diffusivity, or axial diffusivity) made significant but small-sized contributions to processing speed standard scores across our entire sample. Taken together, the present findings suggest that robust decreases in processing speed may be present in ASD, more pronounced in adulthood, and partially attributable to white matter microstructural integrity. PMID:24269298

  18. The Chandra Source Catalog: Algorithms

    NASA Astrophysics Data System (ADS)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  19. Strategy for development of the Polish electricity sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dybowski, J.

    1995-12-01

    This paper represents the strategy for development of the Polish Electricity Sector dealing with specific problems which are common for all of East Central Europe. In 1990 Poland adopted a restructuring program for the entire energy sector. Very ambitious plans were changed several times but still the main direction of change was preserved. The most difficult period of transformation is featured by several contradictions which have to be balanced. Electricity prices should increase in order to cover the modernization and development program but the society is not able to take this burden in such a short time. Furthermore the newmore » environment protection standards force the growth of capital investment program which sooner or later has to be transferred through the electricity prices. New economic mechanisms have to be introduced to the electricity sector to replace the old ones noneffective, centrally planned. This process has to follow slow management changes. Also, introduction of new electricity market is limited by those constraints. However, this process of change would not be possible without parallel governmental initiation like preparation of new energy law and regulatory frames.« less

  20. Space Shuttle propulsion parameter estimation using optimal estimation techniques, volume 1

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The mathematical developments and their computer program implementation for the Space Shuttle propulsion parameter estimation project are summarized. The estimation approach chosen is the extended Kalman filtering with a modified Bryson-Frazier smoother. Its use here is motivated by the objective of obtaining better estimates than those available from filtering and to eliminate the lag associated with filtering. The estimation technique uses as the dynamical process the six degree equations-of-motion resulting in twelve state vector elements. In addition to these are mass and solid propellant burn depth as the ""system'' state elements. The ""parameter'' state elements can include aerodynamic coefficient, inertia, center-of-gravity, atmospheric wind, etc. deviations from referenced values. Propulsion parameter state elements have been included not as options just discussed but as the main parameter states to be estimated. The mathematical developments were completed for all these parameters. Since the systems dynamics and measurement processes are non-linear functions of the states, the mathematical developments are taken up almost entirely by the linearization of these equations as required by the estimation algorithms.

  1. Numerical simulation of advective-dispersive multisolute transport with sorption, ion exchange and equilibrium chemistry

    USGS Publications Warehouse

    Lewis, F.M.; Voss, C.I.; Rubin, Jacob

    1986-01-01

    A model was developed that can simulate the effect of certain chemical and sorption reactions simultaneously among solutes involved in advective-dispersive transport through porous media. The model is based on a methodology that utilizes physical-chemical relationships in the development of the basic solute mass-balance equations; however, the form of these equations allows their solution to be obtained by methods that do not depend on the chemical processes. The chemical environment is governed by the condition of local chemical equilibrium, and may be defined either by the linear sorption of a single species and two soluble complexation reactions which also involve that species, or binary ion exchange and one complexation reaction involving a common ion. Partial differential equations that describe solute mass balance entirely in the liquid phase are developed for each tenad (a chemical entity whose total mass is independent of the reaction process) in terms of their total dissolved concentration. These equations are solved numerically in two dimensions through the modification of an existing groundwater flow/transport computer code. (Author 's abstract)

  2. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  3. Phase separation like dynamics during Myxococcus xanthus fruiting body formation

    NASA Astrophysics Data System (ADS)

    Liu, Guannan; Thutupalli, Shashi; Wigbers, Manon; Shaevitz, Joshua

    2015-03-01

    Collective motion exists in many living organisms as an advantageous strategy to help the entire group with predation, forage, and survival. However, the principles of self-organization underlying such collective motions remain unclear. During various developmental stages of the soil-dwelling bacterium, Myxococcus xanthus, different types of collective motions are observed. In particular, when starved, M. xanthus cells eventually aggregate together to form 3-dimensional structures (fruiting bodies), inside which cells sporulate in response to the stress. We study the fruiting body formation process as an out of equilibrium phase separation process. As local cell density increases, the dynamics of the aggregation M. xanthus cells switch from a spatio-temporally random process, resembling nucleation and growth, to an emergent pattern formation process similar to a spinodal decomposition. By employing high-resolution microscopy and a video analysis system, we are able to track the motion of single cells within motile collective groups, while separately tuning local cell density, cell velocity and reversal frequency, probing the multi-dimensional phase space of M. xanthus development.

  4. Development of a data management front-end for use with a LANDSAT-based information system

    NASA Technical Reports Server (NTRS)

    Turner, B. J.

    1982-01-01

    The development and implementation of a data management front-end system for use with a LANDSAT based information system that facilitates the processsing of both LANDSAT and ancillary data was examined. The final tasks, reported on here, involved; (1) the implementation of the VICAR image processing software system at Penn State and the development of a user-friendly front-end for this system; (2) the implementation of JPL-developed software based on VICAR, for mosaicking LANDSAT scenes; (3) the creation and storage of a mosiac of 1981 summer LANDSAT data for the entire state of Pennsylvania; (4) demonstrations of the defoliation assessment procedure for Perry and Centre Counties, and presentation of the results at the 1982 National Gypsy Moth Review Meeting, and (5) the training of Pennsylvania Bureau of Forestry personnel in the use of the defoliation analysis system.

  5. Facilitating the Information Exchange Using a Modular Electronic Discharge Summary.

    PubMed

    Denecke, Kerstin; Dittli, Pascal A; Kanagarasa, Niveadha; Nüssli, Stephan

    2018-01-01

    Discharge summaries are a standard communication tool delivering important clinical information from inpatient to ambulatory care. To ensure a high quality, correctness and completeness, the generation process is time consuming. It requires also contributions of multiple persons. This is problematic since the primary care provider needs the information from the discharge summary for continuing the intended treatment. To address this challenge, we developed a concept for exchanging a modular electronic discharge summary. Through a literature review and interviews with multiple stakeholders, we analysed existing processes and derived requirements for an improved communication of the discharge summary. In this paper, we suggest a concept of a modular electronic discharge summary that is exchanged through the electronic patient dossier in CDA CH level 2 documents. Until 2020, all Swiss hospitals are obliged to connect to the electronic patient dossier. Our concept allows to access already completed modules of the discharge summary from the primary care side, before the entire report is entirely finalised. The data is automatically merged with the local patient record on the physician side and prepared for data integration into the practice information system. Our concept offers the opportunity not only to improve the information exchange between hospital and primary care, but it also provides a potential use case and demonstrates a benefit of the electronic patient dossier for primary care providers who are so far not obliged to connect to the patient dossier in Switzerland.

  6. Mitochondrial redox and pH signaling occurs in axonal and synaptic organelle clusters.

    PubMed

    Breckwoldt, Michael O; Armoundas, Antonis A; Aon, Miguel A; Bendszus, Martin; O'Rourke, Brian; Schwarzländer, Markus; Dick, Tobias P; Kurz, Felix T

    2016-03-22

    Redox switches are important mediators in neoplastic, cardiovascular and neurological disorders. We recently identified spontaneous redox signals in neurons at the single mitochondrion level where transients of glutathione oxidation go along with shortening and re-elongation of the organelle. We now have developed advanced image and signal-processing methods to re-assess and extend previously obtained data. Here we analyze redox and pH signals of entire mitochondrial populations. In total, we quantified the effects of 628 redox and pH events in 1797 mitochondria from intercostal axons and neuromuscular synapses using optical sensors (mito-Grx1-roGFP2; mito-SypHer). We show that neuronal mitochondria can undergo multiple redox cycles exhibiting markedly different signal characteristics compared to single redox events. Redox and pH events occur more often in mitochondrial clusters (medium cluster size: 34.1 ± 4.8 μm(2)). Local clusters possess higher mitochondrial densities than the rest of the axon, suggesting morphological and functional inter-mitochondrial coupling. We find that cluster formation is redox sensitive and can be blocked by the antioxidant MitoQ. In a nerve crush paradigm, mitochondrial clusters form sequentially adjacent to the lesion site and oxidation spreads between mitochondria. Our methodology combines optical bioenergetics and advanced signal processing and allows quantitative assessment of entire mitochondrial populations.

  7. JunoCam Images of Jupiter: A Juno Citizen Science Experiment

    NASA Astrophysics Data System (ADS)

    Hansen, Candice; Ravine, Michael; Bolton, Scott; Caplinger, Mike; Eichstadt, Gerald; Jensen, Elsa; Momary, Thomas W.; Orton, Glenn S.; Rogers, John

    2017-10-01

    The Juno mission to Jupiter carries a visible imager on its payload primarily for outreach. The vision of JunoCam’s outreach plan was for the public to participate in, not just observe, a science investigation. Four webpage components were developed for uploading and downloading comments and images, following the steps a traditional imaging team would do: Planning, Discussion, Voting, and Processing, hosted at https://missionjuno.swri.edu/junocam. Lightly processed and raw JunoCam data are posted. JunoCam images through broadband red, green and blue filters and a narrowband methane filter centered at 889 nm mounted directly on the detector. JunoCam is a push-frame imager with a 58 deg wide field of view covering a 1600 pixel width, and builds the second dimension of the image as the spacecraft rotates. This design enables capture of the entire pole of Jupiter in a single image at low emission angle when Juno is ~1 hour from perijove (closest approach). At perijove the wide field of view images are high-resolution while still capturing entire storms, e.g. the Great Red Spot. The public is invited to download JunoCam images, process them, and then upload their products. Over 2000 images have been uploaded to the JunoCam public image gallery. Contributions range from scientific quality to artful whimsy. Artistic works are inspired by Van Gogh and Monet. Works of whimsy include how Jupiter might look through the viewport of the Millennium Falcon, or to an angel perched on a lookout, or through a kaleidoscope. Citizen scientists have also engaged in serious quantitative analysis of the images, mapping images to storms and disruptions of the belts and zones that have been tracked from the earth. They are developing a phase function for Jupiter that allows the images to be flattened from the subsolar point to the terminator, and studying high hazes. Citizen scientists are also developing time-lapse movies, measuring wind flow, tracking circulation patterns in the circumpolar cyclones, and looking for lightning flashes. This effort has engaged the public, with a range of personal interests and considerable artistic and analytic talents. In return, we count our diverse public as partners in this endeavor.

  8. Understanding the ignition mechanism of high-pressure spray flames

    DOE PAGES

    Dahms, Rainer N.; Paczko, Günter A.; Skeen, Scott A.; ...

    2016-10-25

    A conceptual model for turbulent ignition in high-pressure spray flames is presented. The model is motivated by first-principles simulations and optical diagnostics applied to the Sandia n-dodecane experiment. The Lagrangian flamelet equations are combined with full LLNL kinetics (2755 species; 11,173 reactions) to resolve all time and length scales and chemical pathways of the ignition process at engine-relevant pressures and turbulence intensities unattainable using classic DNS. The first-principles value of the flamelet equations is established by a novel chemical explosive mode-diffusion time scale analysis of the fully-coupled chemical and turbulent time scales. Contrary to conventional wisdom, this analysis reveals thatmore » the high Damköhler number limit, a key requirement for the validity of the flamelet derivation from the reactive Navier–Stokes equations, applies during the entire ignition process. Corroborating Rayleigh-scattering and formaldehyde PLIF with simultaneous schlieren imaging of mixing and combustion are presented. Our combined analysis establishes a characteristic temporal evolution of the ignition process. First, a localized first-stage ignition event consistently occurs in highest temperature mixture regions. This initiates, owed to the intense scalar dissipation, a turbulent cool flame wave propagating from this ignition spot through the entire flow field. This wave significantly decreases the ignition delay of lower temperature mixture regions in comparison to their homogeneous reference. This explains the experimentally observed formaldehyde formation across the entire spray head prior to high-temperature ignition which consistently occurs first in a broad range of rich mixture regions. There, the combination of first-stage ignition delay, shortened by the cool flame wave, and the subsequent delay until second-stage ignition becomes minimal. A turbulent flame subsequently propagates rapidly through the entire mixture over time scales consistent with experimental observations. As a result, we demonstrate that the neglect of turbulence-chemistry-interactions fundamentally fails to capture the key features of this ignition process.« less

  9. Soils as relative-age dating tools

    USGS Publications Warehouse

    Markewich, Helaine Walsh; Pavich, Milan J.; Wysocki, Douglas A.

    2017-01-01

    Soils develop at the earth's surface via multiple processes that act through time. Precluding burial or disturbance, soil genetic horizons form progressively and reflect the balance among formation processes, surface age, and original substrate composition. Soil morphology provides a key link between process and time (soil age), enabling soils to serve as both relative and numerical dating tools for geomorphic studies and landscape evolution. Five major factors define the contemporary state of all soils: climate, organisms, topography, parent material, and time. Soils developed on similar landforms and parent materials within a given landscape comprise what we term a soil/landform/substrate complex. Soils on such complexes that differ in development as a function of time represent a soil chronosequence. In a soil chronosequence, time constitutes the only independent formation factor; the other factors act through time. Time dictates the variations in soil development or properties (field or laboratory measured) on a soil/landform/substrate complex. Using a dataset within the chronosequence model, we can also formulate various soil development indices based upon one or a combination of soil properties, either for individual soil horizons or for an entire profile. When we evaluate soil data or soil indices mathematically, the resulting equation creates a chronofunction. Chronofunctions help quantify processes and mechanisms involved in soil development, and relate them mathematically to time. These rigorous kinds of comparisons among and within soil/landform complexes constitute an important tool for relative-age dating. After determining one or more absolute ages for a soil/landform complex, we can calculate quantitative soil formation, and or landform-development rates. Multiple dates for several complexes allow rate calculations for soil/landform-chronosequence development and soil-chronofunction calibration.

  10. Annotating images by mining image search results.

    PubMed

    Wang, Xin-Jing; Zhang, Lei; Li, Xirong; Ma, Wei-Ying

    2008-11-01

    Although it has been studied for years by the computer vision and machine learning communities, image annotation is still far from practical. In this paper, we propose a novel attempt at model-free image annotation, which is a data-driven approach that annotates images by mining their search results. Some 2.4 million images with their surrounding text are collected from a few photo forums to support this approach. The entire process is formulated in a divide-and-conquer framework where a query keyword is provided along with the uncaptioned image to improve both the effectiveness and efficiency. This is helpful when the collected data set is not dense everywhere. In this sense, our approach contains three steps: 1) the search process to discover visually and semantically similar search results, 2) the mining process to identify salient terms from textual descriptions of the search results, and 3) the annotation rejection process to filter out noisy terms yielded by Step 2. To ensure real-time annotation, two key techniques are leveraged-one is to map the high-dimensional image visual features into hash codes, the other is to implement it as a distributed system, of which the search and mining processes are provided as Web services. As a typical result, the entire process finishes in less than 1 second. Since no training data set is required, our approach enables annotating with unlimited vocabulary and is highly scalable and robust to outliers. Experimental results on both real Web images and a benchmark image data set show the effectiveness and efficiency of the proposed algorithm. It is also worth noting that, although the entire approach is illustrated within the divide-and conquer framework, a query keyword is not crucial to our current implementation. We provide experimental results to prove this.

  11. The evolution and future of minimalism in neurological surgery.

    PubMed

    Liu, Charles Y; Wang, Michael Y; Apuzzo, Michael L J

    2004-11-01

    The evolution of the field of neurological surgery has been marked by a progressive minimalism. This has been evident in the development of an entire arsenal of modern neurosurgical enterprises, including microneurosurgery, neuroendoscopy, stereotactic neurosurgery, endovascular techniques, radiosurgical systems, intraoperative and navigational devices, and in the last decade, cellular and molecular adjuvants. In addition to reviewing the major developments and paradigm shifts in the cyclic reinvention of the field as it currently stands, this paper attempts to identify forces and developments that are likely to fuel the irresistible escalation of minimalism into the future. These forces include discoveries in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.

  12. [Preliminary studies on critical control point of traceability system in wolfberry].

    PubMed

    Liu, Sai; Xu, Chang-Qing; Li, Jian-Ling; Lin, Chen; Xu, Rong; Qiao, Hai-Li; Guo, Kun; Chen, Jun

    2016-07-01

    As a traditional Chinese medicine, wolfberry (Lycium barbarum) has a long cultivation history and a good industrial development foundation. With the development of wolfberry production, the expansion of cultivation area and the increased attention of governments and consumers on food safety, the quality and safety requirement of wolfberry is higher demanded. The quality tracing and traceability system of production entire processes is the important technology tools to protect the wolfberry safety, and to maintain sustained and healthy development of the wolfberry industry. Thus, this article analyzed the wolfberry quality management from the actual situation, the safety hazard sources were discussed according to the HACCP (hazard analysis and critical control point) and GAP (good agricultural practice for Chinese crude drugs), and to provide a reference for the traceability system of wolfberry. Copyright© by the Chinese Pharmaceutical Association.

  13. Toward the Modularization of Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Raskin, R. G.

    2009-12-01

    Decision support systems are typically developed entirely from scratch without the use of modular components. This “stovepiped” approach is inefficient and costly because it prevents a developer from leveraging the data, models, tools, and services of other developers. Even when a decision support component is made available, it is difficult to know what problem it solves, how it relates to other components, or even that the component exists, The Spatial Decision Support (SDS) Consortium was formed in 2008 to organize the body of knowledge in SDS within a common portal. The portal identifies the canonical steps in the decision process and enables decision support components to be registered, categorized, and searched. This presentation describes how a decision support system can be assembled from modular models, data, tools and services, based on the needs of the Earth science application.

  14. A five-phase process model describing the return to sustainable work of persons who survived cancer: A qualitative study.

    PubMed

    Brusletto, Birgit; Torp, Steffen; Ihlebæk, Camilla Martha; Vinje, Hege Forbech

    2018-06-01

    We investigated persons who survived cancer (PSC) and their experiences in returning to sustainable work. Videotaped, qualitative, in-depth interviews with previous cancer patients were analyzed directly using "Interpretative Phenomenological Analysis" (IPA). Four men and four women aged 42-59 years participated. Mean time since last treatment was nine years. All participants had worked for more than 3 years when interviewed. An advisory team of seven members with diverse cancer experiences contributed as co-researchers. The entire trajectory from cancer diagnosis until achievement of sustainable work was analog to a journey, and a process model comprising five phases was developed, including personal situations, treatments, and work issues. The theme "return-to-work" (RTW) turned out to be difficult to separate from the entire journey that started at the time of diagnosis. PSCs were mainly concerned about fighting for life in phases 1 and 2. In phase 3 and 4, some participants had to adjust and make changes at work more than once over a period of 1-10 years before reaching sustainable work in phase 5. Overall, the ability to adapt to new circumstances, take advantage of emerging opportunities, and finding meaningful occupational activities were crucial. Our process model may be useful as a tool when discussing the future working life of PSCs. Every individual's journey towards sustainable work was unique, and contained distinct and long-lasting efforts and difficulties. The first attempt to RTW after cancer may not be persistent. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Dashboard visualizations: Supporting real-time throughput decision-making.

    PubMed

    Franklin, Amy; Gantela, Swaroop; Shifarraw, Salsawit; Johnson, Todd R; Robinson, David J; King, Brent R; Mehta, Amit M; Maddow, Charles L; Hoot, Nathan R; Nguyen, Vickie; Rubio, Adriana; Zhang, Jiajie; Okafor, Nnaemeka G

    2017-07-01

    Providing timely and effective care in the emergency department (ED) requires the management of individual patients as well as the flow and demands of the entire department. Strategic changes to work processes, such as adding a flow coordination nurse or a physician in triage, have demonstrated improvements in throughput times. However, such global strategic changes do not address the real-time, often opportunistic workflow decisions of individual clinicians in the ED. We believe that real-time representation of the status of the entire emergency department and each patient within it through information visualizations will better support clinical decision-making in-the-moment and provide for rapid intervention to improve ED flow. This notion is based on previous work where we found that clinicians' workflow decisions were often based on an in-the-moment local perspective, rather than a global perspective. Here, we discuss the challenges of designing and implementing visualizations for ED through a discussion of the development of our prototype Throughput Dashboard and the potential it holds for supporting real-time decision-making. Copyright © 2017. Published by Elsevier Inc.

  16. Structural health monitoring of a reinforced concrete building during the severe typhoon Vicente in 2012.

    PubMed

    Kuok, Sin-Chi; Yuen, Ka-Veng

    2013-01-01

    The goal of this study is to investigate the structural performance of reinforced concrete building under the influence of severe typhoon. For this purpose, full-scale monitoring of a 22-story reinforced concrete building was conducted during the entire passage process of a severe typhoon "Vicente." Vicente was the eighth tropical storm developed in the Western North Pacific Ocean and the South China Sea in 2012. Moreover, it was the strongest and most devastating typhoon that struck Macao since 1999. The overall duration of the typhoon affected period that lasted more than 70 hours and the typhoon eye region covered Macao for around one hour. The wind and structural response measurements were acquired throughout the entire typhoon affected period. The wind characteristics were analyzed using the measured wind data including the wind speed and wind direction time histories. Besides, the structural response measurements of the monitored building were utilized for modal identification using the Bayesian spectral density approach. Detailed analysis of the field data and the typhoon generated effects on the structural performance are discussed.

  17. Improved cyberinfrastructure for integrated hydrometeorological predictions within the fully-coupled WRF-Hydro modeling system

    NASA Astrophysics Data System (ADS)

    gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh

    2014-05-01

    The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.

  18. The use of benthic indicators in Europe: from the Water Framework Directive to the Marine Strategy Framework Directive.

    PubMed

    Van Hoey, Gert; Borja, Angel; Birchenough, Silvana; Buhl-Mortensen, Lene; Degraer, Steven; Fleischer, Dirk; Kerckhof, Francis; Magni, Paolo; Muxika, Iñigo; Reiss, Henning; Schröder, Alexander; Zettler, Michael L

    2010-12-01

    The Water Framework Directive (WFD) and the Marine Strategy Framework Directive (MSFD) are the European umbrella regulations for water systems. It is a challenge for the scientific community to translate the principles of these directives into realistic and accurate approaches. The aim of this paper, conducted by the Benthos Ecology Working Group of ICES, is to describe how the principles have been translated, which were the challenges and best way forward. We have tackled the following principles: the ecosystem-based approach, the development of benthic indicators, the definition of 'pristine' or sustainable conditions, the detection of pressures and the development of monitoring programs. We concluded that testing and integrating the different approaches was facilitated during the WFD process, which led to further insights and improvements, which the MSFD can rely upon. Expert involvement in the entire implementation process proved to be of vital importance. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Strategic Protein Target Analysis for Developing Drugs to Stop Dental Caries

    PubMed Central

    Horst, J.A.; Pieper, U.; Sali, A.; Zhan, L.; Chopra, G.; Samudrala, R.; Featherstone, J.D.B.

    2012-01-01

    Dental caries is the most common disease to cause irreversible damage in humans. Several therapeutic agents are available to treat or prevent dental caries, but none besides fluoride has significantly influenced the disease burden globally. Etiologic mechanisms of the mutans group streptococci and specific Lactobacillus species have been characterized to various degrees of detail, from identification of physiologic processes to specific proteins. Here, we analyze the entire Streptococcus mutans proteome for potential drug targets by investigating their uniqueness with respect to non-cariogenic dental plaque bacteria, quality of protein structure models, and the likelihood of finding a drug for the active site. Our results suggest specific targets for rational drug discovery, including 15 known virulence factors, 16 proteins for which crystallographic structures are available, and 84 previously uncharacterized proteins, with various levels of similarity to homologs in dental plaque bacteria. This analysis provides a map to streamline the process of clinical development of effective multispecies pharmacologic interventions for dental caries. PMID:22899687

  20. Educational interventions targeted at minors in situations of grave social vulnerability and their families

    NASA Astrophysics Data System (ADS)

    de La Caba Collado, Mariangeles; Bartau Rojas, Isabel

    2010-10-01

    The aim of this article is to outline and assess an educational intervention programme targeted at improving the skills of families and the personal and social development of children living in situations of grave social vulnerability. The sample comprised 10 families during the first phase of the intervention and six during the second. The design, intervention and assessment process of this study was carried out in two phases over a period of a year and a half. For both phases, three different groups—of men/fathers, women/mothers and children—were established. Study variables (parenting skills and children's personal and social development) were evaluated before and after the intervention in every group, as well as during the entire process. The results, taking into account the improvements reported by all the participants (social workers, group monitors, fathers, mothers, children) show that inter-professional involvement and coordination at all phases of the intervention is vital in order to achieve small but significant improvements.

  1. Some applications of mathematics in golf.

    PubMed

    Otto, S R

    2017-08-01

    At its core, like many other sports, golf is a game of integers. The minimization of the number of strokes played is generally what determines the winner, whether each of these are associated with the shortest of putts or the longest of drives. The outcomes of these shots are influenced by very slight changes, but hopefully in a deterministic sense. Understanding the mechanics of golf necessitates the development of models and this is coupled more often than not to the use of statistics. In essence, the individual aspects of the sport can be modelled adequately via fairly simplistic models, but the presence of a human at one end of the kinematic chain has a significant impact on the variability of the entire process. In this paper, we will review some of the ways that mathematics has been used to develop the understanding of the physical processes involved in the sport, including some of the analysis which is exploited within the Equipment Rules. We will also discuss some of the future challenges.

  2. High resolution ultrasonic spectroscopy system for nondestructive evaluation

    NASA Technical Reports Server (NTRS)

    Chen, C. H.

    1991-01-01

    With increased demand for high resolution ultrasonic evaluation, computer based systems or work stations become essential. The ultrasonic spectroscopy method of nondestructive evaluation (NDE) was used to develop a high resolution ultrasonic inspection system supported by modern signal processing, pattern recognition, and neural network technologies. The basic system which was completed consists of a 386/20 MHz PC (IBM AT compatible), a pulser/receiver, a digital oscilloscope with serial and parallel communications to the computer, an immersion tank with motor control of X-Y axis movement, and the supporting software package, IUNDE, for interactive ultrasonic evaluation. Although the hardware components are commercially available, the software development is entirely original. By integrating signal processing, pattern recognition, maximum entropy spectral analysis, and artificial neural network functions into the system, many NDE tasks can be performed. The high resolution graphics capability provides visualization of complex NDE problems. The phase 3 efforts involve intensive marketing of the software package and collaborative work with industrial sectors.

  3. The interactive digital video interface

    NASA Technical Reports Server (NTRS)

    Doyle, Michael D.

    1989-01-01

    A frequent complaint in the computer oriented trade journals is that current hardware technology is progressing so quickly that software developers cannot keep up. A example of this phenomenon can be seen in the field of microcomputer graphics. To exploit the advantages of new mechanisms of information storage and retrieval, new approaches must be made towards incorporating existing programs as well as developing entirely new applications. A particular area of need is the correlation of discrete image elements to textural information. The interactive digital video (IDV) interface embodies a new concept in software design which addresses these needs. The IDV interface is a patented device and language independent process for identifying image features on a digital video display and which allows a number of different processes to be keyed to that identification. Its capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. Sophisticated interrelationships can be set up between images, text, and program control mechanisms.

  4. Rapid adhesive bonding concepts

    NASA Technical Reports Server (NTRS)

    Stein, B. A.; Tyeryar, J. R.; Hodges, W. T.

    1984-01-01

    Adhesive bonding in the aerospace industry typically utilizes autoclaves or presses which have considerable thermal mass. As a consequence, the rates of heatup and cooldown of the bonded parts are limited and the total time and cost of the bonding process is often relatively high. Many of the adhesives themselves do not inherently require long processing times. Bonding could be performed rapidly if the heat was concentrated in the bond lines or at least in the adherends. Rapid adhesive bonding concepts were developed to utilize induction heating techniques to provide heat directly to the bond line and/or adherends without heating the entire structure, supports, and fixtures of a bonding assembly. Bonding times for specimens are cut by a factor of 10 to 100 compared to standard press bonding. The development of rapid adhesive bonding for lap shear specimens (per ASTM D1003 and D3163), for aerospace panel bonding, and for field repair needs of metallic and advanced fiber reinforced polymeric matrix composite structures are reviewed.

  5. [The significance of meat quality in marketing].

    PubMed

    Kallweit, E

    1994-07-01

    Food quality in general and meat quality in particular are not only evaluated by means of objective quality traits but the entire production process is gaining more attention by the modern consumer. Due to this development quality programs were developed to define the majority of the processes in all production and marketing steps which are again linked by contracts. Not all of these items are quality relevant, but are concessions to ethic principles (animal welfare etc.). This is demonstrated by the example of Scharrel-pork production. The price differentiation at the pork market is still influenced predominantly by quantitative carcass traits. On the European market quality programs still are of minor significance. Premiums which are paid for high quality standards are more or less compensated by higher production costs and lower lean meat percentages, which must be expected in stress susceptible strains. The high efforts to establish quality programs, however, help to improve the quality level in general, and secure the market shares for local producers.

  6. A landsat data tiling and compositing approach optimized for change detection in the conterminous United States

    USGS Publications Warehouse

    Nelson, Kurtis; Steinwand, Daniel R.

    2015-01-01

    Annual disturbance maps are produced by the LANDFIRE program across the conterminous United States (CONUS). Existing LANDFIRE disturbance data from 1999 to 2010 are available and current efforts will produce disturbance data through 2012. A tiling and compositing approach was developed to produce bi-annual images optimized for change detection. A tiled grid of 10,000 × 10,000 30 m pixels was defined for CONUS and adjusted to consolidate smaller tiles along national borders, resulting in 98 non-overlapping tiles. Data from Landsat-5,-7, and -8 were re-projected to the tile extents, masked to remove clouds, shadows, water, and snow/ice, then composited using a cosine similarity approach. The resultant images were used in a change detection algorithm to determine areas of vegetation change. This approach enabled more efficient processing compared to using single Landsat scenes, by taking advantage of overlap between adjacent paths, and allowed an automated system to be developed for the entire process.

  7. Recent Progress on the Second Generation CMORPH: A Prototype Operational Processing System

    NASA Astrophysics Data System (ADS)

    Xie, Pingping; Joyce, Robert; Wu, Shaorong

    2016-04-01

    As reported at the EGU General Assembly of 2015, a conceptual test system was developed for the second generation CMORPH to produce global analyses of 30-min precipitation on a 0.05deg lat/lon grid over the entire globe from pole to pole through integration of information from satellite observations as well as numerical model simulations. The second generation CMORPH is built upon the Kalman Filter based CMORPH algorithm of Joyce and Xie (2011). Inputs to the system include both rainfall and snowfall rate retrievals from passive microwave (PMW) measurements aboard all available low earth orbit (LEO) satellites, precipitation estimates derived from infrared (IR) observations of geostationary (GEO) as well as LEO platforms, and precipitation simulations from numerical global models. Sub-systems were developed and refined to derive precipitation estimates from the GEO and LEO IR observations and to compute precipitating cloud motion vectors. The results were reported at the EGU of 2014 and the AGU 2015 Fall Meetings. In this presentation, we report our recent work on the construction of a prototype operational processing system for the second generation CMORPH. The second generation CMORPH prototype operational processing system takes in the passive microwave (PMW) retrievals of instantaneous precipitation rates from all available sensors, the full-resolution GEO and LEO IR data, as well as the hourly precipitation fields generated by the NOAA/NCEP Climate Forecast System (CFS) Reanalysis (CFS). First, a combined field of PMW based precipitation retrievals (MWCOMB) is created on a 0.05deg lat/lon grid over the entire globe through inter-calibrating retrievals from various sensors against a common reference. For this experiment, the reference field is the GMI based retrievals with climatological adjustment against the TMI retrievals using data over the overlapping period. Precipitation estimation is then derived from the GEO and LEO IR data through calibration against the global MWCOMB and the CloudSat CPR based estimates. At the meantime, precipitating cloud motion vectors are derived through the combination of vectors computed from the GEO IR based precipitation estimates and the CFSR precipitation with a 2DVAR technique. A prototype system is applied to generate integrated global precipitation estimates over the entire globe for a three-month period from June 1 to August 31 of 2015. Preliminary tests are conducted to optimize the performance of the system. Specific efforts are made to improve the computational efficiency of the system. The second generation CMORPH test products are compared to the first generation CMORPH and ground observations. Detailed results will be reported at the EGU.

  8. New tree-measurement concepts: height accumulation, giant tree, taper and shape

    Treesearch

    L. R. Grosenbaugh

    1954-01-01

    An entirely new concept of tree measurment was announced by the author in 1948 (11). Since the original theory and applications have subsequently been b roadened considerably, it seems advisable to publish the entire development in readily usable form, along with other material helpful in tree measurement.

  9. Mary Bidwell Breed: The Educator as Dean.

    ERIC Educational Resources Information Center

    Fley, Jo Ann; Jaramillo, George R.

    1979-01-01

    Mary Bidwell Breed predicted that midwestern universities would probably "pass through a stage of educational development in which the liberal arts are entirely feminized, the men are entirely commercialized." We can appreciate how close she came to pinpointing trends which did not begin to be reversed until sixty years later.…

  10. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as the standard deviation of calibrated radiance spectra from multiple scans. To obtain an estimate of the FPA performance, we developed an efficient method of generating pixel performance assessments. In addition, a random pixel selection scheme is developed based on the pixel performance evaluation. This would allow us to perform the calibration procedures on a random pixel population that is a good statistical representation of the entire FPA. The design and implementation of each individual component will be discussed in details.

  11. LANDSAT D data transmission and dissemination study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    An assessment of the quantity of data processed by the system is discussed investigating the various methods for transmission within the system. Various methods of data storage are considered. It is concluded that the entire processing system should be located in White Sands, New Mexico.

  12. Study on Dynamic Development of Three-dimensional Weld Pool Surface in Stationary GTAW

    NASA Astrophysics Data System (ADS)

    Huang, Jiankang; He, Jing; He, Xiaoying; Shi, Yu; Fan, Ding

    2018-04-01

    The weld pool contains abundant information about the welding process. In particular, the type of the weld pool surface shape, i. e., convex or concave, is determined by the weld penetration. To detect it, an innovative laser-vision-based sensing method is employed to observe the weld pool surface of the gas tungsten arc welding (GTAW). A low-power laser dots pattern is projected onto the entire weld pool surface. Its reflection is intercepted by a screen and captured by a camera. Then the dynamic development process of the weld pool surface can be detected. By observing and analyzing, the change of the reflected laser dots reflection pattern, for shape of the weld pool surface shape, was found to closely correlate to the penetration of weld pool in the welding process. A mathematical model was proposed to correlate the incident ray, reflected ray, screen and surface of weld pool based on structured laser specular reflection. The dynamic variation of the weld pool surface and its corresponding dots laser pattern were simulated and analyzed. By combining the experimental data and the mathematical analysis, the results show that the pattern of the reflected laser dots pattern is closely correlated to the development of weld pool, such as the weld penetration. The concavity of the pool surface was found to increase rapidly after the surface shape was changed from convex to concave during the stationary GTAW process.

  13. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  14. New methodology for dynamic lot dispatching

    NASA Astrophysics Data System (ADS)

    Tai, Wei-Herng; Wang, Jiann-Kwang; Lin, Kuo-Cheng; Hsu, Yi-Chin

    1994-09-01

    This paper presents a new dynamic dispatching rule to improve delivery. The dynamic dispatching rule named `SLACK and OTD (on time delivery)' is developed for focusing on due date and target cycle time under the environment of IC manufacturing. This idea uses traditional SLACK policy to control long term due date and new OTD policy to reflect the short term stage queue time. Through the fuzzy theory, these two policies are combined as the dispatching controller to define the lot priority in the entire production line. Besides, the system would automatically update the lot priority according to the current line situation. Since the wafer dispatching used to be controlled by critical ratio that indicates the low customer satisfaction. And the overall slack time in the front end of the process is greater compared to that in the rear end of the process which reveals that the machines in the rear end are overloaded by rush orders. When SLACK and OTD are used the due date control has been gradually improved. The wafer with either a long stage queue time or urgent due date will be pushed through the overall production line instead of jammed in the front end. A demand pull system is also developed to satisfy not only due date but also the quantity of monthly demand. The SLACK and OTD rule has been implemented in Taiwan Semiconductor Manufacturing Company for eight months with beneficial results. In order to clearly monitor the SLACK and OTD policy, a method called box chart is generated to simulate the entire production system. From the box chart, we can not only monitor the result of decision policy but display the production situation on the density figure. The production cycle time and delivery situation can also be investigated.

  15. Developing Subdomain Allocation Algorithms Based on Spatial and Communicational Constraints to Accelerate Dust Storm Simulation

    PubMed Central

    Gui, Zhipeng; Yu, Manzhu; Yang, Chaowei; Jiang, Yunfeng; Chen, Songqing; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Hassan, Mohammed Anowarul; Jin, Baoxuan

    2016-01-01

    Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMM-dust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical modeling. PMID:27044039

  16. Classification of video sequences into chosen generalized use classes of target size and lighting level.

    PubMed

    Leszczuk, Mikołaj; Dudek, Łukasz; Witkowski, Marcin

    The VQiPS (Video Quality in Public Safety) Working Group, supported by the U.S. Department of Homeland Security, has been developing a user guide for public safety video applications. According to VQiPS, five parameters have particular importance influencing the ability to achieve a recognition task. They are: usage time-frame, discrimination level, target size, lighting level, and level of motion. These parameters form what are referred to as Generalized Use Classes (GUCs). The aim of our research was to develop algorithms that would automatically assist classification of input sequences into one of the GUCs. Target size and lighting level parameters were approached. The experiment described reveals the experts' ambiguity and hesitation during the manual target size determination process. However, the automatic methods developed for target size classification make it possible to determine GUC parameters with 70 % compliance to the end-users' opinion. Lighting levels of the entire sequence can be classified with an efficiency reaching 93 %. To make the algorithms available for use, a test application has been developed. It is able to process video files and display classification results, the user interface being very simple and requiring only minimal user interaction.

  17. Design of structure and simulation of the three-zone gasifier of dense layer of the inverted process

    NASA Astrophysics Data System (ADS)

    Zagrutdinov, R. Sh; Negutorov, V. N.; Maliykhin, D. G.; Nikishanin, M. S.; Senachin, P. K.

    2017-11-01

    Experts of LLC “New Energy Technologies” have developed gasifiers designs, with the implementation of the three-zone gasification method, which satisfy the following conditions: 1) the generated gas must be free from tar, soot and hydrocarbons, with a given ratio of CO/H2; 2) to use as the fuel source a wide range of low-grade low-value solid fuels, including biomass and various kinds of carbonaceous wastes; 3) have high reliability in operation, do not require qualified operating personnel, be relatively inexpensive to produce and use steam-air blowing instead of expensive steam-oxygen one; 4) the line of standard sizes should be sufficiently wide (with a single unit capacity of fuel from 1 to 50-70 MW). Two models of gas generators of the inverted gasification process with three combustion zones operating under pressure have been adopted for design: 1) gas generator with a remote combustion chamber type GOP-VKS (two-block version) and 2) a gas generator with a common combustion chamber of the GOP-OK type (single-block version), which is an almost ideal model for increasing the unit capacity. There have been worked out various schemes for the preparation of briquettes from practically the entire spectrum of low-grade fuel: high-ash and high-moisture coals, peat and biomass, including all types of waste - solid household waste, crop, livestock, poultry, etc. In the gas generators there are gasified the cylindrical briquettes with a diameter of 20-25 mm and a length of 25-35 mm. There have been developed a mathematical model and computer code for numerical simulation of synthesis gas generation processes in a gasifier of a dense layer of inverted process during a steam-air blast, including: continuity equations for the 8 gas phase components and for the solid phase; the equation of the heat balance for the entire heterogeneous system; the Darcy law equation (for porous media); equation of state for 8 components of the gas phase; equations for the rates of 3 gas-phase and 4 heterogeneous reactions; macro kinetics law of coke combustion; other equations and boundary conditions.

  18. Development of Individually Addressable Micro-Mirror-Arrays for Space Applications

    NASA Technical Reports Server (NTRS)

    Dutta, Sanghamitra B.; Ewin, Audrey J.; Jhabvala, Murzy; Kotecki, Carl A.; Kuhn, Jonathan L.; Mott, D. Brent

    2000-01-01

    We have been developing a 32 x 32 prototype array of individually addressable Micro-Mirrors capable of operating at cryogenic temperature for Earth and Space Science applications. Micro-Mirror-Array technology has the potential to revolutionize imaging and spectroscopy systems for NASA's missions of the 21st century. They can be used as programmable slits for the Next Generation Space Telescope, as smart sensors for a steerable spectrometer, as neutral density filters for bright scene attenuation etc. The, entire fabrication process is carried out in the Detector Development Laboratory at NASA, GSFC. The fabrication process is low temperature compatible and involves integration of conventional CMOS technology and surface micro-machining used in MEMS. Aluminum is used as the mirror material and is built on a silicon substrate containing the CMOS address circuit. The mirrors are 100 microns x l00 microns in area and deflect by +/- 10 deg induced by electrostatic actuation between two parallel plate capacitors. A pair of thin aluminum torsion straps allow the mirrors to tilt. Finite-element-analysis and closed form solutions using electrostatic and mechanical torque for mirror operation were developed and the results were compared with laboratory performance. The results agree well both at room temperature and at cryogenic temperature. The development demonstrates the first cryogenic operation of two-dimensional Micro-Mirrors with bi-state operation. Larger arrays will be developed meeting requirements for different science applications. Theoretical analysis, fabrication process, laboratory test results and different science applications will be described in detail.

  19. Tunable high-refractive index hybrid for solution-processed light management devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bachevillier, Stefan

    2016-10-01

    After the use of highly efficient but expensive inorganic optical materials, solution-processable polymers and hybrids have drawn more and more interest. Our group have recently developed a novel polymer-based hybrid optical material from titanium oxide hydrate exhibiting an outstanding set of optical and material properties. Firstly, their low cost, processability and cross-linked states are particularly attractive for many applications. Moreover, a high refractive index can be repeatedly achieved while optical losses stays considerably low over the entire visible and near-infrared wavelength regime. Indeed, the formation of inorganic nanoparticles, usually present in nanocomposites, is avoided by a specific formulation process. Even more remarkably, the refractive index can be tuned by either changing the inorganic content, using different titanium precursors or via a low-temperature curing process. A part of our work is focused on the reliable optical characterization of these properties, in particular a microscope-based setup allowing in-situ measurement and sample mapping has been developed. Our efforts are also concentrated on various applications of these exceptional properties. This hybrid material is tailored for photonic devices, with a specific emphasis on the production of highly efficient solution processable Distributed Bragg Reflectors (DBR) and anti-reflection coatings. Furthermore, waveguides can be fabricated from thin films along with in-coupling and out-coupling structures. These light managements structures are particularly adapted to organic photovoltaic cells (OPVs) and light emitting diodes (OLEDs).

  20. Signal processing and analyzing works of art

    NASA Astrophysics Data System (ADS)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  1. Open Source Seismic Software in NOAA's Next Generation Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Hellman, S. B.; Baker, B. I.; Hagerty, M. T.; Leifer, J. M.; Lisowski, S.; Thies, D. A.; Donnelly, B. K.; Griffith, F. P.

    2014-12-01

    The Tsunami Information technology Modernization (TIM) is a project spearheaded by National Oceanic and Atmospheric Administration to update the United States' Tsunami Warning System software currently employed at the Pacific Tsunami Warning Center (Eva Beach, Hawaii) and the National Tsunami Warning Center (Palmer, Alaska). This entirely open source software project will integrate various seismic processing utilities with the National Weather Service Weather Forecast Office's core software, AWIPS2. For the real-time and near real-time seismic processing aspect of this project, NOAA has elected to integrate the open source portions of GFZ's SeisComP 3 (SC3) processing system into AWIPS2. To provide for better tsunami threat assessments we are developing open source tools for magnitude estimations (e.g., moment magnitude, energy magnitude, surface wave magnitude), detection of slow earthquakes with the Theta discriminant, moment tensor inversions (e.g. W-phase and teleseismic body waves), finite fault inversions, and array processing. With our reliance on common data formats such as QuakeML and seismic community standard messaging systems, all new facilities introduced into AWIPS2 and SC3 will be available as stand-alone tools or could be easily integrated into other real time seismic monitoring systems such as Earthworm, Antelope, etc. Additionally, we have developed a template based design paradigm so that the developer or scientist can efficiently create upgrades, replacements, and/or new metrics to the seismic data processing with only a cursory knowledge of the underlying SC3.

  2. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  3. Planetary image conversion task

    NASA Technical Reports Server (NTRS)

    Martin, M. D.; Stanley, C. L.; Laughlin, G.

    1985-01-01

    The Planetary Image Conversion Task group processed 12,500 magnetic tapes containing raw imaging data from JPL planetary missions and produced an image data base in consistent format on 1200 fully packed 6250-bpi tapes. The output tapes will remain at JPL. A copy of the entire tape set was delivered to US Geological Survey, Flagstaff, Ariz. A secondary task converted computer datalogs, which had been stored in project specific MARK IV File Management System data types and structures, to flat-file, text format that is processable on any modern computer system. The conversion processing took place at JPL's Image Processing Laboratory on an IBM 370-158 with existing software modified slightly to meet the needs of the conversion task. More than 99% of the original digital image data was successfully recovered by the conversion task. However, processing data tapes recorded before 1975 was destructive. This discovery is of critical importance to facilities responsible for maintaining digital archives since normal periodic random sampling techniques would be unlikely to detect this phenomenon, and entire data sets could be wiped out in the act of generating seemingly positive sampling results. Reccomended follow-on activities are also included.

  4. Nonisothermal glass molding for the cost-efficient production of precision freeform optics

    NASA Astrophysics Data System (ADS)

    Vu, Anh-Tuan; Kreilkamp, Holger; Dambon, Olaf; Klocke, Fritz

    2016-07-01

    Glass molding has become a key replication-based technology to satisfy intensively growing demands of complex precision optics in the today's photonic market. However, the state-of-the-art replicative technologies are still limited, mainly due to their insufficiency to meet the requirements of mass production. This paper introduces a newly developed nonisothermal glass molding in which a complex-shaped optic is produced in a very short process cycle. The innovative molding technology promises a cost-efficient production because of increased mold lifetime, less energy consumption, and high throughput from a fast process chain. At the early stage of the process development, the research focuses on an integration of finite element simulation into the process chain to reduce time and labor-intensive cost. By virtue of numerical modeling, defects including chill ripples and glass sticking in the nonisothermal molding process can be predicted and the consequent effects are avoided. In addition, the influences of process parameters and glass preforms on the surface quality, form accuracy, and residual stress are discussed. A series of experiments was carried out to validate the simulation results. The successful modeling, therefore, provides a systematic strategy for glass preform design, mold compensation, and optimization of the process parameters. In conclusion, the integration of simulation into the entire nonisothermal glass molding process chain will significantly increase the manufacturing efficiency as well as reduce the time-to-market for the mass production of complex precision yet low-cost glass optics.

  5. Dynamics of asexual reproduction in planarians

    NASA Astrophysics Data System (ADS)

    Schoetz, Eva-Maria; Lincoln, Bryan; Quinodoz, Sofia

    2011-03-01

    Planaria research is experiencing a resurgence due to the development of molecular tools, the Planarian genome project and database resources. Despite the resulting progress in planarian biology research, an extensive study of their physical properties remains to be undertaken. We developed a method to collect a large amount of data on the dynamics of clonal reproduction in the freshwater planarian S.mediterranea. The capability of planarians to regenerate an entire organism from a minuscule body part is based on a homogeneously distributed stem cell population that comprises 25-30% of all cells. Due to this stem cell contingent, planarians can reproduce spontaneously by dividing into a larger head and a smaller tail piece, which then will rebuild the missing body parts, including a central nervous system, within about a week. Time-lapse imaging allows us to characterize the fission process in detail, revealing the stages of the process as well as capturing the nature of the rupture itself. A traction force measurement setup is being developed to allow us to quantify the forces planarians exert on the substrate during reproduction, a macroscopic analog to the Traction Force Microscopy setups used to determine local cellular forces. We are particularly interested in the molecular processes during division and the interplay between tissue mechanics and cell signaling.

  6. Framework for Site Characterization for Monitored Natural Attenuation of Volatile Organic Compounds in Ground Water

    EPA Science Inventory

    Monitored Natural Attenuation (MNA) is unique among remedial technologies in relying entirely on natural processes to achieve site-specific objectives. Site characterization is essential to provide site-specific data and interpretations for the decision-making process (i.e., to ...

  7. Evaluating Amtrak's S2S: Are Recorded Injury Rates Showing Actual Injury Rates?

    DOT National Transportation Integrated Search

    2017-08-01

    Since 2009, Amtrak has been engaged in unprecedented efforts to advance its safety processes and improve the safety culture of the entire corporation, including establishing a peer-to-peer feedback process, known as the Safe-2-Safer program. FRA is c...

  8. Metrology needs for the semiconductor industry over the next decade

    NASA Astrophysics Data System (ADS)

    Melliar-Smith, Mark; Diebold, Alain C.

    1998-11-01

    Metrology will continue to be a key enabler for the development and manufacture of future generations of integrated circuits. During 1997, the Semiconductor Industry Association renewed the National Technology Roadmap for Semiconductors (NTRS) through the 50 nm technology generation and for the first time included a Metrology Roadmap (1). Meeting the needs described in the Metrology Roadmap will be both a technological and financial challenge. In an ideal world, metrology capability would be available at the start of process and tool development, and silicon suppliers would have 450 mm wafer capable metrology tools in time for development of that wafer size. Unfortunately, a majority of the metrology suppliers are small companies that typically can't afford the additional two to three year wait for return on R&D investment. Therefore, the success of the semiconductor industry demands that we expand cooperation between NIST, SEMATECH, the National Labs, SRC, and the entire community. In this paper, we will discuss several critical metrology topics including the role of sensor-based process control, in-line microscopy, focused measurements for transistor and interconnect fabrication, and development needs. Improvements in in-line microscopy must extend existing critical dimension measurements up to 100 nm generations and new methods may be required for sub 100 nm generations. Through development, existing metrology dielectric thickness and dopant dose and junction methods can be extended to 100 nm, but new and possibly in-situ methods are needed beyond 100 nm. Interconnect process control will undergo change before 100 nm due to the introduction of copper metallization, low dielectric constant interlevel dielectrics, and Damascene process flows.

  9. From the Big Bang to sustainable societies.

    PubMed

    Eriksson, K E; Robèrt, K H

    1991-01-01

    A series of events in the history of cosmos has created the prerequisites for life on Earth. With respect to matter, the earth is a closed system. However, it receives light from the sun and emits infrared radiation into space. The difference in thermodynamic potential between these two flows has provided the physical conditions for self-organization. The transformation of lifeless matter into modern life forms, with their high degree of order and complexity, has occurred in the context of the earth's natural cycles, including the water cycle and the biochemical cycles between plants and animals. Primary production units, the cells of green plants, can use the thermodynamic potential of the energy balance in a very direct way, i.e. in photosynthesis. Plant cells are unique in their ability to synthesize more structure than is broken down elsewhere in the biosphere. The perpetuation of this process requires the recycling of wastes. However, modern industrial societies are obsessed with the supply side, ignoring the principle of matter's conservation and neglecting to plan for the entire material flow. As a result there has been an accumulation of both visible and invisible garbage (pollution), which disturbs the biosphere and reduces stocks of natural resources. Furthermore, due to complexity and delay mechanisms, we usually cannot predict time parameters for the resulting socio-economic consequences or the development of disease. To continue along this path of folly is not compatible with the maintenance of wealth, nor with the health of humans or the biosphere. Rather than address the millions of environmental problems one at a time, we need to approach them at the systemic level. It is essential to convert to human life-styles and forms of societal organization that are based on cyclic processes compatible with the earth's natural cycles. The challenge to the developed countries is not only to decrease their own emissions of pollutants but to develop the cyclic technology and life styles needed by the entire human community.

  10. Economic development and population policy in Bangladesh.

    PubMed

    Khan, M R

    1984-09-01

    This paper deals with Bangladesh's growth rate and the policy implications for its economy. Despite its obvious influence on the economy, population has never been integrated as an endogenous variable in any planning model. Development planning is mostly supported by donor agencies, involving little micro-level planning and practically no trickle-down effect. This paper examines the interaction of population and other development variables in the country's planning process. Much of the rural population consists of landless farmers share croppers, so that the land ownership pattern contributes to low productivity. Population increase is making the rural masses even poorer. This process is further compounded by increasing foreign aid dependence, adverse terms of trade in the international market, low savings and investments, and the rural sector's worsening terms of trade. During 1950-1970 real per capita gross domestic product (GDP) increased only at a rate of 1% per annum and during 1950-1970 real growth of GDP fell behind the population growth rate. A cost benefit analysis of fertility reduction is needed. The cost benefit ratio of most countries varies between 1:10 to 1:30; for Bangladesh it is 1:16. Macro-model studies indicate that the higher the fertility reduction and shorter the period of required decline, the higher will be the benefits in terms of gains in per capita income. There is, however, a contradiction between national and household interests. The latter's decision to have more children has a negative spillover effect, which nullifies the gains of the community. The national family planning program suffered a serious setback during and after the liberation of Bangladesh, mainly due to lack of administrative leadership and support. In order for the population growth rate to be checked and to increase the quality of life for the entire population, the family planning program must be revitalized by mobilizing the entire government machinery and involving the people at the grass roots level.

  11. Changing sediment budget of the Mekong: Cumulative threats and management strategies for a large river basin.

    PubMed

    Kondolf, G Mathias; Schmitt, Rafael J P; Carling, Paul; Darby, Steve; Arias, Mauricio; Bizzi, Simone; Castelletti, Andrea; Cochrane, Thomas A; Gibson, Stanford; Kummu, Matti; Oeurng, Chantha; Rubin, Zan; Wild, Thomas

    2018-06-01

    Two decades after the construction of the first major dam, the Mekong basin and its six riparian countries have seen rapid economic growth and development of the river system. Hydropower dams, aggregate mines, flood-control dykes, and groundwater-irrigated agriculture have all provided short-term economic benefits throughout the basin. However, it is becoming evident that anthropic changes are significantly affecting the natural functioning of the river and its floodplains. We now ask if these changes are risking major adverse impacts for the 70 million people living in the Mekong Basin. Many livelihoods in the basin depend on ecosystem services that will be strongly impacted by alterations of the sediment transport processes that drive river and delta morpho-dynamics, which underpin a sustainable future for the Mekong basin and Delta. Drawing upon ongoing and recently published research, we provide an overview of key drivers of change (hydropower development, sand mining, dyking and water infrastructures, climate change, and accelerated subsidence from pumping) for the Mekong's sediment budget, and their likely individual and cumulative impacts on the river system. Our results quantify the degree to which the Mekong delta, which receives the impacts from the entire connected river basin, is increasingly vulnerable in the face of declining sediment loads, rising seas and subsiding land. Without concerted action, it is likely that nearly half of the Delta's land surface will be below sea level by 2100, with the remaining areas impacted by salinization and frequent flooding. The threat to the Delta can be understood only in the context of processes in the entire river basin. The Mekong River case can serve to raise awareness of how the connected functions of river systems in general depend on undisturbed sediment transport, thereby informing planning for other large river basins currently embarking on rapid economic development. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. The Paperless Solution

    NASA Technical Reports Server (NTRS)

    2001-01-01

    REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.

  13. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  14. Fast interrupt platform for extended DOS

    NASA Technical Reports Server (NTRS)

    Duryea, T. W.

    1995-01-01

    Extended DOS offers the unique combination of a simple operating system which allows direct access to the interrupt tables, 32 bit protected mode access to 4096 MByte address space, and the use of industry standard C compilers. The drawback is that fast interrupt handling requires both 32 bit and 16 bit versions of each real-time process interrupt handler to avoid mode switches on the interrupts. A set of tools has been developed which automates the process of transforming the output of a standard 32 bit C compiler to 16 bit interrupt code which directly handles the real mode interrupts. The entire process compiles one set of source code via a make file, which boosts productivity by making the management of the compile-link cycle very simple. The software components are in the form of classes written mostly in C. A foreground process written as a conventional application which can use the standard C libraries can communicate with the background real-time classes via a message passing mechanism. The platform thus enables the integration of high performance real-time processing into a conventional application framework.

  15. Characterization of Calcite Mineral Precipitation Process by EICP in Porous Media

    NASA Astrophysics Data System (ADS)

    Kim, D.; Mahabadi, N.; Hall, C.; Jang, J.; van Paassen, L. A.

    2017-12-01

    One of the most prevalent ground improvement techniques is injection of synthetic materials, such as cement grout or silicates into the pore space to create cementing bonds between soil particles. Besides these traditional ground improvement methods, several biological processes have been developed to improve soil properties. Enzyme induced carbonate precipitation (EICP) is a biological process in which urea hydrolyzes into ammonia and inorganic carbon, and promotes carbonate mineral precipitation. Different morphologies and patterns of calcite mineral precipitation, such as particle surface coating, pore filling, and soil particles bonding, have been observed in the previous studies. Most of the researches have detected precipitated minerals after the completion of the treatment using SEM (Scanning Electron Microscope) imaging and XRD (X-ray Diffractometer) structural analysis. In this research, an EICP reaction medium is injected into a microfluidic chip to observe the entire process of carbonate precipitation through several cycles of EICP treatment in the porous medium. Once the process of mineral precipitation is completed, water is injected into the microfluidic chip with different flow rates to evaluate the stability of carbonates during fluid flow injection.

  16. High-throughput continuous hydrothermal synthesis of an entire nanoceramic phase diagram.

    PubMed

    Weng, Xiaole; Cockcroft, Jeremy K; Hyett, Geoffrey; Vickers, Martin; Boldrin, Paul; Tang, Chiu C; Thompson, Stephen P; Parker, Julia E; Knowles, Jonathan C; Rehman, Ihtesham; Parkin, Ivan; Evans, Julian R G; Darr, Jawwad A

    2009-01-01

    A novel High-Throughput Continuous Hydrothermal (HiTCH) flow synthesis reactor was used to make directly and rapidly a 66-sample nanoparticle library (entire phase diagram) of nanocrystalline Ce(x)Zr(y)Y(z)O(2-delta) in less than 12 h. High resolution PXRD data were obtained for the entire heat-treated library (at 1000 degrees C/1 h) in less than a day using the new robotic beamline I11, located at Diamond Light Source (DLS). This allowed Rietveld-quality powder X-ray diffraction (PXRD) data collection of the entire 66-sample library in <1 day. Consequently, the authors rapidly mapped out phase behavior and sintering behaviors for the entire library. Out of the entire 66-sample heat-treated library, the PXRD data suggests that 43 possess the fluorite structure, of which 30 (out of 36) are ternary compositions. The speed, quantity and quality of data obtained by our new approach, offers an exciting new development which will allow structure-property relationships to be accessed for nanoceramics in much shorter time periods.

  17. DRAGON Grid: A Three-Dimensional Hybrid Grid Generation Code Developed

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2000-01-01

    Because grid generation can consume 70 percent of the total analysis time for a typical three-dimensional viscous flow simulation for a practical engineering device, payoffs from research and development could reduce costs and increase throughputs considerably. In this study, researchers at the NASA Glenn Research Center at Lewis Field developed a new hybrid grid approach with the advantages of flexibility, high-quality grids suitable for an accurate resolution of viscous regions, and a low memory requirement. These advantages will, in turn, reduce analysis time and increase accuracy. They result from an innovative combination of structured and unstructured grids to represent the geometry and the computation domain. The present approach makes use of the respective strengths of both the structured and unstructured grid methods, while minimizing their weaknesses. First, the Chimera grid generates high-quality, mostly orthogonal meshes around individual components. This process is flexible and can be done easily. Normally, these individual grids are required overlap each other so that the solution on one grid can communicate with another. However, when this communication is carried out via a nonconservative interpolation procedure, a spurious solution can result. Current research is aimed at entirely eliminating this undesired interpolation by directly replacing arbitrary grid overlapping with a nonstructured grid called a DRAGON grid, which uses the same set of conservation laws over the entire region, thus ensuring conservation everywhere. The DRAGON grid is shown for a typical film-cooled turbine vane with 33 holes and 3 plenum compartments. There are structured grids around each geometrical entity and unstructured grids connecting them. In fiscal year 1999, Glenn researchers developed and tested the three-dimensional DRAGON grid-generation tools. A flow solver suitable for the DRAGON grid has been developed, and a series of validation tests are underway.

  18. Mapping permafrost change hot-spots with Landsat time-series

    NASA Astrophysics Data System (ADS)

    Grosse, G.; Nitze, I.

    2016-12-01

    Recent and projected future climate warming strongly affects permafrost stability over large parts of the terrestrial Arctic with local, regional and global scale consequences. The monitoring and quantification of permafrost and associated land surface changes in these areas is crucial for the analysis of hydrological and biogeochemical cycles as well as vegetation and ecosystem dynamics. However, detailed knowledge of the spatial distribution and the temporal dynamics of these processes is scarce and likely key locations of permafrost landscape dynamics may remain unnoticed. As part of the ERC funded PETA-CARB and ESA GlobPermafrost projects, we developed an automated processing chain based on data from the entire Landsat archive (excluding MSS) for the detection of permafrost change related processes and hotspots. The automated method enables us to analyze thousands of Landsat scenes, which allows for a multi-scaled spatio-temporal analysis at 30 meter spatial resolution. All necessary processing steps are carried out automatically with minimal user interaction, including data extraction, masking, reprojection, subsetting, data stacking, and calculation of multi-spectral indices. These indices, e.g. Landsat Tasseled Cap and NDVI among others, are used as proxies for land surface conditions, such as vegetation status, moisture or albedo. Finally, a robust trend analysis is applied to each multi-spectral index and each pixel over the entire observation period of up to 30 years from 1985 to 2015, depending on data availability. Large transects of around 2 million km² across different permafrost types in Siberia and North America have been processed. Permafrost related or influencing landscape dynamics were detected within the trend analysis, including thermokarst lake dynamics, fires, thaw slumps, and coastal dynamics. The produced datasets will be distributed to the community as part of the ERC PETA-CARB and ESA GlobPermafrost projects. Users are encouraged to provide feedback and ground truth data for a continuous improvement of our methodology and datasets, which will lead to a better understanding of the spatial and temporal distribution of changes within the vulnerable permafrost zone.

  19. Innovative vitrification for soil remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetta, N.W.; Patten, J.S.; Hart, J.G.

    1995-12-01

    The objective of this DOE demonstration program is to validate the performance and operation of the Vortec Cyclone Melting System (CMS{trademark}) for the processing of LLW contaminated soils found at DOE sites. This DOE vitrification demonstration project has successfully progressed through the first two phases. Phase 1 consisted of pilot scale testing with surrogate wastes and the conceptual design of a process plant operating at a generic DOE site. The objective of Phase 2, which is scheduled to be completed the end of FY 95, is to develop a definitive process plant design for the treatment of wastes at amore » specific DOE facility. During Phase 2, a site specific design was developed for the processing of LLW soils and muds containing TSCA organics and RCRA metal contaminants. Phase 3 will consist of a full scale demonstration at the DOE gaseous diffusion plant located in Paducah, KY. Several DOE sites were evaluated for potential application of the technology. Paducah was selected for the demonstration program because of their urgent waste remediation needs as well as their strong management and cost sharing financial support for the project. During Phase 2, the basic nitrification process design was modified to meet the specific needs of the new waste streams available at Paducah. The system design developed for Paducah has significantly enhanced the processing capabilities of the Vortec vitrification process. The overall system design now includes the capability to shred entire drums and drum packs containing mud, concrete, plastics and PCB`s as well as bulk waste materials. This enhanced processing capability will substantially expand the total DOE waste remediation applications of the technology.« less

  20. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  1. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  2. Spaceborne sensors (1983-2000 AD): A forecast of technology

    NASA Technical Reports Server (NTRS)

    Kostiuk, T.; Clark, B. P.

    1984-01-01

    A technical review and forecast of space technology as it applies to spaceborne sensors for future NASA missions is presented. A format for categorization of sensor systems covering the entire electromagnetic spectrum, including particles and fields is developed. Major generic sensor systems are related to their subsystems, components, and to basic research and development. General supporting technologies such as cryogenics, optical design, and data processing electronics are addressed where appropriate. The dependence of many classes of instruments on common components, basic R&D and support technologies is also illustrated. A forecast of important system designs and instrument and component performance parameters is provided for the 1983-2000 AD time frame. Some insight into the scientific and applications capabilities and goals of the sensor systems is also given.

  3. A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1997-01-01

    This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.

  4. Reconfigurable, Cognitive Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  5. Title TBA: Revising the Abstract Submission Process.

    PubMed

    Tibon, Roni; Open Science Committee, Cbu; Henson, Richard

    2018-04-01

    Academic conferences are among the most prolific scientific activities, yet the current abstract submission and review process has serious limitations. We propose a revised process that would address these limitations, achieve some of the aims of Open Science, and stimulate discussion throughout the entire lifecycle of the scientific work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. 76 FR 56466 - Notice of Intent to Prepare an Environmental Document and Proposed Plan Amendment for the West...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-13

    ... Vehicle Access Element of the CDCA Plan for the WEMO area; and (2) Alternative processes for designating.... Identification of the process and decision criteria that should be used to designate routes in the sub-regional... analysis, and guide the entire process from plan decision-making to route designation review in order to...

  7. Uncertainty propagation from raw data to final results. [ALEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1985-01-01

    Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure. Propagation of experimental uncertainties through that reduction process has sometimes been perceived as even more difficult, if not impossible. At the Oak Ridge Electron Linear Accelerator, a computer code ALEX has been developed to assist in the propagation process. The purpose of ALEX is to carefully and correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional input from the experimentalist beyond that whichmore » is needed for the data reduction itself. The theoretical method used in ALEX is described, with emphasis on transmission measurements. Application to the natural iron and natural nickel measurements of D.C. Larson is shown.« less

  8. Reduced state feedback gain computation. [optimization and control theory for aircraft control

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1976-01-01

    Because application of conventional optimal linear regulator theory to flight controller design requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. Therefore, a stochastic linear model that was developed is presented which accounts for aircraft parameter and initial uncertainty, measurement noise, turbulence, pilot command and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell. Results using a seventh order process show the proposed procedures to be very effective.

  9. Information Transparency in Education: Three Sides of a Two-Sided Process

    ERIC Educational Resources Information Center

    Mertsalova, T. A.

    2015-01-01

    Information transparency is the result of informational globalization and the avalanche of information and communication technologies: thus, these processes are natural for the whole modern society. Statistics show that during the past several years the transparency situation not just in education but in the entire society has expanded…

  10. Fetal programming and environmental exposures: Implications for prenatal care and preterm birth

    EPA Science Inventory

    Fetal programming is an enormously complex process that relies on numerous environmental inputs from uterine tissue, the placenta, the maternal blood supply, and other sources. Recent evidence has made clear that the process is not based entirely on genetics, but rather on a deli...

  11. Paths for Future Population Aging.

    ERIC Educational Resources Information Center

    Grigsby, Jill S.

    Population aging refers to an entire age structure becoming older. The age structure of a population is the result of three basic processes: fertility, mortality, and migration. Age structures reflect both past effects and current patterns of these processes. At the town, city, or regional level, migration becomes an important factor in raising…

  12. 9 CFR 381.300 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...

  13. 9 CFR 381.300 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...

  14. 9 CFR 381.300 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...

  15. 9 CFR 381.300 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...

  16. 9 CFR 381.300 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...

  17. Students Matter: Quality Measurements in Online Courses

    ERIC Educational Resources Information Center

    Unal, Zafer; Unal, Aslihan

    2016-01-01

    Quality Matters (QM) is a peer review process designed to certify the quality of online courses and online components. It has generated widespread interest and received national recognition for its peer-based approach to quality assurance and continuous improvement in online education. While the entire QM online course design process is…

  18. Oxygen ion-beam microlithography

    DOEpatents

    Tsuo, Y.S.

    1991-08-20

    A method of providing and developing a resist on a substrate for constructing integrated circuit (IC) chips includes the following steps: of depositing a thin film of amorphous silicon or hydrogenated amorphous silicon on the substrate and exposing portions of the amorphous silicon to low-energy oxygen ion beams to oxidize the amorphous silicon at those selected portions. The nonoxidized portions are then removed by etching with RF-excited hydrogen plasma. Components of the IC chip can then be constructed through the removed portions of the resist. The entire process can be performed in an in-line vacuum production system having several vacuum chambers. Nitrogen or carbon ion beams can also be used. 5 figures.

  19. Correlation Dimension Estimates of Global and Local Temperature Data.

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    1995-11-01

    The author has attempted to detect the presence of low-dimensional deterministic chaos in temperature data by estimating the correlation dimension with the Hill estimate that has been recently developed by Mikosch and Wang. There is no convincing evidence of low dimensionality with either global dataset (Southern Hemisphere monthly average temperatures from 1858 to 1984) or local temperature dataset (daily minimums at Auckland, New Zealand). Any apparent reduction in the dimension estimates appears to be due large1y, if not entirely, to effects of statistical bias, but neither is it a purely random stochastic process. The dimension of the climatic attractor may be significantly larger than 10.

  20. Parallel Fixed Point Implementation of a Radial Basis Function Network in an FPGA

    PubMed Central

    de Souza, Alisson C. D.; Fernandes, Marcelo A. C.

    2014-01-01

    This paper proposes a parallel fixed point radial basis function (RBF) artificial neural network (ANN), implemented in a field programmable gate array (FPGA) trained online with a least mean square (LMS) algorithm. The processing time and occupied area were analyzed for various fixed point formats. The problems of precision of the ANN response for nonlinear classification using the XOR gate and interpolation using the sine function were also analyzed in a hardware implementation. The entire project was developed using the System Generator platform (Xilinx), with a Virtex-6 xc6vcx240t-1ff1156 as the target FPGA. PMID:25268918

  1. Medical Device for Automated Prick Test Reading.

    PubMed

    Justo, Xabier; Diaz, Inaki; Gil, Jorge Juan; Gastaminza, Gabriel

    2018-05-01

    Allergy tests are routinely performed in most hospitals everyday. However, measuring the outcomes of these tests is still a very laborious manual task. Current methods and systems lack of precision and repeatability. This paper presents a novel mechatronic system that is able to scan a patient's entire arm and provide allergists with precise measures of wheals for diagnosis. The device is based on 3-D laser technology and specific algorithms have been developed to process the information gathered. This system aims to automate the reading of skin prick tests and make gains in speed, accuracy, and reliability. Several experiments have been performed to evaluate the performance of the system.

  2. Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Barrett, Adam B.; Seth, Anil K.

    2009-12-01

    Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.

  3. Modulation of eukaryotic cell apoptosis by members of the bacterial order Actinomycetales.

    PubMed

    Barry, Daniel P; Beaman, Blaine L

    2006-10-01

    Apoptosis, or programmed cell death, is normally responsible for the orderly elimination of aged or damaged cells, and is a necessary part of the homeostasis and development of multicellular organisms. Some pathogenic bacteria can disrupt this process by triggering excess apoptosis or by preventing it when appropriate. Either event can lead to disease. There has been extensive research into the modulation of host cell death by microorganisms, and several reviews have been published on the phenomenon. Rather than covering the entire field, this review focuses on the dysregulation of host cell apoptosis by members of the order Actinomycetales, containing the genera Corynebacterium, Mycobacterium, Rhodococcus, and Nocardia.

  4. Pathogenesis of Taenia solium taeniasis and cysticercosis.

    PubMed

    Gonzales, I; Rivera, J T; Garcia, H H

    2016-03-01

    Taenia solium infections (taeniasis/cysticercosis) are a major scourge to most developing countries. Neurocysticercosis, the infection of the human nervous system by the cystic larvae of this parasite, has a protean array of clinical manifestations varying from entirely asymptomatic infections to aggressive, lethal courses. The diversity of clinical manifestations reflects a series of contributing factors which include the number, size and location of the invading parasites, and particularly the inflammatory response of the host. This manuscript reviews the different presentations of T. solium infections in the human host with a focus on the mechanisms or processes responsible for their clinical expression. © 2016 John Wiley & Sons Ltd.

  5. Sequence search on a supercomputer.

    PubMed

    Gotoh, O; Tagashira, Y

    1986-01-10

    A set of programs was developed for searching nucleic acid and protein sequence data bases for sequences similar to a given sequence. The programs, written in FORTRAN 77, were optimized for vector processing on a Hitachi S810-20 supercomputer. A search of a 500-residue protein sequence against the entire PIR data base Ver. 1.0 (1) (0.5 M residues) is carried out in a CPU time of 45 sec. About 4 min is required for an exhaustive search of a 1500-base nucleotide sequence against all mammalian sequences (1.2M bases) in Genbank Ver. 29.0. The CPU time is reduced to about a quarter with a faster version.

  6. Synthetic Genome Recoding: New genetic codes for new features

    PubMed Central

    Kuo, James; Stirling, Finn; Lau, Yu Heng; Shulgina, Yekaterina; Way, Jeffrey C.; Silver, Pamela A.

    2018-01-01

    Full genome recoding, or rewriting codon meaning, through chemical synthesis of entire bacterial chromosomes has become feasible in the past several years. Recoding an organism can impart new properties including non-natural amino acid incorporation, virus resistance, and biocontainment. The estimated cost of construction that includes DNA synthesis, assembly by recombination, and troubleshooting, is now comparable to costs of early stage development of drugs or other high-tech products. Here we discuss several recently published assembly methods and provide some thoughts on the future, including how synthetic efforts might benefit from analysis of natural recoding processes and organisms that use alternative genetic codes. PMID:28983660

  7. Oxygen ion-beam microlithography

    DOEpatents

    Tsuo, Y. Simon

    1991-01-01

    A method of providing and developing a resist on a substrate for constructing integrated circuit (IC) chips includes the following steps: of depositing a thin film of amorphous silicon or hydrogenated amorphous silicon on the substrate and exposing portions of the amorphous silicon to low-energy oxygen ion beams to oxidize the amorphous silicon at those selected portions. The nonoxidized portions are then removed by etching with RF-excited hydrogen plasma. Components of the IC chip can then be constructed through the removed portions of the resist. The entire process can be performed in an in-line vacuum production system having several vacuum chambers. Nitrogen or carbon ion beams can also be used.

  8. Nanoscale resonant-cavity-enhanced germanium photodetectors with lithographically defined spectral response for improved performance at telecommunications wavelengths.

    PubMed

    Balram, Krishna C; Audet, Ross M; Miller, David A B

    2013-04-22

    We demonstrate the use of a subwavelength planar metal-dielectric resonant cavity to enhance the absorption of germanium photodetectors at wavelengths beyond the material's direct absorption edge, enabling high responsivity across the entire telecommunications C and L bands. The resonant wavelength of the detectors can be tuned linearly by varying the width of the Ge fin, allowing multiple detectors, each resonant at a different wavelength, to be fabricated in a single-step process. This approach is promising for the development of CMOS-compatible devices suitable for integrated, high-speed, and energy-efficient photodetection at telecommunications wavelengths.

  9. Parameter estimating state reconstruction

    NASA Technical Reports Server (NTRS)

    George, E. B.

    1976-01-01

    Parameter estimation is considered for systems whose entire state cannot be measured. Linear observers are designed to recover the unmeasured states to a sufficient accuracy to permit the estimation process. There are three distinct dynamics that must be accommodated in the system design: the dynamics of the plant, the dynamics of the observer, and the system updating of the parameter estimation. The latter two are designed to minimize interaction of the involved systems. These techniques are extended to weakly nonlinear systems. The application to a simulation of a space shuttle POGO system test is of particular interest. A nonlinear simulation of the system is developed, observers designed, and the parameters estimated.

  10. User's instructions for the Guyton circulatory dynamics model using the Univac 1110 batch and demand processing (with graphic capabilities)

    NASA Technical Reports Server (NTRS)

    Archer, G. T.

    1974-01-01

    The model presents a systems analysis of a human circulatory regulation based almost entirely on experimental data and cumulative present knowledge of the many facets of the circulatory system. The model itself consists of eighteen different major systems that enter into circulatory control. These systems are grouped into sixteen distinct subprograms that are melded together to form the total model. The model develops circulatory and fluid regulation in a simultaneous manner. Thus, the effects of hormonal and autonomic control, electrolyte regulation, and excretory dynamics are all important and are all included in the model.

  11. Application of experimental design in geothermal resources assessment of Ciwidey-Patuha, West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Ashat, Ali; Pratama, Heru Berian

    2017-12-01

    The successful Ciwidey-Patuha geothermal field size assessment required integration data analysis of all aspects to determined optimum capacity to be installed. Resources assessment involve significant uncertainty of subsurface information and multiple development scenarios from these field. Therefore, this paper applied the application of experimental design approach to the geothermal numerical simulation of Ciwidey-Patuha to generate probabilistic resource assessment result. This process assesses the impact of evaluated parameters affecting resources and interacting between these parameters. This methodology have been successfully estimated the maximum resources with polynomial function covering the entire range of possible values of important reservoir parameters.

  12. Persulfidation proteome reveals the regulation of protein function by hydrogen sulfide in diverse biological processes in Arabidopsis.

    PubMed

    Aroca, Angeles; Benito, Juan M; Gotor, Cecilia; Romero, Luis C

    2017-10-13

    Hydrogen sulfide-mediated signaling pathways regulate many physiological and pathophysiological processes in mammalian and plant systems. The molecular mechanism by which hydrogen sulfide exerts its action involves the post-translational modification of cysteine residues to form a persulfidated thiol motif, a process called protein persulfidation. We have developed a comparative and quantitative proteomic analysis approach for the detection of endogenous persulfidated proteins in wild-type Arabidopsis and L-CYSTEINE DESULFHYDRASE 1 mutant leaves using the tag-switch method. The 2015 identified persulfidated proteins were isolated from plants grown under controlled conditions, and therefore, at least 5% of the entire Arabidopsis proteome may undergo persulfidation under baseline conditions. Bioinformatic analysis revealed that persulfidated cysteines participate in a wide range of biological functions, regulating important processes such as carbon metabolism, plant responses to abiotic and biotic stresses, plant growth and development, and RNA translation. Quantitative analysis in both genetic backgrounds reveals that protein persulfidation is mainly involved in primary metabolic pathways such as the tricarboxylic acid cycle, glycolysis, and the Calvin cycle, suggesting that this protein modification is a new regulatory component in these pathways. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  13. Development and Research on the Mechanism of Novel Mist Etching Method for Oxide Thin Films

    NASA Astrophysics Data System (ADS)

    Kawaharamura, Toshiyuki; Hirao, Takashi

    2012-03-01

    A novel etching process with etchant mist was developed and applied to oxide thin films such as zinc oxide (ZnO), zinc magnesium oxide (ZnMgO), and indium tin oxide (ITO). By using this process, it was shown that precise control of the etching characteristics is possible with a reasonable etching rate, for example, in the range of 10-100 nm/min, and a fine pattern of high accuracy can also be realized, even though this is usually very difficult by conventional wet etching processes, for ZnO and ZnMgO. The mist etching process was found to be similarly and successfully applied to ITO. The mechanism of mist etching has been studied by examining the etching temperature dependence of pattern accuracy, and it was shown that the mechanism was different from that of conventional liquid-phase spray etching. It was ascertained that fine pattern etching was attained using mist droplets completely (or partly) gasified by the heat applied to the substrate. This technique was applied to the fabrication of a ZnO thin-film transistor (TFT) with a ZnO active channel length of 4 µm. The electrical properties of the TFT were found to be excellent with fine uniformity over the entire 4-in. wafer.

  14. The process of moving from a regionally based cervical cytology biobank to a national infrastructure.

    PubMed

    Perskvist, Nasrin; Norlin, Loreana; Dillner, Joakim

    2015-04-01

    This article addresses the important issue of the standardization of the biobank process. It reports on i) the implementation of standard operating procedures for the processing of liquid-based cervical cells, ii) the standardization of storage conditions, and iii) the ultimate establishment of nationwide standardized biorepositories for cervical specimens. Given the differences in the infrastructure and healthcare systems of various county councils in Sweden, these efforts were designed to develop standardized methods of biobanking across the nation. The standardization of cervical sample processing and biobanking is an important and widely acknowledged issue. Efforts to address these concerns will facilitate better patient care and improve research based on retrospective and prospective collections of patient samples and cohorts. The successful nationalization of the Cervical Cytology Biobank in Sweden is based on three vital issues: i) the flexibility of the system to adapt to other regional systems, ii) the development of the system based on national collaboration between the university and the county councils, and iii) stable governmental financing by the provider, the Biobanking and Molecular Resource Infrastructure of Sweden (BBMRI.se). We will share our experiences with biorepository communities to promote understanding of and advances in opportunities to establish a nationalized biobank which covers the healthcare of the entire nation.

  15. Methods and approaches to support Indigenous water planning: An example from the Tiwi Islands, Northern Territory, Australia

    NASA Astrophysics Data System (ADS)

    Hoverman, Suzanne; Ayre, Margaret

    2012-12-01

    SummaryIndigenous land owners of the Tiwi Islands, Northern Territory Australia have begun the first formal freshwater allocation planning process in Australia entirely within Indigenous lands and waterways. The process is managed by the Northern Territory government agency responsible for water planning, the Department of Natural Resources, Environment, The Arts and Sport, in partnership with the Tiwi Land Council, the principal representative body for Tiwi Islanders on matters of land and water management and governance. Participatory planning methods ('tools') were developed to facilitate community participation in Tiwi water planning. The tools, selected for their potential to generate involvement in the planning process needed both to incorporate Indigenous knowledge of water use and management and raise awareness in the Indigenous community of Western science and water resources management. In consultation with the water planner and Tiwi Land Council officers, the researchers selected four main tools to develop, trial and evaluate. Results demonstrate that the tools provided mechanisms which acknowledge traditional management systems, improve community engagement, and build confidence in the water planning process. The researchers found that participatory planning approaches supported Tiwi natural resource management institutions both in determining appropriate institutional arrangements and clarifying roles and responsibilities in the Islands' Water Management Strategy.

  16. Computational Chemistry in the Pharmaceutical Industry: From Childhood to Adolescence.

    PubMed

    Hillisch, Alexander; Heinrich, Nikolaus; Wild, Hanno

    2015-12-01

    Computational chemistry within the pharmaceutical industry has grown into a field that proactively contributes to many aspects of drug design, including target selection and lead identification and optimization. While methodological advancements have been key to this development, organizational developments have been crucial to our success as well. In particular, the interaction between computational and medicinal chemistry and the integration of computational chemistry into the entire drug discovery process have been invaluable. Over the past ten years we have shaped and developed a highly efficient computational chemistry group for small-molecule drug discovery at Bayer HealthCare that has significantly impacted the clinical development pipeline. In this article we describe the setup and tasks of the computational group and discuss external collaborations. We explain what we have found to be the most valuable and productive methods and discuss future directions for computational chemistry method development. We share this information with the hope of igniting interesting discussions around this topic. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Dental development of Didelphis albiventris (Marsupialia): I--incisors and canines.

    PubMed

    Fonseca, C T; Alves, J B

    2006-02-01

    The formation of incisors and canines in marsupials of D. albiventris was studied at various stages of development. Seventy-six specimens, with ages varying from 0 to 100 days, were used in this investigation. Serial sections of the maxilla were obtained in the transverse plane and stained with hematoxylin and eosin. Histological analyses were made to verify the pattern of teeth development, as well as their chronology of eruption. The period of time from birth to 100 days comprised the entire process of teeth development, from epithelial bud formation to early eruption of the teeth. Oral epithelium thickening gave rise to the functional incisors and canines. In addition, a secondary dental lamina emerged in different phases of development in the outer epithelium of incisors and canines, which degenerated when it reached the bud stage. No evidence of deciduous dentition was observed. The results of this investigation suggest that secondary dental lamina represents remnants of a primitive condition in which secondary dentition used to be present.

  18. 50 CFR 679.30 - General CDQ regulations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... visual representation of the qualified applicant's entire organizational structure, including all... narrative description of how the CDQ group intends to harvest and process its CDQ allocations, including a...

  19. Navigating Institutions and Institutional Leadership to Address Sexual Violence

    ERIC Educational Resources Information Center

    Sisneros, Kathy; Rivera, Monica

    2018-01-01

    Using an institutional example, this chapter offers strategies to effectively navigate institutional culture, processes, and structures to engage the entire campus community in addressing sexual violence.

  20. Cabbage Patch Chemistry.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2000

    2000-01-01

    This activity takes students through the process of fermentation. Requires an entire month for the full reaction to take place. The reaction, catalyzed by bacterial enzymes, produces lactic acid from glucose. (SAH)

Top