25 CFR 15.11 - What are the basic steps of the probate process?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps of the probate process? 15.11 Section 15.11 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE PROBATE OF INDIAN... are the basic steps of the probate process? The basic steps of the probate process are: (a) We learn...
Processing of zero-derived words in English: an fMRI investigation.
Pliatsikas, Christos; Wheeldon, Linda; Lahiri, Aditi; Hansen, Peter C
2014-01-01
Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationality
Three basic principles of success.
Levin, Roger
2003-06-01
Basic business principles all but ensure success when they are followed consistently. Putting strategies, objectives and tactics in place is the first step toward being able to document systems, initiate scripting and improve staff training. Without the basic steps, systems, scripting and training the practice for performance would be hit or miss, at best. More importantly, applying business principles ensures that limited practice resources are dedicated to the achievement of the strategy. By following this simple, three-step process, a dental practice can significantly enhance both financial success and dentist and staff satisfaction.
Conceptual analysis of Physiology of vision in Ayurveda.
Balakrishnan, Praveen; Ashwini, M J
2014-07-01
The process by which the world outside is seen is termed as visual process or physiology of vision. There are three phases in this visual process: phase of refraction of light, phase of conversion of light energy into electrical impulse and finally peripheral and central neurophysiology. With the advent of modern instruments step by step biochemical changes occurring at each level of the visual process has been deciphered. Many investigations have emerged to track these changes and helping to diagnose the exact nature of the disease. Ayurveda has described this physiology of vision based on the functions of vata and pitta. Philosophical textbook of ayurveda, Tarka Sangraha, gives certain basics facts of visual process. This article discusses the second and third phase of visual process. Step by step analysis of the visual process through the spectacles of ayurveda amalgamated with the basics of philosophy from Tarka Sangraha has been analyzed critically to generate a concrete idea regarding the physiology and hence thereby interpret the pathology on the grounds of ayurveda based on the investigative reports.
77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... the process. The Code Revision Process contains four basic steps that are followed for developing new documents as well as revising existing documents. Step 1: Public Input Stage, which results in the First Draft Report (formerly ROP); Step 2: Comment Stage, which results in the Second Draft Report (formerly...
Conceptual analysis of Physiology of vision in Ayurveda
Balakrishnan, Praveen; Ashwini, M. J.
2014-01-01
The process by which the world outside is seen is termed as visual process or physiology of vision. There are three phases in this visual process: phase of refraction of light, phase of conversion of light energy into electrical impulse and finally peripheral and central neurophysiology. With the advent of modern instruments step by step biochemical changes occurring at each level of the visual process has been deciphered. Many investigations have emerged to track these changes and helping to diagnose the exact nature of the disease. Ayurveda has described this physiology of vision based on the functions of vata and pitta. Philosophical textbook of ayurveda, Tarka Sangraha, gives certain basics facts of visual process. This article discusses the second and third phase of visual process. Step by step analysis of the visual process through the spectacles of ayurveda amalgamated with the basics of philosophy from Tarka Sangraha has been analyzed critically to generate a concrete idea regarding the physiology and hence thereby interpret the pathology on the grounds of ayurveda based on the investigative reports. PMID:25336853
Possibly All of that and Then Some: Scalar Implicatures Are Understood in Two Steps
ERIC Educational Resources Information Center
Tomlinson, John M., Jr.; Bailey, Todd M.; Bott, Lewis
2013-01-01
Scalar implicatures often incur a processing cost in sentence comprehension tasks. We used a novel mouse-tracking technique in a sentence verification paradigm to test different accounts of this effect. We compared a two-step account, in which people access a basic meaning and then enrich the basic meaning to form the scalar implicature, against a…
ERIC Educational Resources Information Center
Research for Better Schools, Inc., Philadelphia, PA.
The process for providing a "thorough and efficient" (T & E) education according to New Jersey statutes and regulations involves six basic steps. This document suggests procedures for handling the fifth step, educational program evaluation. Processes discussed include committee formation, evaluation planning, action plan…
Imaging through Fog Using Polarization Imaging in the Visible/NIR/SWIR Spectrum
2017-01-11
few haze effects as possible. One post processing step on the image in order to complete image dehazing Figure 6: Basic architecture of the...Page 16 Figure 7: Basic architecture of post-processing techniques to recover an image dehazed from a raw image This first study was limited on the
Bibliographic Instruction in a Step-by-Step Approach.
ERIC Educational Resources Information Center
Soash, Richard L.
1992-01-01
Describes an information search process based on Kuhlthau's model that was used to teach bibliographic research to ninth grade students. A research test to ensure that students are familiar with basic library skills is presented, forms for helping students narrow the topic and evaluate materials are provided, and a research process checklist is…
A Selection Method That Succeeds!
ERIC Educational Resources Information Center
Weitman, Catheryn J.
Provided a structural selection method is carried out, it is possible to find quality early childhood personnel. The hiring process involves five definite steps, each of which establishes a base for the next. A needs assessment formulating basic minimal qualifications is the first step. The second step involves review of current job descriptions…
A Reference Unit on Home Vegetable Gardening.
ERIC Educational Resources Information Center
McCully, James S., Comp.; And Others
Designed to provide practical, up-to-date, basic information on home gardening for vocational agriculture students with only a limited knowledge of vegetable gardening, this reference unit includes step-by-step procedures for planning, planting, cultivating, harvesting, and processing vegetables in a small plot. Topics covered include plot…
Converting baker's waste into alcohol. Revised final progress report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halsey, R.; Wilson, P.B.
All types of baker's waste (including waste from candy manufacturers) can be converted into alcohol to be used as a fuel. All types of waste at any stage in process can be converted, such as: basic ingredients (including floor sweepings); dry mixes (including floor sweepings); dough at any stage; partially or fully cooked products; and day old returned products. The basic steps are the same, only the initial preparation will vary slightly. The variation will be: amount of water to be added and amount and type of nutrients (if any) to be added. The basic steps are: slurrying, liquefying tomore » put starch into liquid state, saccharifying to convert starch into fermentable sugars, fermentation to convert sugars into alcohol, and distillation to separate the alcohol from the mash. Each step is discussed in detail along with problems that may arise. Directions are given and materials (enzymes, yeast, etc.) and equipment are descibed briefly.« less
Gaia DR2 documentation Chapter 3: Astrometry
NASA Astrophysics Data System (ADS)
Hobbs, D.; Lindegren, L.; Bastian, U.; Klioner, S.; Butkevich, A.; Stephenson, C.; Hernandez, J.; Lammers, U.; Bombrun, A.; Mignard, F.; Altmann, M.; Davidson, M.; de Bruijne, J. H. J.; Fernández-Hernández, J.; Siddiqui, H.; Utrilla Molina, E.
2018-04-01
This chapter of the Gaia DR2 documentation describes the models and processing steps used for the astrometric core solution, namely, the Astrometric Global Iterative Solution (AGIS). The inputs to this solution rely heavily on the basic observables (or astrometric elementaries) which have been pre-processed and discussed in Chapter 2, the results of which were published in Fabricius et al. (2016). The models consist of reference systems and time scales; assumed linear stellar motion and relativistic light deflection; in addition to fundamental constants and the transformation of coordinate systems. Higher level inputs such as: planetary and solar system ephemeris; Gaia tracking and orbit information; initial quasar catalogues and BAM data are all needed for the processing described here. The astrometric calibration models are outlined followed by the details processing steps which give AGIS its name. We also present a basic quality assessment and validation of the scientific results (for details, see Lindegren et al. 2018).
Supercritical Fluid Spray Application Process for Adhesives and Primers
2003-03-01
The basic scheme of SFE process consists of three steps. A solvent, typically carbon dioxide, first is heated and pressurized to a supercritical...passivation step to remove contaminants and to prevent recontamination. Bok et al. (25) describe a pressure pulsation mechanism to stimulate improved...in as a liquid, and then it is heated to above its critical temperature to become a supercritical fluid. The sample is injected and dissolved into
Dynamic Modeling of the Main Blow in Basic Oxygen Steelmaking Using Measured Step Responses
NASA Astrophysics Data System (ADS)
Kattenbelt, Carolien; Roffel, B.
2008-10-01
In the control and optimization of basic oxygen steelmaking, it is important to have an understanding of the influence of control variables on the process. However, important process variables such as the composition of the steel and slag cannot be measured continuously. The decarburization rate and the accumulation rate of oxygen, which can be derived from the generally measured waste gas flow and composition, are an indication of changes in steel and slag composition. The influence of the control variables on the decarburization rate and the accumulation rate of oxygen can best be determined in the main blow period. In this article, the measured step responses of the decarburization rate and the accumulation rate of oxygen to step changes in the oxygen blowing rate, lance height, and the addition rate of iron ore during the main blow are presented. These measured step responses are subsequently used to develop a dynamic model for the main blow. The model consists of an iron oxide and a carbon balance and an additional equation describing the influence of the lance height and the oxygen blowing rate on the decarburization rate. With this simple dynamic model, the measured step responses can be explained satisfactorily.
Reading Assessment: A Primer for Teachers and Tutors.
ERIC Educational Resources Information Center
Caldwell, JoAnne Schudt
This primer provides the basic information that teachers and tutors need to get started on the complex process of reading assessment. Designed for maximum utility in today's standards-driven classroom, the primer presents simple, practical assessment strategies that are based on theory and research. It takes teachers step by step through learning…
ERIC Educational Resources Information Center
National Library of Australia, Canberra.
This manual is designed to provide an introduction and basic guide to the use of IBM's Advanced Text Management System (ATMS), the text processing system to be used for the creation of Australian data bases within AUSINET. Instructions are provided for using the system to enter, store, retrieve, and modify data, which may then be displayed at the…
WaveNet: A Web-Based Metocean Data Access, Processing, and Analysis Tool. Part 3 - CDIP Database
2014-06-01
and Analysis Tool; Part 3 – CDIP Database by Zeki Demirbilek, Lihwa Lin, and Derek Wilson PURPOSE: This Coastal and Hydraulics Engineering...Technical Note (CHETN) describes coupling of the Coastal Data Information Program ( CDIP ) database to WaveNet, the first module of MetOcnDat (Meteorological...provides a step-by-step procedure to access, process, and analyze wave and wind data from the CDIP database. BACKGROUND: WaveNet addresses a basic
Purification and Refolding of Overexpressed Human Basic Fibroblast Growth Factor in Escherichia coli
Alibolandi, Mona; Mirzahoseini, Hasan
2011-01-01
This work describes the integration of expanded bed adsorption (EBA) and adsorptive protein refolding operations used to recover purified and biologically active human basic fibroblast growth factor from inclusion bodies expressed in E. coli. Insoluble overexpressed human basic fibroblast growth factor has been purified on CM Hyper Z matrix by expanded bed adsorption after isolation and solubilization in 8 M urea. The adsorption was made in expanded bed without clarification steps such as centrifugation. Column refolding was done by elimination of urea and elution with NaCl. The human basic fibroblast growth factor was obtained as a highly purified soluble monomer form with similar behavior in circular dichroism and fluorescence spectroscopy as native protein. A total of 92.52% of the available human basic fibroblast growth factor was recovered as biologically active and purified protein using the mentioned purification and refolding process. This resulted in the first procedure describing high-throughput purification and refolding of human basic fibroblast growth factor in one step and is likely to have the greatest benefit for proteins that tend to aggregate when refolded by dilution. PMID:21837279
Measure Guideline. Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, A.; Easley, S.
2012-05-01
This measure guideline evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provides a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.
Measure Guideline: Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, A.; Easley, S.
2012-05-01
The report evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provide a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.
Pozzolanic filtration/solidification of radionuclides in nuclear reactor cooling water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Englehardt, J.D.; Peng, C.
1995-12-31
Laboratory studies to investigate the feasibility of one- and two-step processes for precipitation/coprecipitating radionuclides from nuclear reactor cooling water, filtering with pozzolanic filter aid, and solidifying, are reported in this paper. In the one-step process, ferrocyanide salt and excess lime are added ahead of the filter, and the resulting filter cake solidifies by a pozzolanic reaction. The two-step process involves addition of solidifying agents subsequent to filtration. It was found that high surface area diatomaceous synthetic calcium silicate powders, sold commercially as functional fillers and carriers, adsorb nickel isotopes from solution at neutral and slightly basic pH. Addition of themore » silicates to cooling water allowed removal of the tested metal isotopes (nickel, iron, manganese, cobalt, and cesium) simultaneously at neutral to slightly basic pH. Lime to diatomite ratio was the most influential characteristic of composition on final strength tested, with higher lime ratios giving higher strength. Diatomaceous earth filter aids manufactured without sodium fluxes exhibited higher pozzolanic activity. Pozzolanic filter cake solidified with sodium silicate and a ratio of 0.45 parts lime to 1 part diatomite had compressive strength ranging from 470 to 595 psi at a 90% confidence level. Leachability indices of all tested metals in the solidified waste were acceptable. In light of the typical requirement of removing iron and desirability of control over process pH, a two-step process involving addition of Portland cement to the filter cake may be most generally applicable.« less
[The emphases and basic procedures of genetic counseling in psychotherapeutic model].
Zhang, Yuan-Zhi; Zhong, Nanbert
2006-11-01
The emphases and basic procedures of genetic counseling are all different with those in old models. In the psychotherapeutic model, genetic counseling will not only focus on counselees' genetic disorders and birth defects, but also their psychological problems. "Client-centered therapy" termed by Carl Rogers plays an important role in genetic counseling process. The basic procedures of psychotherapeutic model of genetic counseling include 7 steps: initial contact, introduction, agendas, inquiry of family history, presenting information, closing the session and follow-up.
Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project
ERIC Educational Resources Information Center
Bolstad, Rachel
2016-01-01
This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…
ERIC Educational Resources Information Center
Stodden, Robert A.; Boone, Rosalie
1986-01-01
Discusses the role of teachers in providing vocational assessment to disabled students. Steps in this process include (1) establish planning team and conduct information search, (2) define purpose, (3) establish basic considerations, (4) formulate assessment model, (5) establish implementation focus, and (6) pilot test and evaluate assessment…
ERIC Educational Resources Information Center
Wolfe, Lisa A.
This book provides an introduction to basic communications concepts, a step-by-step process for developing and implementing a library public relations/communications plan, and descriptions of effective library communications tools and strategies. Detailed examples of the application of solid communications planning in school, public, academic, and…
A Guide to Program Planning Vol. II.
ERIC Educational Resources Information Center
Allen, Earl, Sr.
This booklet is a simplified guide for program planning and is intended to complement a somewhat lengthier companion booklet on program evaluation. It spells out in outline fashion the basic elements and steps involved in the planning process. Brief sections focus in turn on different phases of the planning process, including problem…
1985-01-01
envisionment) produced by GIZMO . ? In the envisionment, I s indicates the set of quantity—conditioned individuals that exists during a situa- tion...envisionment step by step . In START, the initial state, GIZMO deduces that heat flow occurs, since there is assumed to be a temperature difference between the...stov e GIZMO implements the basic operations of qualitative process theory, including an envisioner for makin g predictions and a program for
10 Tips for Getting Grants to Keep Your Library Afloat
ERIC Educational Resources Information Center
McCune, Bonnie
2007-01-01
In this article, the author, taking the view of the foundation or the grant giver, shares the basic steps on how funders assess requests for grants. Each of these steps may be waived if a key member of the selection process, such as a board member, has a personal interest in the application. The following are 10 tips from the funders' point of…
Information Fluxes as Concept for Categorizations of Life
NASA Astrophysics Data System (ADS)
Hildenbrand, Georg; Hausmann, M.
2012-05-01
Definitions of life are controversially discussed; however, they are mostly depending on bio- evolutionary driven arguments. Here, we propose a systematic, theoretical approach to the question what life is, by categorization and classification of different levels of life. This approach is mainly based on the analysis of information flux occurring in systems being suspicious to be alive, and on the analysis of their power of environmental control. In a first step, we show that all biological definitions of life can be derived from basic physical principles of entropy (number of possible states of a thermodynamic system) and of the energy needed for controlling entropic development. In a next step we discuss how any process where information flux is generated, regardless of its materialization is defined and related to classical definitions of life. In a third step we resume the proposed classification scheme in its most basic way, looking only for existence of data storage, its processing, and its environmental control. We join inhere a short discussion how the materialization of information fluxes can take place depending on the special properties of the four basic physical forces. Having done all this we are able to give everybody a classification catalogue at hand that one can categorize the kind of life one is talking about, thus overcoming the obstacles deriving from the simple appearing question whether something is alive or not. On its most basic level as presented here, our scheme offers a categorization for fire, crystals, prions, viruses, spores, up to cells and even tardigrada and cryostases.
The two-step assemblies of basic-amino-Acid-rich Peptide with a highly charged polyoxometalate.
Zhang, Teng; Li, Hong-Wei; Wu, Yuqing; Wang, Yizhan; Wu, Lixin
2015-06-15
Two-step assembly of a peptide from HPV16 L1 with a highly charged europium-substituted polyoxometalate (POM) cluster, accompanying a great luminescence enhancement of the inorganic polyanions, is reported. The mechanism is discussed in detail by analyzing the thermodynamic parameters from isothermal titration calorimetry (ITC), time-resolved fluorescent and NMR spectra. By comparing the actions of the peptide analogues, a binding process and model are proposed accordingly. The driving forces in each binding step are clarified, and the initial POM aggregation, basic-sequence and hydrophobic C termini of peptide are revealed to contribute essentially to the two-step assembly. The present study demonstrates both a meaningful preparation for bioinorganic materials and a strategy using POMs to modulate the assembly of peptides and even proteins, which could be extended to other proteins and/or viruses by using peptides and POMs with similar properties. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Some Reflections on Strategic Planning in Public Libraries.
ERIC Educational Resources Information Center
Palmour, Vernon E.
1985-01-01
Presents the Public Library Association's planning model for strategic planning in public libraries. The development of the model is explained, the basic steps of the planning process are described, and improvements to the model are suggested. (CLB)
Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.
LaBerge, Jeanne M; Andriole, Katherine P
2003-12-01
This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored.
5 CFR 531.405 - Waiting periods for within-grade increase.
Code of Federal Regulations, 2011 CFR
2011-01-01
... step 4-260 days of creditable service in a pay status over a period of not less than 52 calendar weeks... step 4-52 calendar weeks of creditable service; (ii) Rate of basic pay equal to or greater than the rate of basic pay at step 4 and less than the rate of basic pay at step 7-104 calendar weeks of...
A cascade reaction network mimicking the basic functional steps of adaptive immune response
NASA Astrophysics Data System (ADS)
Han, Da; Wu, Cuichen; You, Mingxu; Zhang, Tao; Wan, Shuo; Chen, Tao; Qiu, Liping; Zheng, Zheng; Liang, Hao; Tan, Weihong
2015-10-01
Biological systems use complex ‘information-processing cores’ composed of molecular networks to coordinate their external environment and internal states. An example of this is the acquired, or adaptive, immune system (AIS), which is composed of both humoral and cell-mediated components. Here we report the step-by-step construction of a prototype mimic of the AIS that we call an adaptive immune response simulator (AIRS). DNA and enzymes are used as simple artificial analogues of the components of the AIS to create a system that responds to specific molecular stimuli in vitro. We show that this network of reactions can function in a manner that is superficially similar to the most basic responses of the vertebrate AIS, including reaction sequences that mimic both humoral and cellular responses. As such, AIRS provides guidelines for the design and engineering of artificial reaction networks and molecular devices.
Generation of Antibunched Light by Excited Molecules in a Microcavity Trap
NASA Technical Reports Server (NTRS)
DeMartini, F.; DiGiuseppe, G.; Marrocco, M.
1996-01-01
The active microcavity is adopted as an efficient source of non-classical light. By this device, excited by a mode-locked laser at a rate of 100 MHz, single-photons are generated over a single field mode with a nonclassical sub-poissonian distribution. The process of adiabatic recycling within a multi-step Franck-Condon molecular optical-pumping mechanism, characterized in our case by a quantum efficiency very close to one, implies a pump self-regularization process leading to a striking n-squeezing effect. By a replication of the basic single-atom excitation process a beam of quantum photon (Fock states) can be created. The new process represents a significant advance in the modern fields of basic quantum-mechanical investigation, quantum communication and quantum cryptography.
Bench to bedside: integrating advances in basic science into daily clinical practice.
McGoldrick, Rory B; Hui, Kenneth; Chang, James
2014-08-01
This article focuses on the initial steps of commercial development of a patentable scientific discovery from an academic center through to marketing a clinical product. The basics of partnering with a technology transfer office (TTO) and the complex process of patenting are addressed, followed by a discussion on marketing and licensing the patent to a company in addition to starting a company. Finally, the authors address the basic principles of obtaining clearance from the Food and Drugs Administration, production in a good manufacturing practice (GMP) facility, and bringing the product to clinical trial. Published by Elsevier Inc.
Strategic Information Systems Planning.
ERIC Educational Resources Information Center
Rowley, Jennifer
1995-01-01
Strategic Information Systems Planning (SISP) is the process of establishing a program for implementation and use of information systems in ways that will optimize effectiveness of information resources and use them to support the objectives of the organization. Basic steps in SISP methodology are outlined. (JKP)
ERIC Educational Resources Information Center
Wichowski, Chester
1979-01-01
The zero-based budgeting approach is designed to achieve the greatest benefit with the fewest undesirable consequences. Seven basic steps make up the zero-based decision-making process: (1) identifying program goals, (2) classifying goals, (3) identifying resources, (4) reviewing consequences, (5) developing decision packages, (6) implementing a…
5 CFR 531.405 - Waiting periods for within-grade increase.
Code of Federal Regulations, 2010 CFR
2010-01-01
... step 4-52 calendar weeks of creditable service; (ii) Rate of basic pay equal to or greater than the... creditable service; and (iii) Rate of basic pay equal to or greater than the rate of basic pay at step 7-156... step 4-260 days of creditable service in a pay status over a period of not less than 52 calendar weeks...
25 CFR 166.220 - What are the basic steps for acquiring a permit through negotiation?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps for acquiring a permit through negotiation? 166.220 Section 166.220 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER GRAZING PERMITS Permit Requirements Obtaining A Permit § 166.220 What are the basic steps for acquiring a permit through negotiation? The...
Image reconstruction: an overview for clinicians.
Hansen, Michael S; Kellman, Peter
2015-03-01
Image reconstruction plays a critical role in the clinical use of magnetic resonance imaging (MRI). The MRI raw data is not acquired in image space and the role of the image reconstruction process is to transform the acquired raw data into images that can be interpreted clinically. This process involves multiple signal processing steps that each have an impact on the image quality. This review explains the basic terminology used for describing and quantifying image quality in terms of signal-to-noise ratio and point spread function. In this context, several commonly used image reconstruction components are discussed. The image reconstruction components covered include noise prewhitening for phased array data acquisition, interpolation needed to reconstruct square pixels, raw data filtering for reducing Gibbs ringing artifacts, Fourier transforms connecting the raw data with image space, and phased array coil combination. The treatment of phased array coils includes a general explanation of parallel imaging as a coil combination technique. The review is aimed at readers with no signal processing experience and should enable them to understand what role basic image reconstruction steps play in the formation of clinical images and how the resulting image quality is described. © 2014 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Green, Connie
1997-01-01
Describes a classroom unit that provides preschoolers with hands-on experience, using common dirt as a way to develop scientific thinking and foster an appreciation of biology, ecology, and the natural world. Focuses on practicing the basic steps in the scientific process, including prediction, observation, documentation, conclusions, and…
Sharpe, Patricia A; Brandt, Heather M; McCree, Donna H; Owl-Myers, Elizabeth; Taylor, Betty; Mullins, Glenda
2013-07-01
Participatory formative research guided the creation of a culturally tailored educational brochure about human papillomavirus (HPV) at an American Indian women's clinic. A review of existing educational materials and in-depth interviews were conducted. Nine steps for creating health communications messages that were patterned after National Cancer Institute guidelines guided the brochure development process. Of 95 women tested for HPV, 41% were positive, 32 (34%) agreed to the in-depth interview, and 9 agreed to the pretesting interview. Mean age was 41 years. Interviews revealed key themes concerning emotional reactions to abnormal Pap test results and HPV; need for basic information about HPV, Pap tests, and results; concerns about HPV stigma, sexual transmission, and communication with sexual partner; and the preferred source and format for HPV educational materials. A literature review revealed 12 areas of basic HPV content. A participatory process successfully engaged nursing staff and patients in creating culturally appropriate brochures for clinic use. This article provides specific steps for creating culturally tailored patient education materials.
SeaWiFS Science Algorithm Flow Chart
NASA Technical Reports Server (NTRS)
Darzi, Michael
1998-01-01
This flow chart describes the baseline science algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Data Processing System (SDPS). As such, it includes only processing steps used in the generation of the operational products that are archived by NASA's Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC). It is meant to provide the reader with a basic understanding of the scientific algorithm steps applied to SeaWiFS data. It does not include non-science steps, such as format conversions, and places the greatest emphasis on the geophysical calculations of the level-2 processing. Finally, the flow chart reflects the logic sequences and the conditional tests of the software so that it may be used to evaluate the fidelity of the implementation of the scientific algorithm. In many cases however, the chart may deviate from the details of the software implementation so as to simplify the presentation.
Kinetic concepts of thermally stimulated reactions in solids
NASA Astrophysics Data System (ADS)
Vyazovkin, Sergey
Historical analysis suggests that the basic kinetic concepts of reactions in solids were inherited from homogeneous kinetics. These concepts rest upon the assumption of a single-step reaction that disagrees with the multiple-step nature of solid-state processes. The inadequate concepts inspire such unjustified anticipations of kinetic analysis as evaluating constant activation energy and/or deriving a single-step reaction mechanism for the overall process. A more adequate concept is that of the effective activation energy, which may vary with temperature and extent of conversion. The adequacy of this concept is illustrated by literature data as well as by experimental data on the thermal dehydration of calcium oxalate monohydrate and thermal decomposition of calcium carbonate, ammonium nitrate and 1,3,5,7- tetranitro-1,3,5,7-tetrazocine.
Selection of species and sampling areas: The importance of inference
Paul Stephen Corn
2009-01-01
Inductive inference, the process of drawing general conclusions from specific observations, is fundamental to the scientific method. Platt (1964) termed conclusions obtained through rigorous application of the scientific method as "strong inference" and noted the following basic steps: generating alternative hypotheses; devising experiments, the...
ERIC Educational Resources Information Center
Gouran, Dennis S.
2009-01-01
This article presents the author's response to Professor Hewes's "The Influence of Communication Processes on Group Outcomes: Antithesis and Thesis." The author believes that Hewes could have been more helpful to the reader and to those who are apt to find inspiration in the steps he has taken in his essay to promote a "return to basic theorizing…
Basic Casting from A to Z. Student's Instruction Booklet.
ERIC Educational Resources Information Center
Zebco, Tulsa, OK.
A profusely illustrated student instruction booklet contains step-by-step directions and diagrams for learning four basic casting techniques. Separate sections cover basic spin-casting, spinning, bait-casting, and fly-casting. Each section details recommended equipment (reel, rod, line, plug, tackle, lures, leaders, flies), describes specific…
Learning about Tasks Computers Can Perform. ERIC Digest.
ERIC Educational Resources Information Center
Brosnan, Patricia A.
Knowing what different kinds of computer equipment can do is the first step in choosing the computer that is right for you. This digest describes a developmental progression of computer capabilities. First the basic three software programs (word processing, spreadsheets, and database programs) are discussed using examples. Next, an explanation of…
Public Fire Education. First Edition. IFSTA 606.
ERIC Educational Resources Information Center
Laughlin, Jerry W., Ed.; And Others
This manual was developed to give the firefighter basic knowledge concerning the problem of reaching the public with an educational program. Focusing on fire education planning, the first of seven chapters presents a five-step planning process that involves identification, selection, design, implementation, and evaluation. Chapter 2 presents a…
The New Guide to Utility Ratemaking.
ERIC Educational Resources Information Center
American Gas Association, Arlington, VA. Educational Services.
This booklet focuses on state regulations of gas, electricity, water, and telephone services. Section 1 describes the basic steps in a rate case, procedures followed, and key terms used in explaining these processes. Included information highlights preparing and tracking a rate case in terms of: (1) preliminary events; (2) the staff's position and…
ERIC Educational Resources Information Center
Work, Kirsten A.; Gibbs, Melissa A.; Friedman, Erich J.
2015-01-01
We describe a card game that helps introductory biology students understand the basics of the immune response to pathogens. Students simulate the steps of the immune response with cards that represent the pathogens and the cells and molecules mobilized by the immune system. In the process, they learn the similarities and differences between the…
Translation: Elements of a Craft.
ERIC Educational Resources Information Center
Heiderson, Mazin A.
An overview of the skills, techniques, tools, and compensation of language translators and interpreters is offered. It begins with a definition of translation and a brief history of translation in the western world. Basic principles of translation dating back to Roman writers are also outlined. A five-step process in producing a good translation…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisher, L.M.
Presently some methods of HTS-conductors processing are under study in the authors laboratory. ``Powder-in-tube`` (PIT), ``Jelly-roll``, electrophorethis are among them. PIT process has developed predominantly both in a view of the achieved J{sub c} values Bi-2223 phase was used as a core material for these tapes. Since the main purpose of the task order was to enhance the development of long length high temperature superconductor tapes, the authors have considered reasonable to lay the perfection idea of the PIT process step by step or tape by tape. To realize it they have assumed, keeping stable the basic scheme of PITmore » process, to vary some technological parameters which are as follows: (1) type of initial powder; (2) sheath material; (3) tape construction (filaments number, cross section e.a.); and (4) processing regimes. This report covers the fabrication process and characteristics of the produced conductors.« less
An analysis of hydrogen production via closed-cycle schemes. [thermochemical processings from water
NASA Technical Reports Server (NTRS)
Chao, R. E.; Cox, K. E.
1975-01-01
A thermodynamic analysis and state-of-the-art review of three basic schemes for production of hydrogen from water: electrolysis, thermal water-splitting, and multi-step thermochemical closed cycles is presented. Criteria for work-saving thermochemical closed-cycle processes are established, and several schemes are reviewed in light of such criteria. An economic analysis is also presented in the context of energy costs.
5 CFR 531.406 - Creditable service.
Code of Federal Regulations, 2010 CFR
2010-01-01
... pay is equal to or greater than the rate of basic pay for step 4 of the applicable grade and less than... period for an employee whose rate of basic pay is equal to or greater than the rate of basic pay for step....406 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER THE...
Research in the perianesthesia setting: the basics of getting started.
Myers, Gina; Kosinski, Michele
2005-02-01
Research can be defined as a process that systematically investigates a situation with the objective of expanding the existing knowledge of a profession. Research asks the question "Does what we do as nurses help or hinder?" The purpose of this article is to provide a brief history of nursing research and to review basic research methods. In addition, examples of potential research projects focused in the perianesthesia practice arena will be explored. Practical steps will be outlined to guide novice nurse researchers in their early endeavors.
Transit scheduling: Basic and advanced manuals. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pine, R.; Niemeyer, J.; Chisholm, R.
1998-12-01
This manual will be of interest to new transit schedulers, experienced schedulers, transit planners, operating staff, and others who need to be conversant with the scheduling process. The materials clearly describe all steps in the bus and light rail scheduling process, under TCRP Project A-11, Transit Scheduling: A Manual with Materials, research was undertaken by Transportation Management and Design of Solana Beach, California, to prepare a transit scheduling manual that incorporates modern training techniques for bus and light rail transit scheduling. The manual consists of two sections: a basic treatment and an advanced section. The basic-level section is in anmore » instructional format designed primarily for novice schedulers and other transit staff. The advance section covers more complex scheduling requirements. Each section may be used sequentially or independently and is designed to integrate with agency apprenticeship and on-the-job training.« less
ERIC Educational Resources Information Center
Seguin, Barbara; Swanson, Lois
The Virginia STEPS (Student/Teacher Education Planning System) was developed to enable adult basic education (ABE) students to become independent learners responsible for planning, carrying out, evaluating, and making adjustments in their education. ABE instructors at Blackhawk Technical College in Wisconsin have adapted the STEPS model to make…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qiu, J; Washington University in St Louis, St Louis, MO; Li, H. Harlod
Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The mostmore » important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.« less
How To Create and Conduct a Memory Enhancement Program.
ERIC Educational Resources Information Center
Meyer, Genevieve R.; Ober-Reynolds, Sharman
This report describes Memory Enhancement Group workshops which have been conducted at the Senior Health and Peer Counseling Center in Santa Monica, California and gives basic data regarding outcomes of the workshops. It provides a model of memory as a three-step process of registration or becoming aware, consolidation, and retrieval. It presents…
A Method of Synthesizing Large Bodies of Knowledge in the Social Sciences.
ERIC Educational Resources Information Center
Thiemann, Francis C.
Employing concepts of formal symbolic logic, the philosophy of science, computer technology, and the work of Hans Zetterberg, a format is suggested for synthesizing and increasing use of the rapidly expanding knowledge of the social sciences. Steps in the process include formulating basic propositions, utilizing computers to establish sets, and…
Shao-Yuan Leu; J.Y. Zhu
2013-01-01
Enzymatic saccharification of cellulose is a key step in conversion of plant biomass to advanced biofuel and chemicals. Many substrate-related factors affect saccharification. Rather than examining the role of each individual factor on overall saccharification efficiency, this study examined how each factor affects the three basic processes of a heterogeneous...
Implementing a Redesign Strategy: Lessons from Educational Change.
ERIC Educational Resources Information Center
Basom, Richard E., Jr.; Crandall, David P.
The effective implementation of school redesign, based on a social systems approach, is discussed in this paper. A basic assumption is that the interdependence of system elements has implications for a complex change process. Seven barriers to redesign and five critical issues for successful redesign strategy are presented. Seven linear steps for…
Behavioral Talk-Write as a Method for Teaching Technical Editing.
ERIC Educational Resources Information Center
Gilbertsen, Michael; Killingsworth, M. Jimmie
1987-01-01
Presents a process-oriented method for teachers of stylistic editing workshops that allows them to (1) focus on individual students, (2) start with students basic repertory of responses and build from there, (3) work with freely emitted behavior, (4) ensure frequent and brief responses, and (5) achieve desired behavior through sequential steps.…
A Guide to Networking for K-12 Schools.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
The purpose of this guide is to provide basic networking information and planning assistance for technology coordinators and others involved in building networks for K-12 schools. The information in this guide focuses on the first few steps in the networking process. It reviews planning considerations and network design issues facing educators who…
Pre-School Students' Informal Acquisitions Regarding the Concepts of Point and Straight Line
ERIC Educational Resources Information Center
Orbay, Keziban; Develi, Mehmet Hikmet
2015-01-01
This study aimed to investigate the informal cognitive structures regarding "point" and "straight line"--two basic and undefined terms of geometry--in children registered in preschool--the previous step before in-class formal education process. The study was conducted with the participation of 50 children enrolled in nursery,…
Budgeting: The Basics and Beyond. Learn at Home.
ERIC Educational Resources Information Center
Prochaska-Cue, Kathy; Sugden, Marilyn
Designed as an at-home course to help users develop a realistic budget plan and set up a workable record-keeping system, these course materials provide practical tips, ideas, and suggestions for budgeting. The course begins with a nine-step budgeting process which emphasizes communicating among family members, considering personal or family…
Composting. Sludge Treatment and Disposal Course #166. Instructor's Guide [and] Student Workbook.
ERIC Educational Resources Information Center
Arasmith, E. E.
Composting is a lesson developed for a sludge treatment and disposal course. The lesson discusses the basic theory of composting and the basic operation, in a step-by-step sequence, of the two typical composting procedures: windrow and forced air static pile. The lesson then covers basic monitoring and operational procedures. The instructor's…
Determining the Number of Clusters in a Data Set Without Graphical Interpretation
NASA Technical Reports Server (NTRS)
Aguirre, Nathan S.; Davies, Misty D.
2011-01-01
Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,
Yamada, Minoru; Aoyama, Tomoki; Nakamura, Masatoshi; Tanaka, Buichi; Nagai, Koutatsu; Tatematsu, Noriatsu; Uemura, Kazuki; Nakamura, Takashi; Tsuboyama, Tadao; Ichihashi, Noriaki
2011-01-01
The purpose of this study was to examine whether the Nintendo Wii Fit program could be used for fall risk assessment in healthy, community-dwelling older adults. Forty-five community-dwelling older women participated in this study. The "Basic Step" and "Ski Slalom" modules were selected from the Wii Fit game program. The following 5 physical performance tests were performed: the 10-m walk test under single- and dual-task conditions, the Timed Up and Go test under single- and dual-task conditions, and the Functional Reach test. Compared with the faller group, the nonfaller group showed a significant difference in the Basic Step (P < .001) and a nonsignificant difference in the Ski Slalom (P = .453). The discriminating criterion between the 2 groups was a score of 111 points on the Basic Step (P < .001). The Basic Step showed statistically significant, moderate correlations between the dual-task lag of walking (r = -.547) and the dual-task lag of the Timed Up and Go test (r = -.688). These results suggest that game-based fall risk assessment using the Basic Step has a high generality and is useful in community-dwelling older adults. Copyright © 2011 Mosby, Inc. All rights reserved.
Refining of metallurgical-grade silicon
NASA Technical Reports Server (NTRS)
Dietl, J.
1986-01-01
A basic requirement of large scale solar cell fabrication is to provide low cost base material. Unconventional refining of metallurical grade silicon represents one of the most promising ways of silicon meltstock processing. The refining concept is based on an optimized combination of metallurgical treatments. Commercially available crude silicon, in this sequence, requires a first pyrometallurgical step by slagging, or, alternatively, solvent extraction by aluminum. After grinding and leaching, high purity qualtiy is gained as an advanced stage of refinement. To reach solar grade quality a final pyrometallurgical step is needed: liquid-gas extraction.
Noise Enhances Action Potential Generation in Mouse Sensory Neurons via Stochastic Resonance.
Onorato, Irene; D'Alessandro, Giuseppina; Di Castro, Maria Amalia; Renzi, Massimiliano; Dobrowolny, Gabriella; Musarò, Antonio; Salvetti, Marco; Limatola, Cristina; Crisanti, Andrea; Grassi, Francesca
2016-01-01
Noise can enhance perception of tactile and proprioceptive stimuli by stochastic resonance processes. However, the mechanisms underlying this general phenomenon remain to be characterized. Here we studied how externally applied noise influences action potential firing in mouse primary sensory neurons of dorsal root ganglia, modelling a basic process in sensory perception. Since noisy mechanical stimuli may cause stochastic fluctuations in receptor potential, we examined the effects of sub-threshold depolarizing current steps with superimposed random fluctuations. We performed whole cell patch clamp recordings in cultured neurons of mouse dorsal root ganglia. Noise was added either before and during the step, or during the depolarizing step only, to focus onto the specific effects of external noise on action potential generation. In both cases, step + noise stimuli triggered significantly more action potentials than steps alone. The normalized power norm had a clear peak at intermediate noise levels, demonstrating that the phenomenon is driven by stochastic resonance. Spikes evoked in step + noise trials occur earlier and show faster rise time as compared to the occasional ones elicited by steps alone. These data suggest that external noise enhances, via stochastic resonance, the recruitment of transient voltage-gated Na channels, responsible for action potential firing in response to rapid step-wise depolarizing currents.
Noise Enhances Action Potential Generation in Mouse Sensory Neurons via Stochastic Resonance
Onorato, Irene; D'Alessandro, Giuseppina; Di Castro, Maria Amalia; Renzi, Massimiliano; Dobrowolny, Gabriella; Musarò, Antonio; Salvetti, Marco; Limatola, Cristina; Crisanti, Andrea; Grassi, Francesca
2016-01-01
Noise can enhance perception of tactile and proprioceptive stimuli by stochastic resonance processes. However, the mechanisms underlying this general phenomenon remain to be characterized. Here we studied how externally applied noise influences action potential firing in mouse primary sensory neurons of dorsal root ganglia, modelling a basic process in sensory perception. Since noisy mechanical stimuli may cause stochastic fluctuations in receptor potential, we examined the effects of sub-threshold depolarizing current steps with superimposed random fluctuations. We performed whole cell patch clamp recordings in cultured neurons of mouse dorsal root ganglia. Noise was added either before and during the step, or during the depolarizing step only, to focus onto the specific effects of external noise on action potential generation. In both cases, step + noise stimuli triggered significantly more action potentials than steps alone. The normalized power norm had a clear peak at intermediate noise levels, demonstrating that the phenomenon is driven by stochastic resonance. Spikes evoked in step + noise trials occur earlier and show faster rise time as compared to the occasional ones elicited by steps alone. These data suggest that external noise enhances, via stochastic resonance, the recruitment of transient voltage-gated Na channels, responsible for action potential firing in response to rapid step-wise depolarizing currents. PMID:27525414
Toth, Tibor Istvan; Grabowska, Martyna; Schmidt, Joachim; Büschges, Ansgar; Daun-Gruhn, Silvia
2013-01-01
Stop and start of stepping are two basic actions of the musculo-skeletal system of a leg. Although they are basic phenomena, they require the coordinated activities of the leg muscles. However, little is known of the details of how these activities are generated by the interactions between the local neuronal networks controlling the fast and slow muscle fibres at the individual leg joints. In the present work, we aim at uncovering some of those details using a suitable neuro-mechanical model. It is an extension of the model in the accompanying paper and now includes all three antagonistic muscle pairs of the main joints of an insect leg, together with their dedicated neuronal control, as well as common inhibitory motoneurons and the residual stiffness of the slow muscles. This model enabled us to study putative processes of intra-leg coordination during stop and start of stepping. We also made use of the effects of sensory signals encoding the position and velocity of the leg joints. Where experimental observations are available, the corresponding simulation results are in good agreement with them. Our model makes detailed predictions as to the coordination processes of the individual muscle systems both at stop and start of stepping. In particular, it reveals a possible role of the slow muscle fibres at stop in accelerating the convergence of the leg to its steady-state position. These findings lend our model physiological relevance and can therefore be used to elucidate details of the stop and start of stepping in insects, and perhaps in other animals, too. PMID:24278108
The contribution of temporary storage and executive processes to category learning.
Wang, Tengfei; Ren, Xuezhu; Schweizer, Karl
2015-09-01
Three distinctly different working memory processes, temporary storage, mental shifting and inhibition, were proposed to account for individual differences in category learning. A sample of 213 participants completed a classic category learning task and two working memory tasks that were experimentally manipulated for tapping specific working memory processes. Fixed-links models were used to decompose data of the category learning task into two independent components representing basic performance and improvement in performance in category learning. Processes of working memory were also represented by fixed-links models. In a next step the three working memory processes were linked to components of category learning. Results from modeling analyses indicated that temporary storage had a significant effect on basic performance and shifting had a moderate effect on improvement in performance. In contrast, inhibition showed no effect on any component of the category learning task. These results suggest that temporary storage and the shifting process play different roles in the course of acquiring new categories. Copyright © 2015 Elsevier B.V. All rights reserved.
You Can't Come to My Birthday Party! Conflict Resolution with Young Children.
ERIC Educational Resources Information Center
Evans, Betsy
Noting that many teachers and parents are baffled by the repetitiveness of young children's conflict and by their own reaction to it, this book describes how adults can help children find alternatives to hurtful words and fighting by settling differences through a six-step mediation process based on several basic adult-child interaction…
How We Make Energy Work: Grades 4, 5, 6 Science.
ERIC Educational Resources Information Center
National Science Teachers Association, Washington, DC.
This packet of units is designed to focus on the technological aspects of energy. Four units are presented, with from 1-4 lessons included in each unit. Units include: (1) basic concepts and applications of energy; (2) steps and processes of energy production and transmission; (3) fuel acquisition; and (4) energy futures and application of…
Generalizations Related to Concepts Important for Youth Orientation to the World of Work.
ERIC Educational Resources Information Center
Warren, Mary A.; And Others
A basic first step in building a curriculum contributing to the orientation of youth to world of work is identification of concepts important to that orientation. In this study, the generalizations within the concept framework were identified through a developmental process of analysis and synthesis, including a review of current literature, a…
ERIC Educational Resources Information Center
Warger, Cynthia
This digest offers guidelines to help teachers prepare students with disabilities to succeed on state and district writing assessments. Teachers are urged to use the three principles of effective writing instruction: (1) use a basic framework of planning, writing, and revision; (2) instruct students in steps of the writing process and the features…
Free energy of steps using atomistic simulations
NASA Astrophysics Data System (ADS)
Freitas, Rodrigo; Frolov, Timofey; Asta, Mark
The properties of solid-liquid interfaces are known to play critical roles in solidification processes. Particularly special importance is given to thermodynamic quantities that describe the equilibrium state of these surfaces. For example, on the solid-liquid-vapor heteroepitaxial growth of semiconductor nanowires the crystal nucleation process on the faceted solid-liquid interface is influenced by the solid-liquid and vapor-solid interfacial free energies, and also by the free energies of associated steps at these faceted interfaces. Crystal-growth theories and mesoscale simulation methods depend on quantitative information about these properties, which are often poorly characterized from experimental measurements. In this work we propose an extension of the capillary fluctuation method for calculation of the free energy of steps on faceted crystal surfaces. From equilibrium atomistic simulations of steps on (111) surfaces of Copper we computed accurately the step free energy for different step orientations. We show that the step free energy remains finite at all temperature up to the melting point and that the results obtained agree with the more well established method of thermodynamic integration if finite size effects are taken into account. The research of RF and MA at UC Berkeley were supported by the US National Science Foundation (Grant No. DMR-1105409). TF acknowledges support through a postdoctoral fellowship from the Miller Institute for Basic Research in Science.
Editorial Commentary: A Model for Shoulder Rotator Cuff Repair and for Basic Science Investigations.
Brand, Jefferson C
2018-04-01
"Breaking the fourth wall" is a theater convention where the narrator or character speaks directly to the audience. As an Assistant Editor-in-Chief, as I comment on a recent basic science study investigating rotator cuff repair, I break the fourth wall and articulate areas of basic science research excellence that align with the vision that we hold for our journal. Inclusion of a powerful video strengthens the submission. We prefer to publish clinical videos in our companion journal, Arthroscopy Techniques, and encourage basic science video submissions to Arthroscopy. Basic science research requires step-by-tedious-step analogous to climbing a mountain. Establishment of a murine rotator cuff repair model was rigorous and research intensive, biomechanically, radiographically, histologically, and genetically documented, a huge step toward the bone-to-tendon healing research summit. This research results in a model for both rotator cuff repair and the pinnacle of quality, basic science research. Copyright © 2018 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Luo, Y.; Nissen-Meyer, T.; Morency, C.; Tromp, J.
2008-12-01
Seismic imaging in the exploration industry is often based upon ray-theoretical migration techniques (e.g., Kirchhoff) or other ideas which neglect some fraction of the seismic wavefield (e.g., wavefield continuation for acoustic-wave first arrivals) in the inversion process. In a companion paper we discuss the possibility of solving the full physical forward problem (i.e., including visco- and poroelastic, anisotropic media) using the spectral-element method. With such a tool at hand, we can readily apply the adjoint method to tomographic inversions, i.e., iteratively improving an initial 3D background model to fit the data. In the context of this inversion process, we draw connections between kernels in adjoint tomography and basic imaging principles in migration. We show that the images obtained by migration are nothing but particular kinds of adjoint kernels (mainly density kernels). Migration is basically a first step in the iterative inversion process of adjoint tomography. We apply the approach to basic 2D problems involving layered structures, overthrusting faults, topography, salt domes, and poroelastic regions.
Billoir, Elise; Denis, Jean-Baptiste; Cammeau, Natalie; Cornu, Marie; Zuliani, Veronique
2011-02-01
To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance. © 2010 Society for Risk Analysis.
Fine Tuning Cell Migration by a Disintegrin and Metalloproteinases
Theodorou, K.
2017-01-01
Cell migration is an instrumental process involved in organ development, tissue homeostasis, and various physiological processes and also in numerous pathologies. Both basic cell migration and migration towards chemotactic stimulus consist of changes in cell polarity and cytoskeletal rearrangement, cell detachment from, invasion through, and reattachment to their neighboring cells, and numerous interactions with the extracellular matrix. The different steps of immune cell, tissue cell, or cancer cell migration are tightly coordinated in time and place by growth factors, cytokines/chemokines, adhesion molecules, and receptors for these ligands. This review describes how a disintegrin and metalloproteinases interfere with several steps of cell migration, either by proteolytic cleavage of such molecules or by functions independent of proteolytic activity. PMID:28260841
Recovery and purification process development for monoclonal antibody production
Ma, Junfen; Winter, Charles; Bayer, Robert
2010-01-01
Hundreds of therapeutic monoclonal antibodies (mAbs) are currently in development, and many companies have multiple antibodies in their pipelines. Current methodology used in recovery processes for these molecules are reviewed here. Basic unit operations such as harvest, Protein A affinity chromatography and additional polishing steps are surveyed. Alternative processes such as flocculation, precipitation and membrane chromatography are discussed. We also cover platform approaches to purification methods development, use of high throughput screening methods, and offer a view on future developments in purification methodology as applied to mAbs. PMID:20647768
Identifying Opportunities for Vertical Integration of Biochemistry and Clinical Medicine.
Wendelberger, Karen J.; Burke, Rebecca; Haas, Arthur L.; Harenwattananon, Marisa; Simpson, Deborah
1998-01-01
Objectives: Retention of basic science knowledge, as judged by National Board of Medical Examiners' (NBME) data, suffers due to lack of apparent relevance and isolation of instruction from clinical application, especially in biochemistry. However, the literature reveals no systematic process for identifying key biochemical concepts and associated clinical conditions. This study systematically identified difficult biochemical concepts and their common clinical conditions as a critical step towards enhancing relevance and retention of biochemistry.Methods: A multi-step/ multiple stakeholder process was used to: (1) identify important biochemistry concepts; (2) determine students' perceptions of concept difficulty; (3) assess biochemistry faculty, student, and clinical teaching scholars' perceived relevance of identified concepts; and (4) identify associated common clinical conditions for relevant and difficult concepts. Surveys and a modified Delphi process were used to gather data, subsequently analyzed using SPSS for Windows.Results: Sixteen key biochemical concepts were identified. Second year medical students rated 14/16 concepts as extremely difficult while fourth year students rated nine concepts as moderately to extremely difficult. On average, each teaching scholar generated common clinical conditions for 6.2 of the 16 concepts, yielding a set of seven critical concepts and associated clinical conditions.Conclusions: Key stakeholders in the instructional process struggle to identify biochemistry concepts that are critical, difficult to learn and associated with common clinical conditions. However, through a systematic process beginning with identification of concepts and associated clinical conditions, relevance of basic science instruction can be enhanced.
VICAR image processing system guide to system use
NASA Technical Reports Server (NTRS)
Seidman, J. B.
1977-01-01
The functional characteristics and operating requirements of the VICAR (Video Image Communication and Retrieval) system are described. An introduction to the system describes the functional characteristics and the basic theory of operation. A brief description of the data flow as well as tape and disk formats is also presented. A formal presentation of the control statement formats is given along with a guide to usage of the system. The guide provides a step-by-step reference to the creation of a VICAR control card deck. Simple examples are employed to illustrate the various options and the system response thereto.
Abuhamad, Alfred; Zhao, Yili; Abuhamad, Sharon; Sinkovskaya, Elena; Rao, Rashmi; Kanaan, Camille; Platt, Lawrence
2016-01-01
This study aims to validate the feasibility and accuracy of a new standardized six-step approach to the performance of the focused basic obstetric ultrasound examination, and compare the new approach to the regular approach performed in the scheduled obstetric ultrasound examination. A new standardized six-step approach to the performance of the focused basic obstetric ultrasound examination, to evaluate fetal presentation, fetal cardiac activity, presence of multiple pregnancy, placental localization, amniotic fluid volume evaluation, and biometric measurements, was prospectively performed on 100 pregnant women between 18(+0) and 27(+6) weeks of gestation and another 100 pregnant women between 28(+0) and 36(+6) weeks of gestation. The agreement of findings for each of the six steps of the standardized six-step approach was evaluated against the regular approach. In all ultrasound examinations performed, substantial to perfect agreement (Kappa value between 0.64 and 1.00) was observed between the new standardized six-step approach and the regular approach. The new standardized six-step approach to the focused basic obstetric ultrasound examination can be performed successfully and accurately between 18(+0) and 36(+6) weeks of gestation. This standardized approach can be of significant benefit to limited resource settings and in point of care obstetric ultrasound applications. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
ERIC Educational Resources Information Center
Busch, Phyllis S.
1985-01-01
Provides directions for basic science experiments which demonstrate the rain cycle, fundamentals of cloud formation, and testing for the presence of acidity in local rainwater. Describes materials required, step-by-step instructions, and discussion topics. (NEC)
Antiviral drug research proposal activity.
Injaian, Lisa; Smith, Ann C; Shipley, Jennifer German; Marbach-Ad, Gili; Fredericksen, Brenda
2011-01-01
The development of antiviral drugs provides an excellent example of how basic and clinical research must be used together in order to achieve the final goal of treating disease. A Research Oriented Learning Activity was designed to help students to better understand how basic and clinical research can be combined toward a common goal. Through this project students gained a better understanding of the process of scientific research and increased their information literacy in the field of virology. The students worked as teams to research the many aspects involved in the antiviral drug design process, with each student becoming an "expert" in one aspect of the project. The Antiviral Drug Research Proposal (ADRP) culminated with students presenting their proposals to their peers and local virologists in a poster session. Assessment data showed increased student awareness and knowledge of the research process and the steps involved in the development of antiviral drugs as a result of this activity.
Alfadeel, Mona A; Hamid, Yassin H M; El Fadeel, Ogail Ata; Salih, Karimeldin M A
2015-01-01
The objectives of this study are to identify the availability of the service logistics in basic public schools (structure as quality concept), to assess steps of physical examination according to the ministry of health guidelines (process as quality concept) and to measure satisfaction of service consumers (pupils) and service providers (teacher and doctors). The study involved seven localities in Sudan using questionnaires and observations. The structure in form of material and human resources was not well maintained, equally the process and procedure of medical examination did not well fit with rules of quality, however, the satisfaction level was within the accepted level. As far as structure, process and outcome were concerned, we are still below the standards in developed countries for many reasons but the level of satisfaction in the present study is more or less similar as in else studies.
Antiviral Drug Research Proposal Activity †
Injaian, Lisa; Smith, Ann C.; Shipley, Jennifer German; Marbach-Ad, Gili; Fredericksen, Brenda
2011-01-01
The development of antiviral drugs provides an excellent example of how basic and clinical research must be used together in order to achieve the final goal of treating disease. A Research Oriented Learning Activity was designed to help students to better understand how basic and clinical research can be combined toward a common goal. Through this project students gained a better understanding of the process of scientific research and increased their information literacy in the field of virology. The students worked as teams to research the many aspects involved in the antiviral drug design process, with each student becoming an “expert” in one aspect of the project. The Antiviral Drug Research Proposal (ADRP) culminated with students presenting their proposals to their peers and local virologists in a poster session. Assessment data showed increased student awareness and knowledge of the research process and the steps involved in the development of antiviral drugs as a result of this activity. PMID:23653735
NASA Technical Reports Server (NTRS)
Duxbury, J. H.
1983-01-01
The JPL's Scientific Data Analysis System (SDAS), which will process IRAS data and produce a catalogue of perhaps a million infrared sources in the sky, as well as other information for astronomical records, is described. The purposes of SDAS are discussed, and the major SDAS processors are shown in block diagram. The catalogue processing is addressed, mentioning the basic processing steps which will be applied to raw detector data. Signal reconstruction and conversion to astrophysical units, source detection, source confirmation, data management, and survey data products are considered in detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osmanlioglu, Ahmet Erdal
Pre-treatment of radioactive waste is the first step in waste management program that occurs after waste generation from various applications in Turkey. Pre-treatment and characterization practices are carried out in Radioactive Waste Management Unit (RWMU) at Cekmece Nuclear Research and Training Center (CNRTC) in Istanbul. This facility has been assigned to take all low-level radioactive wastes generated by nuclear applications in Turkey. The wastes are generated from research and nuclear applications mainly in medicine, biology, agriculture, quality control in metal processing and construction industries. These wastes are classified as low- level radioactive wastes. Pre-treatment practices cover several steps. In thismore » paper, main steps of pre-treatment and characterization are presented. Basically these are; collection, segregation, chemical adjustment, size reduction and decontamination operations. (author)« less
ERIC Educational Resources Information Center
Burns, E. Robert; Garrett, Judy
2015-01-01
Correlates of achievement in the basic science years in medical school and on the Step 1 of the United States Medical Licensing Examination® (USMLE®), (Step 1) in relation to preadmission variables have been the subject of considerable study. Preadmissions variables such as the undergraduate grade point average (uGPA) and Medical College Admission…
ERIC Educational Resources Information Center
Secor, John R.
Often when total quality management (TQM) does not live up to expectations, that failure is a sign that implementation of TQM was simply fashionable "management hype" or "window dressing" without strong organizational underpinnings. TQM can have staying power when it is backed up by leadership basics of training people…
Tests That Work: Designing and Delivering Fair and Practical Measurement Tools in the Workplace.
ERIC Educational Resources Information Center
Westgaard, Odin
This guide shows organization managers how to use tests to assess skills and values in the workplace, as well as how to develop good, fair tests without needing any other resources. Part 1, chapters 1 through 5, presents basic information about tests and their practical applications. Part 2 describes the 15 steps of the testing process. The…
TagDust2: a generic method to extract reads from sequencing data.
Lassmann, Timo
2015-01-28
Arguably the most basic step in the analysis of next generation sequencing data (NGS) involves the extraction of mappable reads from the raw reads produced by sequencing instruments. The presence of barcodes, adaptors and artifacts subject to sequencing errors makes this step non-trivial. Here I present TagDust2, a generic approach utilizing a library of hidden Markov models (HMM) to accurately extract reads from a wide array of possible read architectures. TagDust2 extracts more reads of higher quality compared to other approaches. Processing of multiplexed single, paired end and libraries containing unique molecular identifiers is fully supported. Two additional post processing steps are included to exclude known contaminants and filter out low complexity sequences. Finally, TagDust2 can automatically detect the library type of sequenced data from a predefined selection. Taken together TagDust2 is a feature rich, flexible and adaptive solution to go from raw to mappable NGS reads in a single step. The ability to recognize and record the contents of raw reads will help to automate and demystify the initial, and often poorly documented, steps in NGS data analysis pipelines. TagDust2 is freely available at: http://tagdust.sourceforge.net .
Addressing problems of employee performance.
McConnell, Charles R
2011-01-01
Employee performance problems are essentially of 2 kinds: those that are motivational in origin and those resulting from skill deficiencies. Both kinds of problems are the province of the department manager. Performance problems differ from problems of conduct in that traditional disciplinary processes ordinarily do not apply. Rather, performance problems are addressed through educational and remedial processes. The manager has a basic responsibility in ensuring that everything reasonable is done to help each employee succeed. There are a number of steps the manager can take to address employee performance problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudsen, J.K.; Smith, C.L.
The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more thanmore » one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.« less
Hand and goods judgment algorithm based on depth information
NASA Astrophysics Data System (ADS)
Li, Mingzhu; Zhang, Jinsong; Yan, Dan; Wang, Qin; Zhang, Ruiqi; Han, Jing
2016-03-01
A tablet computer with a depth camera and a color camera is loaded on a traditional shopping cart. The inside information of the shopping cart is obtained by two cameras. In the shopping cart monitoring field, it is very important for us to determine whether the customer with goods in or out of the shopping cart. This paper establishes a basic framework for judging empty hand, it includes the hand extraction process based on the depth information, process of skin color model building based on WPCA (Weighted Principal Component Analysis), an algorithm for judging handheld products based on motion and skin color information, statistical process. Through this framework, the first step can ensure the integrity of the hand information, and effectively avoids the influence of sleeve and other debris, the second step can accurately extract skin color and eliminate the similar color interference, light has little effect on its results, it has the advantages of fast computation speed and high efficiency, and the third step has the advantage of greatly reducing the noise interference and improving the accuracy.
Basic Technology Tools for Administrators: Preparing for the New Millennium.
ERIC Educational Resources Information Center
Aguilera, Raymond; Hendricks, Joen M.
This paper suggests activities for school administrators to learn basic technology tools. Step-by-step instructions are provided for browsing and using the Internet, organizing favorite World Wide Web sites, and organizing Internet bookmarks. Interesting job search, legal, and professional organization Web sites for administrators are listed. A…
Testing for multigroup equivalence of a measuring instrument: a walk through the process.
Byrne, Barbara M
2008-11-01
This article presents an overview and application of the steps taken in testing for the equivalence of a measuring instrument across one or more groups. Following a basic description of, and rationale underlying these steps, the process is illustrated with data comprising response scores to four nonacademic subscales (Physical SC [Ability], Physical SC [Appearance], Social SC [Peers], and Social SC [Parents]) of the Self Description Questionnaire-I for Australian (N = 497) and Nigerian (N = 439) adolescents. All tests for validity and equivalence are based on the analysis of covariance structures within the framework of CFA models using the EQS 6 program. Prospective impediments to equivalence are suggested and additional caveats proposed in the special case where the groups under study represent different cultures.
Splendidly blended: a machine learning set up for CDU control
NASA Astrophysics Data System (ADS)
Utzny, Clemens
2017-06-01
As the concepts of machine learning and artificial intelligence continue to grow in importance in the context of internet related applications it is still in its infancy when it comes to process control within the semiconductor industry. Especially the branch of mask manufacturing presents a challenge to the concepts of machine learning since the business process intrinsically induces pronounced product variability on the background of small plate numbers. In this paper we present the architectural set up of a machine learning algorithm which successfully deals with the demands and pitfalls of mask manufacturing. A detailed motivation of this basic set up followed by an analysis of its statistical properties is given. The machine learning set up for mask manufacturing involves two learning steps: an initial step which identifies and classifies the basic global CD patterns of a process. These results form the basis for the extraction of an optimized training set via balanced sampling. A second learning step uses this training set to obtain the local as well as global CD relationships induced by the manufacturing process. Using two production motivated examples we show how this approach is flexible and powerful enough to deal with the exacting demands of mask manufacturing. In one example we show how dedicated covariates can be used in conjunction with increased spatial resolution of the CD map model in order to deal with pathological CD effects at the mask boundary. The other example shows how the model set up enables strategies for dealing tool specific CD signature differences. In this case the balanced sampling enables a process control scheme which allows usage of the full tool park within the specified tight tolerance budget. Overall, this paper shows that the current rapid developments off the machine learning algorithms can be successfully used within the context of semiconductor manufacturing.
A comparison between atmospheric/humidity and vacuum cyanoacrylate fuming of latent fingermarks.
Farrugia, Kevin J; Fraser, Joanna; Friel, Lauren; Adams, Duncan; Attard-Montalto, Nicola; Deacon, Paul
2015-12-01
A number of pseudo-operational trials were set up to compare the atmospheric/humidity and vacuum cyanoacrylate fuming processes on plastic carrier bags. The fuming processes were compared using two-step cyanoacrylate fuming with basic yellow 40 (BY40) staining and a one-step fluorescent cyanoacrylate fuming, Lumicyano 4%. Preliminary work using planted fingermarks and split depletions were performed to identify the optimum vacuum fuming conditions. The first pseudo-operational trial compared the different fuming conditions (atmospheric/humidity vs. vacuum) for the two-step process where an additional 50% more marks were detected with the atmospheric/humidity process. None of the marks by the vacuum process could be observed visually; however, a significant number of marks were detected by fluorescence after BY40 staining. The second trial repeated the same work in trial 1 using the one-step cyanoacrylate process, Lumicyano at a concentration of 4%. Trial 2 provided comparable results to trial 1 and all the items were then re-treated with Lumicyano 4% at atmospheric/humidity conditions before dyeing with BY40 to provide the sequences of process A (Lumicyano 4% atmospheric-Lumicyano 4% atmospheric-BY40) and process B (Lumicyano 4% vacuum-Lumicyano 4% atmospheric-BY40). The number of marks (visual and fluorescent) was counted after each treatment with a substantial increase in the number of detected marks in the second and third treatments of the process. The increased detection rate after the double Lumicyano process was unexpected and may have important implications. Trial 3 was performed to investigate whether the amount of cyanoacrylate and/or fuming time had an impact on the results observed in trial 2 whereas trial 4 assessed if the double process using conventional cyanoacrylate, rather than Lumicyano 4%, provided an increased detection rate. Trials 3 and 4 confirmed that doubling the amount of Lumicyano 4% cyanoacrylate and fuming time produced a lower detection rate than the double process with Lumicyano 4%. Furthermore, the double process with conventional cyanoacrylate did not provide any benefit. Scanning electron microscopy was also performed to investigate the morphology of the cyanoacrylate polymer under different conditions. The atmospheric/humidity process appears to be superior to the vacuum process for both the two-step and one-step cyanoacrylate fuming, although the two-step process performed better in comparison to the one-step process under vacuum conditions. Nonetheless, the use of vacuum cyanoacrylate fuming may have certain operational advantages and its use does not adversely affect subsequent cyanoacrylate fuming with atmospheric/humidity conditions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Thin layer imaging process for microlithography using radiation at strongly attenuated wavelengths
Wheeler, David R.
2004-01-06
A method for patterning of resist surfaces which is particularly advantageous for systems having low photon flux and highly energetic, strongly attenuated radiation. A thin imaging layer is created with uniform silicon distribution in a bilayer format. An image is formed by exposing selected regions of the silylated imaging layer to radiation. The radiation incident upon the silyliated resist material results in acid generation which either catalyzes cleavage of Si--O bonds to produce moieties that are volatile enough to be driven off in a post exposure bake step or produces a resist material where the exposed portions of the imaging layer are soluble in a basic solution, thereby desilylating the exposed areas of the imaging layer. The process is self limiting due to the limited quantity of silyl groups within each region of the pattern. Following the post exposure bake step, an etching step, generally an oxygen plasma etch, removes the resist material from the de-silylated areas of the imaging layer.
ERIC Educational Resources Information Center
Vidacovich, Courtney
2015-01-01
Current teaching standards and practices are dictated, at least in part, by state- and district-mandated standardized tests. Yet, despite being surrounded by data, teachers receive only basic trainings on how to use assessments. In reality, teachers use data and assessments daily--even minute by minute--through the assessment process, which uses…
The Art of Showing Art. Revised and Updated.
ERIC Educational Resources Information Center
Reeve, James K.
This book focuses attention on the art objects collections and how to display them. Designing the effective placement of objects is an easily learned art. Starting with the basics, the book takes the reader step by step through a systematic method, to solutions for display problems. The first chapter covers basic concepts of display including…
Teaching BASIC. A Step by Step Guide.
ERIC Educational Resources Information Center
Allen, M. F.
This three-chapter guide provides simple explanations about BASIC programming for a teacher to use in a classroom situation, and suggests procedures for a "hands-on" course. Numerous examples are presented of the questions, problems, and level of understanding to expect from first-time, adult users (ages 13 and up). The course materials…
Sequence-dependent base pair stepping dynamics in XPD helicase unwinding
Qi, Zhi; Pugh, Robert A; Spies, Maria; Chemla, Yann R
2013-01-01
Helicases couple the chemical energy of ATP hydrolysis to directional translocation along nucleic acids and transient duplex separation. Understanding helicase mechanism requires that the basic physicochemical process of base pair separation be understood. This necessitates monitoring helicase activity directly, at high spatio-temporal resolution. Using optical tweezers with single base pair (bp) resolution, we analyzed DNA unwinding by XPD helicase, a Superfamily 2 (SF2) DNA helicase involved in DNA repair and transcription initiation. We show that monomeric XPD unwinds duplex DNA in 1-bp steps, yet exhibits frequent backsteps and undergoes conformational transitions manifested in 5-bp backward and forward steps. Quantifying the sequence dependence of XPD stepping dynamics with near base pair resolution, we provide the strongest and most direct evidence thus far that forward, single-base pair stepping of a helicase utilizes the spontaneous opening of the duplex. The proposed unwinding mechanism may be a universal feature of DNA helicases that move along DNA phosphodiester backbones. DOI: http://dx.doi.org/10.7554/eLife.00334.001 PMID:23741615
A systematic writing program as a tool in the grief process: part 1.
Furnes, Bodil; Dysvik, Elin
2010-12-06
The basic aim of this paper is to suggest a flexible and individualized writing program as a tool for use during the grief process of bereaved adults. An open, qualitative approach following distinct steps was taken to gain a broad perspective on the grief and writing processes, as a platform for the writing program. Following several systematic methodological steps, we arrived at suggestions for the initiation of a writing program and its structure and substance, with appropriate guidelines. We believe that open and expressive writing, including free writing and focused writing, may have beneficial effects on a person experiencing grief. These writing forms may be undertaken and systematized through a writing program, with participation in a grief writing group and with diary writing, to achieve optimal results. A structured writing program might be helpful in promoting thought activities and as a tool to increase the coherence and understanding of individuals in the grief process. Our suggested program may also be a valuable guide to future program development and research.
NASA Astrophysics Data System (ADS)
Maes, Pieter-Jan; Amelynck, Denis; Leman, Marc
2012-12-01
In this article, a computational platform is presented, entitled "Dance-the-Music", that can be used in a dance educational context to explore and learn the basics of dance steps. By introducing a method based on spatiotemporal motion templates, the platform facilitates to train basic step models from sequentially repeated dance figures performed by a dance teacher. Movements are captured with an optical motion capture system. The teachers' models can be visualized from a first-person perspective to instruct students how to perform the specific dance steps in the correct manner. Moreover, recognition algorithms-based on a template matching method-can determine the quality of a student's performance in real time by means of multimodal monitoring techniques. The results of an evaluation study suggest that the Dance-the-Music is effective in helping dance students to master the basics of dance figures.
Process for the combined removal of SO.sub.2 and NO.sub.x from flue gas
Chang, Shih-Ger; Liu, David K.; Griffiths, Elizabeth A.; Littlejohn, David
1988-01-01
The present invention in one aspect relates to a process for the simultaneous removal of NO.sub.x and SO.sub.2 from a fluid stream comprising mixtures thereof and in another aspect relates to the separation, use and/or regeneration of various chemicals contaminated or spent in the process and which includes the steps of: (A) contacting the fluid stream at a temperature of between about 105.degree. and 180.degree. C. with a liquid aqueous slurry or solution comprising an effective amount of an iron chelate of an amino acid moiety having at least one --SH group; (B) separating the fluid stream from the particulates formed in step (A) comprising the chelate of the amino acid moiety and fly ash; (C) washing and separating the particulates of step (B) with an aqueous solution having a pH value of between about 5 to 8; (D) subsequently washing and separating the particulates of step (C) with a strongly acidic aqueous solution having a pH value of between about 1 to 3; (E) washing and separating the particulates of step (D) with an basic aqueous solution having a pH value of between about 9 to 12; (F) optionally adding additional amino acid moiety, iron (II) and alkali to the aqueous liquid from step (D) to produce an aqueous solution or slurry similar to that in step (A) having a pH value of between about 4 to 12; and (G) recycling the aqueous slurry of step (F) to the contacting zone of step (A). Steps (D) and (E) can be carried out in the reverse sequence, however the preferred order is (D) and then (E). In another preferred embodiment the present invention provides a process for the removal of NO.sub.x, SO.sub.2 and particulates from a fluid stream which includes the steps of (A) injecting into a reaction zone an aqueous solution itself comprising (i) an amino acid moiety selected from those described above; (ii) iron (II) ion; and (iii) an alkali, wherein the aqueous solution has a pH of between about 4 and 11; followed by solids separation and washing as is described in steps (B), (C), (D) and (E) above. The overall process is useful to reduce acid rain components from combustion gas sources.
Serveau, Carole; Boulangé, Alain; Lecaille, Fabien; Gauthier, Francis; Authié, Edith; Lalmanach, Gilles
2003-06-01
Congopain, the major cysteine protease from Trypanosoma congolense, is synthesized as an inactive zymogen, and further converted into its active form after removal of the proregion, most probably via an autocatalytic mechanism. Processing of recombinant procongopain occurs via an apparent one-step or a multistep mechanism depending on the ionic strength. The auto-activation is pH-dependent, with an optimum at pH 4.0, and no activation observed at pH 6.0. After addition of dextran sulfate (10 microg/ml), an approx. 20-fold increase of processing (expressed as enzymatic activity) is observed. Furthermore, in the presence of dextran sulfate, procongopain can be processed at pH 8.0, an unusual feature among papain-like enzymes. Detection of procongopain and trypanosomal enzymatic activity in the plasma of T. congolense-infected cattle, together with the capacity of procongopain to be activated at weakly basic pH, suggest that procongopain may be extracellularly processed in the presence of blood vessel glycosaminoglycans, supporting the hypothesis that congopain acts as a pathogenic factor in host-parasite relationships.
Simulation Activity for Initiating Thinking about Year Around School Plans.
ERIC Educational Resources Information Center
Sagness, Richard L.
This document presents a three-step model for providing teachers with basic information on year-round or extended school plans. As a first step, descriptions of basic plans are given. The four used in this paper are Staggered Quarter for All, Full 48-week School Year for All, Voluntary Summer Program, and A Summer Program for Professional…
The Basic Organizing/Optimizing Training Scheduler (BOOTS): User's Guide. Technical Report 151.
ERIC Educational Resources Information Center
Church, Richard L.; Keeler, F. Laurence
This report provides the step-by-step instructions required for using the Navy's Basic Organizing/Optimizing Training Scheduler (BOOTS) system. BOOTS is a computerized tool designed to aid in the creation of master training schedules for each Navy recruit training command. The system is defined in terms of three major functions: (1) data file…
Materials and manufacturing processes for increased life/reliability. [of turbine wheels
NASA Technical Reports Server (NTRS)
Duttweiler, R. E.
1977-01-01
Improvements in both quality and durability of disk raw material for both military and commercial engines necessitated an entirely new concept in raw material process control which imposes careful selection, screening and sampling of the basic alloy ingredients, followed by careful monitoring of the melting parameters in all phases of the vacuum melting sequence. Special care is taken to preclude solidification conditions that produce adverse levels of segregation. Melt furnaces are routinely cleaned and inspected for contamination. Ingots are also cleaned and inspected before entering the final melt step.
The use of quizStar application for online examination in basic physics course
NASA Astrophysics Data System (ADS)
Kustijono, R.; Budiningarti, H.
2018-03-01
The purpose of the study is to produce an online Basic Physics exam system using the QuizStar application. This is a research and development with ADDIE model. The steps are: 1) analysis; 2) design; 3) development; 4) implementation; 5) evaluation. System feasibility is reviewed for its validity, practicality, and effectiveness. The subjects of research are 60 Physics Department students of Universitas Negeri Surabaya. The data analysis used is a descriptive statistic. The validity, practicality, and effectiveness scores are measured using a Likert scale. Criteria feasible if the total score of all aspects obtained is ≥ 61%. The results obtained from the online test system by using QuizStar developed are 1) conceptually feasible to use; 2) the system can be implemented in the Basic Physics assessment process, and the existing constraints can be overcome; 3) student's response to system usage is in a good category. The results conclude that QuizStar application is eligible to be used for online Basic Physics exam system.
From Recombinant Expression to Crystals: A Step-by-Step Guide to GPCR Crystallography.
Shukla, Arun K; Kumari, Punita; Ghosh, Eshan; Nidhi, Kumari
2015-01-01
G protein-coupled receptors (GPCRs) are the primary targets of drugs prescribed for many human pathophysiological conditions such as hypertension, allergies, schizophrenia, asthma, and various types of cancer. High-resolution structure determination of GPCRs has been a key focus area in GPCR biology to understand the basic mechanism of their activation and signaling and to materialize the long-standing dream of structure-based drug design on these versatile receptors. There has been tremendous effort at this front in the past two decades and it has culminated into crystal structures of 27 different receptors so far. The recent progress in crystallization and structure determination of GPCRs has been driven by innovation and cutting-edge developments at every step involved in the process of crystallization. Here, we present a step-by-step description of various steps involved in GPCR crystallization starting from recombinant expression to obtaining diffracting crystals. We also discuss the next frontiers in GPCR biology that are likely to be a primary focus for crystallography efforts in the next decade or so. © 2015 Elsevier Inc. All rights reserved.
Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method
2015-01-05
rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes
ERIC Educational Resources Information Center
Bhola, H. S.
Addressed to professionals involved in program evaluation, this working paper covers various aspects of evaluation planning, including the following: planning as a sociotechnical process, steps in evaluation planning, program planning and implementation versus evaluation planning and implementation, the literacy system and its subsystems, and some…
van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette
2015-08-13
It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.
NASA Astrophysics Data System (ADS)
Wan, Jiangping; Jones, James D.
2013-11-01
The Warfield version of systems science supports a wide variety of application areas, and is useful to practitioners who use the work program of complexity (WPOC) tool. In this article, WPOC is applied to information technology service management (ITSM) for managing the complexity of projects. In discussing the application of WPOC to ITSM, we discuss several steps of WPOC. The discovery step of WPOC consists of a description process and a diagnosis process. During the description process, 52 risk factors are identified, which are then narrowed to 20 key risk factors. All of this is done by interviews and surveys. Root risk factors (the most basic risk factors) consist of 11 kinds of common 'mindbugs' which are selected from an interpretive structural model. This is achieved by empirical analysis of 25 kinds of mindbugs. (A lesser aim of this research is to affirm that these mindbugs developed from a Western mindset have corresponding relevance in a completely different culture: the Peoples Republic of China.) During the diagnosis process, the relationships among the root risk factors in the implementation of the ITSM project are identified. The resolution step of WPOC consists of a design process and an implementation process. During the design process, issues related to the ITSM application are compared to both e-Government operation and maintenance, and software process improvement. The ITSM knowledge support structure is also designed at this time. During the implementation process, 10 keys to the successful implementation of ITSM projects are identified.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
Onda, Mitsuko; Takagaki, Nobumasa
2018-01-01
Osaka University of Pharmaceutical Sciences has included an evidence-based medicine (EBM) exercise in the introductory education for clinical practice for 4th-year pharmacy students since 2015. The purpose of this exercise is to learn the process of practice and basic concepts of EBM, especially to cultivate the practical ability to solve patients' problems and answer their questions. Additionally, in 2016, we have attempted flipped teaching. The students are instructed to review the basic knowledge necessary for active learning in this exercise by watching video teaching materials and to bring reports summarizing the contents on the flipped teaching days. The program includes short lectures [overview of EBM, document retrieval, randomized controlled trials (RCTs), and systematic review], exercises [patient, intervention, comparison, outcome (PICO) structuring, critical appraisal of papers in small groups with tutors], and presentations. The program includes: step 1, PICO structuring based on scenarios; step 2, critical appraisal of English-language papers on RCTs using evaluation worksheets; and step 3, reviewing the results of the PICO exercise with patients. The results of the review are shared among groups through general discussion. In this symposium, I discuss students' attitudes, the effectiveness of small group discussions using flipped teaching, and future challenges to be addressed in this program.
Burns, E Robert; Garrett, Judy
2015-01-01
Correlates of achievement in the basic science years in medical school and on the Step 1 of the United States Medical Licensing Examination® (USMLE®), (Step 1) in relation to preadmission variables have been the subject of considerable study. Preadmissions variables such as the undergraduate grade point average (uGPA) and Medical College Admission Test® (MCAT®) scores, solely or in combination, have previously been found to be predictors of achievement in the basic science years and/or on the Step 1. The purposes of this retrospective study were to: (1) determine if our statistical analysis confirmed previously published relationships between preadmission variables (MCAT, uGPA, and applicant pool size), and (2) study correlates of the number of failures in five M1 courses with those preadmission variables and failures on Step 1. Statistical analysis confirmed previously published relationships between all preadmission variables. Only one course, Microscopic Anatomy, demonstrated significant correlations with all variables studied including the Step 1 failures. Physiology correlated with three of the four variables studied, but not with the Step 1 failures. Analyses such as these provide a tool by which administrators will be able to identify what courses are or are not responding in appropriate ways to changes in the preadmissions variables that signal student performance on the Step 1. © 2014 American Association of Anatomists.
Naik, Aanand Dinkar; Rao, Raghuram; Petersen, Laura Ann
2008-01-01
Diagnostic errors are poorly understood despite being a frequent cause of medical errors. Recent efforts have aimed to advance the "basic science" of diagnostic error prevention by tracing errors to their most basic origins. Although a refined theory of diagnostic error prevention will take years to formulate, we focus on communication breakdown, a major contributor to diagnostic errors and an increasingly recognized preventable factor in medical mishaps. We describe a comprehensive framework that integrates the potential sources of communication breakdowns within the diagnostic process and identifies vulnerable steps in the diagnostic process where various types of communication breakdowns can precipitate error. We then discuss potential information technology-based interventions that may have efficacy in preventing one or more forms of these breakdowns. These possible intervention strategies include using new technologies to enhance communication between health providers and health systems, improve patient involvement, and facilitate management of information in the medical record. PMID:18373151
Quality initiatives: planning, setting up, and carrying out radiology process improvement projects.
Tamm, Eric P; Szklaruk, Janio; Puthooran, Leejo; Stone, Danna; Stevens, Brian L; Modaro, Cathy
2012-01-01
In the coming decades, those who provide radiologic imaging services will be increasingly challenged by the economic, demographic, and political forces affecting healthcare to improve their efficiency, enhance the value of their services, and achieve greater customer satisfaction. It is essential that radiologists master and consistently apply basic process improvement skills that have allowed professionals in many other fields to thrive in a competitive environment. The authors provide a step-by-step overview of process improvement from the perspective of a radiologic imaging practice by describing their experience in conducting a process improvement project: to increase the daily volume of body magnetic resonance imaging examinations performed at their institution. The first step in any process improvement project is to identify and prioritize opportunities for improvement in the work process. Next, an effective project team must be formed that includes representatives of all participants in the process. An achievable aim must be formulated, appropriate measures selected, and baseline data collected to determine the effects of subsequent efforts to achieve the aim. Each aspect of the process in question is then analyzed by using appropriate tools (eg, flowcharts, fishbone diagrams, Pareto diagrams) to identify opportunities for beneficial change. Plans for change are then established and implemented with regular measurements and review followed by necessary adjustments in course. These so-called PDSA (planning, doing, studying, and acting) cycles are repeated until the aim is achieved or modified and the project closed.
Khan, Samina Mohsin; Mahmood, Raja Amjad
2016-01-01
Pakistani nationals that obtained foreign basic/additional medical/dental qualifications need to be registered with Pakistan Medical and Dental Council (PMDC) to serve in health delivery system after qualifying the National Examination Board (NEB) conducted by the outsourced universities. The aim was to analyse the results and the factors that influence the NEB examinations held for accreditation conducted by different universities from the year 2010 to 2015. The register based data was collected from the NEB section of PMDC for examinations conducted in the year 2010, 2013, 2014 and 2015. The format of the examination comprises three parts that include Step I, II based on written assessment composed of both multiple choice and subjective questions. Step III is designed for oral assessment, i.e., viva voce. All these examination were held in Islamabad conducted by the outsourced universities. The percentage of candidates that has passed all the examinations, Step I, II & III held by various universities varies across the universities. Further, the percentage of passed candidate remains low in Step I & II as compared to candidates passed in Step III. Overall the result of Step III shows a wide gradient that ranges from 50% to 98% among the different universities. Likewise, the lowest percentage of passing candidates in Step I and Step II remains 4% and 9% respectively as compared to 50% in Step III. There is a wide gradient in the results of NEB examination among universities due to lack of uniformity and standardization of assessment process. In addition, with regards to the approach of examination venue, the faced issues include lack of accessibility and cost effectiveness. The PMDC needs to institutionalize a robust and vigorous accreditation system equipped with inbuilt evaluation indicators that should steer to devise strategies to improve the accreditation process.
Birmingham, Jackie
2004-01-01
Discharge planning is a legally mandated function for hospitals and is one of the "basic" hospital roles as outlined in Medicare's Conditions of Participation. This article will define discharge planning; describe the steps in the discharge planning process; list rules and regulations that influence discharge planning in hospitals; and compare hospital-based actions with payer-based actions when planning discharges. Case managers who work for payers interact with hospital-based case managers to facilitate the discharge planning process for patients. Those who form this patient-provider-payer triangle will benefit by reviewing the dynamics of the discharge planning process.
Development of an E-mail Application Seemit and its Utilization in an Information Literacy Course
NASA Astrophysics Data System (ADS)
Kita, Toshihiro; Miyazaki, Makoto; Nakano, Hiroshi; Sugitani, Kenichi; Akiyama, Hidenori
We have developed a simple e-mail application named Seemit which is designed for being used in information literacy courses. It has necessary and sufficient functionality of an e-mail application, and it has been developed for the purpose of learning basic operations and mechanisms of e-mail transfer easily. It is equipped with the function to automatically set the configuration of user's SMTP/POP servers and e-mail address, etc. The process of transferring e-mail via SMTP and POP can be demonstrated step by step showing actual messages passed during the client-server interaction. We have utilized Seemit in a university-wide information literacy course which holds about 1800 students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, E.S.
1988-01-01
An introduction to geophysical methods used to explore for natural resources and to survey earth's geology is presented in this volume. It is suitable for second-and third-year undergraduate students majoring in geology or engineering and for professional engineering and for professional engineers and earth scientists without formal instruction in geophysics. The author assumes the reader is familiar with geometry, algebra, and trigonometry. Geophysical exploration includes seismic refraction and reflection surveying, electrical resistivity and electromagnetic field surveying, and geophysical well logging. Surveying operations are described in step-by-step procedures and are illustrated by practical examples. Computer-based methods of processing and interpreting datamore » as well as geographical methods are introduced.« less
Transportation systems evaluation methodology development and applications, phase 3
NASA Technical Reports Server (NTRS)
Kuhlthau, A. R.; Jacobson, I. D.; Richards, L. C.
1981-01-01
Transportation systems or proposed changes in current systems are evaluated. Four principal evaluation criteria are incorporated in the process, operating performance characteristics as viewed by potential users, decisions based on the perceived impacts of the system, estimating what is required to reduce the system to practice; and predicting the ability of the concept to attract financial support. A series of matrix multiplications in which the various matrices represent evaluations in a logical sequence of the various discrete steps in a management decision process is used. One or more alternatives are compared with the current situation, and the result provides a numerical rating which determines the desirability of each alternative relative to the norm and to each other. The steps in the decision process are isolated so that contributions of each to the final result are readily analyzed. The ability to protect against bias on the part of the evaluators, and the fact that system parameters which are basically qualitative in nature can be easily included are advantageous.
Fundamental Fractal Antenna Design Process
NASA Astrophysics Data System (ADS)
Zhu, L. P.; Kim, T. C.; Kakas, G. D.
2017-12-01
Antenna designers are always looking to come up with new ideas to push the envelope for new antennas, using a smaller volume while striving for higher bandwidth, wider bandwidth, and antenna gain. One proposed method of increasing bandwidth or shrinking antenna size is via the use of fractal geometry, which gives rise to fractal antennas. Fractals are those fun shapes that if one zooms in or zoom out, the structure is always the same. Design a new type of antenna based on fractal antenna design by utilize the Design of Experiment (DOE) will be shown in fractal antenna design process. Investigate conformal fractal antenna design for patterns, dimensions, and size, of the antenna but maintaining or improving the antenna performance. Research shows an antenna designer how to create basic requirements of the fractal antenna through a step by step process, and provides how to optimize the antenna design with the model prediction, lab measurement, and actual results from the compact range measurement on the antenna patterns.
Design review - A tool for all seasons.
NASA Technical Reports Server (NTRS)
Liberman, D. S.
1972-01-01
The origins of design review are considered together with questions of definitions. The main characteristics which distinguish the concept of design review discussed from the basic master-apprentice relationship include competence, objectivity, formality, and a systematic approach. Preliminary, major, and final reviews are the steps used in the management of the design and development process in each company. It is shown that the design review is generically a systems engineering milestone review with certain unique characteristics.
The Pd-Catalyzed Conversion of Aryl Chlorides, Triflates, and Nonaflates to Nitroaromatics
Fors, Brett P.; Buchwald, Stephen L.
2009-01-01
An efficient Pd-catalyst for the transformation of aryl chlorides, triflates and nonaflates to nitroaromatics has been developed. This reaction proceeds under weekly basic conditions and displays a broad scope and excellent functional group compatibility. Moreover, this method allows for the synthesis of aromatic nitro compounds that cannot be accessed efficiently via other nitration protocols. Mechanistic insight into the trasmetallation step of the catalytic process is also reported. PMID:19737014
ERIC Educational Resources Information Center
Cer, Erkan
2015-01-01
Children's books of literary quality may affect gender perception, self-perception, and the social roles of children through the illustrations, main characters, and situations that they reflect. The Turkish Ministry of Education suggested a list consisting of stories, novels, poetry, and puns under the title "100 Basic Works of…
Observing the sick child: part 2b. Respiratory palpation.
Aylott, Marion
2007-02-01
Assessment is a major nursing role, and expanding assessment techniques, traditionally seen as the remit of the medical profession, can assist nursing assessment and the provision of appropriate care. This is the third article in a series of five articles. Parts 1 and 2a provided a practical critical review of the validity and reliability of basic respiratory assessment focusing on measurement of respiratory rate, rhythm and depth. This article provides a practical step-by-step introduction to the theory and practice of advanced respiratory assessment using palpation. Next month we will build on these skills and provide a practical step-by-step introduction to using auscultation in order to augment basic respiratory assessment skills.
Non-Contact Conductivity Measurement for Automated Sample Processing Systems
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kirby, James P.
2012-01-01
A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables
Yeung, Celine; McMillan, Catherine; Saun, Tomas J; Sun, Kimberly; D'hondt, Veerle; von Schroeder, Herbert P; Martou, Glykeria; Lee, Matthew; Liao, Elizabeth; Binhammer, Paul
To describe the development of cognitive task analysis (CTA)-based multimedia educational videos for surgical trainees in plastic surgery. A needs assessment survey was used to identify 5 plastic surgery skills on which to focus the educational videos. Three plastic surgeons were video-recorded performing each skill while describing the procedure, and were interviewed with probing questions. Three medical student reviewers coded transcripts and categorized each step into "action," "decision," or "assessment," and created a cognitive demands table (CDT) for each skill. The CDTs were combined into 1 table that was reviewed by the surgeons performing each skill to ensure accuracy. The final CDTs were compared against each surgeon's original transcripts. The total number of steps identified, percentage of steps shared, and the average percentage of steps omitted were calculated. Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada, an urban tertiary care teaching center. Canadian junior plastic surgery residents (n = 78) were sent a needs assessment survey. Four plastic surgeons and 1 orthopedic surgeon performed the skills. Twenty-eight residents responded to the survey (36%). Subcuticular suturing, horizontal and vertical mattress suturing, hand splinting, digital nerve block, and excisional biopsy had the most number of residents (>80%) rank the skills as being skills that students should be able to perform before entering residency. The number of steps identified through CTA ranged from 12 to 29. Percentage of steps shared by all 3 surgeons for each skill ranged from 30% to 48%, while the average percentage of steps that were omitted by each surgeon ranged from 27% to 40%. Instructional videos for basic surgical skills may be generated using CTA to help experts provide comprehensive descriptions of a procedure. A CTA-based educational tool may give trainees access to a broader, objective body of knowledge, allowing them to learn decision-making processes before entering the operating room. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Real-time PCR (qPCR) primer design using free online software.
Thornton, Brenda; Basu, Chhandak
2011-01-01
Real-time PCR (quantitative PCR or qPCR) has become the preferred method for validating results obtained from assays which measure gene expression profiles. The process uses reverse transcription polymerase chain reaction (RT-PCR), coupled with fluorescent chemistry, to measure variations in transcriptome levels between samples. The four most commonly used fluorescent chemistries are SYBR® Green dyes and TaqMan®, Molecular Beacon or Scorpion probes. SYBR® Green is very simple to use and cost efficient. As SYBR® Green dye binds to any double-stranded DNA product, its success depends greatly on proper primer design. Many types of online primer design software are available, which can be used free of charge to design desirable SYBR® Green-based qPCR primers. This laboratory exercise is intended for those who have a fundamental background in PCR. It addresses the basic fluorescent chemistries of real-time PCR, the basic rules and pitfalls of primer design, and provides a step-by-step protocol for designing SYBR® Green-based primers with free, online software. Copyright © 2010 Wiley Periodicals, Inc.
"Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes
... Home Current Issue Past Issues Special Section "Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes ... onset. Those are the basic facts of "Small Steps. Big Rewards: Prevent type 2 Diabetes," created by ...
NASA Astrophysics Data System (ADS)
Bauhuis, Gerard J.; Mulder, Peter; Haverkamp, Erik J.; Schermer, John J.; Nash, Lee J.; Fulgoni, Dominic J. F.; Ballard, Ian M.; Duggan, Geoffrey
2010-10-01
The epitaxial lift-off (ELO) technique has been combined with inverted III-V PV cell epitaxial growth with the aim of employing thin film PV cells in HCPV systems. In a stepwise approach to the realization of an inverted triple junction on a MELO platform we have first grown a GaAs single junction PV cell to establish the basic layer release process and cell processing steps followed by the growth, fabrication and test of an inverted InGaP/GaAs dual junction structure.
Angelopoulou, A; Efthimiadou, E K; Boukos, N; Kordas, G
2014-05-01
In this work, hybrid microspheres were prepared in a two-step process combining the emulsifier free-emulsion polymerization and the sol-gel coating method. In the first step, polystyrene (St) and poly(methyl methacrylate) (PMMA) microspheres were prepared as sacrificial template and in the second step a silanol shell was fabricated. The functionalized surface of the hybrid microspheres by silane analogs (APTES, TEOS) resulted in enhanced effects. The hollow microspheres were resulted either in an additional step by template dissolution and/or during the coating process. The microspheres' surface interactions and the size distribution were optimized by treatment in simulated body fluids, which resulted in the in vitro prediction of bioactivity. The bioassay test indicated that the induced hydroxyapatite resembled in structure to naturally occurring bone apatite. The drug doxorubicin (DOX) was used as a model entity for the evaluation of drug loading and release. The drug release study was performed in two different pH conditions, at acidic (pH=4.5) close to cancer cell environment and at slightly basic pH (pH=7.4) resembling the orthopedic environment. The results of the present study indicated promising hybrid microspheres for the potential application as drug delivery vehicles, for dual orthopedic functionalities in bone defects, bone inflammation, bone cancer and bone repair. Copyright © 2014 Elsevier B.V. All rights reserved.
Open Science: a first step towards Science Communication
NASA Astrophysics Data System (ADS)
Grigorov, Ivo; Tuddenham, Peter
2015-04-01
As Earth Science communicators gear up to adopt the new tools and captivating approaches to engage citizen scientists, budding entrepreneurs, policy makers and the public in general, researchers have the responsibility, and opportunity, to fully adopt Open Science principles and capitalize on its full societal impact and engagement. Open Science is about removing all barriers to basic research, whatever its formats, so that it can be freely used, re-used and re-hashed, thus fueling discourse and accelerating generation of innovative ideas. The concept is central to EU's Responsible Research and Innovation philosophy, and removing barriers to basic research measurably contributes to engaging citizen scientists into the research process, it sets the scene for co-creation of solutions to societal challenges, and raises the general science literacy level of the public. Despite this potential, only 50% of today's basic research is freely available. Open Science can be the first passive step of communicating marine research outside academia. Full and unrestricted access to our knowledge including data, software code and scientific publications is not just an ethical obligation, but also gives solid credibility to a more sophisticated communication strategy on engaging society. The presentation will demonstrate how Open Science perfectly compliments a coherent communication strategy for placing Marine Research in societal context, and how it underpin an effective integration of Ocean & Earth Literacy principles in standard educational, as well mobilizing citizen marine scientists, thus making marine science Open Science.
Basic Cake Decorating Workbook.
ERIC Educational Resources Information Center
Bogdany, Mel
Included in this student workbook for basic cake decorating are the following: (1) Drawings of steps in a basic way to ice a layer cake, how to make a paper cone, various sizes of flower nails, various sizes and types of tin pastry tubes, and special rose tubes; (2) recipes for basic decorating icings (buttercream, rose paste, and royal icing);…
Fast auto-focus scheme based on optical defocus fitting model
NASA Astrophysics Data System (ADS)
Wang, Yeru; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting; Cen, Min
2018-04-01
An optical defocus fitting model-based (ODFM) auto-focus scheme is proposed. Considering the basic optical defocus principle, the optical defocus fitting model is derived to approximate the potential-focus position. By this accurate modelling, the proposed auto-focus scheme can make the stepping motor approach the focal plane more accurately and rapidly. Two fitting positions are first determined for an arbitrary initial stepping motor position. Three images (initial image and two fitting images) at these positions are then collected to estimate the potential-focus position based on the proposed ODFM method. Around the estimated potential-focus position, two reference images are recorded. The auto-focus procedure is then completed by processing these two reference images and the potential-focus image to confirm the in-focus position using a contrast based method. Experimental results prove that the proposed scheme can complete auto-focus within only 5 to 7 steps with good performance even under low-light condition.
Neural networks for vertical microcode compaction
NASA Astrophysics Data System (ADS)
Chu, Pong P.
1992-09-01
Neural networks provide an alternative way to solve complex optimization problems. Instead of performing a program of instructions sequentially as in a traditional computer, neural network model explores many competing hypotheses simultaneously using its massively parallel net. The paper shows how to use the neural network approach to perform vertical micro-code compaction for a micro-programmed control unit. The compaction procedure includes two basic steps. The first step determines the compatibility classes and the second step selects a minimal subset to cover the control signals. Since the selection process is an NP- complete problem, to find an optimal solution is impractical. In this study, we employ a customized neural network to obtain the minimal subset. We first formalize this problem, and then define an `energy function' and map it to a two-layer fully connected neural network. The modified network has two types of neurons and can always obtain a valid solution.
Defining and incorporating basic nursing care actions into the electronic health record.
Englebright, Jane; Aldrich, Kelly; Taylor, Cathy R
2014-01-01
To develop a definition of basic nursing care for the hospitalized adult patient and drive uptake of that definition through the implementation of an electronic health record. A team of direct care nurses, assisted by subject matter experts, analyzed nursing theory and regulatory requirements related to basic nursing care. The resulting list of activities was coded using the Clinical Care Classification (CCC) system and incorporated into the electronic health record system of a 170-bed community hospital. Nine basic nursing care activities were identified as a result of analyzing nursing theory and regulatory requirements in the framework of a hypothetical "well" patient. One additional basic nursing care activity was identified following the pilot implementation in the electronic health record. The pilot hospital has successfully passed a post-implementation regulatory review with no recommendations related to the documentation of basic patient care. This project demonstrated that it is possible to define the concept of basic nursing care and to distinguish it from the interdisciplinary, problem-focused plan of care. The use of the electronic health record can help clarify, document, and communicate basic care elements and improve uptake among nurses. This project to define basic nursing care activities and incorporate into the electronic health record represents a first step in capturing meaningful data elements. When fully implemented, these data could be translated into knowledge for improving care outcomes and collaborative processes. © 2013 Sigma Theta Tau International.
A process for the preparation of cysteine from cystine
Chang, Shih-Ger; Liu, David K.; Griffiths, Elizabeth A.; Littlejohn, David
1989-01-01
The present invention in one aspect relates to a process for the simultaneous removal of NO.sub.x and SO.sub.2 from a fluid stream comprising mixtures thereof and in another aspect relates to the separation, use and/or regeneration of various chemicals contaminated or spent in the process and which includes the steps of: (A) contacting the fluid stream at a temperature of between about 105.degree. and 180.degree. C. with a liquid aqueous slurry or solution comprising an effective amount of an iron chelate of an amino acid moiety having at least one --SH group; (B) separating the fluid stream from the particulates formed in step (A) comprising the chelate of the amino acid moiety and fly ash; (C) washing and separating the particulates of step (B) with an aqeous solution having a pH value of between about 5 to 8; (D) subsequently washing and separating the particulates of step (C) with a strongly acidic aqueous solution having a pH value of between about 1 to 3; (E) washing and separating the particulates of step (D) with an basic aqueous solution having a pH value of between about 9 to 12; (F) optionally adding additional amino acid moiety, iron (II) and alkali to the aqueous liquid from step (D) to produce an aqueous solution or slurry similar to that in step (A) having a pH value of between about 4 to 12; and (G) recycling the aqueous slurry of step (F) to the contacting zone of step (A). Steps (D) and (E) can be carried out in the reverse sequence, however the preferred order is (D) and then (E). In a preferred embodiment the present invention provides an improved process for the preparation (regeneration) of cysteine from cystine, which includes reacting an aqueous solution of cystine at a pH of between about 9 to 13 with a reducing agent selected from hydrogen sulfide or alkali metal sulfides, sulfur dioxide, an alkali metal sulfite or mixtures thereof for a time and at a temperature effective to cleave and reduce the cystine to cysteine with subsequent recovery of the cysteine. In another preferred embodiment the present invention provides a process for the removal of NO.sub.x, SO.sub.2 and particulates from a fluid stream which includes the steps of (A) injecting into a reaction zone an aqueous solution itself comprising (i) an amino acid moiety selected from those described above; (ii) iron (II) ion; and (iii) an alkali, wherein the aqueous solution has a pH of between about 4 and 11; followed by solids separation and washing as is described in steps (B), (C), (D) and (E) above. The overall process is useful to reduce acid rain components from combustion gas sources.
Contreras-López, Orlando; Moyano, Tomás C; Soto, Daniela C; Gutiérrez, Rodrigo A
2018-01-01
The rapid increase in the availability of transcriptomics data generated by RNA sequencing represents both a challenge and an opportunity for biologists without bioinformatics training. The challenge is handling, integrating, and interpreting these data sets. The opportunity is to use this information to generate testable hypothesis to understand molecular mechanisms controlling gene expression and biological processes (Fig. 1). A successful strategy to generate tractable hypotheses from transcriptomics data has been to build undirected network graphs based on patterns of gene co-expression. Many examples of new hypothesis derived from network analyses can be found in the literature, spanning different organisms including plants and specific fields such as root developmental biology.In order to make the process of constructing a gene co-expression network more accessible to biologists, here we provide step-by-step instructions using published RNA-seq experimental data obtained from a public database. Similar strategies have been used in previous studies to advance root developmental biology. This guide includes basic instructions for the operation of widely used open source platforms such as Bio-Linux, R, and Cytoscape. Even though the data we used in this example was obtained from Arabidopsis thaliana, the workflow developed in this guide can be easily adapted to work with RNA-seq data from any organism.
Space-time dynamics of Stem Cell Niches: a unified approach for Plants.
Pérez, Maria Del Carmen; López, Alejandro; Padilla, Pablo
2013-06-01
Many complex systems cannot be analyzed using traditional mathematical tools, due to their irreducible nature. This makes it necessary to develop models that can be implemented computationally to simulate their evolution. Examples of these models are cellular automata, evolutionary algorithms, complex networks, agent-based models, symbolic dynamics and dynamical systems techniques. We review some representative approaches to model the stem cell niche in Arabidopsis thaliana and the basic biological mechanisms that underlie its formation and maintenance. We propose a mathematical model based on cellular automata for describing the space-time dynamics of the stem cell niche in the root. By making minimal assumptions on the cell communication process documented in experiments, we classify the basic developmental features of the stem-cell niche, including the basic structural architecture, and suggest that they could be understood as the result of generic mechanisms given by short and long range signals. This could be a first step in understanding why different stem cell niches share similar topologies, not only in plants. Also the fact that this organization is a robust consequence of the way information is being processed by the cells and to some extent independent of the detailed features of the signaling mechanism.
Space-time dynamics of stem cell niches: a unified approach for plants.
Pérez, Maria del Carmen; López, Alejandro; Padilla, Pablo
2013-04-02
Many complex systems cannot be analyzed using traditional mathematical tools, due to their irreducible nature. This makes it necessary to develop models that can be implemented computationally to simulate their evolution. Examples of these models are cellular automata, evolutionary algorithms, complex networks, agent-based models, symbolic dynamics and dynamical systems techniques. We review some representative approaches to model the stem cell niche in Arabidopsis thaliana and the basic biological mechanisms that underlie its formation and maintenance. We propose a mathematical model based on cellular automata for describing the space-time dynamics of the stem cell niche in the root. By making minimal assumptions on the cell communication process documented in experiments, we classify the basic developmental features of the stem-cell niche, including the basic structural architecture, and suggest that they could be understood as the result of generic mechanisms given by short and long range signals. This could be a first step in understanding why different stem cell niches share similar topologies, not only in plants. Also the fact that this organization is a robust consequence of the way information is being processed by the cells and to some extent independent of the detailed features of the signaling mechanism.
The OSHA standard setting process: role of the occupational health nurse.
Klinger, C S; Jones, M L
1994-08-01
1. Occupational health nurses are the health professionals most often involved with the worker who suffers as a result of ineffective or non-existent safety and health standards. 2. Occupational health nurses are familiar with health and safety standards, but may not understand or participate in the rulemaking process used to develop them. 3. Knowing the eight basic steps of rulemaking and actively participating in the process empowers occupational health nurses to influence national policy decisions affecting the safety and health of millions of workers. 4. By actively participating in rulemaking activities, occupational health nurses also improve the quality of occupational health nursing practice and enhance the image of the nursing profession.
NASA Astrophysics Data System (ADS)
Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.
The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.
Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells.
Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J; Rhodes, Christopher; Mukherjee, Partha P
2016-02-01
Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring.
Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells
Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J.; Rhodes, Christopher; Mukherjee, Partha P.
2016-01-01
Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring. PMID:26863503
Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette
2015-01-01
Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510
Fundamentals and applications of electrochemistry
NASA Astrophysics Data System (ADS)
McEvoy, A. J.
2013-06-01
The Voltaic pile, invented here on Lake Como 200 years ago, was a crucial step in the development of electrical engineering. For the first time a controlled and reliable source of electric current was available. The science of electrochemistry developed rapidly and is now a key contributor, not just to energy technology but also, for example, to metallurgy and industrial processes. The basic concepts of electrochemistry are presented, with the practical examples of its application in fuel cells, and with the perspective of the history of the subject.
Hybrid finite element and Brownian dynamics method for diffusion-controlled reactions.
Bauler, Patricia; Huber, Gary A; McCammon, J Andrew
2012-04-28
Diffusion is often the rate determining step in many biological processes. Currently, the two main computational methods for studying diffusion are stochastic methods, such as Brownian dynamics, and continuum methods, such as the finite element method. This paper proposes a new hybrid diffusion method that couples the strengths of each of these two methods. The method is derived for a general multidimensional system, and is presented using a basic test case for 1D linear and radially symmetric diffusion systems.
Performance Limiting Flow Processes in High-State Loading High-Mach Number Compressors
2008-03-13
the Doctoral Thesis Committee of the doctoral student. 3 3.0 Technical Background A strong incentive exists to reduce airfoil count in aircraft engine ...Advanced Turbine Engine ). A basic constraint on blade reduction is seen from the Euler turbine equation, which shows that, although a design can be carried...on the vane to rotor blade ratio of 8:11). Within the MSU Turbo code, specifying a small number of time steps requires more iteration at each time
Electrophoresis experiments in microgravity
NASA Technical Reports Server (NTRS)
Snyder, Robert S.; Rhodes, Percy H.
1991-01-01
The use of the microgravity environment to separate and purify biological cells and proteins has been a major activity since the beginning of the NASA Microgravity Science and Applications program. Purified populations of cells are needed for research, transplantation and analysis of specific cell constituents. Protein purification is a necessary step in research areas such as genetic engineering where the new protein has to be separated from the variety of other proteins synthesized from the microorganism. Sufficient data are available from the results of past electrophoresis experiments in space to show that these experiments were designed with incomplete knowledge of the fluid dynamics of the process including electrohydrodynamics. However, electrophoresis is still an important separation tool in the laboratory and thermal convection does limit its performance. Thus, there is a justification for electrophoresis but the emphasis of future space experiments must be directed toward basic research with model experiments to understand the microgravity environment and fluid analysis to test the basic principles of the process.
Segmentation of human brain using structural MRI.
Helms, Gunther
2016-04-01
Segmentation of human brain using structural MRI is a key step of processing in imaging neuroscience. The methods have undergone a rapid development in the past two decades and are now widely available. This non-technical review aims at providing an overview and basic understanding of the most common software. Starting with the basis of structural MRI contrast in brain and imaging protocols, the concepts of voxel-based and surface-based segmentation are discussed. Special emphasis is given to the typical contrast features and morphological constraints of cortical and sub-cortical grey matter. In addition to the use for voxel-based morphometry, basic applications in quantitative MRI, cortical thickness estimations, and atrophy measurements as well as assignment of cortical regions and deep brain nuclei are briefly discussed. Finally, some fields for clinical applications are given.
A framework for farmland parcels extraction based on image classification
NASA Astrophysics Data System (ADS)
Liu, Guoying; Ge, Wenying; Song, Xu; Zhao, Hongdan
2018-03-01
It is very important for the government to build an accurate national basic cultivated land database. In this work, farmland parcels extraction is one of the basic steps. However, during the past years, people had to spend much time on determining an area is a farmland parcel or not, since they were bounded to understand remote sensing images only from the mere visual interpretation. In order to overcome this problem, in this study, a method was proposed to extract farmland parcels by means of image classification. In the proposed method, farmland areas and ridge areas of the classification map are semantically processed independently and the results are fused together to form the final results of farmland parcels. Experiments on high spatial remote sensing images have shown the effectiveness of the proposed method.
Improving preanalytic processes using the principles of lean production (Toyota Production System).
Persoon, Thomas J; Zaleski, Sue; Frerichs, Janice
2006-01-01
The basic technologies used in preanalytic processes for chemistry tests have been mature for a long time, and improvements in preanalytic processes have lagged behind improvements in analytic and postanalytic processes. We describe our successful efforts to improve chemistry test turnaround time from a central laboratory by improving preanalytic processes, using existing resources and the principles of lean production. Our goal is to report 80% of chemistry tests in less than 1 hour and to no longer recognize a distinction between expedited and routine testing. We used principles of lean production (the Toyota Production System) to redesign preanalytic processes. The redesigned preanalytic process has fewer steps and uses 1-piece flow to move blood samples through the accessioning, centrifugation, and aliquoting processes. Median preanalytic processing time was reduced from 29 to 19 minutes, and the laboratory met the goal of reporting 80% of chemistry results in less than 1 hour for 11 consecutive months.
A Survey of Terrestrial Approaches to the Challenge of Lunar Dust Containment
NASA Technical Reports Server (NTRS)
Aguilera, Tatiana; Perry, Jay L.
2009-01-01
Numerous technical challenges exist to successfully extend lunar surface exploration beyond the tantalizing first steps of Apollo. Among these is the challenge of lunar dust intrusion into the cabin environment. Addressing this challenge includes the design of barriers to intrusion as well as techniques for removing the dust from the cabin atmosphere. Opportunities exist for adapting approaches employed in dusty industrial operations and pristine manufacturing environments to cabin environmental quality maintenance applications. A survey of process technologies employed by the semiconductor, pharmaceutical, food processing, and mining industries offers insight into basic approaches that may be suitable for adaptation to lunar surface exploration applications.
A biotechnological project with a gamma radiation source of 100,000 Ci
NASA Astrophysics Data System (ADS)
Lombardo, J. H.; Smolko, E. E.
A project for the production of radiovaccines and other bio-medical products is presented which includes a radiation facility provided with a gamma ray source equivalent to 100,000 Ci of Co-60. The whole process incorporates novel basic features in virus production and inactivation steps. The former is carried out in animals previously subjected to immunodepression through electromagnetic radiation. The later is obtained at low temperatures by using either electromagnetic or particle radiations. A vaccine manufacture process is shown to illustrate the utilization of ionizing radiations to obtain a foot and mouth disease virus (FMDV) vaccine with good antigenic quality and low cost.
Planning for the next influenza pandemic: using the science and art of logistics.
Cupp, O Shawn; Predmore, Brad G
2011-01-01
The complexities and challenges for healthcare providers and their efforts to provide fundamental basic items to meet the logistical demands of an influenza pandemic are discussed in this article. The supply chain, planning, and alternatives for inevitable shortages are some of the considerations associated with this emergency mass critical care situation. The planning process and support for such events are discussed in detail with several recommendations obtained from the literature and the experience from recent mass casualty incidents (MCIs). The first step in this planning process is the development of specific triage requirements during an influenza pandemic. The second step is identification of logistical resources required during such a pandemic, which are then analyzed within the proposed logistics science and art model for planning purposes. Resources highlighted within the model include allocation and use of work force, bed space, intensive care unit assets, ventilators, personal protective equipment, and oxygen. The third step is using the model to discuss in detail possible workarounds, suitable substitutes, and resource allocation. An examination is also made of the ethics surrounding palliative care within the construction of an MCI and the factors that will inevitably determine rationing and prioritizing of these critical assets to palliative care patients.
ERIC Educational Resources Information Center
Gay, John; And Others
The basic premise of this text is that, in addition to the presentation of basic cognitive and affective information, health education should go one step further by assisting student in developing decision-making skills. The text begins by offering the student a basic foundation of what is meant by health and how this meaning applies to the world,…
Gross, Andreas J; Herrmann, Thomas R W
2007-06-01
The developments of laser technology from the cradle of modern physics in 1900 by Planck to its latest medical boundaries is an exciting example of how basic physics finds its way into clinical practice. This article merits the protagonists and their contribution to the steps in this development. The competition between the different research groups finally led to the award of the Nobel Prize to Townes, Basov and Prokhorov in 1964 for the scientific basis on quantum electronics, which led to the construction of oscillators and amplifiers based on the laser-maser principle. Forty-three years after Einstein's first theories Maiman introduced the first ruby laser for commercial use. This marked the key step for the laser application and pioneered fruitful cooperations between basic and clinical science. The pioneers of lasers in clinical urology were Parsons in 1966 with studies in canine bladders and Mulvany 1968 with experiments in calculi fragmentation. The central technological component for the triumphal procession of lasers in urology is the endoscope. Therefore lasers are currently widely used, being the tool of choice in some areas, such as endoscopical lithotriptic stone treatment or endoluminal organ-preserving tumor ablation. Furthermore they show promising treatment alternatives for the treatment of benign prostate hyperplasia.
Steps Toward Effective Production of Speech (STEPS): No. 7--How to Take Care of Glasses.
ERIC Educational Resources Information Center
Sheeley, Eugene C.; McQuiddy, Doris
This guide, one of a series of booklets developed by Project STEPS (Steps Toward Effective Production of Speech), presents guidelines for parents of deaf-blind children regarding the care of eyeglasses. Basic concerns with glasses and contact lenses are noted and parents are advised to perform the following daily tasks: checking the frames,…
Porous single-phase NiTi processed under Ca reducing vapor for use as a bone graft substitute.
Bertheville, Bernard
2006-03-01
Porous nickel-titanium alloys (NiTi, nitinol) have recently attracted attention in clinical surgery because they are a very interesting alternative to the more brittle and less machinable conventional porous Ca-based ceramics. The main remaining limitations come from the chemical homogeneity of the as-processed porous nickel-titanium alloys, which always contain undesired secondary Ti- and Ni-rich phases. These are known to weaken the NiTi products, to favor their cavitation corrosion and to decrease their biocompatibility. Elemental nickel must also be avoided because it could give rise to several adverse tissue reactions. Therefore, the synthesis of porous single-phase NiTi alloys by using a basic single-step sintering procedure is an important step towards the processing of safe implant materials. The sintering process used in this work is based on a vapor phase calciothermic reduction operating during the NiTi compound formation. The as-processed porous nickel-titanium microstructure is single-phase and shows a uniformly open pore distribution with porosity of about 53% and pore diameters in the range 20-100 microm. Furthermore, due to the process, fine CaO layers grow on the NiTi outer and inner surfaces, acting as possible promoting agents for the ingrowth of bone cells at the implantation site.
Dubois, Eline Agnès; Franson, Kari Lanette
2009-09-01
Basic sciences can be integrated into the medical school curriculum via e-learning. The process of integrating a basic science in this manner resembles a curricular change. The change usually begins with an idea for using e-learning to teach a basic science and establishing the need for the innovation. In the planning phase, learning outcomes are formulated and a prototype of the program is developed based on the desired requirements. A realistic concept is formed after considering the limitations of the current institute. Next, a project team is assembled to develop the program and plan its integration. Incorporation of the e-learning program is facilitated by a well-developed and communicated integration plan. Various course coordinators are contacted to determine content of the e-learning program as well as establish assessment. Linking the e-learning program to existing course activities and thereby applying the basic science into the clinical context enhances the degree of integration. The success of the integration is demonstrated by a positive assessment of the program including favourable cost-benefit analysis and improved student performance. Lastly, when the program becomes institutionalised, continuously updating content and technology (when appropriate), and evaluating the integration contribute to the prolonged survival of the e-learning program.
Thermal Model Development for Ares I-X
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; DelCorso, Joe
2008-01-01
Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.
A Fuzzy Goal Programming for a Multi-Depot Distribution Problem
NASA Astrophysics Data System (ADS)
Nunkaew, Wuttinan; Phruksaphanrat, Busaba
2010-10-01
A fuzzy goal programming model for solving a Multi-Depot Distribution Problem (MDDP) is proposed in this research. This effective proposed model is applied for solving in the first step of Assignment First-Routing Second (AFRS) approach. Practically, a basic transportation model is firstly chosen to solve this kind of problem in the assignment step. After that the Vehicle Routing Problem (VRP) model is used to compute the delivery cost in the routing step. However, in the basic transportation model, only depot to customer relationship is concerned. In addition, the consideration of customer to customer relationship should also be considered since this relationship exists in the routing step. Both considerations of relationships are solved using Preemptive Fuzzy Goal Programming (P-FGP). The first fuzzy goal is set by a total transportation cost and the second fuzzy goal is set by a satisfactory level of the overall independence value. A case study is used for describing the effectiveness of the proposed model. Results from the proposed model are compared with the basic transportation model that has previously been used in this company. The proposed model can reduce the actual delivery cost in the routing step owing to the better result in the assignment step. Defining fuzzy goals by membership functions are more realistic than crisps. Furthermore, flexibility to adjust goals and an acceptable satisfactory level for decision maker can also be increased and the optimal solution can be obtained.
Intrafirm planning and mathematical modeling of owner's equity in industrial enterprises
NASA Astrophysics Data System (ADS)
Ponomareva, S. V.; Zheleznova, I. V.
2018-05-01
The article aims to review the different approaches to intrafirm planning of owner's equity in industrial enterprises. Since charter capital, additional capital and reserve capital do not change in the process of enterprise activity, the main interest lies on the field of share repurchases from shareholders and retained earnings within the owner's equity of the enterprise. In order to study the effect of share repurchases on the activities of the enterprise, let us use such mathematical methods as event study and econometric modeling. This article describes the step-by-step algorithm of carrying out event study and justifies the choice of Logit model in econometric analysis. The article represents basic results of conducted regression analysis on the effect of share repurchases on the key financial indicators in industrial enterprises.
Development of a Launch Vehicle Manufacturing Process. Chapter 4
NASA Technical Reports Server (NTRS)
Vickers, John; Munafo, Paul M. (Technical Monitor)
2002-01-01
One of the goals of this chapter is to provide sufficient information so that you can develop a manufacturing process for a potential launch vehicle. With the variety of manufacturing options available, you might ask how this can possibly be done in the span of a single chapter. Actually, it will be quite simple because a basic manufacturing process is nothing more than a set of logical steps that are iterated until they produce a desired product. Although these statements seem simple and logical, don't let this simplicity fool you. Manufacturing problems with launch vehicles and their subassemblies have been the primary cause of project failures because the vehicle concept delivered to the manufacturing floor could not be built as designed.
Applying Machine Learning to Star Cluster Classification
NASA Astrophysics Data System (ADS)
Fedorenko, Kristina; Grasha, Kathryn; Calzetti, Daniela; Mahadevan, Sridhar
2016-01-01
Catalogs describing populations of star clusters are essential in investigating a range of important issues, from star formation to galaxy evolution. Star cluster catalogs are typically created in a two-step process: in the first step, a catalog of sources is automatically produced; in the second step, each of the extracted sources is visually inspected by 3-to-5 human classifiers and assigned a category. Classification by humans is labor-intensive and time consuming, thus it creates a bottleneck, and substantially slows down progress in star cluster research.We seek to automate the process of labeling star clusters (the second step) through applying supervised machine learning techniques. This will provide a fast, objective, and reproducible classification. Our data is HST (WFC3 and ACS) images of galaxies in the distance range of 3.5-12 Mpc, with a few thousand star clusters already classified by humans as a part of the LEGUS (Legacy ExtraGalactic UV Survey) project. The classification is based on 4 labels (Class 1 - symmetric, compact cluster; Class 2 - concentrated object with some degree of asymmetry; Class 3 - multiple peak system, diffuse; and Class 4 - spurious detection). We start by looking at basic machine learning methods such as decision trees. We then proceed to evaluate performance of more advanced techniques, focusing on convolutional neural networks and other Deep Learning methods. We analyze the results, and suggest several directions for further improvement.
Defining care products to finance health care in the Netherlands.
Westerdijk, Machiel; Zuurbier, Joost; Ludwig, Martijn; Prins, Sarah
2012-04-01
A case-mix project started in the Netherlands with the primary goal to define a complete set of health care products for hospitals. The definition of the product structure was completed 4 years later. The results are currently being used for billing purposes. This paper focuses on the methodology and techniques that were developed and applied in order to define the casemix product structure. The central research question was how to develop a manageable product structure, i.e., a limited set of hospital products, with acceptable cost homogeneity. For this purpose, a data warehouse with approximately 1.5 million patient records from 27 hospitals was build up over a period of 3 years. The data associated with each patient consist of a large number of a priori independent parameters describing the resource utilization in different stages of the treatment process, e.g., activities in the operating theatre, the lab and the radiology department. Because of the complexity of the database, it was necessary to apply advanced data analysis techniques. The full analyses process that starts from the database and ends up with a product definition consists of four basic analyses steps. Each of these steps has revealed interesting insights. This paper describes each step in some detail and presents the major results of each step. The result consists of 687 product groups for 24 medical specialties used for billing purposes.
Contributions of Basic Sciences to Science of Education. Studies in Educational Administration.
ERIC Educational Resources Information Center
Lall, Bernard M.
The science of education has been influenced by the basic sciences to the extent that educational research now has been able to modernize its approach by accepting and using the basic scientific methodology and experimental techniques. Using primarily the same steps of scientific investigations, education today holds a place of much greater esteem…
ERIC Educational Resources Information Center
Evaluation and Training Inst., Los Angeles, CA.
This handbook was produced as a result of a project that studied California community college programs that teach basic skills in vocational education programs. The project included a literature review, a telephone survey, and 12 site visits. The handbook contains four sections: (1) steps for integrating basic skills and vocational instruction;…
A sandpile model of grain blocking and consequences for sediment dynamics in step-pool streams
NASA Astrophysics Data System (ADS)
Molnar, P.
2012-04-01
Coarse grains (cobbles to boulders) are set in motion in steep mountain streams by floods with sufficient energy to erode the particles locally and transport them downstream. During transport, grains are often blocked and form width-spannings structures called steps, separated by pools. The step-pool system is a transient, self-organizing and self-sustaining structure. The temporary storage of sediment in steps and the release of that sediment in avalanche-like pulses when steps collapse, leads to a complex nonlinear threshold-driven dynamics in sediment transport which has been observed in laboratory experiments (e.g., Zimmermann et al., 2010) and in the field (e.g., Turowski et al., 2011). The basic question in this paper is if the emergent statistical properties of sediment transport in step-pool systems may be linked to the transient state of the bed, i.e. sediment storage and morphology, and to the dynamics in sediment input. The hypothesis is that this state, in which sediment transporting events due to the collapse and rebuilding of steps of all sizes occur, is analogous to a critical state in self-organized open dissipative dynamical systems (Bak et al., 1988). To exlore the process of self-organization, a cellular automaton sandpile model is used to simulate the processes of grain blocking and hydraulically-driven step collapse in a 1-d channel. Particles are injected at the top of the channel and are allowed to travel downstream based on various local threshold rules, with the travel distance drawn from a chosen probability distribution. In sandpile modelling this is a simple 1-d limited non-local model, however it has been shown to have nontrivial dynamical behaviour (Kadanoff et al., 1989), and it captures the essence of stochastic sediment transport in step-pool systems. The numerical simulations are used to illustrate the differences between input and output sediment transport rates, mainly focussing on the magnification of intermittency and variability in the system response by the processes of grain blocking and step collapse. The temporal correlation in input and output rates and the number of grains stored in the system at any given time are quantified by spectral analysis and statistics of long-range dependence. Although the model is only conceptually conceived to represent the real processes of step formation and collapse, connections will be made between the modelling results and some field and laboratory data on step-pool systems. The main focus in the discussion will be to demonstrate how even in such a simple model the processes of grain blocking and step collapse may impact the sediment transport rates to the point that certain changes in input are not visible anymore, along the lines of "shredding the signals" proposed by Jerolmack and Paola (2010). The consequences are that the notions of stability and equilibrium, the attribution of cause and effect, and the timescales of process and form in step-pool systems, and perhaps in many other fluvial systems, may have very limited applicability.
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
Long-term effects of cannabis on oculomotor function in humans.
Huestegge, L; Radach, R; Kunert, H J
2009-08-01
Cannabis is known to affect human cognitive and visuomotor skills directly after consumption. Some studies even point to rather long-lasting effects, especially after chronic tetrahydrocannabinol (THC) abuse. However, it is still unknown whether long-term effects on basic visual and oculomotor processing may exist. In the present study, the performance of 20 healthy long-term cannabis users without acute THC intoxication and 20 control subjects were examined in four basic visuomotor paradigms to search for specific long-term impairments. Subjects were asked to perform: 1) reflexive saccades to visual targets (prosaccades), including gap and overlap conditions, 2) voluntary antisaccades, 3) memory-guided saccades and 4) double-step saccades. Spatial and temporal parameters of the saccades were subsequently analysed. THC subjects exhibited a significant increase of latency in the prosaccade and antisaccade tasks, as well as prolonged saccade amplitudes in the antisaccade and memory-guided task, compared with the control subjects. The results point to substantial and specific long-term deficits in basic temporal processing of saccades and impaired visuo-spatial working memory. We suggest that these impairments are a major contributor to degraded performance of chronic users in a vital everyday task like visual search, and they might potentially also affect spatial navigation and reading.
Components of the Engulfment Machinery Have Distinct Roles in Corpse Processing
Meehan, Tracy L.; Joudi, Tony F.; Timmons, Allison K.; Taylor, Jeffrey D.; Habib, Corey S.; Peterson, Jeanne S.; Emmanuel, Shanan; Franc, Nathalie C.; McCall, Kimberly
2016-01-01
Billions of cells die in our bodies on a daily basis and are engulfed by phagocytes. Engulfment, or phagocytosis, can be broken down into five basic steps: attraction of the phagocyte, recognition of the dying cell, internalization, phagosome maturation, and acidification. In this study, we focus on the last two steps, which can collectively be considered corpse processing, in which the engulfed material is degraded. We use the Drosophila ovarian follicle cells as a model for engulfment of apoptotic cells by epithelial cells. We show that engulfed material is processed using the canonical corpse processing pathway involving the small GTPases Rab5 and Rab7. The phagocytic receptor Draper is present on the phagocytic cup and on nascent, phosphatidylinositol 3-phosphate (PI(3)P)- and Rab7-positive phagosomes, whereas integrins are maintained on the cell surface during engulfment. Due to the difference in subcellular localization, we investigated the role of Draper, integrins, and downstream signaling components in corpse processing. We found that some proteins were required for internalization only, while others had defects in corpse processing as well. This suggests that several of the core engulfment proteins are required for distinct steps of engulfment. We also performed double mutant analysis and found that combined loss of draper and αPS3 still resulted in a small number of engulfed vesicles. Therefore, we investigated another known engulfment receptor, Crq. We found that loss of all three receptors did not inhibit engulfment any further, suggesting that Crq does not play a role in engulfment by the follicle cells. A more complete understanding of how the engulfment and corpse processing machinery interact may enable better understanding and treatment of diseases associated with defects in engulfment by epithelial cells. PMID:27347682
Farrugia, Kevin J; Deacon, Paul; Fraser, Joanna
2014-03-01
There are a number of studies discussing recent developments of a one-step fluorescent cyanoacrylate process. This study is a pseudo operational trial to compare an example of a one-step fluorescent cyanoacrylate product, Lumicyano™, with the two recommended techniques for plastic carrier bags; cyanoacrylate fuming followed by basic yellow 40 (BY40) dyeing and powder suspensions. 100 plastic carrier bags were collected from the place of work and the items were treated as found without any additional fingermark deposition. The bags were split into three and after treatment with the three techniques a comparable number of fingermarks were detected by each technique (average of 300 fingermarks). The items treated with Lumicyano™ were sequentially processed with BY40 and an additional 43 new fingermarks were detected. Lumicyano™ appears to be a suitable technique for the development of fingermarks on plastic carrier bags and it can help save lab space and time as it does not require dyeing or drying procedures. Furthermore, contrary to other one-step cyanoacrylate products, existing cyanoacrylate cabinets do not require any modification for the treatment of articles with Lumicyano™. To date, there is little peer reviewed articles in the literature on trials related to Lumicyano™ and this study aims to contribute to fill this gap. © 2013.
Nojavan, Saeed; Moharami, Arezoo; Fakhari, Ali Reza
2012-08-01
In this work, two-step hollow fiber-based liquid-phase microextraction procedure was evaluated for extraction of the zwitterionic cetirizine (CTZ) and basic hydroxyzine (HZ) in human plasma. In the first step of extraction, the pH of sample was adjusted at 5.0 in order to promote liquid-phase microextraction of the zwitterionic CTZ. In the second step, the pH of sample was increased up to 11.0 for extraction of basic HZ. In this procedure, the extraction times for the first and the second steps were 30 and 20 min, respectively. Owing to the high ratio between the volumes of donor phase and acceptor phase, CTZ and HZ were enriched by factors of 280 and 355, respectively. The linearity of the analytical method was investigated for both compounds in the range of 10-500 ng mL(-1) (R(2) > 0.999). Limit of quantification (S/N = 10) for CTZ and HZ was 10 ng mL(-1) , while the limit of detection was 3 ng mL(-1) for both compounds at a signal to noise ratio of 3:1. Intraday and interday relative standard deviations (RSDs, n = 6) were in the range of 6.5-16.2%. This procedure enabled CTZ and HZ to be analyzed simultaneously by capillary electrophoresis. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
MollDE: a homology modeling framework you can click with.
Canutescu, Adrian A; Dunbrack, Roland L
2005-06-15
Molecular Integrated Development Environment (MolIDE) is an integrated application designed to provide homology modeling tools and protocols under a uniform, user-friendly graphical interface. Its main purpose is to combine the most frequent modeling steps in a semi-automatic, interactive way, guiding the user from the target protein sequence to the final three-dimensional protein structure. The typical basic homology modeling process is composed of building sequence profiles of the target sequence family, secondary structure prediction, sequence alignment with PDB structures, assisted alignment editing, side-chain prediction and loop building. All of these steps are available through a graphical user interface. MolIDE's user-friendly and streamlined interactive modeling protocol allows the user to focus on the important modeling questions, hiding from the user the raw data generation and conversion steps. MolIDE was designed from the ground up as an open-source, cross-platform, extensible framework. This allows developers to integrate additional third-party programs to MolIDE. http://dunbrack.fccc.edu/molide/molide.php rl_dunbrack@fccc.edu.
Generalized Models for Rock Joint Surface Shapes
Du, Shigui; Hu, Yunjin; Hu, Xiaofei
2014-01-01
Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901
Boisvert, Maude; Bouchard-Lévesque, Véronique; Fernandes, Sandra
2014-01-01
ABSTRACT Nuclear targeting of capsid proteins (VPs) is important for genome delivery and precedes assembly in the replication cycle of porcine parvovirus (PPV). Clusters of basic amino acids, corresponding to potential nuclear localization signals (NLS), were found only in the unique region of VP1 (VP1up, for VP1 unique part). Of the five identified basic regions (BR), three were important for nuclear localization of VP1up: BR1 was a classic Pat7 NLS, and the combination of BR4 and BR5 was a classic bipartite NLS. These NLS were essential for viral replication. VP2, the major capsid protein, lacked these NLS and contained no region with more than two basic amino acids in proximity. However, three regions of basic clusters were identified in the folded protein, assembled into a trimeric structure. Mutagenesis experiments showed that only one of these three regions was involved in VP2 transport to the nucleus. This structural NLS, termed the nuclear localization motif (NLM), is located inside the assembled capsid and thus can be used to transport trimers to the nucleus in late steps of infection but not for virions in initial infection steps. The two NLS of VP1up are located in the N-terminal part of the protein, externalized from the capsid during endosomal transit, exposing them for nuclear targeting during early steps of infection. Globally, the determinants of nuclear transport of structural proteins of PPV were different from those of closely related parvoviruses. IMPORTANCE Most DNA viruses use the nucleus for their replication cycle. Thus, structural proteins need to be targeted to this cellular compartment at two distinct steps of the infection: in early steps to deliver viral genomes to the nucleus and in late steps to assemble new viruses. Nuclear targeting of proteins depends on the recognition of a stretch of basic amino acids by cellular transport proteins. This study reports the identification of two classic nuclear localization signals in the minor capsid protein (VP1) of porcine parvovirus. The major protein (VP2) nuclear localization was shown to depend on a complex structural motif. This motif can be used as a strategy by the virus to avoid transport of incorrectly folded proteins and to selectively import assembled trimers into the nucleus. Structural nuclear localization motifs can also be important for nuclear proteins without a classic basic amino acid stretch, including multimeric cellular proteins. PMID:25078698
Reporting Experiments in Homeopathic Basic Research (REHBaR).
Stock-Schröer, Beate
2015-10-01
The aim of this study was to develop a criteria catalogue serving as a guideline for authors to improve quality of Reporting Experiments in Homeopathic Basic Research (REHBaR). Main focus was in the field of biochemical and biological experiments. So far, there was no guideline for scientists and authors available, unlike criteria catalogues common in clinical research. A Delphi Process was conducted among experts who published experimental work within the last five years in this field. The process included a total of five rounds, three rounds of adjusting and phrasing plus two consensus conferences. A checklist of 23 items was achieved, augmented with detailed examples how to handle each item while compiling a publication. Background, objectives and possible hypotheses are necessary to be given in the part 'introduction'. The section 'materials and methods' is the most important part, where a detailed description of chosen controls, object of investigation, experimental setup, replication, parameters, intervention, allocation, blinding, and statistical methods is mandatory. In the 'results' section sufficient details on analysed data, descriptive as well as inferential are needed. Moreover, authors should discuss their results and interpret them in the context of current evidence. REHBaR was compiled for authors when preparing their manuscripts, and to be used by scientific journals in the reviewing process. Reporting experiments in basic research in homeopathy is an important issue to state the quality and validity of gained results. A guideline for REHBaR seemed to be the first step to come to a commitment what information is necessary to be given in a paper. More than that, the catalogue can serve as a statement what the standards in good basic research should be. Copyright © 2015 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.
Impact of modellers' decisions on hydrological a priori predictions
NASA Astrophysics Data System (ADS)
Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.
2013-07-01
The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.
SIGKit: a New Data-based Software for Learning Introductory Geophysics
NASA Astrophysics Data System (ADS)
Zhang, Y.; Kruse, S.; George, O.; Esmaeili, S.; Papadimitrios, K. S.; Bank, C. G.; Cadmus, A.; Kenneally, N.; Patton, K.; Brusher, J.
2016-12-01
Students of diverse academic backgrounds take introductory geophysics courses to learn the theory of a variety of measurement and analysis methods with the expectation to be able to apply their basic knowledge to real data. Ideally, such data is collected in field courses and also used in lecture-based courses because they provide a critical context for better learning and understanding of geophysical methods. Each method requires a separate software package for the data processing steps, and the complexity and variety of professional software makes the path through data processing to data interpretation a strenuous learning process for students and a challenging teaching task for instructors. SIGKit (Student Investigation of Geophysics Toolkit) being developed as a collaboration between the University of South Florida, the University of Toronto, and MathWorks intends to address these shortcomings by showing the most essential processing steps and allowing students to visualize the underlying physics of the various methods. It is based on MATLAB software and offered as an easy-to-use graphical user interface and packaged so it can run as an executable in the classroom and the field even on computers without MATLAB licenses. An evaluation of the software based on student feedback from focus-group interviews and think-aloud observations helps drive its development and refinement. The toolkit provides a logical gateway into the more sophisticated and costly software students will encounter later in their training and careers by combining essential visualization, modeling, processing, and analysis steps for seismic, GPR, magnetics, gravity, resistivity, and electromagnetic data.
Basic Information about EPA ExpoBox
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases,
Firnkorn, D; Ganzinger, M; Muley, T; Thomas, M; Knaup, P
2015-01-01
Joint data analysis is a key requirement in medical research networks. Data are available in heterogeneous formats at each network partner and their harmonization is often rather complex. The objective of our paper is to provide a generic approach for the harmonization process in research networks. We applied the process when harmonizing data from three sites for the Lung Cancer Phenotype Database within the German Center for Lung Research. We developed a spreadsheet-based solution as tool to support the harmonization process for lung cancer data and a data integration procedure based on Talend Open Studio. The harmonization process consists of eight steps describing a systematic approach for defining and reviewing source data elements and standardizing common data elements. The steps for defining common data elements and harmonizing them with local data definitions are repeated until consensus is reached. Application of this process for building the phenotype database led to a common basic data set on lung cancer with 285 structured parameters. The Lung Cancer Phenotype Database was realized as an i2b2 research data warehouse. Data harmonization is a challenging task requiring informatics skills as well as domain knowledge. Our approach facilitates data harmonization by providing guidance through a uniform process that can be applied in a wide range of projects.
ERIC Educational Resources Information Center
Patton, Patricia L.; And Others
This Spanish version of "How to Work and Live in the Real World: Basic Steps for Youth with Handicaps and Their Parents and Teachers" is for young people with handicaps who are getting ready to graduate from high school and begin working and living in the adult world. The booklet places a special focus on individuals with cultural…
Tryptophan depletion decreases the recognition of fear in female volunteers.
Harmer, C J; Rogers, R D; Tunbridge, E; Cowen, P J; Goodwin, G M
2003-06-01
Serotonergic processes have been implicated in the modulation of fear conditioning in humans, postulated to occur at the level of the amygdala. The processing of other fear-relevant cues, such as facial expressions, has also been associated with amygdala function, but an effect of serotonin depletion on these processes has not been assessed. The present study investigated the effects of reducing serotonin function, using acute tryptophan depletion, on the recognition of basic facial expressions of emotions in healthy male and female volunteers. A double-blind between-groups design was used, with volunteers being randomly allocated to receive an amino acid drink specifically lacking tryptophan or a control mixture containing a balanced mixture of these amino acids. Participants were given a facial expression recognition task 5 h after drink administration. This task featured examples of six basic emotions (fear, anger, disgust, surprise, sadness and happiness) that had been morphed between each full emotion and neutral in 10% steps. As a control, volunteers were given a famous face classification task matched in terms of response selection and difficulty level. Tryptophan depletion significantly impaired the recognition of fearful facial expressions in female, but not male, volunteers. This was specific since recognition of other basic emotions was comparable in the two groups. There was also no effect of tryptophan depletion on the classification of famous faces or on subjective state ratings of mood or anxiety. These results confirm a role for serotonin in the processing of fear related cues, and in line with previous findings also suggest greater effects of tryptophan depletion in female volunteers. Although acute tryptophan depletion does not typically affect mood in healthy subjects, the present results suggest that subtle changes in the processing of emotional material may occur with this manipulation of serotonin function.
Solving the puzzle of pluripotent stem cell-derived cardiomyocyte maturation: piece by piece.
Lundy, David J; Lee, Desy S; Hsieh, Patrick C H
2017-03-01
There is a growing need for in vitro models which can serve as platforms for drug screening and basic research. Human adult cardiomyocytes cannot be readily obtained or cultured, and so pluripotent stem cell-derived cardiomyocytes appear to be an attractive option. Unfortunately, these cells are structurally and functionally immature-more comparable to foetal cardiomyocytes than adult. A recent study by Ruan et al ., provides new insights into accelerating the maturation process and takes us a step closer to solving the puzzle of pluripotent stem cell-derived cardiomyocyte maturation.
The spinal cord: a review of functional neuroanatomy.
Bican, Orhan; Minagar, Alireza; Pruitt, Amy A
2013-02-01
The spinal cord controls the voluntary muscles of the trunk and limbs and receives sensory input from these areas. It extends from the medulla oblongata to the lower border of the first lumbar vertebra. A basic knowledge of spinal cord anatomy is essential for interpretation of clinical signs and symptoms and for understanding of pathologic processes involving the spinal cord. In this article, anatomic structures are correlated with relevant clinical signs and symptoms and a step-wise approach to spinal cord diagnosis is outlined. Copyright © 2013 Elsevier Inc. All rights reserved.
Parametric Modeling as a Technology of Rapid Prototyping in Light Industry
NASA Astrophysics Data System (ADS)
Tomilov, I. N.; Grudinin, S. N.; Frolovsky, V. D.; Alexandrov, A. A.
2016-04-01
The paper deals with the parametric modeling method of virtual mannequins for the purposes of design automation in clothing industry. The described approach includes the steps of generation of the basic model on the ground of the initial one (obtained in 3D-scanning process), its parameterization and deformation. The complex surfaces are presented by the wireframe model. The modeling results are evaluated with the set of similarity factors. Deformed models are compared with their virtual prototypes. The results of modeling are estimated by the standard deviation factor.
NASA Technical Reports Server (NTRS)
Hazelton, Lyman R., Jr.
1990-01-01
Some of the logical components of a rule based planning and scheduling system are described. The researcher points out a deficiency in the conventional truth maintenance approach to this class of problems and suggests a new mechanism which overcomes the problem. This extension of the idea of justification truth maintenance may seem at first to be a small philosophical step. However, it embodies a process of basic human reasoning which is so common and automatic as to escape conscious detection without careful introspection. It is vital to any successful implementation of a rule based planning reasoner.
Conversion of ammonia into hydrogen and nitrogen by reaction with a sulfided catalyst
Matthews, Charles W.
1977-01-01
A method is provided for removing ammonia from the sour water stream of a coal gasification process. The basic steps comprise stripping the ammonia from the sour water; heating the stripped ammonia to a temperature from between 400.degree. to 1,000.degree. F; passing the gaseous ammonia through a reactor containing a sulfided catalyst to produce elemental hydrogen and nitrogen; and scrubbing the reaction product to obtain an ammonia-free gas. The residual equilibrium ammonia produced by the reactor is recycled into the stripper. The ammonia-free gas may be advantageously treated in a Claus process to recover elemental sulfur. Iron sulfide or cobalt molybdenum sulfide catalysts are used.
Constraints on Fluctuations in Sparsely Characterized Biological Systems.
Hilfinger, Andreas; Norman, Thomas M; Vinnicombe, Glenn; Paulsson, Johan
2016-02-05
Biochemical processes are inherently stochastic, creating molecular fluctuations in otherwise identical cells. Such "noise" is widespread but has proven difficult to analyze because most systems are sparsely characterized at the single cell level and because nonlinear stochastic models are analytically intractable. Here, we exactly relate average abundances, lifetimes, step sizes, and covariances for any pair of components in complex stochastic reaction systems even when the dynamics of other components are left unspecified. Using basic mathematical inequalities, we then establish bounds for whole classes of systems. These bounds highlight fundamental trade-offs that show how efficient assembly processes must invariably exhibit large fluctuations in subunit levels and how eliminating fluctuations in one cellular component requires creating heterogeneity in another.
Constraints on Fluctuations in Sparsely Characterized Biological Systems
NASA Astrophysics Data System (ADS)
Hilfinger, Andreas; Norman, Thomas M.; Vinnicombe, Glenn; Paulsson, Johan
2016-02-01
Biochemical processes are inherently stochastic, creating molecular fluctuations in otherwise identical cells. Such "noise" is widespread but has proven difficult to analyze because most systems are sparsely characterized at the single cell level and because nonlinear stochastic models are analytically intractable. Here, we exactly relate average abundances, lifetimes, step sizes, and covariances for any pair of components in complex stochastic reaction systems even when the dynamics of other components are left unspecified. Using basic mathematical inequalities, we then establish bounds for whole classes of systems. These bounds highlight fundamental trade-offs that show how efficient assembly processes must invariably exhibit large fluctuations in subunit levels and how eliminating fluctuations in one cellular component requires creating heterogeneity in another.
Calculating with light using a chip-scale all-optical abacus.
Feldmann, J; Stegmaier, M; Gruhler, N; Ríos, C; Bhaskaran, H; Wright, C D; Pernice, W H P
2017-11-02
Machines that simultaneously process and store multistate data at one and the same location can provide a new class of fast, powerful and efficient general-purpose computers. We demonstrate the central element of an all-optical calculator, a photonic abacus, which provides multistate compute-and-store operation by integrating functional phase-change materials with nanophotonic chips. With picosecond optical pulses we perform the fundamental arithmetic operations of addition, subtraction, multiplication, and division, including a carryover into multiple cells. This basic processing unit is embedded into a scalable phase-change photonic network and addressed optically through a two-pulse random access scheme. Our framework provides first steps towards light-based non-von Neumann arithmetic.
Political Regime and Human Capital: A Cross-Country Analysis
ERIC Educational Resources Information Center
Klomp, Jeroen; de Haan, Jakob
2013-01-01
We examine the relationship between different dimensions of the political regime in place and human capital using a two-step structural equation model. In the first step, we employ factor analysis on 16 human capital indicators to construct two new human capital measures (basic and advanced human capital). In the second step, we estimate the…
How To Prepare Effective Overhead Projector Presentations: One Picture Is Worth a Thousand Words.
ERIC Educational Resources Information Center
National Audio-Visual Supply, East Rutherford, NJ.
Designed to help create effective presentations, this guide describes the basic techniques and provides hints for producing professional, attention-getting overhead transparencies in a step-by-step procedure format. Eight topics are addressed in the guide: (1) eight steps to a successful meeting presentation; (2) advantages of overhead projection;…
Basic Steps to Using the Energy Savings Plus Health Guidelines
he Energy Savings Plus Health Guide equips school districts to integrate indoor air quality protections into school energy efficiency retrofits and other building upgrade projects. This document describes steps to using the Energy Savings Plus Health guide
Reverse engineering of aircraft wing data using a partial differential equation surface model
NASA Astrophysics Data System (ADS)
Huband, Jacalyn Mann
Reverse engineering is a multi-step process used in industry to determine a production representation of an existing physical object. This representation is in the form of mathematical equations that are compatible with computer-aided design and computer-aided manufacturing (CAD/CAM) equipment. The four basic steps to the reverse engineering process are data acquisition, data separation, surface or curve fitting, and CAD/CAM production. The surface fitting step determines the design representation of the object, and thus is critical to the success or failure of the reverse engineering process. Although surface fitting methods described in the literature are used to model a variety of surfaces, they are not suitable for reversing aircraft wings. In this dissertation, we develop and demonstrate a new strategy for reversing a mathematical representation of an aircraft wing. The basis of our strategy is to take an aircraft design model and determine if an inverse model can be derived. A candidate design model for this research is the partial differential equation (PDE) surface model, proposed by Bloor and Wilson and used in the Rapid Airplane Parameter Input Design (RAPID) tool at the NASA-LaRC Geolab. There are several basic mathematical problems involved in reversing the PDE surface model: (i) deriving a computational approximation of the surface function; (ii) determining a radial parametrization of the wing; (iii) choosing mathematical models or classes of functions for representation of the boundary functions; (iv) fitting the boundary data points by the chosen boundary functions; and (v) simultaneously solving for the axial parameterization and the derivative boundary functions. The study of the techniques to solve the above mathematical problems has culminated in a reverse PDE surface model and two reverse PDE surface algorithms. One reverse PDE surface algorithm recovers engineering design parameters for the RAPID tool from aircraft wing data and the other generates a PDE surface model with spline boundary functions from an arbitrary set of grid points. Our numerical tests show that the reverse PDE surface model and the reverse PDE surface algorithms can be used for the reverse engineering of aircraft wing data.
TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0
NASA Technical Reports Server (NTRS)
Ortiz, C. J.
1994-01-01
The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.
Safe and private pedestrian detection by a low-cost fiber optic specklegram
NASA Astrophysics Data System (ADS)
Rodriguez-Cuevas, A.; Rodriguez-Cobo, L.; Lomer, M.; Lopez-Higuera, J. M.
2017-04-01
Surveillance is becoming more and more important in the recent years. In many cities, cameras have been set to look after parks, streets, roads, facilities and so on, however this fact is rising concerns about privacy. In this work, an alternative surveillance method which gather at once security and privacy has been propose and tested. Based on fiber optic specklegram technology; a system consisting of a fiber optic, a coherent light source and a photodetector has been placed under a carpet for detecting people walking over it and its accuracy, regarding measuring the steps, have been measured. Results suggest that using low exposed geometries along the carpet and basic processing methods, it is possible to detect with more than 95% of accuracy the number of steps done by the person walking over the carpet.
Information systems in healthcare - state and steps towards sustainability.
Lenz, R
2009-01-01
To identify core challenges and first steps on the way to sustainable information systems in healthcare. Recent articles on healthcare information technology and related articles from Medical Informatics and Computer Science were reviewed and analyzed. Core challenges that couldn't be solved over the years are identified. The two core problem areas are process integration, meaning to effectively embed IT-systems into routine workflows, and systems integration, meaning to reduce the effort for interconnecting independently developed IT-components. Standards for systems integration have improved a lot, but their usefulness is limited where system evolution is needed. Sustainable Healthcare Information Systems should be based on system architectures that support system evolution and avoid costly system replacements every five to ten years. Some basic principles for the design of such systems are separation of concerns, loose coupling, deferred systems design, and service oriented architectures.
Workplace Basics: The Skills Employers Want.
ERIC Educational Resources Information Center
Carnevale, Anthony P.; And Others
1989-01-01
Identifies the basic skills needed by workers to function in today's high technology workplace. Examines ways of training employees in learning and communication skills, adaptability, personal management, group effectiveness, and organizational leadership. Describes the eight-step training approach used by Mazda Motor Manufacturing Corporation.…
Development and acceleration of unstructured mesh-based cfd solver
NASA Astrophysics Data System (ADS)
Emelyanov, V.; Karpenko, A.; Volkov, K.
2017-06-01
The study was undertaken as part of a larger effort to establish a common computational fluid dynamics (CFD) code for simulation of internal and external flows and involves some basic validation studies. The governing equations are solved with ¦nite volume code on unstructured meshes. The computational procedure involves reconstruction of the solution in each control volume and extrapolation of the unknowns to find the flow variables on the faces of control volume, solution of Riemann problem for each face of the control volume, and evolution of the time step. The nonlinear CFD solver works in an explicit time-marching fashion, based on a three-step Runge-Kutta stepping procedure. Convergence to a steady state is accelerated by the use of geometric technique and by the application of Jacobi preconditioning for high-speed flows, with a separate low Mach number preconditioning method for use with low-speed flows. The CFD code is implemented on graphics processing units (GPUs). Speedup of solution on GPUs with respect to solution on central processing units (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.
NASA Astrophysics Data System (ADS)
Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki
2014-01-01
A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.
Pinto, Vitor Laerte; Cerbino Neto, José; Penna, Gerson Oliveira
2014-12-01
Health surveillance (HS) is one of the key components of the Brazilian Unified Health System (SUS). This article describes recent changes in health surveillance funding models and the role these changes have had in the reorganization and decentralization of health actions. Federal law no. 8.080 of 1990 defined health surveillance as a fundamental pillar of the SUS, and an exclusive fund with equitable distribution criteria was created in the Basic Operational Norm of 1996 to pay for health surveillance actions. This step facilitated the decentralization of health care at the municipal level, giving local authorities autonomy to plan and provide services. The Health Pact of 2006 and its regulation under federal decree No. 3252 in 2009 bolstered the processes of decentralization, regionalization and integration of health care. Further changes in the basic concepts of health surveillance around the world and in the funding policies negotiated by different spheres of government in Brazil have been catalysts for the process of HS institutionalization in recent years.
Reducing violent injuries: priorities for pediatrician advocacy.
Dolins, J C; Christoffel, K K
1994-10-01
A basic framework for developing an advocacy plan must systematically break down the large task of policy development implementation into manageable components. The basic framework described in detail in this paper includes three steps: Setting policy objectives by narrowing the scope of policy, by reviewing policy options, and by examining options against selected criteria. Developing strategies for educating the public and for approaching legislative/regulatory bodies. Evaluating the effectiveness of the advocacy action plan as a process and as an agent for change. To illustrate the variety of ways in which pediatricians can be involved in the policy process to reduce violent injuries among children and adolescents, we apply this systematic approach to three priority areas. Prohibiting the use of corporal punishment in schools is intended to curb the institutionalized legitimacy of violence that has been associated with future use of violence. Efforts to remove handguns from the environments of children and adolescents are aimed at reducing the numbers of firearm injuries inflicted upon and by minors. Comprehensive treatment of adolescent victims of assault is intended to decrease the reoccurrence of violent injuries.
The Raptor Real-Time Processing Architecture
NASA Astrophysics Data System (ADS)
Galassi, M.; Starr, D.; Wozniak, P.; Brozdin, K.
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor ``fovea'' cameras and spectrometer to the location of the optical transient. The most interesting of Raptor's many applications is the real-time search for orphan optical counterparts of Gamma Ray Bursts. The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback, etc.) is implemented with a ``component'' approach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system. Finally, the Raptor architecture is entirely based on free software (sometimes referred to as ``open source'' software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.
ERIC Educational Resources Information Center
Suveren-Erdogan, Ceren; Suveren, Sibel
2018-01-01
The aim of this study is to enable basic posture exercises to be included in the basic exercises of the visually impaired individuals as a step to learn more difficult movements, to guide the instructors in order to make efficient progress in a short time and to help more numbers of disabled individuals benefit from these studies. Method: 15…
Spin Choreography: Basic Steps in High Resolution NMR (by Ray Freeman)
NASA Astrophysics Data System (ADS)
Minch, Michael J.
1998-02-01
There are three orientations that NMR courses may take. The traditional molecular structure course focuses on the interpretation of spectra and the use of chemical shifts, coupling constants, and nuclear Overhauser effects (NOE) to sort out subtle details of structure and stereochemistry. Courses can also focus on the fundamental quantum mechanics of observable NMR parameters and processes such a spin-spin splitting and relaxation. More recently there are courses devoted to the manipulation of nuclear spins and the basic steps of one- and two-dimensional NMR experiments. Freeman's book is directed towards the latter audience. Modern NMR methods offer a myriad ways to extract information about molecular structure and motion by observing the behavior of nuclear spins under a variety of conditions. In Freeman's words: "We can lead the spins through an intricate dance, carefully programmed in advance, to enhance, simplify, correlate, decouple, edit or assign NMR spectra." This is a carefully written, well-illustrated account of how this dance is choreographed by pulse programming, double resonance, and gradient effects. Although well written, this book is not an easy read; every word counts. It is recommended for graduate courses that emphasize the fundamentals of magnetic resonance. It is not a text on interpretation of spectra.
Shlyonsky, Vadim; Dupuis, Freddy; Gall, David
2014-01-01
Understanding the electrical biophysical properties of the cell membrane can be difficult for neuroscience students as it relies solely on lectures of theoretical models without practical hands on experiments. To address this issue, we developed an open-source lipid bilayer amplifier, the OpenPicoAmp, which is appropriate for use in introductory courses in biophysics or neurosciences at the undergraduate level, dealing with the electrical properties of the cell membrane. The amplifier is designed using the common lithographic printed circuit board fabrication process and off-the-shelf electronic components. In addition, we propose a specific design for experimental chambers allowing the insertion of a commercially available polytetrafluoroethylene film. We provide a complete documentation allowing to build the amplifier and the experimental chamber. The students hand-out giving step-by step instructions to perform a recording is also included. Our experimental setup can be used in basic experiments in which students monitor the bilayer formation by capacitance measurement and record unitary currents produced by ionic channels like gramicidin A dimers. Used in combination with a low-cost data acquisition board this system provides a complete solution for hands-on lessons, therefore improving the effectiveness in teaching basic neurosciences or biophysics.
The OpenPicoAmp: An Open-Source Planar Lipid Bilayer Amplifier for Hands-On Learning of Neuroscience
Shlyonsky, Vadim; Dupuis, Freddy; Gall, David
2014-01-01
Understanding the electrical biophysical properties of the cell membrane can be difficult for neuroscience students as it relies solely on lectures of theoretical models without practical hands on experiments. To address this issue, we developed an open-source lipid bilayer amplifier, the OpenPicoAmp, which is appropriate for use in introductory courses in biophysics or neurosciences at the undergraduate level, dealing with the electrical properties of the cell membrane. The amplifier is designed using the common lithographic printed circuit board fabrication process and off-the-shelf electronic components. In addition, we propose a specific design for experimental chambers allowing the insertion of a commercially available polytetrafluoroethylene film. We provide a complete documentation allowing to build the amplifier and the experimental chamber. The students hand-out giving step-by step instructions to perform a recording is also included. Our experimental setup can be used in basic experiments in which students monitor the bilayer formation by capacitance measurement and record unitary currents produced by ionic channels like gramicidin A dimers. Used in combination with a low-cost data acquisition board this system provides a complete solution for hands-on lessons, therefore improving the effectiveness in teaching basic neurosciences or biophysics. PMID:25251830
Vasculitic wheel - an algorithmic approach to cutaneous vasculitides.
Ratzinger, Gudrun; Zelger, Bettina Gudrun; Carlson, J Andrew; Burgdorf, Walter; Zelger, Bernhard
2015-11-01
Previous classifications of vasculitides suffer from several defects. First, classifications may follow different principles including clinicopathologic findings, etiology, pathogenesis, prognosis, or therapeutic options. Second, authors fail to distinguish between vasculitis and coagulopathy. Third, vasculitides are systemic diseases. Organ-specific variations make morphologic findings difficult to compare. Fourth, subtle changes are recognized in the skin, but may be asymptomatic in other organs. Our aim was to use the skin and subcutis as a model and the clinicopathologic correlation as the basic process for classification. We use an algorithmic approach with pattern analysis, which allows for consistent reporting of microscopic findings. We first differentiate between small and medium vessel vasculitis. In the second step, we differentiate the subtypes of small (capillaries versus postcapillary venules) and medium-sized (arterioles/arteries versus veins) vessels. In the final step, we differentiate, according to the predominant cell type, into leukocytoclastic and/or granulomatous vasculitis. Starting from leukocytoclastic vasculitis as a central reaction pattern of cutaneous small/medium vessel vasculitides, its relations or variations may be arranged around it like spokes of a wheel around the hub. This may help establish some basic order in this rather complex realm of cutaneous vasculitides, leading to a better understanding in a complicated field. © 2015 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.
Computational Phenotyping in Psychiatry: A Worked Example
2016-01-01
Abstract Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology—structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry. PMID:27517087
NASA Technical Reports Server (NTRS)
Kasahara, Hironori; Honda, Hiroki; Narita, Seinosuke
1989-01-01
Parallel processing of real-time dynamic systems simulation on a multiprocessor system named OSCAR is presented. In the simulation of dynamic systems, generally, the same calculation are repeated every time step. However, we cannot apply to Do-all or the Do-across techniques for parallel processing of the simulation since there exist data dependencies from the end of an iteration to the beginning of the next iteration and furthermore data-input and data-output are required every sampling time period. Therefore, parallelism inside the calculation required for a single time step, or a large basic block which consists of arithmetic assignment statements, must be used. In the proposed method, near fine grain tasks, each of which consists of one or more floating point operations, are generated to extract the parallelism from the calculation and assigned to processors by using optimal static scheduling at compile time in order to reduce large run time overhead caused by the use of near fine grain tasks. The practicality of the scheme is demonstrated on OSCAR (Optimally SCheduled Advanced multiprocessoR) which has been developed to extract advantageous features of static scheduling algorithms to the maximum extent.
Computational Phenotyping in Psychiatry: A Worked Example.
Schwartenbeck, Philipp; Friston, Karl
2016-01-01
Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology-structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry.
Self-assembly kinetics of microscale components: A parametric evaluation
NASA Astrophysics Data System (ADS)
Carballo, Jose M.
The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.
ERIC Educational Resources Information Center
Sells, Scott P.
A model for treating difficult adolescents and their families is presented. Part 1 offers six basic assumptions about the causes of severe behavioral problems and presents the treatment model with guidelines necessary to address each of these six causes. Case examples highlight and clarify major points within each of the 15 procedural steps of the…
Abdel-Wahab, May; Rengan, Ramesh; Curran, Bruce; Swerdloff, Stuart; Miettinen, Mika; Field, Colin; Ranjitkar, Sunita; Palta, Jatinder; Tripuraneni, Prabhakar
2010-02-01
To describe the processes and benefits of the integrating healthcare enterprises in radiation oncology (IHE-RO). The IHE-RO process includes five basic steps. The first step is to identify common interoperability issues encountered in radiation treatment planning and the delivery process. IHE-RO committees partner with vendors to develop solutions (integration profiles) to interoperability problems. The broad application of these integration profiles across a variety of vender platforms is tested annually at the Connectathon event. Demonstration of the seamless integration and transfer of patient data to the potential users are then presented by vendors at the public demonstration event. Users can then integrate these profiles into requests for proposals and vendor contracts by institutions. Incorporation of completed integration profiles into requests for proposals can be done when purchasing new equipment. Vendors can publish IHE integration statements to document the integration profiles supported by their products. As a result, users can reference integration profiles in requests for proposals, simplifying the systems acquisition process. These IHE-RO solutions are now available in many of the commercial radiation oncology-related treatment planning, delivery, and information systems. They are also implemented at cancer care sites around the world. IHE-RO serves an important purpose for the radiation oncology community at large. Copyright 2010 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ranjan, Alok, E-mail: alok.ranjan@us.tel.com; Wang, Mingmei; Sherpa, Sonam D.
2016-05-15
Atomic or layer by layer etching of silicon exploits temporally segregated self-limiting adsorption and material removal steps to mitigate the problems associated with continuous or quasicontinuous (pulsed) plasma processes: selectivity loss, damage, and profile control. Successful implementation of atomic layer etching requires careful choice of the plasma parameters for adsorption and desorption steps. This paper illustrates how process parameters can be arrived at through basic scaling exercises, modeling and simulation, and fundamental experimental tests of their predictions. Using chlorine and argon plasma in a radial line slot antenna plasma source as a platform, the authors illustrate how cycle time, ionmore » energy, and radical to ion ratio can be manipulated to manage the deviation from ideality when cycle times are shortened or purges are incomplete. Cell based Monte Carlo feature scale modeling is used to illustrate profile outcomes. Experimental results of atomic layer etching processes are illustrated on silicon line and space structures such that iso-dense bias and aspect ratio dependent free profiles are produced. Experimental results also illustrate the profile control margin as processes move from atomic layer to multilayer by layer etching. The consequence of not controlling contamination (e.g., oxygen) is shown to result in deposition and roughness generation.« less
Optical observations of electrical activity in cloud discharges
NASA Astrophysics Data System (ADS)
Vayanganie, S. P. A.; Fernando, M.; Sonnadara, U.; Cooray, V.; Perera, C.
2018-07-01
Temporal variation of the luminosity of seven natural cloud-to-cloud lightning channels were studied, and results were presented. They were recorded by using a high-speed video camera with the speed of 5000 fps (frames per second) and the pixel resolution of 512 × 512 in three locations in Sri Lanka in the tropics. Luminosity variation of the channel with time was obtained by analyzing the image sequences. Recorded video frames together with the luminosity variation were studied to understand the cloud discharge process. Image analysis techniques also used to understand the characteristics of channels. Cloud flashes show more luminosity variability than ground flashes. Most of the time it starts with a leader which do not have stepping process. Channel width and standard deviation of intensity variation across the channel for each cloud flashes was obtained. Brightness variation across the channel shows a Gaussian distribution. The average time duration of the cloud flashes which start with non stepped leader was 180.83 ms. Identified characteristics are matched with the existing models to understand the process of cloud flashes. The fact that cloud discharges are not confined to a single process have been further confirmed from this study. The observations show that cloud flash is a basic lightning discharge which transfers charge between two charge centers without using one specific mechanism.
Automatic Registration of Terrestrial Laser Scanner Point Clouds Using Natural Planar Surfaces
NASA Astrophysics Data System (ADS)
Theiler, P. W.; Schindler, K.
2012-07-01
Terrestrial laser scanners have become a standard piece of surveying equipment, used in diverse fields like geomatics, manufacturing and medicine. However, the processing of today's large point clouds is time-consuming, cumbersome and not automated enough. A basic step of post-processing is the registration of scans from different viewpoints. At present this is still done using artificial targets or tie points, mostly by manual clicking. The aim of this registration step is a coarse alignment, which can then be improved with the existing algorithm for fine registration. The focus of this paper is to provide such a coarse registration in a fully automatic fashion, and without placing any target objects in the scene. The basic idea is to use virtual tie points generated by intersecting planar surfaces in the scene. Such planes are detected in the data with RANSAC and optimally fitted using least squares estimation. Due to the huge amount of recorded points, planes can be determined very accurately, resulting in well-defined tie points. Given two sets of potential tie points recovered in two different scans, registration is performed by searching for the assignment which preserves the geometric configuration of the largest possible subset of all tie points. Since exhaustive search over all possible assignments is intractable even for moderate numbers of points, the search is guided by matching individual pairs of tie points with the help of a novel descriptor based on the properties of a point's parent planes. Experiments show that the proposed method is able to successfully coarse register TLS point clouds without the need for artificial targets.
Business model for sensor-based fall recognition systems.
Fachinger, Uwe; Schöpke, Birte
2014-01-01
AAL systems require, in addition to sophisticated and reliable technology, adequate business models for their launch and sustainable establishment. This paper presents the basic features of alternative business models for a sensor-based fall recognition system which was developed within the context of the "Lower Saxony Research Network Design of Environments for Ageing" (GAL). The models were developed parallel to the R&D process with successive adaptation and concretization. An overview of the basic features (i.e. nine partial models) of the business model is given and the mutual exclusive alternatives for each partial model are presented. The partial models are interconnected and the combinations of compatible alternatives lead to consistent alternative business models. However, in the current state, only initial concepts of alternative business models can be deduced. The next step will be to gather additional information to work out more detailed models.
NASA Technical Reports Server (NTRS)
Bart, Timothy J.; Kutler, Paul (Technical Monitor)
1998-01-01
Chapter 1 briefly reviews several related topics associated with the symmetrization of systems of conservation laws and quasi-conservation laws: (1) Basic Entropy Symmetrization Theory; (2) Symmetrization and eigenvector scaling; (3) Symmetrization of the compressible Navier-Stokes equations; and (4) Symmetrization of the quasi-conservative form of the magnetohydrodynamic (MHD) equations. Chapter 2 describes one of the best known tools employed in the study of differential equations, the maximum principle: any function f(x) which satisfies the inequality f(double prime)>0 on the interval [a,b] attains its maximum value at one of the endpoints on the interval. Chapter three examines the upwind finite volume schemes for scalar and system conservation laws. The basic tasks in the upwind finite volume approach have already been presented: reconstruction, flux evaluation, and evolution. By far, the most difficult task in this process is the reconstruction step.
van Egmond, Marieke Christina; Navarrete Berges, Andrés; Omarshah, Tariq; Benton, Jennifer
2017-06-01
An emerging field of research is beginning to examine the ways in which socioeconomic disparities affect emotional, cognitive, and social processes. In this study, we took a two-step approach to examining the role that resource scarcity plays in the predictive power of intrinsic motivation on school attendance, as well as its influence on the precursors of intrinsic motivation: the psychological needs of relatedness, autonomy, and competence. Results revealed that intrinsic motivation predicts school attendance even under conditions of extreme adversity. The satisfaction of the basic needs is more important for participants who are exposed to severe rather than mild levels of deprivation. Our findings illustrate ecological effects on the mechanism underlying goal-directed behavior. They provide evidence in favor of self-determination theory's depiction of humans as active, growth-oriented organisms and for the potential of psychological interventions to reduce poverty.
Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.
Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A
2016-04-01
The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.
Michman, Elisheva; Agranat, Israel
2016-01-01
The role of elementary stereochemistry is illustrated in the patent litigations of the blockbuster antidepressant drug escitalopram oxalate. An undergraduate student of organic chemistry would recognize the stereochemical courses of the intramolecular SN 2 and SN 1 reactions of the single-enantiomer (S)-diol intermediate in the synthesis of the blockbuster antidepressant drug escitalopram oxalate: retention of configuration of the chiral carbon atom under basic conditions and racemization under acidic conditions, respectively. He/she, in searching for a stereoselective ring-closure reaction of the enantiomeric diol, will think of an SN 2 reaction in a basic medium. From these points of view, the process claim in the enantiomer patents of escitalopram is obvious/lacks an inventive step. An organic chemistry examination problem based on this scenario is offered. © 2015 Wiley Periodicals, Inc.
How to write effective business letters: scribing information for pharmacists.
Hamilton, C W
1993-11-01
Pharmacists frequently write letters but lack specific training on how to do it well. This review summarizes strategies for improving business correspondence, emphasizes basic writing guidelines, and offers practical advice for pharmacists. The first steps for effective communication are careful planning and identifying the main message to be conveyed. The purpose for writing should be stated in the opening paragraph of the letter. To ensure a successful outcome, actions needed should be clearly summarized and visually highlighted. The tone of the letter should reflect a reasonable speech pattern, not the cryptic writing found in many scientific papers. The layout of the letter should be inviting, which is readily achievable through judicious use of word processing. Many delivery options are available, such as traditional postal services, express mail, and facsimile transmission. Readers are encouraged to test these basic writing principles and decide for themselves whether these recommendations affect the success of business correspondence.
Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Emmert-Buck, Michael R
2005-01-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of pancreatic malignancy and other biological phenomena. This chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed-over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification. High-quality tissue microdissection does not necessarily mean high-quality samples to analyze. The quality of biomaterials obtained for analysis is highly dependent on steps upstream and downstream from tissue microdissection. We provide protocols for each of these steps, and encourage you to improve upon these. It is worth the effort of every laboratory to optimize and document its technique at each stage of the process, and we provide a starting point for those willing to spend the time to optimize. In our view, poor documentation of tissue and cell type of origin and the use of nonoptimized protocols is a source of inefficiency in current life science research. Even incremental improvement in this area will increase productivity significantly.
Measured Plume Dispersion Parameters Over Water. Volume 1.
1984-09-01
meteorlogical parameters were continuously monitored at various locations. Tracer gas concentrations were measured by a variety of methods at...addition, this step added a header . to the data set containing a variety of averaged meteorlogical quantities. The basic procedure in this step was
A Three-Step Approach to Veterinary Medical Education
ERIC Educational Resources Information Center
Kavanaugh, J. F.
1976-01-01
A formal education plan with two admission steps is outlined. Animal agriculture and the basic sciences are combined in a two-year middle stage. The medical education (third stage) that specifically addresses pathology and the clinical sciences encompasses three years. (Author/LBH)
Asadi, Sakine; Nojavan, Saeed
2016-06-07
In the present work, acidic and basic drugs were simultaneously extracted by a novel method of high efficiency herein referred to as two-step voltage dual electromembrane extraction (TSV-DEME). Optimizing effective parameters such as composition of organic liquid membrane, pH values of donor and acceptor solutions, voltage and duration of each step, the method had its figures of merit investigated in pure water, human plasma, wastewater, and breast milk samples. Simultaneous extraction of acidic and basic drugs was done by applying potentials of 150 V and 400 V for 6 min and 19 min as the first and second steps, respectively. The model compounds were extracted from 4 mL of sample solution (pH = 6) into 20 μL of each acceptor solution (32 mM NaOH for acidic drugs and 32 mM HCL for basic drugs). 1-Octanol was immobilized within the pores of a porous hollow fiber of polypropylene, as the supported liquid membrane (SLM) for acidic drugs, and 2-ethyle hexanol, as the SLM for basic drugs. The proposed TSV-DEME technique provided good linearity with the resulting correlation coefficients ranging from 0.993 to 0.998 over a concentration range of 1-1000 ng mL(-1). The limit of detections of the drugs were found to range within 0.3-1.5 ng mL(-1), while the corresponding repeatability ranged from 7.7 to 15.5% (n = 4). The proposed method was further compared to simple dual electromembrane extraction (DEME), indicating significantly higher recoveries for TSV-DEME procedure (38.1-68%), as compared to those of simple DEME procedure (17.7-46%). Finally, the optimized TSV-DEME was applied to extract and quantify model compounds in breast milk, wastewater, and plasma samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Basic Internet Software Toolkit.
ERIC Educational Resources Information Center
Buchanan, Larry
1998-01-01
Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…
Inculcating Quality Concepts in the US Air Force: Right Music, Wrong Step
1994-04-01
CONCEPTS IN THE US AIR FORCE: RIGHT MUSIC , WRONG STEP Acceslon For NTIS CRA&J DTIC TAB Unannounced fJ by Justification Barbara A. Kucharczyk By. Lieutenant...TITLE: Inculcating Quality Concepts in the US Air Force: Right Music , Wrong Step AUTHOR: Barbara A. Kucharczyk, Lieutenant Colonel, USAF In its...perceived attitudinal backlash.4 While basic quality concepts are certainly the right music , many Air Force members are dancing the wrong step. Why? This
A macrosonic system for industrial processing
Gallego-Juarez; Rodriguez-Corral; Riera-Franco de Sarabia E; Campos-Pozuelo; Vazquez-Martinez; Acosta-Aparicio
2000-03-01
The development of high-power applications of sonic and ultrasonic energy in industrial processing requires a great variety of practical systems with characteristics which are dependent on the effect to be exploited. Nevertheless, the majority of systems are basically constituted of a treatment chamber and one or several transducers coupled to it. Therefore, the feasibility of the application mainly depends on the efficiency of the transducer-chamber system. This paper deals with a macrosonic system which is essentially constituted of a high-power transducer with a double stepped-plate radiator coupled to a chamber of square section. The radiator, which has a rectangular shape, is placed on one face of the chamber in order to drive the inside fluid volume. The stepped profile of the radiator allows a piston-like radiation to be obtained. The radiation from the back face of the radiator is also applied to the chamber by using adequate reflectors. Transducer-chamber systems for sonic and ultrasonic frequencies have been developed with power capacities up to about 5 kW for the treatment of fluid volumes of several cubic meters. The characteristics of these systems are presented in this paper.
Chow, Grant V; Hayashi, Jennifer; Hirsch, Glenn A; Christmas, Colleen
2011-04-01
Weather emergencies present a multifaceted challenge to residents and residency programs. Both the individual trainee and program may be pushed to the limits of physical and mental strain, potentially jeopardizing core competencies of patient care and professionalism. Although daunting, the task of preparing for these events should be a methodical process integrated into every residency training program. The core elements of emergency preparation with regard to inpatient services include identifying and staffing critical positions, motivating residents to consider the needs of the group over those of the individual, providing for basic needs, and planning activities in order to preserve team morale and facilitate recovery. The authors outline a four-step process in preparing a residency program for an anticipated short-term weather emergency. An example worksheet for emergency planning is included. With adequate preparation, residency training programs can maintain the highest levels of patient care, professionalism, and esprit de corps during weather emergencies. When managed effectively, emergencies may present an opportunity for professional growth and a sense of unity for those involved.
Analysis of ChIP-seq Data in R/Bioconductor.
de Santiago, Ines; Carroll, Thomas
2018-01-01
The development of novel high-throughput sequencing methods for ChIP (chromatin immunoprecipitation) has provided a very powerful tool to study gene regulation in multiple conditions at unprecedented resolution and scale. Proactive quality-control and appropriate data analysis techniques are of critical importance to extract the most meaningful results from the data. Over the last years, an array of R/Bioconductor tools has been developed allowing researchers to process and analyze ChIP-seq data. This chapter provides an overview of the methods available to analyze ChIP-seq data based primarily on software packages from the open-source Bioconductor project. Protocols described in this chapter cover basic steps including data alignment, peak calling, quality control and data visualization, as well as more complex methods such as the identification of differentially bound regions and functional analyses to annotate regulatory regions. The steps in the data analysis process were demonstrated on publicly available data sets and will serve as a demonstration of the computational procedures routinely used for the analysis of ChIP-seq data in R/Bioconductor, from which readers can construct their own analysis pipelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L
2015-02-01
Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based proceduremore » system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the underlying data structure for such CBPS. The objective of the research effort is to develop guidance on how to design both the user interface and the underlying schema. This paper will describe the result and insights gained from the research activities conducted to date.« less
Maize - GO annotation methods, evaluation, and review (Maize-GAMER)
USDA-ARS?s Scientific Manuscript database
Making a genome sequence accessible and useful involves three basic steps: genome assembly, structural annotation, and functional annotation. The quality of data generated at each step influences the accuracy of inferences that can be made, with high-quality analyses produce better datasets resultin...
Endobronchial valves for bronchopleural fistula: pitfalls and principles.
Gaspard, Dany; Bartter, Thaddeus; Boujaoude, Ziad; Raja, Haroon; Arya, Rohan; Meena, Nikhil; Abouzgheib, Wissam
2017-01-01
Placement of endobronchial valves for bronchopleural fistula (BPF) is not always straightforward. A simple guide to the steps for an uncomplicated procedure does not encompass pitfalls that need to be understood and overcome to maximize the efficacy of this modality. The objective of this study was to discuss examples of difficult cases for which the placement of endobronchial valves was not straightforward and required alterations in the usual basic steps. Subsequently, we aimed to provide guiding principles for a successful procedure. Six illustrative cases were selected to demonstrate issues that can arise during endobronchial valve placement. In each case, a real or apparent lack of decrease in airflow through a BPF was diagnosed and addressed. We have used the selected problem cases to illustrate principles, with the goal of helping to increase the success rate for endobronchial valve placement in the treatment of BPF. This series demonstrates issues that complicate effective placement of endobronchial valves for BPF. These issues form the basis for troubleshooting steps that complement the basic procedural steps.
Retraining walking adaptability following incomplete spinal cord injury.
Fox, Emily J; Tester, Nicole J; Butera, Katie A; Howland, Dena R; Spiess, Martina R; Castro-Chapman, Paula L; Behrman, Andrea L
2017-01-01
Functional walking requires the ability to modify one's gait pattern to environmental demands and task goals-gait adaptability. Following incomplete spinal cord injury (ISCI), gait rehabilitation such as locomotor training (Basic-LT) emphasizes intense, repetitive stepping practice. Rehabilitation approaches focusing on practice of gait adaptability tasks have not been established for individuals with ISCIs but may promote recovery of higher level walking skills. The primary purpose of this case series was to describe and determine the feasibility of administering a gait adaptability retraining approach-Adapt-LT-by comparing the dose and intensity of Adapt-LT to Basic-LT. Three individuals with ISCIs (>1 year, AIS C or D) completed three weeks each (15 sessions) of Basic-LT and Adapt-LT. Interventions included practice on a treadmill with body weight support and practice overground (≥30 mins total). Adapt-LT focused on speed changes, obstacle negotiation, and backward walking. Training parameters (step counts, speeds, perceived exertion) were compared and outcomes assessed pre and post interventions. Based on completion of the protocol and similarities in training parameters in the two interventions, it was feasible to administer Adapt-LT with a similar dosage and intensity as Basic-LT. Additionally, the participants demonstrated gains in walking function and balance following each training type. Rehabilitation that includes stepping practice with adaptability tasks is feasible for individuals with ISCIs. Further investigation is needed to determine the efficacy of Adapt-LT.
Processing Satellite Images on Tertiary Storage: A Study of the Impact of Tile Size on Performance
NASA Technical Reports Server (NTRS)
Yu, JieBing; DeWitt, David J.
1996-01-01
Before raw data from a satellite can be used by an Earth scientist, it must first undergo a number of processing steps including basic processing, cleansing, and geo-registration. Processing actually expands the volume of data collected by a factor of 2 or 3 and the original data is never deleted. Thus processing and storage requirements can exceed 2 terrabytes/day. Once processed data is ready for analysis, a series of algorithms (typically developed by the Earth scientists) is applied to a large number of images in a data set. The focus of this paper is how best to handle such images stored on tape using the following assumptions: (1) all images of interest to a scientist are stored on a single tape, (2) images are accessed and processed in the order that they are stored on tape, and (3) the analysis requires access to only a portion of each image and not the entire image.
Collagen Quantification in Tissue Specimens.
Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I
2017-01-01
Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.
Efficient green methanol synthesis from glycerol
NASA Astrophysics Data System (ADS)
Haider, Muhammad H.; Dummer, Nicholas F.; Knight, David W.; Jenkins, Robert L.; Howard, Mark; Moulijn, Jacob; Taylor, Stuart H.; Hutchings, Graham J.
2015-12-01
The production of biodiesel from the transesterification of plant-derived triglycerides with methanol has been commercialized extensively. Impure glycerol is obtained as a by-product at roughly one-tenth the mass of the biodiesel. Utilization of this crude glycerol is important in improving the viability of the overall process. Here we show that crude glycerol can be reacted with water over very simple basic or redox oxide catalysts to produce methanol in high yields, together with other useful chemicals, in a one-step low-pressure process. Our discovery opens up the possibility of recycling the crude glycerol produced during biodiesel manufacture. Furthermore, we show that molecules containing at least two hydroxyl groups can be converted into methanol, which demonstrates some aspects of the generality of this new chemistry.
Bioreactor concepts for cell culture-based viral vaccine production.
Gallo-Ramírez, Lilí Esmeralda; Nikolay, Alexander; Genzel, Yvonne; Reichl, Udo
2015-01-01
Vaccine manufacturing processes are designed to meet present and upcoming challenges associated with a growing vaccine market and to include multi-use facilities offering a broad portfolio and faster reaction times in case of pandemics and emerging diseases. The final products, from whole viruses to recombinant viral proteins, are very diverse, making standard process strategies hardly universally applicable. Numerous factors such as cell substrate, virus strain or expression system, medium, cultivation system, cultivation method, and scale need consideration. Reviewing options for efficient and economical production of human vaccines, this paper discusses basic factors relevant for viral antigen production in mammalian cells, avian cells and insect cells. In addition, bioreactor concepts, including static systems, single-use systems, stirred tanks and packed-beds are addressed. On this basis, methods towards process intensification, in particular operational strategies, the use of perfusion systems for high product yields, and steps to establish continuous processes are introduced.
Projector Center: Replication, Transcription, and Translation.
ERIC Educational Resources Information Center
Ruth, Edward B.
1984-01-01
Describes the use of a chart that systematically summarizes three basic steps that involve DNA and its decoding in both eukaryotic and prokaryotic cells: replication; transcription, and translation. Indicates that the chart (mounted on a tranparency) does an adequate job of conveying basic information about nucleic acids to students. (DH)
Basic Emergency Medical Technician Skills Manual.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This manual was developed to help students preparing to become emergency medical technicians (EMTs) learn standardized basic skills in the field. The manual itemizes the steps and performance criteria of each required skill and uses an accompanying videotape series (not included) to enhance the educational experience. The five units of the manual,…
Project Logic Handbook: Computer Literacy through BASIC.
ERIC Educational Resources Information Center
Huber, Leonard; And Others
This handbook for teachers offers guidance on introducing computer literacy into elementary and secondary classrooms. It includes a list of computer concepts exemplified by each step in learning to write programs in BASIC Programming Language and the objectives for the elementary and secondary activities; suggestions for using computers in…
BASIC STEPS IN DESIGNING SCIENCE LABORATORIES.
ERIC Educational Resources Information Center
WHITNEY, FRANK L.
PLANNERS OF CURRENT UNIVERSITY LABORATORIES OFTEN MAKE THE SAME MISTAKES MADE BY INDUSTRIAL LABORATORIES 20 YEARS AGO. THIS CAN BE REMEDIED BY INCREASED COMMUNICATION BETWEEN SCIENTISTS AND DESIGNERS IN SEMINARS DEFINING THE BASIC NEEDS OF A PARTICULAR LABORATORY SITUATION. ELECTRONIC AND MECHANICAL EQUIPMENT ACCOUNT FOR OVER 50 PER CENT OF TOTAL…
McGarvey, Daniel J.; Falke, Jeffrey A.; Li, Hiram W.; Li, Judith; Hauer, F. Richard; Lamberti, G.A.
2017-01-01
Methods to sample fishes in stream ecosystems and to analyze the raw data, focusing primarily on assemblage-level (all fish species combined) analyses, are presented in this chapter. We begin with guidance on sample site selection, permitting for fish collection, and information-gathering steps to be completed prior to conducting fieldwork. Basic sampling methods (visual surveying, electrofishing, and seining) are presented with specific instructions for estimating population sizes via visual, capture-recapture, and depletion surveys, in addition to new guidance on environmental DNA (eDNA) methods. Steps to process fish specimens in the field including the use of anesthesia and preservation of whole specimens or tissue samples (for genetic or stable isotope analysis) are also presented. Data analysis methods include characterization of size-structure within populations, estimation of species richness and diversity, and application of fish functional traits. We conclude with three advanced topics in assemblage-level analysis: multidimensional scaling (MDS), ecological networks, and loop analysis.
NASA Astrophysics Data System (ADS)
Iranmanesh, P.; Tabatabai Yazdi, Sh.; Mehran, M.; Saeednia, S.
2018-03-01
In this work, well-dispersed nanoparticles of NiFe2O4 with diameters less than 10 nm and good crystallinity and excellent magnetic properties were synthesized via a simple one-step capping agent-free coprecipitation route from metal chlorides. The ammonia was used as the precipitating agent and also the solution basicity controller. The effect of pH value during the coprecipitation process was investigated by details through microstructural, optical and magnetic characterizations of the synthesized particles using X-ray diffraction, transmission electron microscopy, Fourier transform infrared and UV-vis spectroscopy, and vibrating sample magnetometer. The results showed that the particle size, departure from the inverse spinel structure, the band gap value and the magnetization of Ni ferrite samples increase with pH value from 9 to 11 indicating the more pronounced surface effects in the smaller nanoparticles.
Marchwinski, Jay
2002-01-01
Every organization has a process for reviewing requests against their own guidelines, which are not always well defined. Normally, all requests must be able to demonstrate an adequate financial return on any investment of dollars and/or resources. Quality and service improvement benefits sometimes are accepted, but often must be translated into documentable dollars and cents. In this article, the author shows how understanding the basic concepts of revenue and expense is a good first step toward improving your chances of gaining approval. Forming an effective two-way partnership with the staff in your finance department will improve your chances even more.
Preparing images for publication: part 2.
Bengel, Wolfgang; Devigus, Alessandro
2006-08-01
The transition from conventional to digital photography presents many advantages for authors and photographers in the field of dentistry, but also many complexities and potential problems. No uniform procedures for authors and publishers exist at present for producing high-quality dental photographs. This two-part article aims to provide guidelines for preparing images for publication and improving communication between these two parties. Part 1 provided information about basic color principles, factors that can affect color perception, and digital color management. Part 2 describes the camera setup, discusses how to take a photograph suitable for publication, and outlines steps for the image editing process.
Signal and image processing for early detection of coronary artery diseases: A review
NASA Astrophysics Data System (ADS)
Mobssite, Youness; Samir, B. Belhaouari; Mohamad Hani, Ahmed Fadzil B.
2012-09-01
Today biomedical signals and image based detection are a basic step to diagnose heart diseases, in particular, coronary artery diseases. The goal of this work is to provide non-invasive early detection of Coronary Artery Diseases relying on analyzing images and ECG signals as a combined approach to extract features, further classify and quantify the severity of DCAD by using B-splines method. In an aim of creating a prototype of screening biomedical imaging for coronary arteries to help cardiologists to decide the kind of treatment needed to reduce or control the risk of heart attack.
[Development of a High Power Green Laser Therapeutic Equipment for Hyperplasia of Prostate].
Liang, Jie; Kang, Hongxiang; Shen, Benjian; Zhao, Lusheng; Wu, Xinshe; Chen, Peng; Chang, Aihong; Guo Hua; Guo, Jiayu
2015-09-01
The basic theory of high power green laser equipment for prostate hyperplasia therapy and the components of the system developed are introduced. Considering the requirements of the clinical therapy, the working process of the high power green laser apparatus are designed and the laser with stable output at 120 W is achieved. The controlling hardware and application software are developed, and the safety step is designed. The high power green laser apparatus manufactured with characteristics of stable output, multifunctional and friendly interface provides a choices of prostate hyperplasia therapy for using nationalization instrument.
Ion beams in radiotherapy - from tracks to treatment planning
NASA Astrophysics Data System (ADS)
Krämer, M.; Scifoni, E.; Wälzlein, C.; Durante, M.
2012-07-01
Several dozen clinical sites around the world apply beams of fast light ions for radiotherapeutical purposes. Thus there is a vested interest in the various physical and radiobiological processes governing the interaction of ion beams with matter, specifically living systems. We discuss the various modelling steps which lead from basic interactions to the application in actual patient treatment planning. The nano- and microscopic scale is covered by sample calculations with our TRAX code. On the macroscopic scale we feature the TRiP98 treatment planning system, which was clinically used in GSI's radiotherapy pilot project.
[Advancements of computer chemistry in separation of Chinese medicine].
Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei
2011-12-01
Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.
From Diapers to Dating. A Parent's Guide to Raising Sexually Healthy Children.
ERIC Educational Resources Information Center
Haffner, Debra W.
This step-by-step program for raising sexually healthy children helps parents provide accurate information and communicate their own values to their children. Chapter 1, "The Basics"; includes "Sexually Healthy Families"; "The Key: Finding Teachable Moments"; and "Guidelines for Communication." Chapter 2,…
Audiovisual Fundamentals; Basic Equipment Operation and Simple Materials Production.
ERIC Educational Resources Information Center
Bullard, John R.; Mether, Calvin E.
A guide illustrated with simple sketches explains the functions and step-by-step uses of audiovisual (AV) equipment. Principles of projection, audio, AV equipment, lettering, limited-quantity and quantity duplication, and materials preservation are outlined. Apparatus discussed include overhead, opaque, slide-filmstrip, and multiple-loading slide…
Conquering technophobia: preparing faculty for today.
Richard, P L
1997-01-01
The constantly changing world of technology creates excitement and an obligation for faculty of schools of nursing to address computer literacy in the curricula at all levels. The initial step in the process of meeting the goals was to assist the faculty in becoming computer literate so that they could foster and encourage the same in the students. The implementation of The Cure for Technophobia included basic and advanced computer skills designed to assist the faculty in becoming comfortable and competent computer users. The applications addressed included: introduction to windows, electronic mail, word processing, presentation and database applications, library on-line searches of literature databases, introduction to internet browsers and a computerized testing program. Efforts were made to overcome barriers to computer literacy and promote the learning process. Familiar, competent, computer literate individuals were used to conduct the classes to accomplish this goal.
NASA Technical Reports Server (NTRS)
Gagliani, J.; Lee, R.; Sorathia, U. A.; Wilcoxson, A. L.
1980-01-01
A terpolyimide precursor was developed which can be foamed by microwave methods and yields foams possessing the best seating properties. A continuous process, based on spray drying techniques, permits production of polyimide powder precursors in large quantities. The constrained rise foaming process permits fabrication of rigid foam panels with improved mechanical properties and almost unlimited density characteristics. Polyimide foam core rigid panels were produced by this technique with woven fiberglass fabric bonded to each side of the panel in a one step microwave process. The fire resistance of polyimide foams was significantly improved by the addition of ceramic fibers to the powder precursors. Foams produced from these compositions are flexible, possess good acoustical attenuation and meet the minimum burnthrough requirements when impinged by high flux flame sources.
Yang, Xue; Li, Xue-You; Li, Jia-Guo; Ma, Jun; Zhang, Li; Yang, Jan; Du, Quan-Ye
2014-02-01
Fast Fourier transforms (FFT) is a basic approach to remote sensing image processing. With the improvement of capacity of remote sensing image capture with the features of hyperspectrum, high spatial resolution and high temporal resolution, how to use FFT technology to efficiently process huge remote sensing image becomes the critical step and research hot spot of current image processing technology. FFT algorithm, one of the basic algorithms of image processing, can be used for stripe noise removal, image compression, image registration, etc. in processing remote sensing image. CUFFT function library is the FFT algorithm library based on CPU and FFTW. FFTW is a FFT algorithm developed based on CPU in PC platform, and is currently the fastest CPU based FFT algorithm function library. However there is a common problem that once the available memory or memory is less than the capacity of image, there will be out of memory or memory overflow when using the above two methods to realize image FFT arithmetic. To address this problem, a CPU and partitioning technology based Huge Remote Fast Fourier Transform (HRFFT) algorithm is proposed in this paper. By improving the FFT algorithm in CUFFT function library, the problem of out of memory and memory overflow is solved. Moreover, this method is proved rational by experiment combined with the CCD image of HJ-1A satellite. When applied to practical image processing, it improves effect of the image processing, speeds up the processing, which saves the time of computation and achieves sound result.
Du, Yuncheng; Budman, Hector M; Duever, Thomas A
2017-06-01
Accurate and fast quantitative analysis of living cells from fluorescence microscopy images is useful for evaluating experimental outcomes and cell culture protocols. An algorithm is developed in this work to automatically segment and distinguish apoptotic cells from normal cells. The algorithm involves three steps consisting of two segmentation steps and a classification step. The segmentation steps are: (i) a coarse segmentation, combining a range filter with a marching square method, is used as a prefiltering step to provide the approximate positions of cells within a two-dimensional matrix used to store cells' images and the count of the number of cells for a given image; and (ii) a fine segmentation step using the Active Contours Without Edges method is applied to the boundaries of cells identified in the coarse segmentation step. Although this basic two-step approach provides accurate edges when the cells in a given image are sparsely distributed, the occurrence of clusters of cells in high cell density samples requires further processing. Hence, a novel algorithm for clusters is developed to identify the edges of cells within clusters and to approximate their morphological features. Based on the segmentation results, a support vector machine classifier that uses three morphological features: the mean value of pixel intensities in the cellular regions, the variance of pixel intensities in the vicinity of cell boundaries, and the lengths of the boundaries, is developed for distinguishing apoptotic cells from normal cells. The algorithm is shown to be efficient in terms of computational time, quantitative analysis, and differentiation accuracy, as compared with the use of the active contours method without the proposed preliminary coarse segmentation step.
Using the TSAR electromagnetic modeling system
NASA Astrophysics Data System (ADS)
Pennock, S. T.; Laguna, G. W.
1993-09-01
A new user, upon receipt of the TSAR EM modeling system, may be overwhelmed by the number of software packages to learn and the number of manuals associated with those packages. This is a document to describe the creation of a simple TSAR model, beginning with an MGED solid and continuing the process through final results from TSAR. It is not intended to be a complete description of all the parts of the TSAR package. Rather, it is intended simply to touch on all the steps in the modeling process and to take a new user through the system from start to finish. There are six basic parts to the TSAR package. The first, MGED, is part of the BRL-CAD package and is used to create a solid model. The second part, ANASTASIA, is the program used to sample the solid model and create a finite-difference mesh. The third program, IMAGE, lets the user view the mesh itself and verify its accuracy. If everything about the mesh is correct, the process continues to the fourth step, SETUP-TSAR, which creates the parameter files for compiling TSAR and the input file for running a particular simulation. The fifth step is actually running TSAR, the field modeling program. Finally, the output from TSAR is placed into SIG, B2RAS or another program for post-processing and plotting. Each of these steps will be described below. The best way to learn to use the TSAR software is to actually create and run a simple test problem. As an example of how to use the TSAR package, let's create a sphere with a rectangular internal cavity, with conical and cylindrical penetrations connecting the outside to the inside, and find the electric field inside the cavity when the object is exposed to a Gaussian plane wave. We will begin with the solid modeling software, MGED, a part of the BRL-CAD modeling release.
Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens
2014-07-07
The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.
Using the TSAR Electromagnetic modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennock, S.T.; Laguna, G.W.
1993-09-01
A new user, upon receipt of the TSAR EM modeling system, may be overwhelmed by the number of software packages to learn and the number of manuals associated with those packages. This is a document to describe the creation of a simple TSAR model, beginning with an MGED solid and continuing the process through final results from TSAR. It is not intended to be a complete description of all the parts of the TSAR package. Rather, it is intended simply to touch on all the steps in the modeling process and to take a new user through the system frommore » start to finish. There are six basic parts to the TSAR package. The first, MGED, is part of the BRL-CAD package and is used to create a solid model. The second part, ANASTASIA, is the program used to sample the solid model and create a finite -- difference mesh. The third program, IMAGE, lets the user view the mesh itself and verify its accuracy. If everything about the mesh is correct, the process continues to the fourth step, SETUP-TSAR, which creates the parameter files for compiling TSAR and the input file for running a particular simulation. The fifth step is actually running TSAR, the field modeling program. Finally, the output from TSAR is placed into SIG, B2RAS or another program for post-processing and plotting. Each of these steps will be described below. The best way to learn to use the TSAR software is to actually create and run a simple test problem. As an example of how to use the TSAR package, let`s create a sphere with a rectangular internal cavity, with conical and cylindrical penetrations connecting the outside to the inside, and find the electric field inside the cavity when the object is exposed to a Gaussian plane wave. We will begin with the solid modeling software, MGED, a part of the BRL-CAD modeling release.« less
ERIC Educational Resources Information Center
Gandy, Robyn A.; Herial, Nabeel A.; Khuder, Sadik A.; Metting, Patricia J.
2008-01-01
This paper studies student performance predictions based on the United States Medical Licensure Exam (USMLE) Step 1. Subjects were second-year medical students from academic years of 2002 through 2006 (n = 711). Three measures of basic science knowledge (two curricular and one extracurricular) were evaluated as predictors of USMLE Step 1 scores.…
The book concerns Soviet and foreign experience in the design and use of synchronizers in the step gear boxes of transport vehicles. Side by side...with description of the basic steps in the development of the gear engagement mechanisms and of the design used in synchronizers of domestic and foreign...manufacture, in this work much attention is devoted to the theory of gear engagement in gear boxes equipped with synchronizers , and to figuring out
Process development and tooling design for intrinsic hybrid composites
NASA Astrophysics Data System (ADS)
Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.
2017-09-01
Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.
Ladd Effio, Christopher; Hahn, Tobias; Seiler, Julia; Oelmeier, Stefan A; Asen, Iris; Silberer, Christine; Villain, Louis; Hubbuch, Jürgen
2016-01-15
Recombinant protein-based virus-like particles (VLPs) are steadily gaining in importance as innovative vaccines against cancer and infectious diseases. Multiple VLPs are currently evaluated in clinical phases requiring a straightforward and rational process design. To date, there is no generic platform process available for the purification of VLPs. In order to accelerate and simplify VLP downstream processing, there is a demand for novel development approaches, technologies, and purification tools. Membrane adsorbers have been identified as promising stationary phases for the processing of bionanoparticles due to their large pore sizes. In this work, we present the potential of two strategies for designing VLP processes following the basic tenet of 'quality by design': High-throughput experimentation and process modeling of an anion-exchange membrane capture step. Automated membrane screenings allowed the identification of optimal VLP binding conditions yielding a dynamic binding capacity of 5.7 mg/mL for human B19 parvovirus-like particles derived from Spodoptera frugiperda Sf9 insect cells. A mechanistic approach was implemented for radial ion-exchange membrane chromatography using the lumped-rate model and stoichiometric displacement model for the in silico optimization of a VLP capture step. For the first time, process modeling enabled the in silico design of a selective, robust and scalable process with minimal experimental effort for a complex VLP feedstock. The optimized anion-exchange membrane chromatography process resulted in a protein purity of 81.5%, a DNA clearance of 99.2%, and a VLP recovery of 59%. Copyright © 2015 Elsevier B.V. All rights reserved.
A Two-Step Approach to Analyze Satisfaction Data
ERIC Educational Resources Information Center
Ferrari, Pier Alda; Pagani, Laura; Fiorio, Carlo V.
2011-01-01
In this paper a two-step procedure based on Nonlinear Principal Component Analysis (NLPCA) and Multilevel models (MLM) for the analysis of satisfaction data is proposed. The basic hypothesis is that observed ordinal variables describe different aspects of a latent continuous variable, which depends on covariates connected with individual and…
Programming--Translating Assessment into Action.
ERIC Educational Resources Information Center
Pyfer, Jean L.
Four steps should be taken in developing correct physical education programs for handicapped students. Assessment of student need is the first step, and a growing number of assessment instruments are available which provide information on basic locomotor skills and patterns and on physical and motor fitness. They are comprised of items that sample…
A facile synthesis of the basic steroidal skeleton using a Pauson-Khand reaction as a key step.
Kim, Do Han; Kim, Kwang; Chung, Young Keun
2006-10-13
A high-yield synthesis of steroid-type molecules under mild reaction conditions is achieved in two steps involving nucleophilic addition of alkynyl cerium reagent to an easily enolizable carbonyl compound (beta-tetralone) followed by an intramolecular Pauson-Khand reaction.
An Analysis of Instruction-Cached SIMD Computer Architecture
1993-12-01
ASSEBLE SIMULATE SCHEDULE VERIFY :t og ... . .. ... V~JSRUCTONSFOR PECIIEDCOMPARE ASSEMBLEI SIMULATE Ift*U1II ~ ~ SCHEDULEIinw ;. & VERIFY...Cache to Place Blocks ................. 70 4.5.4 Step 4: Schedule Cache Blocks ............................. 70 4.5.5 Step 5: Store Cache Blocks...167 B.4 Scheduler .............................................. 167 B.4.1 Basic Block Definition
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
ERIC Educational Resources Information Center
Talpur, Mir Aftab Hussain; Napiah, Madzlan; Chandio, Imtiaz Ahmed; Memon, Irfan Ahmed
2014-01-01
Rural subregions of the developing countries are suffering from many physical and socioeconomic problems, including scarcity of basic education institutions. The shortage of education institutions extended distance between rural localities and education institutions. Hence, to curb this problem, this research is aimed to deal with the basic…
VAKT for Basic Subtraction Facts.
ERIC Educational Resources Information Center
Thornton, Carol A.; Toohey, Margaret A.
Guidelines are presented for modifying basic instruction of subtraction facts for elementary level learning disabled students. A detailed case study is used to illustrate a five-step structured program: (1) find a way to work it out; (2) add to check; (3) learn the partner facts; (4) study families of facts; (5) review and practice. The selection…
TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools
NASA Technical Reports Server (NTRS)
Marlowe, Jill M.; Dixon, Genevieve D.
1998-01-01
This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.
Water Infrastructure Needs and Investment: Review and Analysis of Key Issues
2008-11-24
the Rural Development Act of 1972, as amended (7 U.S.C. § 1926). The purpose of these USDA programs is to provide basic amenities, alleviate health...nonregulatory costs (e.g., routine replacement of basic infrastructure).12 Wastewater Needs. The most recent wastewater survey, conducted in 2004 and issued...1.6 billion just to implement the most basic steps needed to improve security (such as better controlling access to facilities with fences, locks
Convertini, Livia S.; Quatraro, Sabrina; Ressa, Stefania; Velasco, Annalisa
2015-01-01
Background. Even though the interpretation of natural language messages is generally conceived as the result of a conscious processing of the message content, the influence of unconscious factors is also well known. What is still insufficiently known is the way such factors work. We have tackled interpretation assuming it is a process, whose basic features are the same for the whole humankind, and employing a naturalistic approach (careful observation of phenomena in conditions the closest to “natural” ones, and precise description before and independently of data statistical analysis). Methodology. Our field research involved a random sample of 102 adults. We presented them with a complete real world-like case of written communication using unabridged message texts. We collected data (participants’ written reports on their interpretations) in controlled conditions through a specially designed questionnaire (closed and opened answers); then, we treated it through qualitative and quantitative methods. Principal Findings. We gathered some evidence that, in written message interpretation, between reading and the attribution of conscious meaning, an intermediate step could exist (we named it “disassembling”) which looks like an automatic reaction to the text words/expressions. Thus, the process of interpretation would be a discontinuous sequence of three steps having different natures: the initial “decoding” step (i.e., reading, which requires technical abilities), disassembling (the automatic reaction, an unconscious passage) and the final conscious attribution of meaning. If this is true, words and expressions would firstly function like physical stimuli, before being taken into account as symbols. Such hypothesis, once confirmed, could help explaining some links between the cultural (human communication) and the biological (stimulus-reaction mechanisms as the basis for meanings) dimension of humankind. PMID:26528419
Teaching Mathematical Modelling for Earth Sciences via Case Studies
NASA Astrophysics Data System (ADS)
Yang, Xin-She
2010-05-01
Mathematical modelling is becoming crucially important for earth sciences because the modelling of complex systems such as geological, geophysical and environmental processes requires mathematical analysis, numerical methods and computer programming. However, a substantial fraction of earth science undergraduates and graduates may not have sufficient skills in mathematical modelling, which is due to either limited mathematical training or lack of appropriate mathematical textbooks for self-study. In this paper, we described a detailed case-study-based approach for teaching mathematical modelling. We illustrate how essential mathematical skills can be developed for students with limited training in secondary mathematics so that they are confident in dealing with real-world mathematical modelling at university level. We have chosen various topics such as Airy isostasy, greenhouse effect, sedimentation and Stokes' flow,free-air and Bouguer gravity, Brownian motion, rain-drop dynamics, impact cratering, heat conduction and cooling of the lithosphere as case studies; and we use these step-by-step case studies to teach exponentials, logarithms, spherical geometry, basic calculus, complex numbers, Fourier transforms, ordinary differential equations, vectors and matrix algebra, partial differential equations, geostatistics and basic numeric methods. Implications for teaching university mathematics for earth scientists for tomorrow's classroom will also be discussed. Refereces 1) D. L. Turcotte and G. Schubert, Geodynamics, 2nd Edition, Cambridge University Press, (2002). 2) X. S. Yang, Introductory Mathematics for Earth Scientists, Dunedin Academic Press, (2009).
A Comparison of Five Numerical Weather Prediction Analysis Climatologies in Southern High Latitudes.
NASA Astrophysics Data System (ADS)
Connolley, William M.; Harangozo, Stephen A.
2001-01-01
In this paper, numerical weather prediction analyses from four major centers are compared-the Australian Bureau of Meteorology (ABM), the European Centre for Medium-Range Weather Forecasts (ECMWF), the U.S. National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR), and The Met. Office (UKMO). Two of the series-ECMWF reanalysis (ERA) and NCEP-NCAR reanalysis (NNR)-are `reanalyses'; that is, the data have recently been processed through a consistent, modern analysis system. The other three-ABM, ECMWF operational (EOP), and UKMO-are archived from operational analyses.The primary focus in this paper is on the period of 1979-93, the period used for the reanalyses, and on climatology. However, ABM and NNR are also compared for the period before 1979, for which the evidence tends to favor NNR. The authors are concerned with basic variables-mean sea level pressure, height of the 500-hPa surface, and near-surface temperature-that are available from the basic analysis step, rather than more derived quantities (such as precipitation), which are available only from the forecast step.Direct comparisons against station observations, intercomparisons of the spatial pattern of the analyses, and intercomparisons of the temporal variation indicate that ERA, EOP, and UKMO are best for sea level pressure;that UKMO and EOP are best for 500-hPa height; and that none of the analyses perform well for near-surface temperature.
Theory and applications for optimization of every part of a photovoltaic system
NASA Technical Reports Server (NTRS)
Redfield, D.
1978-01-01
A general method is presented for quantitatively optimizing the design of every part and fabrication step of an entire photovoltaic system, based on the criterion of minimum cost/Watt for the system output power. It is shown that no element or process step can be optimized properly by considering only its own cost and performance. Moreover, a fractional performance loss at any fabrication step within the cell or array produces the same fractional increase in the cost/Watt of the entire array, but not of the full system. One general equation is found to be capable of optimizing all parts of a system, although the cell and array steps are basically different from the power-handling elements. Applications of this analysis are given to show (1) when Si wafers should be cut to increase their packing fraction; and (2) what the optimum dimensions for solar cell metallizations are. The optimum shadow fraction of the fine grid is shown to be independent of metal cost and resistivity as well as cell size. The optimum thicknesses of both the fine grid and the bus bar are substantially greater than the values in general use, and the total array cost has a major effect on these values. By analogy, this analysis is adaptable to other solar energy systems.
Ab Initio ONIOM-Molecular Dynamics (MD) Study on the Deamination Reaction by Cytidine Deaminase
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsubara, Toshiaki; Dupuis, Michel; Aida, Misako
2007-08-23
We applied the ONIOM-molecular dynamics (MD) method to the hydrolytic deamination of cytidine by cytidine deaminase, which is an essential step of the activation process of the anticancer drug inside the human body. The direct MD simulations were performed for the realistic model of cytidine deaminase calculating the energy and its gradient by the ab initio ONIOM method on the fly. The ONIOM-MD calculations including the thermal motion show that the neighboring amino acid residue is an important factor of the environmental effects and significantly affects not only the geometry and energy of the substrate trapped in the pocket ofmore » the active site but also the elementary step of the catalytic reaction. We successfully simulate the second half of the catalytic cycle, which has been considered to involve the rate-determining step, and reveal that the rate-determing step is the release of the NH3 molecule. TM and MA were supported in part by grants from the Ministry of Education, Culture, Sports, Science and Technology of Japan. MD was supported by the Division of Chemical Sciences, Office of Basic Energy Sciences, and by the Office of Biological and Environmental Research of the U.S. Department of Energy DOE. Battelle operates Pacific Northwest National Laboratory for DOE.« less
Lau, Dennis H; Volders, Paul G A; Kohl, Peter; Prinzen, Frits W; Zaza, Antonio; Kääb, Stefan; Oto, Ali; Schotten, Ulrich
2015-05-01
Cardiac electrophysiology has evolved into an important subspecialty in cardiovascular medicine. This is in part due to the significant advances made in our understanding and treatment of heart rhythm disorders following more than a century of scientific discoveries and research. More recently, the rapid development of technology in cellular electrophysiology, molecular biology, genetics, computer modelling, and imaging have led to the exponential growth of knowledge in basic cardiac electrophysiology. The paradigm of evidence-based medicine has led to a more comprehensive decision-making process and most likely to improved outcomes in many patients. However, implementing relevant basic research knowledge in a system of evidence-based medicine appears to be challenging. Furthermore, the current economic climate and the restricted nature of research funding call for improved efficiency of translation from basic discoveries to healthcare delivery. Here, we aim to (i) appraise the broad challenges of translational research in cardiac electrophysiology, (ii) highlight the need for improved strategies in the training of translational electrophysiologists, and (iii) discuss steps towards building a favourable translational research environment and culture. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Williams, Gareth J.; Hammel, Michal; Radhakrishnan, Sarvan Kumar; Ramsden, Dale; Lees-Miller, Susan P.; Tainer, John A.
2014-01-01
Non-homologous end joining (NHEJ) is the major pathway for repair of DNA double-strand breaks (DSBs) in human cells. NHEJ is also needed for V(D)J recombination and the development of T and B cells in vertebrate immune systems, and acts in both the generation and prevention of non-homologous chromosomal translocations, a hallmark of genomic instability and many human cancers. X-ray crystal structures, cryo-electron microscopy envelopes, and small angle X-ray scattering (SAXS) solution conformations and assemblies are defining most of the core protein components for NHEJ: Ku70/Ku80 heterodimer; the DNA dependent protein kinase catalytic subunit (DNA-PKcs); the structure-specific endonuclease Artemis along with polynucleotide kinase/phosphatase (PNKP), aprataxin and PNKP related protein (APLF); the scaffolding proteins XRCC4 and XLF (XRCC4-like factor); DNA polymerases, and DNA ligase IV (Lig IV). The dynamic assembly of multi-protein NHEJ complexes at DSBs is regulated in part by protein phosphorylation. The basic steps of NHEJ have been biochemically defined to require: 1) DSB detection by the Ku heterodimer with subsequent DNA-PKcs tethering to form the DNA-PKcs-Ku-DNA complex (termed DNA-PK), 2) lesion processing, and 3) DNA end ligation by Lig IV, which functions in complex with XRCC4 and XLF. The current integration of structures by combined methods is resolving puzzles regarding the mechanisms, coordination and regulation of these three basic steps. Overall, structural results suggest the NHEJ system forms a flexing scaffold with the DNA-PKcs HEAT repeats acting as compressible macromolecular springs suitable to store and release conformational energy to apply forces to regulate NHEJ complexes and the DNA substrate for DNA end protection, processing, and ligation. PMID:24656613
In Vitro Characterization of the Two-Stage Non-Classical Reassembly Pathway of S-Layers
Breitwieser, Andreas; Iturri, Jagoba; Toca-Herrera, Jose-Luis; Sleytr, Uwe B.; Pum, Dietmar
2017-01-01
The recombinant bacterial surface layer (S-layer) protein rSbpA of Lysinibacillus sphaericus CCM 2177 is an ideal model system to study non-classical nucleation and growth of protein crystals at surfaces since the recrystallization process may be separated into two distinct steps: (i) adsorption of S-layer protein monomers on silicon surfaces is completed within 5 min and the amount of bound S-layer protein sufficient for the subsequent formation of a closed crystalline monolayer; (ii) the recrystallization process is triggered—after washing away the unbound S-layer protein—by the addition of a CaCl2 containing buffer solution, and completed after approximately 2 h. The entire self-assembly process including the formation of amorphous clusters, the subsequent transformation into crystalline monomolecular arrays, and finally crystal growth into extended lattices was investigated by quartz crystal microbalance with dissipation (QCM-D) and atomic force microscopy (AFM). Moreover, contact angle measurements showed that the surface properties of S-layers change from hydrophilic to hydrophobic as the crystallization proceeds. This two-step approach is new in basic and application driven S-layer research and, most likely, will have advantages for functionalizing surfaces (e.g., by spray-coating) with tailor-made biological sensing layers. PMID:28216572
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations
NASA Astrophysics Data System (ADS)
Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
A Christian faith-based recovery theory: understanding God as sponsor.
Timmons, Shirley M
2012-12-01
This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions.
Albadarin, Ahmad B; Mangwandi, Chirangano
2015-12-01
The biosorption process of anionic dye Alizarin Red S (ARS) and cationic dye methylene blue (MB) as a function of contact time, initial concentration and solution pH onto olive stone (OS) biomass has been investigated. Equilibrium biosorption isotherms in single and binary systems and kinetics in batch mode were also examined. The kinetic data of the two dyes were better described by the pseudo second-order model. At low concentration, ARS dye appeared to follow a two-step diffusion process, while MB dye followed a three-step diffusion process. The biosorption experimental data for ARS and MB dyes were well suited to the Redlich-Peterson isotherm. The maximum biosorption of ARS dye, qmax = 16.10 mg/g, was obtained at pH 3.28 and the maximum biosorption of MB dye, qmax = 13.20 mg/g, was observed at basic pH values. In the binary system, it was indicated that the MB dye diffuses firstly inside the biosorbent particle and occupies the biosorption sites forming a monodentate complex and then the ARS dye enters and can only bind to untaken sites; forms a tridentate complex with OS active sites. Copyright © 2015 Elsevier Ltd. All rights reserved.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.
Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
The thermodynamics of pyrochemical processes for liquid metal reactor fuel cycles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, I.
1987-01-01
The thermodynamic basis for pyrochemical processes for the recovery and purification of fuel for the liquid metal reactor fuel cycle is described. These processes involve the transport of the uranium and plutonium from one liquid alloy to another through a molten salt. The processes discussed use liquid alloys of cadmium, zinc, and magnesium and molten chloride salts. The oxidation-reduction steps are done either chemically by the use of an auxiliary redox couple or electrochemically by the use of an external electrical supply. The same basic thermodynamics apply to both the salt transport and the electrotransport processes. Large deviations from idealmore » solution behavior of the actinides and lanthanides in the liquid alloys have a major influence on the solubilities and the performance of both the salt transport and electrotransport processes. Separation of plutonium and uranium from each other and decontamination from the more noble fission product elements can be achieved using both transport processes. The thermodynamic analysis is used to make process design computations for different process conditions.« less
Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando
2013-10-01
Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.
Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C
2014-01-01
Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.
Ginosar, Daniel M.; Fox, Robert V.
2005-05-03
A process for producing alkyl esters useful in biofuels and lubricants by transesterifying glyceride- or esterifying free fatty acid-containing substances in a single critical phase medium is disclosed. The critical phase medium provides increased reaction rates, decreases the loss of catalyst or catalyst activity and improves the overall yield of desired product. The process involves the steps of dissolving an input glyceride- or free fatty acid-containing substance with an alcohol or water into a critical fluid medium; reacting the glyceride- or free fatty acid-containing substance with the alcohol or water input over either a solid or liquid acidic or basic catalyst and sequentially separating the products from each other and from the critical fluid medium, which critical fluid medium can then be recycled back in the process. The process significantly reduces the cost of producing additives or alternatives to automotive fuels and lubricants utilizing inexpensive glyceride- or free fatty acid-containing substances, such as animal fats, vegetable oils, rendered fats, and restaurant grease.
NASA Astrophysics Data System (ADS)
Leier, André; Marquez-Lago, Tatiana T.; Burrage, Kevin
2008-05-01
The delay stochastic simulation algorithm (DSSA) by Barrio et al. [Plos Comput. Biol. 2, 117(E) (2006)] was developed to simulate delayed processes in cell biology in the presence of intrinsic noise, that is, when there are small-to-moderate numbers of certain key molecules present in a chemical reaction system. These delayed processes can faithfully represent complex interactions and mechanisms that imply a number of spatiotemporal processes often not explicitly modeled such as transcription and translation, basic in the modeling of cell signaling pathways. However, for systems with widely varying reaction rate constants or large numbers of molecules, the simulation time steps of both the stochastic simulation algorithm (SSA) and the DSSA can become very small causing considerable computational overheads. In order to overcome the limit of small step sizes, various τ-leap strategies have been suggested for improving computational performance of the SSA. In this paper, we present a binomial τ-DSSA method that extends the τ-leap idea to the delay setting and avoids drawing insufficient numbers of reactions, a common shortcoming of existing binomial τ-leap methods that becomes evident when dealing with complex chemical interactions. The resulting inaccuracies are most evident in the delayed case, even when considering reaction products as potential reactants within the same time step in which they are produced. Moreover, we extend the framework to account for multicellular systems with different degrees of intercellular communication. We apply these ideas to two important genetic regulatory models, namely, the hes1 gene, implicated as a molecular clock, and a Her1/Her 7 model for coupled oscillating cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cermelli, Paolo; Jabbour, Michel E.; Department of Mathematics, University of Kentucky, Lexington, Kentucky 40506-0027
A thermodynamically consistent continuum theory for single-species, step-flow epitaxy that extends the classical Burton-Cabrera-Frank (BCF) framework is derived from basic considerations. In particular, an expression for the step chemical potential is obtained that contains two energetic contributions--one from the adjacent terraces in the form of the jump in the adatom grand canonical potential and the other from the monolayer of crystallized adatoms that underlies the upper terrace in the form of the nominal bulk chemical potential--thus generalizing the classical Gibbs-Thomson relation to the dynamic, dissipative setting of step-flow growth. The linear stability analysis of the resulting quasistatic free-boundary problem formore » an infinite train of equidistant rectilinear steps yields explicit--i.e., analytical--criteria for the onset of step bunching in terms of the basic physical and geometric parameters of the theory. It is found that, in contrast with the predictions of the classical BCF model, both in the absence as well as in the presence of desorption, a growth regime exists for which step bunching occurs, except possibly in the dilute limit where the train is always stable to step bunching. In the present framework, the onset of one-dimensional instabilities is directly attributed to the energetic influence on the migrating steps of the adjacent terraces. Hence the theory provides a ''minimalist'' alternative to existing theories of step bunching and should be relevant to, e.g., molecular beam epitaxy of GaAs where the equilibrium adatom density is shown by Tersoff, Johnson, and Orr [Phys. Rev. B 78, 282 (1997)] to be extremely high.« less
Simulated Exercise Physiology Laboratories.
ERIC Educational Resources Information Center
Morrow, James R., Jr.; Pivarnik, James M.
This book consists of a lab manual and computer disks for either Apple or IBM hardware. The lab manual serves as "tour guide" for the learner going through the various lab experiences. The manual contains definitions, proper terminology, and other basic information about physiological principles. It is organized so a step-by-step procedure may be…
Effluent Monitoring Procedures: Basic Parameters for Municipal Effluents. Staff Guide.
ERIC Educational Resources Information Center
Environmental Protection Agency, Washington, DC. Office of Water Programs.
This is one of several short-term courses developed to assist in the training of waste water treatment plant operational personnel in the tests, measurements, and report preparation required for compliance with their NPDES Permits. This Staff Guide provides step-by-step guidelines on course planning, development and implementation involving…
Self-Monitoring Procedures: Basic Parameters for Municipal Effluents. Student Reference Manual.
ERIC Educational Resources Information Center
Environmental Protection Agency, Washington, DC. Office of Water Programs.
This is one of several short-term courses developed to assist in the training of waste water treatment plant operational personnel in the tests, measurements, and report preparation required for compliance with their NPDES Permits. The Student Reference Manual provides step-by-step procedures for laboratory application of equipment operating…
Procedures for Conducting Installation Compatible Use Zone (ICUZ) studies
1988-08-01
Yes ( Penguin Books, New York, 1986). This book is a national bestseller on negotiation. It provides a concise, step-by-step, proven strategy for...AL ftA TE trate your proposal: Basic Principles of this CP TECHNIQUE - You may be able to transport some of the It’s not infrequent that public
Widening Horizons: A Guide to Organizing Field Trips for Adult Students. Final Report.
ERIC Educational Resources Information Center
Lutheran Social Mission Society, Philadelphia, PA. Lutheran Settlement House.
Based on a successful program for women conducted by Lutheran Settlement House in Philadelphia, this guide outlines step-by-step procedures for conducting educational field trips for students in adult basic education programs. The guide offers suggestions for identification of cultural, historical, and social resources that would provide valuable…
A Teacher's Project Guide to the Internet.
ERIC Educational Resources Information Center
Crotchett, Kevin R.
This book is a step-by-step guide to the Internet, suggesting creative K-12 classroom projects ranging from one of the most basic Internet functions, e-mail, to one of the more difficult, writing World Wide Web homepages. Intervening chapters describe projects using Usenet Newsgroups, File Transfer Protocols (FTP), Gopher, Veronica, the World Wide…
Seven Basic Steps to Solving Ethical Dilemmas in Special Education: A Decision-Making Framework
ERIC Educational Resources Information Center
Stockall, Nancy; Dennis, Lindsay R.
2015-01-01
This article presents a seven-step framework for decision making to solve ethical issues in special education. The authors developed the framework from the existing literature and theoretical frameworks of justice, critique, care, and professionalism. The authors briefly discuss each theoretical framework and then describe the decision-making…
A Curriculum Package for Implementing Instruction in Electricity Fundamentals/House Wiring.
ERIC Educational Resources Information Center
Murphy, Brian P.
This curriculum guide is designed for instructors of secondary industrial arts, vocational, and apprenticeship programs. The material is presented in two sections. Section I provides step-by-step instructions on how to present basic electrical circuit concepts with the use of a simply-made breadboard. Included in this section is the following…
An Analysis of the Carpentry Occupation.
ERIC Educational Resources Information Center
McKinney, Oral O.; And Others
The general purpose of the occupational analysis is to provide workable, basic information dealing with the many and varied duties performed in the carpentry occupation. The analysis starts with the progress of a house from the first study of the blueprints to the laying out of the excavations and continuing step-by-step until the interior finish…
Carnivalesque Enactment at the Children's Medical Centre of Rabin Hospital.
ERIC Educational Resources Information Center
Lev-Aladgem, Shulamith
2000-01-01
Describes the basic characteristics of the "carnivalesque enactment" and its therapeutic potential. Explains a case study of the drama project at the Rabin Children's Medical Centre, how the carnivalesque enactment was developed step by step, and the kind of effect it stimulated among the children. Suggests new theatrical experiments with…
Cost Accounting and Accountability: One Approach.
ERIC Educational Resources Information Center
Gingold, William
This paper outlines an approach designed to provide an accurate and efficient cost accounting system for use in schools and other social service organizations. In his discussion, the author presents a detailed step-by-step description of how to establish, plan, and operate the system. The basic element of the system is the Daily Event Record…
Food on Campus: A Recipe for Action.
ERIC Educational Resources Information Center
Kinsella, Susan
Really good food can be served in any school, and this step-by-step guide contains the basics of understanding and reforming food service: detailed explanations of how food services are run; guidelines for rating the food service; the wholesome, good-tasting foods students really like to eat yet are affordable and manageable. Included are plans…
Mizota, Tomoko; Kurashima, Yo; Poudel, Saseem; Watanabe, Yusuke; Shichinohe, Toshiaki; Hirano, Satoshi
2018-07-01
Despite its advantages, few trainees outside of North America have access to simulation training. We hypothesized that a stepwise training method using tele-mentoring system would be an efficient technique for training in basic laparoscopic skills. Residents were randomized into two groups and trained to proficiency in intracorporeal suturing. The stepwise group (SG) practiced the task step-by-step, while the other group practiced comprehensively (CG). Each participant received weekly coaching via two-way web conferencing software. The duration of the coaching sessions and self-practice time were compared between the two groups. Twenty residents from 15 institutions participated, and all achieved proficiency. Coaching sessions using tele-mentoring system were completed without difficulties. The SG required significantly shorter coaching time per session than the CG (p = .002). There was no significant difference in self-practice time. The stepwise training method with the tele-mentoring system appears to make efficient use of surgical trainees' and trainers' time. Copyright © 2017 Elsevier Inc. All rights reserved.
Endobronchial valves for bronchopleural fistula: pitfalls and principles
Gaspard, Dany; Bartter, Thaddeus; Boujaoude, Ziad; Raja, Haroon; Arya, Rohan; Meena, Nikhil; Abouzgheib, Wissam
2016-01-01
Background: Placement of endobronchial valves for bronchopleural fistula (BPF) is not always straightforward. A simple guide to the steps for an uncomplicated procedure does not encompass pitfalls that need to be understood and overcome to maximize the efficacy of this modality. Objectives: The objective of this study was to discuss examples of difficult cases for which the placement of endobronchial valves was not straightforward and required alterations in the usual basic steps. Subsequently, we aimed to provide guiding principles for a successful procedure. Methods: Six illustrative cases were selected to demonstrate issues that can arise during endobronchial valve placement. Results: In each case, a real or apparent lack of decrease in airflow through a BPF was diagnosed and addressed. We have used the selected problem cases to illustrate principles, with the goal of helping to increase the success rate for endobronchial valve placement in the treatment of BPF. Conclusions: This series demonstrates issues that complicate effective placement of endobronchial valves for BPF. These issues form the basis for troubleshooting steps that complement the basic procedural steps. PMID:27742781
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
Scientific data processing for Hipparcos
NASA Astrophysics Data System (ADS)
van der Marel, H.
1988-04-01
The scientific aims of the ESA Hipparcos astrometric satellite are reviewed, and the fundamental principles and practical implementation of the data-analysis and data-reduction procedures are discussed in detail. Hipparcos is to determine the positions and proper motions of a catalog of 110,000 stars to a limit of 12 mag with accuracy a few marcsec and obtain photometric observations of 400,000 stars (the Tycho mission). Consideration is given to the organization of the data-processing consortia FAST, NDAC, and TDAC; the basic problems of astrometry; the measurement principle; the large amounts of data to be generated during the 2.5-year mission; and the three-step iterative method to be applied (positional reconstruction and reduction to a reference great circle, spherical reconstruction, and extraction of the astrometric parameters). Diagrams and a flow chart are provided.
Back to Basics: Preventing Surgical Fires.
Spruce, Lisa
2016-09-01
When fires occur in the OR, they are devastating and potentially fatal to both patients and health care workers. Fires can be prevented by understanding the fire triangle and methods of reducing fire risk, conducting fire risk assessments, and knowing how to respond if a fire occurs. This Back to Basics article addresses the basics of fire prevention and the steps that can be taken to prevent fires from occurring. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Elementary Accounting. A Programed Text. Revised. Edition Code-3.
ERIC Educational Resources Information Center
Army Finance School, Fort Benjamin Harrison, IN.
This programed text is designed to teach the basic elements of the double entry system of accounting, including basic terms, procedures, definitions, and principles used. The text consists of frames, which are sequenced instructional steps and, in most cases, are composed of two parts. The first part states a fact or relates information and asks a…
1991-12-01
effective (19:15) Figure 2 details a flowchart of the basic steps in prototyping. The basic concept behind prototyping is to quickly produce a working...One approach to overcoming this is to structure the document relative to the experience level of the user (14:49). A "novice" or beginner would
ERIC Educational Resources Information Center
Rodriguez, Emilio; Vicente, Miguel Angel
2002-01-01
Presents a 10-hour chemistry experiment using copper sulfate that has three steps: (1) purification of an ore containing copper sulfate and insoluble basic copper sulfates; (2) determination of the number of water molecules in hydrated copper sulfate; and (3) recovery of metallic copper from copper sulfate. (Author/YDS)
ERIC Educational Resources Information Center
Webb, Mel
A comprehensive student basic skills assessment program was developed at St. Louis Community College (SLCC) at Florissant Valley to appraise student readiness to take courses, gather information for counseling and advising, diagnose student problems, and evaluate program efficiency and effectiveness. The steps taken in developing the program were:…
ERIC Educational Resources Information Center
Fishburne, R. P., Jr.; Mims, Diane M.
An experimental Basic Electricity and Electronics course (BE/E) utilizing a lock-step, instructor presentation methodology was developed and evaluated at the Service School Command, Great Lakes. The study, directed toward the training of lower mental group, school nonqualified personnel, investigated comparative data on test performance, attitude,…
ERIC Educational Resources Information Center
Pissanos, Becky W.; And Others
1983-01-01
Step-wise linear regressions were used to relate children's age, sex, and body composition to performance on basic motor abilities including balance, speed, agility, power, coordination, and reaction time, and to health-related fitness items including flexibility, muscle strength and endurance and cardiovascular functions. Eighty subjects were in…
NASA Astrophysics Data System (ADS)
Liu, Likun
2018-01-01
In the field of remote sensing image processing, remote sensing image segmentation is a preliminary step for later analysis of remote sensing image processing and semi-auto human interpretation, fully-automatic machine recognition and learning. Since 2000, a technique of object-oriented remote sensing image processing method and its basic thought prevails. The core of the approach is Fractal Net Evolution Approach (FNEA) multi-scale segmentation algorithm. The paper is intent on the research and improvement of the algorithm, which analyzes present segmentation algorithms and selects optimum watershed algorithm as an initialization. Meanwhile, the algorithm is modified by modifying an area parameter, and then combining area parameter with a heterogeneous parameter further. After that, several experiments is carried on to prove the modified FNEA algorithm, compared with traditional pixel-based method (FCM algorithm based on neighborhood information) and combination of FNEA and watershed, has a better segmentation result.
Process service quality evaluation based on Dempster-Shafer theory and support vector machine.
Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei
2017-01-01
Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.
NASA Astrophysics Data System (ADS)
Weidinger, Peter; Günther, Kay; Fitzel, Martin; Logvinov, Ruslan; Ilin, Alexander; Ploshikhin, Vasily; Hugger, Florian; Mann, Vincent; Roth, Stephan; Schmidt, Michael
The necessity for weight reduction in motor vehicles in order to save fuel consumption pushes automotive suppliers to use materials of higher strength. Due to their excellent crash behavior high strength steels are increasingly applied in various structures. In this paper some predevelopment steps for a material change from a micro alloyed to dual phase and complex phase steels of a T-joint assembly are displayed. Initially the general weldability of the materials regarding pore formation, hardening in the heat affected zone and hot cracking susceptibility is discussed. After this basic investigation, the computer aided design optimization of a clamping device is shown, in which influences of the clamping jaw, the welding position and the clamping forces upon weld quality are presented. Finally experimental results of the welding process are displayed, which validate the numerical simulation.
Multistep integration formulas for the numerical integration of the satellite problem
NASA Technical Reports Server (NTRS)
Lundberg, J. B.; Tapley, B. D.
1981-01-01
The use of two Class 2/fixed mesh/fixed order/multistep integration packages of the PECE type for the numerical integration of the second order, nonlinear, ordinary differential equation of the satellite orbit problem. These two methods are referred to as the general and the second sum formulations. The derivation of the basic equations which characterize each formulation and the role of the basic equations in the PECE algorithm are discussed. Possible starting procedures are examined which may be used to supply the initial set of values required by the fixed mesh/multistep integrators. The results of the general and second sum integrators are compared to the results of various fixed step and variable step integrators.
Preparation and characterization of silica xerogels as carriers for drugs.
Czarnobaj, K
2008-11-01
The aim of the present study was to utilize the sol-gel method to synthesize different forms of xerogel matrices for drugs and to investigate how the synthesis conditions and solubility of drugs influence the change of the profile of drug release and the structure of the matrices. Silica xerogels doped with drugs were prepared by the sol-gel method from a hydrolyzed tetraethoxysilane (TEOS) solution containing two model compounds: diclofenac diethylamine, (DD)--a water-soluble drug or ibuprofen, (IB)--a water insoluble drug. Two procedures were used for the synthesis of sol-gel derived materials: one-step procedure (the sol-gel reaction was carried out under acidic or basic conditions) and the two-step procedure (first, hydrolysis of TEOS was carried out under acidic conditions, and then condensation of silanol groups was carried out under basic conditions) in order to obtain samples with altered microstructures. In vitro release studies of drugs revealed a similar release profile in two steps: an initial diffusion-controlled release followed by a slower release rate. In all the cases studied, the released amount of DD was higher and the released time was shorter compared with IB for the same type of matrices. The released amount of drugs from two-step prepared xerogels was always lower than that from one-step base-catalyzed xerogels. One-step acid-catalyzed xerogels proved unsuitable as the carriers for the examined drugs.
Denmark, Scott E; Kobayashi, Tetsuya
2003-06-27
The palladium- and copper-catalyzed cross-coupling reactions of cyclic silyl ethers with aryl iodides are reported. Silyl ethers 3 were readily prepared by intramolecular silylformylation of homopropargyl silyl ethers 2 under a carbon monoxide atmosphere. The reaction of cyclic silyl ethers 3with various aryl iodides 7 in the presence of [(allyl)PdCl](2), CuI, a hydrosilane, and KF.2H(2)O in DMF at room temperature provided the alpha,beta-unsaturated aldehyde coupling products 8 in high yields. The need for copper in this process suggested that transmetalation from silicon to copper is an important step in the mechanism. Although siloxane 3 and the product 8 are not stable under basic conditions, KF.2H(2)O provided the appropriate balance of reactivity toward silicon and reduced basicity. The addition of a hydrosilane to [(allyl)PdCl](2) was needed to reduce the palladium(II) to the active palladium(0) form.
Krishnan, Kannan; Carrier, Richard
2017-07-03
The consideration of inhalation and dermal routes of exposures in developing guideline values for drinking water contaminants is important. However, there is no guidance for determining the eligibility of a drinking water contaminant for its multiroute exposure potential. The objective of the present study was to develop a 4-step framework to screen chemicals for their dermal and inhalation exposure potential in the process of developing guideline values. The proposed framework emphasizes the importance of considering basic physicochemical properties prior to detailed assessment of dermal and inhalation routes of exposure to drinking water contaminants in setting guideline values.
NASA Technical Reports Server (NTRS)
1987-01-01
In a complex computer environment there is ample opportunity for error, a mistake by a programmer, or a software-induced undesirable side effect. In insurance, errors can cost a company heavily, so protection against inadvertent change is a must for the efficient firm. The data processing center at Transport Life Insurance Company has taken a step to guard against accidental changes by adopting a software package called EQNINT (Equations Interpreter Program). EQNINT cross checks the basic formulas in a program against the formulas that make up the major production system. EQNINT assures that formulas are coded correctly and helps catch errors before they affect the customer service or its profitability.
Chemicals and Structural Foams to Neutralize or Defeat Anti-Personnel Mines
1990-10-01
first-level goals in LD. This shows the basic approach used for this analysis. I OVERALL GOALi Select Best Foam System II Best Foam Product Best Delivery...pouring back and forth three times would have three steps for that part of the process, plus any other motions, such as pulling off the lid, and...i B-II I I I I I I :’½ j> I I I I 3 Typical Tilt-Rod AP Mine I I I I I I Typical Pull Firing Pin Device H I I I I I I I iPrsu -SniiePatcCsdMn I i
Crop Identification Technolgy Assessment for Remote Sensing (CITARS). Volume 1: Task design plan
NASA Technical Reports Server (NTRS)
Hall, F. G.; Bizzell, R. M.
1975-01-01
A plan for quantifying the crop identification performances resulting from the remote identification of corn, soybeans, and wheat is described. Steps for the conversion of multispectral data tapes to classification results are specified. The crop identification performances resulting from the use of several basic types of automatic data processing techniques are compared and examined for significant differences. The techniques are evaluated also for changes in geographic location, time of the year, management practices, and other physical factors. The results of the Crop Identification Technology Assessment for Remote Sensing task will be applied extensively in the Large Area Crop Inventory Experiment.
NASA Astrophysics Data System (ADS)
Pour Yousefian Barfeh, Davood; Ebron, Jonalyn G.; Pabico, Jaderick P.
2018-02-01
In this study researchers pay attention to the essence of Insertion Sort and propose a sorter in Membrane Computing. This research shows how a theoretical computing device same as Membrane Computing can perform the basic concepts same as sorting. In this regard, researches introduce conditional reproduction rule such that each membrane can reproduce another membrane having same structure with the original membrane. The researchers use the functionality of comparator P system as a basis in which two multisets are compared and then stored in two adjacent membranes. And finally, the researchers present the process of sorting as a collection of transactions implemented in four levels while each level has different steps.
NASA Astrophysics Data System (ADS)
Wen, Di; Ding, Xiaoqing
2003-12-01
In this paper we propose a general framework for character segmentation in complex multilingual documents, which is an endeavor to combine the traditionally separated segmentation and recognition processes into a cooperative system. The framework contains three basic steps: Dissection, Local Optimization and Global Optimization, which are designed to fuse various properties of the segmentation hypotheses hierarchically into a composite evaluation to decide the final recognition results. Experimental results show that this framework is general enough to be applied in variety of documents. A sample system based on this framework to recognize Chinese, Japanese and Korean documents and experimental performance is reported finally.
Optimizing DNA nanotechnology through coarse-grained modeling: a two-footed DNA walker.
Ouldridge, Thomas E; Hoare, Rollo L; Louis, Ard A; Doye, Jonathan P K; Bath, Jonathan; Turberfield, Andrew J
2013-03-26
DNA has enormous potential as a programmable material for creating artificial nanoscale structures and devices. For more complex systems, however, rational design and optimization can become difficult. We have recently proposed a coarse-grained model of DNA that captures the basic thermodynamic, structural, and mechanical changes associated with the fundamental process in much of DNA nanotechnology, the formation of duplexes from single strands. In this article, we demonstrate that the model can provide powerful insight into the operation of complex nanotechnological systems through a detailed investigation of a two-footed DNA walker that is designed to step along a reusable track, thereby offering the possibility of optimizing the design of such systems. We find that applying moderate tension to the track can have a large influence on the operation of the walker, providing a bias for stepping forward and helping the walker to recover from undesirable overstepped states. Further, we show that the process by which spent fuel detaches from the walker can have a significant impact on the rebinding of the walker to the track, strongly influencing walker efficiency and speed. Finally, using the results of the simulations, we propose a number of modifications to the walker to improve its operation.
Development of a TiAl Alloy by Spark Plasma Sintering
NASA Astrophysics Data System (ADS)
Couret, Alain; Voisin, Thomas; Thomas, Marc; Monchoux, Jean-Philippe
2017-12-01
Spark plasma sintering (SPS) is a consolidated powder metallurgy process for which the powder sintering is achieved through an applied electric current. The present article aims to describe the method we employed to develop a TiAl-based alloy adjusted for this SPS process. Owing to its enhanced mechanical properties, this alloy was found to fully match the industrial specifications for the aeronautic and automotive industries, which require a high strength at high temperature and a reasonably good ductility at room temperature. A step-by-step method was followed for this alloy development. Starting from a basic study on the as-SPSed GE alloy (Ti-48Al-2Cr-2Nb) in which the influence of the microstructure was studied, the microstructure-alloy composition relationships were then investigated to increase the mechanical properties. As a result of this study, we concluded that tungsten had to be the major alloying element to improve the resistance at high temperature and a careful addition of boron would serve the properties at room temperature. Thus, we developed the IRIS alloy (Ti-48Al-2W-0.08B). Its microstructure and mechanical properties are described here.
NASA Astrophysics Data System (ADS)
Liang, Yong-Chao; Liu, Rang-Su; Xie, Quan; Tian, Ze-An; Mo, Yun-Fei; Zhang, Hai-Tao; Liu, Hai-Rong; Hou, Zhao-Yang; Zhou, Li-Li; Peng, Ping
2017-02-01
To investigate the structural evolution and hereditary mechanism of icosahedral nano-clusters formed during rapid solidification, a molecular dynamics (MD) simulation study has been performed for a system consisting of 107 atoms of liquid Mg70Zn30 alloy. Adopting Honeycutt-Anderson (HA) bond-type index method and cluster type index method (CTIM-3) to analyse the microstructures in the system it is found that for all the nano-clusters including 2~8 icosahedral clusters in the system, there are 62 kinds of geometrical structures, and those can be classified, by the configurations of the central atoms of basic clusters they contained, into four types: chain-like, triangle-tailed, quadrilateral-tailed and pyramidal-tailed. The evolution of icosahedral nano-clusters can be conducted by perfect heredity and replacement heredity, and the perfect heredity emerges when temperature is slightly less than Tm then increase rapidly and far exceeds the replacement heredity at Tg; while for the replacement heredity, there are three major modes: replaced by triangle (3-atoms), quadrangle (4-atoms) and pentagonal pyramid (6-atoms), rather than by single atom step by step during rapid solidification processes.
Kitamura, Kazuo; Tokunaga, Makio; Esaki, Seiji; Iwane, Atsuko Hikikoshi; Yanagida, Toshio
2005-01-01
We have previously measured the process of displacement generation by a single head of muscle myosin (S1) using scanning probe nanometry. Given that the myosin head was rigidly attached to a fairly large scanning probe, it was assumed to stably interact with an underlying actin filament without diffusing away as would be the case in muscle. The myosin head has been shown to step back and forth stochastically along an actin filament with actin monomer repeats of 5.5 nm and to produce a net movement in the forward direction. The myosin head underwent 5 forward steps to produce a maximum displacement of 30 nm per ATP at low load (<1 pN). Here, we measured the steps over a wide range of forces up to 4 pN. The size of the steps (∼5.5 nm) did not change as the load increased whereas the number of steps per displacement and the stepping rate both decreased. The rate of the 5.5-nm steps at various force levels produced a force-velocity curve of individual actomyosin motors. The force-velocity curve from the individual myosin heads was comparable to that reported in muscle, suggesting that the fundamental mechanical properties in muscle are basically due to the intrinsic stochastic nature of individual actomyosin motors. In order to explain multiple stochastic steps, we propose a model arguing that the thermally-driven step of a myosin head is biased in the forward direction by a potential slope along the actin helical pitch resulting from steric compatibility between the binding sites of actin and a myosin head. Furthermore, computer simulations show that multiple cooperating heads undergoing stochastic steps generate a long (>60 nm) sliding distance per ATP between actin and myosin filaments, i.e., the movement is loosely coupled to the ATPase cycle as observed in muscle. PMID:27857548
Kitamura, Kazuo; Tokunaga, Makio; Esaki, Seiji; Iwane, Atsuko Hikikoshi; Yanagida, Toshio
2005-01-01
We have previously measured the process of displacement generation by a single head of muscle myosin (S1) using scanning probe nanometry. Given that the myosin head was rigidly attached to a fairly large scanning probe, it was assumed to stably interact with an underlying actin filament without diffusing away as would be the case in muscle. The myosin head has been shown to step back and forth stochastically along an actin filament with actin monomer repeats of 5.5 nm and to produce a net movement in the forward direction. The myosin head underwent 5 forward steps to produce a maximum displacement of 30 nm per ATP at low load (<1 pN). Here, we measured the steps over a wide range of forces up to 4 pN. The size of the steps (∼5.5 nm) did not change as the load increased whereas the number of steps per displacement and the stepping rate both decreased. The rate of the 5.5-nm steps at various force levels produced a force-velocity curve of individual actomyosin motors. The force-velocity curve from the individual myosin heads was comparable to that reported in muscle, suggesting that the fundamental mechanical properties in muscle are basically due to the intrinsic stochastic nature of individual actomyosin motors. In order to explain multiple stochastic steps, we propose a model arguing that the thermally-driven step of a myosin head is biased in the forward direction by a potential slope along the actin helical pitch resulting from steric compatibility between the binding sites of actin and a myosin head. Furthermore, computer simulations show that multiple cooperating heads undergoing stochastic steps generate a long (>60 nm) sliding distance per ATP between actin and myosin filaments, i.e., the movement is loosely coupled to the ATPase cycle as observed in muscle.
Elements of a Science of Education
ERIC Educational Resources Information Center
Kalantzis, Mary
2006-01-01
Education has become a domain of considerable ideological division. Today the mantra is freedom and choice, yet at the same time, a push to "back to basics" is observed. This author attempts to trace the contours of this division by taking two steps back from the contemporary fray. One step is to situate present day discussions in a larger…
ERIC Educational Resources Information Center
Sherman, Lee
1999-01-01
A nationally acclaimed antiviolence program, Second Step teaches three basic skills needed for living peacefully in society: empathy, impulse control, and anger management. In Bethel (Alaska), where student gunfire killed a student and principal in 1997, Second Step is used enthusiastically, having been modified to fit Yupik Eskimo culture and…
ERIC Educational Resources Information Center
Calderon, Margarita Espino; Minaya-Rowe, Liliana
This book provides school administrators, teachers, and parents with the basic knowledge necessary for planning and implementing effective two-way bilingual programs. It offers essential elements to help students gain literacy in two languages, increase cross-cultural understanding, and meet high levels of achievement in all core academic areas.…
Team Nutrition's Teacher Handbook: Tips, Tools, and Jewels for Busy Educators.
ERIC Educational Resources Information Center
Shepherd, Sandra K.; Whitehead, Constance S.
This teacher support manual helps elementary educators teach proper nutrition to students in pre-K through grade 5. It provides a summary of all the background and tools teachers will need to do what they want with the Team Nutrition/Scholastic curricula. There is brief background information on nutrition basics; step-by-step instructions for…
Better Speeches in Ten Simple Steps. Revised 2nd Edition.
ERIC Educational Resources Information Center
Robinson, James W.
Acknowledging that the fear of public speaking is widespread, this book guides the reader/communicator through 10 steps of writing and presenting a speech. The book is based on the idea that solid preparation and following a few basic rules makes public speaking easier and the finished product more dynamic. The book is divided into the following…
An empirical investigation of sparse distributed memory using discrete speech recognition
NASA Technical Reports Server (NTRS)
Danforth, Douglas G.
1990-01-01
Presented here is a step by step analysis of how the basic Sparse Distributed Memory (SDM) model can be modified to enhance its generalization capabilities for classification tasks. Data is taken from speech generated by a single talker. Experiments are used to investigate the theory of associative memories and the question of generalization from specific instances.
ERIC Educational Resources Information Center
Ettinger, Blanche; Perfetto, Edda
Using a developmental, hands-on approach, this text/workbook helps students master the basic English skills that are essential to write effective business correspondence, to recognize language errors, and to develop decision-making and problem-solving skills. Its step-by-step focus and industry-specific format encourages students to review,…
ERIC Educational Resources Information Center
Cihak, David F.; Bowlin, Tammy
2009-01-01
The researchers examined the use of video modeling by means of a handheld computer as an alternative instructional delivery system for learning basic geometry skills. Three high school students with learning disabilities participated in this study. Through video modeling, teacher-developed video clips showing step-by-step problem solving processes…
The Politics of Diversifying Basic Education Delivery: A Comparative Analysis from East Africa
ERIC Educational Resources Information Center
Hoppers, Wim
2011-01-01
This article addresses the politics and policy-making of alternative forms of basic education in the context of EFA, with an emphasis on non-formal education (NFE) for school-age children. It explores how policy-makers interpret the relevance of such alternatives, the steps that are taken to implement the reforms and factors that play a role in…
X-Ray Diffraction and the Discovery of the Structure of DNA
ERIC Educational Resources Information Center
Crouse, David T.
2007-01-01
A method is described for teaching the analysis of X-ray diffraction of DNA through a series of steps utilizing the original methods used by James Watson, Francis Crick, Maurice Wilkins and Rosalind Franklin. The X-ray diffraction pattern led to the conclusion of the basic helical structure of DNA and its dimensions while basic chemical principles…
Maio, Nunziata; Rouault, Tracey. A.
2014-01-01
Iron-sulfur (Fe-S) clusters are ancient, ubiquitous cofactors composed of iron and inorganic sulfur. The combination of the chemical reactivity of iron and sulfur, together with many variations of cluster composition, oxidation states and protein environments, enables Fe-S clusters to participate in numerous biological processes. Fe-S clusters are essential to redox catalysis in nitrogen fixation, mitochondrial respiration and photosynthesis, to regulatory sensing in key metabolic pathways (i. e. cellular iron homeostasis and oxidative stress response), and to the replication and maintenance of the nuclear genome. Fe-S cluster biogenesis is a multistep process that involves a complex sequence of catalyzed protein- protein interactions and coupled conformational changes between the components of several dedicated multimeric complexes. Intensive studies of the assembly process have clarified key points in the biogenesis of Fe-S proteins. However several critical questions still remain, such as: what is the role of frataxin? Why do some defects of Fe-S cluster biogenesis cause mitochondrial iron overload? How are specific Fe-S recipient proteins recognized in the process of Fe-S transfer? This review focuses on the basic steps of Fe-S cluster biogenesis, drawing attention to recent advances achieved on the identification of molecular features that guide selection of specific subsets of nascent Fe-S recipients by the cochaperone HSC20. Additionally, it outlines the distinctive phenotypes of human diseases due to mutations in the components of the basic pathway. PMID:25245479
Gravitational wave signals of electroweak phase transition triggered by dark matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Wei; Guo, Huai-Ke; Shu, Jing, E-mail: chaowei@bnu.edu.cn, E-mail: ghk@itp.ac.cn, E-mail: jshu@itp.ac.cn
We study in this work a scenario that the universe undergoes a two step phase transition with the first step happened to the dark matter sector and the second step being the transition between the dark matter and the electroweak vacuums, where the barrier between the two vacuums, that is necessary for a strongly first order electroweak phase transition (EWPT) as required by the electroweak baryogenesis mechanism, arises at the tree-level. We illustrate this idea by working with the standard model (SM) augmented by a scalar singlet dark matter and an extra scalar singlet which mixes with the SM Higgsmore » boson. We study the conditions for such pattern of phase transition to occur and especially for the strongly first order EWPT to take place, as well as its compatibility with the basic requirements of a successful dark matter, such as observed relic density and constraints of direct detections. We further explore the discovery possibility of this pattern EWPT by searching for the gravitational waves generated during this process in spaced based interferometer, by showing a representative benchmark point of the parameter space that the generated gravitational waves fall within the sensitivity of eLISA, DECIGO and BBO.« less
Gravitational wave signals of electroweak phase transition triggered by dark matter
NASA Astrophysics Data System (ADS)
Chao, Wei; Guo, Huai-Ke; Shu, Jing
2017-09-01
We study in this work a scenario that the universe undergoes a two step phase transition with the first step happened to the dark matter sector and the second step being the transition between the dark matter and the electroweak vacuums, where the barrier between the two vacuums, that is necessary for a strongly first order electroweak phase transition (EWPT) as required by the electroweak baryogenesis mechanism, arises at the tree-level. We illustrate this idea by working with the standard model (SM) augmented by a scalar singlet dark matter and an extra scalar singlet which mixes with the SM Higgs boson. We study the conditions for such pattern of phase transition to occur and especially for the strongly first order EWPT to take place, as well as its compatibility with the basic requirements of a successful dark matter, such as observed relic density and constraints of direct detections. We further explore the discovery possibility of this pattern EWPT by searching for the gravitational waves generated during this process in spaced based interferometer, by showing a representative benchmark point of the parameter space that the generated gravitational waves fall within the sensitivity of eLISA, DECIGO and BBO.
Using an Automated 3D-tracking System to Record Individual and Shoals of Adult Zebrafish
Maaswinkel, Hans; Zhu, Liqun; Weng, Wei
2013-01-01
Like many aquatic animals, zebrafish (Danio rerio) moves in a 3D space. It is thus preferable to use a 3D recording system to study its behavior. The presented automatic video tracking system accomplishes this by using a mirror system and a calibration procedure that corrects for the considerable error introduced by the transition of light from water to air. With this system it is possible to record both single and groups of adult zebrafish. Before use, the system has to be calibrated. The system consists of three modules: Recording, Path Reconstruction, and Data Processing. The step-by-step protocols for calibration and using the three modules are presented. Depending on the experimental setup, the system can be used for testing neophobia, white aversion, social cohesion, motor impairments, novel object exploration etc. It is especially promising as a first-step tool to study the effects of drugs or mutations on basic behavioral patterns. The system provides information about vertical and horizontal distribution of the zebrafish, about the xyz-components of kinematic parameters (such as locomotion, velocity, acceleration, and turning angle) and it provides the data necessary to calculate parameters for social cohesions when testing shoals. PMID:24336189
Chemical facility vulnerability assessment project.
Jaeger, Calvin D
2003-11-14
Sandia National Laboratories, under the direction of the Office of Science and Technology, National Institute of Justice, conducted the chemical facility vulnerability assessment (CFVA) project. The primary objective of this project was to develop, test and validate a vulnerability assessment methodology (VAM) for determining the security of chemical facilities against terrorist or criminal attacks (VAM-CF). The project also included a report to the Department of Justice for Congress that in addition to describing the VAM-CF also addressed general observations related to security practices, threats and risks at chemical facilities and chemical transport. In the development of the VAM-CF Sandia leveraged the experience gained from the use and development of VAs in other areas and the input from the chemical industry and Federal agencies. The VAM-CF is a systematic, risk-based approach where risk is a function of the severity of consequences of an undesired event, the attack potential, and the likelihood of adversary success in causing the undesired event. For the purpose of the VAM-CF analyses Risk is a function of S, L(A), and L(AS), where S is the severity of consequence of an event, L(A) is the attack potential and L(AS) likelihood of adversary success in causing a catastrophic event. The VAM-CF consists of 13 basic steps. It involves an initial screening step, which helps to identify and prioritize facilities for further analysis. This step is similar to the prioritization approach developed by the American Chemistry Council (ACC). Other steps help to determine the components of the risk equation and ultimately the risk. The VAM-CF process involves identifying the hazardous chemicals and processes at a chemical facility. It helps chemical facilities to focus their attention on the most critical areas. The VAM-CF is not a quantitative analysis but, rather, compares relative security risks. If the risks are deemed too high, recommendations are developed for measures to reduce the risk. This paper will briefly discuss the CFVA project and VAM-CF process.
Parametric Grid Information in the DOE Knowledge Base: Data Preparation, Storage, and Access
DOE Office of Scientific and Technical Information (OSTI.GOV)
HIPP,JAMES R.; MOORE,SUSAN G.; MYERS,STEPHEN C.
The parametric grid capability of the Knowledge Base provides an efficient, robust way to store and access interpolatable information which is needed to monitor the Comprehensive Nuclear Test Ban Treaty. To meet both the accuracy and performance requirements of operational monitoring systems, we use a new approach which combines the error estimation of kriging with the speed and robustness of Natural Neighbor Interpolation (NNI). The method involves three basic steps: data preparation (DP), data storage (DS), and data access (DA). The goal of data preparation is to process a set of raw data points to produce a sufficient basis formore » accurate NNI of value and error estimates in the Data Access step. This basis includes a set of nodes and their connectedness, collectively known as a tessellation, and the corresponding values and errors that map to each node, which we call surfaces. In many cases, the raw data point distribution is not sufficiently dense to guarantee accurate error estimates from the NNI, so the original data set must be densified using a newly developed interpolation technique known as Modified Bayesian Kriging. Once appropriate kriging parameters have been determined by variogram analysis, the optimum basis for NNI is determined in a process they call mesh refinement, which involves iterative kriging, new node insertion, and Delauny triangle smoothing. The process terminates when an NNI basis has been calculated which will fir the kriged values within a specified tolerance. In the data storage step, the tessellations and surfaces are stored in the Knowledge Base, currently in a binary flatfile format but perhaps in the future in a spatially-indexed database. Finally, in the data access step, a client application makes a request for an interpolated value, which triggers a data fetch from the Knowledge Base through the libKBI interface, a walking triangle search for the containing triangle, and finally the NNI interpolation.« less
P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)
NASA Astrophysics Data System (ADS)
Kropp, Derek L.
2009-05-01
One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.
Managing the replacement cycle of laser inventory.
Davis, C E
1992-01-01
Medical lasers are quickly moving into the replacement phase of technology management. Barnes Hospital (St. Louis, MO) is using its laser team to define a process of planned laser replacement using the experience gained from traditional medical equipment replacement cycles, quality improvement principles and tools, and other formalized interdisciplinary teams. The process described in this paper has six basic steps: (1) A decision is made to request a replacement laser. (2) An appropriation request form is completed and submitted with the clinical and/or technical justifications. (3) Those requests initiated outside of the Clinical Engineering Department are reviewed by the Clinical Engineer/Medical Laser Safety Officer (CE/MLSO). (4) The CE/MLSO presents the requests to the hospital Laser Committee, and (5) then to the Laser Users' Group. (6) Finally, an Expenditure Authorization Committee reviews all capital expense requests, including those for replacement lasers, and allocates funds for the next fiscal year. This paper illustrates and evaluates the process, using an example from the review process for 1993 equipment purchases at Barnes Hospital.
Corbara, F; Di Cristofaro, E
1996-01-01
The concept of Quality is particularly up to date and not a new one for the Journal. The need for better Quality is a must also in Medical care. Quality doesn't mean additional costs and excessive burden for the co-workers. On the contrary, initial costs can be compensated for through a more rational utilisation of the resources. The consequent better service for the patient results in an ameliorated working environment, with high profits. Fundamental requirements for reaching concrete results are: 1) the convinced involvement in the idea of all levels (division, service, laboratory) in order to have the different groups act in synergism towards common goals; 2) the knowledge of appropriate methods. The Authors examine this last point with a deep analysis of the techniques involved in Company Wide Quality Control (C.W.Q.C.) or Total Quality. The improving process has to the continuous and proceed in small steps, each time being constituted by 4 different phases, represented by the PDCA cycle, or Demining wheel, where: P = PLAN, which means plan before acting; D = DO, perform what has been planned; C = CHECK, verify the results; A = ACT, standardize if the results are positive, repeat the process if negative. Each process of improvement implies a prior precise definition of a project, i.e. a problem whose solution has been planned. The project must always presume: a specific subject--a goal--one or more people to reach it--a limited time to work it out. The most effective way to ameliorate Quality is performing projects. Step by Step amelioration is synonymous of performance of many projects. A brilliant way to produce many projects remains their "industrialization", which can be reached by means of 6 basic criteria: 1) full involvement of the Direction; 2) potential co-working in the projects of all employees; 3) employment of simple instruments; 4) respect of a few procedural formalities; 5) rewarding of personnel; 6) continuous promotion of the concepts of quality and ongoing improvement. The Authors describe for each of the previous criteria approaching methods and best operative techniques, according C.W.Q.C.
Automatic Detection of Clouds and Shadows Using High Resolution Satellite Image Time Series
NASA Astrophysics Data System (ADS)
Champion, Nicolas
2016-06-01
Detecting clouds and their shadows is one of the primaries steps to perform when processing satellite images because they may alter the quality of some products such as large-area orthomosaics. The main goal of this paper is to present the automatic method developed at IGN-France for detecting clouds and shadows in a sequence of satellite images. In our work, surface reflectance orthoimages are used. They were processed from initial satellite images using a dedicated software. The cloud detection step consists of a region-growing algorithm. Seeds are firstly extracted. For that purpose and for each input ortho-image to process, we select the other ortho-images of the sequence that intersect it. The pixels of the input ortho-image are secondly labelled seeds if the difference of reflectance (in the blue channel) with overlapping ortho-images is bigger than a given threshold. Clouds are eventually delineated using a region-growing method based on a radiometric and homogeneity criterion. Regarding the shadow detection, our method is based on the idea that a shadow pixel is darker when comparing to the other images of the time series. The detection is basically composed of three steps. Firstly, we compute a synthetic ortho-image covering the whole study area. Its pixels have a value corresponding to the median value of all input reflectance ortho-images intersecting at that pixel location. Secondly, for each input ortho-image, a pixel is labelled shadows if the difference of reflectance (in the NIR channel) with the synthetic ortho-image is below a given threshold. Eventually, an optional region-growing step may be used to refine the results. Note that pixels labelled clouds during the cloud detection are not used for computing the median value in the first step; additionally, the NIR input data channel is used to perform the shadow detection, because it appeared to better discriminate shadow pixels. The method was tested on times series of Landsat 8 and Pléiades-HR images and our first experiments show the feasibility to automate the detection of shadows and clouds in satellite image sequences.
Improving health care, Part 4: Concepts for improving any clinical process.
Batalden, P B; Mohr, J J; Nelson, E C; Plume, S K
1996-10-01
One promising method for streamlining the generation of "good ideas" is to formulate what are sometimes called change concepts-general notions or approaches to change found useful in developing specific ideas for changes that lead to improvement. For example, in current efforts to reduce health care costs by discounting provider charges, the underlying generic concept is "reducing health care costs," and the specific idea is "discounting provider charges." Short-term gains in health care cost reduction can occur by pursuing discounts. After some time, however, limits to such reduction in costs are experienced. Persevering and continuing to travel down the "discounting provider charges" path is less likely to produce further substantial improvement than returning to the basic concept of "reducing health care costs." An interdisciplinary team aiming to reduce costs while improving quality of care for patients in need of hip joint replacement generated ideas for changing "what's done (process) to get better results." After team members wrote down their improvement ideas, they deduced the underlying change concepts and used them to generate even more ideas for improvement. Such change concepts include reordering the sequence of steps (preadmission physical therapy "certification"), eliminating failures at hand-offs between steps (transfer of information from physician's office to hospital), and eliminating a step (epidural pain control). Learning about making change, encouraging change, managing the change within and across organizations, and learning from the changes tested will characterize the sustainable, thriving health systems of the future.
The physics of lipid droplet nucleation, growth and budding.
Thiam, Abdou Rachid; Forêt, Lionel
2016-08-01
Lipid droplets (LDs) are intracellular oil-in-water emulsion droplets, covered by a phospholipid monolayer and mainly present in the cytosol. Despite their important role in cellular metabolism and growing number of newly identified functions, LD formation mechanism from the endoplasmic reticulum remains poorly understood. To form a LD, the oil molecules synthesized in the ER accumulate between the monolayer leaflets and induce deformation of the membrane. This formation process works through three steps: nucleation, growth and budding, exactly as in phase separation and dewetting phenomena. These steps involve sequential biophysical membrane remodeling mechanisms for which we present basic tools of statistical physics, membrane biophysics, and soft matter science underlying them. We aim to highlight relevant factors that could control LD formation size, site and number through this physics description. An emphasis will be given to a currently underestimated contribution of the molecular interactions between lipids to favor an energetically costless mechanism of LD formation. Copyright © 2016 Elsevier B.V. All rights reserved.
BEARKIMPE-2: A VBA Excel program for characterizing granular iron in treatability studies
NASA Astrophysics Data System (ADS)
Firdous, R.; Devlin, J. F.
2014-02-01
The selection of a suitable kinetic model to investigate the reaction rate of a contaminant with granular iron (GI) is essential to optimize the permeable reactive barrier (PRB) performance in terms of its reactivity. The newly developed Kinetic Iron Model (KIM) determines the surface rate constant (k) and sorption parameters (Cmax &J) which were not possible to uniquely identify previously. The code was written in Visual Basic (VBA), within Microsoft Excel, was adapted from earlier command line FORTRAN codes, BEARPE and KIMPE. The program is organized with several user interface screens (UserForms) that guide the user step by step through the analysis. BEARKIMPE-2 uses a non-linear optimization algorithm to calculate transport and chemical kinetic parameters. Both reactive and non-reactive sites are considered. A demonstration of the functionality of BEARKIMPE-2, with three nitroaromatic compounds showed that the differences in reaction rates for these compounds could be attributed to differences in their sorption behavior rather than their propensities to accept electrons in the reduction process.
NASA Astrophysics Data System (ADS)
Omar, M. A.; Parvataneni, R.; Zhou, Y.
2010-09-01
Proposed manuscript describes the implementation of a two step processing procedure, composed of the self-referencing and the Principle Component Thermography (PCT). The combined approach enables the processing of thermograms from transient (flash), steady (halogen) and selective (induction) thermal perturbations. Firstly, the research discusses the three basic processing schemes typically applied for thermography; namely mathematical transformation based processing, curve-fitting processing, and direct contrast based calculations. Proposed algorithm utilizes the self-referencing scheme to create a sub-sequence that contains the maximum contrast information and also compute the anomalies' depth values. While, the Principle Component Thermography operates on the sub-sequence frames by re-arranging its data content (pixel values) spatially and temporally then it highlights the data variance. The PCT is mainly used as a mathematical mean to enhance the defects' contrast thus enabling its shape and size retrieval. The results show that the proposed combined scheme is effective in processing multiple size defects in sandwich steel structure in real-time (<30 Hz) and with full spatial coverage, without the need for a priori defect-free area.
NASA Technical Reports Server (NTRS)
Maynard, N. C. (Editor)
1979-01-01
Significant deficiencies exist in the present understanding of the basic physical processes taking place within the middle atmosphere (the region between the tropopause and the mesopause), and in the knowledge of the variability of many of the primary parameters that regulate Middle Atmosphere Electrodynamics (MAE). Knowledge of the electrical properties, i.e., electric fields, plasma characteristics, conductivity and currents, and the physical processes that govern them is of fundamental importance to the physics of the region. Middle atmosphere electrodynamics may play a critical role in the electrodynamical aspects of solar-terrestrial relations. As a first step, the Workshop on the Role of the Electrodynamics of the Middle Atmosphere on Solar-Terrestrial Coupling was held to review the present status and define recommendations for future MAE research.
Libyan nationalizations: TOPCO/CALASIATIC vs Libya arbitration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Von Mehren, R.B.; Kourides, P.N.
1979-01-01
Nine international oil companies operating in Libya were informed in 1973 and early 1974 that their interests and properties would be nationalized. This event followed four years after a military takeover of the Libyan government by Colonel Muammar el-Qadhafi, whose actions led to a major international arbitration. This article describes the background of the Libyan nationalization, the steps toward arbitration, the arbitration proceeding, the awards of the Sole Arbitrator, and the significance of those awards. The TOPCO/CALASIATIC vs Libya arbitration not only provides an excellent example of the process of arbitration, but also it confirms the effectiveness of the processmore » in leading to eventual settlement of the dispute. Basic fundamental principles of law were considered, articulated, and reaffirmed throughout the process, adding percedent to the small body of international case law. 38 references.« less
Enarson, C; Cariaga-Lo, L
2001-11-01
The results of the United States Medical Licensing Examination Step 1 and 2 examinations are reported for students enrolled in a problem-based and traditional lecture-based curricula over a seven-year period at a single institution. There were no statistically significant differences in mean scores on either examination over the seven year period as a whole. There were statistically significant main effects noted by cohort year and curricular track for both the Step 1 and 2 examinations. These results support the general, long-term effectiveness of problem-based learning with respect to basic and clinical science knowledge acquisition. This paper reports the United States Medical Licensing Examination Step 1 and Step 2 results for students enrolled in a problem-based and traditional lecture-based learning curricula over the seven-year period (1992-98) in order to evaluate the adequacy of each curriculum in supporting students learning of the basic and clinical sciences. Six hundred and eighty-nine students who took the United States Medical Licensing Examination Step 1 and 540 students who took Step 2 for the first time over the seven-year period were included in the analyses. T-test analyses were utilized to compare students' Step 1 and Step 2 performance by curriculum groups. United States Medical Licensing Examination Step 1 scores over the seven-year period were 214 for Traditional Curriculum students and 208 for Parallel Curriculum students (t-value = 1.32, P=0.21). Mean Step 2 scores over the seven-year period were 208 for Traditional Curriculum students and 206 for Parallel Curriculum students (t-value=1.08, P=0.30). Statistically significant main effects were noted by cohort year and curricular track for both the Step 1 and Step 2 examinations. The totality of experience in both groups, although differing by curricular type, may be similar enough that the comparable scores are what should be expected. These results should be reassuring to curricular planners and faculty that problem-based learning can provide students with the knowledge needed for the subsequent phases of their medical education.
Dunn, Thomas M; Dalton, Alice; Dorfman, Todd; Dunn, William W
2004-01-01
To be a first step in determining whether emergency medicine technician (EMT)-Basics are capable of using a protocol that allows for selective immobilization of the cervical spine. Such protocols are coming into use at an advanced life support level and could be beneficial when used by basic life support providers. A convenience sample of participants (n=95) from 11 emergency medical services agencies and one college class participated in the study. All participants evaluated six patients in written scenarios and decided which should be placed into spinal precautions according to a selective spinal immobilization protocol. Systems without an existing selective spinal immobilization protocol received a one-hour continuing education lecture regarding the topic. College students received a similar lecture written so laypersons could understand the protocol. All participants showed proficiency when applying a selective immobilization protocol to patients in paper-based scenarios. Furthermore, EMT-Basics performed at the same level as paramedics when following the protocol. Statistical analysis revealed no significant differences between EMT-Basics and paramedics. A follow-up group of college students (added to have a non-EMS comparison group) also performed as well as paramedics when making decisions to use spinal precautions. Differences between college students and paramedics were also statistically insignificant. The results suggest that EMT-Basics are as accurate as paramedics when making decisions regarding selective immobilization of the cervical spine during paper-based scenarios. That laypersons are also proficient when using the protocol could indicate that it is extremely simple to follow. This study is a first step toward the necessary additional studies evaluating the efficacy of EMT-Basics using selective immobilization as a regular practice.
Building the Service-Based Library Web Site: A Step-by-Step Guide to Design and Options.
ERIC Educational Resources Information Center
Garlock, Kristen L.; Piontek, Sherry
The World Wide Web, with its captivating multimedia features and hypertext capabilities, has brought millions of new users to the Internet. Library staff who could create a home page on the Web could present basic information about the library and its services, showcase its resources, create links to quality material inside and outside the…
DOT National Transportation Integrated Search
1989-01-01
This manual provides basic background information and step-by-step procedures for conducting traffic conflict surveys at signalized and unsignalized intersections. The manual was prepared as a training aid and reference source for persons who are ass...
Challenging Behavior Step-by-Step Sifting: Part 4--Critical Needs
ERIC Educational Resources Information Center
Duffy, Roslyn Ann
2010-01-01
What causes challenging behavior and what can adults do about it? That is a basic question parents and caregivers face everyday. Some needs are easy to meet, others take more work, and some require outside help. This article is the fourth and final segment of a multi-part series about dealing with Challenging Behavior, both at home and school. The…
Learning To Use the World Wide Web. Academic Edition.
ERIC Educational Resources Information Center
Ackerman, Ernest
This book emphasizes how to use Netscape Navigator to access the World Wide Web and associated resources and services in a step-by-step, organized manner. Chapters include -- Chapter 1: Introduction to the World Wide Web and the Internet; Chapter 2: Using a Web Browser; Chapter 3: The Basics of Electronic Mail and Using Netscape Email; Chapter 4:…
Impact of modellers' decisions on hydrological a priori predictions
NASA Astrophysics Data System (ADS)
Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.
2014-06-01
In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of added information. In this qualitative analysis of a statistically small number of predictions we learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing), and (iii) that added process understanding can be as efficient as adding data for improving parameters needed to satisfy model requirements.
Development of biology student worksheets to facilitate science process skills of student
NASA Astrophysics Data System (ADS)
Rahayu, Y. S.; Pratiwi, R.; Indana, S.
2018-01-01
This research aims to describe development of Biology student worksheets to facilitate science process skills of student, at the same time to facilitate thinking skills of students in senior high school are equipped with Assesment Sheets. The worksheets development refers to cycle which includes phase analysis (analysis), planning (planning), design (design), development (development), implementation (implementation), evaluation and revision (evaluation and revision). Phase evaluation and revision is an ongoing activity conducted in each phase of the development cycle. That is, after the evaluation of the results of these activities and make revisions at any phase, then continue to the next phase. Based on the test results for grade X, XI, and XII in St. Agnes Surabaya high school, obtained some important findings. The findings are as follows. (1) Developed biology student worksheets could be used to facilitate thinking ability of students in particular skills integrated process that includes components to formulate the problem, formulate hypotheses, determine the study variables, formulate an operational definition of variables, determine the steps in the research, planning data tables, organizing Data in the form of tables/charts, drawing conclusions, (2) Developed biology student worksheets could also facilitate the development of social interaction of students such as working together, listening/respect the opinions of others, assembling equipment and materials, discuss and share information and facilitate the upgrading of skills hands-on student activity. (3) Developed biology worksheets basically could be implemented with the guidance of the teacher step by step, especially for students who have never used a similar worksheet. Guidance at the beginning of this need, especially for worksheets that require special skills or understanding of specific concepts as a prerequisite, such as using a microscope, determine the heart rate, understand the mechanism of specific indicators.
The potential role of telocytes in Tissue Engineering and Regenerative Medicine.
Boos, Anja M; Weigand, Annika; Brodbeck, Rebekka; Beier, Justus P; Arkudas, Andreas; Horch, Raymund E
2016-07-01
Research and ideas for potential applications in the field of Tissue Engineering (TE) and Regenerative Medicine (RM) have been constantly increasing over recent years, basically driven by the fundamental human dream of repairing and regenerating lost tissue and organ functions. The basic idea of TE is to combine cells with putative stem cell properties with extracellular matrix components, growth factors and supporting matrices to achieve independently growing tissue. As a side effect, in the past years, more insights have been gained into cell-cell interaction and how to manipulate cell behavior. However, to date the ideal cell source has still to be found. Apart from commonly known various stem cell sources, telocytes (TC) have recently attracted increasing attention because they might play a potential role for TE and RM. It becomes increasingly evident that TC provide a regenerative potential and act in cellular communication through their network-forming telopodes. While TE in vitro experiments can be the first step, the key for elucidating their regenerative role will be the investigation of the interaction of TC with the surrounding tissue. For later clinical applications further steps have to include an upscaling process of vascularization of engineered tissue. Arteriovenous loop models to vascularize such constructs provide an ideal platform for preclinical testing of future therapeutic concepts in RM. The following review article should give an overview of what is known so far about the potential role of TC in TE and RM. Copyright © 2016 Elsevier Ltd. All rights reserved.
Practical ways to facilitate ergonomics improvements in occupational health practice.
Kogi, Kazutaka
2012-12-01
Recent advances in participatory programs for improving workplace conditions are discussed to examine practical ways to facilitate ergonomics improvements. Participatory training programs are gaining importance, particularly in promoting occupational health and safety in small-scale workplaces. These programs have led to many improvements that can reduce work-related risks in varied situations. Recent experiences in participatory action-oriented training programs in small workplaces and agriculture are reviewed.The emphasis of the review is on training steps, types of improvements achieved, and the use of action tools by trainers and training participants. Immediate improvements in multiple technical areas are targeted, including materials handling,workstation design, physical environment, welfare facilities, and work organization. In facilitating ergonomics improvements in each local situation, it is important to focus on (a) building on local good practices; (b) applying practical, simple improvements that apply the basic principles of ergonomics; and (c) developing action-oriented toolkits for direct use by workers and managers. This facilitation process is effective when locally designed action toolkits are used by trainers, including local good examples, action checklists, and illustrated how-to guides. Intervention studies demonstrate the effectiveness of participatory steps that use these toolkits in promoting good practices and reducing work-related risks. In facilitating ergonomics improvements in small-scale workplaces, it is important to focus on practical, low-cost improvements that build on local good practices. The use of action-oriented toolkits reflecting basic ergonomics principles is helpful.The promotion of the intercountry networking of positive experiences in participatory training is suggested.
ERIC Educational Resources Information Center
Jiménez-Salas, Zacarías; Campos-Góngora, Eduardo; González-Martínez, Blanca E.; Tijerina-Sáenz, Alexandra; Escamilla-Méndez, Angélica D.; Ramírez-López, Erik
2017-01-01
Over the past few years, a new research field has emerged, focusing on the social-scientific criteria for the study of opinions toward genetically modified foods (GMFs), since these may be limiting factors for the success or failure of these products. Basic education is the first step in the Mexican education system, and teachers may wield an…
Gynecologic oncology group strategies to improve timeliness of publication.
Bialy, Sally; Blessing, John A; Stehman, Frederick B; Reardon, Anne M; Blaser, Kim M
2013-08-01
The Gynecologic Oncology Group (GOG) is a multi-institution cooperative group funded by the National Cancer Institute to conduct clinical trials encompassing clinical and basic scientific research in gynecologic malignancies. These results are disseminated via publication in peer-reviewed journals. This process requires collaboration of numerous investigators located in diverse cancer research centers. Coordination of manuscript development is positioned within the Statistical and Data Center (SDC), thus allowing the SDC personnel to manage the process and refine strategies to promote earlier dissemination of results. A major initiative to improve timeliness utilizing the assignment, monitoring, and enforcement of deadlines for each phase of manuscript development is the focus of this investigation. Document improvement in timeliness via comparison of deadline compliance and time to journal submission due to expanded administrative and technologic initiatives implemented in 2006. Major steps in the publication process include generation of first draft by the First Author and submission to SDC, Co-author review, editorial review by Publications Subcommittee, response to journal critique, and revision. Associated with each step are responsibilities of First Author to write or revise, collaborating Biostatistician to perform analysis and interpretation, and assigned SDC Clinical Trials Editorial Associate to format/revise according to journal requirements. Upon the initiation of each step, a deadline for completion is assigned. In order to improve efficiency, a publications database was developed to track potential steps in manuscript development that enables the SDC Director of Administration and the Publications Subcommittee Chair to assign, monitor, and enforce deadlines. They, in turn, report progress to Group Leadership through the Operations Committee. The success of the strategies utilized to improve the GOG publication process was assessed by comparing the timeliness of each potential step in the development of primary Phase II manuscripts during 2003-2006 versus 2007-2010. Improvement was noted in 10 of 11 identified steps resulting in a cumulative average improvement of 240 days from notification of data maturity to First Author through first submission to a journal. Moreover, the average time to journal acceptance has improved by an average of 346 days. The investigation is based on only Phase II trials to ensure comparability of manuscript complexity. Nonetheless, the procedures employed are applicable to the development of any clinical trials manuscript. The assignment, monitoring, and enforcement of deadlines for all stages of manuscript development have resulted in increased efficiency and timeliness. The positioning and support of manuscript development within the SDC provide a valuable resource to authors in meeting assigned deadlines, accomplishing peer review, and complying with journal requirements.
Large-scale crystallization of proteins for purification and formulation.
Hekmat, Dariusch
2015-07-01
Since about 170 years, salts were used to create supersaturated solutions and crystallize proteins. The dehydrating effect of salts as well as their kosmotropic or chaotropic character was revealed. Even the suitability of organic solvents for crystallization was already recognized. Interestingly, what was performed during the early times is still practiced today. A lot of effort was put into understanding the underlying physico-chemical interaction mechanisms leading to protein crystallization. However, it was understood that already the solvation of proteins is a highly complex process not to mention the intricate interrelation of electrostatic and hydrophobic interactions taking place. Although many basic questions are still unanswered, preparative protein crystallization was attempted as illustrated in the presented case studies. Due to the highly variable nature of crystallization, individual design of the crystallization process is needed in every single case. It was shown that preparative crystallization from impure protein solutions as a capture step is possible after applying adequate pre-treatment procedures like precipitation or extraction. Protein crystallization can replace one or more chromatography steps. It was further shown that crystallization can serve as an attractive alternative means for formulation of therapeutic proteins. Crystalline proteins can offer enhanced purity and enable highly concentrated doses of the active ingredient. Easy scalability of the proposed protein crystallization processes was shown using the maximum local energy dissipation as a suitable scale-up criterion. Molecular modeling and target-oriented protein engineering may allow protein crystallization to become part of a platform purification process in the near future.
Step Bunching: Influence of Impurities and Solution Flow
NASA Technical Reports Server (NTRS)
Chernov, A. A.; Vekilov, P. G.; Coriell, S. R.; Murray, B. T.; McFadden, G. B.
1999-01-01
Step bunching results in striations even at relatively early stages of its development and in inclusions of mother liquor at the later stages. Therefore, eliminating step bunching is crucial for high crystal perfection. At least 5 major effects causing and influencing step bunching are known: (1) Basic morphological instability of stepped interfaces. It is caused by concentration gradient in the solution normal to the face and by the redistribution of solute tangentially to the interface which redistribution enhances occasional perturbations in step density due to various types of noise; (2) Aggravation of the above basic instability by solution flowing tangentially to the face in the same directions as the steps or stabilization of equidistant step train if these flows are antiparallel; (3) Enhanced bunching at supersaturation where step velocity v increases with relative supersaturation s much faster than linear. This v(s) dependence is believed to be associated with impurities. The impurities of which adsorption time is comparable with the time needed to deposit one lattice layer may also be responsible for bunching; (4) Very intensive solution flow stabilizes growing interface even at parallel solution and step flows; (5) Macrosteps were observed to nucleate at crystal corners and edges. Numerical simulation, assuming step-step interactions via surface diffusion also show that step bunching may be induced by random step nucleation at the facet edge and by discontinuity in the step density (a ridge) somewhere in the middle of a face. The corresponding bunching patterns produce the ones observed in experiment. The nature of step bunching generated at the corners and edges and by dislocation step sources, as well as the also relative importance and interrelations between mechanisms 1-5 is not clear, both from experimental and theoretical standpoints. Furthermore, several laws controlling the evolution of existing step bunches have been suggested, though unambiguous conclusions are still missing. Addressing these issues is the major goal of the present project. The theory addressing the above problem, experimental methods, several figures which include: (1) the spatial wave numbers at which the system is neutrally stable as a function of growth velocity for linear kinetics and supersaturation for nonlinear kinetics; (2) a schematic of the experiment of lysozyme crystal growing under conditions of natural convection; (3) fluctuations in time, t, of the normal growth rate, R(t), vicinal slope, p(t) and Fourier Spectra of R(t), discussions and conclusions are presented.
Physiological Responses and Hedonics During Prolonged Physically Interactive Videogame Play.
Santo, Antonio S; Barkley, Jacob E; Hafen, Paul S; Navalta, James
2016-04-01
This study was designed to assess physiologic responses and hedonics (i.e., liking) during prolonged physically interactive videogame play. Participants (n = 24) completed three 30-minute videogame conditions on separate days in a random order. During two of the conditions participants played physically interactive videogames (Nintendo of America, Inc. [Redmond, WA] "Wii™ Fit" "Basic Run" and "Basic Step"). During the third condition participants played a traditional/sedentary game ("Tanks!"), which required minimal physical movement for gameplay. Oxygen consumption (VO2) was assessed using indirect calorimetry throughout each condition and averaged every 5 minutes. Liking was assessed via visual analog scale at the 15- and 30-minute time points during each condition. Mean VO2 was significantly (P < 0.001) greater during "Basic Run" (16.14 ± 5.8 mL/kg/minute, 4.6 ± 1.7 metabolic equivalents [METs]) than either "Basic Step" (11.4 ± 1.7 mL/kg/minute, 3.3 ± 0.5 METs) or the traditional/sedentary videogame (5.39 ± 1.0 mL/kg/minute, 1.5 ± 0.1 METs). "Basic Step" was also greater (P < 0.001) than the traditional/sedentary game. VO2 did not significantly (P = 0.25) fluctuate across the 30-minute session for any game. In other words, participants maintained a consistent physiologic intensity throughout each 30-minute condition. There were no differences (P ≥ 0.20) across gaming conditions or time for liking. Participants achieved and maintained moderate-intensity physical activity (≥3.0 METs) during both 30-minute physically interactive videogame conditions. Furthermore, because liking was similar across all gaming conditions, participants may be willing to substitute the physically interactive videogames in place of the traditional/sedentary game.
Study on the syhthesis process of tetracaine hydrochloride
NASA Astrophysics Data System (ADS)
Li, Wenli; Zhao, Jie; Cui, Yujie
2017-05-01
Tetrachloride hydrochloride is a local anesthetic with long-acting ester, and it is usually present in the form of a hydrochloride salt. Firsleb first synthesized the tetracaine by experiment in 1928, which is one of the recognized clinical potent anesthetics. This medicine has the advantages of stable physical and chemical properties, the rapid role and long maintenance. Tetracaine is also used for ophthalmic surface anesthesia as one of the main local anesthetic just like conduction block anesthesia, mucosal surface anesthesia and epidural anesthesia. So far, the research mainly engaged in its clinical application research, and the research strength is relatively small in the field of synthetic technology. The general cost of the existing production process is high, and the yield is low. In addition, the reaction time is long and the reaction conditions are harsh. In this paper, a new synthetic method was proposed for the synthesis of tetracaine hydrochloride. The reaction route has the advantages of few steps, high yield, short reaction time and mild reaction conditions. The cheap p-nitrobenzoic acid was selected as raw material. By esterification with ethanol and reaction with n-butyraldehyde (the reaction process includes nitro reduction, aldol condensation and hydrogenation reduction), the intermediate was transesterified with dimethylaminoethanol under basic conditions. Finally, the PH value was adjusted in the ethanol solvent. After experiencing 4 steps reaction, the crude tetracaine hydrochloride was obtained.
Taking advantage of ground data systems attributes to achieve quality results in testing software
NASA Technical Reports Server (NTRS)
Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.
1994-01-01
During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.
NASA Astrophysics Data System (ADS)
Latkowski, S.; van Veldhoven, P. J.; Hänsel, A.; D'Agostino, D.; Rabbani-Haghighi, H.; Docter, B.; Bhattacharya, N.; Thijs, P. J. A.; Ambrosius, H. P. M. M.; Smit, M. K.; Williams, K. A.; Bente, E. A. J. M.
2017-02-01
In this paper a generic monolithic photonic integration technology platform and tunable laser devices for gas sensing applications at 2 μm will be presented. The basic set of long wavelength optical functions which is fundamental for a generic photonic integration approach is realized using planar, but-joint, active-passive integration on indium phosphide substrate with active components based on strained InGaAs quantum wells. Using this limited set of basic building blocks a novel geometry, widely tunable laser source was designed and fabricated within the first long wavelength multiproject wafer run. The fabricated laser operates around 2027 nm, covers a record tuning range of 31 nm and is successfully employed in absorption measurements of carbon dioxide. These results demonstrate a fully functional long wavelength photonic integrated circuit that operates at these wavelengths. Moreover, the process steps and material system used for the long wavelength technology are almost identical to the ones which are used in the technology process at 1.5μm which makes it straightforward and hassle-free to transfer to the photonic foundries with existing fabrication lines. The changes from the 1550 nm technology and the trade-offs made in the building block design and layer stack will be discussed.
The effects of distributed life cycles on the dynamics of viral infections.
Campos, Daniel; Méndez, Vicenç; Fedotov, Sergei
2008-09-21
We explore the role of cellular life cycles for viruses and host cells in an infection process. For this purpose, we derive a generalized version of the basic model of virus dynamics (Nowak, M.A., Bangham, C.R.M., 1996. Population dynamics of immune responses to persistent viruses. Science 272, 74-79) from a mesoscopic description. In its final form the model can be written as a set of Volterra integrodifferential equations. We consider the role of distributed lifespans and a intracellular (eclipse) phase. These processes are implemented by means of probability distribution functions. The basic reproductive ratio R(0) of the infection is properly defined in terms of such distributions by using an analysis of the equilibrium states and their stability. It is concluded that the introduction of distributed delays can strongly modify both the value of R(0) and the predictions for the virus loads, so the effects on the infection dynamics are of major importance. We also show how the model presented here can be applied to some simple situations where direct comparison with experiments is possible. Specifically, phage-bacteria interactions are analyzed. The dynamics of the eclipse phase for phages is characterized analytically, which allows us to compare the performance of three different fittings proposed before for the one-step growth curve.
Image editing with Adobe Photoshop 6.0.
Caruso, Ronald D; Postel, Gregory C
2002-01-01
The authors introduce Photoshop 6.0 for radiologists and demonstrate basic techniques of editing gray-scale cross-sectional images intended for publication and for incorporation into computerized presentations. For basic editing of gray-scale cross-sectional images, the Tools palette and the History/Actions palette pair should be displayed. The History palette may be used to undo a step or series of steps. The Actions palette is a menu of user-defined macros that save time by automating an action or series of actions. Converting an image to 8-bit gray scale is the first editing function. Cropping is the next action. Both decrease file size. Use of the smallest file size necessary for the purpose at hand is recommended. Final file size for gray-scale cross-sectional neuroradiologic images (8-bit, single-layer TIFF [tagged image file format] at 300 pixels per inch) intended for publication varies from about 700 Kbytes to 3 Mbytes. Final file size for incorporation into computerized presentations is about 10-100 Kbytes (8-bit, single-layer, gray-scale, high-quality JPEG [Joint Photographic Experts Group]), depending on source and intended use. Editing and annotating images before they are inserted into presentation software is highly recommended, both for convenience and flexibility. Radiologists should find that image editing can be carried out very rapidly once the basic steps are learned and automated. Copyright RSNA, 2002
Performance measures from the explorer platform berthing experiment
NASA Technical Reports Server (NTRS)
Leake, Stephen
1993-01-01
The Explorer Platform is a Modular Mission Spacecraft: it has several subunits that are designed to be replaced on orbit. The Goddard Space Flight Center Robotics Lab undertook an experiment to evaluate various robotic approaches to replacing one of the units; a large (approximately 1 meter by 1 meter by 0.5 meter) power box. The hardware consists of a Robotics Research Corporation K-1607 (RRC) manipulator mounted on a large gantry robot, a Kraft handcontroller for teleoperation of RRC, a Lightweight Servicing Tool (LST) mounted on the RRC, and an Explorer Platform mockup (EP) with a removable box (MMS) that has fixtures that mate with the LST. Sensors include a wrist wrench sensor on the RRC and Capaciflectors mounted on the LST and the MMS. There are also several cameras, but no machine vision is used. The control system for the RRC is entirely written by Goddard; it consists of Ada code on three Multibus I 386/387 CPU boards doing the real-time robot control, and C on a 386 PC processing Capaciflector data. The gantry is not moved during this experiment. The task is the exchange of the MMS; it is removed and replaced. This involves four basic steps: mating the LST to the MMS, demating the MMS from the EP, mating the MMS to the EP, and demating the LST form the MMS. Each of the mating steps must be preceeded by an alignment to bring the mechanical fixtures within their capture range. Two basic approaches to alignment are explored: teleoperation with the operator viewing thru cameras, and Capaciflector based autonomy. To evaluate the two alignment approaches, several runs were run with each approach and the final pose was recorded. Comparing this to the ideal alignment pose gives accuracy and repeatability data. In addition the wrenches exerted during the mating tasks were recorded; this gives information on how the alignment step affects the mating step. There are also two approaches to mating; teleoperation, and impedance based autonomy. The wrench data taken during mating using these two approaches is used to evaluate them. Section 2 describes the alignment results, section 3 describes the mating results, and finally Section 4 gives some conclusions.
Third Party TMDL Development Toolkit
Water Environment Federation's toolkit provides basic steps in which an organization or group other than the lead water quality agency takes responsibility for developing the TMDL document and supporting analysis.
Tost, H; Meyer-Lindenberg, A; Ruf, M; Demirakça, T; Grimm, O; Henn, F A; Ende, G
2005-02-01
Modern neuroimaging techniques such as magnetic resonance imaging (MRI) and positron emission tomography (PET) have contributed tremendously to our current understanding of psychiatric disorders in the context of functional, biochemical and microstructural alterations of the brain. Since the mid-nineties, functional MRI has provided major insights into the neurobiological correlates of signs and symptoms in schizophrenia. The current paper reviews important fMRI studies of the past decade in the domains of motor, visual, auditory, attentional and working memory function. Special emphasis is given to new methodological approaches, such as the visualisation of medication effects and the functional characterisation of risk genes.
Image processing for grazing incidence fast atom diffraction
NASA Astrophysics Data System (ADS)
Debiossac, Maxime; Roncin, Philippe
2016-09-01
Grazing incidence fast atom diffraction (GIFAD, or FAD) has developed as a surface sensitive technique. Compared with thermal energies helium diffraction (TEAS or HAS), GIFAD is less sensitive to thermal decoherence but also more demanding in terms of surface coherence, the mean distance between defects. Such high quality surfaces can be obtained from freshly cleaved crystals or in a molecular beam epitaxy (MBE) chamber where a GIFAD setup has been installed allowing in situ operation. Based on recent publications by Atkinson et al. (2014) and Debiossac et al. (2014), the paper describes in detail the basic steps needed to measure the relative intensities of the diffraction spots. Care is taken to outline the underlying physical assumptions.
Educational Video Recording and Editing for The Hand Surgeon
Rehim, Shady A.; Chung, Kevin C.
2016-01-01
Digital video recordings are increasingly used across various medical and surgical disciplines including hand surgery for documentation of patient care, resident education, scientific presentations and publications. In recent years, the introduction of sophisticated computer hardware and software technology has simplified the process of digital video production and improved means of disseminating large digital data files. However, the creation of high quality surgical video footage requires basic understanding of key technical considerations, together with creativity and sound aesthetic judgment of the videographer. In this article we outline the practical steps involved with equipment preparation, video recording, editing and archiving as well as guidance for the choice of suitable hardware and software equipment. PMID:25911212
Parallel optoelectronic trinary signed-digit division
NASA Astrophysics Data System (ADS)
Alam, Mohammad S.
1999-03-01
The trinary signed-digit (TSD) number system has been found to be very useful for parallel addition and subtraction of any arbitrary length operands in constant time. Using the TSD addition and multiplication modules as the basic building blocks, we develop an efficient algorithm for performing parallel TSD division in constant time. The proposed division technique uses one TSD subtraction and two TSD multiplication steps. An optoelectronic correlator based architecture is suggested for implementation of the proposed TSD division algorithm, which fully exploits the parallelism and high processing speed of optics. An efficient spatial encoding scheme is used to ensure better utilization of space bandwidth product of the spatial light modulators used in the optoelectronic implementation.
Crew interface with a telerobotic control station
NASA Technical Reports Server (NTRS)
Mok, Eva
1987-01-01
A method for apportioning crew-telerobot tasks has been derived to facilitate the design of a crew-friendly telerobot control station. To identify the most appropriate state-of-the-art hardware for the control station, task apportionment must first be conducted to identify if an astronaut or a telerobot is best to execute the task and which displays and controls are required for monitoring and performance. Basic steps that comprise the task analysis process are: (1) identify space station tasks; (2) define tasks; (3) define task performance criteria and perform task apportionment; (4) verify task apportionment; (5) generate control station requirements; (6) develop design concepts to meet requirements; and (7) test and verify design concepts.
Martini, Marinna A.; Sherwood, Chris; Horwitz, Rachel; Ramsey, Andree; Lightsom, Fran; Lacy, Jessie; Xu, Jingping
2006-01-01
3.\tpreserving minimally processed and partially processed versions of data sets. STG usually deploys ADV and PCADP probes configured as downward looking, mounted on bottom tripods, with the objective of measuring high-resolution near-bed currents. The velocity profiles are recorded with minimal internal data processing. Also recorded are parameters such as temperature, conductivity, optical backscatter, light transmission, and high frequency pressure. Sampling consists of high-frequency bursts(1–10 Hz) bursts of long duration (5–30 minutes) at regular and recurring intervals for a duration of 1 to 6 months. The result is very large data files, often 500 MB per Hydra, per deployment, in Sontek's compressed binary format. This section introduces the Hydratools toolbox and provides information about the history of the system's development. The USGS philosophy regarding data quality is discussed to provide an understating of the motivation for creating the system. General information about the following topics will also be discussed: hardware and software required for the system, basic processing steps, limitations of program usage, and features that are unique to the programs.
ERIC Educational Resources Information Center
White, Jim; Alexander, Larry
This student activity kit consists of a programmed, self-instructional learning guide and an accompanying instructor's manual for use in teaching trade and industrial education students how to make an adjustable C-clamp. The student guide contains step-by-step instructions in the following areas: basic layout principles; use of a hack saw, file,…
Operational procedures for ground station operation: ATS-3 Hawaii-Ames satellite link experiment
NASA Technical Reports Server (NTRS)
Nishioka, K.; Gross, E. H.
1979-01-01
Hardware description and operational procedures for the ATS-3 Hawaii-Ames satellite computer link are presented in basic step-by-step instructions. Transmit and receive channels and frequencies are given. Details such as switch settings for activating the station to the sequence of turning switches on are provided. Methods and procedures for troubleshooting common problems encountered with communication stations are also provided.
Setting Up Git Software Tool on Linux | High-Performance Computing | NREL
system. Before you can get started using the github.nrel.gov git repos, you'll have to do some basic shell (SSH) keys created on those systems. If this is the case, for more information, see using the git Steps - Using a Remote Git Repository Now you have all the basic configuration for using git with a
Tran, Phuong Ha-Lien; Tran, Thao Truong-Dinh; Lee, Kyoung-Ho; Kim, Dong-Jin; Lee, Beom-Jin
2010-05-01
Although the solid dispersion method has been known to increase the dissolution rate of poorly water-soluble drugs by dispersing them in hydrophilic carriers, one obstacle of the solid dispersion method is its limited solubilization capacity, especially for pH-dependent soluble drugs. pH-modified solid dispersion, in which pH modifiers are incorporated, may be a useful method for increasing the dissolution rate of weakly acidic or basic drugs. Sufficient research, including the most recent reports, was undertaken in this review. How could the inclusion of the pH the pH modifiers in the solid dispersion system change drug structural behaviors, molecular interactions, microenvironmental pH, and/or release rate of pH modifiers, relating with the enhanced dissolution of weakly acidic or weakly basic drugs with poor water solubility? These questions have been investigated to determine the dissolution-modulating mechanism of pH modifiers in solid dispersion containing weakly acidic or basic drugs. It is believed that step-by-step mechanistic approaches could provide the ultimate solution for solubilizing several poorly water-soluble drugs with pH-dependent solubility from a solid dispersion system, as well as provide ideas for developing future dosage systems.
Siedenburg, J
2009-01-01
Common Rules for Aviation Safety had been developed under the aegis of the Joint Aviation Authorities in the 1990ies. In 2002 the Basic Regulation 1592/2002 was the founding document of a new entity, the European Aviation Safety Agency. Areas of activity were Certification and Maintenance of aircraft. On 18 March the new Basic Regulation 216/2008, repealing the original Basic Regulation was published and applicable from 08 April on. The included Essential Requirements extended the competencies of EASA inter alia to Pilot Licensing and Flight Operations. The future aeromedical requirements will be included as Annex II in another Implementing Regulation on Personnel Licensing. The detailed provisions will be published as guidance material. The proposals for these provisions have been published on 05 June 2008 as NPA 2008- 17c. After public consultation, processing of comments and final adoption the new proposals may be applicable form the second half of 2009 on. A transition period of four year will apply. Whereas the provisions are based on Joint Awiation Requirement - Flight Crew Licensing (JAR-FCL) 3, a new Light Aircraft Pilot Licence (LAPL) project and the details of the associated medical certification regarding general practitioners will be something new in aviation medicine. This paper consists of 6 sections. The introduction outlines the idea of international aviation safety. The second section describes the development of the Joint Aviation Authorities (JAA), the first step to common rules for aviation safety in Europe. The third section encompasses a major change as next step: the foundation of the European Aviation Safety Agency (EASA) and the development of its rules. In the following section provides an outline of the new medical requirements. Section five emphasizes the new concept of a Leisure Pilot Licence. The last section gives an outlook on ongoing rulemaking activities and the opportunities of the public to participate in them. PMID:19561781
Siedenburg, J
2009-04-01
Common Rules for Aviation Safety had been developed under the aegis of the Joint Aviation Authorities in the 1990s. In 2002 the Basic Regulation 1592/2002 was the founding document of a new entity, the European Aviation Safety Agency. Areas of activity were Certification and Maintenance of aircraft. On 18 March the new Basic Regulation 216/2008, repealing the original Basic Regulation was published and applicable from 08 April on. The included Essential Requirements extended the competencies of EASA inter alia to Pilot Licensing and Flight Operations. The future aeromedical requirements will be included as Annex II in another Implementing Regulation on Personnel Licensing. The detailed provisions will be published as guidance material. The proposals for these provisions have been published on 05 June 2008 as NPA 2008- 17c. After public consultation, processing of comments and final adoption the new proposals may be applicable form the second half of 2009 on. A transition period of four year will apply. Whereas the provisions are based on Joint Aviation Requirement-Flight Crew Licensing (JAR-FCL) 3, a new Light Aircraft Pilot Licence (LAPL) project and the details of the associated medical certification regarding general practitioners will be something new in aviation medicine. This paper consists of 6 sections. The introduction outlines the idea of international aviation safety. The second section describes the development of the Joint Aviation Authorities (JAA), the first step to common rules for aviation safety in Europe. The third section encompasses a major change as next step: the foundation of the European Aviation Safety Agency (EASA) and the development of its rules. In the following section provides an outline of the new medical requirements. Section five emphasizes the new concept of a Leisure Pilot Licence. The last section gives an outlook on ongoing rulemaking activities and the opportunities of the public to participate in them.
Effective constitutive relations for large repetitive frame-like structures
NASA Technical Reports Server (NTRS)
Nayfeh, A. H.; Hefzy, M. S.
1981-01-01
Effective mechanical properties for large repetitive framelike structures are derived using combinations of strength of material and orthogonal transformation techniques. Symmetry considerations are used in order to identify independent property constants. The actual values of these constants are constructed according to a building block format which is carried out in the three consecutive steps: (1) all basic planar lattices are identified; (2) effective continuum properties are derived for each of these planar basic grids using matrix structural analysis methods; and (3) orthogonal transformations are used to determine the contribution of each basic set to the overall effective continuum properties of the structure.
Keller, Tobias C; Rodrigues, Elodie G; Pérez-Ramírez, Javier
2014-06-01
High-silica zeolites have been reported recently as efficient catalysts for liquid- and gas-phase condensation reactions because of the presence of a complementary source of basicity compared to Al-rich basic zeolites. Herein, we describe the controlled generation of these active sites on silica-rich FAU, BEA, and MFI zeolites. Through the application of a mild base treatment in aqueous Na2CO3, alkali-metal-coordinating defects are generated within the zeolite whereas the porous properties are fully preserved. The resulting catalysts were applied in the gas-phase condensation of propanal at 673 K as a model reaction for the catalytic upgrading of pyrolysis oil, for which an up to 20-fold increased activity compared to the unmodified zeolites was attained. The moderate basicity of these new sites leads to a coke resistance superior to traditional base catalysts such as CsX and MgO, and comparable activity and excellent selectivity is achieved for the condensation pathways. Through strategic acid and base treatments and the use of magic-angle spinning NMR spectroscopy, the nature of the active sites was investigated, which supports the theory of siloxy sites as basic centers. This contribution represents a key step in the understanding and design of high-silica base catalysts for the intermediate deoxygenation of crude bio-oil prior to the hydrotreating step for the production of second-generation biofuels. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Developing Connectivist Schemas for Geological and Geomorphological Education
NASA Astrophysics Data System (ADS)
Whalley, B.
2012-12-01
Teaching geology is difficult; students need to grasp changes in time over three dimensions. Furthermore, the scales and rates of change in four dimensions may vary over several orders of magnitude. Geological explanations incorporate ideas from physics, chemistry, biology and engineering, lectures and textbooks provide a basic framework but they need to be amplified by laboratories and fieldwork involving active student participation and engagement. Being shown named 'things' is only a start to being able to being able to inculcate geological thinking that requires a wide and focused viewpoints. Kastens and Ishikawa (2006) suggested five aspects of thinking geologically, summarised as: 1. Observing, describing, recording, communicating geologically entities (ie basic cognitive skills) 2. (mentally) manipulating these entities 3. interpreting them via causal relationships 4. predicting other aspects using the basic knowledge (to create new knowledge) 5. using cognitive strategies to develop new ways of interpreting gained knowledge. These steps can be used follow the sequence from 'known' through 'need to know' to using knowledge to gain better geologic explanation, taken as enquiry-based or problem solving modes of education. These follow ideas from Dewey though Sternberg's 'thinking styles' and Siemens' connectivist approaches. Implementation of this basic schema needs to be structured for students in a complex geological world in line with Edelson's (2006) 'learning for' framework. In a geomorphological setting, this has been done by showing students how to interpret a landscape (landform, section etc) practice their skills and thus gain confidence with a tutor at hand. A web-based device, 'Virtorial' provides scenarios for students to practice interpretation (or even be assessed with). A cognitive tool is provided for landscape interpretation by division into the recognition of 'Materials' (rock, sediments etc), Processes (slope, glacial processes etc) and 'Geometry' (what it looks like). These components provide basic metadata for any landform in a landscape. Thus, the recognition of a landform means much more than a feature; the metadata provide contexts that can be used for interpretation in the field or laboratory, individually or in discussion groups, distance or field learning environments.
[Community health in primary health care teams: a management objective].
Nebot Adell, Carme; Pasarin Rua, Maribel; Canela Soler, Jaume; Sala Alvarez, Clara; Escosa Farga, Alex
2016-12-01
To describe the process of development of community health in a territory where the Primary Health Care board decided to include it in its roadmap as a strategic line. Evaluative research using qualitative techniques, including SWOT analysis on community health. Two-steps study. Primary care teams (PCT) of the Catalan Health Institute in Barcelona city. The 24 PCT belonging to the Muntanya-Dreta Primary Care Service in Barcelona city, with 904 professionals serving 557,430 inhabitants. Application of qualitative methodology using SWOT analysis in two steps (two-step study). Step 1: Setting up a core group consisting of local PCT professionals; collecting the community projects across the territory; SWOT analysis. Step 2: From the needs identified in the previous phase, a plan was developed, including a set of training activities in community health: basic, advanced, and a workshop to exchange experiences from the PCTs. A total of 80 team professionals received specific training in the 4 workshops held, one of them an advanced level. Two workshops were held to exchange experiences with 165 representatives from the local teams, and 22 PCTs presenting their practices. In 2013, 6 out of 24 PCTs have had a community diagnosis performed. Community health has achieved a good level of development in some areas, but this is not the general situation in the health care system. Its progression depends on the management support they have, the local community dynamics, and the scope of the Primary Health Care. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Satagopam, Venkata; Gu, Wei; Eifes, Serge; Gawron, Piotr; Ostaszewski, Marek; Gebel, Stephan; Barbosa-Silva, Adriano; Balling, Rudi; Schneider, Reinhard
2016-01-01
Abstract Translational medicine is a domain turning results of basic life science research into new tools and methods in a clinical environment, for example, as new diagnostics or therapies. Nowadays, the process of translation is supported by large amounts of heterogeneous data ranging from medical data to a whole range of -omics data. It is not only a great opportunity but also a great challenge, as translational medicine big data is difficult to integrate and analyze, and requires the involvement of biomedical experts for the data processing. We show here that visualization and interoperable workflows, combining multiple complex steps, can address at least parts of the challenge. In this article, we present an integrated workflow for exploring, analysis, and interpretation of translational medicine data in the context of human health. Three Web services—tranSMART, a Galaxy Server, and a MINERVA platform—are combined into one big data pipeline. Native visualization capabilities enable the biomedical experts to get a comprehensive overview and control over separate steps of the workflow. The capabilities of tranSMART enable a flexible filtering of multidimensional integrated data sets to create subsets suitable for downstream processing. A Galaxy Server offers visually aided construction of analytical pipelines, with the use of existing or custom components. A MINERVA platform supports the exploration of health and disease-related mechanisms in a contextualized analytical visualization system. We demonstrate the utility of our workflow by illustrating its subsequent steps using an existing data set, for which we propose a filtering scheme, an analytical pipeline, and a corresponding visualization of analytical results. The workflow is available as a sandbox environment, where readers can work with the described setup themselves. Overall, our work shows how visualization and interfacing of big data processing services facilitate exploration, analysis, and interpretation of translational medicine data. PMID:27441714
NASA Astrophysics Data System (ADS)
Bauer, Joschka; Schaal, Daniel; Eisoldt, Lukas; Schweimer, Kristian; Schwarzinger, Stephan; Scheibel, Thomas
2016-09-01
Dragline silk is the most prominent amongst spider silks and comprises two types of major ampullate spidroins (MaSp) differing in their proline content. In the natural spinning process, the conversion of soluble MaSp into a tough fiber is, amongst other factors, triggered by dimerization and conformational switching of their helical amino-terminal domains (NRN). Both processes are induced by protonation of acidic residues upon acidification along the spinning duct. Here, the structure and monomer-dimer-equilibrium of the domain NRN1 of Latrodectus hesperus MaSp1 and variants thereof have been investigated, and the key residues for both could be identified. Changes in ionic composition and strength within the spinning duct enable electrostatic interactions between the acidic and basic pole of two monomers which prearrange into an antiparallel dimer. Upon naturally occurring acidification this dimer is stabilized by protonation of residue E114. A conformational change is independently triggered by protonation of clustered acidic residues (D39, E76, E81). Such step-by-step mechanism allows a controlled spidroin assembly in a pH- and salt sensitive manner, preventing premature aggregation of spider silk proteins in the gland and at the same time ensuring fast and efficient dimer formation and stabilization on demand in the spinning duct.
Current Practices in Teaching Introductory Epidemiology: How We Got Here, Where to Go
Keyes, Katherine M.; Galea, Sandro
2014-01-01
The number of students and disciplines requiring basic instruction in epidemiologic methods is growing. As a field, we now have a lexicon of epidemiologic terminology and particular methods that have developed and become canonical through the historical development of the field. Yet, many of our basic concepts remain elusive to some students, particularly those not pursuing a career in epidemiology. Further, disagreement and redundancy across basic terms limit their utility in teaching epidemiology. Many approaches to teaching epidemiology generally start with labeling key concepts and then move on to explain them. We submit that an approach grounded not in labels but in foundational concepts may offer a useful adjunct to introductory epidemiology education. We propose 7 foundational steps in conducting an epidemiologic study and provide examples of how these steps can be operationalized, using simple graphics that articulate how populations are defined, samples are selected, and individuals are followed to count cases. A reorganization of introductory epidemiology around core first principles may be an effective way forward for educating the next generation of public health scientists. PMID:25190677
NASA Astrophysics Data System (ADS)
Kweon, Hyunkyu; Choi, Sungdae; Kim, Youngsik; Nam, Kiho
Micro UTM (Universal Testing Machines) are becoming increasingly popular for testing the mechanical properties of MEMS materials, metal thin films, and micro-molecule materials1-2. And, new miniature testing machines that can perform in-process measurement in SEM, TEM, and SPM are also needed. In this paper, a new micro UTM with a precision positioning system that can be fine positioning stage. Coarse positioning is implemented by step motor. The size, load output and used in SEM, TEM, and SPM have been proposed. Bimorph type PZT precision actuator is used in displacement output of bimorph type UTM are 109×64×22(mm), about 35g, and 0.4 mm, respectively. And the displacement output is controlled in the block digital form. The results of the analysis and basic properties of positioning system and the UTM system are presented. In addition, the experiment results of in-process measurement during tensile load in SEM and AFM are showed.
NASA Astrophysics Data System (ADS)
Gupta, Shubhank; Panda, Aditi; Naskar, Ruchira; Mishra, Dinesh Kumar; Pal, Snehanshu
2017-11-01
Steels are alloys of iron and carbon, widely used in construction and other applications. The evolution of steel microstructure through various heat treatment processes is an important factor in controlling properties and performance of steel. Extensive experimentations have been performed to enhance the properties of steel by customizing heat treatment processes. However, experimental analyses are always associated with high resource requirements in terms of cost and time. As an alternative solution, we propose an image processing-based technique for refinement of raw plain carbon steel microstructure images, into a digital form, usable in experiments related to heat treatment processes of steel in diverse applications. The proposed work follows the conventional steps practiced by materials engineers in manual refinement of steel images; and it appropriately utilizes basic image processing techniques (including filtering, segmentation, opening, and clustering) to automate the whole process. The proposed refinement of steel microstructure images is aimed to enable computer-aided simulations of heat treatment of plain carbon steel, in a timely and cost-efficient manner; hence it is beneficial for the materials and metallurgy industry. Our experimental results prove the efficiency and effectiveness of the proposed technique.
Okada, Takayuki
2013-01-01
The author suggested that it is essential for lawyers and psychiatrists to have a common understanding of the mutual division of roles between them when determining criminal responsibility (CR) and, for this purpose, proposed an 8-step structured CR decision-making process. The 8 steps are: (1) gathering of information related to mental function and condition, (2) recognition of mental function and condition,(3) psychiatric diagnosis, (4) description of the relationship between psychiatric symptom or psychopathology and index offense, (5) focus on capacities of differentiation between right and wrong and behavioral control, (6) specification of elements of cognitive/volitional prong in legal context, (7) legal evaluation of degree of cognitive/volitional prong, and (8) final interpretation of CR as a legal conclusion. The author suggested that the CR decision-making process should proceed not in a step-like pattern from (1) to (2) to (3) to (8), but in a step-like pattern from (1) to (2) to (4) to (5) to (6) to (7) to (8), and that not steps after (5), which require the interpretation or the application of section 39 of the Penal Code, but Step (4), must be the core of psychiatric expert evidence. When explaining the relationship between the mental disorder and offense described in Step (4), the Seven Focal Points (7FP) are often used. The author urged basic precautions to prevent the misuse of 7FP, which are: (a) the priority of each item is not equal and the relative importance differs from case to case; (b) each item is not exclusively independent, there may be overlap between items; (c) the criminal responsibility shall not be judged because one item is applicable or because a number of items are applicable, i. e., 7FP are not "criteria," for example, the aim is not to decide such things as 'the motive is understandable' or 'the conduct is appropriate', but should be to describe how psychopathological factors affected the offense specifically in the context of understandability of motive or appropriateness of conduct; (d) it is essential to evaluate each item from a neutral point of view rather than only from one perspective, for example, looking at the case from the aspects of both comprehensibility and incomprehensibility of motive or from aspects of both oriented, purposeful, organized behavior and disoriented, purposeless, disorganized behavior during the offense; (e) depending on the case, there are some items that do not require any consideration (there are some cases in which there are less than seven items); (f) 7FP are not exhaustive and there are instances in which, depending on the case, there should be a focus on points that are not included in these.
Digital logic optimization using selection operators
NASA Technical Reports Server (NTRS)
Whitaker, Sterling R. (Inventor); Miles, Lowell H. (Inventor); Cameron, Eric G. (Inventor); Gambles, Jody W. (Inventor)
2004-01-01
According to the invention, a digital design method for manipulating a digital circuit netlist is disclosed. In one step, a first netlist is loaded. The first netlist is comprised of first basic cells that are comprised of first kernel cells. The first netlist is manipulated to create a second netlist. The second netlist is comprised of second basic cells that are comprised of second kernel cells. A percentage of the first and second kernel cells are selection circuits. There is less chip area consumed in the second basic cells than in the first basic cells. The second netlist is stored. In various embodiments, the percentage could be 2% or more, 5% or more, 10% or more, 20% or more, 30% or more, or 40% or more.
The neurophysiology of sexual arousal.
Schober, Justine M; Pfaff, Donald
2007-09-01
Our understanding of the process and initiation of sexual arousal is being enhanced by both animal and human studies, inclusive of basic science principles and research on clinical outcomes. Sexual arousal is dependent on neural (sensory and cognitive) factors, hormonal factors, genetic factors and, in the human case, the complex influences of culture and context. Sexual arousal activates the cognitive and physiologic processes that can eventually lead to sexual behavior. Sexual arousal comprises a particular subset of central nervous system arousal functions which depend on primitive, fundamental arousal mechanisms that cause generalized brain activity, but are manifest in a sociosexual context. The neurophysiology of sexual arousal is seen as a bidirectional system universal to all vertebrates. The following review includes known neural and genomic mechanisms of a hormone-dependent circuit for simple sex behavior. New information about hormone effects on causal steps related to sex hormones' nuclear receptor isoforms expressed by hypothalamic neurons continues to enrich our understanding of this neurophysiology.
Characterization and Degradation of Pectic Polysaccharides in Cocoa Pulp.
Meersman, Esther; Struyf, Nore; Kyomugasho, Clare; Jamsazzadeh Kermani, Zahra; Santiago, Jihan Santanina; Baert, Eline; Hemdane, Sami; Vrancken, Gino; Verstrepen, Kevin J; Courtin, Christophe M; Hendrickx, Marc; Steensels, Jan
2017-11-08
Microbial fermentation of the viscous pulp surrounding cocoa beans is a crucial step in chocolate production. During this process, the pulp is degraded, after which the beans are dried and shipped to factories for further processing. Despite its central role in chocolate production, pulp degradation, which is assumed to be a result of pectin breakdown, has not been thoroughly investigated. Therefore, this study provides a comprehensive physicochemical analysis of cocoa pulp, focusing on pectic polysaccharides, and the factors influencing its degradation. Detailed analysis reveals that pectin in cocoa pulp largely consists of weakly bound substances, and that both temperature and enzyme activity play a role in its degradation. Furthermore, this study shows that pulp degradation by an indigenous yeast fully relies on the presence of a single gene (PGU1), encoding for an endopolygalacturonase. Apart from their basic scientific value, these new insights could propel the selection of microbial starter cultures for more efficient pulp degradation.
Insar Unwrapping Error Correction Based on Quasi-Accurate Detection of Gross Errors (quad)
NASA Astrophysics Data System (ADS)
Kang, Y.; Zhao, C. Y.; Zhang, Q.; Yang, C. S.
2018-04-01
Unwrapping error is a common error in the InSAR processing, which will seriously degrade the accuracy of the monitoring results. Based on a gross error correction method, Quasi-accurate detection (QUAD), the method for unwrapping errors automatic correction is established in this paper. This method identifies and corrects the unwrapping errors by establishing a functional model between the true errors and interferograms. The basic principle and processing steps are presented. Then this method is compared with the L1-norm method with simulated data. Results show that both methods can effectively suppress the unwrapping error when the ratio of the unwrapping errors is low, and the two methods can complement each other when the ratio of the unwrapping errors is relatively high. At last the real SAR data is tested for the phase unwrapping error correction. Results show that this new method can correct the phase unwrapping errors successfully in the practical application.
Flow chemistry vs. flow analysis.
Trojanowicz, Marek
2016-01-01
The flow mode of conducting chemical syntheses facilitates chemical processes through the use of on-line analytical monitoring of occurring reactions, the application of solid-supported reagents to minimize downstream processing and computerized control systems to perform multi-step sequences. They are exactly the same attributes as those of flow analysis, which has solid place in modern analytical chemistry in several last decades. The following review paper, based on 131 references to original papers as well as pre-selected reviews, presents basic aspects, selected instrumental achievements and developmental directions of a rapidly growing field of continuous flow chemical synthesis. Interestingly, many of them might be potentially employed in the development of new methods in flow analysis too. In this paper, examples of application of flow analytical measurements for on-line monitoring of flow syntheses have been indicated and perspectives for a wider application of real-time analytical measurements have been discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
A template for integrated community sustainability planning.
Ling, Christopher; Hanna, Kevin; Dale, Ann
2009-08-01
This article describes a template for implementing an integrated community sustainability plan. The template emphasizes community engagement and outlines the components of a basic framework for integrating ecological, social and economic dynamics into a community plan. The framework is a series of steps that support a sustainable community development process. While it reflects the Canadian experience, the tools and techniques have applied value for a range of environmental planning contexts around the world. The research is case study based and draws from a diverse range of communities representing many types of infrastructure, demographics and ecological and geographical contexts. A critical path for moving local governments to sustainable community development is the creation and implementation of integrated planning approaches. To be effective and to be implemented, a requisite shift to sustainability requires active community engagement processes, political will, and a commitment to political and administrative accountability, and measurement.
Winett, Richard A; Anderson, Eileen S; Wojcik, Janet R; Winett, Sheila G; Moore, Shane; Blake, Chad
2011-03-01
Theory-based, efficacious, long-term, completely Internet-based interventions are needed to induce favorable shifts in health behaviors and prevent weight gain. To assess nutrition, physical activity, and, secondarily, body weight outcomes in the tailored, social cognitive theory Guide to Health ( WB-GTH ) program with all recruitment, assessment, and intervention performed on the Internet. The focus of the efficacy study was engaged participants who completed 3 or more program modules plus baseline, 6-months post and, 16-months follow-up assessments (n = 247). To be eligible, participants needed to be between 18-63 years of age, with a BMI between 23-39, sedentary to low-active but otherwise healthy. Participant had a mean age of 45.5 years (10.3), 86.2% were female, with 8.5% from minority groups, with a mean 17.5 (3.0) years of education, and had a median annual household income of about $85k. Nevertheless, about 83% were overweight or obese and about 75% were sedentary (i.e., <5000 steps/day) or had low levels of activity (i.e., 5,000 - 7499 steps/day). Participants were randomized to the WB-GTH-Basic intervention or WB-GTH-Enhanced intervention. Content, overall target behaviors, program goals and strategies were the same in the two interventions with the difference that Basic included a generic feedback and planning approach and Enhanced included a highly tailored planning and feedback approach. Participants reported at assessments pedometer step counts to assess physical activity, bodyweight from a scale provided, and fruit and vegetable (F&V) servings were assessed from food frequency questionnaires completed online. Participants in both Basic and Enhanced at follow-up increased physical activity by about 1400 steps/day, lost about 3% of bodyweight, and increased F&V by about 1.5 serving/day. There was evidence that the least physically active, those who were obese, and those with poorest nutrition made greater long-term improvements. Given similar outcomes for Basic and Enhanced , a relatively simple entirely Internet-based program can help people improve health behaviors and prevent weight gain.
Effect of Profilin on Actin Critical Concentration: A Theoretical Analysis
Yarmola, Elena G.; Dranishnikov, Dmitri A.; Bubb, Michael R.
2008-01-01
To explain the effect of profilin on actin critical concentration in a manner consistent with thermodynamic constraints and available experimental data, we built a thermodynamically rigorous model of actin steady-state dynamics in the presence of profilin. We analyzed previously published mechanisms theoretically and experimentally and, based on our analysis, suggest a new explanation for the effect of profilin. It is based on a general principle of indirect energy coupling. The fluctuation-based process of exchange diffusion indirectly couples the energy of ATP hydrolysis to actin polymerization. Profilin modulates this coupling, producing two basic effects. The first is based on the acceleration of exchange diffusion by profilin, which indicates, paradoxically, that a faster rate of actin depolymerization promotes net polymerization. The second is an affinity-based mechanism similar to the one suggested in 1993 by Pantaloni and Carlier although based on indirect rather than direct energy coupling. In the model by Pantaloni and Carlier, transformation of chemical energy of ATP hydrolysis into polymerization energy is regulated by direct association of each step in the hydrolysis reaction with a corresponding step in polymerization. Thus, hydrolysis becomes a time-limiting step in actin polymerization. In contrast, indirect coupling allows ATP hydrolysis to lag behind actin polymerization, consistent with experimental results. PMID:18835900
NASA Astrophysics Data System (ADS)
Komolov, Vladimir L.; Gruzdev, Vitaly E.; Przhibelskii, Sergey G.; Smirnov, Dmitry S.
2012-12-01
Damage of a metal spherical nanoparticle by femtosecond laser pulses is analyzed by splitting the overall process into two steps. The fast step includes electron photoemission from a nanoparticle. It takes place during direct action of a laser pulse and its rate is evaluated as a function of laser and particle parameters by two approaches. Obtained results suggest the formation of significant positive charge of the nanoparticles due to the photoemission. The next step includes ion emission that removes the excessive positive charge and modifies particle structure. It is delayed with respect to the photo-emission and is analyzed by a simple analytical model and modified molecular dynamics. Obtained energy distribution suggests generation of fast ions capable of penetrating into surrounding material and generating defects next to the nanoparticle. The modeling is extended to the case of a nanoparticle on a solid surface to understand the basic mechanism of surface laser damage initiated by nano-contamination. Simulations predict embedding the emitted ions into substrate within a spot with size significantly exceeding the original particle size. We discuss the relation of those effects to the problem of bulk and surface laser-induced damage of optical materials by single and multiple ultrashort laser pulses.
Cor, M Ken
Interpreting results from quantitative research can be difficult when measures of concepts are constructed poorly, something that can limit measurement validity. Social science steps for defining concepts, guidelines for limiting construct-irrelevant variance when writing self-report questions, and techniques for conducting basic item analysis are reviewed to inform the design of instruments to measure social science concepts in pharmacy education research. Based on a review of the literature, four main recommendations emerge: These include: (1) employ a systematic process of conceptualization to derive nominal definitions; (2) write exact and detailed operational definitions for each concept, (3) when creating self-report questionnaires, write statements and select scales to avoid introducing construct-irrelevant variance (CIV); and (4) use basic item analysis results to inform instrument revision. Employing recommendations that emerge from this review will strengthen arguments to support measurement validity which in turn will support the defensibility of study finding interpretations. An example from pharmacy education research is used to contextualize the concepts introduced. Copyright © 2017 Elsevier Inc. All rights reserved.
Raptor -- Mining the Sky in Real Time
NASA Astrophysics Data System (ADS)
Galassi, M.; Borozdin, K.; Casperson, D.; McGowan, K.; Starr, D.; White, R.; Wozniak, P.; Wren, J.
2004-06-01
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor ``fovea'' cameras and spectrometer to the location of the optical transient. The most interesting of Raptor's many applications is the real-time search for orphan optical counterparts of Gamma Ray Bursts. The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback...) is implemented with a ``component'' aproach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system. Finally: the Raptor architecture is entirely based on free software (sometimes referred to as "open source" software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.
Efficient numerical modeling of the cornea, and applications
NASA Astrophysics Data System (ADS)
Gonzalez, L.; Navarro, Rafael M.; Hdez-Matamoros, J. L.
2004-10-01
Corneal topography has shown to be an essential tool in the ophthalmology clinic both in diagnosis and custom treatments (refractive surgery, keratoplastia), having also a strong potential in optometry. The post processing and analysis of corneal elevation, or local curvature data, is a necessary step to refine the data and also to extract relevant information for the clinician. In this context a parametric cornea model is proposed consisting of a surface described mathematically by two terms: one general ellipsoid corresponding to a regular base surface, expressed by a general quadric term located at an arbitrary position and free orientation in 3D space and a second term, described by a Zernike polynomial expansion, which accounts for irregularities and departures from the basic geometry. The model has been validated obtaining better adjustment of experimental data than other previous models. Among other potential applications, here we present the determination of the optical axis of the cornea by transforming the general quadric to its canonical form. This has permitted us to perform 3D registration of corneal topographical maps to improve the signal-to-noise ratio. Other basic and clinical applications are also explored.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Classifying seismic waveforms from scratch: a case study in the alpine environment
NASA Astrophysics Data System (ADS)
Hammer, C.; Ohrnberger, M.; Fäh, D.
2013-01-01
Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTA trigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.
Adult Dental Trauma: What Should the Dental Practitioner Know?
Chauhan, Ravi; Rasaratnam, Lakshmi; Alani, Aws; Djemal, Serpil
2016-05-01
The management of adult dental trauma can be a daunting challenge for practitioners at any level. Like medical emergencies, initial management can have a large influence on prognosis. It is important that practitioners understand the basic principles of managing the acute presentations of dental trauma. This article aims to illustrate a step-by-step approach in order to improve the management within general dental practice for better outcomes for patients.
ERIC Educational Resources Information Center
Acar, Ebru Aktan; Çetin, Hilal
2017-01-01
The study features two basic steps. The first step of the research aims to develop a scale to measure the attitude of preschool teachers towards the Persona Dolls Approach and to verify its validity/reliability through a general survey. The cohort employed in the research was drawn from a pool of preschool teachers working in and around the cities…
Developing Brain Vital Signs: Initial Framework for Monitoring Brain Function Changes Over Time
Ghosh Hajra, Sujoy; Liu, Careesa C.; Song, Xiaowei; Fickling, Shaun; Liu, Luke E.; Pawlowski, Gabriela; Jorgensen, Janelle K.; Smith, Aynsley M.; Schnaider-Beeri, Michal; Van Den Broek, Rudi; Rizzotti, Rowena; Fisher, Kirk; D'Arcy, Ryan C. N.
2016-01-01
Clinical assessment of brain function relies heavily on indirect behavior-based tests. Unfortunately, behavior-based assessments are subjective and therefore susceptible to several confounding factors. Event-related brain potentials (ERPs), derived from electroencephalography (EEG), are often used to provide objective, physiological measures of brain function. Historically, ERPs have been characterized extensively within research settings, with limited but growing clinical applications. Over the past 20 years, we have developed clinical ERP applications for the evaluation of functional status following serious injury and/or disease. This work has identified an important gap: the need for a clinically accessible framework to evaluate ERP measures. Crucially, this enables baseline measures before brain dysfunction occurs, and might enable the routine collection of brain function metrics in the future much like blood pressure measures today. Here, we propose such a framework for extracting specific ERPs as potential “brain vital signs.” This framework enabled the translation/transformation of complex ERP data into accessible metrics of brain function for wider clinical utilization. To formalize the framework, three essential ERPs were selected as initial indicators: (1) the auditory N100 (Auditory sensation); (2) the auditory oddball P300 (Basic attention); and (3) the auditory speech processing N400 (Cognitive processing). First step validation was conducted on healthy younger and older adults (age range: 22–82 years). Results confirmed specific ERPs at the individual level (86.81–98.96%), verified predictable age-related differences (P300 latency delays in older adults, p < 0.05), and demonstrated successful linear transformation into the proposed brain vital sign (BVS) framework (basic attention latency sub-component of BVS framework reflects delays in older adults, p < 0.05). The findings represent an initial critical step in developing, extracting, and characterizing ERPs as vital signs, critical for subsequent evaluation of dysfunction in conditions like concussion and/or dementia. PMID:27242415
Climatological Processing of Radar Data for the TRMM Ground Validation Program
NASA Technical Reports Server (NTRS)
Kulie, Mark; Marks, David; Robinson, Michael; Silberstein, David; Wolff, David; Ferrier, Brad; Amitai, Eyal; Fisher, Brad; Wang, Jian-Xin; Augustine, David;
2000-01-01
The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November, 1997. The main purpose of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented. The primary goal of TRMM GV is to provide basic validation of satellite-derived precipitation measurements over monthly climatologies for the following primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, research analysts at NASA Goddard Space Flight Center (GSFC) generate standardized TRMM GV products using quality-controlled ground-based radar data from the four primary GV sites as input. This presentation will provide an overview of the TRMM GV climatological processing system. A description of the data flow between the primary GV sites, NASA GSFC, and the TRMM Science and Data Information System (TSDIS) will be presented. The radar quality control algorithm, which features eight adjustable height and reflectivity parameters, and its effect on monthly rainfall maps will be described. The methodology used to create monthly, gauge-adjusted rainfall products for each primary site will also be summarized. The standardized monthly rainfall products are developed in discrete, modular steps with distinct intermediate products. These developmental steps include: (1) extracting radar data over the locations of rain gauges, (2) merging rain gauge and radar data in time and space with user-defined options, (3) automated quality control of radar and gauge merged data by tracking accumulations from each instrument, and (4) deriving Z-R relationships from the quality-controlled merged data over monthly time scales. A summary of recently reprocessed official GV rainfall products available for TRMM science users will be presented. Updated basic standardized product results and trends involving monthly accumulation, Z-R relationship, and gauge statistics for each primary GV site will be also displayed.
Commercialization of basic research from within the university and return of value to the public
Hammerstedt, Roy H; Blach, Edward L
2008-01-01
The responsibility to return “value” to those who support basic research is an obligatory part of accepting funds to support the research. This reality should, but now does not, impact planning and execution of all basic research from its earliest stages. Universities are becoming ever more important in their role in the accelerating quest of a national goal of transition to a “knowledge based economy.” As such, the complex organizational format of a university, laden with entrenched procedures of questionable utility, should be adjusted to identify the means to commercialize the small subset of projects that appear suitable for further development. Of special concern is the growing tendency to encourage academic “innovators” to develop spin-out companies “on the side.” While seductive in perceived simplicity, this is a difficult step and we believe that most such individuals are ill-suited to these activities. Not because of technical ability but because of lack of relevant management experience. We attempt to address that situation through a brief listing of some reasons why people “do research” and outline phases (steps) in moving from concept to application, including an overview of start-up and funding early-stage spin-outs. A discussion of the limits to applying results of basic research to enhancing sperm fertility in commodity and companion animals and humans is provided. Hurdles are so daunting that there is concern as to why anyone would attempt to translate basic observations into practical solutions; which in turn raises the question of why funding agencies should fund basic studies in the first place. PMID:18164880
Workbook, Basic Mathematics and Wastewater Processing Calculations.
ERIC Educational Resources Information Center
New York State Dept. of Environmental Conservation, Albany.
This workbook serves as a self-learning guide to basic mathematics and treatment plant calculations and also as a reference and source book for the mathematics of sewage treatment and processing. In addition to basic mathematics, the workbook discusses processing and process control, laboratory calculations and efficiency calculations necessary in…
The Disestablishment of U.S. Joint Forces Command: A Step Backward in Jointness
2011-06-01
and the less than stellar 1983 rescue of a few medical students from the island of Grenada (Operation Urgent Fury), Congress passed the Goldwater...an opportunity for dialogue and a source of basic joint doctrine, it was widely ignored despite increasing tension and preparation for the Second... basic unity of effort appeared lacking even with dialogue like the Key West Agreements. Dr. William Niskanen, with the CATO Institute, described the
Assessment of molecular contamination in mask pod
NASA Astrophysics Data System (ADS)
Foray, Jean Marie; Dejaune, Patrice; Sergent, Pierre; Gough, Stuart; Cheung, D.; Davenet, Magali; Favre, Arnaud; Rude, C.; Trautmann, T.; Tissier, Michel; Fontaine, H.; Veillerot, M.; Avary, K.; Hollein, I.; Lerit, R.
2008-04-01
Context/ study Motivation: Contamination and especially Airbone Molecular Contamination (AMC) is a critical issue for mask material flow with a severe and fairly unpredictable risk of induced contamination and damages especially for 193 nm lithography. It is therefore essential to measure, to understand and then try to reduce AMC in mask environment. Mask material flow was studied in a global approach by a pool of European partners, especially within the frame of European MEDEA+ project, so called "MUSCLE". This paper deals with results and assessment of mask pod environment in term of molecular contamination in a first step, then in a second step preliminary studies to reduce mask pod influence and contamination due to material out gassing. Approach and techniques: A specific assessment of environmental / molecular contamination along the supply chain was performed by all partners. After previous work presented at EMLC 07, further studies were performed on real time contamination measurement pod at different sites locations (including Mask manufacturing site, blank manufacturing sites, IC fab). Studies were linked to the main critical issues: cleaning, storage, handling, materials and processes. Contamination measurement campaigns were carried out along the mask supply chain using specific Adixen analyzer in order to monitor in real time organic contaminants (ppb level) in mask pods. Key results would be presented: VOC, AMC and humidity level on different kinds of mask carriers, impact of basic cleaning on pod outgassing measurement (VOC, NH3), and process influence on pod contamination... In a second step, preliminary specific pod conditioning studies for better pod environment were performed based on Adixen vacuum process. Process influence had been experimentally measured in term of molecular outgassing from mask pods. Different AMC experimental characterization methods had been carried out leading to results on a wide range of organic and inorganic contaminants: by inline techniques based on Adixen humidity, also VOC and organic sensors, together by off-line techniques already used in the extensive previous mask pods benchmark (TD-GCMS & Ionic Chromatography). Humidity and VOC levels from mask carriers had shown significant reduction after Adixen pod conditioning process. Focus had been made on optimized vacuum step (for AMC) after particles carrier cleaning cycle. Based upon these key results new procedures, as well as guidelines for mask carrier cleaning optimization are proposed to improve pod contamination control. Summary results/next steps: This paper reports molecular contamination measurement campaigns performed by a pool of European partners along the mask supply chain. It allows us to investigate, identify and quantify critical molecular contamination in mask pod, as well as VOC and humidity, issues depending on locations, uses, and carrier's type. Preliminary studies highlight initial process solutions for pods conditioning that are being used for short term industrialization and further industrialized.
NASA Technical Reports Server (NTRS)
Exum, Daniel
1996-01-01
AMB-21 is a new polymer developed by Mr. Ray Vannucci, NASA, LeRC as a noncarcinogenic polyimide matrix which may be suitable for fabricating composite parts by the Resin Transfer Modeling (RTM) process. The polyimide for this project was prepared at the Center of Composite Materials Research at N.C. A&T State University because it is not currently an item of commerce. The RTM process is especially suitable for producing geometrically complex composite parts at a low cost. Because of the high melting point and very high viscosity at the time of processing, polyimides have not been extensively used in the RTM process. The process for preparing AMB-21 as well as the process for fabricating composite plates will be described. The basic fabrication process consists of injecting a solvent solution of AMP-21 into a carbon fiber preform, evaporating the solvent, imidizing the polyimide, and vacuum/compression modeling the impregnated preform. All the above molding steps are preformed in a specially designed RTM mold which will be described. The results of this process have been inconsistent. Where as some experiments have resulted in a reasonably sound panels, others have not. Further refinements of the process are required to establish a reliable process.
Implementing Performance Assessment in the Classroom.
ERIC Educational Resources Information Center
Brualdi, Amy
1999-01-01
Provides advice on implementing performance assessment in the classroom. Outlines the basic steps from defining the purpose of the assessment to giving the student feedback. Advice is also given about scoring rubrics. (SLD)
Parallel algorithms for boundary value problems
NASA Technical Reports Server (NTRS)
Lin, Avi
1990-01-01
A general approach to solve boundary value problems numerically in a parallel environment is discussed. The basic algorithm consists of two steps: the local step where all the P available processors work in parallel, and the global step where one processor solves a tridiagonal linear system of the order P. The main advantages of this approach are two fold. First, this suggested approach is very flexible, especially in the local step and thus the algorithm can be used with any number of processors and with any of the SIMD or MIMD machines. Secondly, the communication complexity is very small and thus can be used as easily with shared memory machines. Several examples for using this strategy are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebberts, Blaine D.; Zelinsky, Ben D.; Karnezis, Jason P.
We successfully implemented and institutionalized an adaptive management (AM) process for the Columbia Estuary Ecosystem Restoration Program, which is a large-scale restoration program focused on improving ecosystem conditions in the 234-km lower Columbia River and estuary. For our purpose, “institutionalized” means the AM process and restoration program are embedded in the work flow of the implementing agencies and affected parties. While plans outlining frameworks, processes, or approaches to AM of ecosystem restoration programs are commonplace, establishment for the long term is not. This paper presents the basic AM framework and explains how AM was implemented and institutionalized. Starting with amore » common goal, we pursued included a well-understood governance and decision-making structure, routine coordination and communication activities, data and information sharing, commitment from partners and upper agency management to the AM process, and meaningful cooperation among program managers and partners. The overall approach and steps to implement and institutionalize AM for ecosystem restoration explained here are applicable to situations where it has been less than successful or, as in our case, the restoration program is just getting started.« less
NASA Astrophysics Data System (ADS)
Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.
2018-03-01
Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations
Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, M., E-mail: mpfeiffer@irs.uni-stuttgart.de; Nizenkov, P., E-mail: nizenkov@irs.uni-stuttgart.de; Mirza, A., E-mail: mirza@irs.uni-stuttgart.de
2016-02-15
Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn’s Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methodsmore » are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.« less
A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems
NASA Technical Reports Server (NTRS)
Bindschadler, Duane L.; Boyles, Carole A.
2008-01-01
The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.
Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases
NASA Astrophysics Data System (ADS)
Pfeiffer, M.; Nizenkov, P.; Mirza, A.; Fasoulas, S.
2016-02-01
Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn's Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methods are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.
Food Safety: MedlinePlus Health Topic
... reached its expiration date. United States Department of Agriculture Start Here 4 Basic Steps to Food Safety ... Food Safety When Preparing Holiday Meals (Department of Agriculture, Food Safety and Inspection Service) - PDF Also in ...
Basic Airline Services to Improve Customer Satisfaction Act
Sen. Landrieu, Mary L. [D-LA
2011-11-18
Senate - 11/18/2011 Read twice and referred to the Committee on Commerce, Science, and Transportation. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
... US Food & Drug Administration. 4 basic steps to food safety at home. www.fda.gov/downloads/forconsumers/byaudience/ ... and the A.D.A.M. Editorial team. Food Safety Read more NIH MedlinePlus Magazine Read more Health ...
Assessment of turbulent models for scramjet flowfields
NASA Technical Reports Server (NTRS)
Sindir, M. M.; Harsha, P. T.
1982-01-01
The behavior of several turbulence models applied to the prediction of scramjet combustor flows is described. These models include the basic two equation model, the multiple dissipation length scale variant of the two equation model, and the algebraic stress model (ASM). Predictions were made of planar backward facing step flows and axisymmetric sudden expansion flows using each of these approaches. The formulation of each of these models are discussed, and the application of the different approaches to supersonic flows is described. A modified version of the ASM is found to provide the best prediction of the planar backward facing step flow in the region near the recirculation zone, while the basic ASM provides the best results downstream of the recirculation. Aspects of the interaction of numerica modeling and turbulences modeling as they affect the assessment of turbulence models are discussed.
Al-Ghobashy, Medhat A; Williams, Martin A K; Brophy, Brigid; Laible, Götz; Harding, David R K
2009-06-01
Downstream purification of a model recombinant protein (human myelin basic protein) from milk of transgenic cows is described. The recombinant protein was expressed as a His tagged fusion protein in the milk of transgenic cows and was found associated with the casein micellar phase. While difficulties in obtaining good recoveries were found when employing conventional micelle disruption procedures, direct capture using the cation exchanger SP Sepharose Big Beads was found successful in the extraction of the recombinant protein. Early breakthrough suggested a slow release of the recombinant protein from the micelles and dictated micelle disruption in order to obtain good yields. A new approach for deconstruction of the calcium core of the casein micelles, employing the interaction between the micellar calcium and the active sites of the cation exchanger resin was developed. Milk samples were loaded to the column in aliquots with a column washing step after each aliquot. This sequential loading approach successfully liberated the recombinant protein from the micelles and was found superior to the conventional sample loading approach. It increased the recovery by more than 25%, reduced fouling due to milk components and improved the column hydrodynamic properties as compared to the conventional sample loading approach. Hardware and software modifications to the chromatography system were necessary in order to keep the whole process automated. A second purification step using a Ni2+ affinity column was used to isolate the recombinant protein at purity more than 90% and a recovery percentage of 78%.
Inorganic and Protein Crystal Assembly in Solutions
NASA Technical Reports Server (NTRS)
Chernov, A. A.
2005-01-01
The basic kinetic and thermodynamic concepts of crystal growth will be revisited in view of recent AFM and interferometric findings. These concepts are as follows: 1) The Kossel crystal model that allows only one kink type on the crystal surface. The modern theory is developed overwhelmingly for the Kessel model; 2) Presumption that intensive step fluctuations maintain kink density sufficiently high to allow applicability of Gibbs-Thomson law; 3) Common experience that unlimited step bunching (morphological instability) during layer growth from solutions and supercooled melts always takes place if the step flow direction coincides with that of the fluid.
Microbial diversity in a bagasse-based compost prepared for the production of Agaricus brasiliensis
Silva, Cristina Ferreira; Azevedo, Raquel Santos; Braga, Claudia; da Silva, Romildo; Dias, Eustáquio Souza; Schwan, Rosane Freitas
2009-01-01
Edible mushrooms are renowned for their nutritional and medicinal properties and are thus of considerable commercial importance. Mushroom production depends on the chemical composition of the basic substrates and additional supplements employed in the compost as well as on the method of composting. In order to minimise the cost of mushroom production, considerable interest has been shown in the use of agro-industrial residues in the preparation of alternative compost mixtures. However, the interaction of the natural microbiota present in agricultural residues during the composting process greatly influences the subsequent colonisation by the mushroom. The aim of the present study was to isolate and identify the microbiota present in a sugar cane bagasse and coast-cross straw compost prepared for the production of Agaricus brasilienses. Composting lasted for 14 days, during which time the substrates and additives were mixed every 2 days, and this was followed by a two-step steam pasteurisation (55 - 65°C; 15 h each step). Bacteria, (mainly Bacillus and Paenibacillus spp. and members of the Enterobacteriaceae) were the predominant micro-organisms present throughout the composting process with an average population density of 3 x 108 CFU/g. Actinomycetes, and especially members of the genus Streptomyces, were well represented with a population density of 2 - 3 x 108 CFU/g. The filamentous fungi, however, exhibited much lower population densities and were less diverse than the other micro-organisms, although Aspergillus fumigatus was present during the whole composting process and after pasteurisation. PMID:24031404
Feng, Nianhua; Han, Qin; Li, Jing; Wang, Shihua; Li, Hongling; Yao, Xinglei; Zhao, Robert Chunhua
2014-03-01
Neural stem cells (NSCs) are ideal candidates in stem cell-based therapy for neurodegenerative diseases. However, it is unfeasible to get enough quantity of NSCs for clinical application. Generation of NSCs from human adipose-derived mesenchymal stem cells (hAD-MSCs) will provide a solution to this problem. Currently, the differentiation of hAD-MSCs into highly purified NSCs with biological functions is rarely reported. In our study, we established a three-step NSC-inducing protocol, in which hAD-MSCs were induced to generate NSCs with high purity after sequentially cultured in the pre-inducing medium (Step1), the N2B27 medium (Step2), and the N2B27 medium supplement with basic fibroblast growth factor and epidermal growth factor (Step3). These hAD-MSC-derived NSCs (adNSCs) can form neurospheres and highly express Sox1, Pax6, Nestin, and Vimentin; the proportion was 96.1% ± 1.3%, 96.8% ± 1.7%, 96.2% ± 1.3%, and 97.2% ± 2.5%, respectively, as detected by flow cytometry. These adNSCs can further differentiate into astrocytes, oligodendrocytes, and functional neurons, which were able to generate tetrodotoxin-sensitive sodium current. Additionally, we found that the neural differentiation of hAD-MSCs were significantly suppressed by Sox1 interference, and what's more, Step1 was a key step for the following induction, probably because it was associated with the initiation and nuclear translocation of Sox1, an important transcriptional factor for neural development. Finally, we observed that bone morphogenetic protein signal was inhibited, and Wnt/β-catenin signal was activated during inducing process, and both signals were related with Sox1 expression. In conclusion, we successfully established a three-step inducing protocol to derive NSCs from hAD-MSCs with high purity by Sox1 activation. These findings might enable to acquire enough autologous transplantable NSCs for the therapy of neurodegenerative diseases in clinic.
Evaluation of a newly developed media-supported 4-step approach for basic life support training
2012-01-01
Objective The quality of external chest compressions (ECC) is of primary importance within basic life support (BLS). Recent guidelines delineate the so-called 4“-step approach” for teaching practical skills within resuscitation training guided by a certified instructor. The objective of this study was to evaluate whether a “media-supported 4-step approach” for BLS training leads to equal practical performance compared to the standard 4-step approach. Materials and methods After baseline testing, 220 laypersons were either trained using the widely accepted method for resuscitation training (4-step approach) or using a newly created “media-supported 4-step approach”, both of equal duration. In this approach, steps 1 and 2 were ensured via a standardised self-produced podcast, which included all of the information regarding the BLS algorithm and resuscitation skills. Participants were tested on manikins in the same mock cardiac arrest single-rescuer scenario prior to intervention, after one week and after six months with respect to ECC-performance, and participants were surveyed about the approach. Results Participants (age 23 ± 11, 69% female) reached comparable practical ECC performances in both groups, with no statistical difference. Even after six months, there was no difference detected in the quality of the initial assessment algorithm or delay concerning initiation of CPR. Overall, at least 99% of the intervention group (n = 99; mean 1.5 ± 0.8; 6-point Likert scale: 1 = completely agree, 6 = completely disagree) agreed that the video provided an adequate introduction to BLS skills. Conclusions The “media-supported 4-step approach” leads to comparable practical ECC-performance compared to standard teaching, even with respect to retention of skills. Therefore, this approach could be useful in special educational settings where, for example, instructors’ resources are sparse or large-group sessions have to be prepared. PMID:22647148
W. Devine; C. Aubry; J. Miller; K. Potter; A. Bower
2012-01-01
This guide provides a step-by-step description of the methodology used to apply the Forest Tree Genetic Risk Assessment System (ForGRAS; Potter and Crane 2010) to the tree species of the Pacific Northwest in a recent climate change vulnerability assessment (Devine et al. 2012). We describe our modified version of the ForGRAS model, and we review the modelâs basic...
Hsiao, Ya-Shan; Narhe, Bharat D; Chang, Ying-Sheng; Sun, Chung-Ming
2013-10-14
A one-pot, two-step synthesis of imidazo[1,2-a]benzimidazoles has been achieved by a three-component reaction of 2-aminobenzimidazoles with an aromatic aldehyde and an isocyanide. The reaction involving condensation of 2-aminobenzimidazole with an aldehyde is run under microwave activation to generate an imine intermediate under basic conditions which then undergoes [4 + 1] cycloaddition with an isocyanide.
Basic Lessons in ORA and AutoMap 2011
2011-06-13
A small legend also appears. Below is a screen capture showing the visualization of the agent x event graph from the Stargate Summit Meta-Network...4 The visualization displays the connections between all items in the Stargate Summit Meta-Network. The red circles represent the agents, the...It takes the examples I used for the Stargate dataset. 5 lessons - 201-207 A step by step run through of creating the Meta-Network from
Basic Lessons in *ORA and Automap 2009
2009-06-01
screen capture showing the visualization of the agent x event graph from the Stargate Summit Meta-Network. The visualization displays the connections...for the Stargate dataset. 25.2 lessons - 201-207 A step by step run through of creating the Meta-Network from working with Excel, exporting data to...For the purpose of creating the Stargate MetaNetwork the two-episode story arc (Summit / Last Stand) was choosen as the basis for all the nodes
On the ability of consumer electronics microphones for environmental noise monitoring.
Van Renterghem, Timothy; Thomas, Pieter; Dominguez, Frederico; Dauwe, Samuel; Touhafi, Abdellah; Dhoedt, Bart; Botteldooren, Dick
2011-03-01
The massive production of microphones for consumer electronics, and the shift from dedicated processing hardware to PC-based systems, opens the way to build affordable, extensive noise measurement networks. Applications include e.g. noise limit and urban soundscape monitoring, and validation of calculated noise maps. Microphones are the critical components of such a network. Therefore, in a first step, some basic characteristics of 8 microphones, distributed over a wide range of price classes, were measured in a standardized way in an anechoic chamber. In a next step, a thorough evaluation was made of the ability of these microphones to be used for environmental noise monitoring. This was done during a continuous, half-year lasting outdoor experiment, characterized by a wide variety of meteorological conditions. While some microphones failed during the course of this test, it was shown that it is possible to identify cheap microphones that highly correlate to the reference microphone during the full test period. When the deviations are expressed in total A-weighted (road traffic) noise levels, values of less than 1 dBA are obtained, in excess to the deviation amongst reference microphones themselves.
Finset, Arnstein; Mjaaland, Trond A
2009-03-01
To present a model of the medical consultation as a value chain, and to apply a neurobehavioral perspective to analyze each element in the chain with relevance for emotion regulation. Current knowledge on four elements in medical consultations and neuroscientific evidence on corresponding basic processes are selectively reviewed. The four elements of communication behaviours presented as steps in a value chain model are: (1) establishing rapport, (2) patient disclosure of emotional cues and concerns, (3) the doctor's expression of empathy, and (4) positive reappraisal of concerns. The metaphor of the value chain, with emphasis on goal orientation, helps to understand the impact of each communicative element on the outcome of the consultation. Added value at each step is proposed in terms of effects on outcome indicators; in this case patients affect regulation. Neurobehavioral mechanisms are suggested to explain the association between communication behaviour and affect regulation outcome. The value chain metaphor and the emphasis on behaviour-outcome-mechanisms associations may be of interest as conceptualizations for communications skills training.
Statistical methods of estimating mining costs
Long, K.R.
2011-01-01
Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neufeld, R. D.; Bern, J.; Erdogan, H.
1979-11-15
Activities are underway to investigate basic phenomena that would assist demonstration and commercial sized coal conversion facilities in the environmentally acceptable disposal of process solid waste residuals. The approach taken is to consider only those residuals coming from the conversion technology itself, i.e. from gasification, liquefaction, and hot-clean-up steps as well as residuals from the wastewater treatment train. Residuals from the coal mining and coal grinding steps will not be considered in detail since those materials are being handled in some manner in the private sector. Laboratory evalations have been conducted on solid waste samples of fly ash from anmore » existing Capman gasifier. ASTM-A and EPA-EP leaching procedures have been completed on sieved size fractions of the above wastes. Data indicate that smaller size fractions pose greater contamination potential than do larger size particles with a transition zone occurring at particle sizes of about 0.05 inches in diameter. Ames testing of such residuals is reported. Similar studies are under way with samples of H-Coal solid waste residuals.« less
Kircher, Stefan; Wellmer, Frank; Nick, Peter; Rügner, Alexander; Schäfer, Eberhard; Harter, Klaus
1999-01-01
In plants, light perception by photoreceptors leads to differential expression of an enormous number of genes. An important step for differential gene expression is the regulation of transcription factor activities. To understand these processes in light signal transduction we analyzed the three well-known members of the common plant regulatory factor (CPRF) family from parsley (Petroselinum crispum). Here, we demonstrate that these CPRFs, which belong to the basic- region leucine-zipper (bZIP) domain-containing transcription factors, are differentially distributed within parsley cells, indicating different regulatory functions within the regulatory networks of the plant cell. In particular, we show by cell fractionation and immunolocalization approaches that CPRF2 is transported from the cytosol into the nucleus upon irradiation due to action of phytochrome photoreceptors. Two NH2-terminal domains responsible for cytoplasmic localization of CPRF2 in the dark were characterized by deletion analysis using a set of CPRF2-green fluorescent protein (GFP) gene fusion constructs transiently expressed in parsley protoplasts. We suggest that light-induced nuclear import of CPRF2 is an essential step in phytochrome signal transduction. PMID:9922448
Portable and Error-Free DNA-Based Data Storage.
Yazdi, S M Hossein Tabatabaei; Gabrys, Ryan; Milenkovic, Olgica
2017-07-10
DNA-based data storage is an emerging nonvolatile memory technology of potentially unprecedented density, durability, and replication efficiency. The basic system implementation steps include synthesizing DNA strings that contain user information and subsequently retrieving them via high-throughput sequencing technologies. Existing architectures enable reading and writing but do not offer random-access and error-free data recovery from low-cost, portable devices, which is crucial for making the storage technology competitive with classical recorders. Here we show for the first time that a portable, random-access platform may be implemented in practice using nanopore sequencers. The novelty of our approach is to design an integrated processing pipeline that encodes data to avoid costly synthesis and sequencing errors, enables random access through addressing, and leverages efficient portable sequencing via new iterative alignment and deletion error-correcting codes. Our work represents the only known random access DNA-based data storage system that uses error-prone nanopore sequencers, while still producing error-free readouts with the highest reported information rate/density. As such, it represents a crucial step towards practical employment of DNA molecules as storage media.
Sakurai, Yasuteru
2015-01-01
Ebola virus is an enveloped virus with filamentous structure and causes a severe hemorrhagic fever in human and nonhuman primates. Host cell entry is the first essential step in the viral life cycle, which has been extensively studied as one of the therapeutic targets. A virus factor of cell entry is a surface glycoprotein (GP), which is an only essential viral protein in the step, as well as the unique particle structure. The virus also interacts with a lot of host factors to successfully enter host cells. Ebola virus at first binds to cell surface proteins and internalizes into cells, followed by trafficking through endosomal vesicles to intracellular acidic compartments. There, host proteases process GPs, which can interact with an intracellular receptor. Then, under an appropriate circumstance, viral and endosomal membranes are fused, which is enhanced by major structural changes of GPs, to complete host cell entry. Recently the basic research of Ebola virus infection mechanism has markedly progressed, largely contributed by identification of host factors and detailed structural analyses of GPs. This article highlights the mechanism of Ebola virus host cell entry, including recent findings.
NASA Technical Reports Server (NTRS)
Cotton, William B.; Hilb, Robert; Koczo, Stefan, Jr.; Wing, David J.
2016-01-01
A set of five developmental steps building from the NASA TASAR (Traffic Aware Strategic Aircrew Requests) concept are described, each providing incrementally more efficiency and capacity benefits to airspace system users and service providers, culminating in a Full Airborne Trajectory Management capability. For each of these steps, the incremental Operational Hazards and Safety Requirements are identified for later use in future formal safety assessments intended to lead to certification and operational approval of the equipment and the associated procedures. Two established safety assessment methodologies that are compliant with the FAA's Safety Management System were used leading to Failure Effects Classifications (FEC) for each of the steps. The most likely FEC for the first three steps, Basic TASAR, Digital TASAR, and 4D TASAR, is "No effect". For step four, Strategic Airborne Trajectory Management, the likely FEC is "Minor". For Full Airborne Trajectory Management (Step 5), the most likely FEC is "Major".
Chen, Wentao; Dong, Jiajia; Li, Suhua; Liu, Yu; Wang, Yujia; Yoon, Leonard; Wu, Peng; Sharpless, K Barry; Kelly, Jeffery W
2016-01-26
Tyrosine O-sulfation is a common protein post-translational modification that regulates many biological processes, including leukocyte adhesion and chemotaxis. Many peptides with therapeutic potential contain one or more sulfotyrosine residues. We report a one-step synthesis for Fmoc-fluorosulfated tyrosine. An efficient Fmoc-based solid-phase peptide synthetic strategy is then introduced for incorporating the fluorosulfated tyrosine residue into peptides of interest. Standard simultaneous peptide-resin cleavage and removal of the acid-labile side-chain protecting groups affords the crude peptides containing fluorosulfated tyrosine. Basic ethylene glycol, serving both as solvent and reactant, transforms the fluorosulfated tyrosine peptides into sulfotyrosine peptides in high yield. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lee, B S; Choi, S; Yoon, J H; Park, J Y; Won, M S
2012-02-01
A magnet system for a 28 GHz electron cyclotron resonance ion source is being developed by the Korea Basic Science Institute. The configuration of the magnet system consists of 3 solenoid coils for a mirror magnetic field and 6 racetrack coils for a hexapole magnetic field. They can generate axial magnetic fields of 3.6 T at the beam injection part and 2.2 T at the extraction part. A radial magnetic field of 2.1 T is achievable at the plasma chamber wall. A step type winding process was employed in fabricating the hexapole coil. The winding technique was confirmed through repeated cooling tests. Superconducting magnets and a cryostat system are currently being manufactured.
Craig, Sandra
2011-01-01
Carbohydrates in various forms play a vital role in numerous critical biological processes. The detection of such saccharides can give insight into the progression of such diseases such as cancer. Boronic acids react with 1,2 and 1,3 diols of saccharides in non-aqueous or basic aqueous media. Herein, we describe the design, synthesis and evaluation of three bisboronic acid fluorescent probes, each having about ten linear steps in its synthesis. Among these compounds that were evaluated, 9b was shown to selectively label HepG2, liver carcinoma cell line within a concentration range of 0.5–10 μM in comparison to COS-7, a normal fibroblast cell line. PMID:22177855
Designed electromagnetic pulsed therapy: clinical applications.
Gordon, Glen A
2007-09-01
First reduced to science by Maxwell in 1865, electromagnetic technology as therapy received little interest from basic scientists or clinicians until the 1980s. It now promises applications that include mitigation of inflammation (electrochemistry) and stimulation of classes of genes following onset of illness and injury (electrogenomics). The use of electromagnetism to stop inflammation and restore tissue seems a logical phenomenology, that is, stop the inflammation, then upregulate classes of restorative gene loci to initiate healing. Studies in the fields of MRI and NMR have aided the understanding of cell response to low energy EMF inputs via electromagnetically responsive elements. Understanding protein iterations, that is, how they process information to direct energy, we can maximize technology to aid restorative intervention, a promising step forward over current paradigms of therapy.
3D Product Development for Loose-Fitting Garments Based on Parametric Human Models
NASA Astrophysics Data System (ADS)
Krzywinski, S.; Siegmund, J.
2017-10-01
Researchers and commercial suppliers worldwide pursue the objective of achieving a more transparent garment construction process that is computationally linked to a virtual body, in order to save development costs over the long term. The current aim is not to transfer the complete pattern making step to a 3D design environment but to work out basic constructions in 3D that provide excellent fit due to their accurate construction and morphological pattern grading (automatic change of sizes in 3D) in respect of sizes and body types. After a computer-aided derivation of 2D pattern parts, these can be made available to the industry as a basis on which to create more fashionable variations.
Onishchenko, G G; Bragina, I V; Ezhlova, E B; Demina, V P; Gorskiĭ, A A; Gus'kov, A S; Aksenova, O I; Ivanov, G E; Klindukhov, V P; Nikolaevich, P N; Grechanaia, T B; Kulichenko, A N; Maletskaia, O V; Manin, E A; Parkhomenko, V V; Kulichenko, O A
2015-01-01
The paper generalizes the experience of formation of protection system against biological threats and ensuring sanitary and epidemiological welfare during preparation for the XXII Olympic Winter Games and XI Paralympic Winter Games of 2014 in Sochi. The basic steps for creating this system, since 2007, participation and role of Rospotrebnadzor in this process are shown. The paper deals with such questions as the governmental and administrative structures with federal agencies interaction, development of a regulatory framework governing the safety system of the Olympic Games, development of algorithms of information exchange and management decisions, biological safety in developing infrastructure in Sochi.
Comulang: towards a collaborative e-learning system that supports student group modeling.
Troussas, Christos; Virvou, Maria; Alepis, Efthimios
2013-01-01
This paper describes an e-learning system that is expected to further enhance the educational process in computer-based tutoring systems by incorporating collaboration between students and work in groups. The resulting system is called "Comulang" while as a test bed for its effectiveness a multiple language learning system is used. Collaboration is supported by a user modeling module that is responsible for the initial creation of student clusters, where, as a next step, working groups of students are created. A machine learning clustering algorithm works towards group formatting, so that co-operations between students from different clusters are attained. One of the resulting system's basic aims is to provide efficient student groups whose limitations and capabilities are well balanced.
Testing the World with Simulations.
ERIC Educational Resources Information Center
Roberts, Nancy
1983-01-01
Explains the three main concepts of the system dynamics approach to model building (dynamics, feedback, and systems) and the basic steps to problem solving by simulation applicable to all educational levels. Some DYNAMO commands are briefly described. (EAO)
Fischer, Fabian; Zufferey, Géraldine; Sugnaux, Marc; Happe, Manuel
2015-01-01
Phosphate was remobilised from iron phosphate contained in digested sewage sludge using a bio-electric cell. A significant acceleration above former results was caused by strongly basic catholytes. For these experiments a dual chambered microbial electrolysis cell with a small cathode (40 mL) and an 80 times larger anode (2.5 L) was equipped with a platinum sputtered reticulated vitreous carbon cathode. Various applied voltages (0.2-6.0 V) generated moderate to strongly basic catholytes using artificial waste water with pH close to neutral. Phosphate from iron phosphate contained in digested sewage sludge was remobilised most effectively at pH ∼13 with up to 95% yield. Beside minor electrochemical reduction, hydroxyl substitution was the dominating remobilisation mechanism. Particle-fluid kinetics using the "shrinking core" model allowed us to determine the reaction controlling step. Reaction rates changed with temperature (15-40 °C) and an activation energy of Ea = 55 kJ mol(-1) was found. These analyses indicated chemical and physical reaction control, which is of interest for future scale-up work. Phosphate remobilisation rates increased significantly, yields doubled and recovered PO4(3-) concentrations increased four times using a task specific bio-electric system. The result is a sustainable process for decentralized phosphate mining and a green chemical base generator useful also for many other sustainable processing needs.
A fast, parallel algorithm to solve the basic fluvial erosion/transport equations
NASA Astrophysics Data System (ADS)
Braun, J.
2012-04-01
Quantitative models of landform evolution are commonly based on the solution of a set of equations representing the processes of fluvial erosion, transport and deposition, which leads to predict the geometry of a river channel network and its evolution through time. The river network is often regarded as the backbone of any surface processes model (SPM) that might include other physical processes acting at a range of spatial and temporal scales along hill slopes. The basic laws of fluvial erosion requires the computation of local (slope) and non-local (drainage area) quantities at every point of a given landscape, a computationally expensive operation which limits the resolution of most SPMs. I present here an algorithm to compute the various components required in the parameterization of fluvial erosion (and transport) and thus solve the basic fluvial geomorphic equation, that is very efficient because it is O(n) (the number of required arithmetic operations is linearly proportional to the number of nodes defining the landscape), and is fully parallelizable (the computation cost decreases in a direct inverse proportion to the number of processors used to solve the problem). The algorithm is ideally suited for use on latest multi-core processors. Using this new technique, geomorphic problems can be solved at an unprecedented resolution (typically of the order of 10,000 X 10,000 nodes) while keeping the computational cost reasonable (order 1 sec per time step). Furthermore, I will show that the algorithm is applicable to any regular or irregular representation of the landform, and is such that the temporal evolution of the landform can be discretized by a fully implicit time-marching algorithm, making it unconditionally stable. I will demonstrate that such an efficient algorithm is ideally suited to produce a fully predictive SPM that links observationally based parameterizations of small-scale processes to the evolution of large-scale features of the landscapes on geological time scales. It can also be used to model surface processes at the continental or planetary scale and be linked to lithospheric or mantle flow models to predict the potential interactions between tectonics driving surface uplift in orogenic areas, mantle flow producing dynamic topography on continental scales and surface processes.
TRUST84. Sat-Unsat Flow in Deformable Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narasimhan, T.N.
1984-11-01
TRUST84 solves for transient and steady-state flow in variably saturated deformable media in one, two, or three dimensions. It can handle porous media, fractured media, or fractured-porous media. Boundary conditions may be an arbitrary function of time. Sources or sinks may be a function of time or of potential. The theoretical model considers a general three-dimensional field of flow in conjunction with a one-dimensional vertical deformation field. The governing equation expresses the conservation of fluid mass in an elemental volume that has a constant volume of solids. Deformation of the porous medium may be nonelastic. Permeability and the compressibility coefficientsmore » may be nonlinearly related to effective stress. Relationships between permeability and saturation with pore water pressure in the unsaturated zone may be characterized by hysteresis. The relation between pore pressure change and effective stress change may be a function of saturation. The basic calculational model of the conductive heat transfer code TRUMP is applied in TRUST84 to the flow of fluids in porous media. The model combines an integrated finite difference algorithm for numerically solving the governing equation with a mixed explicit-implicit iterative scheme in which the explicit changes in potential are first computed for all elements in the system, after which implicit corrections are made only for those elements for which the stable time-step is less than the time-step being used. Time-step sizes are automatically controlled to optimize the number of iterations, to control maximum change to potential during a time-step, and to obtain desired output information. Time derivatives, estimated on the basis of system behavior during the two previous time-steps, are used to start the iteration process and to evaluate nonlinear coefficients. Both heterogeneity and anisotropy can be handled.« less
Cost analysis of oxygen recovery systems
NASA Technical Reports Server (NTRS)
Yakut, M. M.
1973-01-01
The design and development of equipment for flight use in earth-orbital programs, when optimally approached cost effectively, proceed through the following logical progression: (1) bench testing of breadboard designs, (2) the fabrication and evaluation of prototype equipment, (3) redesign to meet flight-imposed requirements, and (4) qualification and testing of a flight-ready system. Each of these steps is intended to produce the basic design information necessary to progress to the next step. The cost of each step is normally substantially less than that of the following step. An evaluation of the cost elements involved in each of the steps and their impact on total program cost are presented. Cost analyses of four leading oxygen recovery subsystems which include two carbon dioxide reduction subsystem, Sabatier and Bosch, and two water electrolysis subsystems, the solid polymer electrolyte and the circulating KOH electrolyte are described.
Semantic features of 'stepped' versus 'continuous' contours in German intonation.
Dombrowski, Ernst
2013-01-01
This study analyses the meaning spaces of German pitch contours using two modes of melodic movement: continuous or in steps of sustained pitch. Both the continuous and stepped movements are represented by a set of five basic patterns, the latter being derived from the former. Thirty-six German native speakers judged the pattern sets on a 12-scale semantic differential. The semantic profiles confirm that stepped contours can be conceived of as stylized intonation, in a formal as well as in a functional sense. On the one hand, continuous (non-stylized) and stepped (stylized) contours are assigned different overall meanings (especially on the scales astonished - commonplace and interested - not interested). On the other hand, listeners organize the two contour sets in a similar fashion, which speaks in favour of parallel pattern inventories of continuous and stepped movement, respectively. However, the meaning space of the stylized patterns is affected by formal restrictions, for instance in the step transformation of continuous rises. © 2014 S. Karger AG, Basel.
Keijsers, Carolina J P W; Segers, Wieke S; de Wildt, Dick J; Brouwers, Jacobus R B J; Keijsers, Loes; Jansen, Paul A F
2015-06-01
The only validated tool for pharmacotherapy education for medical students is the 6-step method of the World Health Organization. It has proven effective in experimental studies with short term interventions. The generalizability of this effect after implementation in a contextual-rich medical curriculum was investigated. The pharmacology knowledge and pharmacotherapy skills of cohorts of students, from years before, during and after implementation of a WHO-6-step-based integrated learning programme were tested using a standardized assessment containing 50 items covering knowledge of basic (n = 25) and clinical (n = 24) pharmacology, and pharmacotherapy skills (n = 1 open question). All scores are expressed as a percentage of the maximum score possible per (sub)domain. In total, 1652 students were included between September 2010 and July 2014 (participation rate 89%). The WHO-6-step-based learning programme improved students' knowledge of basic pharmacology (mean score ± SD, 60.6 ± 10.5% vs. 63.4 ± 10.9%, P < 0.01) and clinical or applied pharmacology (63.7 ± 10.4% vs. 67.4 ± 10.3%, P < 0.01), and improved their pharmacotherapy skills (68.8 ± 26.1% vs. 74.6% ± 22.9%, P 0.02). Moreover, satisfaction with education increased (5.7 ± 1.3 vs. 6.3 ± 1.0 on a 10-point scale, P < 0.01) and as did students' confidence in daily practice (from -0.81 ± 0.72 to -0.50 ± 0.79 on a -2 to +2 scale, P < 0.01). The WHO-6-step method was successfully implemented in a medical curriculum. In this observational study, the integrated learning programme had positive effects on students' knowledge of basic and applied pharmacology, improved their pharmacotherapy skills, and increased satisfaction with education and self-confidence in prescribing. Whether this training method leads to better patient care remains to be established. © 2015 The British Pharmacological Society.
NASA Technical Reports Server (NTRS)
Hepp, Aloysius F.; Kulis, Michael J.; Psarras, Peter C.; Ball, David W.; Timko, Michael T.; Wong, Hsi-Wu; Peck, Jay; Chianelli, Russell R.
2014-01-01
Transportation fuels production (including aerospace propellants) from non-traditional sources (gases, waste materials, and biomass) has been an active area of research and development for decades. Reducing terrestrial waste streams simultaneous with energy conversion, plentiful biomass, new low-cost methane sources, and/or extra-terrestrial resource harvesting and utilization present significant technological and business opportunities being realized by a new generation of visionary entrepreneurs. We examine several new approaches to catalyst fabrication and new processing technologies to enable utilization of these nontraditional raw materials. Two basic processing architectures are considered: a single-stage pyrolysis approach that seeks to basically re-cycle hydrocarbons with minimal net chemistry or a two-step paradigm that involves production of supply or synthesis gas (mainly carbon oxides and H2) followed by production of fuel(s) via Sabatier or methanation reactions and/or Fischer-Tröpsch synthesis. Optimizing the fraction of product stream relevant to targeted aerospace (and other transportation) fuels via modeling, catalyst fabrication and novel reactor design are described. Energy utilization is a concern for production of fuels for either terrestrial or space operations; renewable sources based on solar energy and/or energy efficient processes may be mission enabling. Another important issue is minimizing impurities in the product stream(s), especially those potentially posing risks to personnel or operations through (catalyst) poisoning or (equipment) damage. Technologies being developed to remove (and/or recycle) heteroatom impurities are briefly discussed as well as the development of chemically robust catalysts whose activities are not diminished during operation. The potential impacts on future missions by such new approaches as well as balance of system issues are addressed.
NASA Technical Reports Server (NTRS)
Hepp, A. F.; Kulis, M. J.; Psarras, P. C.; Ball, D. W.; Timko, M. T.; Wong, H.-W.; Peck, J.; Chianelli, R. R.
2014-01-01
Transportation fuels production (including aerospace propellants) from non-traditional sources (gases, waste materials, and biomass) has been an active area of research and development for decades. Reducing terrestrial waste streams simultaneous with energy conversion, plentiful biomass, new low-cost methane sources, and/or extra-terrestrial resource harvesting and utilization present significant technological and business opportunities being realized by a new generation of visionary entrepreneurs. We examine several new approaches to catalyst fabrication and new processing technologies to enable utilization of these non-traditional raw materials. Two basic processing architectures are considered: a single-stage pyrolysis approach that seeks to basically re-cycle hydrocarbons with minimal net chemistry or a two-step paradigm that involves production of supply or synthesis gas (mainly carbon oxides and hydrogen) followed by production of fuel(s) via Sabatier or methanation reactions and/or Fischer-Tropsch synthesis. Optimizing the fraction of product stream relevant to targeted aerospace (and other transportation) fuels via modeling, catalyst fabrication and novel reactor design are described. Energy utilization is a concern for production of fuels for either terrestrial or space operations; renewable sources based on solar energy and/or energy efficient processes may be mission enabling. Another important issue is minimizing impurities in the product stream(s), especially those potentially posing risks to personnel or operations through (catalyst) poisoning or (equipment) damage. Technologies being developed to remove (and/or recycle) heteroatom impurities are briefly discussed as well as the development of chemically robust catalysts whose activity are not diminished during operation. The potential impacts on future missions by such new approaches as well as balance of system issues are addressed.
Minimalist Design of Allosterically Regulated Protein Catalysts.
Makhlynets, O V; Korendovych, I V
2016-01-01
Nature facilitates chemical transformations with exceptional selectivity and efficiency. Despite a tremendous progress in understanding and predicting protein function, the overall problem of designing a protein catalyst for a given chemical transformation is far from solved. Over the years, many design techniques with various degrees of complexity and rational input have been developed. Minimalist approach to protein design that focuses on the bare minimum requirements to achieve activity presents several important advantages. By focusing on basic physicochemical properties and strategic placing of only few highly active residues one can feasibly evaluate in silico a very large variety of possible catalysts. In more general terms minimalist approach looks for the mere possibility of catalysis, rather than trying to identify the most active catalyst possible. Even very basic designs that utilize a single residue introduced into nonenzymatic proteins or peptide bundles are surprisingly active. Because of the inherent simplicity of the minimalist approach computational tools greatly enhance its efficiency. No complex calculations need to be set up and even a beginner can master this technique in a very short time. Here, we present a step-by-step protocol for minimalist design of functional proteins using basic, easily available, and free computational tools. © 2016 Elsevier Inc. All rights reserved.
An Overview of Cloud Implementation in the Manufacturing Process Life Cycle
NASA Astrophysics Data System (ADS)
Kassim, Noordiana; Yusof, Yusri; Hakim Mohamad, Mahmod Abd; Omar, Abdul Halim; Roslan, Rosfuzah; Aryanie Bahrudin, Ida; Ali, Mohd Hatta Mohamed
2017-08-01
The advancement of information and communication technology (ICT) has changed the structure and functions of various sectors and it has also started to play a significant role in modern manufacturing in terms of computerized machining and cloud manufacturing. It is important for industries to keep up with the current trend of ICT for them to be able survive and be competitive. Cloud manufacturing is an approach that wanted to realize a real-world manufacturing processes that will apply the basic concept from the field of Cloud computing to the manufacturing domain called Cloud-based manufacturing (CBM) or cloud manufacturing (CM). Cloud manufacturing has been recognized as a new paradigm for manufacturing businesses. In cloud manufacturing, manufacturing companies need to support flexible and scalable business processes in the shop floor as well as the software itself. This paper provides an insight or overview on the implementation of cloud manufacturing in the modern manufacturing processes and at the same times analyses the requirements needed regarding process enactment for Cloud manufacturing and at the same time proposing a STEP-NC concept that can function as a tool to support the cloud manufacturing concept.
Guide to Developing an Environmental Management System - Act
This page takes you though the basic steps (Plan, Do, Check, Act) of building an Environmental Management System (EMS) as they are outlined in the 2001 Second Edition of Environmental Management Systems: An Implementation Guide. Act section.
Guide to Developing an Environmental Management System - Plan
This page takes you though the basic steps (Plan, Do, Check, Act) of building an Environmental Management System (EMS) as they are outlined in the 2001 Second Edition of Environmental Management Systems: An Implementation Guide. Plan section.
Systematic reviews in the field of nutrition
USDA-ARS?s Scientific Manuscript database
Systematic reviews are valuable tools for staying abreast of evolving nutrition and aging -related topics, formulating dietary guidelines, establishing nutrient reference intakes, formulating clinical practice guidance, evaluating health claims, and setting research agendas. Basic steps of conductin...
Guide to Developing an Environmental Management System - Check
This page takes you though the basic steps (Plan, Do, Check, Act) of building an Environmental Management System (EMS) as they are outlined in the 2001 Second Edition of Environmental Management Systems: An Implementation Guide. Check section.
4 Basic Steps to Food Safety at Home
... F -Ground Beef, Pork, Lamb 160 °F -Turkey, Chicken, Duck 165 °F • Use a food thermometer to ... cause you to feel like you have the flu. Food illness can also cause serious health problems, ...
Regional Classification of Traditional Japanese Folk Songs
NASA Astrophysics Data System (ADS)
Kawase, Akihiro; Tokosumi, Akifumi
In this study, we focus on the melodies of Japanese folk songs, and examine the basic structures of Japanese folk songs that represent the characteristics of different regions. We sample the five largest song genres within the music corpora of the Nihon Min-yo Taikan (Anthology of Japanese Folk Songs), consisting of 202,246 tones from 1,794 song pieces from 45 prefectures in Japan. Then, we calculate the probabilities of 24 transition patterns that fill the interval of the perfect fourth pitch, which is the interval that maintains most of the frequency for one-step and two-step pitch transitions within 11 regions, in order to determine the parameters for cluster analysis. As a result, we successively classify the regions into two basic groups, eastern Japan and western Japan, which corresponds to geographical factors and cultural backgrounds, and also match accent distributions in the Japanese language.
Steady-State Density Functional Theory for Finite Bias Conductances.
Stefanucci, G; Kurth, S
2015-12-09
In the framework of density functional theory, a formalism to describe electronic transport in the steady state is proposed which uses the density on the junction and the steady current as basic variables. We prove that, in a finite window around zero bias, there is a one-to-one map between the basic variables and both local potential on as well as bias across the junction. The resulting Kohn-Sham system features two exchange-correlation (xc) potentials, a local xc potential, and an xc contribution to the bias. For weakly coupled junctions the xc potentials exhibit steps in the density-current plane which are shown to be crucial to describe the Coulomb blockade diamonds. At small currents these steps emerge as the equilibrium xc discontinuity bifurcates. The formalism is applied to a model benzene junction, finding perfect agreement with the orthodox theory of Coulomb blockade.
Temporal differentiation and the optimization of system output
NASA Astrophysics Data System (ADS)
Tannenbaum, Emmanuel
2008-01-01
We develop two simplified dynamical models with which to explore the conditions under which temporal differentiation leads to increased system output. By temporal differentiation, we mean a division of labor whereby different subtasks associated with performing a given task are done at different times. The idea is that, by focusing on one particular set of subtasks at a time, it is possible to increase the efficiency with which each subtask is performed, thereby allowing for faster completion of the overall task. In the first model, we consider the filling and emptying of a tank in the presence of a time-varying resource profile. If a given resource is available, the tank may be filled at some rate rf . As long as the tank contains a resource, it may be emptied at a rate re , corresponding to processing into some product, which is either the final product of a process or an intermediate that is transported for further processing. Given a resource-availability profile over some time interval T , we develop an algorithm for determining the fill-empty profile that produces the maximum quantity of processed resource at the end of the time interval. We rigorously prove that the basic algorithm is one where the tank is filled when a resource is available and emptied when a resource is not available. In the second model, we consider a process whereby some resource is converted into some final product in a series of three agent-mediated steps. Temporal differentiation is incorporated by allowing the agents to oscillate between performing the first two steps and performing the last step. We find that temporal differentiation is favored when the number of agents is at intermediate values and when there are process intermediates that have long lifetimes compared to other characteristic time scales in the system. Based on these results, we speculate that temporal differentiation may provide an evolutionary basis for the emergence of phenomena such as sleep, distinct REM and non-REM sleep states, and circadian rhythms in general. The essential argument is that in sufficiently complex biological systems, a maximal amount of information and tasks can be processed and completed if the system follows a temporally differentiated “work plan,” whereby the system focuses on one or a few tasks at a time.
Solar Process Heat Basics | NREL
Process Heat Basics Solar Process Heat Basics Commercial and industrial buildings may use the same , black metal panel mounted on a south-facing wall to absorb the sun's heat. Air passes through the many nonresidential buildings. A typical system includes solar collectors that work along with a pump, heat exchanger
Horigian, Viviana E; Anderson, Austen R; Szapocznik, José
2016-09-01
In this article, we review the research evidence generated over 40 years on Brief Strategic Family Therapy illustrating the NIH stages of intervention development and highlighting the translational process. Basic research (Stage 0) led to the discovery of the characteristics of the population and the nature of the problems that needed to be addressed. This step informed the selection of an intervention model that addressed the problems presented by the population, but in a fashion that was congruent with the population's culture, defined in terms of its value orientations. From this basic research, an intervention that integrated structural and strategic elements was selected and refined through testing (Stage I). The second stage of translation (Stage II) included efficacy trials of a specialized engagement module that responded to challenges to the provision of services. It also included several other efficacy trials that documented the effects of the intervention, mostly in research settings or with research therapists. Stages III/IV in the translational process led to the testing of the effectiveness of the intervention in real-world settings with community therapists and some oversight from the developer. This work revealed that an implementation/organizational intervention was required to achieve fidelity and sustainability of the intervention in real-world settings. The work is currently in Stage V in which new model development led to an implementation intervention that can ensure fidelity and sustainability. Future research will evaluate the effectiveness of the current implementation model in increasing adoption, fidelity, and long-term sustainability in real-world settings. © 2016 Family Process Institute.
Biogenesis of iron-sulfur clusters in mammalian cells: new insights and relevance to human disease
Rouault, Tracey A.
2012-01-01
Iron-sulfur (Fe-S) clusters are ubiquitous cofactors composed of iron and inorganic sulfur. They are required for the function of proteins involved in a wide range of activities, including electron transport in respiratory chain complexes, regulatory sensing, photosynthesis and DNA repair. The proteins involved in the biogenesis of Fe-S clusters are evolutionarily conserved from bacteria to humans, and many insights into the process of Fe-S cluster biogenesis have come from studies of model organisms, including bacteria, fungi and plants. It is now clear that several rare and seemingly dissimilar human diseases are attributable to defects in the basic process of Fe-S cluster biogenesis. Although these diseases –which include Friedreich’s ataxia (FRDA), ISCU myopathy, a rare form of sideroblastic anemia, an encephalomyopathy caused by dysfunction of respiratory chain complex I and multiple mitochondrial dysfunctions syndrome – affect different tissues, a feature common to many of them is that mitochondrial iron overload develops as a secondary consequence of a defect in Fe-S cluster biogenesis. This Commentary outlines the basic steps of Fe-S cluster biogenesis as they have been defined in model organisms. In addition, it draws attention to refinements of the process that might be specific to the subcellular compartmentalization of Fe-S cluster biogenesis proteins in some eukaryotes, including mammals. Finally, it outlines several important unresolved questions in the field that, once addressed, should offer important clues into how mitochondrial iron homeostasis is regulated, and how dysfunction in Fe-S cluster biogenesis can contribute to disease. PMID:22382365