Sample records for specific processing steps

  1. RNA splicing process analysis for identifying antisense oligonucleotide inhibitors with padlock probe-based isothermal amplification.

    PubMed

    Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang; Li, Jinghong

    2017-08-01

    RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5'-ASO could block RNA splicing by inhibiting the first step, while 3'-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs.

  2. An Application of Business Process Management to Health Care Facilities.

    PubMed

    Hassan, Mohsen M D

    The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.

  3. Carp Collage

    ERIC Educational Resources Information Center

    Laux, David

    2009-01-01

    In this article, the author describes a metal-tooling project for his fourth-graders. Giving the students a specific subject with specific features and textures enabled him to guide them step-by-step in the metal-tooling process. This project would be a great practice project for even high-school students before doing other relief work. After…

  4. Developing the Instructional Specification. Monograph Number 12.

    ERIC Educational Resources Information Center

    Gerlach, Vernon S.; And Others

    Analysis of a film script that makes the learner identify and distinguish between statements of observation and statements of inference leads an instructor (or other program writer) step by step through the processes of preparing an instructional specification--a blue-print for preparing powerful self-instructional materials. There are four parts…

  5. Creating the Infrastructure for Rapid Application Development and Processing Response to the HIRDLS Radiance Anomaly

    NASA Astrophysics Data System (ADS)

    Cavanaugh, C.; Gille, J.; Francis, G.; Nardi, B.; Hannigan, J.; McInerney, J.; Krinsky, C.; Barnett, J.; Dean, V.; Craig, C.

    2005-12-01

    The High Resolution Dynamics Limb Sounder (HIRDLS) instrument onboard the NASA Aura spacecraft experienced a rupture of the thermal blanketing material (Kapton) during the rapid depressurization of launch. The Kapton draped over the HIRDLS scan mirror, severely limiting the aperture through which HIRDLS views space and Earth's atmospheric limb. In order for HIRDLS to achieve its intended measurement goals, rapid characterization of the anomaly, and rapid recovery from it were required. The recovery centered around a new processing module inserted into the standard HIRDLS processing scheme, with a goal of minimizing the effect of the anomaly on the already existing processing modules. We describe the software infrastructure on which the new processing module was built, and how that infrastructure allows for rapid application development and processing response. The scope of the infrastructure spans three distinct anomaly recovery steps and the means for their intercommunication. Each of the three recovery steps (removing the Kapton-induced oscillation in the radiometric signal, removing the Kapton signal contamination upon the radiometric signal, and correcting for the partially-obscured atmospheric view) is completely modularized and insulated from the other steps, allowing focused and rapid application development towards a specific step, and neutralizing unintended inter-step influences, thus greatly shortening the design-development-test lifecycle. The intercommunication is also completely modularized and has a simple interface to which the three recovery steps adhere, allowing easy modification and replacement of specific recovery scenarios, thereby heightening the processing response.

  6. Seven Steps to Responsible Software Selection. ERIC Digest.

    ERIC Educational Resources Information Center

    Komoski, P. Kenneth; Plotnick, Eric

    Microcomputers in schools contribute significantly to the learning process, and software selection is taken as seriously as the selection of text books. The seven step process for responsible software selection are: (1) analyzing needs, including the differentiation between needs and objectives; (2) specification of requirements; (3) identifying…

  7. RNA splicing process analysis for identifying antisense oligonucleotide inhibitors with padlock probe-based isothermal amplification† †Electronic supplementary information (ESI) available: Additional experimental materials, methods, DNA sequences and supplementary figures and tables. See DOI: 10.1039/c7sc01336a Click here for additional data file.

    PubMed Central

    Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang

    2017-01-01

    RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5′-ASO could block RNA splicing by inhibiting the first step, while 3′-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs. PMID:28989608

  8. Viral safety characteristics of Flebogamma DIF, a new pasteurized, solvent-detergent treated and Planova 20 nm nanofiltered intravenous immunoglobulin.

    PubMed

    Caballero, Santiago; Nieto, Sandra; Gajardo, Rodrigo; Jorquera, Juan I

    2010-07-01

    A new human liquid intravenous immunoglobulin product, Flebogamma DIF, has been developed. This IgG is purified from human plasma by cold ethanol fractionation, PEG precipitation and ion exchange chromatography. The manufacturing process includes three different specific pathogen clearance (inactivation/removal) steps: pasteurization, solvent/detergent treatment and Planova nanofiltration with a pore size of 20 nm. This study evaluates the pathogen clearance capacity of seven steps in the production process for a wide range of viruses through spiking experiments: the three specific steps mentioned above and also four more production steps. Infectivity of samples was measured using a Tissue Culture Infectious Dose assay (log(10) TCID(50)) or Plaque Forming Units assay (log(10) PFU). Validation studies demonstrated that each specific step cleared more than 4 log(10) for all viruses assayed. An overall viral clearance between > or =13.33 log(10) and > or =25.21 log(10), was achieved depending on the virus and the number of steps studied for each virus. It can be concluded that Flebogamma DIF has a very high viral safety profile. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  9. Improving Program Performance through Management Information. A Workbook.

    ERIC Educational Resources Information Center

    Bienia, Nancy

    Designed specifically for state and local managers and supervisors who plan, direct, and operate child support enforcement programs, this workbook provides a four-part, step-by-step process for identifying needed information and methods of using the information to operate an effective program. The process consists of: (1) determining what…

  10. Systems Maintenance Automated Repair Tasks (SMART)

    NASA Technical Reports Server (NTRS)

    Schuh, Joseph; Mitchell, Brent; Locklear, Louis; Belson, Martin A.; Al-Shihabi, Mary Jo Y.; King, Nadean; Norena, Elkin; Hardin, Derek

    2010-01-01

    SMART is a uniform automated discrepancy analysis and repair-authoring platform that improves technical accuracy and timely delivery of repair procedures for a given discrepancy (see figure a). SMART will minimize data errors, create uniform repair processes, and enhance the existing knowledge base of engineering repair processes. This innovation is the first tool developed that links the hardware specification requirements with the actual repair methods, sequences, and required equipment. SMART is flexibly designed to be useable by multiple engineering groups requiring decision analysis, and by any work authorization and disposition platform (see figure b). The organizational logic creates the link between specification requirements of the hardware, and specific procedures required to repair discrepancies. The first segment in the SMART process uses a decision analysis tree to define all the permutations between component/ subcomponent/discrepancy/repair on the hardware. The second segment uses a repair matrix to define what the steps and sequences are for any repair defined in the decision tree. This segment also allows for the selection of specific steps from multivariable steps. SMART will also be able to interface with outside databases and to store information from them to be inserted into the repair-procedure document. Some of the steps will be identified as optional, and would only be used based on the location and the current configuration of the hardware. The output from this analysis would be sent to a work authoring system in the form of a predefined sequence of steps containing required actions, tools, parts, materials, certifications, and specific requirements controlling quality, functional requirements, and limitations.

  11. Surface Modified Particles By Multi-Step Michael-Type Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2005-05-03

    A new class of surface modified particles and a multi-step Michael-type addition surface modification process for the preparation of the same is provided. The multi-step Michael-type addition surface modification process involves two or more reactions to compatibilize particles with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through reactive organic linking groups. Specifically, these reactive groups are activated carbon—carbon pi bonds and carbon and non-carbon nucleophiles that react via Michael or Michael-type additions.

  12. Evidence-Based Assessment of Attention-Deficit/Hyperactivity Disorder: Using Multiple Sources of Information

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Youngstrom, Eric A.

    2006-01-01

    In this article, the authors illustrate a step-by-step process of acquiring and integrating information according to the recommendations of evidence-based practices. A case example models the process, leading to specific recommendations regarding instruments and strategies for evidence-based assessment (EBA) of attention-deficit/hyperactivity…

  13. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  14. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  15. Facilitating the Research Paper Process: A Guide for the Social Science Instructor.

    ERIC Educational Resources Information Center

    Medina, Suzanne L.

    This paper describes the approach used successfully at California State University, Dominguez Hills, to instruct college students in the research paper writing process. To achieve the results, the instructor followed a specific set of steps during a class meeting set aside for this specialized training. This paper details each step in the…

  16. Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.

    PubMed

    Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen

    2018-07-20

    Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Contamination Control in Hybrid Microelectronic Modules. Part 1: Identification of Critical Process and Contaminants

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.

    1975-01-01

    Various hybrid processing steps, handling procedures, and materials are examined in an attempt to identify sources of contamination and to propose methods for the control of these contaminants. It is found that package sealing, assembly, and rework are especially susceptible to contamination. Moisture and loose particles are identified as the worst contaminants. The points at which contaminants are most likely to enter the hybrid package are also identified, and both general and specific methods for their detection and control are developed. In general, the most effective controls for contaminants are: clean working areas, visual inspection at each step of the process, and effective cleaning at critical process steps. Specific methods suggested include the detection of loose particles by a precap visual inspection, by preseal and post-seal electrical testing, and by a particle impact noise test. Moisture is best controlled by sealing all packages in a clean, dry, inert atmosphere after a thorough bake-out of all parts.

  18. Chapter 10. Developing a habitat monitoring program: three examples from national forest planning

    Treesearch

    Michael I. Goldstein; Lowell H. Suring; Christina D. Vojta; Mary M. Rowland; Clinton. McCarthy

    2013-01-01

    This chapter reviews the process steps of wildlife habitat monitoring described in chapters 2 through 9 and provides three case examples that illustrate how the process steps apply to specific situations. It provides the reader an opportunity to synthesize the material while also revealing the potential knowledge gaps and pitfalls that may complicate completion of a...

  19. Meeting the Challenge of Adequate Yearly Progress: How One School Is Learning to Leave No Child Behind

    ERIC Educational Resources Information Center

    Richardson, Jacqueline S.; Soltez, Jeff

    2003-01-01

    This article describes the step-by-step process Ross Elementary in Topeka, Kan., used to achieve Adequate Yearly Progress (AYP) in accordance with the No Child Left Behind Act of 2001. A simple process of using teamwork, setting specific measurable goals, and frequent monitoring of student progress toward these goals allowed Ross to significantly…

  20. 3D road marking reconstruction from street-level calibrated stereo pairs

    NASA Astrophysics Data System (ADS)

    Soheilian, Bahman; Paparoditis, Nicolas; Boldo, Didier

    This paper presents an automatic approach to road marking reconstruction using stereo pairs acquired by a mobile mapping system in a dense urban area. Two types of road markings were studied: zebra crossings (crosswalks) and dashed lines. These two types of road markings consist of strips having known shape and size. These geometric specifications are used to constrain the recognition of strips. In both cases (i.e. zebra crossings and dashed lines), the reconstruction method consists of three main steps. The first step extracts edge points from the left and right images of a stereo pair and computes 3D linked edges using a matching process. The second step comprises a filtering process that uses the known geometric specifications of road marking objects. The goal is to preserve linked edges that can plausibly belong to road markings and to filter others out. The final step uses the remaining linked edges to fit a theoretical model to the data. The method developed has been used for processing a large number of images. Road markings are successfully and precisely reconstructed in dense urban areas under real traffic conditions.

  1. Measurements of gluconeogenesis and glycogenolysis: A methodological review

    USDA-ARS?s Scientific Manuscript database

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to deter...

  2. Process improvement as an investment: Measuring its worth

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Jeletic, Kellyann

    1993-01-01

    This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.

  3. Key future research questions on mediators and moderators of behaviour change processes for substance abuse.

    PubMed

    Rehm, Jürgen

    2008-06-01

    In summarizing the key themes and results of the second meeting of the German Addiction Research Network 'Understanding Addiction: Mediators and Moderators of Behaviour Change Process', the following concrete steps forward were laid out to improve knowledge. The steps included pleas to (1) redefine substance abuse disorders, especially redefine the concept of abuse and harmful use; (2) increase the use of longitudinal and life-course studies with more adequate statistical methods such as latent growth modelling; (3) empirically test more specific and theoretically derived common factors and mechanisms of behavioural change processes; (4) better exploit cross-regional and cross-cultural differences.Funding agencies are urged to support these developments by specifically supporting interdisciplinary research along the lines specified above. This may include improved forms of international funding of groups of researchers from different countries, where each national group conducts a specific part of an integrated proposal. 2008 John Wiley & Sons, Ltd

  4. Emergence of Coding and its Specificity as a Physico-Informatic Problem

    NASA Astrophysics Data System (ADS)

    Wills, Peter R.; Nieselt, Kay; McCaskill, John S.

    2015-06-01

    We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.

  5. Understanding and Effectively Managing the Aggressive Child.

    ERIC Educational Resources Information Center

    Beck, Mitchell A; Roblee, Cathy

    1983-01-01

    Identifies specific behavioral characteristics of both aggressive and passive-aggressive students, delineates a four-step process that most aggressive students go through as they lose control of their emotions, and describes intervention strategies that teachers can use to deal with each step of the breakdown of control. (FL)

  6. Graphics-Based Parallel Programming Tools

    DTIC Science & Technology

    1991-09-01

    mean "beyond" (as in " paranormal "). emphasizing the fact that the editor supports the specification of not just single graphs, but entire graph...conflicting dependencies: all processes see the three steps in the same order and all interprocess communication happens within a step. 6 Not all abstract

  7. 22 CFR 161.9 - Specific steps in the Department's NEPA process.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... cooperation with the Office of Environment and Health, initiate steps to prepare an environmental impact... EIS, will itself have no significant impact, he should conduct an environmental assessment in... assessment or environmental impact statement. However, the responsible action officer shall note in the...

  8. 22 CFR 161.9 - Specific steps in the Department's NEPA process.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... cooperation with the Office of Environment and Health, initiate steps to prepare an environmental impact... EIS, will itself have no significant impact, he should conduct an environmental assessment in... assessment or environmental impact statement. However, the responsible action officer shall note in the...

  9. 22 CFR 161.9 - Specific steps in the Department's NEPA process.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... cooperation with the Office of Environment and Health, initiate steps to prepare an environmental impact... EIS, will itself have no significant impact, he should conduct an environmental assessment in... assessment or environmental impact statement. However, the responsible action officer shall note in the...

  10. 22 CFR 161.9 - Specific steps in the Department's NEPA process.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... cooperation with the Office of Environment and Health, initiate steps to prepare an environmental impact... EIS, will itself have no significant impact, he should conduct an environmental assessment in... assessment or environmental impact statement. However, the responsible action officer shall note in the...

  11. A Team-Based Process for Designing Comprehensive, Integrated, Three-Tiered (CI3T) Models of Prevention: How Does My School-Site Leadership Team Design a CI3T Model?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Jenkins, Abbie; Menzies, Holly Mariah; Kalberg, Jemma Robertson

    2014-01-01

    Comprehensive, integrated, three-tiered models are context specific and developed by school-site teams according to the core values held by the school community. In this article, the authors provide a step-by-step, team-based process for designing comprehensive, integrated, three-tiered models of prevention that integrate academic, behavioral, and…

  12. Cooperative Buying for New Associates: Some Assembly Required. Important Safety Instructions.

    ERIC Educational Resources Information Center

    Talarico, Scott

    1998-01-01

    A guide for using cooperative buying to block-book campus activities or attractions through a campus activities convention provides a step-by-step process and outlines some specific considerations, including forms, pricing, preparation for the conference, follow-up approaches, and hints for new users of the system. (MSE)

  13. Towards Core Modelling Practices in Integrated Water Resource Management: An Interdisciplinary View of the Modelling Process

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Elsawah, S.; Pierce, S. A.; Ames, D. P.

    2016-12-01

    The National Socio-Environmental Synthesis Center (SESYNC) Core Modelling Practices Pursuit is developing resources to describe core practices for developing and using models to support integrated water resource management. These practices implement specific steps in the modelling process with an interdisciplinary perspective; however, the particular practice that is most appropriate depends on contextual aspects specific to the project. The first task of the pursuit is to identify the various steps for which implementation practices are to be described. This paper reports on those results. The paper draws on knowledge from the modelling process literature for environmental modelling (Jakeman et al., 2006), engaging stakeholders (Voinov and Bousquet, 2010) and general modelling (Banks, 1999), as well as the experience of the consortium members. We organise the steps around the four modelling phases. The planning phase identifies what is to be achieved, how and with what resources. The model is built and tested during the construction phase, and then used in the application phase. Finally, models that become part of the ongoing policy process require a maintenance phase. For each step, the paper focusses on what is to be considered or achieved, rather than how it is performed. This reflects the separation of the steps from the practices that implement them in different contexts. We support description of steps with a wide range of examples. Examples are designed to be generic and do not reflect any one project or context, but instead are drawn from common situations or from extremely different ones so as to highlight some of the issues that may arise at each step. References Banks, J. (1999). Introduction to simulation. In Proceedings of the 1999 Winter Simulation Conference. Jakeman, A. J., R. A. Letcher, and J. P. Norton (2006). Ten iterative steps in development and evaluation of environmental models. Environmental Modelling and Software 21, 602-614. Voinov, A. and F. Bousquet (2010). Modelling with stakeholders. Environmental Modelling & Software 25 (11), 1268-1281.

  14. Controlling the Local Electronic Properties of Si(553)-Au through Hydrogen Doping

    NASA Astrophysics Data System (ADS)

    Hogan, C.; Speiser, E.; Chandola, S.; Suchkova, S.; Aulbach, J.; Schäfer, J.; Meyer, S.; Claessen, R.; Esser, N.

    2018-04-01

    We propose a quantitative and reversible method for tuning the charge localization of Au-stabilized stepped Si surfaces by site-specific hydrogenation. This is demonstrated for Si(553)-Au as a model system by combining density functional theory simulations and reflectance anisotropy spectroscopy experiments. We find that controlled H passivation is a two-step process: step-edge adsorption drives excess charge into the conducting metal chain "reservoir" and renders it insulating, while surplus H recovers metallic behavior. Our approach illustrates a route towards microscopic manipulation of the local surface charge distribution and establishes a reversible switch of site-specific chemical reactivity and magnetic properties on vicinal surfaces.

  15. An episodic specificity induction enhances means-end problem solving in young and older adults.

    PubMed

    Madore, Kevin P; Schacter, Daniel L

    2014-12-01

    Episodic memory plays an important role not only in remembering past experiences, but also in constructing simulations of future experiences and solving means-end social problems. We recently found that an episodic specificity induction-brief training in recollecting details of past experiences-enhances performance of young and older adults on memory and imagination tasks. Here we tested the hypothesis that this specificity induction would also positively impact a means-end problem-solving task on which age-related changes have been linked to impaired episodic memory. Young and older adults received the specificity induction or a control induction before completing a means-end problem-solving task, as well as memory and imagination tasks. Consistent with previous findings, older adults provided fewer relevant steps on problem solving than did young adults, and their responses also contained fewer internal (i.e., episodic) details across the 3 tasks. There was no difference in the number of other (e.g., irrelevant) steps on problem solving or external (i.e., semantic) details generated on the 3 tasks as a function of age. Critically, the specificity induction increased the number of relevant steps and internal details (but not other steps or external details) that both young and older adults generated in problem solving compared with the control induction, as well as the number of internal details (but not external details) generated for memory and imagination. Our findings support the idea that episodic retrieval processes are involved in means-end problem solving, extend the range of tasks on which a specificity induction targets these processes, and show that the problem-solving performance of older adults can benefit from a specificity induction as much as that of young adults. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  16. An episodic specificity induction enhances means-end problem solving in young and older adults

    PubMed Central

    Madore, Kevin P.; Schacter, Daniel L.

    2014-01-01

    Episodic memory plays an important role not only in remembering past experiences, but also in constructing simulations of future experiences and solving means-end social problems. We recently found that an episodic specificity induction- brief training in recollecting details of past experiences- enhances performance of young and older adults on memory and imagination tasks. Here we tested the hypothesis that this specificity induction would also positively impact a means-end problem solving task on which age-related changes have been linked to impaired episodic memory. Young and older adults received the specificity induction or a control induction before completing a means-end problem solving task as well as memory and imagination tasks. Consistent with previous findings, older adults provided fewer relevant steps on problem solving than did young adults, and their responses also contained fewer internal (i.e., episodic) details across the three tasks. There was no difference in the number of other (e.g., irrelevant) steps on problem solving or external (i.e., semantic) details generated on the three tasks as a function of age. Critically, the specificity induction increased the number of relevant steps and internal details (but not other steps or external details) that both young and older adults generated in problem solving compared with the control induction, as well as the number of internal details (but not external details) generated for memory and imagination. Our findings support the idea that episodic retrieval processes are involved in means-end problem solving, extend the range of tasks on which a specificity induction targets these processes, and show that the problem solving performance of older adults can benefit from a specificity induction as much as that of young adults. PMID:25365688

  17. Generation of cell type-specific monoclonal antibodies for the planarian and optimization of sample processing for immunolabeling.

    PubMed

    Forsthoefel, David J; Waters, Forrest A; Newmark, Phillip A

    2014-12-21

    Efforts to elucidate the cellular and molecular mechanisms of regeneration have required the application of methods to detect specific cell types and tissues in a growing cohort of experimental animal models. For example, in the planarian Schmidtea mediterranea, substantial improvements to nucleic acid hybridization and electron microscopy protocols have facilitated the visualization of regenerative events at the cellular level. By contrast, immunological resources have been slower to emerge. Specifically, the repertoire of antibodies recognizing planarian antigens remains limited, and a more systematic approach is needed to evaluate the effects of processing steps required during sample preparation for immunolabeling. To address these issues and to facilitate studies of planarian digestive system regeneration, we conducted a monoclonal antibody (mAb) screen using phagocytic intestinal cells purified from the digestive tracts of living planarians as immunogens. This approach yielded ten antibodies that recognized intestinal epitopes, as well as markers for the central nervous system, musculature, secretory cells, and epidermis. In order to improve signal intensity and reduce non-specific background for a subset of mAbs, we evaluated the effects of fixation and other steps during sample processing. We found that fixative choice, treatments to remove mucus and bleach pigment, as well as methods for tissue permeabilization and antigen retrieval profoundly influenced labeling by individual antibodies. These experiments led to the development of a step-by-step workflow for determining optimal specimen preparation for labeling whole planarians as well as unbleached histological sections. We generated a collection of monoclonal antibodies recognizing the planarian intestine and other tissues; these antibodies will facilitate studies of planarian tissue morphogenesis. We also developed a protocol for optimizing specimen processing that will accelerate future efforts to generate planarian-specific antibodies, and to extend functional genetic studies of regeneration to post-transcriptional aspects of gene expression, such as protein localization or modification. Our efforts demonstrate the importance of systematically testing multiple approaches to species-specific idiosyncracies, such as mucus removal and pigment bleaching, and may serve as a template for the development of immunological resources in other emerging model organisms.

  18. Genetics Home Reference: peroxisomal acyl-CoA oxidase deficiency

    MedlinePlus

    ... of certain fat molecules called very long-chain fatty acids (VLCFAs). Specifically, it is involved in the first step of a process called the peroxisomal fatty acid beta-oxidation pathway. This process shortens the VLCFA ...

  19. Enviropod handbook: A guide to preparation and use of the Environmental Protection Agency's light-weight aerial camera system. [Weber River, Utah

    NASA Technical Reports Server (NTRS)

    Brower, S. J.; Ridd, M. K.

    1984-01-01

    The use of the Environmental Protection Agency (EPA) Enviropod camera system is detailed in this handbook which contains a step-by-step guide for mission planning, flights, film processing, indexing, and documentation. Information regarding Enviropod equipment and specifications is included.

  20. Preparing for High Technology: 30 Steps to Implementation. Research & Development Series No. 232.

    ERIC Educational Resources Information Center

    Abram, Robert; And Others

    This planning guide is one of three that addresses the concerns of postsecondary college administrators and planners regarding the planning and implementation of technician training programs in high technology areas. It specifically focuses on a 30-step planning process that is generalizable to various high technology areas. (The other two…

  1. Game Theory and Technical Communication: Interpreting the Texas Two-Step through Harsanyi Transformation

    ERIC Educational Resources Information Center

    Williams, Miriam F.

    2012-01-01

    The author uses game theoretical models to identify technical communication breakdowns encountered during the notoriously confusing Texas Two-Step voting and caucusing process. Specifically, the author uses narrative theory and game theory to highlight areas where caucus participants needed instructions to better understand the rules of the game…

  2. Beyond Constant Comparison Qualitative Data Analysis: Using NVivo

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2011-01-01

    The purposes of this paper are to outline seven types of qualitative data analysis techniques, to present step-by-step guidance for conducting these analyses via a computer-assisted qualitative data analysis software program (i.e., NVivo9), and to present screenshots of the data analysis process. Specifically, the following seven analyses are…

  3. The HPT Model Applied to a Kayak Company's Registration Process

    ERIC Educational Resources Information Center

    Martin, Florence; Hall, Herman A., IV; Blakely, Amanda; Gayford, Matthew C.; Gunter, Erin

    2009-01-01

    This case study describes the step-by-step application of the traditional human performance technology (HPT) model at a premier kayak company located on the coast of North Carolina. The HPT model was applied to address lost revenues related to three specific business issues: misinformed customers, dissatisfied customers, and guides not showing up…

  4. Addressable-Matrix Integrated-Circuit Test Structure

    NASA Technical Reports Server (NTRS)

    Sayah, Hoshyar R.; Buehler, Martin G.

    1991-01-01

    Method of quality control based on use of row- and column-addressable test structure speeds collection of data on widths of resistor lines and coverage of steps in integrated circuits. By use of straightforward mathematical model, line widths and step coverages deduced from measurements of electrical resistances in each of various combinations of lines, steps, and bridges addressable in test structure. Intended for use in evaluating processes and equipment used in manufacture of application-specific integrated circuits.

  5. Strategies for developing competency models.

    PubMed

    Marrelli, Anne F; Tondora, Janis; Hoge, Michael A

    2005-01-01

    There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry.

  6. Automated drug identification system

    NASA Technical Reports Server (NTRS)

    Campen, C. F., Jr.

    1974-01-01

    System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.

  7. Development of a Premium Quality Plasma-derived IVIg (IQYMUNE®) Utilizing the Principles of Quality by Design-A Worked-through Case Study.

    PubMed

    Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent

    2018-01-01

    Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.

  8. Scatterometry-based metrology for SAQP pitch walking using virtual reference

    NASA Astrophysics Data System (ADS)

    Kagalwala, Taher; Vaid, Alok; Mahendrakar, Sridhar; Lenahan, Michael; Fang, Fang; Isbester, Paul; Shifrin, Michael; Etzioni, Yoav; Cepler, Aron; Yellai, Naren; Dasari, Prasad; Bozdog, Cornel

    2016-03-01

    Advanced technology nodes, 10nm and beyond, employing multi-patterning techniques for pitch reduction pose new process and metrology challenges in maintaining consistent positioning of structural features. Self-Aligned Quadruple Patterning (SAQP) process is used to create the Fins in FinFET devices with pitch values well below optical lithography limits. The SAQP process bares compounding effects from successive Reactive Ion Etch (RIE) and spacer depositions. These processes induce a shift in the pitch value from one fin compared to another neighboring fin. This is known as pitch walking. Pitch walking affects device performance as well as later processes which work on an assumption that there is consistent spacing between fins. In SAQP there are 3 pitch walking parameters of interest, each linked to specific process steps in the flow. These pitch walking parameters are difficult to discriminate at a specific process step by singular evaluation technique or even with reference metrology such as Transmission Electron Microscopy (TEM). In this paper we will utilize a virtual reference to generate a scatterometry model to measure pitch walk for SAQP process flow.

  9. Measuring self-aligned quadruple patterning pitch walking with scatterometry-based metrology utilizing virtual reference

    NASA Astrophysics Data System (ADS)

    Kagalwala, Taher; Vaid, Alok; Mahendrakar, Sridhar; Lenahan, Michael; Fang, Fang; Isbester, Paul; Shifrin, Michael; Etzioni, Yoav; Cepler, Aron; Yellai, Naren; Dasari, Prasad; Bozdog, Cornel

    2016-10-01

    Advanced technology nodes, 10 nm and beyond, employing multipatterning techniques for pitch reduction pose new process and metrology challenges in maintaining consistent positioning of structural features. A self-aligned quadruple patterning (SAQP) process is used to create the fins in FinFET devices with pitch values well below optical lithography limits. The SAQP process bears the compounding effects from successive reactive ion etch and spacer depositions. These processes induce a shift in the pitch value from one fin compared to another neighboring fin. This is known as pitch walking. Pitch walking affects device performance as well as later processes, which work on an assumption that there is consistent spacing between fins. In SAQP, there are three pitch walking parameters of interest, each linked to specific process steps in the flow. These pitch walking parameters are difficult to discriminate at a specific process step by singular evaluation technique or even with reference metrology, such as transmission electron microscopy. We will utilize a virtual reference to generate a scatterometry model to measure pitch walk for SAQP process flow.

  10. Molecular-Scale Structural Controls on Nanoscale Growth Processes: Step-Specific Regulation of Biomineral Morphology

    NASA Astrophysics Data System (ADS)

    Dove, P. M.; Davis, K. J.; De Yoreo, J. J.; Orme, C. A.

    2001-12-01

    Deciphering the complex strategies by which organisms produce nanocrystalline materials with exquisite morphologies is central to understanding biomineralizing systems. One control on the morphology of biogenic nanoparticles is the specific interactions of their surfaces with the organic functional groups provided by the organism and the various inorganic species present in the ambient environment. It is now possible to directly probe the microscopic structural controls on crystal morphology by making quantitative measurements of the dynamic processes occurring at the mineral-water interface. These observations can provide crucial information concerning the actual mechanisms of growth that is otherwise unobtainable through macroscopic techniques. Here we use in situ molecular-scale observations of step dynamics and growth hillock morphology to directly resolve roles of principal impurities in regulating calcite surface morphologies. We show that the interactions of certain inorganic as well as organic impurities with the calcite surface are dependent upon the molecular-scale structures of step-edges. These interactions can assume a primary role in directing crystal morphology. In calcite growth experiments containing magnesium, we show that growth hillock structures become modified owing to the preferential inhibition of step motion along directions approximately parallel to the [010]. Compositional analyses have shown that Mg incorporates at different levels into the two types of nonequivalent steps, which meet at the hillock corner parallel to [010]. A simple calculation of the strain caused by this difference indicates that we should expect a significant retardation at this corner, in agreement with the observed development of [010] steps. If the low-energy step-risers produced by these [010] steps is perpendicular to the c-axis as seems likely from crystallographic considerations, this effect provides a plausible mechanism for the elongated calcite crystal habits found in natural environments that contain magnesium. In a separate study, step-specific interactions are also found between chiral aspartate molecules and the calcite surface. The L and D- aspartate enantiomers exhibit structure preferences for the different types of step-risers on the calcite surface. These site-specific interactions result in the transfer of asymmetry from the organic molecule to the crystal surface through the formation of chiral growth hillocks and surface morphologies. These studies yield direct experimental insight into the molecular-scale structural controls on nanocrystal morphology in biomineralizing systems.

  11. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    PubMed

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  12. Array Automated Assembly Task Low Cost Silicon Solar Array Project, Phase 2

    NASA Technical Reports Server (NTRS)

    Rhee, S. S.; Jones, G. T.; Allison, K. L.

    1978-01-01

    Progress in the development of solar cells and module process steps for low-cost solar arrays is reported. Specific topics covered include: (1) a system to automatically measure solar cell electrical performance parameters; (2) automation of wafer surface preparation, printing, and plating; (3) laser inspection of mechanical defects of solar cells; and (4) a silicon antireflection coating system. Two solar cell process steps, laser trimming and holing automation and spray-on dopant junction formation, are described.

  13. 22 CFR 161.9 - Specific steps in the Department's NEPA process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... cooperation with the Office of Environment and Health, initiate steps to prepare an environmental impact... proposed action may have a significant impact on the human environment the necessary revision in the... Environment and Health, prepare a “Finding of no significant impact” (see §§ 1501.4 and 1508.13 of the CEQ...

  14. Screening for Usher Syndrome: A Hands-On Guide for School Nurses.

    ERIC Educational Resources Information Center

    Houghton, Joan; Coonts, Teresa; Jordan, Beth; Schafer, Jacqueline, Ed.

    This manual was written specifically to help school nurses conduct screenings for Usher syndrome, a genetic condition that involves deafness or hearing loss and the progressive loss of vision. It provides information on the step-by-step process of how to conduct a screening, the actual forms needed for a screening, and resources for referring…

  15. Best of Teacher-to-Teacher: The Ultimate Beginner's Guide. NEA Teacher-to-Teacher Books.

    ERIC Educational Resources Information Center

    National Education Association, Washington, DC.

    In this book, beginning teachers from around the country share their favorite chapters from the National Education Association's "Teacher-to-Teacher" books. Each story illustrates step-by-step how teachers tackle a specific restructuring challenge, describing what worked and what did not work in the process. Each chapter includes diagrams,…

  16. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  17. Specific and selective target detection of supra-genome 21 Mers Salmonella via silicon nanowires biosensor

    NASA Astrophysics Data System (ADS)

    Mustafa, Mohammad Razif Bin; Dhahi, Th S.; Ehfaed, Nuri. A. K. H.; Adam, Tijjani; Hashim, U.; Azizah, N.; Mohammed, Mohammed; Noriman, N. Z.

    2017-09-01

    The nano structure based on silicon can be surface modified to be used as label-free biosensors that allow real-time measurements. The silicon nanowire surface was functionalized using 3-aminopropyltrimethoxysilane (APTES), which functions as a facilitator to immobilize biomolecules on the silicon nanowire surface. The process is simple, economical; this will pave the way for point-of-care applications. However, the surface modification and subsequent detection mechanism still not clear. Thus, study proposed step by step process of silicon nano surface modification and its possible in specific and selective target detection of Supra-genome 21 Mers Salmonella. The device captured the molecule with precisely; the approach took the advantages of strong binding chemistry created between APTES and biomolecule. The results indicated how modifications of the nanowires provide sensing capability with strong surface chemistries that can lead to specific and selective target detection.

  18. EST databases and web tools for EST projects.

    PubMed

    Shen, Yao-Qing; O'Brien, Emmet; Koski, Liisa; Lang, B Franz; Burger, Gertraud

    2009-01-01

    This chapter outlines key considerations for constructing and implementing an EST database. Instead of showing the technological details step by step, emphasis is put on the design of an EST database suited to the specific needs of EST projects and how to choose the most suitable tools. Using TBestDB as an example, we illustrate the essential factors to be considered for database construction and the steps for data population and annotation. This process employs technologies such as PostgreSQL, Perl, and PHP to build the database and interface, and tools such as AutoFACT for data processing and annotation. We discuss these in comparison to other available technologies and tools, and explain the reasons for our choices.

  19. Sperm kinematics and subpopulational responses during the cryopreservation process in caprine ejaculates.

    PubMed

    Barbas, J P; Leahy, T; Horta, A E; García-Herreros, M

    2018-03-20

    Sperm cryopreservation in goats has been a challenge for many years due to the detrimental effects of seminal plasma enzymes produced by the bulbo-urethral glands which catalyse the hydrolysis of lecithins in egg yolk to fatty acids and lysolecithins which are deleterious to spermatozoa. This fact implies to carry out additional processing steps during sperm cryopreservation for seminal plasma removal triggering different sperm responses which may affect sperm functionality. The objective of the present study was to determine specific sperm subpopulation responses in different handling steps during the cryopreservation process by using functional sperm kinematic descriptors in caprine ejaculates. Buck ejaculates (n = 40) were analysed for sperm concentration, viability, morphology and acrosome integrity. Moreover, sperm motility was assessed using a computer-assisted sperm analysis (CASA) system after five different handling steps (fresh sperm, 1st washing, 2nd washing, cooling and frozen-thawed sperm) during a standard cryopreservation protocol for goat semen. The results were analysed using Principal Component Analysis (PCA) and multivariate clustering procedures to establish the relationship between the distribution of the subpopulations found and the functional sperm motility in each step. Except for the 1st and 4th steps, four sperm kinematic subpopulations were observed explaining more than 75% of the variance. Based on velocity and linearity parameters and the subpopulations disclosed, the kinematic response varies among processing steps modifying sperm movement trajectories in a subpopulation-specific and handling step-dependent manner (p < 0.001). The predominant motile subpopulation in freshly ejaculated buck sperm had very fast velocity characteristics and a non-linear trajectory (41.1%). Washing buck sperm twice altered the subpopulation structure as well as cooling which resulted in a dramatic reduction in sperm velocities (p < 0.01). Frozen-thawed spermatozoa showed similar characteristics to cooled sperm except there was a further increase in linearity with a large proportion of sperm attributed to new slow, linear cluster (32.5%). In conclusion, this study confirms the variability and heterogeneity of goat sperm kinematic patterns throughout the cryopreservation process and suggests that the predominant motility pattern (assayed in vitro via CASA) of high quality spermatozoa might be typified by high speed and a non-linear trajectory. The relationships among the number and distribution of sperm subpopulations and the different handling steps were particularlly relevant, specially after the cooling and the post-thawing steps, when effects derived from these critical handling steps were evident and altered drastically the sperm motion patterns. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    NASA Astrophysics Data System (ADS)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  1. Development, upscaling and validation of the purification process for human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII produced in a human cell-line.

    PubMed

    Winge, Stefan; Yderland, Louise; Kannicht, Christoph; Hermans, Pim; Adema, Simon; Schmidt, Torben; Gilljam, Gustav; Linhult, Martin; Tiemeyer, Maya; Belyanskaya, Larisa; Walter, Olaf

    2015-11-01

    Human-cl rhFVIII (Nuwiq®), a new generation recombinant factor VIII (rFVIII), is the first rFVIII produced in a human cell-line approved by the European Medicines Agency. To describe the development, upscaling and process validation for industrial-scale human-cl rhFVIII purification. The purification process involves one centrifugation, two filtration, five chromatography columns and two dedicated pathogen clearance steps (solvent/detergent treatment and 20 nm nanofiltration). The key purification step uses an affinity resin (VIIISelect) with high specificity for FVIII, removing essentially all host-cell proteins with >80% product recovery. The production-scale multi-step purification process efficiently removes process- and product-related impurities and results in a high-purity rhFVIII product, with an overall yield of ∼50%. Specific activity of the final product was >9000 IU/mg, and the ratio between active FVIII and total FVIII protein present was >0.9. The entire production process is free of animal-derived products. Leaching of potential harmful compounds from chromatography resins and all pathogens tested were below the limit of quantification in the final product. Human-cl rhFVIII can be produced at 500 L bioreactor scale, maintaining high purity and recoveries. The innovative purification process ensures a high-purity and high-quality human-cl rhFVIII product with a high pathogen safety margin. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  2. DEVELOPMENT OF PERMANENT MECHANICAL REPAIR SLEEVE FOR PLASTIC PIPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hitesh Patadia

    2005-04-29

    The report presents a comprehensive summary of the prototype development process utilized towards the development of a permanent mechanical repair fitting intended to be installed on damaged PE mains under blowing gas conditions. Specifically, the step by step construction approach is presented and the experimental data to support the mitigation of ensuing slow crack growth (SCG) of the damage area.

  3. Cross-species infection trials reveal cryptic parasite varieties and a putative polymorphism shared among host species.

    PubMed

    Luijckx, Pepijn; Duneau, David; Andras, Jason P; Ebert, Dieter

    2014-02-01

    A parasite's host range can have important consequences for ecological and evolutionary processes but can be difficult to infer. Successful infection depends on the outcome of multiple steps and only some steps of the infection process may be critical in determining a parasites host range. To test this hypothesis, we investigated the host range of the bacterium Pasteuria ramosa, a Daphnia parasite, and determined the parasites success in different stages of the infection process. Multiple genotypes of Daphnia pulex, Daphnia longispina and Daphnia magna were tested with four Pasteuria genotypes using infection trials and an assay that determines the ability of the parasite to attach to the hosts esophagus. We find that attachment is not specific to host species but is specific to host genotype. This may suggest that alleles on the locus controlling attachment are shared among different host species that diverged 100 million year. However, in our trials, Pasteuria was never able to reproduce in nonnative host species, suggesting that Pasteuria infecting different host species are different varieties, each with a narrow host range. Our approach highlights the explanatory power of dissecting the steps of the infection process and resolves potentially conflicting reports on parasite host ranges. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  4. [Effect of excess ethanol on the growth of yeasts of the genus Candida during continuous cultivation].

    PubMed

    Shkidchenko, A N; Shul'ga, A V; Gurina, L V

    1988-01-01

    The effect of flow rates and a specific ethanol load on the growth of Candida utilis and Candida krusei was studied in the process of one-step and three-step cultivation. The productive capacity of fermenters and the economic coefficient of yeast biomass production were shown to depend on the ability of microbial populations to assimilate a certain quantity of a carbon substrate per unit time. When a specific ethanol load exceeds the optimal one, the respiratory activity of a population and the economic coefficient of growth fall down whereas the accumulation of metabolites in the cultural broth increases. The steady state of biomass can be maintained in the process of continuous cultivation by inhibiting the yeast growth with an excess of ethanol.

  5. Coincidence measurements following 2p photoionization in Mg

    NASA Astrophysics Data System (ADS)

    Sokell, E.; Bolognesi, P.; Safgren, S.; Avaldi, L.

    2014-04-01

    Triple Differential Cross-Section (TDCS) measurements have been made to investigate the 2p pho-toionization of Magnesium. In the experiment the photoelectron and the L3-M1M1 Auger electron have been detected in coincidence at four distinct photon energies from 7 to 40 eV above the 2p threshold. Auger decay is usually treated as a two-step process, where the intermediate single hole-state makes the link between the pho-toionization and decay processes. These measurements allow the investigation of the process as a function of excess energy, and specifically to test the validity of the two-step model as the ionization threshold is approached.

  6. Kaizen: a process improvement model for the business of health care and perioperative nursing professionals.

    PubMed

    Tetteh, Hassan A

    2012-01-01

    Kaizen is a proven management technique that has a practical application for health care in the context of health care reform and the 2010 Institute of Medicine landmark report on the future of nursing. Compounded productivity is the unique benefit of kaizen, and its principles are change, efficiency, performance of key essential steps, and the elimination of waste through small and continuous process improvements. The kaizen model offers specific instruction for perioperative nurses to achieve process improvement in a five-step framework that includes teamwork, personal discipline, improved morale, quality circles, and suggestions for improvement. Published by Elsevier Inc.

  7. Faculty Governance and Outcomes Assessment: Compatible or Combustible?

    ERIC Educational Resources Information Center

    DeBoy, James L.; Monsilovich, Sally B.; DeBoy, Joanne R.

    2013-01-01

    This paper identifies the various factors driving the outcomes assessment movement, contrasts the old paradigm with the model now espoused by regional accreditors, discusses the six-step process of student outcomes assessment, emphasizes faculty ownership of the process to prevent administrative usurpation, and proposes specific strategies to…

  8. Six-sigma application in tire-manufacturing company: a case study

    NASA Astrophysics Data System (ADS)

    Gupta, Vikash; Jain, Rahul; Meena, M. L.; Dangayach, G. S.

    2017-09-01

    Globalization, advancement of technologies, and increment in the demand of the customer change the way of doing business in the companies. To overcome these barriers, the six-sigma define-measure-analyze-improve-control (DMAIC) method is most popular and useful. This method helps to trim down the wastes and generating the potential ways of improvement in the process as well as service industries. In the current research, the DMAIC method was used for decreasing the process variations of bead splice causing wastage of material. This six-sigma DMAIC research was initiated by problem identification through voice of customer in the define step. The subsequent step constitutes of gathering the specification data of existing tire bead. This step was followed by the analysis and improvement steps, where the six-sigma quality tools such as cause-effect diagram, statistical process control, and substantial analysis of existing system were implemented for root cause identification and reduction in process variation. The process control charts were used for systematic observation and control the process. Utilizing DMAIC methodology, the standard deviation was decreased from 2.17 to 1.69. The process capability index (C p) value was enhanced from 1.65 to 2.95 and the process performance capability index (C pk) value was enhanced from 0.94 to 2.66. A DMAIC methodology was established that can play a key role for reducing defects in the tire-manufacturing process in India.

  9. Video Processes in Teacher Education Programs; Scope, Techniques, and Assessment. Multi-State Teacher Education Project, Monograph III.

    ERIC Educational Resources Information Center

    Bosley, Howard E.; And Others

    "Video Processes Are Changing Teacher Education" by Howard Bosley (the first of five papers comprising this document) discusses the Multi-State Teacher Education Project (M-STEP) experimentation with media; it lists various uses of video processes, concentrating specifically on microteaching and the use of simulation and critical…

  10. The Impact of Metacognitive Strategies and Self-Regulating Processes of Solving Math Word Problems

    ERIC Educational Resources Information Center

    Vula, Eda; Avdyli, Rrezarta; Berisha, Valbona; Saqipi, Blerim; Elezi, Shpetim

    2017-01-01

    This empirical study investigates the impact of metacognitive strategies and self-regulating processes in learners' achievement on solving math word problems. It specifically analyzes the impact of the linguistic factor and the number of steps and arithmetical operations that learners need to apply during the process of solving math word problems.…

  11. Expected values for pedometer-determined physical activity in older populations

    PubMed Central

    2009-01-01

    The purpose of this review is to update expected values for pedometer-determined physical activity in free-living healthy older populations. A search of the literature published since 2001 began with a keyword (pedometer, "step counter," "step activity monitor" or "accelerometer AND steps/day") search of PubMed, Cumulative Index to Nursing & Allied Health Literature (CINAHL), SportDiscus, and PsychInfo. An iterative process was then undertaken to abstract and verify studies of pedometer-determined physical activity (captured in terms of steps taken; distance only was not accepted) in free-living adult populations described as ≥ 50 years of age (studies that included samples which spanned this threshold were not included unless they provided at least some appropriately age-stratified data) and not specifically recruited based on any chronic disease or disability. We identified 28 studies representing at least 1,343 males and 3,098 females ranging in age from 50–94 years. Eighteen (or 64%) of the studies clearly identified using a Yamax pedometer model. Monitoring frames ranged from 3 days to 1 year; the modal length of time was 7 days (17 studies, or 61%). Mean pedometer-determined physical activity ranged from 2,015 steps/day to 8,938 steps/day. In those studies reporting such data, consistent patterns emerged: males generally took more steps/day than similarly aged females, steps/day decreased across study-specific age groupings, and BMI-defined normal weight individuals took more steps/day than overweight/obese older adults. The range of 2,000–9,000 steps/day likely reflects the true variability of physical activity behaviors in older populations. More explicit patterns, for example sex- and age-specific relationships, remain to be informed by future research endeavors. PMID:19706192

  12. Storybridging: Four steps for constructing effective health narratives

    PubMed Central

    Boeijinga, Anniek; Hoeken, Hans; Sanders, José

    2017-01-01

    Objective: To develop a practical step-by-step approach to constructing narrative health interventions in response to the mixed results and wide diversity of narratives used in health-related narrative persuasion research. Method: Development work was guided by essential narrative characteristics as well as principles enshrined in the Health Action Process Approach. Results: The ‘storybridging’ method for constructing health narratives is described as consisting of four concrete steps: (a) identifying the stage of change, (b) identifying the key elements, (c) building the story, and (d) pre-testing the story. These steps are illustrated by means of a case study in which an effective narrative health intervention was developed for Dutch truck drivers: a high-risk, underprivileged occupational group. Conclusion: Although time and labour intensive, the Storybridging approach suggests integrating the target audience as an important stakeholder throughout the development process. Implications and recommendations are provided for health promotion targeting truck drivers specifically and for constructing narrative health interventions in general. PMID:29276232

  13. The Design Process of Corporate Universities: A Stakeholder Approach

    ERIC Educational Resources Information Center

    Patrucco, Andrea Stefano; Pellizzoni, Elena; Buganza, Tommaso

    2017-01-01

    Purpose: Corporate universities (CUs) have been experiencing tremendous growth during the past years and can represent a real driving force for structured organizations. The paper aims to define the process of CU design shaped around company strategy. For each step, the authors propose specific roles, activities and methods.…

  14. 36 CFR 800.3 - Initiation of the section 106 process.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Religious Freedom Act, the Archeological Resources Protection Act, and agency-specific legislation, such as... the next step in the process based on the finding or determination or consult with the Council in lieu... as the consulting parties and the public have an adequate opportunity to express their views as...

  15. Purification of anti-Japanese encephalitis virus monoclonal antibody by ceramic hydroxyapatite chromatography without proteins A and G.

    PubMed

    Saito, Maiko; Kurosawa, Yae; Okuyama, Tsuneo

    2012-02-01

    Antibody purification using proteins A and G has been a standard method for research and industrial processes. The conventional method, however, includes a three-step process, including buffer exchange, before chromatography. In addition, proteins A and G require low pH elution, which causes antibody aggregation and inactivates the antibody's immunity. This report proposes a two-step method using hydroxyapatite chromatography and membrane filtration, without proteins A and G. This novel method shortens the running time to one-third the conventional method for each cycle. Using our two-step method, 90.2% of the monoclonal antibodies purified were recovered in the elution fraction, the purity achieved was >90%, and most of the antigen-specific activity was retained. This report suggests that the two-step method using hydroxyapatite chromatography and membrane filtration should be considered as an alternative to purification using proteins A and G.

  16. Perspective: Optical measurement of feature dimensions and shapes by scatterometry

    NASA Astrophysics Data System (ADS)

    Diebold, Alain C.; Antonelli, Andy; Keller, Nick

    2018-05-01

    The use of optical scattering to measure feature shape and dimensions, scatterometry, is now routine during semiconductor manufacturing. Scatterometry iteratively improves an optical model structure using simulations that are compared to experimental data from an ellipsometer. These simulations are done using the rigorous coupled wave analysis for solving Maxwell's equations. In this article, we describe the Mueller matrix spectroscopic ellipsometry based scatterometry. Next, the rigorous coupled wave analysis for Maxwell's equations is presented. Following this, several example measurements are described as they apply to specific process steps in the fabrication of gate-all-around (GAA) transistor structures. First, simulations of measurement sensitivity for the inner spacer etch back step of horizontal GAA transistor processing are described. Next, the simulated metrology sensitivity for sacrificial (dummy) amorphous silicon etch back step of vertical GAA transistor processing is discussed. Finally, we present the application of plasmonically active test structures for improving the sensitivity of the measurement of metal linewidths.

  17. Good pharmacovigilance practices: technology enabled.

    PubMed

    Nelson, Robert C; Palsulich, Bruce; Gogolak, Victor

    2002-01-01

    The assessment of spontaneous reports is most effective it is conducted within a defined and rigorous process. The framework for good pharmacovigilance process (GPVP) is proposed as a subset of good postmarketing surveillance process (GPMSP), a functional structure for both a public health and corporate risk management strategy. GPVP has good practices that implement each step within a defined process. These practices are designed to efficiently and effectively detect and alert the drug safety professional to new and potentially important information on drug-associated adverse reactions. These practices are enabled by applied technology designed specifically for the review and assessment of spontaneous reports. Specific practices include rules-based triage, active query prompts for severe organ insults, contextual single case evaluation, statistical proportionality and correlational checks, case-series analyses, and templates for signal work-up and interpretation. These practices and the overall GPVP are supported by state-of-the-art web-based systems with powerful analytical engines, workflow and audit trials to allow validated systems support for valid drug safety signalling efforts. It is also important to understand that a process has a defined set of steps and any one cannot stand independently. Specifically, advanced use of technical alerting methods in isolation can mislead and allow one to misunderstand priorities and relative value. In the end, pharmacovigilance is a clinical art and a component process to the science of pharmacoepidemiology and risk management.

  18. GeoSegmenter: A statistically learned Chinese word segmenter for the geoscience domain

    NASA Astrophysics Data System (ADS)

    Huang, Lan; Du, Youfu; Chen, Gongyang

    2015-03-01

    Unlike English, the Chinese language has no space between words. Segmenting texts into words, known as the Chinese word segmentation (CWS) problem, thus becomes a fundamental issue for processing Chinese documents and the first step in many text mining applications, including information retrieval, machine translation and knowledge acquisition. However, for the geoscience subject domain, the CWS problem remains unsolved. Although a generic segmenter can be applied to process geoscience documents, they lack the domain specific knowledge and consequently their segmentation accuracy drops dramatically. This motivated us to develop a segmenter specifically for the geoscience subject domain: the GeoSegmenter. We first proposed a generic two-step framework for domain specific CWS. Following this framework, we built GeoSegmenter using conditional random fields, a principled statistical framework for sequence learning. Specifically, GeoSegmenter first identifies general terms by using a generic baseline segmenter. Then it recognises geoscience terms by learning and applying a model that can transform the initial segmentation into the goal segmentation. Empirical experimental results on geoscience documents and benchmark datasets showed that GeoSegmenter could effectively recognise both geoscience terms and general terms.

  19. A Microwell-Printing Fabrication Strategy for the On-Chip Templated Biosynthesis of Protein Microarrays for Surface Plasmon Resonance Imaging

    PubMed Central

    Manuel, Gerald; Lupták, Andrej; Corn, Robert M.

    2017-01-01

    A two-step templated, ribosomal biosynthesis/printing method for the fabrication of protein microarrays for surface plasmon resonance imaging (SPRI) measurements is demonstrated. In the first step, a sixteen component microarray of proteins is created in microwells by cell free on chip protein synthesis; each microwell contains both an in vitro transcription and translation (IVTT) solution and 350 femtomoles of a specific DNA template sequence that together are used to create approximately 40 picomoles of a specific hexahistidine-tagged protein. In the second step, the protein microwell array is used to contact print one or more protein microarrays onto nitrilotriacetic acid (NTA)-functionalized gold thin film SPRI chips for real-time SPRI surface bioaffinity adsorption measurements. Even though each microwell array element only contains approximately 40 picomoles of protein, the concentration is sufficiently high for the efficient bioaffinity adsorption and capture of the approximately 100 femtomoles of hexahistidine-tagged protein required to create each SPRI microarray element. As a first example, the protein biosynthesis process is verified with fluorescence imaging measurements of a microwell array containing His-tagged green fluorescent protein (GFP), yellow fluorescent protein (YFP) and mCherry (RFP), and then the fidelity of SPRI chips printed from this protein microwell array is ascertained by measuring the real-time adsorption of various antibodies specific to these three structurally related proteins. This greatly simplified two-step synthesis/printing fabrication methodology eliminates most of the handling, purification and processing steps normally required in the synthesis of multiple protein probes, and enables the rapid fabrication of SPRI protein microarrays from DNA templates for the study of protein-protein bioaffinity interactions. PMID:28706572

  20. Modeling Growth of Nanostructures in Plasmas

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.

    2004-01-01

    As semiconductor circuits shrink to CDs below 0.1 nm, it is becoming increasingly critical to replace and/or enhance existing technology with nanoscale structures, such as nanowires for interconnects. Nanowires grown in plasmas are strongly dependent on processing conditions, such as gas composition and substrate temperature. Growth occurs at specific sites, or step-edges, with the bulk growth rate of the nanowires determined from the equation of motion of the nucleating crystalline steps. Traditional front-tracking algorithms, such as string-based or level set methods, suffer either from numerical complications in higher spatial dimensions, or from difficulties in incorporating surface-intense physical and chemical phenomena. Phase field models have the robustness of the level set method, combined with the ability to implement surface-specific chemistry that is required to model crystal growth, although they do not necessarily directly solve for the advancing front location. We have adopted a phase field approach and will present results of the adatom density and step-growth location in time as a function of processing conditions, such as temperature and plasma gas composition.

  1. A Flow Adhesion Assay to Study Leucocyte Recruitment to Human Hepatic Sinusoidal Endothelium Under Conditions of Shear Stress

    PubMed Central

    Shetty, Shishir; Weston, Christopher J.; Adams, David H.; Lalor, Patricia F.

    2014-01-01

    Leucocyte infiltration into human liver tissue is a common process in all adult inflammatory liver diseases. Chronic infiltration can drive the development of fibrosis and progression to cirrhosis. Understanding the molecular mechanisms that mediate leucocyte recruitment to the liver could identify important therapeutic targets for liver disease. The key interaction during leucocyte recruitment is that of inflammatory cells with endothelium under conditions of shear stress. Recruitment to the liver occurs within the low shear channels of the hepatic sinusoids which are lined by hepatic sinusoidal endothelial cells (HSEC). The conditions within the hepatic sinusoids can be recapitulated by perfusing leucocytes through channels lined by human HSEC monolayers at specific flow rates. In these conditions leucocytes undergo a brief tethering step followed by activation and firm adhesion, followed by a crawling step and subsequent transmigration across the endothelial layer. Using phase contrast microscopy, each step of this 'adhesion cascade' can be visualized and recorded followed by offline analysis. Endothelial cells or leucocytes can be pretreated with inhibitors to determine the role of specific molecules during this process. PMID:24686418

  2. Indicator Systems and Evaluation

    NASA Technical Reports Server (NTRS)

    Canright, Shelley; Grabowski, Barbara

    1995-01-01

    Participants in the workshop session were actively engaged in a hands-on, minds-on approach to learning about indicators and evaluation processes. The six hour session was broken down into three two hour sessions. Each session was built upon an instructional model which moved from general understanding to specific IITA application. Examples and practice exercises served to demonstrate tand reinforce the workshop concepts. Each successive session built upon the previous session and addressed the major steps in the evaluation process. The major steps covered in the workshop included: project descriptions, writing goals and objectives for categories, determining indicators and indicator systems for specific projects, and methods and issues of data collection. The workshop served as a baseline upon which the field centers will build during the summer in undertaking a comprehensive examination and evaluation of their existing K-12 education projects.

  3. Specifications of a Simulation Model for a Local Area Network Design in Support of a Stock Point Logistics Integrated Communication Environment (SPLICE).

    DTIC Science & Technology

    1983-06-01

    constrained at each step. Use of dis- crete simulation can be a powerful tool in this process if its role is carefully planned. The gross behavior of the...by projecting: - the arrival of units of work at SPLICE processing facilities (workload analysis) . - the amount of processing resources comsumed in

  4. Groundwater Circulating Well Assessment and Guidance

    DTIC Science & Technology

    1998-04-03

    47 3 . 1 Decis ion Tree and Process Description...two GCW systems p laced c lose enough to affect each other significantly (Herding et al. , 1 994). This type of wel l spaci ng may be requ ired to...3.1 Decision Tree and Process Description The process for screening the GCW technology is a logical sequence of steps during which site­ specific

  5. Fabrication and characterization of resonant SOI micromechanical silicon sensors based on DRIE micromachining, freestanding release process and silicon direct bonding

    NASA Astrophysics Data System (ADS)

    Gigan, Olivier; Chen, Hua; Robert, Olivier; Renard, Stephane; Marty, Frederic

    2002-11-01

    This paper is dedicated to the fabrication and technological aspect of a silicon microresonator sensor. The entire project includes the fabrication processes, the system modelling/simulation, and the electronic interface. The mechanical model of such resonator is presented including description of frequency stability and Hysterises behaviour of the electrostatically driven resonator. Numeric model and FEM simulations are used to simulate the system dynamic behaviour. The complete fabrication process is based on standard microelectronics technology with specific MEMS technological steps. The key steps are described: micromachining on SOI by Deep Reactive Ion Etching (DRIE), specific release processes to prevent sticking (resist and HF-vapour release process) and collective vacuum encapsulation by Silicon Direct Bonding (SDB). The complete process has been validated and prototypes have been fabricated. The ASIC was designed to interface the sensor and to control the vibration amplitude. This electronic was simulated and designed to work up to 200°C and implemented in a standard 0.6μ CMOS technology. Characterizations of sensor prototypes are done both mechanically and electrostatically. These measurements showed good agreements with theory and FEM simulations.

  6. Dimming LEDs with Phase-Cut Dimmers: The Specifier's Process for Maximizing Success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Naomi J.; Poplawski, Michael E.

    2013-10-01

    This report reviews how phase-cut dimmers work, how LEDs differ from the incandescent lamps that the dimmers were historically designed to control, and how these differences can lead to complications when trying to dim LEDs. Compatibility between a specific LED source and a specific phase-cut dimmer is often unknown and difficult to assess, and ensuring compatibility adds complexity to the design, specification, bidding, and construction observation phases for new buildings and major remodel projects. To maximize project success, this report provides both general guidance and step-by-step procedures for designing phase-controlled LED dimming on both new and existing projects, as wellmore » as real-world examples of how to use those procedures.« less

  7. Virus elimination during the purification of monoclonal antibodies by column chromatography and additional steps.

    PubMed

    Roberts, Peter L

    2014-01-01

    The theoretical potential for virus transmission by monoclonal antibody based therapeutic products has led to the inclusion of appropriate virus reduction steps. In this study, virus elimination by the chromatographic steps used during the purification process for two (IgG-1 & -3) monoclonal antibodies (MAbs) have been investigated. Both the Protein G (>7log) and ion-exchange (5 log) chromatography steps were very effective for eliminating both enveloped and non-enveloped viruses over the life-time of the chromatographic gel. However, the contribution made by the final gel filtration step was more limited, i.e., 3 log. Because these chromatographic columns were recycled between uses, the effectiveness of the column sanitization procedures (guanidinium chloride for protein G or NaOH for ion-exchange) were tested. By evaluating standard column runs immediately after each virus spiked run, it was possible to directly confirm that there was no cross contamination with virus between column runs (guanidinium chloride or NaOH). To further ensure the virus safety of the product, two specific virus elimination steps have also been included in the process. A solvent/detergent step based on 1% triton X-100 rapidly inactivating a range of enveloped viruses by >6 log inactivation within 1 min of a 60 min treatment time. Virus removal by virus filtration step was also confirmed to be effective for those viruses of about 50 nm or greater. In conclusion, the combination of these multiple steps ensures a high margin of virus safety for this purification process. © 2014 American Institute of Chemical Engineers.

  8. Kinetic analysis of overlapping multistep thermal decomposition comprising exothermic and endothermic processes: thermolysis of ammonium dinitramide.

    PubMed

    Muravyev, Nikita V; Koga, Nobuyoshi; Meerov, Dmitry B; Pivkina, Alla N

    2017-01-25

    This study focused on kinetic modeling of a specific type of multistep heterogeneous reaction comprising exothermic and endothermic reaction steps, as exemplified by the practical kinetic analysis of the experimental kinetic curves for the thermal decomposition of molten ammonium dinitramide (ADN). It is known that the thermal decomposition of ADN occurs as a consecutive two step mass-loss process comprising the decomposition of ADN and subsequent evaporation/decomposition of in situ generated ammonium nitrate. These reaction steps provide exothermic and endothermic contributions, respectively, to the overall thermal effect. The overall reaction process was deconvoluted into two reaction steps using simultaneously recorded thermogravimetry and differential scanning calorimetry (TG-DSC) curves by considering the different physical meanings of the kinetic data derived from TG and DSC by P value analysis. The kinetic data thus separated into exothermic and endothermic reaction steps were kinetically characterized using kinetic computation methods including isoconversional method, combined kinetic analysis, and master plot method. The overall kinetic behavior was reproduced as the sum of the kinetic equations for each reaction step considering the contributions to the rate data derived from TG and DSC. During reproduction of the kinetic behavior, the kinetic parameters and contributions of each reaction step were optimized using kinetic deconvolution analysis. As a result, the thermal decomposition of ADN was successfully modeled as partially overlapping exothermic and endothermic reaction steps. The logic of the kinetic modeling was critically examined, and the practical usefulness of phenomenological modeling for the thermal decomposition of ADN was illustrated to demonstrate the validity of the methodology and its applicability to similar complex reaction processes.

  9. The Influence of Task Complexity on Knee Joint Kinetics Following ACL Reconstruction

    PubMed Central

    Schroeder, Megan J.; Krishnan, Chandramouli; Dhaher, Yasin Y.

    2015-01-01

    Background Previous research indicates that subjects with anterior cruciate ligament reconstruction exhibit abnormal knee joint movement patterns during functional activities like walking. While the sagittal plane mechanics have been studied extensively, less is known about the secondary planes, specifically with regard to more demanding tasks. This study explored the influence of task complexity on functional joint mechanics in the context of graft-specific surgeries. Methods In 25 participants (10 hamstring tendon graft, 6 patellar tendon graft, 9 matched controls), three-dimensional joint torques were calculated using a standard inverse dynamics approach during level walking and stair descent. The stair descent task was separated into two functionally different sub-tasks—step-to-floor and step-to-step. The differences in external knee moment profiles were compared between groups; paired differences between the reconstructed and non-reconstructed knees were also assessed. Findings The reconstructed knees, irrespective of graft type, typically exhibited significantly lower peak knee flexion moments compared to control knees during stair descent, with the differences more pronounced in the step-to-step task. Frontal plane adduction torque deficits were graft-specific and limited to the hamstring tendon knees during the step-to-step task. Internal rotation torque deficits were also primarily limited to the hamstring tendon graft group during stair descent. Collectively, these results suggest that task complexity was a primary driver of differences in joint mechanics between anterior cruciate ligament reconstructed individuals and controls, and such differences were more pronounced in individuals with hamstring tendon grafts. Interpretation The mechanical environment experienced in the cartilage during repetitive, cyclical tasks such as walking and other activities of daily living has been argued to contribute to the development of degenerative changes to the joint and ultimately osteoarthritis. Given the task-specific and graft-specific differences in joint mechanics detected in this study, care should be taken during the rehabilitation process to mitigate these changes. PMID:26101055

  10. Phase 2 of the array automated assembly task for the low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Petersen, R. C.

    1980-01-01

    Studies were conducted on several fundamental aspects of electroless nickel/solder metallization for silicon solar cells. A process, which precedes the electroless nickel plating with several steps of palladium plating and heat treatment, was compared directly with single step electroless nickel plating. Work was directed toward answering specific questions concerning the effect of silicon surface oxide on nickel plating, effects of thermal stresses on the metallization, sintering of nickel plated on silicon, and effects of exposure to the plating solution on solar cell characteristics. The process was found to be extremely lengthy and cumbersome, and was also found to produce a product virtually identical to that produced by single step electroless nickel plating, as shown by adhesion tests and electrical characteristics of cells under illumination.

  11. B827 Chemical Synthhesis Project - Industrial Control System Integration - Statement of Work & Specification with Attachments 1-14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wade, F. E.

    The Chemical Synthesis Pilot Process at the Lawrence Livermore National Laboratory (LLNL) Site 300 827 Complex will be used to synthesize small quantities of material to support research and development. The project will modernize and increase current capabilities for chemical synthesis at LLNL. The primary objective of this project is the conversion of a non-automated hands-on process to a remoteoperation process, while providing enhanced batch process step control, stored recipe-specific parameter sets, process variable visibility, monitoring, alarm and warning handling, and comprehensive batch record data logging. This Statement of Work and Specification provides the industrial-grade process control requirements for themore » chemical synthesis batching control system, hereafter referred to as the “Control System” to be delivered by the System Integrator.« less

  12. Noise Enhances Action Potential Generation in Mouse Sensory Neurons via Stochastic Resonance.

    PubMed

    Onorato, Irene; D'Alessandro, Giuseppina; Di Castro, Maria Amalia; Renzi, Massimiliano; Dobrowolny, Gabriella; Musarò, Antonio; Salvetti, Marco; Limatola, Cristina; Crisanti, Andrea; Grassi, Francesca

    2016-01-01

    Noise can enhance perception of tactile and proprioceptive stimuli by stochastic resonance processes. However, the mechanisms underlying this general phenomenon remain to be characterized. Here we studied how externally applied noise influences action potential firing in mouse primary sensory neurons of dorsal root ganglia, modelling a basic process in sensory perception. Since noisy mechanical stimuli may cause stochastic fluctuations in receptor potential, we examined the effects of sub-threshold depolarizing current steps with superimposed random fluctuations. We performed whole cell patch clamp recordings in cultured neurons of mouse dorsal root ganglia. Noise was added either before and during the step, or during the depolarizing step only, to focus onto the specific effects of external noise on action potential generation. In both cases, step + noise stimuli triggered significantly more action potentials than steps alone. The normalized power norm had a clear peak at intermediate noise levels, demonstrating that the phenomenon is driven by stochastic resonance. Spikes evoked in step + noise trials occur earlier and show faster rise time as compared to the occasional ones elicited by steps alone. These data suggest that external noise enhances, via stochastic resonance, the recruitment of transient voltage-gated Na channels, responsible for action potential firing in response to rapid step-wise depolarizing currents.

  13. Noise Enhances Action Potential Generation in Mouse Sensory Neurons via Stochastic Resonance

    PubMed Central

    Onorato, Irene; D'Alessandro, Giuseppina; Di Castro, Maria Amalia; Renzi, Massimiliano; Dobrowolny, Gabriella; Musarò, Antonio; Salvetti, Marco; Limatola, Cristina; Crisanti, Andrea; Grassi, Francesca

    2016-01-01

    Noise can enhance perception of tactile and proprioceptive stimuli by stochastic resonance processes. However, the mechanisms underlying this general phenomenon remain to be characterized. Here we studied how externally applied noise influences action potential firing in mouse primary sensory neurons of dorsal root ganglia, modelling a basic process in sensory perception. Since noisy mechanical stimuli may cause stochastic fluctuations in receptor potential, we examined the effects of sub-threshold depolarizing current steps with superimposed random fluctuations. We performed whole cell patch clamp recordings in cultured neurons of mouse dorsal root ganglia. Noise was added either before and during the step, or during the depolarizing step only, to focus onto the specific effects of external noise on action potential generation. In both cases, step + noise stimuli triggered significantly more action potentials than steps alone. The normalized power norm had a clear peak at intermediate noise levels, demonstrating that the phenomenon is driven by stochastic resonance. Spikes evoked in step + noise trials occur earlier and show faster rise time as compared to the occasional ones elicited by steps alone. These data suggest that external noise enhances, via stochastic resonance, the recruitment of transient voltage-gated Na channels, responsible for action potential firing in response to rapid step-wise depolarizing currents. PMID:27525414

  14. Long-term memory-based control of attention in multi-step tasks requires working memory: evidence from domain-specific interference

    PubMed Central

    Foerster, Rebecca M.; Carbone, Elena; Schneider, Werner X.

    2014-01-01

    Evidence for long-term memory (LTM)-based control of attention has been found during the execution of highly practiced multi-step tasks. However, does LTM directly control for attention or are working memory (WM) processes involved? In the present study, this question was investigated with a dual-task paradigm. Participants executed either a highly practiced visuospatial sensorimotor task (speed stacking) or a verbal task (high-speed poem reciting), while maintaining visuospatial or verbal information in WM. Results revealed unidirectional and domain-specific interference. Neither speed stacking nor high-speed poem reciting was influenced by WM retention. Stacking disrupted the retention of visuospatial locations, but did not modify memory performance of verbal material (letters). Reciting reduced the retention of verbal material substantially whereas it affected the memory performance of visuospatial locations to a smaller degree. We suggest that the selection of task-relevant information from LTM for the execution of overlearned multi-step tasks recruits domain-specific WM. PMID:24847304

  15. A Process for Developing and Articulating Learning Goals or Competencies for Social and Emotional Learning

    ERIC Educational Resources Information Center

    Yoder, Nick; Dusenbury, Linda

    2017-01-01

    Developing and articulating clear goals for student social and emotional learning (SEL) involves a number of important steps. This document focuses specifically on the articulation of learning goals (sometimes called "competencies" or "standards" in state and district policy) and suggests a process for those state teams that…

  16. Transforming English Language Learners' Work Readiness: Case Studies in Explicit, Work-Specific Vocabulary Instruction

    ERIC Educational Resources Information Center

    Madrigal-Hopes, Diana L.; Villavicencio, Edna; Foote, Martha M.; Green, Chris

    2014-01-01

    This qualitative study examined the impact of a six-step framework for work-specific vocabulary instruction in adult English language learners (ELLs). Guided by research in English as a second language (ESL) methodology and the transactional theory, the researchers sought to unveil how these processes supported the acquisition and application of…

  17. Preview of the NASA NNWG NDE Sample Preparation Handbook

    NASA Technical Reports Server (NTRS)

    2010-01-01

    This viewgraph presents a step-by-step how-to fabrication documentation of every kind of sample that is fabricated for MSFC by UA Huntsville, including photos and illustrations. The tabulation of what kind of samples are being fabricated for what NDE method, detailed instructions/documentation of the inclusion/creation of defects, detailed specifications for materials, processes, and equipment, case histories and/or experiences with the different fabrication methods and defect inclusion techniques, discussion of pitfalls and difficulties associated with sample fabrication and defect inclusion techniques, and a discussion of why certain fabrication techniques are needed as related to the specific NDE methods are included in this presentation.

  18. Effects of acute alcohol intoxication on automated processing: evidence from the double-step paradigm.

    PubMed

    Vorstius, Christian; Radach, Ralph; Lang, Alan R

    2012-02-01

    Reflexive and voluntary levels of processing have been studied extensively with respect to possible impairments due to alcohol intoxication. This study examined alcohol effects at the 'automated' level of processing essential to many complex visual processing tasks (e.g., reading, visual search) that involve ongoing modifications or reprogramming of well-practiced routines. Data from 30 participants (16 male) were collected in two counterbalanced sessions (alcohol vs. no-alcohol control; mean breath alcohol concentration = 68 mg/dL vs. 0 mg/dL). Eye movements were recorded during a double-step task where 75% of trials involved two target stimuli in rapid succession (inter-stimulus interval [ISI]=40, 70, or 100 ms) so that they could elicit two distinct saccades or eye movements (double steps). On 25% of trials a single target appeared. Results indicated that saccade latencies were longer under alcohol. In addition, the proportion of single-step responses and the mean saccade amplitude (length) of primary saccades decreased significantly with increasing ISI. The key novel finding, however, was that the reprogramming time needed to cancel the first saccade and adjust saccade amplitude was extended significantly by alcohol. The additional time made available by prolonged latencies due to alcohol was not utilized by the saccade programming system to decrease the number of two-step responses. These results represent the first demonstration of specific alcohol-induced programming deficits at the automated level of oculomotor processing.

  19. Sequential decision tree using the analytic hierarchy process for decision support in rectal cancer.

    PubMed

    Suner, Aslı; Çelikoğlu, Can Cengiz; Dicle, Oğuz; Sökmen, Selman

    2012-09-01

    The aim of the study is to determine the most appropriate method for construction of a sequential decision tree in the management of rectal cancer, using various patient-specific criteria and treatments such as surgery, chemotherapy, and radiotherapy. An analytic hierarchy process (AHP) was used to determine the priorities of variables. Relevant criteria used in two decision steps and their relative priorities were established by a panel of five general surgeons. Data were collected via a web-based application and analyzed using the "Expert Choice" software specifically developed for the AHP. Consistency ratios in the AHP method were calculated for each set of judgments, and the priorities of sub-criteria were determined. A sequential decision tree was constructed for the best treatment decision process, using priorities determined by the AHP method. Consistency ratios in the AHP method were calculated for each decision step, and the judgments were considered consistent. The tumor-related criterion "presence of perforation" (0.331) and the patient-surgeon-related criterion "surgeon's experience" (0.630) had the highest priority in the first decision step. In the second decision step, the tumor-related criterion "the stage of the disease" (0.230) and the patient-surgeon-related criterion "surgeon's experience" (0.281) were the paramount criteria. The results showed some variation in the ranking of criteria between the decision steps. In the second decision step, for instance, the tumor-related criterion "presence of perforation" was just the fifth. The consistency of decision support systems largely depends on the quality of the underlying decision tree. When several choices and variables have to be considered in a decision, it is very important to determine priorities. The AHP method seems to be effective for this purpose. The decision algorithm developed by this method is more realistic and will improve the quality of the decision tree. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. A ten-step process to develop case management plans.

    PubMed

    Tahan, Hussein A

    2002-01-01

    The use of case management plans has contained cost and improved quality of care successfully. However, the process of developing these plans remains a great challenge for healthcare executives, in this article, the author presents the answer to this challenge by discussing a 10-step formal process that administrators of patient care services and case managers can adapt to their institutions. It also can be used by interdisciplinary team members as a practical guide to develop a specific case management plan. This process is applicable to any care setting (acute, ambulatory, long term, and home care), diagnosis, or procedure. It is particularly important for those organizations that currently do not have a deliberate and systematic process to develop case management plans and are struggling with how to improve the efficiency and productivity of interdisciplinary teams charged with developing case management plans.

  1. Introduction to Remote Sensing Image Registration

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline

    2017-01-01

    For many applications, accurate and fast image registration of large amounts of multi-source data is the first necessary step before subsequent processing and integration. Image registration is defined by several steps and each step can be approached by various methods which all present diverse advantages and drawbacks depending on the type of data, the type of applications, the a prior information known about the data and the type of accuracy that is required. This paper will first present a general overview of remote sensing image registration and then will go over a few specific methods and their applications

  2. Timing paradox of stepping and falls in ageing: not so quick and quick(er) on the trigger

    PubMed Central

    Mille, Marie‐Laure

    2016-01-01

    Abstract Physiological and degenerative changes affecting human standing balance are major contributors to falls with ageing. During imbalance, stepping is a powerful protective action for preserving balance that may be voluntarily initiated in recognition of a balance threat, or be induced by an externally imposed mechanical or sensory perturbation. Paradoxically, with ageing and falls, initiation slowing of voluntary stepping is observed together with perturbation‐induced steps that are triggered as fast as or faster than for younger adults. While age‐associated changes in sensorimotor conduction, central neuronal processing and cognitive functions are linked to delayed voluntary stepping, alterations in the coupling of posture and locomotion may also prolong step triggering. It is less clear, however, how these factors may explain the accelerated triggering of induced stepping. We present a conceptual model that addresses this issue. For voluntary stepping, a disruption in the normal coupling between posture and locomotion may underlie step‐triggering delays through suppression of the locomotion network based on an estimation of the evolving mechanical state conditions for stability. During induced stepping, accelerated step initiation may represent an event‐triggering process whereby stepping is released according to the occurrence of a perturbation rather than to the specific sensorimotor information reflecting the evolving instability. In this case, errors in the parametric control of induced stepping and its effectiveness in stabilizing balance would be likely to occur. We further suggest that there is a residual adaptive capacity with ageing that could be exploited to improve paradoxical triggering and other changes in protective stepping to impact fall risk. PMID:26915664

  3. A step-by-step translation of evidence into a psychosocial intervention for everyday activities in dementia: a focus group study.

    PubMed

    Giebel, Clarissa M; Challis, David; Hooper, Nigel M; Ferris, Sally

    2018-03-01

    In order to increase the efficacy of psychosocial interventions in dementia, a step-by-step process translating evidence and public engagement should be adhered to. This paper describes such a process by involving a two-stage focus group with people with dementia (PwD), informal carers, and staff. Based on previous evidence, general aspects of effective interventions were drawn out. These were tested in the first stage of focus groups, one with informal carers and PwD and one with staff. Findings from this stage helped shape the intervention further specifying its content. In the second stage, participants were consulted about the detailed components. The extant evidence base and focus groups helped to identify six practical and situation-specific elements worthy of consideration in planning such an intervention, including underlying theory and personal motivations for participation. Carers, PwD, and staff highlighted the importance of rapport between practitioners and PwD prior to commencing the intervention. It was also considered important that the intervention would be personalised to each individual. This paper shows how valuable public involvement can be to intervention development, and outlines a process of public involvement for future intervention development. The next step would be to formally test the intervention.

  4. Space Medicine in the Human System Integration Process

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.

    2010-01-01

    This slide presentation reviews the importance of integration of space medicine in the human system of lunar exploration. There is a review of historical precedence in reference to lunar surface operations. The integration process is reviewed in a chart which shows the steps from research to requirements development, requirements integration, design, verification, operations and using the lessons learned, giving more information and items for research. These steps are reviewed in view of specific space medical issues. Some of the testing of the operations are undertaken in an environment that is an analog to the exploration environment. Some of these analog environments are reviewed, and there is some discussion of the benefits of use of an analog environment in testing the processes that are derived.

  5. 49 CFR 40.61 - What are the preliminary steps in the collection process?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... that the alcohol test is completed before the urine collection process begins. Example to paragraph (b... ensure that post-accident and reasonable suspicion alcohol tests happen as soon as possible (e.g., by... actually beginning a collection: (a) When a specific time for an employee's test has been scheduled, or the...

  6. 49 CFR 40.61 - What are the preliminary steps in the collection process?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that the alcohol test is completed before the urine collection process begins. Example to paragraph (b... ensure that post-accident and reasonable suspicion alcohol tests happen as soon as possible (e.g., by... actually beginning a collection: (a) When a specific time for an employee's test has been scheduled, or the...

  7. 49 CFR 40.61 - What are the preliminary steps in the collection process?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that the alcohol test is completed before the urine collection process begins. Example to paragraph (b... ensure that post-accident and reasonable suspicion alcohol tests happen as soon as possible (e.g., by... actually beginning a collection: (a) When a specific time for an employee's test has been scheduled, or the...

  8. 49 CFR 40.61 - What are the preliminary steps in the collection process?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that the alcohol test is completed before the urine collection process begins. Example to paragraph (b... ensure that post-accident and reasonable suspicion alcohol tests happen as soon as possible (e.g., by... actually beginning a collection: (a) When a specific time for an employee's test has been scheduled, or the...

  9. 49 CFR 40.61 - What are the preliminary steps in the collection process?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that the alcohol test is completed before the urine collection process begins. Example to paragraph (b... ensure that post-accident and reasonable suspicion alcohol tests happen as soon as possible (e.g., by... actually beginning a collection: (a) When a specific time for an employee's test has been scheduled, or the...

  10. Student Characteristics and Expectations of University Classes: A Free Elicitation Approach

    ERIC Educational Resources Information Center

    Moore, Melissa L.; Moore, Robert S.; McDonald, Robert

    2008-01-01

    Past research has shown that there are subpopulation differences in what students expect from a specific class. These expectation evaluations are usually conducted in the process of when the class is being delivered. Framing the class as a service experience, we take a step back in the service delivery process and assess what students expect from…

  11. Production Of Human Antibodies

    NASA Technical Reports Server (NTRS)

    Sammons, David W.; Neil, Garry A.

    1993-01-01

    Process for making human monoclonal antibodies based on combination of techniques. Antibodies made active against specific antigen. Process involves in vivo immunization of human B lymphocyte cells in mice. B cells of interest enriched in vitro before fusion. Method potentially applicable to any antigen. Does not rely on use of Epstein-Barr virus at any step. Human lymphocytes taken from any source.

  12. Physical activity in England: who is meeting the recommended level of participation through sports and exercise?

    PubMed

    Anokye, Nana Kwame; Pokhrel, Subhash; Buxton, Martin; Fox-Rushby, Julia

    2013-06-01

    Little is known about the correlates of meeting recommended levels of participation in physical activity (PA) and how this understanding informs public health policies on behaviour change. To analyse who meets the recommended level of participation in PA in males and females separately by applying 'process' modelling frameworks (single vs. sequential 2-step process). Using the Health Survey for England 2006, (n = 14 142; ≥ 16 years), gender-specific regression models were estimated using bivariate probit with selectivity correction and single probit models. A 'sequential, 2-step process' modelled participation and meeting the recommended level separately, whereas the 'single process' considered both participation and level together. In females, meeting the recommended level was associated with degree holders [Marginal effect (ME) = 0.013] and age (ME = -0.001), whereas in males, age was a significant correlate (ME = -0.003 to -0.004). The order of importance of correlates was similar across genders, with ethnicity being the most important correlate in both males (ME = -0.060) and females (ME = -0.133). In females, the 'sequential, 2-step process' performed better (ρ = -0.364, P < 0.001) than that in males (ρ = 0.154). The degree to which people undertake the recommended level of PA through vigorous activity varies between males and females, and the process that best predicts such decisions, i.e. whether it is a sequential, 2-step process or a single-step choice, is also different for males and females. Understanding this should help to identify subgroups that are less likely to meet the recommended level of PA (and hence more likely to benefit from any PA promotion intervention).

  13. [Indications of lung transplantation: Patients selection, timing of listing, and choice of procedure].

    PubMed

    Morisse Pradier, H; Sénéchal, A; Philit, F; Tronc, F; Maury, J-M; Grima, R; Flamens, C; Paulus, S; Neidecker, J; Mornex, J-F

    2016-02-01

    Lung transplantation (LT) is now considered as an excellent treatment option for selected patients with end-stage pulmonary diseases, such as COPD, cystic fibrosis, idiopathic pulmonary fibrosis, and pulmonary arterial hypertension. The 2 goals of LT are to provide a survival benefit and to improve quality of life. The 3-step decision process leading to LT is discussed in this review. The first step is the selection of candidates, which requires a careful examination in order to check absolute and relative contraindications. The second step is the timing of listing for LT; it requires the knowledge of disease-specific prognostic factors available in international guidelines, and discussed in this paper. The third step is the choice of procedure: indications of heart-lung, single-lung, and bilateral-lung transplantation are described. In conclusion, this document provides guidelines to help pulmonologists in the referral and selection processes of candidates for transplantation in order to optimize the outcome of LT. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  14. Probabilistic modeling of the fate of Listeria monocytogenes in diced bacon during the manufacturing process.

    PubMed

    Billoir, Elise; Denis, Jean-Baptiste; Cammeau, Natalie; Cornu, Marie; Zuliani, Veronique

    2011-02-01

    To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance. © 2010 Society for Risk Analysis.

  15. A deep etching mechanism for trench-bridging silicon nanowires

    NASA Astrophysics Data System (ADS)

    Tasdemir, Zuhal; Wollschläger, Nicole; Österle, Werner; Leblebici, Yusuf; Erdem Alaca, B.

    2016-03-01

    Introducing a single silicon nanowire with a known orientation and dimensions to a specific layout location constitutes a major challenge. The challenge becomes even more formidable, if one chooses to realize the task in a monolithic fashion with an extreme topography, a characteristic of microsystems. The need for such a monolithic integration is fueled by the recent surge in the use of silicon nanowires as functional building blocks in various electromechanical and optoelectronic applications. This challenge is addressed in this work by introducing a top-down, silicon-on-insulator technology. The technology provides a pathway for obtaining well-controlled silicon nanowires along with the surrounding microscale features up to a three-order-of-magnitude scale difference. A two-step etching process is developed, where the first shallow etch defines a nanoscale protrusion on the wafer surface. After applying a conformal protection on the protrusion, a deep etch step is carried out forming the surrounding microscale features. A minimum nanowire cross-section of 35 nm by 168 nm is demonstrated in the presence of an etch depth of 10 μm. Nanowire cross-sectional features are characterized via transmission electron microscopy and linked to specific process steps. The technology allows control on all dimensional aspects along with the exact location and orientation of the silicon nanowire. The adoption of the technology in the fabrication of micro and nanosystems can potentially lead to a significant reduction in process complexity by facilitating direct access to the nanowire during surface processes such as contact formation and doping.

  16. A deep etching mechanism for trench-bridging silicon nanowires.

    PubMed

    Tasdemir, Zuhal; Wollschläger, Nicole; Österle, Werner; Leblebici, Yusuf; Alaca, B Erdem

    2016-03-04

    Introducing a single silicon nanowire with a known orientation and dimensions to a specific layout location constitutes a major challenge. The challenge becomes even more formidable, if one chooses to realize the task in a monolithic fashion with an extreme topography, a characteristic of microsystems. The need for such a monolithic integration is fueled by the recent surge in the use of silicon nanowires as functional building blocks in various electromechanical and optoelectronic applications. This challenge is addressed in this work by introducing a top-down, silicon-on-insulator technology. The technology provides a pathway for obtaining well-controlled silicon nanowires along with the surrounding microscale features up to a three-order-of-magnitude scale difference. A two-step etching process is developed, where the first shallow etch defines a nanoscale protrusion on the wafer surface. After applying a conformal protection on the protrusion, a deep etch step is carried out forming the surrounding microscale features. A minimum nanowire cross-section of 35 nm by 168 nm is demonstrated in the presence of an etch depth of 10 μm. Nanowire cross-sectional features are characterized via transmission electron microscopy and linked to specific process steps. The technology allows control on all dimensional aspects along with the exact location and orientation of the silicon nanowire. The adoption of the technology in the fabrication of micro and nanosystems can potentially lead to a significant reduction in process complexity by facilitating direct access to the nanowire during surface processes such as contact formation and doping.

  17. Characterization of Developer Application Methods Used in Fluorescent Penetrant Inspection

    NASA Astrophysics Data System (ADS)

    Brasche, L. J. H.; Lopez, R.; Eisenmann, D.

    2006-03-01

    Fluorescent penetrant inspection (FPI) is the most widely used inspection method for aviation components seeing use for production as well as an inservice inspection applications. FPI is a multiple step process requiring attention to the process parameters for each step in order to enable a successful inspection. A multiyear program is underway to evaluate the most important factors affecting the performance of FPI, to determine whether existing industry specifications adequately address control of the process parameters, and to provide the needed engineering data to the public domain. The final step prior to the inspection is the application of developer with typical aviation inspections involving the use of dry powder (form d) usually applied using either a pressure wand or dust storm chamber. Results from several typical dust storm chambers and wand applications have shown less than optimal performance. Measurements of indication brightness and recording of the UVA image, and in some cases, formal probability of detection (POD) studies were used to assess the developer application methods. Key conclusions and initial recommendations are provided.

  18. Automatic diagnosis of malaria based on complete circle-ellipse fitting search algorithm.

    PubMed

    Sheikhhosseini, M; Rabbani, H; Zekri, M; Talebi, A

    2013-12-01

    Diagnosis of malaria parasitemia from blood smears is a subjective and time-consuming task for pathologists. The automatic diagnostic process will reduce the diagnostic time. Also, it can be worked as a second opinion for pathologists and may be useful in malaria screening. This study presents an automatic method for malaria diagnosis from thin blood smears. According to this fact that malaria life cycle is started by forming a ring around the parasite nucleus, the proposed approach is mainly based on curve fitting to detect parasite ring in the blood smear. The method is composed of six main phases: stain object extraction step, which extracts candidate objects that may be infected by malaria parasites. This phase includes stained pixel extraction step based on intensity and colour, and stained object segmentation by defining stained circle matching. Second step is preprocessing phase which makes use of nonlinear diffusion filtering. The process continues with detection of parasite nucleus from resulted image of previous step according to image intensity. Fourth step introduces a complete search process in which the circle search step identifies the direction and initial points for direct least-square ellipse fitting algorithm. Furthermore in the ellipse searching process, although parasite shape is completed undesired regions with high error value are removed and ellipse parameters are modified. Features are extracted from the parasite candidate region instead of whole candidate object in the fifth step. By employing this special feature extraction way, which is provided by special searching process, the necessity of employing clump splitting methods is removed. Also, defining stained circle matching process in the first step speeds up the whole procedure. Finally, a series of decision rules are applied on the extracted features to decide on the positivity or negativity of malaria parasite presence. The algorithm is applied on 26 digital images which are provided from thin blood smear films. The images are contained 1274 objects which may be infected by parasite or healthy. Applying the automatic identification of malaria on provided database showed a sensitivity of 82.28% and specificity of 98.02%. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  19. Timing paradox of stepping and falls in ageing: not so quick and quick(er) on the trigger.

    PubMed

    Rogers, Mark W; Mille, Marie-Laure

    2016-08-15

    Physiological and degenerative changes affecting human standing balance are major contributors to falls with ageing. During imbalance, stepping is a powerful protective action for preserving balance that may be voluntarily initiated in recognition of a balance threat, or be induced by an externally imposed mechanical or sensory perturbation. Paradoxically, with ageing and falls, initiation slowing of voluntary stepping is observed together with perturbation-induced steps that are triggered as fast as or faster than for younger adults. While age-associated changes in sensorimotor conduction, central neuronal processing and cognitive functions are linked to delayed voluntary stepping, alterations in the coupling of posture and locomotion may also prolong step triggering. It is less clear, however, how these factors may explain the accelerated triggering of induced stepping. We present a conceptual model that addresses this issue. For voluntary stepping, a disruption in the normal coupling between posture and locomotion may underlie step-triggering delays through suppression of the locomotion network based on an estimation of the evolving mechanical state conditions for stability. During induced stepping, accelerated step initiation may represent an event-triggering process whereby stepping is released according to the occurrence of a perturbation rather than to the specific sensorimotor information reflecting the evolving instability. In this case, errors in the parametric control of induced stepping and its effectiveness in stabilizing balance would be likely to occur. We further suggest that there is a residual adaptive capacity with ageing that could be exploited to improve paradoxical triggering and other changes in protective stepping to impact fall risk. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.

  20. Improvement of the System of Training of Specialists by University for Coal Mining Enterprises

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, Vadim; Seredkina, Irina

    2017-11-01

    In the article the ingenious technique of the Quality Function Deployment with reference to the process of training of specialists with higher education by university is considered. The method is based on the step-by-step conversion of customer requirements into specific organizational, meaningful and functional transformations of the technological process of the university. A fully deployed quality function includes four stages of tracking customer requirements while creating a product: product planning and design, process design, production design. The Quality Function Deployment can be considered as one of the methods for optimizing the technological processes of training of specialists with higher education in the current economic conditions. Implemented at the initial stages of the life cycle of the technological process, it ensures not only the high quality of the "product" of graduate school, but also the fullest possible satisfaction of consumer's requests and expectations.

  1. A Featured-Based Strategy for Stereovision Matching in Sensors with Fish-Eye Lenses for Forest Environments

    PubMed Central

    Herrera, Pedro Javier; Pajares, Gonzalo; Guijarro, Maria; Ruz, José J.; Cruz, Jesús M.; Montes, Fernando

    2009-01-01

    This paper describes a novel feature-based stereovision matching process based on a pair of omnidirectional images in forest stands acquired with a stereovision sensor equipped with fish-eye lenses. The stereo analysis problem consists of the following steps: image acquisition, camera modelling, feature extraction, image matching and depth determination. Once the depths of significant points on the trees are obtained, the growing stock volume can be estimated by considering the geometrical camera modelling, which is the final goal. The key steps are feature extraction and image matching. This paper is devoted solely to these two steps. At a first stage a segmentation process extracts the trunks, which are the regions used as features, where each feature is identified through a set of attributes of properties useful for matching. In the second step the features are matched based on the application of the following four well known matching constraints, epipolar, similarity, ordering and uniqueness. The combination of the segmentation and matching processes for this specific kind of sensors make the main contribution of the paper. The method is tested with satisfactory results and compared against the human expert criterion. PMID:22303134

  2. Review of Peak Detection Algorithms in Liquid-Chromatography-Mass Spectrometry

    PubMed Central

    Zhang, Jianqiu; Gonzalez, Elias; Hestilow, Travis; Haskins, William; Huang, Yufei

    2009-01-01

    In this review, we will discuss peak detection in Liquid-Chromatography-Mass Spectrometry (LC/MS) from a signal processing perspective. A brief introduction to LC/MS is followed by a description of the major processing steps in LC/MS. Specifically, the problem of peak detection is formulated and various peak detection algorithms are described and compared. PMID:20190954

  3. Improvements to the single screw extruder

    NASA Technical Reports Server (NTRS)

    Hiemenz, C.; Ziegmann, G.; Franzkoch, B.; Hoffmanns, W.; Michaeli, W.

    1977-01-01

    The extrusion on a single screw extruder is examined. The process is divided into several steps: the dosage of the materials to be conveyed; the modification of the shape of the feeding opening which influences the feeding process and consequently the throughput of the extruder; optimizing the shape of the feeding zone to meet the specific material requirements; and plasticizing and homogenizing.

  4. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, Enhanced Pearson eText with Loose-Leaf Version--Access Card Package. Fifth Edition

    ERIC Educational Resources Information Center

    Creswell, John W.

    2015-01-01

    "Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research" offers a truly balanced, inclusive, and integrated overview of the processes involved in educational research. This text first examines the general steps in the research process and then details the procedures for conducting specific types…

  5. Review: A Position Paper on Selenium in Ecotoxicology: A Procedure for Deriving Site-Specific Water Quality Criteria

    Treesearch

    A. Dennis Lemly

    1997-01-01

    This paper describes a method for deriving site-specific water quality criteria for selenium using a two-step process: (1) gather information on selenium residues and biological effects at the site and in down-gradient systems and (2) examine criteria based on the degree of bioaccumulation, the relationship between mea-sured residues and threshold concentrations for...

  6. Assessment of molecular contamination in mask pod

    NASA Astrophysics Data System (ADS)

    Foray, Jean Marie; Dejaune, Patrice; Sergent, Pierre; Gough, Stuart; Cheung, D.; Davenet, Magali; Favre, Arnaud; Rude, C.; Trautmann, T.; Tissier, Michel; Fontaine, H.; Veillerot, M.; Avary, K.; Hollein, I.; Lerit, R.

    2008-04-01

    Context/ study Motivation: Contamination and especially Airbone Molecular Contamination (AMC) is a critical issue for mask material flow with a severe and fairly unpredictable risk of induced contamination and damages especially for 193 nm lithography. It is therefore essential to measure, to understand and then try to reduce AMC in mask environment. Mask material flow was studied in a global approach by a pool of European partners, especially within the frame of European MEDEA+ project, so called "MUSCLE". This paper deals with results and assessment of mask pod environment in term of molecular contamination in a first step, then in a second step preliminary studies to reduce mask pod influence and contamination due to material out gassing. Approach and techniques: A specific assessment of environmental / molecular contamination along the supply chain was performed by all partners. After previous work presented at EMLC 07, further studies were performed on real time contamination measurement pod at different sites locations (including Mask manufacturing site, blank manufacturing sites, IC fab). Studies were linked to the main critical issues: cleaning, storage, handling, materials and processes. Contamination measurement campaigns were carried out along the mask supply chain using specific Adixen analyzer in order to monitor in real time organic contaminants (ppb level) in mask pods. Key results would be presented: VOC, AMC and humidity level on different kinds of mask carriers, impact of basic cleaning on pod outgassing measurement (VOC, NH3), and process influence on pod contamination... In a second step, preliminary specific pod conditioning studies for better pod environment were performed based on Adixen vacuum process. Process influence had been experimentally measured in term of molecular outgassing from mask pods. Different AMC experimental characterization methods had been carried out leading to results on a wide range of organic and inorganic contaminants: by inline techniques based on Adixen humidity, also VOC and organic sensors, together by off-line techniques already used in the extensive previous mask pods benchmark (TD-GCMS & Ionic Chromatography). Humidity and VOC levels from mask carriers had shown significant reduction after Adixen pod conditioning process. Focus had been made on optimized vacuum step (for AMC) after particles carrier cleaning cycle. Based upon these key results new procedures, as well as guidelines for mask carrier cleaning optimization are proposed to improve pod contamination control. Summary results/next steps: This paper reports molecular contamination measurement campaigns performed by a pool of European partners along the mask supply chain. It allows us to investigate, identify and quantify critical molecular contamination in mask pod, as well as VOC and humidity, issues depending on locations, uses, and carrier's type. Preliminary studies highlight initial process solutions for pods conditioning that are being used for short term industrialization and further industrialized.

  7. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  8. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  9. Specific detection of the cleavage activity of mycobacterial enzymes using a quantum dot based DNA nanosensor

    NASA Astrophysics Data System (ADS)

    Jepsen, Morten Leth; Harmsen, Charlotte; Godbole, Adwait Anand; Nagaraja, Valakunja; Knudsen, Birgitta R.; Ho, Yi-Ping

    2015-12-01

    We present a quantum dot based DNA nanosensor specifically targeting the cleavage step in the reaction cycle of the essential DNA-modifying enzyme, mycobacterial topoisomerase I. The design takes advantages of the unique photophysical properties of quantum dots to generate visible fluorescence recovery upon specific cleavage by mycobacterial topoisomerase I. This report, for the first time, demonstrates the possibility to quantify the cleavage activity of the mycobacterial enzyme without the pre-processing sample purification or post-processing signal amplification. The cleavage induced signal response has also proven reliable in biological matrices, such as whole cell extracts prepared from Escherichia coli and human Caco-2 cells. It is expected that the assay may contribute to the clinical diagnostics of bacterial diseases, as well as the evaluation of treatment outcomes.We present a quantum dot based DNA nanosensor specifically targeting the cleavage step in the reaction cycle of the essential DNA-modifying enzyme, mycobacterial topoisomerase I. The design takes advantages of the unique photophysical properties of quantum dots to generate visible fluorescence recovery upon specific cleavage by mycobacterial topoisomerase I. This report, for the first time, demonstrates the possibility to quantify the cleavage activity of the mycobacterial enzyme without the pre-processing sample purification or post-processing signal amplification. The cleavage induced signal response has also proven reliable in biological matrices, such as whole cell extracts prepared from Escherichia coli and human Caco-2 cells. It is expected that the assay may contribute to the clinical diagnostics of bacterial diseases, as well as the evaluation of treatment outcomes. Electronic supplementary information (ESI) available: Characterization of the QD-based DNA Nanosensor. See DOI: 10.1039/c5nr06326d

  10. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  11. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  12. Application of lean manufacturing techniques in the Emergency Department.

    PubMed

    Dickson, Eric W; Singh, Sabi; Cheung, Dickson S; Wyatt, Christopher C; Nugent, Andrew S

    2009-08-01

    "Lean" is a set of principles and techniques that drive organizations to continually add value to the product they deliver by enhancing process steps that are necessary, relevant, and valuable while eliminating those that fail to add value. Lean has been used in manufacturing for decades and has been associated with enhanced product quality and overall corporate success. To evaluate whether the adoption of Lean principles by an Emergency Department (ED) improves the value of emergency care delivered. Beginning in December 2005, we implemented a variety of Lean techniques in an effort to enhance patient and staff satisfaction. The implementation followed a six-step process of Lean education, ED observation, patient flow analysis, process redesign, new process testing, and full implementation. Process redesign focused on generating improvement ideas from frontline workers across all departmental units. Value-based and operational outcome measures, including patient satisfaction, expense per patient, ED length of stay (LOS), and patient volume were compared for calendar year 2005 (pre-Lean) and periodically after 2006 (post-Lean). Patient visits increased by 9.23% in 2006. Despite this increase, LOS decreased slightly and patient satisfaction increased significantly without raising the inflation adjusted cost per patient. Lean improved the value of the care we delivered to our patients. Generating and instituting ideas from our frontline providers have been the key to the success of our Lean program. Although Lean represents a fundamental change in the way we think of delivering care, the specific process changes we employed tended to be simple, small procedure modifications specific to our unique people, process, and place. We, therefore, believe that institutions or departments aspiring to adopt Lean should focus on the core principles of Lean rather than on emulating specific process changes made at other institutions.

  13. Rett Syndrome.

    ERIC Educational Resources Information Center

    Culbert, Linda A.

    This pamphlet reviews the historical process involved in initially recognizing Rett Syndrome as a specific disorder in girls. Its etiology is unknown, but studies have considered factors as hyperammonemia, a two-step mutation, a fragile X chromosome, metabolic disorder, environmental causation, dopamine deficiency, and an inactive X chromosome.…

  14. Dalhousie Fire

    ERIC Educational Resources Information Center

    Matthews, Fred W.

    1986-01-01

    Describes steps taken by the Weldon Law Library at Dalhousie University in salvaging books damaged in a major fire, including procedures and processes used in packing, sorting, drying, and cleaning the books. The need for a disaster plan for specific libraries is emphasized, and some suggestions are made. (CDD)

  15. The dynamics of team cognition: A process-oriented theory of knowledge emergence in teams.

    PubMed

    Grand, James A; Braun, Michael T; Kuljanin, Goran; Kozlowski, Steve W J; Chao, Georgia T

    2016-10-01

    Team cognition has been identified as a critical component of team performance and decision-making. However, theory and research in this domain continues to remain largely static; articulation and examination of the dynamic processes through which collectively held knowledge emerges from the individual- to the team-level is lacking. To address this gap, we advance and systematically evaluate a process-oriented theory of team knowledge emergence. First, we summarize the core concepts and dynamic mechanisms that underlie team knowledge-building and represent our theory of team knowledge emergence (Step 1). We then translate this narrative theory into a formal computational model that provides an explicit specification of how these core concepts and mechanisms interact to produce emergent team knowledge (Step 2). The computational model is next instantiated into an agent-based simulation to explore how the key generative process mechanisms described in our theory contribute to improved knowledge emergence in teams (Step 3). Results from the simulations demonstrate that agent teams generate collectively shared knowledge more effectively when members are capable of processing information more efficiently and when teams follow communication strategies that promote equal rates of information sharing across members. Lastly, we conduct an empirical experiment with real teams participating in a collective knowledge-building task to verify that promoting these processes in human teams also leads to improved team knowledge emergence (Step 4). Discussion focuses on implications of the theory for examining team cognition processes and dynamics as well as directions for future research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Electrohydraulic linear actuator with two stepping motors controlled by overshoot-free algorithm

    NASA Astrophysics Data System (ADS)

    Milecki, Andrzej; Ortmann, Jarosław

    2017-11-01

    The paper describes electrohydraulic spool valves with stepping motors used as electromechanical transducers. A new concept of a proportional valve in which two stepping motors are working differentially is introduced. Such valve changes the fluid flow proportionally to the sum or difference of the motors' steps numbers. The valve design and principle of its operation is described. Theoretical equations and simulation models are proposed for all elements of the drive, i.e., the stepping motor units, hydraulic valve and cylinder. The main features of the valve and drive operation are described; some specific problem areas covering the nature of stepping motors and their differential work in the valve are also considered. The whole servo drive non-linear model is proposed and used further for simulation investigations. The initial simulation investigations of the drive with a new valve have shown that there is a significant overshoot in the drive step response, which is not allowed in positioning process. Therefore additional effort is spent to reduce the overshoot and in consequence reduce the settling time. A special predictive algorithm is proposed to this end. Then the proposed control method is tested and further improved in simulations. Further on, the model is implemented in reality and the whole servo drive system is tested. The investigation results presented in this paper, are showing an overshoot-free positioning process which enables high positioning accuracy.

  17. Closing the brain-to-brain loop in laboratory testing.

    PubMed

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  18. Cellulose Biosynthesis: Current Views and Evolving Concepts

    PubMed Central

    SAXENA, INDER M.; BROWN, R. MALCOLM

    2005-01-01

    • Aims To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. • Scope Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. • Conclusions With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back. PMID:15894551

  19. Cellulose biosynthesis: current views and evolving concepts.

    PubMed

    Saxena, Inder M; Brown, R Malcolm

    2005-07-01

    To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. * Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. * With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back.

  20. Effect and process evaluation of a kindergarten-based, family-involved cluster randomised controlled trial in six European countries on four- to six-year-old children's steps per day: the ToyBox-study.

    PubMed

    De Craemer, Marieke; Verloigne, Maïté; De Bourdeaudhuij, Ilse; Androutsos, Odysseas; Iotova, Violeta; Moreno, Luis; Koletzko, Berthold; Socha, Piotr; Manios, Yannis; Cardon, Greet

    2017-08-29

    The ToyBox-intervention is a theory- and evidence-based intervention delivered in kindergartens to improve four- to six-year-old children's energy balance-related behaviours and prevent obesity. The current study aimed to (1) examine the effect of the ToyBox-intervention on increasing European four- to six-year-old children' steps per day, and (2) examine if a higher process evaluation score from teachers and parents was related to a more favourable effect on steps per day. A sample of 2438 four- to six-year-old children (51.9% boys, mean age 4.75 ± 0.43 years) from 6 European countries (Belgium, Bulgaria, Germany, Greece, Poland and Spain) wore a motion sensor (pedometer or accelerometer) for a minimum of two weekdays and one weekend day both at baseline and follow-up to objectively measure their steps per day. Kindergarten teachers implemented the physical activity component of the ToyBox-intervention for 6 weeks in total, with a focus on (1) environmental changes in the classroom, (2) the child performing the actual behaviour and (3) classroom activities. Children's parents received newsletters, tip cards and posters. To assess intervention effects, multilevel repeated measures analyses were conducted for the total sample and the six intervention countries separately. In addition, process evaluation questionnaires were used to calculate a total process evaluation score (with implementation and satisfaction as a part of the overall score) for teachers and parents which was then linked with the physical activity outcomes. No significant intervention effects on four- to six-year-old children' steps per weekday, steps per weekend day and steps per average day were found, both in the total sample and in the country-specific samples (all p > 0.05). In general, the intervention effects on steps per day were least favourable in four- to six-year-old children with a low teachers process evaluation score and most favourable in four- to six-year-old children with a high teachers process evaluation score. No differences in intervention effects were found for a low, medium or high parents' process evaluation score. The physical activity component of the ToyBox-intervention had no overall effect on four- to six-year-old children' steps per day. However, the process evaluation scores showed that kindergarten teachers that implemented the physical activity component of the ToyBox-intervention as planned and were satisfied with the physical activity component led to favourable effects on children's steps per day. Strategies to motivate, actively involve and engage the kindergarten teachers and parents/caregivers are needed to induce larger effects.

  1. [Implementation of a rational standard of hygiene for preparation of operating rooms].

    PubMed

    Bauer, M; Scheithauer, S; Moerer, O; Pütz, H; Sliwa, B; Schmidt, C E; Russo, S G; Waeschle, R M

    2015-10-01

    The assurance of high standards of care is a major requirement in German hospitals while cost reduction and efficient use of resources are mandatory. These requirements are particularly evident in the high-risk and cost-intensive operating theatre field with multiple process steps. The cleaning of operating rooms (OR) between surgical procedures is of major relevance for patient safety and requires time and human resources. The hygiene procedure plan for OR cleaning between operations at the university hospital in Göttingen was revised and optimized according to the plan-do-check-act principle due to not clearly defined specifications of responsibilities, use of resources, prolonged process times and increased staff engagement. The current status was evaluated in 2012 as part of the first step "plan". The subsequent step "do" included an expert symposium with external consultants, interdisciplinary consensus conferences with an actualization of the former hygiene procedure plan and the implementation process. All staff members involved were integrated into this management change process. The penetration rate of the training and information measures as well as the acceptance and compliance with the new hygiene procedure plan were reviewed within step "check". The rates of positive swabs and air sampling as well as of postoperative wound infections were analyzed for quality control and no evidence for a reduced effectiveness of the new hygiene plan was found. After the successful implementation of these measures the next improvement cycle ("act") was performed in 2014 which led to a simplification of the hygiene plan by reduction of the number of defined cleaning and disinfection programs for preparation of the OR. The reorganization measures described led to a comprehensive commitment of the hygiene procedure plan by distinct specifications for responsibilities, for the course of action and for the use of resources. Furthermore, a simplification of the plan, a rational staff assignment and reduced process times were accomplished. Finally, potential conflicts due to an insufficient evidence-based knowledge of personnel was reduced. This present project description can be used by other hospitals as a guideline for similar changes in management processes.

  2. Receptor-mediated gene transfer vectors: progress towards genetic pharmaceuticals.

    PubMed

    Molas, M; Gómez-Valadés, A G; Vidal-Alabró, A; Miguel-Turu, M; Bermudez, J; Bartrons, R; Perales, J C

    2003-10-01

    Although specific delivery to tissues and unique cell types in vivo has been demonstrated for many non-viral vectors, current methods are still inadequate for human applications, mainly because of limitations on their efficiencies. All the steps required for an efficient receptor-mediated gene transfer process may in principle be exploited to enhance targeted gene delivery. These steps are: DNA/vector binding, internalization, subcellular trafficking, vesicular escape, nuclear import, and unpacking either for transcription or other functions (i.e., antisense, RNA interference, etc.). The large variety of vector designs that are currently available, usually aimed at improving the efficiency of these steps, has complicated the evaluation of data obtained from specific derivatives of such vectors. The importance of the structure of the final vector and the consequences of design decisions at specific steps on the overall efficiency of the vector will be discussed in detail. We emphasize in this review that stability in serum and thus, proper bioavailability of vectors to their specific receptors may be the single greatest limiting factor on the overall gene transfer efficiency in vivo. We discuss current approaches to overcome the intrinsic instability of synthetic vectors in the blood. In this regard, a summary of the structural features of the vectors obtained from current protocols will be presented and their functional characteristics evaluated. Dissecting information on molecular conjugates obtained by such methodologies, when carefully evaluated, should provide important guidelines for the creation of effective, targeted and safe DNA therapeutics.

  3. SSME propellant path leak detection real-time

    NASA Technical Reports Server (NTRS)

    Crawford, R. A.; Smith, L. M.

    1994-01-01

    Included are four documents that outline the technical aspects of the research performed on NASA Grant NAG8-140: 'A System for Sequential Step Detection with Application to Video Image Processing'; 'Leak Detection from the SSME Using Sequential Image Processing'; 'Digital Image Processor Specifications for Real-Time SSME Leak Detection'; and 'A Color Change Detection System for Video Signals with Applications to Spectral Analysis of Rocket Engine Plumes'.

  4. Wearing the Coveted Purple Ribbon: What Students Need To Know About the Cooperative Buying Process and How It Saves Money for Your Schools.

    ERIC Educational Resources Information Center

    Bestler, Laura

    1998-01-01

    Cooperative buying is a process developed to enable colleges and universities in specific geographic areas to block-book dates for an artist or attraction, resulting in lower costs to schools and more logical routing for artists. A seven-step procedure, beginning before the campus activities convention and continuing through contract completion…

  5. Data specifications for INSPIRE

    NASA Astrophysics Data System (ADS)

    Portele, Clemens; Woolf, Andrew; Cox, Simon

    2010-05-01

    In Europe a major recent development has been the entering in force of the INSPIRE Directive in May 2007, establishing an infrastructure for spatial information in Europe to support Community environmental policies, and policies or activities which may have an impact on the environment. INSPIRE is based on the infrastructures for spatial information established and operated by the 27 Member States of the European Union. The Directive addresses 34 spatial data themes needed for environmental applications, with key components specified through technical implementing rules. This makes INSPIRE a unique example of a legislative "regional" approach. One of the requirements of the INSPIRE Directive is to make existing spatial data sets with relevance for one of the spatial data themes available in an interoperable way, i.e. where the spatial data from different sources in Europe can be combined to a coherent result. Since INSPIRE covers a wide range of spatial data themes, the first step has been the development of a modelling framework that provides a common foundation for all themes. This framework is largely based on the ISO 19100 series of standards. The use of common generic spatial modelling concepts across all themes is an important enabler for interoperability. As a second step, data specifications for the first set of themes has been developed based on the modelling framework. The themes include addresses, transport networks, protected sites, hydrography, administrative areas and others. The data specifications were developed by selected experts nominated by stakeholders from all over Europe. For each theme a working group was established in early 2008 working on their specific theme and collaborating with the other working groups on cross-theme issues. After a public review of the draft specifications starting in December 2008, an open testing process and thorough comment resolution process, the draft technical implementing rules for these themes have been approved by the INSPIRE Committee. After they enter into force they become part of the legal framework and European Member States have to implement these rules. The next step is the development of the remaining 25 spatial data themes, which include many themes of interest for the Earth Sciences including geology, meteorological and oceanographic geographic features, atmospheric conditions, habitats and biotopes, species distribution, environmental monitoring facilities, and land cover to name a few. The process will follow in general the same steps as for the first themes and the working groups are expected to start their work in March/April 2010. The first draft specifications for public comment are expected at the end of 2010 and the work is scheduled to be completed in 2012. At the same time, other initiatives like GMES (Global Monitoring for Environment and Security) and GEOSS (Global Earth Observation System of Systems) are also dealing with spatial data from the themes covered by INSPIRE. With the EU-funded project GIGAS, a support action, a step has been made towards architectural coherence between these initiatives. Recommendations to improve the coherence of the information architectures across the initiatives have been discussed in January 2010 with stakeholders from all initiatives, the standards organisations and EU-funded research projects. Based on the general agreements achieved in these discussions, the next step will be to start working towards the implementation of these recommendations, which are in line with the approach taken by the INSPIRE data specifications.

  6. Cycle development and design for CO{sub 2} capture from flue gas by vacuum swing adsorption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jun Zhang; Paul A. Webley

    CO{sub 2} capture and storage is an important component in the development of clean power generation processes. One CO{sub 2} capture technology is gas-phase adsorption, specifically pressure (or vacuum) swing adsorption. The complexity of these processes makes evaluation and assessment of new adsorbents difficult and time-consuming. In this study, we have developed a simple model specifically targeted at CO{sub 2} capture by pressure swing adsorption and validated our model by comparison with data from a fully instrumented pilot-scale pressure swing adsorption process. The model captures non-isothermal effects as well as nonlinear adsorption and nitrogen coadsorption. Using the model and ourmore » apparatus, we have designed and studied a large number of cycles for CO{sub 2} capture. We demonstrate that by careful management of adsorption fronts and assembly of cycles based on understanding of the roles of individual steps, we are able to quickly assess the effect of adsorbents and process parameters on capture performance and identify optimal operating regimes and cycles. We recommend this approach in contrast to exhaustive parametric studies which tend to depend on specifics of the chosen cycle and adsorbent. We show that appropriate combinations of process steps can yield excellent process performance and demonstrate how the pressure drop, and heat loss, etc. affect process performance through their effect on adsorption fronts and profiles. Finally, cyclic temperature profiles along the adsorption column can be readily used to infer concentration profiles - this has proved to be a very useful tool in cyclic function definition. Our research reveals excellent promise for the application of pressure/vacuum swing adsorption technology in the arena of CO{sub 2} capture from flue gases. 20 refs., 6 figs., 2 tabs.« less

  7. Cycle development and design for CO2 capture from flue gas by vacuum swing adsorption.

    PubMed

    Zhang, Jun; Webley, Paul A

    2008-01-15

    CO2 capture and storage is an important component in the development of clean power generation processes. One CO2 capture technology is gas-phase adsorption, specifically pressure (or vacuum) swing adsorption. The complexity of these processes makes evaluation and assessment of new adsorbents difficult and time-consuming. In this study, we have developed a simple model specifically targeted at CO2 capture by pressure swing adsorption and validated our model by comparison with data from a fully instrumented pilot-scale pressure swing adsorption process. The model captures nonisothermal effects as well as nonlinear adsorption and nitrogen coadsorption. Using the model and our apparatus, we have designed and studied a large number of cycles for CO2 capture. We demonstrate that by careful management of adsorption fronts and assembly of cycles based on understanding of the roles of individual steps, we are able to quickly assess the effect of adsorbents and process parameters on capture performance and identify optimal operating regimes and cycles. We recommend this approach in contrast to exhaustive parametric studies which tend to depend on specifics of the chosen cycle and adsorbent. We show that appropriate combinations of process steps can yield excellent process performance and demonstrate how the pressure drop, and heat loss, etc. affect process performance through their effect on adsorption fronts and profiles. Finally, cyclic temperature profiles along the adsorption column can be readily used to infer concentration profiles-this has proved to be a very useful tool in cyclic function definition. Our research reveals excellent promise for the application of pressure/vacuum swing adsorption technology in the arena of CO2 capture from flue gases.

  8. The road to JCAHO disease-specific care certification: a step-by-step process log.

    PubMed

    Morrison, Kathy

    2005-01-01

    In 2002, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) implemented Disease-Specific Care (DSC) certification. This is a voluntary program in which organizations have their disease management program evaluated by this regulatory agency. Some of the DSC categories are stroke, heart failure, acute MI, diabetes, and pneumonia. The criteria for any disease management program certification are: compliance with consensus-based national standards, effective use of established clinical practice guidelines to manage and optimize care, and an organized approach to performance measurement and improvement activities. Successful accomplishment of DSC certification defines organizations as Centers of Excellence in management of that particular disease. This article will review general guidelines for DSC certification with an emphasis on Primary Stroke Center certification.

  9. Self-Regulated Strategy Development. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2017

    2017-01-01

    "Self-Regulated Strategy Development" ("SRSD") is an intervention designed to improve students' academic skills through a six-step process that teaches students specific academic strategies and self-regulation skills. The practice is especially appropriate for students with learning disabilities, the focal population of the…

  10. How to Develop a Better Guidance Program.

    ERIC Educational Resources Information Center

    Lawrence, William W.

    Designed for school board members, superintendents, administrators, counselors, and teachers, the paper outlines steps toward the systematic development of a guidance program. Following a rationale and introduction, the paper recommends a specific process for program development: (1) the identification of student needs through formal needs…

  11. Guidance for Product Category Rule Development: Process, Outcome and Next Steps

    EPA Science Inventory

    Background The development of Product Category Rules (PCRs) is inconsistent among the program operators using ISO 14025 as the basis. Furthermore, the existence of several other product claim standards and specifications that require PCRs for making product claims, has the potent...

  12. Effects of aging on the relationship between cognitive demand and step variability during dual-task walking.

    PubMed

    Decker, Leslie M; Cignetti, Fabien; Hunt, Nathaniel; Potter, Jane F; Stergiou, Nicholas; Studenski, Stephanie A

    2016-08-01

    A U-shaped relationship between cognitive demand and gait control may exist in dual-task situations, reflecting opposing effects of external focus of attention and attentional resource competition. The purpose of the study was twofold: to examine whether gait control, as evaluated from step-to-step variability, is related to cognitive task difficulty in a U-shaped manner and to determine whether age modifies this relationship. Young and older adults walked on a treadmill without attentional requirement and while performing a dichotic listening task under three attention conditions: non-forced (NF), forced-right (FR), and forced-left (FL). The conditions increased in their attentional demand and requirement for inhibitory control. Gait control was evaluated by the variability of step parameters related to balance control (step width) and rhythmic stepping pattern (step length and step time). A U-shaped relationship was found for step width variability in both young and older adults and for step time variability in older adults only. Cognitive performance during dual tasking was maintained in both young and older adults. The U-shaped relationship, which presumably results from a trade-off between an external focus of attention and competition for attentional resources, implies that higher-level cognitive processes are involved in walking in young and older adults. Specifically, while these processes are initially involved only in the control of (lateral) balance during gait, they become necessary for the control of (fore-aft) rhythmic stepping pattern in older adults, suggesting that attentional resources turn out to be needed in all facets of walking with aging. Finally, despite the cognitive resources required by walking, both young and older adults spontaneously adopted a "posture second" strategy, prioritizing the cognitive task over the gait task.

  13. Altered synthesis and processing of oligosaccharides of vesicular stomatitis virus glycoprotein in different lectin-resistant Chinese hamster ovary cell lines.

    PubMed

    Hunt, L A

    1980-08-01

    To determine the particular intracellular steps in the glycosylation of the vesicular stomatitis virus (VSV) glycoprotein that were altered in several lectin-resistant CHO cell lines, VSV-infected parental and mutant cells were pulse-labeled for 30 and 120 min with [3H]mannose and [3H]glucosamine. Cell-associated viral glycopeptides were analyzed by gel filtration combined with specific glycosidase digestions and compared with the corresponding mature virion oligosaccharides. The intracellular glycosylation of the VSV glycoprotein in a mutant cell line resistant to phytohemagglutinin was identical to that in the normal cells except for a complete block in processing at a specific step in the final trimming of the oligomannosyl core from five to three mannoses. The results demonstrated that a double-mutant cell line selected from the phytohemagglutinin-resistant cells for resistance to concanavalin A had an additional defect in one of the earliest stages of glycosylation, resulting in smaller precursor oligosaccharides linked to protein.

  14. Altered synthesis and processing of oligosaccharides of vesicular stomatitis virus glycoprotein in different lectin-resistant Chinese hamster ovary cell lines.

    PubMed Central

    Hunt, L A

    1980-01-01

    To determine the particular intracellular steps in the glycosylation of the vesicular stomatitis virus (VSV) glycoprotein that were altered in several lectin-resistant CHO cell lines, VSV-infected parental and mutant cells were pulse-labeled for 30 and 120 min with [3H]mannose and [3H]glucosamine. Cell-associated viral glycopeptides were analyzed by gel filtration combined with specific glycosidase digestions and compared with the corresponding mature virion oligosaccharides. The intracellular glycosylation of the VSV glycoprotein in a mutant cell line resistant to phytohemagglutinin was identical to that in the normal cells except for a complete block in processing at a specific step in the final trimming of the oligomannosyl core from five to three mannoses. The results demonstrated that a double-mutant cell line selected from the phytohemagglutinin-resistant cells for resistance to concanavalin A had an additional defect in one of the earliest stages of glycosylation, resulting in smaller precursor oligosaccharides linked to protein. Images PMID:6255177

  15. [Practical guidelines for genetic testing in cardiovascular diseases].

    PubMed

    Reinhard, W; Trenkwalder, T; Schunkert, H

    2017-08-01

    In the last decade, genetic testing for cardiovascular disorders has become more and more relevant. Progress in molecular genetics has led to new opportunities for diagnostics, improved risk prediction and could lead to novel therapeutic approaches. Genetic diagnostic testing is relevant for both confirming a diagnosis as well as deciding on therapeutic consequences, if applicable. Furthermore, predictive testing in family members for specific cardiovascular diseases is now a standard procedure in holistic patient management. The process of genetic testing as well as documentation requirements and discussion of test results with patients are subject to legal regulations. These regulations might be confusing for clinical practitioners/cardiologists. The aim of this article is to provide a clinical framework for genetic testing. First, we explain the legal and ethical background. Second, we illustrate the process of genetic testing step by step and present updates on remuneration. Finally, we discuss the significance of genetic testing and specific disease indications in cardiology.

  16. An adjuvant-modulated vaccine response in human whole blood

    PubMed Central

    Hakimi, Jalil; Azizi, Ali; Ausar, Salvador F.; Todryk, Stephen M.; Rahman, Nausheen; Brookes, Roger H.

    2017-01-01

    ABSTRACT The restimulation of an immune memory response by in vitro culture of blood cells with a specific antigen has been used as a way to gauge immunity to vaccines for decades. In this commentary we discuss a less appreciated application to support vaccine process development. We report that human whole blood from pre-primed subjects can generate a profound adjuvant-modulated, antigen-specific response to several different vaccine formulations. The response is able to differentiate subtle changes in the quality of an immune memory response to vaccine formulations and can be used to select optimal conditions relating to a particular manufacture process step. While questions relating to closeness to in vivo vaccination remain, the approach is another big step nearer to the more relevant human response. It has special importance for new adjuvant development, complementing other preclinical in vivo and in vitro approaches to considerably de-risk progression of novel vaccines before and throughout early clinical development. Broader implications of the approach are discussed. PMID:28605295

  17. Visibility Equalizer Cutaway Visualization of Mesoscopic Biological Models.

    PubMed

    Le Muzic, M; Mindek, P; Sorger, J; Autin, L; Goodsell, D; Viola, I

    2016-06-01

    In scientific illustrations and visualization, cutaway views are often employed as an effective technique for occlusion management in densely packed scenes. We propose a novel method for authoring cutaway illustrations of mesoscopic biological models. In contrast to the existing cutaway algorithms, we take advantage of the specific nature of the biological models. These models consist of thousands of instances with a comparably smaller number of different types. Our method constitutes a two stage process. In the first step, clipping objects are placed in the scene, creating a cutaway visualization of the model. During this process, a hierarchical list of stacked bars inform the user about the instance visibility distribution of each individual molecular type in the scene. In the second step, the visibility of each molecular type is fine-tuned through these bars, which at this point act as interactive visibility equalizers. An evaluation of our technique with domain experts confirmed that our equalizer-based approach for visibility specification was valuable and effective for both, scientific and educational purposes.

  18. Visibility Equalizer Cutaway Visualization of Mesoscopic Biological Models

    PubMed Central

    Le Muzic, M.; Mindek, P.; Sorger, J.; Autin, L.; Goodsell, D.; Viola, I.

    2017-01-01

    In scientific illustrations and visualization, cutaway views are often employed as an effective technique for occlusion management in densely packed scenes. We propose a novel method for authoring cutaway illustrations of mesoscopic biological models. In contrast to the existing cutaway algorithms, we take advantage of the specific nature of the biological models. These models consist of thousands of instances with a comparably smaller number of different types. Our method constitutes a two stage process. In the first step, clipping objects are placed in the scene, creating a cutaway visualization of the model. During this process, a hierarchical list of stacked bars inform the user about the instance visibility distribution of each individual molecular type in the scene. In the second step, the visibility of each molecular type is fine-tuned through these bars, which at this point act as interactive visibility equalizers. An evaluation of our technique with domain experts confirmed that our equalizer-based approach for visibility specification was valuable and effective for both, scientific and educational purposes. PMID:28344374

  19. The Automated Array Assembly Task of the Low-cost Silicon Solar Array Project, Phase 2

    NASA Technical Reports Server (NTRS)

    Coleman, M. G.; Grenon, L.; Pastirik, E. M.; Pryor, R. A.; Sparks, T. G.

    1978-01-01

    An advanced process sequence for manufacturing high efficiency solar cells and modules in a cost-effective manner is discussed. Emphasis is on process simplicity and minimizing consumed materials. The process sequence incorporates texture etching, plasma processes for damage removal and patterning, ion implantation, low pressure silicon nitride deposition, and plated metal. A reliable module design is presented. Specific process step developments are given. A detailed cost analysis was performed to indicate future areas of fruitful cost reduction effort. Recommendations for advanced investigations are included.

  20. Plasmonic nanobubbles for target cell-specific gene and drug delivery and multifunctional processing of heterogeneous cell systems

    NASA Astrophysics Data System (ADS)

    Lukianova-Hleb, Ekaterina Y.; Huye, Leslie E.; Brenner, Malcolm K.; Lapotko, Dmitri O.

    2014-03-01

    Cell and gene cancer therapies require ex vivo cell processing of human grafts. Such processing requires at least three steps - cell enrichment, cell separation (destruction), and gene transfer - each of which requires the use of a separate technology. While these technologies may be satisfactory for research use, they are of limited usefulness in the clinical treatment setting because they have a low processing rate, as well as a low transfection and separation efficacy and specificity in heterogeneous human grafts. Most problematic, because current technologies are administered in multiple steps - rather than in a single, multifunctional, and simultaneous procedure - they lengthen treatment process and introduce an unnecessary level of complexity, labor, and resources into clinical treatment; all these limitations result in high losses of valuable cells. We report a universal, high-throughput, and multifunctional technology that simultaneously (1) inject free external cargo in target cells, (2) destroys unwanted cells, and (3) preserve valuable non-target cells in heterogeneous grafts. Each of these functions has single target cell specificity in heterogeneous cell system, processing rate > 45 mln cell/min, injection efficacy 90% under 96% viability of the injected cells, target cell destruction efficacy > 99%, viability of not-target cells >99% The developed technology employs novel cellular agents, called plasmonic nanobubbles (PNBs). PNBs are not particles, but transient, intracellular events, a vapor nanobubbles that expand and collapse in mere nanoseconds under optical excitation of gold nanoparticles with short picosecond laser pulses. PNBs of different, cell-specific, size (1) inject free external cargo with small PNBs, (2) Destroy other target cells mechanically with large PNBs and (3) Preserve non-target cells. The multi-functionality, precision, and high throughput of all-in-one PNB technology will tremendously impact cell and gene therapies and other clinical applications that depend on ex vivo processing of heterogeneous cell systems.

  1. Achieving Continuous Manufacturing for Final Dosage Formation: Challenges and How to Meet Them May 20-21 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  2. Post-growth process for flexible CdS/CdTe thin film solar cells with high specific power.

    PubMed

    Cho, Eunwoo; Kang, Yoonmook; Kim, Donghwan; Kim, Jihyun

    2016-05-16

    We demonstrated a flexible CdS/CdTe thin film solar cell with high specific power of approximately 254 W/kg. A flexible and ultra-light weight CdS/CdTe cell treated with pre-NP etch process exhibited high conversion efficiency of 13.56% in superstrate configuration. Morphological, structural and optical changes of CdS/CdTe thin films were characterized when pre-NP etch step was incorporated to the conventional post-deposition process. Improvement of photovoltaic parameters can be attributed to the removal of the oxide and the formation of Te-rich layer, which benefit the activation process. Pre-NP etched cell maintained their flexibility and performance under the repeated tensile strain of 0.13%. Our method can pave a way for manufacturing flexible CdS/CdTe thin film solar cells with high specific power for mobile and aerospace applications.

  3. Steps to Ensure a Successful Implementation of Occupational Health and Safety Interventions at an Organizational Level

    PubMed Central

    Herrera-Sánchez, Isabel M.; León-Pérez, José M.; León-Rubio, José M.

    2017-01-01

    There is increasing meta-analytic evidence that addresses the positive impact of evidence-based occupational health and safety interventions on employee health and well-being. However, such evidence is less clear when interventions are approached at an organizational level and are aimed at changing organizational policies and processes. Given that occupational health and safety interventions are usually tailored to specific organizational contexts, generalizing and transferring such interventions to other organizations is a complex endeavor. In response, several authors have argued that an evaluation of the implementation process is crucial for assessing the intervention’s effectiveness and for understanding how and why the intervention has been (un)successful. Thus, this paper focuses on the implementation process and attempts to move this field forward by identifying the main factors that contribute toward ensuring a greater success of occupational health and safety interventions conducted at the organizational level. In doing so, we propose some steps that can guide a successful implementation. These implementation steps are illustrated using examples of evidence-based best practices reported in the literature that have described and systematically evaluated the implementation process behind their interventions during the last decade. PMID:29375413

  4. Site-selective substitutional doping with atomic precision on stepped Al (111) surface by single-atom manipulation

    PubMed Central

    2014-01-01

    In fabrication of nano- and quantum devices, it is sometimes critical to position individual dopants at certain sites precisely to obtain the specific or enhanced functionalities. With first-principles simulations, we propose a method for substitutional doping of individual atom at a certain position on a stepped metal surface by single-atom manipulation. A selected atom at the step of Al (111) surface could be extracted vertically with an Al trimer-apex tip, and then the dopant atom will be positioned to this site. The details of the entire process including potential energy curves are given, which suggests the reliability of the proposed single-atom doping method. PMID:24899871

  5. Site-selective substitutional doping with atomic precision on stepped Al (111) surface by single-atom manipulation.

    PubMed

    Chen, Chang; Zhang, Jinhu; Dong, Guofeng; Shao, Hezhu; Ning, Bo-Yuan; Zhao, Li; Ning, Xi-Jing; Zhuang, Jun

    2014-01-01

    In fabrication of nano- and quantum devices, it is sometimes critical to position individual dopants at certain sites precisely to obtain the specific or enhanced functionalities. With first-principles simulations, we propose a method for substitutional doping of individual atom at a certain position on a stepped metal surface by single-atom manipulation. A selected atom at the step of Al (111) surface could be extracted vertically with an Al trimer-apex tip, and then the dopant atom will be positioned to this site. The details of the entire process including potential energy curves are given, which suggests the reliability of the proposed single-atom doping method.

  6. Knowledge Discovery from Vibration Measurements

    PubMed Central

    Li, Jian; Wang, Daoyao

    2014-01-01

    The framework as well as the particular algorithms of pattern recognition process is widely adopted in structural health monitoring (SHM). However, as a part of the overall process of knowledge discovery from data bases (KDD), the results of pattern recognition are only changes and patterns of changes of data features. In this paper, based on the similarity between KDD and SHM and considering the particularity of SHM problems, a four-step framework of SHM is proposed which extends the final goal of SHM from detecting damages to extracting knowledge to facilitate decision making. The purposes and proper methods of each step of this framework are discussed. To demonstrate the proposed SHM framework, a specific SHM method which is composed by the second order structural parameter identification, statistical control chart analysis, and system reliability analysis is then presented. To examine the performance of this SHM method, real sensor data measured from a lab size steel bridge model structure are used. The developed four-step framework of SHM has the potential to clarify the process of SHM to facilitate the further development of SHM techniques. PMID:24574933

  7. The neural correlates of morphological complexity processing: Detecting structure in pseudowords.

    PubMed

    Schuster, Swetlana; Scharinger, Mathias; Brooks, Colin; Lahiri, Aditi; Hartwigsen, Gesa

    2018-06-01

    Morphological complexity is a highly debated issue in visual word recognition. Previous neuroimaging studies have shown that speakers are sensitive to degrees of morphological complexity. Two-step derived complex words (bridging through bridge N  > bridge V  > bridging) led to more enhanced activation in the left inferior frontal gyrus than their 1-step derived counterparts (running through run V  > running). However, it remains unclear whether sensitivity to degrees of morphological complexity extends to pseudowords. If this were the case, it would indicate that abstract knowledge of morphological structure is independent of lexicality. We addressed this question by investigating the processing of two sets of pseudowords in German. Both sets contained morphologically viable two-step derived pseudowords differing in the number of derivational steps required to access an existing lexical representation and therefore the degree of structural analysis expected during processing. Using a 2 × 2 factorial design, we found lexicality effects to be distinct from processing signatures relating to structural analysis in pseudowords. Semantically-driven processes such as lexical search showed a more frontal distribution while combinatorial processes related to structural analysis engaged more parietal parts of the network. Specifically, more complex pseudowords showed increased activation in parietal regions (right superior parietal lobe and left precuneus) relative to pseudowords that required less structural analysis to arrive at an existing lexical representation. As the two sets were matched on cohort size and surface form, these results highlight the role of internal levels of morphological structure even in forms that do not possess a lexical representation. © 2018 Wiley Periodicals, Inc.

  8. A detailed comparison of analysis processes for MCC-IMS data in disease classification—Automated methods can replace manual peak annotations

    PubMed Central

    Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven

    2017-01-01

    Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313

  9. Leadership processes for re-engineering changes to the health care industry.

    PubMed

    Guo, Kristina L

    2004-01-01

    As health care organizations seek innovative ways to change financing and delivery mechanisms due to escalated health care costs and increased competition, drastic changes are being sought in the form of re-engineering. This study discusses the leader's role of re-engineering in health care. It specifically addresses the reasons for failures in re-engineering and argues that success depends on senior level leaders playing a critical role. Existing studies lack comprehensiveness in establishing models of re-engineering and management guidelines. This research focuses on integrating re-engineering and leadership processes in health care by creating a step-by-step model. Particularly, it illustrates the four Es: Examination, Establishment, Execution and Evaluation, as a comprehensive re-engineering process that combines managerial roles and activities to result in successfully changed and reengineered health care organizations.

  10. Optimal Signal Processing of Frequency-Stepped CW Radar Data

    NASA Technical Reports Server (NTRS)

    Ybarra, Gary A.; Wu, Shawkang M.; Bilbro, Griff L.; Ardalan, Sasan H.; Hearn, Chase P.; Neece, Robert T.

    1995-01-01

    An optimal signal processing algorithm is derived for estimating the time delay and amplitude of each scatterer reflection using a frequency-stepped CW system. The channel is assumed to be composed of abrupt changes in the reflection coefficient profile. The optimization technique is intended to maximize the target range resolution achievable from any set of frequency-stepped CW radar measurements made in such an environment. The algorithm is composed of an iterative two-step procedure. First, the amplitudes of the echoes are optimized by solving an overdetermined least squares set of equations. Then, a nonlinear objective function is scanned in an organized fashion to find its global minimum. The result is a set of echo strengths and time delay estimates. Although this paper addresses the specific problem of resolving the time delay between the first two echoes, the derivation is general in the number of echoes. Performance of the optimization approach is illustrated using measured data obtained from an HP-X510 network analyzer. It is demonstrated that the optimization approach offers a significant resolution enhancement over the standard processing approach that employs an IFFT. Degradation in the performance of the algorithm due to suboptimal model order selection and the effects of additive white Gaussion noise are addressed.

  11. Optimal Signal Processing of Frequency-Stepped CW Radar Data

    NASA Technical Reports Server (NTRS)

    Ybarra, Gary A.; Wu, Shawkang M.; Bilbro, Griff L.; Ardalan, Sasan H.; Hearn, Chase P.; Neece, Robert T.

    1995-01-01

    An optimal signal processing algorithm is derived for estimating the time delay and amplitude of each scatterer reflection using a frequency-stepped CW system. The channel is assumed to be composed of abrupt changes in the reflection coefficient profile. The optimization technique is intended to maximize the target range resolution achievable from any set of frequency-stepped CW radar measurements made in such an environment. The algorithm is composed of an iterative two-step procedure. First, the amplitudes of the echoes are optimized by solving an overdetermined least squares set of equations. Then, a nonlinear objective function is scanned in an organized fashion to find its global minimum. The result is a set of echo strengths and time delay estimates. Although this paper addresses the specific problem of resolving the time delay between the two echoes, the derivation is general in the number of echoes. Performance of the optimization approach is illustrated using measured data obtained from an HP-851O network analyzer. It is demonstrated that the optimization approach offers a significant resolution enhancement over the standard processing approach that employs an IFFT. Degradation in the performance of the algorithm due to suboptimal model order selection and the effects of additive white Gaussion noise are addressed.

  12. Testing Components of a Self-Management Theory in Adolescents With Type 1 Diabetes Mellitus.

    PubMed

    Verchota, Gwen; Sawin, Kathleen J

    The role of self-management in adolescents with type 1 diabetes mellitus is not well understood. The purpose of the research was to examine the relationship of key individual and family self-management theory, context, and process variables on proximal (self-management behaviors) and distal (hemoglobin A1c and diabetes-specific health-related quality of life) outcomes in adolescents with type 1 diabetes. A correlational, cross-sectional study was conducted to identify factors contributing to outcomes in adolescents with Type 1 diabetes and examine potential relationships between context, process, and outcome variables delineated in individual and family self-management theory. Participants were 103 adolescent-parent dyads (adolescents ages 12-17) with Type 1 diabetes from a Midwest, outpatient, diabetes clinic. The dyads completed a self-report survey including instruments intended to measure context, process, and outcome variables from individual and family self-management theory. Using hierarchical multiple regression, context (depressive symptoms) and process (communication) variables explained 37% of the variance in self-management behaviors. Regimen complexity-the only significant predictor-explained 11% of the variance in hemoglobin A1c. Neither process variables nor self-management behaviors were significant. For the diabetes-specific health-related quality of life outcome, context (regimen complexity and depressive symptoms) explained 26% of the variance at step 1; an additional 9% of the variance was explained when process (self-efficacy and communication) variables were added at step 2; and 52% of the variance was explained when self-management behaviors were added at Step 3. In the final model, three variables were significant predictors: depressive symptoms, self-efficacy, and self-management behaviors. The individual and family self-management theory can serve as a cogent theory for understanding key concepts, processes, and outcomes essential to self-management in adolescents and families dealing with Type 1 diabetes mellitus.

  13. Spectroturbidimitry as applied to biomedical and immunological investigations

    NASA Astrophysics Data System (ADS)

    Shchyogolev, Sergei Y.; Khlebtsov, Nikolai G.; Schwartsburd, Boris I.

    1993-06-01

    Methods of optical shifting spectroscopy and laser nephelometry may be used to register antigen-antibody reaction. Usage of laser nephelometry in serodiagnostics, characteristics of a specific step in humoral immune reaction in infections, postinfectious and postinoculation processes response and immunity tension in some infectious diseases were investigated.

  14. Selection of species and sampling areas: The importance of inference

    Treesearch

    Paul Stephen Corn

    2009-01-01

    Inductive inference, the process of drawing general conclusions from specific observations, is fundamental to the scientific method. Platt (1964) termed conclusions obtained through rigorous application of the scientific method as "strong inference" and noted the following basic steps: generating alternative hypotheses; devising experiments, the...

  15. Discovery of Cellular Proteins Required for the Early Steps of HCV Infection Using Integrative Genomics

    PubMed Central

    Yang, Jae-Seong; Kwon, Oh Sung; Kim, Sanguk; Jang, Sung Key

    2013-01-01

    Successful viral infection requires intimate communication between virus and host cell, a process that absolutely requires various host proteins. However, current efforts to discover novel host proteins as therapeutic targets for viral infection are difficult. Here, we developed an integrative-genomics approach to predict human genes involved in the early steps of hepatitis C virus (HCV) infection. By integrating HCV and human protein associations, co-expression data, and tight junction-tetraspanin web specific networks, we identified host proteins required for the early steps in HCV infection. Moreover, we validated the roles of newly identified proteins in HCV infection by knocking down their expression using small interfering RNAs. Specifically, a novel host factor CD63 was shown to directly interact with HCV E2 protein. We further demonstrated that an antibody against CD63 blocked HCV infection, indicating that CD63 may serve as a new therapeutic target for HCV-related diseases. The candidate gene list provides a source for identification of new therapeutic targets. PMID:23593195

  16. Specificity and effector mechanisms of autoantibodies in congenital heart block.

    PubMed

    Wahren-Herlenius, Marie; Sonesson, Sven-Erik

    2006-12-01

    Complete congenital atrio-ventricular (AV) heart block develops in 2-5% of fetuses of Ro/SSA and La/SSB autoantibody-positive pregnant women. During pregnancy, the Ro/SSA and La/SSB antibodies are transported across the placenta and affect the fetus. Emerging data suggest that this happens by a two-stage process. In the first step, maternal autoantibodies bind fetal cardiomyocytes, dysregulate calcium homestasis and induce apoptosis in affected cells. This step might clinically correspond to a first-degree heart block, and be reversible. La/SSB antibodies can bind apoptotic cardiomyocytes and thus increase Ig deposition in the heart. The tissue damage could, as a second step, lead to spread of inflammation in genetically pre-disposed fetuses, progressing to fibrosis and calcification of the AV-node and subsequent complete congenital heart block. Early intrauterine treatment of an incomplete AV-block with fluorinated steroids has been shown to prevent progression of the heart block, making it clinically important to find specific markers to identify the high-risk pregnancies.

  17. nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.

    PubMed

    Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia

    2017-12-01

    Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Method for indexing and retrieving manufacturing-specific digital imagery based on image content

    DOEpatents

    Ferrell, Regina K.; Karnowski, Thomas P.; Tobin, Jr., Kenneth W.

    2004-06-15

    A method for indexing and retrieving manufacturing-specific digital images based on image content comprises three steps. First, at least one feature vector can be extracted from a manufacturing-specific digital image stored in an image database. In particular, each extracted feature vector corresponds to a particular characteristic of the manufacturing-specific digital image, for instance, a digital image modality and overall characteristic, a substrate/background characteristic, and an anomaly/defect characteristic. Notably, the extracting step includes generating a defect mask using a detection process. Second, using an unsupervised clustering method, each extracted feature vector can be indexed in a hierarchical search tree. Third, a manufacturing-specific digital image associated with a feature vector stored in the hierarchicial search tree can be retrieved, wherein the manufacturing-specific digital image has image content comparably related to the image content of the query image. More particularly, can include two data reductions, the first performed based upon a query vector extracted from a query image. Subsequently, a user can select relevant images resulting from the first data reduction. From the selection, a prototype vector can be calculated, from which a second-level data reduction can be performed. The second-level data reduction can result in a subset of feature vectors comparable to the prototype vector, and further comparable to the query vector. An additional fourth step can include managing the hierarchical search tree by substituting a vector average for several redundant feature vectors encapsulated by nodes in the hierarchical search tree.

  19. Achieving continuous manufacturing for final dosage formation: challenges and how to meet them. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are: Form precompetitive partnerships, including industry (pharmaceutical companies and equipment manufacturers), government, and universities. These precompetitive partnerships would develop case studies of continuous manufacturing and ideally perform joint-technology development, including development of small-scale equipment and processes. Develop ways to invest internally in continuous manufacturing. How best to do this will depend on the specifics of a given organization, in particular the current development projects. Upper managers will need to energize their process developers to incorporate continuous manufacturing in at least part of their processes to gain experience and demonstrate directly the benefits. Training of continuous manufacturing technologies, organizational approaches, and regulatory approaches is a key area that industrial leaders should pursue together. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  20. One-step hydrothermal synthesis of hexangular starfruit-like vanadium oxide for high power aqueous supercapacitors

    NASA Astrophysics Data System (ADS)

    Shao, Jie; Li, Xinyong; Qu, Qunting; Zheng, Honghe

    2012-12-01

    Homogenous hexangular starfruit-like vanadium oxide was prepared for the first time by a one-step hydrothermal method. The assembly process of hexangular starfruit-like structure was observed from TEM images. The electrochemical performance of starfruit-like vanadium oxide was examined by cyclic voltammetry and galvanostatic charge/discharge. The obtained starfruit-like vanadium oxide exhibits a high power capability (19 Wh kg-1 at the specific power of 3.4 kW kg-1) and good cycling stability for supercapacitors application.

  1. Literature Review: Crud Formation at the Liquid/Liquid Interface of TBP-Based Solvent-Extraction Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delegard, Calvin H.; Casella, Amanda J.

    2016-09-30

    This report summarizes the literature reviewed on crud formation at the liquid:liquid interface of solvent extraction processes. The review is focused both on classic PUREX extraction for industrial reprocessing, especially as practiced at the Hanford Site, and for those steps specific to plutonium purification that were used at the Plutonium Reclamation Facility (PRF) within the Plutonium Finishing Plant (PFP) at the Hanford Site.

  2. Monitoring stream temperatures—A guide for non-specialists

    USGS Publications Warehouse

    Heck, Michael P.; Schultz, Luke D.; Hockman-Wert, David; Dinger, Eric C.; Dunham, Jason B.

    2018-04-19

    Executive SummaryWater temperature influences most physical and biological processes in streams, and along with streamflows is a major driver of ecosystem processes. Collecting data to measure water temperature is therefore imperative, and relatively straightforward. Several protocols exist for collecting stream temperature data, but these are frequently directed towards specialists. This document was developed to address the need for a protocol intended for non-specialists (non-aquatic) staff. It provides specific step-by-step procedures on (1) how to launch data loggers, (2) check the factory calibration of data loggers prior to field use, (3) how to install data loggers in streams for year-round monitoring, (4) how to download and retrieve data loggers from the field, and (5) how to input project data into organizational databases.

  3. Chromatographic matrix based on hydrogel-coated reticulated polyurethane foams, prepared by gamma irradiation

    NASA Astrophysics Data System (ADS)

    Sánchez, Mirna L.; Giménez, Claudia Y.; Delgado, Juan F.; Martínez, Leandro J.; Grasselli, Mariano

    2017-12-01

    Novel chromatographic materials for protein purification with high adsorption capacity and fouling resistance are highly demanded to improve downstream processes. Here, we describe a novel adsorptive material based on reticulated polyurethane foam (rPUF) coated with a functional hydrogel layer. rPUF provides physical rigidity through its macroscopic structure, whereas the hydrogel layer provides capacity to adsorb proteins by specific interactions. The hydrogel coating process was performed by the dip-coating method, using a polyvinyl alcohol (PVA) solution. The PVA hydrogel was linked to the rPUF material by using a radiation-induced crosslinking process in aqueous ethanol solution. The ethanol in the solvent mixture allowed a balance between PVA swelling and PVA dissolution during the irradiation step. The resulting material showed higher thermal stability than the non-irradiated one. In addition, a simultaneous radiation-induced grafting polymerization (SRIGP) was done by simple addition of glycidyl methacrylate monomer into the irradiation solution. In a further step, sulfonic ligands were included specifically in the hydrogel layer, which contained around 200% of PVA respect to the original rPUF. Materials were characterized by FT-IR, thermogravimetric analysis, SEM microscopy and EDX analysis. The cation-exchange rPUF material was functionally characterized by the Langmuir isotherm and a dynamic adsorption experiment to analyze the chromatographic properties for protein purification processes.

  4. Resolving the infection process reveals striking differences in the contribution of environment, genetics and phylogeny to host-parasite interactions.

    PubMed

    Duneau, David; Luijckx, Pepijn; Ben-Ami, Frida; Laforsch, Christian; Ebert, Dieter

    2011-02-22

    Infection processes consist of a sequence of steps, each critical for the interaction between host and parasite. Studies of host-parasite interactions rarely take into account the fact that different steps might be influenced by different factors and might, therefore, make different contributions to shaping coevolution. We designed a new method using the Daphnia magna - Pasteuria ramosa system, one of the rare examples where coevolution has been documented, in order to resolve the steps of the infection and analyse the factors that influence each of them. Using the transparent Daphnia hosts and fluorescently-labelled spores of the bacterium P. ramosa, we identified a sequence of infection steps: encounter between parasite and host; activation of parasite dormant spores; attachment of spores to the host; and parasite proliferation inside the host. The chances of encounter had been shown to depend on host genotype and environment. We tested the role of genetic and environmental factors in the newly described activation and attachment steps. Hosts of different genotypes, gender and species were all able to activate endospores of all parasite clones tested in different environments; suggesting that the activation cue is phylogenetically conserved. We next established that parasite attachment occurs onto the host oesophagus independently of host species, gender and environmental conditions. In contrast to spore activation, attachment depended strongly on the combination of host and parasite genotypes. Our results show that different steps are influenced by different factors. Host-type-independent spore activation suggests that this step can be ruled out as a major factor in Daphnia-Pasteuria coevolution. On the other hand, we show that the attachment step is crucial for the pronounced genetic specificities of this system. We suggest that this one step can explain host population structure and could be a key force behind coevolutionary cycles. We discuss how different steps can explain different aspects of the coevolutionary dynamics of the system: the properties of the attachment step, explaining the rapid evolution of infectivity and the properties of later parasite proliferation explaining the evolution of virulence. Our study underlines the importance of resolving the infection process in order to better understand host-parasite interactions.

  5. Understanding Patients' Process to Use Medical Marijuana: A Southern New Jersey Community Engagement Project.

    PubMed

    Crowell, Tara L

    2016-09-01

    Given the necessity to better understand the process patients need to go through in order to seek treatment via medical marijuana, this study investigates this process to better understand this phenomenon. Specifically, Compassion Care Foundation (CCF) and Stockton University worked together to identify a solution to this problem. Specifically, 240 new patients at CCF were asked to complete a 1-page survey regarding various aspects associated with their experience prior to their use of medicinal marijuana-diagnosis, what prompted them to seek treatment, level of satisfaction with specific stages in the process, total length of time the process took, and patient's level of pain. Results reveal numerous patient diagnoses for which medical marijuana is being prescribed; the top 4 most common are intractable skeletal spasticity, chronic and severe pain, multiple sclerosis, and inflammatory bowel disease. Next, results indicate a little over half of the patients were first prompted to seek alternative treatment from their physicians, while the remaining patients indicated that other sources such as written information along with friends, relatives, media, and the Internet persuaded them to seek treatment. These data indicate that a variety of sources play a role in prompting patients to seek alternative treatment and is a critical first step in this process. Additional results posit that once patients began the process of qualifying to receive medical marijuana as treatment, the process seemed more positive even though it takes patients on average almost 6 months to obtain their first treatment after they started the process. Finally, results indicate that patients are reporting a moderately high level of pain prior to treatment. Implication of these results highlights several important elements in the patients' initial steps toward seeking medical marijuana, along with the quality and quantity of the process patients must engage in prior to obtaining treatment. In addition, identifying patients' level of pain and better understanding the possible therapeutic value of medical marijuana are essential to patients and health practitioners.

  6. Designing Flightdeck Procedures

    NASA Technical Reports Server (NTRS)

    Barshi, Immanuel; Mauro, Robert; Degani, Asaf; Loukopoulou, Loukia

    2016-01-01

    The primary goal of this document is to provide guidance on how to design, implement, and evaluate flight deck procedures. It provides a process for developing procedures that meet clear and specific requirements. This document provides a brief overview of: 1) the requirements for procedures, 2) a process for the design of procedures, and 3) a process for the design of checklists. The brief overview is followed by amplified procedures that follow the above steps and provide details for the proper design, implementation and evaluation of good flight deck procedures and checklists.

  7. Computational Phenotyping in Psychiatry: A Worked Example

    PubMed Central

    2016-01-01

    Abstract Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology—structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry. PMID:27517087

  8. Computational Phenotyping in Psychiatry: A Worked Example.

    PubMed

    Schwartenbeck, Philipp; Friston, Karl

    2016-01-01

    Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology-structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry.

  9. Development of an Electronic Pediatric All-Cause Harm Measurement Tool Using a Modified Delphi Method.

    PubMed

    Stockwell, David Christopher; Bisarya, Hema; Classen, David C; Kirkendall, Eric S; Lachman, Peter I; Matlow, Anne G; Tham, Eric; Hyman, Dan; Lehman, Samuel M; Searles, Elizabeth; Muething, Stephen E; Sharek, Paul J

    2016-12-01

    To have impact on reducing harm in pediatric inpatients, an efficient and reliable process for harm detection is needed. This work describes the first step toward the development of a pediatric all-cause harm measurement tool by recognized experts in the field. An international group of leaders in pediatric patient safety and informatics were charged with developing a comprehensive pediatric inpatient all-cause harm measurement tool using a modified Delphi technique. The process was conducted in 5 distinct steps: (1) literature review of triggers (elements from a medical record that assist in identifying patient harm) for inclusion; (2) translation of triggers to likely associated harm, improving the ability for expert prioritization; (3) 2 applications of a modified Delphi selection approach with consensus criteria using severity and frequency of harm as well as detectability of the associated trigger as criteria to rate each trigger and associated harm; (4) developing specific trigger logic and relevant values when applicable; and (5) final vetting of the entire trigger list for pilot testing. Literature and expert panel review identified 108 triggers and associated harms suitable for consideration (steps 1 and 2). This list was pared to 64 triggers and their associated harms after the first of the 2 independent expert reviews. The second independent expert review led to further refinement of the trigger package, resulting in 46 items for inclusion (step 3). Adding in specific trigger logic expanded the list. Final review and voting resulted in a list of 51 triggers (steps 4 and 5). Application of a modified Delphi method on an expert-constructed list of 108 triggers, focusing on severity and frequency of harms as well as detectability of triggers in an electronic medical record, resulted in a final list of 51 pediatric triggers. Pilot testing this list of pediatric triggers to identify all-cause harm for pediatric inpatients is the next step to establish the appropriateness of each trigger for inclusion in a global pediatric safety measurement tool.

  10. Catalytic biomass pyrolysis process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dayton, David C.; Gupta, Raghubir P.; Turk, Brian S.

    Described herein are processes for converting a biomass starting material (such as lignocellulosic materials) into a low oxygen containing, stable liquid intermediate that can be refined to make liquid hydrocarbon fuels. More specifically, the process can be a catalytic biomass pyrolysis process wherein an oxygen removing catalyst is employed in the reactor while the biomass is subjected to pyrolysis conditions. The stream exiting the pyrolysis reactor comprises bio-oil having a low oxygen content, and such stream may be subjected to further steps, such as separation and/or condensation to isolate the bio-oil.

  11. Isolated step-down DC -DC converter for electric vehicles

    NASA Astrophysics Data System (ADS)

    Kukovinets, O. V.; Sidorov, K. M.; Yutt, V. E.

    2018-02-01

    Modern motor-vehicle industrial sector is moving rapidly now towards the electricity-driving cars production, improving their range and efficiency of components, and in particular the step-down DC/DC converter to supply the onboard circuit 12/24V of electric vehicle from the high-voltage battery. The purpose of this article - to identify the best circuitry topology to design an advanced step-down DC/DC converters with the smallest mass, volume, highest efficiency and power. And this will have a positive effect on driving distance of electric vehicle (EV). On the basis of computational research of existing and implemented circuit topologies of step-down DC/DC converters (serial resonant converter, full bridge with phase-shifting converter, LLC resonant converter) a comprehensive analysis was carried out on the following characteristics: specific volume, specific weight, power, efficiency. The data obtained was the basis for the best technical option - LLC resonant converter. The results can serve as a guide material in the process of components design of the traction equipment for electric vehicles, providing for the best technical solutions in the design and manufacturing of converting equipment, self-contained power supply systems and advanced driver assistance systems.

  12. Transcriptional profiling of resistant and susceptible buffalograsses in response to Blissus occiduus (Hemiptera: Blissidae) feeding

    USDA-ARS?s Scientific Manuscript database

    Understanding plant resistance mechanisms at a molecular level would provide valuable insights into the biological pathways impacted by insect feeding, and help explain specific plant tolerance mechanisms. As a first step in this process, we conducted next generation sequencing using RNA extracted f...

  13. Communique: Resources for Practicing Counselors, Vol. 2, No. 8.

    ERIC Educational Resources Information Center

    Walz, Garry R., Ed.

    This issue of Communique, a newsletter providing resource information for practicing counselors, features an article describing two non-verbal group counseling techniques for the elementary school counselor; a description of value clarification including a definition of values, the steps in the value clarification process, and specific value…

  14. Environmental isolation task

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1982-01-01

    The failure-analysis process was organized into a more specific set of long-term degradation steps so that material property change can be differentiated from module damage and module failure. Increasing module performance and life are discussed. A polymeric aging computer model is discussed. Early detection of polymer surface reactions due to aging is reported.

  15. SUMMARY REPORT OF THE MEETING ON DEVELOPMENT OF A METALS ASSESSMENT FRAMEWORK

    EPA Science Inventory

    Under the auspices of the Agency's Science Policy Council, the Agency is embarking on the development of an assessment framework for metals. The first step in the process is formulation of an Action Plan that will identify key scientific issues specific to metals and metal compou...

  16. Sea Stories: A Collaborative Tool for Articulating Tactical Knowledge.

    ERIC Educational Resources Information Center

    Radtke, Paul H.; Frey, Paul R.

    Having subject matter experts (SMEs) identify the skills and knowledge to be taught is among the more difficult and time-consuming steps in the training development process. A procedure has been developed for identifying specific tactical decision-making knowledge requirements and translating SME knowledge into appropriate multimedia…

  17. Integrated Watershed Management to Rehabilitate the Distorded Hydrologic Cycle in a Korean Urban Region

    NASA Astrophysics Data System (ADS)

    Lee, K.; Chung, E.; Park, K.

    2007-12-01

    Many urbanized watersheds suffer from streamflow depletion and poor stream quality, which often negatively affects related factors such as in-stream and near-stream ecologic integrity and water supply. But any watershed management which does not consider all potential risks is not proper since all hydrological components are closely related. Therefore this study has developed and applied a ten-step integrated watershed management (IWM) procedure to sustainably rehabilitate distorted hydrologic cycles due to urbanization. Step 1 of this procedure is understanding the watershed component and processes. This study proposes not only water quantity/quality monitoring but also continuous water quantity/quality simulation and estimation of annual pollutant loads from unit loads of all landuses. Step 2 is quantifying the watershed problem as potential flood damage (PFD), potential streamflow depletion (PSD), potential water quality deterioration (PWQD) and watershed evaluation index (WEI). All indicators are selected from the sustainability concept, Pressure-State- Response (PSR) model. All weights are estimated by Analytic Hierarchy Process (AHP). Four indices are calculated using composite programming, a kind of multicritera decision making technque. In Step 3 residents' preference on management objectives which consists of flood damage mitigation, prevention of streamflow depletion, and water quality enhancement are quantified. WEI can be recalculated using these values. Step 4 requires one to set the specific goals and objectives based on the results from Step 2 and 3. Objectives can include spatial flood allocation, instreamflow requirement and total maximum daily load (TMDL). Step 5 and 6 are developing all possible alternatives and to eliminate the infeasible. Step 7 is analyzing the effectiveness of all remaining feasible alternatives. The criteria of water quantity are presented as changed lowflow(Q275) and drought flow(Q355) of flow duration curve and number of days to satisfy the instreamflow requirement. Also the criteria of water quality are proposed as changed average BOD concentration and total daily loads and number of days to satisfy the TMDL. Step 8 involves the calculation of AEI using various MCDM techniques. The indicators of AEI are obtained by the sustainability concept, Drivers-Pressure-State-Impact-Response (DPSIR), an improved PSR model. All previous results are used in this step. Step 9 is estimating the benefit and cost of alternatives. Discrete Willingness To Pay (WTP) for the specific improvement of some current watershed conditions are estimated by the choice experiment method which is an economic valuation with stated presence techniques. WTPs of specific alternatives are calculated by combining AEI and choice experiment results. Therefore, the benefit of alternatives can be obtained by multiplying WTP and total household value of the sub-watershed. Finally in Step 10 the final alternatives comparing the net benefit and BC ratio are determined. Final alternatives derived from the proposed IWM procedure should not be carried out immediately but be discussed by stakeholders and decision makers. However, since plans obtained from the elaborated analyses reflect even sustainability concept, these alternatives can be apt to be accepted comparatively. This ten-step procedure will be helpful in making decision support system for sustainable IWM.

  18. Artificial neural networks to model formulation-property correlations in the process of inline-compounding on an injection moulding machine

    NASA Astrophysics Data System (ADS)

    Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer

    2015-05-01

    Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.

  19. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  20. The integrated simulation and assessment of the impacts of process change in biotherapeutic antibody production.

    PubMed

    Chhatre, Sunil; Jones, Carl; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel; Newcombe, Anthony; Keshavarz-Moore, Eli

    2006-01-01

    Growing commercial pressures in the pharmaceutical industry are establishing a need for robust computer simulations of whole bioprocesses to allow rapid prediction of the effects of changes made to manufacturing operations. This paper presents an integrated process simulation that models the cGMP manufacture of the FDA-approved biotherapeutic CroFab, an IgG fragment used to treat rattlesnake envenomation (Protherics U.K. Limited, Blaenwaun, Ffostrasol, Llandysul, Wales, U.K.). Initially, the product is isolated from ovine serum by precipitation and centrifugation, before enzymatic digestion of the IgG to produce FAB and FC fragments. These are purified by ion exchange and affinity chromatography to remove the FC and non-specific FAB fragments from the final venom-specific FAB product. The model was constructed in a discrete event simulation environment and used to determine the potential impact of a series of changes to the process, such as increasing the step efficiencies or volumes of chromatographic matrices, upon product yields and process times. The study indicated that the overall FAB yield was particularly sensitive to changes in the digestive and affinity chromatographic step efficiencies, which have a predicted 30% greater impact on process FAB yield than do the precipitation or centrifugation stages. The study showed that increasing the volume of affinity matrix has a negligible impact upon total process time. Although results such as these would require experimental verification within the physical constraints of the process and the facility, the model predictions are still useful in allowing rapid "what-if" scenario analysis of the likely impacts of process changes within such an integrated production process.

  1. Making three-dimensional echocardiography more tangible: a workflow for three-dimensional printing with echocardiographic data.

    PubMed

    Mashari, Azad; Montealegre-Gallegos, Mario; Knio, Ziyad; Yeh, Lu; Jeganathan, Jelliffe; Matyal, Robina; Khabbaz, Kamal R; Mahmood, Feroze

    2016-12-01

    Three-dimensional (3D) printing is a rapidly evolving technology with several potential applications in the diagnosis and management of cardiac disease. Recently, 3D printing (i.e. rapid prototyping) derived from 3D transesophageal echocardiography (TEE) has become possible. Due to the multiple steps involved and the specific equipment required for each step, it might be difficult to start implementing echocardiography-derived 3D printing in a clinical setting. In this review, we provide an overview of this process, including its logistics and organization of tools and materials, 3D TEE image acquisition strategies, data export, format conversion, segmentation, and printing. Generation of patient-specific models of cardiac anatomy from echocardiographic data is a feasible, practical application of 3D printing technology. © 2016 The authors.

  2. Graphical modeling and query language for hospitals.

    PubMed

    Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris

    2013-01-01

    So far there has been little evidence that implementation of the health information technologies (HIT) is leading to health care cost savings. One of the reasons for this lack of impact by the HIT likely lies in the complexity of the business process ownership in the hospitals. The goal of our research is to develop a business model-based method for hospital use which would allow doctors to retrieve directly the ad-hoc information from various hospital databases. We have developed a special domain-specific process modelling language called the MedMod. Formally, we define the MedMod language as a profile on UML Class diagrams, but we also demonstrate it on examples, where we explain the semantics of all its elements informally. Moreover, we have developed the Process Query Language (PQL) that is based on MedMod process definition language. The purpose of PQL is to allow a doctor querying (filtering) runtime data of hospital's processes described using MedMod. The MedMod language tries to overcome deficiencies in existing process modeling languages, allowing to specify the loosely-defined sequence of the steps to be performed in the clinical process. The main advantages of PQL are in two main areas - usability and efficiency. They are: 1) the view on data through "glasses" of familiar process, 2) the simple and easy-to-perceive means of setting filtering conditions require no more expertise than using spreadsheet applications, 3) the dynamic response to each step in construction of the complete query that shortens the learning curve greatly and reduces the error rate, and 4) the selected means of filtering and data retrieving allows to execute queries in O(n) time regarding the size of the dataset. We are about to continue developing this project with three further steps. First, we are planning to develop user-friendly graphical editors for the MedMod process modeling and query languages. The second step is to do evaluation of usability the proposed language and tool involving the physicians from several hospitals in Latvia and working with real data from these hospitals. Our third step is to develop an efficient implementation of the query language.

  3. Procedure for minimizing the cost per watt of photovoltaic systems

    NASA Technical Reports Server (NTRS)

    Redfield, D.

    1977-01-01

    A general analytic procedure is developed that provides a quantitative method for optimizing any element or process in the fabrication of a photovoltaic energy conversion system by minimizing its impact on the cost per watt of the complete system. By determining the effective value of any power loss associated with each element of the system, this procedure furnishes the design specifications that optimize the cost-performance tradeoffs for each element. A general equation is derived that optimizes the properties of any part of the system in terms of appropriate cost and performance functions, although the power-handling components are found to have a different character from the cell and array steps. Another principal result is that a fractional performance loss occurring at any cell- or array-fabrication step produces that same fractional increase in the cost per watt of the complete array. It also follows that no element or process step can be optimized correctly by considering only its own cost and performance

  4. Apparatus and process for freeform fabrication of composite reinforcement preforms

    NASA Technical Reports Server (NTRS)

    Yang, Junsheng (Inventor); Wu, Liangwei (Inventor); Liu, Junhai (Inventor); Jang, Bor Z. (Inventor)

    2001-01-01

    A solid freeform fabrication process and apparatus for making a three-dimensional reinforcement shape. The process comprises the steps of (1) operating a multiple-channel material deposition device for dispensing a liquid adhesive composition and selected reinforcement materials at predetermined proportions onto a work surface; (2) during the material deposition process, moving the deposition device and the work surface relative to each other in an X-Y plane defined by first and second directions and in a Z direction orthogonal to the X-Y plane so that the materials are deposited to form a first layer of the shape; (3) repeating these steps to deposit multiple layers for forming a three-dimensional preform shape; and (4) periodically hardening the adhesive to rigidize individual layers of the preform. These steps are preferably executed under the control of a computer system by taking additional steps of (5) creating a geometry of the shape on the computer with the geometry including a plurality of segments defining the preform shape and each segment being preferably coded with a reinforcement composition defining a specific proportion of different reinforcement materials; (6) generating programmed signals corresponding to each of the segments in a predetermined sequence; and (7) moving the deposition device and the work surface relative to each other in response to these programmed signals. Preferably, the system is also operated to generate a support structure for any un-supported feature of the 3-D preform shape.

  5. Simultaneous Runs of the Bayer VERSANT HIV-1 Version 3.0 and HCV bDNA Version 3.0 Quantitative Assays on the System 340 Platform Provide Reliable Quantitation and Improved Work Flow

    PubMed Central

    Elbeik, Tarek; Markowitz, Norman; Nassos, Patricia; Kumar, Uday; Beringer, Scott; Haller, Barbara; Ng, Valerie

    2004-01-01

    Branched DNA (bDNA) assays to quantify human immunodeficiency virus type 1 (HIV-1) and hepatitis C virus (HCV) consist of three distinct steps, including sample processing, hybridization, and detection, and utilize the System 340 platform for plate incubation and washing. Sample processing differs: HIV-1 from 1 ml of plasma is concentrated by high-speed centrifugation, whereas HCV plasma or serum samples are used without concentration. The first step of hybridization involves viral lysis at 63°C: HIV-1 is performed in a heat block, whereas HCV is performed in System 340. The remaining hybridization and detection steps are similar for HIV-1 and HCV and executed on System 340. In the present study, the HIV-1 bDNA assay was adapted for viral lysis in the System 340 platform. The adaptation, test method 2, includes a 20-s vortex of concentrated viral pellet and lysis working solution, transfer of viral lysate to the 96-well capture plate, and transfer to System 340 programmed for HCV assay specifications. With test method 2, specificity and quantification were within assay specifications. HCV bDNA methodology remains unchanged. Hence, an HIV-1 and an HCV bDNA can be run simultaneously on System 340. With simultaneous testing, laboratories can run full plates, as well as combinations of full and partial plates. Also, simultaneous HIV-1 and HCV bDNA permits labor consolidation and improved workflow while maintaining multitasking and rapid patient result turnaround. PMID:15243070

  6. Simultaneous runs of the Bayer VERSANT HIV-1 version 3.0 and HCV bDNA version 3.0 quantitative assays on the system 340 platform provide reliable quantitation and improved work flow.

    PubMed

    Elbeik, Tarek; Markowitz, Norman; Nassos, Patricia; Kumar, Uday; Beringer, Scott; Haller, Barbara; Ng, Valerie

    2004-07-01

    Branched DNA (bDNA) assays to quantify human immunodeficiency virus type 1 (HIV-1) and hepatitis C virus (HCV) consist of three distinct steps, including sample processing, hybridization, and detection, and utilize the System 340 platform for plate incubation and washing. Sample processing differs: HIV-1 from 1 ml of plasma is concentrated by high-speed centrifugation, whereas HCV plasma or serum samples are used without concentration. The first step of hybridization involves viral lysis at 63 degrees C: HIV-1 is performed in a heat block, whereas HCV is performed in System 340. The remaining hybridization and detection steps are similar for HIV-1 and HCV and executed on System 340. In the present study, the HIV-1 bDNA assay was adapted for viral lysis in the System 340 platform. The adaptation, test method 2, includes a 20-s vortex of concentrated viral pellet and lysis working solution, transfer of viral lysate to the 96-well capture plate, and transfer to System 340 programmed for HCV assay specifications. With test method 2, specificity and quantification were within assay specifications. HCV bDNA methodology remains unchanged. Hence, an HIV-1 and an HCV bDNA can be run simultaneously on System 340. With simultaneous testing, laboratories can run full plates, as well as combinations of full and partial plates. Also, simultaneous HIV-1 and HCV bDNA permits labor consolidation and improved workflow while maintaining multitasking and rapid patient result turnaround.

  7. Downstream reactions and engineering in the microbially reconstituted pathway for Taxol.

    PubMed

    Jiang, Ming; Stephanopoulos, Gregory; Pfeifer, Blaine A

    2012-05-01

    Taxol (a trademarked product of Bristol-Myers Squibb) is a complex isoprenoid natural product which has displayed potent anticancer activity. Originally isolated from the Pacific yew tree (Taxus brevifolia), Taxol has been mass-produced through processes reliant on plant-derived biosynthesis. Recently, there have been alternative efforts to reconstitute the biosynthetic process through technically convenient microbial hosts, which offer unmatched growth kinetics and engineering potential. Such an approach is made challenging by the need to successfully introduce the significantly foreign enzymatic steps responsible for eventual biosynthesis. Doing so, however, offers the potential to engineer more efficient and economical production processes and the opportunity to design and produce tailored analog compounds with enhanced properties. This mini review will specifically focus on heterologous biosynthesis as it applies to Taxol with an emphasis on the challenges associated with introducing and reconstituting the downstream reaction steps needed for final bioactivity.

  8. The automated array assembly task of the low-cost silicon solar array project, phase 2

    NASA Technical Reports Server (NTRS)

    Coleman, M. G.; Pryor, R. A.; Sparks, T. G.; Legge, R.; Saltzman, D. L.

    1980-01-01

    Several specific processing steps as part of a total process sequence for manufacturing silicon solar cells were studied. Ion implantation was identified as the preferred process step for impurity doping. Unanalyzed beam ion implantation was shown to have major cost advantages over analyzed beam implantation. Further, high quality cells were fabricated using a high current unanalyzed beam. Mechanically masked plasma patterning of silicon nitride was shown to be capable of forming fine lines on silicon surfaces with spacings between mask and substrate as great as 250 micrometers. Extensive work was performed on advances in plated metallization. The need for the thick electroless palladium layer was eliminated. Further, copper was successfully utilized as a conductor layer utilizing nickel as a barrier to copper diffusion into the silicon. Plasma etching of silicon for texturing and saw damage removal was shown technically feasible but not cost effective compared to wet chemical etching techniques.

  9. The algorithm for automatic detection of the calibration object

    NASA Astrophysics Data System (ADS)

    Artem, Kruglov; Irina, Ugfeld

    2017-06-01

    The problem of the automatic image calibration is considered in this paper. The most challenging task of the automatic calibration is a proper detection of the calibration object. The solving of this problem required the appliance of the methods and algorithms of the digital image processing, such as morphology, filtering, edge detection, shape approximation. The step-by-step process of the development of the algorithm and its adopting to the specific conditions of the log cuts in the image's background is presented. Testing of the automatic calibration module was carrying out under the conditions of the production process of the logging enterprise. Through the tests the average possibility of the automatic isolating of the calibration object is 86.1% in the absence of the type 1 errors. The algorithm was implemented in the automatic calibration module within the mobile software for the log deck volume measurement.

  10. Application of The APA Practice Guidelines on Suicide to Clinical Practice.

    PubMed

    Jacobs, Douglas G; Brewer, Margaret L

    2006-06-01

    This article presents charts from The American Psychiatric Association Practice Guideline for the Assessment and Treatment of Patients with Suicidal Behaviors, part of the Practice Guidelines for the Treatment of Psychiatric Disorders Compendium, and a summary of the assessment information in a format that can be used in routine clinical practice. Four steps in the assessment process are presented: the use of a thorough psychiatric examination to obtain information about the patient's current presentation, history, diagnosis, and to recognize suicide risk factors therein; the necessity of asking very specific questions about suicidal ideation, intent, plans, and attempts; the process of making an estimation of the patient's level of suicide risk is explained; and the use of modifiable risk and protective factors as the basis for treatment planning is demonstrated. Case reports are used to clarify use of each step in this process.

  11. Statistical patterns of visual search for hidden objects

    PubMed Central

    Credidio, Heitor F.; Teixeira, Elisângela N.; Reis, Saulo D. S.; Moreira, André A.; Andrade Jr, José S.

    2012-01-01

    The movement of the eyes has been the subject of intensive research as a way to elucidate inner mechanisms of cognitive processes. A cognitive task that is rather frequent in our daily life is the visual search for hidden objects. Here we investigate through eye-tracking experiments the statistical properties associated with the search of target images embedded in a landscape of distractors. Specifically, our results show that the twofold process of eye movement, composed of sequences of fixations (small steps) intercalated by saccades (longer jumps), displays characteristic statistical signatures. While the saccadic jumps follow a log-normal distribution of distances, which is typical of multiplicative processes, the lengths of the smaller steps in the fixation trajectories are consistent with a power-law distribution. Moreover, the present analysis reveals a clear transition between a directional serial search to an isotropic random movement as the difficulty level of the searching task is increased. PMID:23226829

  12. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  13. Study on Potential Changes in Geological and Disposal Environment Caused by 'Natural Phenomena' on a HLW Disposal System

    NASA Astrophysics Data System (ADS)

    Kawamura, M.; Umeda, K.; Ohi, T.; Ishimaru, T.; Niizato, T.; Yasue, K.; Makino, H.

    2007-12-01

    We have developed a formal evaluation method to assess the potential impact of natural phenomena (earthquakes and faulting; volcanism; uplift, subsidence, denudation and sedimentation; climatic and sea-level changes) on a High Level Radioactive Waste (HLW) Disposal System. In 2000, we had developed perturbation scenarios in a generic and conservative sense and illustrated the potential impact on a HLW disposal system. As results of the development of perturbation scenarios, two points were highlighted for consideration in subsequent work: improvement of the scenarios from the viewpoints of reality, transparency, traceability and consistency and avoiding extreme conservatism. Subsequently, we have thus developed a new procedure for describing such perturbation scenarios based on further studies of the characteristics of these natural perturbation phenomena in Japan. The approach to describing the perturbation scenario is effectively developed in five steps: Step 1: Description of potential process of phenomena and their impacts on the geological environment. Step 2: Characterization of potential changes of geological environment in terms of T-H-M-C (Thermal - Hydrological - Mechanical - Chemical) processes. The focus is on specific T-H-M-C parameters that influence geological barrier performance, utilizing the input from Step 1. Step 3: Classification of potential influences, based on similarity of T-H-M-C perturbations. This leads to development of perturbation scenarios to serve as a basis for consequence analysis. Step 4: Establishing models and parameters for performance assessment. Step 5: Calculation and assessment. This study focuses on identifying key T-H-M-C process associated with perturbations at Step 2. This framework has two advantages. First one is assuring maintenance of traceability during the scenario construction processes, facilitating the production and structuring of suitable records. The second is providing effective elicitation and organization of information from a wide range of investigations of earth sciences within a performance assessment context. In this framework, scenario development work proceeds in a stepwise manner, to ensure clear identification of the impact of processes associated with these phenomena on a HLW disposal system. Output is organized to create credible scenarios with required transparency, consistency, traceability and adequate conservatism. In this presentation, the potential impact of natural phenomena in the viewpoint of performance assessment for HLW disposal will be discussed and modeled using the approach.

  14. Phage display and kinetic selection of antibodies that specifically inhibit amyloid self-replication.

    PubMed

    Munke, Anna; Persson, Jonas; Weiffert, Tanja; De Genst, Erwin; Meisl, Georg; Arosio, Paolo; Carnerup, Anna; Dobson, Christopher M; Vendruscolo, Michele; Knowles, Tuomas P J; Linse, Sara

    2017-06-20

    The aggregation of the amyloid β peptide (Aβ) into amyloid fibrils is a defining characteristic of Alzheimer's disease. Because of the complexity of this aggregation process, effective therapeutic inhibitors will need to target the specific microscopic steps that lead to the production of neurotoxic species. We introduce a strategy for generating fibril-specific antibodies that selectively suppress fibril-dependent secondary nucleation of the 42-residue form of Aβ (Aβ42). We target this step because it has been shown to produce the majority of neurotoxic species during aggregation of Aβ42. Starting from large phage display libraries of single-chain antibody fragments (scFvs), the three-stage approach that we describe includes ( i ) selection of scFvs with high affinity for Aβ42 fibrils after removal of scFvs that bind Aβ42 in its monomeric form; ( ii ) ranking, by surface plasmon resonance affinity measurements, of the resulting candidate scFvs that bind to the Aβ42 fibrils; and ( iii ) kinetic screening and analysis to find the scFvs that inhibit selectively the fibril-catalyzed secondary nucleation process in Aβ42 aggregation. By applying this approach, we have identified four scFvs that inhibit specifically the fibril-dependent secondary nucleation process. Our method also makes it possible to discard antibodies that inhibit elongation, an important factor because the suppression of elongation does not target directly the production of toxic oligomers and may even lead to its increase. On the basis of our results, we suggest that the method described here could form the basis for rationally designed immunotherapy strategies to combat Alzheimer's and related neurodegenerative diseases.

  15. Regulative feedback in pattern formation: towards a general relativistic theory of positional information.

    PubMed

    Jaeger, Johannes; Irons, David; Monk, Nick

    2008-10-01

    Positional specification by morphogen gradients is traditionally viewed as a two-step process. A gradient is formed and then interpreted, providing a spatial metric independent of the target tissue, similar to the concept of space in classical mechanics. However, the formation and interpretation of gradients are coupled, dynamic processes. We introduce a conceptual framework for positional specification in which cellular activity feeds back on positional information encoded by gradients, analogous to the feedback between mass-energy distribution and the geometry of space-time in Einstein's general theory of relativity. We discuss how such general relativistic positional information (GRPI) can guide systems-level approaches to pattern formation.

  16. Integrating Mission Type Orders into Operational Level Intelligence Collection

    DTIC Science & Technology

    2011-05-27

    the planning and direction step of the intelligence process and turning them into collection tasks. The companion effort to CRM is COM. COM is the...the differences between MTOs and standard collection processes observing that “an MTO is asking a chef for their best soup, whereas the standard...collection deck is handing the chef a recipe calling for specific ingredients.”17 Theater collection lacks synergy from the perspective that it is

  17. Evaluation of engineering foods for closed Ecological Life Support System (CELSS)

    NASA Technical Reports Server (NTRS)

    Karel, M.

    1982-01-01

    A nutritionally adequate and acceptable diet was evaluated and developed. A design for a multipurpose food plant is discussed. The types and amounts of foods needed to be regenerated in a partially closed ecological life support system (PCELSS) were proposed. All steps of food processes to be utilized in the multipurpose food plant of PCELSS were also considered. Equipment specifications, simplification of the proposed processes, and food waste treatment were analyzed.

  18. Design and grayscale fabrication of beamfanners in a silicon substrate

    NASA Astrophysics Data System (ADS)

    Ellis, Arthur Cecil

    2001-11-01

    This dissertation addresses important first steps in the development of a grayscale fabrication process for multiple phase diffractive optical elements (DOS's) in silicon. Specifically, this process was developed through the design, fabrication, and testing of 1-2 and 1-4 beamfanner arrays for 5-micron illumination. The 1-2 beamfanner arrays serve as a test-of- concept and basic developmental step toward the construction of the 1-4 beamfanners. The beamfanners are 50 microns wide, and have features with dimensions of between 2 and 10 microns. The Iterative Annular Spectrum Approach (IASA) method, developed by Steve Mellin of UAH, and the Boundary Element Method (BEM) are the design and testing tools used to create the beamfanner profiles and predict their performance. Fabrication of the beamfanners required the techniques of grayscale photolithography and reactive ion etching (RIE). A 2-3micron feature size 1-4 silicon beamfanner array was fabricated, but the small features and contact photolithographic techniques available prevented its construction to specifications. A second and more successful attempt was made in which both 1-4 and 1-2 beamfanner arrays were fabricated with a 5-micron minimum feature size. Photolithography for the UAH array was contracted to MEMS-Optical of Huntsville, Alabama. A repeatability study was performed, using statistical techniques, of 14 photoresist arrays and the subsequent RIE process used to etch the arrays in silicon. The variance in selectivity between the 14 processes was far greater than the variance between the individual etched features within each process. Specifically, the ratio of the variance of the selectivities averaged over each of the 14 etch processes to the variance of individual feature selectivities within the processes yielded a significance level below 0.1% by F-test, indicating that good etch-to-etch process repeatability was not attained. One of the 14 arrays had feature etch-depths close enough to design specifications for optical testing, but 5- micron IR illumination of the 1-4 and 1-2 beamfanners yielded no convincing results of beam splitting in the detector plane 340 microns from the surface of the beamfanner array.

  19. Color-specific conditioning effects due to both orange and blue stimuli are observed in a Halobacterium salinarum strain devoid of putative methylatable sites on HtrI.

    PubMed

    Lucia, S; Cercignani, G; Frediani, A; Petracchi, D

    2003-01-01

    Behavioral responses of Halobacterium salinarum appear as changes in the frequency of motion reversals. Turning on orange light decreases the reversal frequency, whereas blue light induces reversals. Light pulses normally induce the same response as step-up stimuli. However, anomalous behavioral reactions, including inverse responses, are seen when stimuli are applied in sequence. The occurrence of a prior stimulus is conditioning for successive stimulation on a time scale of the same order of adaptational processes. These prolonged conditioning effects are color-specific. The only adaptation process identified so far is methylation of the transducers, and this could be somehow color-specific. Therefore we tested for the behavioral anomalies in a mutant in which all methylation sites on the transducer have been eliminated. The results show that behavioral anomalies are unaffected by the absence of methylation processes on the transducer.

  20. From framework to action: the DESIRE approach to combat desertification.

    PubMed

    Hessel, R; Reed, M S; Geeson, N; Ritsema, C J; van Lynden, G; Karavitis, C A; Schwilch, G; Jetten, V; Burger, P; van der Werff Ten Bosch, M J; Verzandvoort, S; van den Elsen, E; Witsenburg, K

    2014-11-01

    It has become increasingly clear that desertification can only be tackled through a multi-disciplinary approach that not only involves scientists but also stakeholders. In the DESIRE project such an approach was taken. As a first step, a conceptual framework was developed in which the factors and processes that may lead to land degradation and desertification were described. Many of these factors do not work independently, but can reinforce or weaken one another, and to illustrate these relationships sustainable management and policy feedback loops were included. This conceptual framework can be applied globally, but can also be made site-specific to take into account that each study site has a unique combination of bio-physical, socio-economic and political conditions. Once the conceptual framework was defined, a methodological framework was developed in which the methodological steps taken in the DESIRE approach were listed and their logic and sequence were explained. The last step was to develop a concrete working plan to put the project into action, involving stakeholders throughout the process. This series of steps, in full or in part, offers explicit guidance for other organizations or projects that aim to reduce land degradation and desertification.

  1. An industrial ecology approach to municipal solid waste ...

    EPA Pesticide Factsheets

    Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with examples suggested for various residual streams. A methodology is presented to consider individual waste-to-energy or waste-to-product system synergies, evaluating the economic and environmental issues associated with each system. Steps included in the methodology include identifying waste streams, specific waste components of interest, and conversion technologies, plus steps for determining the economic and environmental effects of using wastes and changes due to transport, administrative handling, and processing. In addition to presenting the methodology, technologies for various MSW input streams are categorized as commercialized or demonstrated to provide organizations that are considering processes for MSW with summarized information. The organization can also follow the methodology to analyze interesting processes. Presents information useful for analyzing the sustainability of alternatives for the management of municipal solid waste.

  2. Cyclin-dependent Kinase 9 Links RNA Polymerase II Transcription to Processing of Ribosomal RNA*

    PubMed Central

    Burger, Kaspar; Mühl, Bastian; Rohrmoser, Michaela; Coordes, Britta; Heidemann, Martin; Kellner, Markus; Gruber-Eber, Anita; Heissmeyer, Vigo; Strässer, Katja; Eick, Dirk

    2013-01-01

    Ribosome biogenesis is a process required for cellular growth and proliferation. Processing of ribosomal RNA (rRNA) is highly sensitive to flavopiridol, a specific inhibitor of cyclin-dependent kinase 9 (Cdk9). Cdk9 has been characterized as the catalytic subunit of the positive transcription elongation factor b (P-TEFb) of RNA polymerase II (RNAPII). Here we studied the connection between RNAPII transcription and rRNA processing. We show that inhibition of RNAPII activity by α-amanitin specifically blocks processing of rRNA. The block is characterized by accumulation of 3′ extended unprocessed 47 S rRNAs and the entire inhibition of other 47 S rRNA-specific processing steps. The transcription rate of rRNA is moderately reduced after inhibition of Cdk9, suggesting that defective 3′ processing of rRNA negatively feeds back on RNAPI transcription. Knockdown of Cdk9 caused a strong reduction of the levels of RNAPII-transcribed U8 small nucleolar RNA, which is essential for 3′ rRNA processing in mammalian cells. Our data demonstrate a pivotal role of Cdk9 activity for coupling of RNAPII transcription with small nucleolar RNA production and rRNA processing. PMID:23744076

  3. The Chinese Life-Steps Program: A Cultural Adaptation of a Cognitive-Behavioral Intervention to Enhance HIV Medication Adherence.

    PubMed

    Shiu, Cheng-Shi; Chen, Wei-Ti; Simoni, Jane; Fredriksen-Goldsen, Karen; Zhang, Fujie; Zhou, Hongxin

    2013-05-01

    China is considered to be the new frontier of the global AIDS pandemic. Although effective treatment for HIV is becoming widely available in China, adherence to treatment remains a challenge. This study aimed to adapt an intervention promoting HIV-medication adherence-favorably evaluated in the West-for Chinese HIV-positive patients. The adaptation process was theory-driven and covered several key issues of cultural adaptation. We considered the importance of interpersonal relationships and family in China and cultural notions of health. Using an evidence-based treatment protocol originally designed for Western HIV-positive patients, we developed an 11-step Chinese Life-Steps program with an additional culture-specific intervention option. We describe in detail how the cultural elements were incorporated into the intervention and put into practice at each stage. Clinical considerations are also outlined and followed by two case examples that are provided to illustrate our application of the intervention. Finally, we discuss practical and research issues and limitations emerging from our field experiments in a HIV clinic in Beijing. The intervention was tailored to address both universal and culturally specific barriers to adherence and is readily applicable to generalized clinical settings. This evidence-based intervention provides a case example of the process of adapting behavioral interventions to culturally diverse communities with limited resources.

  4. The Chinese Life-Steps Program: A Cultural Adaptation of a Cognitive-Behavioral Intervention to Enhance HIV Medication Adherence

    PubMed Central

    Shiu, Cheng-Shi; Chen, Wei-Ti; Simoni, Jane; Fredriksen-Goldsen, Karen; Zhang, Fujie; Zhou, Hongxin

    2013-01-01

    China is considered to be the new frontier of the global AIDS pandemic. Although effective treatment for HIV is becoming widely available in China, adherence to treatment remains a challenge. This study aimed to adapt an intervention promoting HIV-medication adherence—favorably evaluated in the West—for Chinese HIV-positive patients. The adaptation process was theory-driven and covered several key issues of cultural adaptation. We considered the importance of interpersonal relationships and family in China and cultural notions of health. Using an evidence-based treatment protocol originally designed for Western HIV-positive patients, we developed an 11-step Chinese Life-Steps program with an additional culture-specific intervention option. We describe in detail how the cultural elements were incorporated into the intervention and put into practice at each stage. Clinical considerations are also outlined and followed by two case examples that are provided to illustrate our application of the intervention. Finally, we discuss practical and research issues and limitations emerging from our field experiments in a HIV clinic in Beijing. The intervention was tailored to address both universal and culturally specific barriers to adherence and is readily applicable to generalized clinical settings. This evidence-based intervention provides a case example of the process of adapting behavioral interventions to culturally diverse communities with limited resources. PMID:23667305

  5. Constructing Singing Assessments for the Music Classroom

    ERIC Educational Resources Information Center

    Nichols, Bryan E.

    2017-01-01

    As teachers are increasingly asked to respond to assessment and evaluation demands in their schools, the aim of this article is to provide suggestions and scales for singing accuracy. A description of each step in the assessment design process is provided, along with specific suggestions for a singing development scale and a singing accuracy scale.

  6. Functionalized graphene hydrogel-based high-performance supercapacitors.

    PubMed

    Xu, Yuxi; Lin, Zhaoyang; Huang, Xiaoqing; Wang, Yang; Huang, Yu; Duan, Xiangfeng

    2013-10-25

    Functionalized graphene hydrogels are prepared by a one-step low-temperature reduction process and exhibit ultrahigh specific capacitances and excellent cycling stability in the aqueous electrolyte. Flexible solid-state supercapacitors based on functionalized graphene hydrogels are demonstrated with superior capacitive performances and extraordinary mechanical flexibility. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Optical method for measuring the surface area of a threaded fastener

    Treesearch

    Douglas Rammer; Samuel Zelinka

    2010-01-01

    This article highlights major aspects of a new optical technique to determine the surface area of a threaded fastener; the theoretical framework has been reported elsewhere. Specifically, this article describes general surface area expressions used in the analysis, details of image acquisition system, and major image processing steps contained within the measurement...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herk, A.; Beggs, T.

    This report outlines the steps a developer can take when creating and implementing high performance standards such as the U.S. Department of Energy’s (DOE’s) Zero Energy Ready Home (ZERH) standards on a community-wide scale. The report also describes the specific examples of how this process is underway in the Stapleton community in Denver, Colorado, by the developer Forest City.

  9. 78 FR 3433 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ... and by educating the public, especially young people, about tobacco products and the dangers their use... identified. When FDA receives tobacco-specific adverse event and product problem information, it will use the... quality problem, or product use error occurs. This risk identification process is the first necessary step...

  10. Letter-Sound Reading: Teaching Preschool Children Print-to-Sound Processing

    ERIC Educational Resources Information Center

    Wolf, Gail Marie

    2016-01-01

    This intervention study investigated the growth of letter sound reading and growth of consonant-vowel-consonant (CVC) word decoding abilities for a representative sample of 41 US children in preschool settings. Specifically, the study evaluated the effectiveness of a 3-step letter-sound teaching intervention in teaching preschool children to…

  11. Interactive Videodisc Design and Production, Workshop Guide. Volume 2

    DTIC Science & Technology

    1983-12-01

    type of answer processing sbeet. The details of branching specifications will depend on the authoring system you use. Step 8 Add " roadsigns " to the...script. Roadsigns are the frames put into the script to keep the student oriented. Roadsigns are not part of the instructions themselves but are inserted

  12. Facilitating Lasting Changes at an Elementary School

    ERIC Educational Resources Information Center

    James, Laurie

    2016-01-01

    The purpose of this study was to determine how to minimize waste in a school setting by reducing, reusing, recycling, and composting waste products. Specifically, the desire was to identify what steps could be taken to decrease waste practices at a Title I elementary school. Through the Washington Green Schools certification process, a Waste and…

  13. Kindergarten Transitions: How Pre-K Teachers Can Help. Program Services Paper.

    ERIC Educational Resources Information Center

    Amwake, Lynn

    Noting that the transition to Kindergarten is a process rather than a one-time event, this guide will help early childhood educators--home caregivers, center providers and public preschool teachers--understand the transitions to school and consider steps teachers can take to improve children's transition and adjustment. Specific ideas and…

  14. New Beginnings. A Manual for Facilitating Growth for Displaced Homemakers.

    ERIC Educational Resources Information Center

    Burnside, Nancy; And Others

    This resource manual shares ideas and exercises with those persons interested in and working with the specific concerns of displaced homemakers. Sections correspond to steps in an adjustment process leading to job search. An introduction overviews the manual and provides helpful hints for use by peer counselors, professional therapists, and group…

  15. What to Do When a Bad Teacher Doesn't Get Better.

    ERIC Educational Resources Information Center

    Dennis, Bruce L.

    1990-01-01

    Responsible administrators are obligated to confront poor teacher performance. Guides principals through 12 steps to take in the confrontation process that include the following: gathering information, waiting for a specific incident, developing a file, meeting with the teacher, helping the teacher to improve, and working with the teacher union.…

  16. Synthesis of Platinum-nickel Nanowires and Optimization for Oxygen Reduction Performance.

    PubMed

    Alia, Shaun M; Pivovar, Bryan S

    2018-04-27

    Platinum-nickel (Pt-Ni) nanowires were developed as fuel cell electrocatalysts, and were optimized for the performance and durability in the oxygen reduction reaction. Spontaneous galvanic displacement was used to deposit Pt layers onto Ni nanowire substrates. The synthesis approach produced catalysts with high specific activities and high Pt surface areas. Hydrogen annealing improved Pt and Ni mixing and specific activity. Acid leaching was used to preferentially remove Ni near the nanowire surface, and oxygen annealing was used to stabilize near-surface Ni, improving durability and minimizing Ni dissolution. These protocols detail the optimization of each post-synthesis processing step, including hydrogen annealing to 250 °C, exposure to 0.1 M nitric acid, and oxygen annealing to 175 °C. Through these steps, Pt-Ni nanowires produced increased activities more than an order of magnitude than Pt nanoparticles, while offering significant durability improvements. The presented protocols are based on Pt-Ni systems in the development of fuel cell catalysts. These techniques have also been used for a variety of metal combinations, and can be applied to develop catalysts for a number of electrochemical processes.

  17. Hydrothermal synthesis of nanostructured graphene/polyaniline composites as high-capacitance electrode materials for supercapacitors

    PubMed Central

    Wang, Ronghua; Han, Meng; Zhao, Qiannan; Ren, Zonglin; Guo, Xiaolong; Xu, Chaohe; Hu, Ning; Lu, Li

    2017-01-01

    As known to all, hydrothermal synthesis is a powerful technique for preparing inorganic and organic materials or composites with different architectures. In this reports, by controlling hydrothermal conditions, nanostructured polyaniline (PANi) in different morphologies were composited with graphene sheets (GNS) and used as electrode materials of supercapacitors. Specifically, ultrathin PANi layers with total thickness of 10–20 nm are uniformly composited with GNS by a two-step hydrothermal-assistant chemical oxidation polymerization process; while PANi nanofibers with diameter of 50~100 nm are obtained by a one-step direct hydrothermal process. Benefitting from the ultrathin layer and porous structure, the sheet-like GNS/PANi composites can deliver specific capacitances of 532.3 to 304.9 F/g at scan rates of 2 to 50 mV/s. And also, this active material showed very good stability with capacitance retention as high as ~99.6% at scan rate of 50 mV/s, indicating a great potential for using in supercapacitors. Furthermore, the effects of hydrothermal temperatures on the electrochemical performances were systematically studied and discussed. PMID:28291246

  18. Protein Adsorption onto Nanomaterials for the Development of Biosensors and Analytical Devices: A Review

    PubMed Central

    Bhakta, Samir A.; Evans, Elizabeth; Benavidez, Tomás E.; Garcia, Carlos D.

    2014-01-01

    An important consideration for the development of biosensors is the adsorption of the bio recognition element to the surface of a substrate. As the first step in the immobilization process, adsorption affects most immobilization routes and much attention is given into the research of this process to maximize the overall activity of the bio sensor. The use of nanomaterials, specifically nanoparticles and nanostructured films, offers advantageous properties that can be fine-tuned for interaction with specific proteins to maximize activity, minimize structural changes, and enhance the catalytic step. In the biosensor field, protein-nanomaterial interactions are an emerging trend that span across many disciplines. This review addresses recent publications about the proteins most frequently used, their most relevant characteristics, and the conditions required to adsorb them to nanomaterials. When relevant and available, subsequent analytical figures of merits are discussed for selected biosensors. The general trend amongst the research papers allows concluding that the use of nanomaterials has already provided significant improvements in the analytical performance of many biosensors and that this research field will continue to grow. PMID:25892065

  19. Real-time traffic sign detection and recognition

    NASA Astrophysics Data System (ADS)

    Herbschleb, Ernst; de With, Peter H. N.

    2009-01-01

    The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput

  20. Comparative TEA for Indirect Liquefaction Pathways to Distillate-Range Fuels via Oxygenated Intermediates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Eric; Snowden-Swan, Lesley J.; Talmadge, Michael

    This paper presents a comparative techno-economic analysis of five conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with specific focus on pathways utilizing oxygenated intermediates (derived either via thermochemical or biochemical conversion steps). The four emerging pathways of interest are compared with one conventional pathway (Fischer-Tropsch) for the production of the hydrocarbon blendstocks. The processing steps of the four emerging pathways include: biomass-to-syngas via indirect gasification, gas cleanup, conversion of syngas to alcohols/oxygenates, followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation. We show that the emerging pathways via oxygenated intermediatesmore » have the potential to be cost competitive with the conventional Fischer-Tropsch process. The evaluated pathways and the benchmark process generally exhibit similar fuel yields and carbon conversion efficiencies. The resulting minimum fuel selling prices are comparable to the benchmark at approximately $3.60 per gallon-gasoline equivalent, with potential for two new pathways to be more economically competitive. Additionally, the coproduct values can play an important role in the economics of the processes with oxygenated intermediates derived via syngas fermentation. Major cost drivers for the integrated processes are tied to achievable fuel yields and conversion efficiency of the intermediate steps, i.e., the production of oxygenates/alcohols from syngas and the conversion of oxygenates/alcohols to hydrocarbon fuels.« less

  1. Insoluble Coatings for Stirling Engine Heat Pipe Condenser Surfaces

    NASA Technical Reports Server (NTRS)

    Dussinger, Peter M.; Lindemuth, James E.

    1997-01-01

    The principal objective of this Phase 2 SBIR program was to develop and demonstrate a practically insoluble coating for nickel-based superalloys for Stirling engine heat pipe applications. Specific technical objectives of the program were: (1) Determine the solubility corrosion rates for Nickel 200, Inconel 718, and Udimet 72OLI in a simulated Stirling engine heat pipe environment, (2) Develop coating processes and techniques for capillary groove and screen wick structures, (3) Evaluate the durability and solubility corrosion rates for capillary groove and screen wick structures coated with an insoluble coating in cylindrical heat pipes operating under Stirling engine conditions, and (4) Design and fabricate a coated full-scale, partial segment of the current Stirling engine heat pipe for the Stirling Space Power Convertor program. The work effort successfully demonstrated a two-step nickel aluminide coating process for groove wick structures and interior wall surfaces in contact with liquid metals; demonstrated a one-step nickel aluminide coating process for nickel screen wick structures; and developed and demonstrated a two-step aluminum-to-nickel aluminide coating process for nickel screen wick structures. In addition, the full-scale, partial segment was fabricated and the interior surfaces and wick structures were coated. The heat pipe was charged with sodium, processed, and scheduled to be life tested for up to ten years as a Phase 3 effort.

  2. A new approach in the design of an interactive environment for teaching Hamiltonian digraphs

    NASA Astrophysics Data System (ADS)

    Iordan, A. E.; Panoiu, M.

    2014-03-01

    In this article the authors present the necessary steps in object orientated design of an interactive environment that is dedicated to the process of acquaintances assimilation in Hamiltonian graphs theory domain, especially for the simulation of algorithms which determine the Hamiltonian trails and circuits. The modelling of the interactive environment is achieved through specific UML diagrams representing the steps of analysis, design and implementation. This interactive environment is very useful for both students and professors, because computer programming domain, especially digraphs theory domain is comprehended and assimilated with difficulty by students.

  3. Cell separation and electrofusion in space

    NASA Technical Reports Server (NTRS)

    Morrison, D. R.; Hofmann, G. A.

    1990-01-01

    In microgravity, free-fluid electrophoretic methods for separating living cells and proteins are improved significantly by the absence of gravity-driven phenomena. Cell fusion, culture, and other bioprocessing steps are being investigated to understand the limits of earth-based processing. A multistep space bioprocess is described that includes electrophoretic separation of human target cells, single-cell manipulations using receptor-specific antibodies, electrofusion to produce immortal hybridomas, gentle suspension culture, and monoclonal antibody recovery using continuous-flow electrophoresis or recirculating isoelectric focusing. Improvements in several key steps already have been demonstrated by space experiments, and others will be studied on Space Station Freedom.

  4. Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.

    PubMed

    Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles

    2004-01-01

    The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.

  5. Reduction of Isoagglutinin in Intravenous Immunoglobulin (IVIG) Using Blood Group A- and B-Specific Immunoaffinity Chromatography: Industry-Scale Assessment.

    PubMed

    Gerber, Simon; Gaida, Annette; Spiegl, Nicole; Wymann, Sandra; Antunes, Adriano Marques; Menyawi, Ibrahim El; Zurbriggen, Brigitte; Hubsch, Alphonse; Imboden, Martin

    2016-10-01

    Hemolysis, a rare but potentially serious complication of intravenous immunoglobulin (IVIG) therapy, is associated with the presence of antibodies to blood groups A and B (isoagglutinins) in the IVIG product. An immunoaffinity chromatography (IAC) step in the production process could decrease isoagglutinin levels in IVIG. Our objectives were to compare isoagglutinin levels in a large number of IVIG (Privigen ® ) batches produced with or without IAC and to assess the feasibility of the production process with an IAC step on an industrial scale. The IAC column comprised a blend of anti-A and anti-B resins formed by coupling synthetic blood group antigens (A/B-trisaccharides) to a base bead matrix, and was introduced towards the end of the industrial-scale IVIG manufacturing process. Isoagglutinin levels in IVIG were determined by anti-A and anti-B hemagglutinin direct and indirect methods according to the European Pharmacopoeia (Ph. Eur.) and an isoagglutinin flow cytometry assay. IVIG product quality was assessed with respect to the retention of immunoglobulin G (IgG) subclasses, specific antibodies, and removal of IgM using standardized procedures. The IAC step reduced isoagglutinins in IVIG by two to three titer steps compared with lots produced without IAC. The median anti-A and anti-B titers with IAC were 1:8 and 1:4, respectively, when measured by the Ph. Eur. direct method, and 1:2 and <1, respectively, when measured by the Ph. Eur. indirect method. The isoagglutinin flow cytometry assay showed an 87-90 % reduction in isoagglutinins in post-IAC versus pre-IAC fractions. IAC alone reduced anti-A and anti-B of the IgMs isotype by 92.5-97.8 % and 95.4-99.2 %, respectively. Other product quality characteristics were similar with and without IAC. IAC is an effective method for reducing isoagglutinin levels in IVIG, and it is feasible on an industrial scale.

  6. Process for remediation of plastic waste

    DOEpatents

    Pol, Vilas G [Westmont, IL; Thiyagarajan, Pappannan [Germantown, MD

    2012-04-10

    A single step process for degrading plastic waste by converting the plastic waste into carbonaceous products via thermal decomposition of the plastic waste by placing the plastic waste into a reactor, heating the plastic waste under an inert or air atmosphere until the temperature of 700.degree. C. is achieved, allowing the reactor to cool down, and recovering the resulting decomposition products therefrom. The decomposition products that this process yields are carbonaceous materials, and more specifically egg-shaped and spherical-shaped solid carbons. Additionally, in the presence of a transition metal compound, this thermal decomposition process produces multi-walled carbon nanotubes.

  7. IRIS Toxicological Review of Tetrahydrofuran (THF) ...

    EPA Pesticide Factsheets

    EPA is releasing the draft report, Toxicological Review of Tetrahydrofuran, that was distributed to Federal agencies and White House Offices for comment during the Science Discussion step of the IRIS Assessment Development Process. Comments received from other Federal agencies and White House Offices are provided below with external peer review panel comments. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for tetrahydrofuran. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.

  8. RNA Interference: Biology, Mechanism, and Applications

    PubMed Central

    Agrawal, Neema; Dasaradhi, P. V. N.; Mohmmed, Asif; Malhotra, Pawan; Bhatnagar, Raj K.; Mukherjee, Sunil K.

    2003-01-01

    Double-stranded RNA-mediated interference (RNAi) is a simple and rapid method of silencing gene expression in a range of organisms. The silencing of a gene is a consequence of degradation of RNA into short RNAs that activate ribonucleases to target homologous mRNA. The resulting phenotypes either are identical to those of genetic null mutants or resemble an allelic series of mutants. Specific gene silencing has been shown to be related to two ancient processes, cosuppression in plants and quelling in fungi, and has also been associated with regulatory processes such as transposon silencing, antiviral defense mechanisms, gene regulation, and chromosomal modification. Extensive genetic and biochemical analysis revealed a two-step mechanism of RNAi-induced gene silencing. The first step involves degradation of dsRNA into small interfering RNAs (siRNAs), 21 to 25 nucleotides long, by an RNase III-like activity. In the second step, the siRNAs join an RNase complex, RISC (RNA-induced silencing complex), which acts on the cognate mRNA and degrades it. Several key components such as Dicer, RNA-dependent RNA polymerase, helicases, and dsRNA endonucleases have been identified in different organisms for their roles in RNAi. Some of these components also control the development of many organisms by processing many noncoding RNAs, called micro-RNAs. The biogenesis and function of micro-RNAs resemble RNAi activities to a large extent. Recent studies indicate that in the context of RNAi, the genome also undergoes alterations in the form of DNA methylation, heterochromatin formation, and programmed DNA elimination. As a result of these changes, the silencing effect of gene functions is exercised as tightly as possible. Because of its exquisite specificity and efficiency, RNAi is being considered as an important tool not only for functional genomics, but also for gene-specific therapeutic activities that target the mRNAs of disease-related genes. PMID:14665679

  9. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Abstract Interpreters for Free

    NASA Astrophysics Data System (ADS)

    Might, Matthew

    In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.

  11. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  12. A semi-automatic method for quantification and classification of erythrocytes infected with malaria parasites in microscopic images.

    PubMed

    Díaz, Gloria; González, Fabio A; Romero, Eduardo

    2009-04-01

    Visual quantification of parasitemia in thin blood films is a very tedious, subjective and time-consuming task. This study presents an original method for quantification and classification of erythrocytes in stained thin blood films infected with Plasmodium falciparum. The proposed approach is composed of three main phases: a preprocessing step, which corrects luminance differences. A segmentation step that uses the normalized RGB color space for classifying pixels either as erythrocyte or background followed by an Inclusion-Tree representation that structures the pixel information into objects, from which erythrocytes are found. Finally, a two step classification process identifies infected erythrocytes and differentiates the infection stage, using a trained bank of classifiers. Additionally, user intervention is allowed when the approach cannot make a proper decision. Four hundred fifty malaria images were used for training and evaluating the method. Automatic identification of infected erythrocytes showed a specificity of 99.7% and a sensitivity of 94%. The infection stage was determined with an average sensitivity of 78.8% and average specificity of 91.2%.

  13. Searching for disability in electronic databases of published literature.

    PubMed

    Walsh, Emily S; Peterson, Jana J; Judkins, Dolores Z

    2014-01-01

    As researchers in disability and health conduct systematic reviews with greater frequency, the definition of disability used in these reviews gains importance. Translating a comprehensive conceptual definition of "disability" into an operational definition that utilizes electronic databases in the health sciences is a difficult step necessary for performing systematic literature reviews in the field. Consistency of definition across studies will help build a body of evidence that is comparable and amenable to synthesis. To illustrate a process for operationalizing the World Health Organization's International Classification of Disability, Functioning, and Health concept of disability for MEDLINE, PsycINFO, and CINAHL databases. We created an electronic search strategy in conjunction with a reference librarian and an expert panel. Quality control steps included comparison of search results to results of a search for a specific disabling condition and to articles nominated by the expert panel. The complete search strategy is presented. Results of the quality control steps indicated that our strategy was sufficiently sensitive and specific. Our search strategy will be valuable to researchers conducting literature reviews on broad populations with disabilities. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Facile Synthesis of Novel Networked Ultralong Cobalt Sulfide Nanotubes and Its Application in Supercapacitors.

    PubMed

    Liu, Sangui; Mao, Cuiping; Niu, Yubin; Yi, Fenglian; Hou, Junke; Lu, Shiyu; Jiang, Jian; Xu, Maowen; Li, Changming

    2015-11-25

    Ultralong cobalt sulfide (CoS(1.097)) nanotube networks are synthesized by a simple one-step solvothermal method without any surfactant or template. A possible formation mechanism for the growth processes is proposed. Owing to the hollow structure and large specific area, the novel CoS(1.097) materials present outstanding electrochemical properties. Electrochemical measurements for supercapacitors show that the as-prepared ultralong CoS(1.097) nanotube networks exhibit high specific capacity, good capacity retention, and excellent Coulombic efficiency.

  15. Framework for enhancing clinical practice guidelines through continuous patient engagement.

    PubMed

    Armstrong, Melissa J; Rueda, Juan-David; Gronseth, Gary S; Mullins, C Daniel

    2017-02-01

    Patient engagement in clinical practice guideline (CPG) development is recommended by multiple institutions and instruments measuring guideline quality. Approaches to engaging patients, however, vary between oversight organizations, quality tools and guideline developers. We propose a ten-step framework outlining steps and options for patient engagement in guideline development with the goal of highlighting steps for patient engagement and methods by which this can be achieved. This framework provides a model for continuous patient engagement in CPGs by outlining ten steps of guideline development occurring at the levels of the developer/committee and the individual guideline project. At the developer level, patients can assist in topic nomination (step 1), topic prioritization (step 2) and guideline development group selection (step 3). Within specific guideline projects, patients' opinions may be incorporated when framing the question (step 4), creating an analytic framework and research plan (step 5), conducting the systematic review and conclusion formation (step 6), development of recommendations (step 7) and dissemination and implementation (step 8). At the end of process, patients can again be engaged at the developer level by helping determine when guidelines need updating (step 9) and evaluating the developer's approach to patient engagement (step 10). Patient engagement at each CPG development step has different purposes, mechanisms, advantages and disadvantages, and implications for resource utilization. This framework can serve as a resource for guideline developers desiring to increase patient engagement and reference for researchers investigating engagement methodology at different steps of the CPG lifecycle. © 2016 The Authors. Health Expectations Published by John Wiley & Sons Ltd.

  16. Structural basis for androgen specificity and oestrogen synthesis in human aromatase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Debashis; Griswold, Jennifer; Erman, Mary

    2009-03-06

    Aromatase cytochrome P450 is the only enzyme in vertebrates known to catalyse the biosynthesis of all oestrogens from androgens. Aromatase inhibitors therefore constitute a frontline therapy for oestrogen-dependent breast cancer. In a three-step process, each step requiring 1 mol of O{sub 2}, 1 mol of NADPH, and coupling with its redox partner cytochrome P450 reductase, aromatase converts androstenedione, testosterone and 16{alpha}-hydroxytestosterone to oestrone, 17{beta}-oestradiol and 17{beta},16{alpha}-oestriol, respectively. The first two steps are C19-methyl hydroxylation steps, and the third involves the aromatization of the steroid A-ring, unique to aromatase. Whereas most P450s are not highly substrate selective, it is the hallmarkmore » androgenic specificity that sets aromatase apart. The structure of this enzyme of the endoplasmic reticulum membrane has remained unknown for decades, hindering elucidation of the biochemical mechanism. Here we present the crystal structure of human placental aromatase, the only natural mammalian, full-length P450 and P450 in hormone biosynthetic pathways to be crystallized so far. Unlike the active sites of many microsomal P450s that metabolize drugs and xenobiotics, aromatase has an androgen-specific cleft that binds the androstenedione molecule snugly. Hydrophobic and polar residues exquisitely complement the steroid backbone. The locations of catalytically important residues shed light on the reaction mechanism. The relative juxtaposition of the hydrophobic amino-terminal region and the opening to the catalytic cleft shows why membrane anchoring is necessary for the lipophilic substrates to gain access to the active site. The molecular basis for the enzyme's androgenic specificity and unique catalytic mechanism can be used for developing next-generation aromatase inhibitors.« less

  17. Developing a workbook to support the contextualisation of global health systems guidance: a case study identifying steps and critical factors for success in this process at WHO.

    PubMed

    Alvarez, Elizabeth; Lavis, John N; Brouwers, Melissa; Schwartz, Lisa

    2018-03-02

    Global guidance can help countries strengthen their health systems to deliver effective interventions to their populations. However, to have an impact, guidance needs to be contextualised or adapted to local settings; this process includes consideration of health system arrangements and political system factors. To date, methods to support contextualisation do not exist. In response, a workbook was designed to provide specific methods and strategies to enable the contextualisation of WHO's 'Optimizing health worker roles to improve maternal and newborn health' (OptimizeMNH) guidance at the national or subnational level. The objective of this study was to describe the process of developing the workbook and identify key steps of the development process, barriers that arose and facilitators that helped overcome some of these barriers. A qualitative single case study design was carried out. Interviews, documents and a reflexive journal were used. Constant comparison and an edit-style of organisation were used during data analysis to develop concepts, themes, subthemes and relationships among them. Thirteen interviews were conducted and 52 documents were reviewed. Three main steps were identified in the process of developing the workbook for health systems guidance contextualisation, namely (1) determining the need for and gaining approval to develop the workbook, (2) developing the workbook (taking on the task, creating the structure of the workbook, operationalising its components, undergoing approval processes and editing it), and (3) implementing the workbook both at the WHO level and at the national/subnational level. Five barriers and/or facilitators emerged relevant to each step, namely (1) having well-placed and credible champions, (2) creating and capitalising on opportunities, (3) finding the right language to engage various actors and obtain buy-in, (4) obtaining and maintaining meaningful buy-in, and (5) ensuring access to resources. Understanding the key steps and the critical factors involved in the process of developing the workbook could help in the planning of similar and other tools aimed to support the implementation of WHO guidance. A plan for dissemination and implementation needs to be addressed during the preparation of these tools.

  18. Resolving the infection process reveals striking differences in the contribution of environment, genetics and phylogeny to host-parasite interactions

    PubMed Central

    2011-01-01

    Background Infection processes consist of a sequence of steps, each critical for the interaction between host and parasite. Studies of host-parasite interactions rarely take into account the fact that different steps might be influenced by different factors and might, therefore, make different contributions to shaping coevolution. We designed a new method using the Daphnia magna - Pasteuria ramosa system, one of the rare examples where coevolution has been documented, in order to resolve the steps of the infection and analyse the factors that influence each of them. Results Using the transparent Daphnia hosts and fluorescently-labelled spores of the bacterium P. ramosa, we identified a sequence of infection steps: encounter between parasite and host; activation of parasite dormant spores; attachment of spores to the host; and parasite proliferation inside the host. The chances of encounter had been shown to depend on host genotype and environment. We tested the role of genetic and environmental factors in the newly described activation and attachment steps. Hosts of different genotypes, gender and species were all able to activate endospores of all parasite clones tested in different environments; suggesting that the activation cue is phylogenetically conserved. We next established that parasite attachment occurs onto the host oesophagus independently of host species, gender and environmental conditions. In contrast to spore activation, attachment depended strongly on the combination of host and parasite genotypes. Conclusions Our results show that different steps are influenced by different factors. Host-type-independent spore activation suggests that this step can be ruled out as a major factor in Daphnia-Pasteuria coevolution. On the other hand, we show that the attachment step is crucial for the pronounced genetic specificities of this system. We suggest that this one step can explain host population structure and could be a key force behind coevolutionary cycles. We discuss how different steps can explain different aspects of the coevolutionary dynamics of the system: the properties of the attachment step, explaining the rapid evolution of infectivity and the properties of later parasite proliferation explaining the evolution of virulence. Our study underlines the importance of resolving the infection process in order to better understand host-parasite interactions. PMID:21342515

  19. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Bioethics education in clinical settings: theory and practice of the dilemma method of moral case deliberation.

    PubMed

    Stolper, Margreet; Molewijk, Bert; Widdershoven, Guy

    2016-07-22

    Moral Case Deliberation is a specific form of bioethics education fostering professionals' moral competence in order to deal with their moral questions. So far, few studies focus in detail on Moral Case Deliberation methodologies and their didactic principles. The dilemma method is a structured and frequently used method in Moral Case Deliberation that stimulates methodological reflection and reasoning through a systematic dialogue on an ethical issue experienced in practice. In this paper we present a case-study of a Moral Case Deliberation with the dilemma method in a health care institution for people with an intellectual disability, describing the theoretical background and the practical application of the dilemma method. The dilemma method focuses on moral experiences of participants concerning a concrete dilemma in practice. By an in-depth description of each of the steps of the deliberation process, we elucidate the educational value and didactics of this specific method. The didactics and methodical steps of the dilemma method both supported and structured the dialogical reflection process of the participants. The process shows that the participants learned to recognize the moral dimension of the issue at stake and were able to distinguish various perspectives and reasons in a systematic manner. The facilitator played an important role in the learning process of the participants, by assisting them in focusing on and exploring moral aspects of the case. The reflection and learning process, experienced by the participants, shows competency-based characteristics. The role of the facilitator is that of a Socratic teacher with specific knowledge and skills, fostering reflection, inquiry and dialogue. The specific didactics of the dilemma method is well suited for teaching bioethics in clinical settings. The dilemma method follows an inductive learning approach through a dialogical moral inquiry in which participants develop not only knowledge but also skills, attitude and character. The role of a trained facilitator and a specific view on teaching and practicing ethics are essential when using the dilemma method in teaching health care professionals how to reflect on their own moral issues in practice.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakel, Allen J.; Conner, Cliff; Quigley, Kevin

    One of the missions of the Reduced Enrichment for Research and Test Reactors (RERTR) program (and now the National Nuclear Security Administrations Material Management and Minimization program) is to facilitate the use of low enriched uranium (LEU) targets for 99Mo production. The conversion from highly enriched uranium (HEU) to LEU targets will require five to six times more uranium to produce an equivalent amount of 99Mo. The work discussed here addresses the technical challenges encountered in the treatment of uranyl nitrate hexahydrate (UNH)/nitric acid solutions remaining after the dissolution of LEU targets. Specifically, the focus of this work is themore » calcination of the uranium waste from 99Mo production using LEU foil targets and the Modified Cintichem Process. Work with our calciner system showed that high furnace temperature, a large vent tube, and a mechanical shield are beneficial for calciner operation. One- and two-step direct calcination processes were evaluated. The high-temperature one-step process led to contamination of the calciner system. The two-step direct calcination process operated stably and resulted in a relatively large amount of material in the calciner cup. Chemically assisted calcination using peroxide was rejected for further work due to the difficulty in handling the products. Chemically assisted calcination using formic acid was rejected due to unstable operation. Chemically assisted calcination using oxalic acid was recommended, although a better understanding of its chemistry is needed. Overall, this work showed that the two-step direct calcination and the in-cup oxalic acid processes are the best approaches for the treatment of the UNH/nitric acid waste solutions remaining from dissolution of LEU targets for 99Mo production.« less

  2. Evaluating and selecting an information system, Part 1.

    PubMed

    Neal, T

    1993-01-01

    Initial steps in the process of evaluating and selecting a computerized information system for the pharmacy department are described. The first step in the selection process is to establish a steering committee and a project committee. The steering committee oversees the project, providing policy guidance, making major decisions, and allocating budgeted expenditures. The project committee conducts the departmental needs assessment, identifies system requirements, performs day-to-day functions, evaluates vendor proposals, trains personnel, and implements the system chosen. The second step is the assessment of needs in terms of personnel, workload, physical layout, and operating requirements. The needs assessment should be based on the department's mission statement and strategic plan. The third step is the development of a request for information (RFI) and a request for proposal (RFP). The RFI is a document designed for gathering preliminary information from a wide range of vendors; this general information is used in deciding whether to send the RFP to a given vendor. The RFP requests more detailed information and gives the purchaser's exact specifications for a system; the RFP also includes contractual information. To help ensure project success, many institutions turn to computer consultants for guidance. The initial steps in selecting a computerized pharmacy information system are establishing computerization committees, conducting a needs assessment, and writing an RFI and an RFP. A crucial early decision is whether to seek a consultant's expertise.

  3. Liver Gene Expression Profiles of Rats Treated with Clofibric Acid

    PubMed Central

    Michel, Cécile; Desdouets, Chantal; Sacre-Salem, Béatrice; Gautier, Jean-Charles; Roberts, Ruth; Boitier, Eric

    2003-01-01

    Clofibric acid (CLO) is a peroxisome proliferator (PP) that acts through the peroxisome proliferator activated receptor α, leading to hepatocarcinogenesis in rodents. CLO-induced hepatocarcinogenesis is a multi-step process, first transforming normal liver cells into foci. The combination of laser capture microdissection (LCM) and genomics has the potential to provide expression profiles from such small cell clusters, giving an opportunity to understand the process of cancer development in response to PPs. To our knowledge, this is the first evaluation of the impact of the successive steps of LCM procedure on gene expression profiling by comparing profiles from LCM samples to those obtained with non-microdissected liver samples collected after a 1 month CLO treatment in the rat. We showed that hematoxylin and eosin (H&E) staining and laser microdissection itself do not impact on RNA quality. However, the overall process of the LCM procedure affects the RNA quality, resulting in a bias in the gene profiles. Nonetheless, this bias did not prevent accurate determination of a CLO-specific molecular signature. Thus, gene-profiling analysis of microdissected foci, identified by H&E staining may provide insight into the mechanisms underlying non-genotoxic hepatocarcinogenesis in the rat by allowing identification of specific genes that are regulated by CLO in early pre-neoplastic foci. PMID:14633594

  4. Liver gene expression profiles of rats treated with clofibric acid: comparison of whole liver and laser capture microdissected liver.

    PubMed

    Michel, Cécile; Desdouets, Chantal; Sacre-Salem, Béatrice; Gautier, Jean-Charles; Roberts, Ruth; Boitier, Eric

    2003-12-01

    Clofibric acid (CLO) is a peroxisome proliferator (PP) that acts through the peroxisome proliferator activated receptor alpha, leading to hepatocarcinogenesis in rodents. CLO-induced hepatocarcinogenesis is a multi-step process, first transforming normal liver cells into foci. The combination of laser capture microdissection (LCM) and genomics has the potential to provide expression profiles from such small cell clusters, giving an opportunity to understand the process of cancer development in response to PPs. To our knowledge, this is the first evaluation of the impact of the successive steps of LCM procedure on gene expression profiling by comparing profiles from LCM samples to those obtained with non-microdissected liver samples collected after a 1 month CLO treatment in the rat. We showed that hematoxylin and eosin (H&E) staining and laser microdissection itself do not impact on RNA quality. However, the overall process of the LCM procedure affects the RNA quality, resulting in a bias in the gene profiles. Nonetheless, this bias did not prevent accurate determination of a CLO-specific molecular signature. Thus, gene-profiling analysis of microdissected foci, identified by H&E staining may provide insight into the mechanisms underlying non-genotoxic hepatocarcinogenesis in the rat by allowing identification of specific genes that are regulated by CLO in early pre-neoplastic foci.

  5. Preparation of clinical-grade 89Zr-panitumumab as a positron emission tomography biomarker for evaluating epidermal growth factor receptor-targeted therapy

    PubMed Central

    Wei, Ling; Shi, Jianfeng; Afari, George; Bhattacharyya, Sibaprasad

    2014-01-01

    Panitumumab is a fully human monoclonal antibody approved for the treatment of epidermal growth factor receptor (EGFR) positive colorectal cancer. Recently, panitumumab has been radiolabeled with 89Zr and evaluated for its potential to be used as immuno-positron emission tomography (PET) probe for EGFR positive cancers. Interesting preclinical results published by several groups of researchers have prompted us to develop a robust procedure for producing clinical-grade 89Zr-panitumumab as an immuno-PET probe to evaluate EGFR-targeted therapy. In this process, clinical-grade panitumumab is bio-conjugated with desferrioxamine chelate and subsequently radiolabeled with 89Zr resulting in high radiochemical yield (>70%, n=3) and purity (>98%, n=3). All quality control (QC) tests were performed according to United States Pharmacopeia specifications. QC tests showed that 89Zr-panitumumab met all specifications for human injection. Herein, we describe a step-by-step method for the facile synthesis and QC tests of 89Zr-panitumumab for medical use. The entire process of bioconjugation, radiolabeling, and all QC tests will take about 5h. Because the synthesis is fully manual, two rapid, in-process QC tests have been introduced to make the procedure robust and error free. PMID:24448743

  6. Solvent recyclability in a multistep direct liquefaction process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hetland, M.D.; Rindt, J.R.

    1995-12-31

    Direct liquefaction research at the Energy & Environmental Research Center (EERC) has, for a number of years, concentrated on developing a direct liquefaction process specifically for low-rank coals (LRCs) through the use of hydrogen-donating solvents and solvents similar to coal-derived liquids, the water/gas shift reaction, and lower-severity reaction conditions. The underlying assumption of all of the research was that advantage could be taken of the reactivity and specific qualities of LRCs to produce a tetrahydrofuran (THF)-soluble material that might be easier to upgrade than the soluble residuum produced during direct liquefaction of high-rank coals. A multistep approach was taken tomore » produce the THF-soluble material, consisting of (1) preconversion treatment to prepare the coal for solubilization, (2) solubilization of the coal in the solvent, and (3) polishing to complete solubilization of the remaining material. The product of these three steps can then be upgraded during a traditional hydrotreatment step. The results of the EERC`s research indicated that additional studies to develop this process more fully were justified. Two areas were targeted for further research: (1) determination of the recyclability of the solvent used during solubilization and (2) determination of the minimum severity required for hydrotreatment of the liquid product. The current project was funded to investigate these two areas.« less

  7. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  8. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  9. RECAPDOC - a questionnaire for the documentation of rehabilitation care utilization in individuals with disorders of consciousness in long-term care in Germany: development and pretesting.

    PubMed

    Klingshirn, Hanna; Mittrach, Rene; Braitmayer, Kathrin; Strobl, Ralf; Bender, Andreas; Grill, Eva; Müller, Martin

    2018-05-04

    A multitude of different rehabilitation interventions and other specific health care services are offered for individuals with disorders of consciousness in long-term care settings. To investigate the association of those services and patient-relevant outcomes, a specific instrument to document the utilization of those services is needed. The purpose of this study was to develop such a questionnaire administered to caregivers in epidemiological studies or patient registries in Germany. The development process of the RECAPDOC questionnaire was carried out in three steps. Step 1 consisted of a systematic literature review and an online-based expert survey to define the general content. Step 2 was an expert interview to evaluate the preliminary content of the questionnaire. Step 3 was a pretest including cognitive interviews with caregivers. After each step, the results were combined into a new version of the questionnaire. The first version of the questionnaire included items on utilization of medical care, medical aids, nursing and therapeutic care. The results of the expert interview led to the integration of five new items and the modification of six other items. The pretest led to some minor modifications of the questionnaire since it was rated as feasible and acceptable. The final questionnaire consisted of 29 items covering the domains "living situation", "social insurance status", "utilisation of home health care", "domestic services", "outpatient health care", "specific diagnostic measures", "adaptive technologies", "medical aids" and "utilization of therapies". Also the experience of family support and multidisciplinary collaboration of health professionals is covered. The developed questionnaire is a first step to make the situation of patients with disorders of consciousness in the long-term care setting accessible for evaluation in epidemiological studies and in the context of patient registries. However, further reliability and validity studies are needed.

  10. Automation of cellular therapy product manufacturing: results of a split validation comparing CD34 selection of peripheral blood stem cell apheresis product with a semi-manual vs. an automatic procedure.

    PubMed

    Hümmer, Christiane; Poppe, Carolin; Bunos, Milica; Stock, Belinda; Wingenfeld, Eva; Huppert, Volker; Stuth, Juliane; Reck, Kristina; Essl, Mike; Seifried, Erhard; Bonig, Halvard

    2016-03-16

    Automation of cell therapy manufacturing promises higher productivity of cell factories, more economical use of highly-trained (and costly) manufacturing staff, facilitation of processes requiring manufacturing steps at inconvenient hours, improved consistency of processing steps and other benefits. One of the most broadly disseminated engineered cell therapy products is immunomagnetically selected CD34+ hematopoietic "stem" cells (HSCs). As the clinical GMP-compliant automat CliniMACS Prodigy is being programmed to perform ever more complex sequential manufacturing steps, we developed a CD34+ selection module for comparison with the standard semi-automatic CD34 "normal scale" selection process on CliniMACS Plus, applicable for 600 × 10(6) target cells out of 60 × 10(9) total cells. Three split-validation processings with healthy donor G-CSF-mobilized apheresis products were performed; feasibility, time consumption and product quality were assessed. All processes proceeded uneventfully. Prodigy runs took about 1 h longer than CliniMACS Plus runs, albeit with markedly less hands-on operator time and therefore also suitable for less experienced operators. Recovery of target cells was the same for both technologies. Although impurities, specifically T- and B-cells, were 5 ± 1.6-fold and 4 ± 0.4-fold higher in the Prodigy products (p = ns and p = 0.013 for T and B cell depletion, respectively), T cell contents per kg of a virtual recipient receiving 4 × 10(6) CD34+ cells/kg was below 10 × 10(3)/kg even in the worst Prodigy product and thus more than fivefold below the specification of CD34+ selected mismatched-donor stem cell products. The products' theoretical clinical usability is thus confirmed. This split validation exercise of a relatively short and simple process exemplifies the potential of automatic cell manufacturing. Automation will further gain in attractiveness when applied to more complex processes, requiring frequent interventions or handling at unfavourable working hours, such as re-targeting of T-cells.

  11. Renewable energy from corn residues by thermochemical conversion

    NASA Astrophysics Data System (ADS)

    Yu, Fei

    Declining fossil oil reserve, skyrocket price, unsecured supplies, and environment pollution are among the many energy problems we are facing today. It is our conviction that renewable energy is a solution to these problems. The long term goal of the proposed research is to develop commercially practical technologies to produce energy from renewable resources. The overall objective of my research is to study and develop thermochemical processes for converting bulky and low-energy-density biomass materials into bio-fuels and value-added bio-products. The rationale for the proposed research is that, once such processes are developed, processing facility can be set up on or near biomass product sites, reducing the costs associated with transport of bulky biomass which is a key technical barrier to biomass conversion. In my preliminary research, several conversion technologies including atmospheric pressure liquefaction, high pressure liquefaction, and microwave pyrolysis have been evaluated. Our data indicated that microwave pyrolysis had the potential to become a simple and economically viable biomass conversion technology. Microwave pyrolysis is an innovative process that provides efficient and uniform heating, and are robust to type, size and uniformity of feedstock and therefore suitable for almost any waste materials without needing to reduce the particle size. The proposed thesis focused on in-depth investigations of microwave pyrolysis of corn residues. My first specific aim was to examine the effects of processing parameters on product yields. The second specific research aim was to characterize the products (gases, bio-oils, and solid residues), which was critical to process optimization and product developments. Other research tasks included conducting kinetic modeling and preliminary mass and energy balance. This study demonstrated that microwave pyrolysis could be optimized to produce high value syngas, liquid fuels and pyrolytic carbons, and had a great potential to become a commercial process according to the mass and energy balance. One-step global model and two-step consecutive-reaction kinetic model offered a clue to the key mechanistic steps in the overall pyrolysis of corn residues. These results should have a positive impact on advancing renewable energy technologies and establishing the University's leadership status in the area of renewable energy development.

  12. Guideline validation in multiple trauma care through business process modeling.

    PubMed

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  13. [Eight-step structured decision-making process to assign criminal responsibility and seven focal points for describing relationship between psychopathology and offense].

    PubMed

    Okada, Takayuki

    2013-01-01

    The author suggested that it is essential for lawyers and psychiatrists to have a common understanding of the mutual division of roles between them when determining criminal responsibility (CR) and, for this purpose, proposed an 8-step structured CR decision-making process. The 8 steps are: (1) gathering of information related to mental function and condition, (2) recognition of mental function and condition,(3) psychiatric diagnosis, (4) description of the relationship between psychiatric symptom or psychopathology and index offense, (5) focus on capacities of differentiation between right and wrong and behavioral control, (6) specification of elements of cognitive/volitional prong in legal context, (7) legal evaluation of degree of cognitive/volitional prong, and (8) final interpretation of CR as a legal conclusion. The author suggested that the CR decision-making process should proceed not in a step-like pattern from (1) to (2) to (3) to (8), but in a step-like pattern from (1) to (2) to (4) to (5) to (6) to (7) to (8), and that not steps after (5), which require the interpretation or the application of section 39 of the Penal Code, but Step (4), must be the core of psychiatric expert evidence. When explaining the relationship between the mental disorder and offense described in Step (4), the Seven Focal Points (7FP) are often used. The author urged basic precautions to prevent the misuse of 7FP, which are: (a) the priority of each item is not equal and the relative importance differs from case to case; (b) each item is not exclusively independent, there may be overlap between items; (c) the criminal responsibility shall not be judged because one item is applicable or because a number of items are applicable, i. e., 7FP are not "criteria," for example, the aim is not to decide such things as 'the motive is understandable' or 'the conduct is appropriate', but should be to describe how psychopathological factors affected the offense specifically in the context of understandability of motive or appropriateness of conduct; (d) it is essential to evaluate each item from a neutral point of view rather than only from one perspective, for example, looking at the case from the aspects of both comprehensibility and incomprehensibility of motive or from aspects of both oriented, purposeful, organized behavior and disoriented, purposeless, disorganized behavior during the offense; (e) depending on the case, there are some items that do not require any consideration (there are some cases in which there are less than seven items); (f) 7FP are not exhaustive and there are instances in which, depending on the case, there should be a focus on points that are not included in these.

  14. When a Step Is Not a Step! Specificity Analysis of Five Physical Activity Monitors.

    PubMed

    O'Connell, Sandra; ÓLaighin, Gearóid; Quinlan, Leo R

    2017-01-01

    Physical activity is an essential aspect of a healthy lifestyle for both physical and mental health states. As step count is one of the most utilized measures for quantifying physical activity it is important that activity-monitoring devices be both sensitive and specific in recording actual steps taken and disregard non-stepping body movements. The objective of this study was to assess the specificity of five activity monitors during a variety of prescribed non-stepping activities. Participants wore five activity monitors simultaneously for a variety of prescribed activities including deskwork, taking an elevator, taking a bus journey, automobile driving, washing and drying dishes; functional reaching task; indoor cycling; outdoor cycling; and indoor rowing. Each task was carried out for either a specific duration of time or over a specific distance. Activity monitors tested were the ActivPAL micro™, NL-2000™ pedometer, Withings Smart Activity Monitor Tracker (Pulse O2)™, Fitbit One™ and Jawbone UP™. Participants were video-recorded while carrying out the prescribed activities and the false positive step count registered on each activity monitor was obtained and compared to the video. All activity monitors registered a significant number of false positive steps per minute during one or more of the prescribed activities. The Withings™ activity performed best, registering a significant number of false positive steps per minute during the outdoor cycling activity only (P = 0.025). The Jawbone™ registered a significant number of false positive steps during the functional reaching task and while washing and drying dishes, which involved arm and hand movement (P < 0.01 for both). The ActivPAL™ registered a significant number of false positive steps during the cycling exercises (P < 0.001 for both). As a number of false positive steps were registered on the activity monitors during the non-stepping activities, the authors conclude that non-stepping physical activities can result in the false detection of steps. This can negatively affect the quantification of physical activity with regard to step count as an output. The Withings™ activity monitor performed best with regard to specificity during the activities of daily living tested.

  15. Conformational dynamics of abasic DNA upon interactions with AP endonuclease 1 revealed by stopped-flow fluorescence analysis.

    PubMed

    Kanazhevskaya, Lyubov Yu; Koval, Vladimir V; Vorobjev, Yury N; Fedorova, Olga S

    2012-02-14

    Apurinic/apyrimidinic (AP) sites are abundant DNA lesions arising from exposure to UV light, ionizing radiation, alkylating agents, and oxygen radicals. In human cells, AP endonuclease 1 (APE1) recognizes this mutagenic lesion and initiates its repair via a specific incision of the phosphodiester backbone 5' to the AP site. We have investigated a detailed mechanism of APE1 functioning using fluorescently labeled DNA substrates. A fluorescent adenine analogue, 2-aminopurine, was introduced into DNA substrates adjacent to the abasic site to serve as an on-site reporter of conformational transitions in DNA during the catalytic cycle. Application of a pre-steady-state stopped-flow technique allows us to observe changes in the fluorescence intensity corresponding to different stages of the process in real time. We also detected an intrinsic Trp fluorescence of the enzyme during interactions with 2-aPu-containing substrates. Our data have revealed a conformational flexibility of the abasic DNA being processed by APE1. Quantitative analysis of fluorescent traces has yielded a minimal kinetic scheme and appropriate rate constants consisting of four steps. The results obtained from stopped-flow data have shown a substantial influence of the 2-aPu base location on completion of certain reaction steps. Using detailed molecular dynamics simulations of the DNA substrates, we have attributed structural distortions of AP-DNA to realization of specific binding, effective locking, and incision of the damaged DNA. The findings allowed us to accurately discern the step that corresponds to insertion of specific APE1 amino acid residues into the abasic DNA void in the course of stabilization of the precatalytic complex.

  16. Laser capture microdissection of embryonic cells and preparation of RNA for microarray assays.

    PubMed

    Redmond, Latasha C; Pang, Christopher J; Dumur, Catherine; Haar, Jack L; Lloyd, Joyce A

    2014-01-01

    In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice-isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure(®) LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM.

  17. Laser Capture Microdissection of Embryonic Cells and Preparation of RNA for Microarray Assays

    PubMed Central

    Redmond, Latasha C.; Pang, Christopher J.; Dumur, Catherine; Haar, Jack L.; Lloyd, Joyce A.

    2014-01-01

    In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice–isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure® LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM. PMID:24318813

  18. Estimating heterotrophic respiration at large scales: Challenges, approaches, and next steps

    DOE PAGES

    Bond-Lamberty, Ben; Epron, Daniel; Harden, Jennifer; ...

    2016-06-27

    Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of Decomposition Functional Types (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelersmore » and experimentalists to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present an example clustering analysis to show how annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from (but complementary to) already-existing PFTs. A similar analysis incorporating observational data could form the basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with rigorous testing of analytical results; using point measurements and realistic forcing variables to constrain process-based models; and planning by the global modeling community for decoupling decomposition from fixed site data. Lastly, these are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR.« less

  19. Estimating heterotrophic respiration at large scales: Challenges, approaches, and next steps

    USGS Publications Warehouse

    Bond-Lamberty, Ben; Epron, Daniel; Harden, Jennifer W.; Harmon, Mark E.; Hoffman, Forrest; Kumar, Jitendra; McGuire, Anthony David; Vargas, Rodrigo

    2016-01-01

    Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of “Decomposition Functional Types” (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelers and experimentalists to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present an example clustering analysis to show how annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from (but complementary to) already-existing PFTs. A similar analysis incorporating observational data could form the basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with rigorous testing of analytical results; using point measurements and realistic forcing variables to constrain process-based models; and planning by the global modeling community for decoupling decomposition from fixed site data. These are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR.

  20. One-Step Nucleic Acid Amplification (OSNA): A fast molecular test based on CK19 mRNA concentration for assessment of lymph-nodes metastases in early stage endometrial cancer.

    PubMed

    Fanfani, Francesco; Monterossi, Giorgia; Ghizzoni, Viola; Rossi, Esther D; Dinoi, Giorgia; Inzani, Frediano; Fagotti, Anna; Gueli Alletti, Salvatore; Scarpellini, Francesca; Nero, Camilla; Santoro, Angela; Scambia, Giovanni; Zannoni, Gian F

    2018-01-01

    The aim of the current study is to evaluate the detection rate of micro- and macro-metastases of the One-Step Nucleic Acid Amplification (OSNA) compared to frozen section examination and subsequent ultra-staging examination in early stage endometrial cancer (EC). From March 2016 to June 2016, data of 40 consecutive FIGO stage I EC patients were prospectively collected in an electronic database. The sentinel lymph node mapping was performed in all patients. All mapped nodes were removed and processed. Sentinel lymph nodes were sectioned and alternate sections were respectively examined by OSNA and by frozen section analysis. After frozen section, the residual tissue from each block was processed with step-level sections (each step at 200 micron) including H&E and IHC slides. Sentinel lymph nodes mapping was successful in 29 patients (72.5%). In the remaining 11 patients (27.5%), a systematic pelvic lymphadenectomy was performed. OSNA assay sensitivity and specificity were 87.5% and 100% respectively. Positive and negative predictive values were 100% and 99% respectively, with a diagnostic accuracy of 99%. As far as frozen section examination and subsequent ultra-staging analysis was concerned, we reported sensitivity and specificity of 50% and 94.4% respectively; positive and negative predictive values were 14.3% and 99%, respectively, with an accuracy of 93.6%. In one patient, despite negative OSNA and frozen section analysis of the sentinel node, a macro-metastasis in 1 non-sentinel node was found. The combination of OSNA procedure with the sentinel lymph node mapping could represent an efficient intra-operative tool for the selection of early-stage EC patients to be submitted to systematic lymphadenectomy.

  1. Costume and Music-Specific Dance: A Structure for Experimentation with Process and Technology

    ERIC Educational Resources Information Center

    Brown, Nathan; Dasen, Ann; Trommer-Beardslee, Heather

    2016-01-01

    This article describes how the authors completed a project at Central Michigan University (CMU) with undergraduate theater majors and minors and dance minors as part of the annual mainstage dance concert. Although the concert is predominantly choreographed and designed by CMU faculty, students are engaged in every step of the performance and…

  2. THE DEVELOPMENT OF TRAINING OBJECTIVES.

    ERIC Educational Resources Information Center

    SMITH, ROBERT G., JR.

    A SIX-STEP PROCESS IS DESCRIBED FOR DEFINING JOB-RELEVANT OBJECTIVES FOR THE TRAINING OF MILITARY PERSONNEL. (1) A FORM OF SYSTEM ANALYSIS IS OUTLINED TO PROVIDE THE CONTEXT FOR THE STUDY OF A PARTICULAR MILITARY OCCUPATION SPECIALTY. (2) A TASK INVENTORY IS MADE OF THE MAJOR DUTIES IN THE JOB AND THE MORE SPECIFIC JOB TASKS ASSOCIATED WITH EACH…

  3. An Assessment Program Designed To Improve Communication Instruction through a Competency-Based Core Curriculum.

    ERIC Educational Resources Information Center

    Aitken, Joan E.; Neer, Michael R.

    This paper provides an example procedure used to design and install a program of assessment to improve communication instruction through a competency-based core curriculum at a mid-sized, urban university. The paper models the various steps in the process, and includes specific tests, forms, memos, course description, sources, and procedures which…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Eric C.D.

    This paper presents a comparative techno-economic analysis of four emerging conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with specific focus on pathways utilizing oxygenated intermediates. The processing steps include: biomass-to-syngas via indirect gasification, gas cleanup, conversion of syngas to alcohols/oxygenates followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation.

  5. Teaching CAD at the University: Specifically Written or Commercial Software?

    ERIC Educational Resources Information Center

    Garcia, Ramon Rubio; Quiros, Javier Suarez; Santos, Ramon Gallego; Penin, Pedro I. Alvarez

    2007-01-01

    At most universities throughout the world Computer Aided Design is taught using commercial programs more suitable for business and industry than for teaching. This led us to write our own design program (GIcad) starting from the best-known standards on the market, but always avoiding unnecessary commands in the first steps of the learning process.…

  6. A Manual for Proposal Planning and Development within the College of Education.

    ERIC Educational Resources Information Center

    Kussrow, Paul G.

    A guide to writing effective proposals that are designed to funding for specific projects is presented. The need to clearly identify the nature of project--its design, its anticipated outcomes, and the needs which the project is addressing--is described as the crucial first step in the grantswriting process. Next, the importance of the prospectus…

  7. Reflective Process in Play Therapy: A Practical Model for Supervising Counseling Students

    ERIC Educational Resources Information Center

    Allen, Virginia B.; Folger, Wendy A.; Pehrsson, Dale-Elizabeth

    2007-01-01

    Counselor educators and other supervisors, who work with graduate student counseling interns utilizing Play Therapy, should be educated, grounded, and trained in theory, supervision, and techniques specific to Play Therapy. Unfortunately, this is often not the case. Therefore, a three step model was created to assist those who do not have specific…

  8. A Comparison of Student Performance on Discipline-Specific versus Integrated Exams in a Medical School Course

    ERIC Educational Resources Information Center

    Thompson, Andrew R.; Braun, Mark W.; O'Loughlin, Valerie D.

    2013-01-01

    Curricular reform is a widespread trend among medical schools. Assessing the impact that pedagogical changes have on students is a vital step in review process. This study examined how a shift from discipline-focused instruction and assessment to integrated instruction and assessment affected student performance in a second-year medical school…

  9. pyGrav, a Python-based program for handling and processing relative gravity data

    NASA Astrophysics Data System (ADS)

    Hector, Basile; Hinderer, Jacques

    2016-06-01

    pyGrav is a Python-based open-source software dedicated to the complete processing of relative-gravity data. It is particularly suited for time-lapse gravity surveys where high precision is sought. Its purpose is to bind together single-task processing codes in a user-friendly interface for handy and fast treatment of raw gravity data from many stations of a network. The intuitive object-based implementation allows to easily integrate additional functions (reading/writing routines, processing schemes, data plots) related to the appropriate object (a station, a loop, or a survey). This makes pyGrav an evolving tool. Raw data can be corrected for tides and air pressure effects. The data selection step features a double table-plot graphical window with either manual or automatic selection according to specific thresholds on data channels (tilts, gravity values, gravity standard deviation, duration of measurements, etc.). Instrumental drifts and gravity residuals are obtained by least square analysis of the dataset. This first step leads to the gravity simple differences between a reference point and any point of the network. When different repetitions of the network are done, the software computes then the gravity double differences and associated errors. The program has been tested on two specific case studies: a large dataset acquired for the study of water storage changes on a small catchment in West Africa, and a dataset operated and processed by several different users for geothermal studies in northern Alsace, France. In both cases, pyGrav proved to be an efficient and easy-to-use solution for the effective processing of relative-gravity data.

  10. Gold glyconanoparticles as new tools in antiadhesive therapy.

    PubMed

    Rojo, Javier; Díaz, Vicente; de la Fuente, Jesús M; Segura, Inmaculada; Barrientos, Africa G; Riese, Hans H; Bernad, Antonio; Penadés, Soledad

    2004-03-05

    Gold glyconanoparticles (GNPs) have been prepared as new multivalent tools that mimic glycosphingolipids on the cell surface. GNPs are highly soluble under physiological conditions, stable against enzymatic degradation and nontoxic. Thereby GNPs open up a novel promising multivalent platform for biological applications. It has recently been demonstrated that specific tumor-associated carbohydrate antigens (glycosphingolipids and glycoproteins) are involved in the initial step of tumor spreading. A mouse melanoma model was selected to test glyconanoparticles as possible inhibitors of experimental lung metastasis. A carbohydrate-carbohydrate interaction is proposed as the first recognition step for this process. Glyconanoparticles presenting lactose (lacto-GNPs) have been used successfully to significantly reduce the progression of experimental metastasis. This result shows for the first time a clear biological effect of lacto-GNPs, demonstrating the potential application of this glyconanotechnology in biological processes.

  11. Diagonal chromatography to study plant protein modifications.

    PubMed

    Walton, Alan; Tsiatsiani, Liana; Jacques, Silke; Stes, Elisabeth; Messens, Joris; Van Breusegem, Frank; Goormachtig, Sofie; Gevaert, Kris

    2016-08-01

    An interesting asset of diagonal chromatography, which we have introduced for contemporary proteome research, is its high versatility concerning proteomic applications. Indeed, the peptide modification or sorting step that is required between consecutive peptide separations can easily be altered and thereby allows for the enrichment of specific, though different types of peptides. Here, we focus on the application of diagonal chromatography for the study of modifications of plant proteins. In particular, we show how diagonal chromatography allows for studying proteins processed by proteases, protein ubiquitination, and the oxidation of protein-bound methionines. We discuss the actual sorting steps needed for each of these applications and the obtained results. This article is part of a Special Issue entitled: Plant Proteomics--a bridge between fundamental processes and crop production, edited by Dr. Hans-Peter Mock. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Open Integrated Personal Learning Environment: Towards a New Conception of the ICT-Based Learning Processes

    NASA Astrophysics Data System (ADS)

    Conde, Miguel Ángel; García-Peñalvo, Francisco José; Casany, Marià José; Alier Forment, Marc

    Learning processes are changing related to technological and sociological evolution, taking this in to account, a new learning strategy must be considered. Specifically what is needed is to give an effective step towards the eLearning 2.0 environments consolidation. This must imply the fusion of the advantages of the traditional LMS (Learning Management System) - more formative program control and planning oriented - with the social learning and the flexibility of the web 2.0 educative applications.

  13. Frequency-Dependent Characteristics of Regional Seismic Phases: Propagation of Pn in Western China

    DTIC Science & Technology

    2011-01-31

    CLIMB project using instrumentation from different sources, incl uding many stations fr om the Inco rporated Research Institutions in Seismology – Progr...specific te rranes have also been id entified for later 3 processing . Four of the in-line events have epicenters less than 200 km from the Hi- CLIMB...observed and the calculated data are analyzed using the same processing steps. Figure A1. This shows an example of a data trace from the Hi-CLIMB

  14. Magnetorheological finishing: a perfect solution to nanofinishing requirements

    NASA Astrophysics Data System (ADS)

    Sidpara, Ajay

    2014-09-01

    Finishing of optics for different applications is the most important as well as difficult step to meet the specification of optics. Conventional grinding or other polishing processes are not able to reduce surface roughness beyond a certain limit due to high forces acting on the workpiece, embedded abrasive particles, limited control over process, etc. Magnetorheological finishing (MRF) process provides a new, efficient, and innovative way to finish optical materials as well many metals to their desired level of accuracy. This paper provides an overview of MRF process for different applications, important process parameters, requirement of magnetorheological fluid with respect to workpiece material, and some areas that need to be explored for extending the application of MRF process.

  15. Leap Frog and Time Step Sub-Cycle Scheme for Coupled Neutronics and Thermal-Hydraulic Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, S.

    2002-07-01

    As the result of the advancing TCP/IP based inter-process communication technology, more and more legacy thermal-hydraulic codes have been coupled with neutronics codes to provide best-estimate capabilities for reactivity related reactor transient analysis. Most of the coupling schemes are based on closely coupled serial or parallel approaches. Therefore, the execution of the coupled codes usually requires significant CPU time, when a complicated system is analyzed. Leap Frog scheme has been used to reduce the run time. The extent of the decoupling is usually determined based on a trial and error process for a specific analysis. It is the intent ofmore » this paper to develop a set of general criteria, which can be used to invoke the automatic Leap Frog algorithm. The algorithm will not only provide the run time reduction but also preserve the accuracy. The criteria will also serve as the base of an automatic time step sub-cycle scheme when a sudden reactivity change is introduced and the thermal-hydraulic code is marching with a relatively large time step. (authors)« less

  16. Location specific solidification microstructure control in electron beam melting of Ti-6Al-4V

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narra, Sneha P.; Cunningham, Ross; Beuth, Jack

    Relationships between prior beta grain size in solidified Ti-6Al-4V and melting process parameters in the Electron Beam Melting (EBM) process are investigated. Samples are built by varying a machine-dependent proprietary speed function to cover the process space. Optical microscopy is used to measure prior beta grain widths and assess the number of prior beta grains present in a melt pool in the raster region of the build. Despite the complicated evolution of beta grain sizes, the beta grain width scales with melt pool width. The resulting understanding of the relationship between primary machine variables and prior beta grain widths ismore » a key step toward enabling the location specific control of as-built microstructure in the EBM process. Control of grain width in separate specimens and within a single specimen is demonstrated.« less

  17. Development of biology student worksheets to facilitate science process skills of student

    NASA Astrophysics Data System (ADS)

    Rahayu, Y. S.; Pratiwi, R.; Indana, S.

    2018-01-01

    This research aims to describe development of Biology student worksheets to facilitate science process skills of student, at the same time to facilitate thinking skills of students in senior high school are equipped with Assesment Sheets. The worksheets development refers to cycle which includes phase analysis (analysis), planning (planning), design (design), development (development), implementation (implementation), evaluation and revision (evaluation and revision). Phase evaluation and revision is an ongoing activity conducted in each phase of the development cycle. That is, after the evaluation of the results of these activities and make revisions at any phase, then continue to the next phase. Based on the test results for grade X, XI, and XII in St. Agnes Surabaya high school, obtained some important findings. The findings are as follows. (1) Developed biology student worksheets could be used to facilitate thinking ability of students in particular skills integrated process that includes components to formulate the problem, formulate hypotheses, determine the study variables, formulate an operational definition of variables, determine the steps in the research, planning data tables, organizing Data in the form of tables/charts, drawing conclusions, (2) Developed biology student worksheets could also facilitate the development of social interaction of students such as working together, listening/respect the opinions of others, assembling equipment and materials, discuss and share information and facilitate the upgrading of skills hands-on student activity. (3) Developed biology worksheets basically could be implemented with the guidance of the teacher step by step, especially for students who have never used a similar worksheet. Guidance at the beginning of this need, especially for worksheets that require special skills or understanding of specific concepts as a prerequisite, such as using a microscope, determine the heart rate, understand the mechanism of specific indicators.

  18. Is the size of the useful field of view affected by postural demands associated with standing and stepping?

    PubMed

    Reed-Jones, James G; Reed-Jones, Rebecca J; Hollands, Mark A

    2014-04-30

    The useful field of view (UFOV) is the visual area from which information is obtained at a brief glance. While studies have examined the effects of increased cognitive load on the visual field, no one has specifically looked at the effects of postural control or locomotor activity on the UFOV. The current study aimed to examine the effects of postural demand and locomotor activity on UFOV performance in healthy young adults. Eleven participants were tested on three modified UFOV tasks (central processing, peripheral processing, and divided-attention) while seated, standing, and stepping in place. Across all postural conditions, participants showed no difference in their central or peripheral processing. However, in the divided-attention task (reporting the letter in central vision and target location in peripheral vision amongst distracter items) a main effect of posture condition on peripheral target accuracy was found for targets at 57° of eccentricity (p=.037). The mean accuracy reduced from 80.5% (standing) to 74% (seated) to 56.3% (stepping). These findings show that postural demands do affect UFOV divided-attention performance. In particular, the size of the useful field of view significantly decreases when stepping. This finding has important implications for how the results of a UFOV test are used to evaluate the general size of the UFOV during varying activities, as the traditional seated test procedure may overestimate the size of the UFOV during locomotor activities. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. [The future of intensive medicine].

    PubMed

    Palencia Herrejón, E; González Díaz, G; Mancebo Cortés, J

    2011-05-01

    Although Intensive Care Medicine is a young specialty compared with other medical disciplines, it currently plays a key role in the process of care for many patients. Experience has shown that professionals with specific training in Intensive Care Medicine are needed to provide high quality care to critically ill patients. In Europe, important steps have been taken towards the standardization of training programs of the different member states. However, it is now necessary to take one more step forward, that is, the creation of a primary specialty in Intensive Care Medicine. Care of the critically ill needs to be led by specialists who have received specific and complete training and who have the necessary professional competences to provide maximum quality care to their patients. The future of the specialty presents challenges that must be faced with determination, with the main objective of meeting the needs of the population. Copyright © 2011 Elsevier España, S.L. y SEMICYUC. All rights reserved.

  20. A two-step process for epigenetic inheritance in Arabidopsis

    PubMed Central

    Blevins, Todd; Pontvianne, Frédéric; Cocklin, Ross; Podicheti, Ram; Chandrasekhara, Chinmayi; Yerneni, Satwica; Braun, Chris; Lee, Brandon; Rusch, Doug; Mockaitis, Keithanne; Tang, Haixu; Pikaard, Craig S.

    2014-01-01

    Summary In Arabidopsis, multisubunit RNA polymerases IV and V orchestrate RNA-directed DNA methylation (RdDM) and transcriptional silencing, but what identifies the loci to be silenced is unclear. We show that heritable silent locus identity at a specific subset of RdDM targets requires HISTONE DEACETYLASE 6 (HDA6) acting upstream of Pol IV recruitment and siRNA biogenesis. At these loci, epigenetic memory conferring silent locus identity is erased in hda6 mutants such that restoration of HDA6 activity cannot restore siRNA biogenesis or silencing. Silent locus identity is similarly lost in mutants for the cytosine maintenance methyltransferase, MET1. By contrast, pol IV or pol V mutants disrupt silencing without erasing silent locus identity, allowing restoration of Pol IV or Pol V function to restore silencing. Collectively, these observations indicate that silent locus specification and silencing are separable steps that together account for epigenetic inheritance of the silenced state. PMID:24657166

  1. Directed Synthesis of Nanoporous Carbons from Task-Specific Ionic Liquid Precursors for the Adsorption of CO 2

    DOE PAGES

    Mahurin, Shannon M.; Fulvio, Pasquale F.; Hillesheim, Patrick C.; ...

    2014-07-31

    Postcombustion CO 2 capture has become a key component of greenhouse-gas reduction as anthropogenic emissions continue to impact the environment. In this paper, we report a one-step synthesis of porous carbon materials using a series of task-specific ionic liquids for the adsorption of CO 2. By varying the structure of the ionic liquid precursor, we were able to control pore architecture and surface functional groups of the carbon materials in this one-step synthesis process leading to adsorbents with high CO 2 sorption capacities (up to 4.067 mmol g -1) at 0 °C and 1 bar. Finally, added nitrogen functional groupsmore » led to high CO 2/N 2 adsorption-selectivity values ranging from 20 to 37 whereas simultaneously the interaction energy was enhanced relative to carbon materials with no added nitrogen.« less

  2. Thermodynamic analyses of hydrogen production from sub-quality natural gas. Part I: Pyrolysis and autothermal pyrolysis

    NASA Astrophysics Data System (ADS)

    Huang, Cunping; T-Raissi, Ali

    Sub-quality natural gas (SQNG) is defined as natural gas whose composition exceeds pipeline specifications of nitrogen, carbon dioxide (CO 2) and/or hydrogen sulfide (H 2S). Approximately one-third of the U.S. natural gas resource is sub-quality gas [1]. Due to the high cost of removing H 2S from hydrocarbons using current processing technologies, SQNG wells are often capped and the gas remains in the ground. We propose and analyze a two-step hydrogen production scheme using SQNG as feedstock. The first step of the process involves hydrocarbon processing (via steam-methane reformation, autothermal steam-methane reformation, pyrolysis and autothermal pyrolysis) in the presence of H 2S. Our analyses reveal that H 2S existing in SQNG is stable and can be considered as an inert gas. No sulfur dioxide (SO 2) and/or sulfur trioxide (SO 3) is formed from the introduction of oxygen to SQNG. In the second step, after the separation of hydrogen from the main stream, un-reacted H 2S is used to reform the remaining methane, generating more hydrogen and carbon disulfide (CS 2). Thermodynamic analyses on SQNG feedstock containing up to 10% (v/v) H 2S have shown that no H 2S separation is required in this process. The Part I of this paper includes only thermodynamic analyses for SQNG pyrolysis and autothermal pyrolysis.

  3. Step-by-Step Construction of Gene Co-expression Networks from High-Throughput Arabidopsis RNA Sequencing Data.

    PubMed

    Contreras-López, Orlando; Moyano, Tomás C; Soto, Daniela C; Gutiérrez, Rodrigo A

    2018-01-01

    The rapid increase in the availability of transcriptomics data generated by RNA sequencing represents both a challenge and an opportunity for biologists without bioinformatics training. The challenge is handling, integrating, and interpreting these data sets. The opportunity is to use this information to generate testable hypothesis to understand molecular mechanisms controlling gene expression and biological processes (Fig. 1). A successful strategy to generate tractable hypotheses from transcriptomics data has been to build undirected network graphs based on patterns of gene co-expression. Many examples of new hypothesis derived from network analyses can be found in the literature, spanning different organisms including plants and specific fields such as root developmental biology.In order to make the process of constructing a gene co-expression network more accessible to biologists, here we provide step-by-step instructions using published RNA-seq experimental data obtained from a public database. Similar strategies have been used in previous studies to advance root developmental biology. This guide includes basic instructions for the operation of widely used open source platforms such as Bio-Linux, R, and Cytoscape. Even though the data we used in this example was obtained from Arabidopsis thaliana, the workflow developed in this guide can be easily adapted to work with RNA-seq data from any organism.

  4. Step-by-step guideline for disease-specific costing studies in low- and middle-income countries: a mixed methodology

    PubMed Central

    Hendriks, Marleen E.; Kundu, Piyali; Boers, Alexander C.; Bolarinwa, Oladimeji A.; te Pas, Mark J.; Akande, Tanimola M.; Agbede, Kayode; Gomez, Gabriella B.; Redekop, William K.; Schultsz, Constance; Tan, Siok Swan

    2014-01-01

    Background Disease-specific costing studies can be used as input into cost-effectiveness analyses and provide important information for efficient resource allocation. However, limited data availability and limited expertise constrain such studies in low- and middle-income countries (LMICs). Objective To describe a step-by-step guideline for conducting disease-specific costing studies in LMICs where data availability is limited and to illustrate how the guideline was applied in a costing study of cardiovascular disease prevention care in rural Nigeria. Design The step-by-step guideline provides practical recommendations on methods and data requirements for six sequential steps: 1) definition of the study perspective, 2) characterization of the unit of analysis, 3) identification of cost items, 4) measurement of cost items, 5) valuation of cost items, and 6) uncertainty analyses. Results We discuss the necessary tradeoffs between the accuracy of estimates and data availability constraints at each step and illustrate how a mixed methodology of accurate bottom-up micro-costing and more feasible approaches can be used to make optimal use of all available data. An illustrative example from Nigeria is provided. Conclusions An innovative, user-friendly guideline for disease-specific costing in LMICs is presented, using a mixed methodology to account for limited data availability. The illustrative example showed that the step-by-step guideline can be used by healthcare professionals in LMICs to conduct feasible and accurate disease-specific cost analyses. PMID:24685170

  5. Using the Intervention Mapping Protocol to develop an online video intervention for parents to prevent childhood obesity: Movie Models.

    PubMed

    De Lepeleere, Sara; Verloigne, Maïté; Brown, Helen Elizabeth; Cardon, Greet; De Bourdeaudhuij, Ilse

    2016-08-08

    The increasing prevalence of childhood overweight/obesity caused by an unhealthy diet, insufficient physical activity (PA) and high levels of sedentary behaviour (SB) is a prominent public health concern. Parenting practices may contribute to healthy behaviour change in children, but well-researched examples are limited. The aim of this study is to describe the systematic development of an intervention for parents to prevent childhood overweight/obesity through the improvement of parenting practices. The six steps of the Intervention Mapping Protocol (IMP), a theory- and evidence-based tool to develop health-related interventions, were used as a framework to develop the 'Movie Models' programme. In Step 1, a needs assessment was performed to better understand the health problem of overweight/obesity in children and its association with diet, PA and SB. In Step 2, the programme goal (increasing the adoption of effective parenting practices) was sub-divided into performance objectives. Change objectives, which specify explicit actions required to accomplish the performance objectives, were also identified. Step 3 included the selection of theoretical methods (e.g. 'modelling' and 'images'), which were then translated into the practical strategy of online parenting videos. Step 4 comprised the development of a final intervention framework, and Step 5 included the planning of programme adoption and implementation. The final phase, Step 6, included the development of an effect- and process-evaluation plan. The IMP was used to structure the development of 'Movie Models', an intervention targeting specific parenting practices related to children's healthy diet, PA, SB, and parental self-efficacy. A clear framework for process analyses is offered, which aims to increase the potential effectiveness of an intervention and can be useful for those developing health promotion programmes. © The Author(s) 2016.

  6. Implementing Immediate Postpartum Long-Acting Reversible Contraception Programs.

    PubMed

    Hofler, Lisa G; Cordes, Sarah; Cwiak, Carrie A; Goedken, Peggy; Jamieson, Denise J; Kottke, Melissa

    2017-01-01

    To understand the most important steps required to implement immediate postpartum long-acting reversible contraception (LARC) programs in different Georgia hospitals and the barriers to implementing such a program. This was a qualitative study. We interviewed 32 key personnel from 10 Georgia hospitals working to establish immediate postpartum LARC programs. Data were analyzed using directed qualitative content analysis principles. We used the Stages of Implementation to organize participant-identified key steps for immediate postpartum LARC into an implementation guide. We compared this guide to hospitals' implementation experiences. At the completion of the study, LARC was available for immediate postpartum placement at 7 of 10 study hospitals. Participants identified common themes for the implementation experience: team member identification and ongoing communication, payer preparedness challenges, interdependent department-specific tasks, and piloting with continuing improvements. Participants expressed a need for anticipatory guidance throughout the process. Key first steps to immediate postpartum LARC program implementation were identifying project champions, creating an implementation team that included all relevant departments, obtaining financial reassurance, and ensuring hospital administration awareness of the project. Potential barriers included lack of knowledge about immediate postpartum LARC, financial concerns, and competing clinical and administrative priorities. Hospitals that were successful at implementing immediate postpartum LARC programs did so by prioritizing clear communication and multidisciplinary teamwork. Although the implementation guide reflects a comprehensive assessment of the steps to implementing immediate postpartum LARC programs, not all hospitals required every step to succeed. Hospital teams report that implementing immediate postpartum LARC programs involves multiple departments and a number of important steps to consider. A stage-based approach to implementation, and a standardized guide detailing these steps, may provide the necessary structure for the complex process of implementing immediate postpartum LARC programs in the hospital setting.

  7. Process, including PSA and membrane separation, for separating hydrogen from hydrocarbons

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2001-01-01

    An improved process for separating hydrogen from hydrocarbons. The process includes a pressure swing adsorption step, a compression/cooling step and a membrane separation step. The membrane step relies on achieving a methane/hydrogen selectivity of at least about 2.5 under the conditions of the process.

  8. Use of Single-Cysteine Variants for Trapping Transient States in DNA Mismatch Repair.

    PubMed

    Friedhoff, Peter; Manelyte, Laura; Giron-Monzon, Luis; Winkler, Ines; Groothuizen, Flora S; Sixma, Titia K

    2017-01-01

    DNA mismatch repair (MMR) is necessary to prevent incorporation of polymerase errors into the newly synthesized DNA strand, as they would be mutagenic. In humans, errors in MMR cause a predisposition to cancer, called Lynch syndrome. The MMR process is performed by a set of ATPases that transmit, validate, and couple information to identify which DNA strand requires repair. To understand the individual steps in the repair process, it is useful to be able to study these large molecular machines structurally and functionally. However, the steps and states are highly transient; therefore, the methods to capture and enrich them are essential. Here, we describe how single-cysteine variants can be used for specific cross-linking and labeling approaches that allow trapping of relevant transient states. Analysis of these defined states in functional and structural studies is instrumental to elucidate the molecular mechanism of this important DNA MMR process. © 2017 Elsevier Inc. All rights reserved.

  9. Applying macromolecular crowding to 3D bioprinting: fabrication of 3D hierarchical porous collagen-based hydrogel constructs.

    PubMed

    Ng, Wei Long; Goh, Min Hao; Yeong, Wai Yee; Naing, May Win

    2018-02-27

    Native tissues and/or organs possess complex hierarchical porous structures that confer highly-specific cellular functions. Despite advances in fabrication processes, it is still very challenging to emulate the hierarchical porous collagen architecture found in most native tissues. Hence, the ability to recreate such hierarchical porous structures would result in biomimetic tissue-engineered constructs. Here, a single-step drop-on-demand (DOD) bioprinting strategy is proposed to fabricate hierarchical porous collagen-based hydrogels. Printable macromolecule-based bio-inks (polyvinylpyrrolidone, PVP) have been developed and printed in a DOD manner to manipulate the porosity within the multi-layered collagen-based hydrogels by altering the collagen fibrillogenesis process. The experimental results have indicated that hierarchical porous collagen structures could be achieved by controlling the number of macromolecule-based bio-ink droplets printed on each printed collagen layer. This facile single-step bioprinting process could be useful for the structural design of collagen-based hydrogels for various tissue engineering applications.

  10. Controlled electrochemical doping of graphene-based 3D nanoarchitecture electrodes for supercapacitors and capacitive deionisation.

    PubMed

    Abdelkader, A M; Fray, D J

    2017-10-05

    Chemically-doped graphenes are promising electrode materials for energy storage and electrosorption applications. Here, an affordable electrochemical green process is introduced to dope graphene with nitrogen. The process is based on reversing the polarity of two identical graphene oxide (GO) electrodes in molten KCl-LiCl-Li 3 N. During the cathodic step, the oxygen functional groups on the GO surface are removed through direct electro-deoxidation reactions or a reaction with the deposited lithium. In the anodic step, nitrogen is adsorbed onto the surface of graphene and subsequently reacts to form nitrogen-doped graphene. The doping process is controllable, and graphene with up to 7.4 at% nitrogen can be produced. The electrochemically treated electrodes show a specific capacitance of 320 F g -1 in an aqueous KOH electrolyte and maintain 96% of this value after 10 000 cycles. The electrodes also display excellent electrosorption performance in capacitive deionisation devices with the salt removal efficiency reaching up to 18.6 mg g -1 .

  11. Molecular chaperone function of Mia40 triggers consecutive induced folding steps of the substrate in mitochondrial protein import

    PubMed Central

    Banci, Lucia; Bertini, Ivano; Cefaro, Chiara; Cenacchi, Lucia; Ciofi-Baffoni, Simone; Felli, Isabella Caterina; Gallo, Angelo; Gonnelli, Leonardo; Luchinat, Enrico; Sideris, Dionisia; Tokatlidis, Kostas

    2010-01-01

    Several proteins of the mitochondrial intermembrane space are targeted by internal targeting signals. A class of such proteins with α-helical hairpin structure bridged by two intramolecular disulfides is trapped by a Mia40-dependent oxidative process. Here, we describe the oxidative folding mechanism underpinning this process by an exhaustive structural characterization of the protein in all stages and as a complex with Mia40. Two consecutive induced folding steps are at the basis of the protein-trapping process. In the first one, Mia40 functions as a molecular chaperone assisting α-helical folding of the internal targeting signal of the substrate. Subsequently, in a Mia40-independent manner, folding of the second substrate helix is induced by the folded targeting signal functioning as a folding scaffold. The Mia40-induced folding pathway provides a proof of principle for the general concept that internal targeting signals may operate as a folding nucleus upon compartment-specific activation. PMID:21059946

  12. Impact of Marine Drugs on Animal Reproductive Processes

    PubMed Central

    Silvestre, Francesco; Tosti, Elisabetta

    2009-01-01

    The discovery and description of bioactive substances from natural sources has been a research topic for the last 50 years. In this respect, marine animals have been used to extract many new compounds exerting different actions. Reproduction is a complex process whose main steps are the production and maturation of gametes, their activation, the fertilisation and the beginning of development. In the literature it has been shown that many substances extracted from marine organisms may have profound influence on the reproductive behaviour, function and reproductive strategies and survival of species. However, despite the central importance of reproduction and thus the maintenance of species, there are still few studies on how reproductive mechanisms are impacted by marine bioactive drugs. At present, studies in either marine and terrestrial animals have been particularly important in identifying what specific fine reproductive mechanisms are affected by marine-derived substances. In this review we describe the main steps of the biology of reproduction and the impact of substances from marine environment and organisms on the reproductive processes. PMID:20098597

  13. Improving health care, Part 4: Concepts for improving any clinical process.

    PubMed

    Batalden, P B; Mohr, J J; Nelson, E C; Plume, S K

    1996-10-01

    One promising method for streamlining the generation of "good ideas" is to formulate what are sometimes called change concepts-general notions or approaches to change found useful in developing specific ideas for changes that lead to improvement. For example, in current efforts to reduce health care costs by discounting provider charges, the underlying generic concept is "reducing health care costs," and the specific idea is "discounting provider charges." Short-term gains in health care cost reduction can occur by pursuing discounts. After some time, however, limits to such reduction in costs are experienced. Persevering and continuing to travel down the "discounting provider charges" path is less likely to produce further substantial improvement than returning to the basic concept of "reducing health care costs." An interdisciplinary team aiming to reduce costs while improving quality of care for patients in need of hip joint replacement generated ideas for changing "what's done (process) to get better results." After team members wrote down their improvement ideas, they deduced the underlying change concepts and used them to generate even more ideas for improvement. Such change concepts include reordering the sequence of steps (preadmission physical therapy "certification"), eliminating failures at hand-offs between steps (transfer of information from physician's office to hospital), and eliminating a step (epidural pain control). Learning about making change, encouraging change, managing the change within and across organizations, and learning from the changes tested will characterize the sustainable, thriving health systems of the future.

  14. Gwyscan: a library to support non-equidistant scanning probe microscope measurements

    NASA Astrophysics Data System (ADS)

    Klapetek, Petr; Yacoot, Andrew; Grolich, Petr; Valtr, Miroslav; Nečas, David

    2017-03-01

    We present a software library and related methodology for enabling easy integration of adaptive step (non-equidistant) scanning techniques into metrological scanning probe microscopes or scanning probe microscopes where individual x, y position data are recorded during measurements. Scanning with adaptive steps can reduce the amount of data collected in SPM measurements thereby leading to faster data acquisition, a smaller amount of data collection required for a specific analytical task and less sensitivity to mechanical and thermal drift. Implementation of adaptive scanning routines into a custom built microscope is not normally an easy task: regular data are much easier to handle for previewing (e.g. levelling) and storage. We present an environment to make implementation of adaptive scanning easier for an instrument developer, specifically taking into account data acquisition approaches that are used in high accuracy microscopes as those developed by National Metrology Institutes. This includes a library with algorithms written in C and LabVIEW for handling data storage, regular mesh preview generation and planning the scan path on basis of different assumptions. A set of modules for Gwyddion open source software for handling these data and for their further analysis is presented. Using this combination of data acquisition and processing tools one can implement adaptive scanning in a relatively easy way into an instrument that was previously measuring on a regular grid. The performance of the presented approach is shown and general non-equidistant data processing steps are discussed.

  15. The nuts and bolts of evaluating science communication activities.

    PubMed

    Spicer, Suzanne

    2017-10-01

    Since 2008 there has been a focus on fostering a culture of public engagement in higher education plus an impact agenda that demands scientists provide evidence of how their work, including their science communication, is making a difference. Good science communication takes a significant amount of time to plan and deliver so how can you improve what you are doing and demonstrate if you are having an impact? The answer is to evaluate. Effective evaluation needs to be planned so this paper takes you step by step through the evaluation process, illustrated using specific examples. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  16. Self-assembled graphene hydrogel via a one-step hydrothermal process.

    PubMed

    Xu, Yuxi; Sheng, Kaixuan; Li, Chun; Shi, Gaoquan

    2010-07-27

    Self-assembly of two-dimensional graphene sheets is an important strategy for producing macroscopic graphene architectures for practical applications, such as thin films and layered paperlike materials. However, construction of graphene self-assembled macrostructures with three-dimensional networks has never been realized. In this paper, we prepared a self-assembled graphene hydrogel (SGH) via a convenient one-step hydrothermal method. The SGH is electrically conductive, mechanically strong, and thermally stable and exhibits a high specific capacitance. The high-performance SGH with inherent biocompatibility of carbon materials is attractive in the fields of biotechnology and electrochemistry, such as drug-delivery, tissue scaffolds, bionic nanocomposites, and supercapacitors.

  17. Peer coaching: the next step in staff development.

    PubMed

    Waddell, Donna L; Dunn, Nancy

    2005-01-01

    A common problem in continuing nursing education and staff development is the transfer of learning to clinical practice. Peer coaching offers a solution to this problem. Initiated by educators, peer coaching has been researched in educational settings and found to be effective in facilitating the transfer of newly acquired knowledge and skill into classroom teaching strategies. This article describes the background, components, process, characteristics, and benefits of peer coaching. A specific example of using peer coaching to teach clinical breast examination skills is used to illustrate the application of peer coaching to the staff development of healthcare professionals. Peer coaching is the next step in nursing staff development.

  18. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  19. Steps to a HealthierUS Cooperative Agreement Program: foundational elements for program evaluation planning, implementation, and use of findings.

    PubMed

    MacDonald, Goldie; Garcia, Danyael; Zaza, Stephanie; Schooley, Michael; Compton, Don; Bryant, Terry; Bagnol, Lulu; Edgerly, Cathy; Haverkate, Rick

    2006-01-01

    The Steps to a HealthierUS Cooperative Agreement Program (Steps Program) enables funded communities to implement chronic disease prevention and health promotion efforts to reduce the burden of diabetes, obesity, asthma, and related risk factors. At both the national and community levels, investment in surveillance and program evaluation is substantial. Public health practitioners engaged in program evaluation planning often identify desired outcomes, related indicators, and data collection methods but may pay only limited attention to an overarching vision for program evaluation among participating sites. We developed a set of foundational elements to provide a vision of program evaluation that informs the technical decisions made throughout the evaluation process. Given the diversity of activities across the Steps Program and the need for coordination between national- and community-level evaluation efforts, our recommendations to guide program evaluation practice are explicit yet leave room for site-specific context and needs. Staff across the Steps Program must consider these foundational elements to prepare a formal plan for program evaluation. Attention to each element moves the Steps Program closer to well-designed and complementary plans for program evaluation at the national, state, and community levels.

  20. Process for remediation of plastic waste

    DOEpatents

    Pol, Vilas G; Thiyagarajan, Pappannan

    2013-11-12

    A single step process for degrading plastic waste by converting the plastic waste into carbonaceous products via thermal decomposition of the plastic waste by placing the plastic waste into a reactor, heating the plastic waste under an inert or air atmosphere until the temperature of about 700.degree. C. is achieved, allowing the reactor to cool down, and recovering the resulting decomposition products therefrom. The decomposition products that this process yields are carbonaceous materials, and more specifically carbon nanotubes having a partially filled core (encapsulated) adjacent to one end of the nanotube. Additionally, in the presence of a transition metal compound, this thermal decomposition process produces multi-walled carbon nanotubes.

  1. Uncertainty analysis as essential step in the establishment of the dynamic Design Space of primary drying during freeze-drying.

    PubMed

    Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas

    2016-06-01

    Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.

  2. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images.

    PubMed

    Muncy, Nathan M; Hedges-Muncy, Ariana M; Kirwan, C Brock

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing.

  3. Mass production of silicon pore optics for ATHENA

    NASA Astrophysics Data System (ADS)

    Wille, Eric; Bavdaz, Marcos; Collon, Maximilien

    2016-07-01

    Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.

  4. Six Sigma process utilization in reducing door-to-balloon time at a single academic tertiary care center.

    PubMed

    Kelly, Elizabeth W; Kelly, Jonathan D; Hiestand, Brian; Wells-Kiser, Kathy; Starling, Stephanie; Hoekstra, James W

    2010-01-01

    Rapid reperfusion in patients with ST-elevation myocardial infarction (STEMI) is associated with lower mortality. Reduction in door-to-balloon (D2B) time for percutaneous coronary intervention requires multidisciplinary cooperation, process analysis, and quality improvement methodology. Six Sigma methodology was used to reduce D2B times in STEMI patients presenting to a tertiary care center. Specific steps in STEMI care were determined, time goals were established, and processes were changed to reduce each step's duration. Outcomes were tracked, and timely feedback was given to providers. After process analysis and implementation of improvements, mean D2B times decreased from 128 to 90 minutes. Improvement has been sustained; as of June 2010, the mean D2B was 56 minutes, with 100% of patients meeting the 90-minute window for the year. Six Sigma methodology and immediate provider feedback result in significant reductions in D2B times. The lessons learned may be extrapolated to other primary percutaneous coronary intervention centers. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. A case study analysing the process of analogy-based learning in a teaching unit about simple electric circuits

    NASA Astrophysics Data System (ADS)

    Paatz, Roland; Ryder, James; Schwedes, Hannelore; Scott, Philip

    2004-09-01

    The purpose of this case study is to analyse the learning processes of a 16-year-old student as she learns about simple electric circuits in response to an analogy-based teaching sequence. Analogical thinking processes are modelled by a sequence of four steps according to Gentner's structure mapping theory (activate base domain, postulate local matches, connect them to a global match, draw candidate inferences). We consider whether Gentner's theory can be used to account for the details of this specific teaching/learning context. The case study involved video-taping teaching and learning activities in a 10th-grade high school course in Germany. Teaching used water flow through pipes as an analogy for electrical circuits. Using Gentner's theory, relational nets were created from the student's statements at different stages of her learning. Overall, these nets reflect the four steps outlined earlier. We also consider to what extent the learning processes revealed by this case study are different from previous analyses of contexts in which no analogical knowledge is available.

  6. [Psychosocial Risk Evaluation in the Workplace: Expert-based Development of a Checklist for Occupational Physicians].

    PubMed

    Weigl, M; Müller, A; Angerer, P; Petru, R

    2016-03-01

    The implementation of psychosocial risk assessment at the workplace often fails in practice. One reason is the lack of competence of those who are in charge of the process. We present a checklist for the effective implementation of psychosocial risk assessment at workplace. This tool shall support occupational physicians in the preparation, planning and implementation of a psychosocial risks assessment process. Based on a stepwise development and validation process, specific steps and factors for the successful implementation were identified qualitatively with 15 occupational physicians and experts. This was conducted in a 2-stage Delphi study. In the following, the identified steps and factors were transferred into a checklist. Subsequently, the checklist was evaluated in a focus group of occupational physicians (user evaluation). Thereafter, the contents were subjected to an expert evaluation. Our checklist for the effective implementation of the process of psychosocial risk management in the workplace aims to strengthen the competence of occupational physicians, especially in the implementation of risk assessments in small and medium-sized enterprises (SMEs). © Georg Thieme Verlag KG Stuttgart · New York.

  7. The Need for a Kinetics for Biological Transport

    PubMed Central

    Schindler, A. M.; Iberall, A. S.

    1973-01-01

    The traditional theory of transport across capillary membranes via a laminar Poiseuille flow is shown to be invalid. It is demonstrated that the random, diffusive nature of the molecular flow and interactions with the “pore” walls play an important role in the transport process. Neither the continuum Navier-Stokes theory nor the equivalent theory of irreversible thermodynamics is adequate to treat the problem. Combination of near-continuum hydrodynamic theory, noncontinuum kinetic theory, and the theory of fluctuations provides a first step toward modeling both liquid processes in general and membrane transport processes as a specific application. PMID:4726880

  8. VisAdapt: A Visualization Tool to Support Climate Change Adaptation.

    PubMed

    Johansson, Jimmy; Opach, Tomasz; Glaas, Erik; Neset, Tina-Simone; Navarra, Carlo; Linner, Bjorn-Ola; Rod, Jan Ketil

    2017-01-01

    The web-based visualization VisAdapt tool was developed to help laypeople in the Nordic countries assess how anticipated climate change will impact their homes. The tool guides users through a three-step visual process that helps them explore risks and identify adaptive actions specifically modified to their location and house type. This article walks through the tool's multistep, user-centered design process. Although VisAdapt's target end users are Nordic homeowners, the insights gained from the development process and the lessons learned from the project are applicable to a wide range of domains.

  9. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  10. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  11. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  12. Development of adsorptive hybrid filters to enable two-step purification of biologics

    PubMed Central

    Peck, Michael; Voloshin, Alexei M.; Moreno, Angela M.; Tan, Zhijun; Hester, Jonathan; Borys, Michael C.; Li, Zheng Jian

    2017-01-01

    ABSTRACT Recent progress in mammalian cell culture process has resulted in significantly increased product titers, but also a substantial increase in process- and product-related impurities. Due to the diverse physicochemical properties of these impurities, there is constant need for new technologies that offer higher productivity and improved economics without sacrificing the process robustness required to meet final drug substance specifications. Here, we examined the use of new synthetic adsorptive hybrid filters (AHF) modified with the high binding capacity of quaternary amine (Emphaze™ AEX) and salt-tolerant biomimetic (Emphaze™ ST-AEX) ligands for clearance of process-related impurities like host cell protein (HCP), residual DNA, and virus. The potential to remove soluble aggregates was also examined. Our aim was to develop a mechanistic understanding of the interactions governing adsorptive removal of impurities during filtration by evaluating the effect of various filter types, feed streams, and process conditions on impurity removal. The ionic capacity of these filters was measured and correlated with their ability to remove impurities for multiple molecules. The ionic capacity of AHF significantly exceeded that of traditional adsorptive depth filters (ADF) by 40% for the Emphaze™ AEX and by 700% for the Emphaze™ ST-AEX, providing substantially higher reduction of soluble anionic impurities, including DNA, HCPs and model virus. Nevertheless, we determined that ADF with filter aid provided additional hydrophobic functionality that resulted in removal of higher molecular weight species than AHF. Implementing AHF demonstrated improved process-related impurity removal and viral clearance after Protein A chromatography and enabled a two-step purification process. The consequences of enhanced process performance are far reaching because it allows the downstream polishing train to be restructured and simplified, and chromatographic purity standards to be met with a reduced number of chromatographic steps. PMID:27929735

  13. Two step continuous method to synthesize colloidal spheroid gold nanorods.

    PubMed

    Chandra, S; Doran, J; McCormack, S J

    2015-12-01

    This research investigated a two-step continuous process to synthesize colloidal suspension of spheroid gold nanorods. In the first step; gold precursor was reduced to seed-like particles in the presence of polyvinylpyrrolidone and ascorbic acid. In continuous second step; silver nitrate and alkaline sodium hydroxide produced various shape and size Au nanoparticles. The shape was manipulated through weight ratio of ascorbic acid to silver nitrate by varying silver nitrate concentration. The specific weight ratio of 1.35-1.75 grew spheroid gold nanorods of aspect ratio ∼1.85 to ∼2.2. Lower weight ratio of 0.5-1.1 formed spherical nanoparticle. The alkaline medium increased the yield of gold nanorods and reduced reaction time at room temperature. The synthesized gold nanorods retained their shape and size in ethanol. The surface plasmon resonance was red shifted by ∼5 nm due to higher refractive index of ethanol than water. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Description and Results of the Air Force Research and Development Program for the Improvement of Maintenance Efficiency.

    ERIC Educational Resources Information Center

    Foley, John P., Jr.

    An overview of the Air Force's Research and Development Program for the Improvement of Maintenance Efficiency is provided. First described are the steps found in any detailed task analysis, a process which results in the complete specification of each task involved in an overall maintenance effort. The factors influencing maintenance effectiveness…

  15. How to Perform a Security Audit: Is Your School's or District's Network Vulnerable?

    ERIC Educational Resources Information Center

    Dark, Melissa; Poftak, Amy

    2004-01-01

    In this article, the authors address the importance of taking a proactive approach to securing a school's network. To do this, it is first required to know the system's specific vulnerabilities and what steps to take to reduce them. The formal process for doing this is known as an information security risk assessment, or a security audit. What…

  16. Furniture and Equipment in Schools: A Purchasing Guide. Managing School Facilities, Guide 7.

    ERIC Educational Resources Information Center

    Wadsworth, Alison

    This document offers advice on the processes that should be followed when schools in the United Kingdom buy their furniture and equipment (F&E). Sections 1 and 2 examine the first steps, prior to purchasing, such as curriculum analysis and market exploration; and sections 3 and 4 explore the importance of creating a clear specification for…

  17. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  18. Structure of matrix metalloproteinase-3 with a platinum-based inhibitor.

    PubMed

    Belviso, Benny Danilo; Caliandro, Rocco; Siliqi, Dritan; Calderone, Vito; Arnesano, Fabio; Natile, Giovanni

    2013-06-18

    An X-ray investigation has been performed with the aim of characterizing the binding sites of a platinum-based inhibitor (K[PtCl3(DMSO)]) of matrix metalloproteinase-3 (stromelysin-1). The platinum complex targets His224 in the S1' specificity loop, representing the first step in the selective inhibition process (PDB ID code 4JA1).

  19. From Red Tape to Results: Creating a Government That Works Better & Costs Less. Report of the National Performance Review.

    ERIC Educational Resources Information Center

    Gore, Al

    This monograph presents results of a 6-month study of the federal government and the Clinton Administration's proposal for a decade-long process of re-inventing the federal government's operations. Each of four major principles are presented in a chapter organized around specific steps towards its implementation. These principles are: cutting red…

  20. Synthesis of Platinum-nickel Nanowires and Optimization for Oxygen Reduction Performance

    DOE PAGES

    Alia, Shaun M.; Pivovar, Bryan S.

    2018-01-01

    Platinum-nickel (Pt-Ni) nanowires were developed as fuel cell electrocatalysts, and were optimized for the performance and durability in the oxygen reduction reaction. Spontaneous galvanic displacement was used to deposit Pt layers onto Ni nanowire substrates. The synthesis approach produced catalysts with high specific activities and high Pt surface areas. Hydrogen annealing improved Pt and Ni mixing and specific activity. Acid leaching was used to preferentially remove Ni near the nanowire surface, and oxygen annealing was used to stabilize near-surface Ni, improving durability and minimizing Ni dissolution. These protocols detail the optimization of each post-synthesis processing step, including hydrogen annealing tomore » 250 degrees C, exposure to 0.1 M nitric acid, and oxygen annealing to 175 degrees C. Through these steps, Pt-Ni nanowires produced increased activities more than an order of magnitude than Pt nanoparticles, while offering significant durability improvements. The presented protocols are based on Pt-Ni systems in the development of fuel cell catalysts. Furthermore, these techniques have also been used for a variety of metal combinations, and can be applied to develop catalysts for a number of electrochemical processes.« less

  1. Synthesis of Platinum-nickel Nanowires and Optimization for Oxygen Reduction Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alia, Shaun M.; Pivovar, Bryan S.

    Platinum-nickel (Pt-Ni) nanowires were developed as fuel cell electrocatalysts, and were optimized for the performance and durability in the oxygen reduction reaction. Spontaneous galvanic displacement was used to deposit Pt layers onto Ni nanowire substrates. The synthesis approach produced catalysts with high specific activities and high Pt surface areas. Hydrogen annealing improved Pt and Ni mixing and specific activity. Acid leaching was used to preferentially remove Ni near the nanowire surface, and oxygen annealing was used to stabilize near-surface Ni, improving durability and minimizing Ni dissolution. These protocols detail the optimization of each post-synthesis processing step, including hydrogen annealing tomore » 250 degrees C, exposure to 0.1 M nitric acid, and oxygen annealing to 175 degrees C. Through these steps, Pt-Ni nanowires produced increased activities more than an order of magnitude than Pt nanoparticles, while offering significant durability improvements. The presented protocols are based on Pt-Ni systems in the development of fuel cell catalysts. Furthermore, these techniques have also been used for a variety of metal combinations, and can be applied to develop catalysts for a number of electrochemical processes.« less

  2. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Scintillator Design Via Codoping

    NASA Astrophysics Data System (ADS)

    Melcher, C. L.; Koschan, M.; Zhuravleva, M.; Wu, Y.; Rothfuss, H.; Meng, F.; Tyagi, M.; Donnald, S.; Yang, K.; Hayward, J. P.; Eriksson, L.

    Scintillation materials that lack intrinsic luminescence centers must be doped with optically active ions in order to provide luminescent centers that radiatively de-excite as the final step of the scintillation process. Codoping, on the other hand, can be defined as the incorporation of additional specific impurity species usually for the purpose of modifying the scintillation properties, mechanical properties, or the crystal growth behavior. In recent years codoping has become an increasingly popular approach for engineering scintillators with optimal performance for targeted applications. This report reviews several successful examples and its effect on specific properties.

  4. Recombinant Passenger Proteins Can Be Conveniently Purified by One-Step Affinity Chromatography.

    PubMed

    Wang, Hua-zhen; Chu, Zhi-zhan; Chen, Chang-chao; Cao, Ao-cheng; Tong, Xin; Ouyang, Can-bin; Yuan, Qi-hang; Wang, Mi-nan; Wu, Zhong-kun; Wang, Hai-hong; Wang, Sheng-bin

    2015-01-01

    Fusion tag is one of the best available tools to date for enhancement of the solubility or improvement of the expression level of recombinant proteins in Escherichia coli. Typically, two consecutive affinity purification steps are often necessitated for the purification of passenger proteins. As a fusion tag, acyl carrier protein (ACP) could greatly increase the soluble expression level of Glucokinase (GlcK), α-Amylase (Amy) and GFP. When fusion protein ACP-G2-GlcK-Histag and ACP-G2-Amy-Histag, in which a protease TEV recognition site was inserted between the fusion tag and passenger protein, were coexpressed with protease TEV respectively in E. coli, the efficient intracellular processing of fusion proteins was achieved. The resulting passenger protein GlcK-Histag and Amy-Histag accumulated predominantly in a soluble form, and could be conveniently purified by one-step Ni-chelating chromatography. However, the fusion protein ACP-GFP-Histag was processed incompletely by the protease TEV coexpressed in vivo, and a large portion of the resulting target protein GFP-Histag aggregated in insoluble form, indicating that the intracellular processing may affect the solubility of cleaved passenger protein. In this context, the soluble fusion protein ACP-GFP-Histag, contained in the supernatant of E. coli cell lysate, was directly subjected to cleavage in vitro by mixing it with the clarified cell lysate of E. coli overexpressing protease TEV. Consequently, the resulting target protein GFP-Histag could accumulate predominantly in a soluble form, and be purified conveniently by one-step Ni-chelating chromatography. The approaches presented here greatly simplify the purification process of passenger proteins, and eliminate the use of large amounts of pure site-specific proteases.

  5. Recombinant Passenger Proteins Can Be Conveniently Purified by One-Step Affinity Chromatography

    PubMed Central

    Wang, Hua-zhen; Chu, Zhi-zhan; Chen, Chang-chao; Cao, Ao-cheng; Tong, Xin; Ouyang, Can-bin; Yuan, Qi-hang; Wang, Mi-nan; Wu, Zhong-kun; Wang, Hai-hong; Wang, Sheng-bin

    2015-01-01

    Fusion tag is one of the best available tools to date for enhancement of the solubility or improvement of the expression level of recombinant proteins in Escherichia coli. Typically, two consecutive affinity purification steps are often necessitated for the purification of passenger proteins. As a fusion tag, acyl carrier protein (ACP) could greatly increase the soluble expression level of Glucokinase (GlcK), α-Amylase (Amy) and GFP. When fusion protein ACP-G2-GlcK-Histag and ACP-G2-Amy-Histag, in which a protease TEV recognition site was inserted between the fusion tag and passenger protein, were coexpressed with protease TEV respectively in E. coli, the efficient intracellular processing of fusion proteins was achieved. The resulting passenger protein GlcK-Histag and Amy-Histag accumulated predominantly in a soluble form, and could be conveniently purified by one-step Ni-chelating chromatography. However, the fusion protein ACP-GFP-Histag was processed incompletely by the protease TEV coexpressed in vivo, and a large portion of the resulting target protein GFP-Histag aggregated in insoluble form, indicating that the intracellular processing may affect the solubility of cleaved passenger protein. In this context, the soluble fusion protein ACP-GFP-Histag, contained in the supernatant of E. coli cell lysate, was directly subjected to cleavage in vitro by mixing it with the clarified cell lysate of E. coli overexpressing protease TEV. Consequently, the resulting target protein GFP-Histag could accumulate predominantly in a soluble form, and be purified conveniently by one-step Ni-chelating chromatography. The approaches presented here greatly simplify the purification process of passenger proteins, and eliminate the use of large amounts of pure site-specific proteases. PMID:26641240

  6. Numerical Simulation of Nanostructure Growth

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.

    2004-01-01

    Nanoscale structures, such as nanowires and carbon nanotubes (CNTs), are often grown in gaseous or plasma environments. Successful growth of these structures is defined by achieving a specified crystallinity or chirality, size or diameter, alignment, etc., which in turn depend on gas mixture ratios. pressure, flow rate, substrate temperature, and other operating conditions. To date, there has not been a rigorous growth model that addresses the specific concerns of crystalline nanowire growth, while demonstrating the correct trends of the processing conditions on growth rates. Most crystal growth models are based on the Burton, Cabrera, and Frank (BCF) method, where adatoms are incorporated into a growing crystal at surface steps or spirals. When the supersaturation of the vapor is high, islands nucleate to form steps, and these steps subsequently spread (grow). The overall bulk growth rate is determined by solving for the evolving motion of the steps. Our approach is to use a phase field model to simulate the growth of finite sized nanowire crystals, linking the free energy equation with the diffusion equation of the adatoms. The phase field method solves for an order parameter that defines the evolving steps in a concentration field. This eliminates the need for explicit front tracking/location, or complicated shadowing routines, both of which can be computationally expensive, particularly in higher dimensions. We will present results demonstrating the effect of process conditions, such as substrate temperature, vapor supersaturation, etc. on the evolving morphologies and overall growth rates of the nanostructures.

  7. Pyrrhocoricin, a proline-rich antimicrobial peptide derived from insect, inhibits the translation process in the cell-free Escherichia coli protein synthesis system.

    PubMed

    Taniguchi, Masayuki; Ochiai, Akihito; Kondo, Hiroshi; Fukuda, Shun; Ishiyama, Yohei; Saitoh, Eiichi; Kato, Tetsuo; Tanaka, Takaaki

    2016-05-01

    Previous studies have shown that pyrrhocoricin, a proline-rich antimicrobial peptide (PrAMP), killed sensitive species in a dose-dependent manner by specifically binding to DnaK. Here, on the basis of the finding that DnaK-deficient Escherichia coli strains are susceptible to PrAMPs, we used pyrrhocoricin to investigate internal targets other than DnaK. Using conventional antibiotics (bleomycin, streptomycin, and fosfomycin) that have known modes of action, first, we validated the availability of an assay using a cell-free rapid translation system (RTS), which is an in vitro protein synthesis system based on E. coli lysate, for evaluating inhibition of protein synthesis. We found that, similarly to bleomycin and streptomycin, pyrrhocoricin inhibited GFP synthesis in RTS in a concentration-dependent manner. In addition, blockage of transcription and translation steps in RTS was individually estimated using RT-PCR after gene expression to determine mRNA products and using sodium dodecyl sulfate-polyacrylamide gel electrophoresis to determine the amounts of GFP expressed from purified mRNA, respectively. The results demonstrated that this inhibition of GFP synthesis by pyrrhocoricin did not occur at the transcription step but rather at the translation step, in a manner similar to that of GFP synthesis by streptomycin, an inhibitor of the translation step by causing misreading of tRNA. These results suggest that RTS is a powerful assay system for determining if antimicrobial peptides inhibit protein synthesis and its transcription and/or translation steps. This is the first study to have shown that pyrrhocoricin inhibited protein synthesis by specifically repressing the translation step. Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  8. Furthering Our Understanding of Therapist Knowledge and Attitudinal Measurement in Youth Community Mental Health.

    PubMed

    Okamura, Kelsie H; Hee, Puanani J; Jackson, David; Nakamura, Brad J

    2018-02-19

    Examining therapist evidence-based practice (EBP) knowledge seems an important step for supporting successful implementation. Advances in implementation science suggest a distinction between practice specific (i.e., knowing which practices are derived from the evidence base) and EBP process (i.e., integrating research evidence, clinical experience, client characteristics, and monitoring outcomes) knowledge. An examination of how these knowledge types are measured and relate to attitudes appears warranted. In our sample of 58 youth community therapists, both practice specific and EBP process knowledge accounted for EBP attitude scores, which varied by therapist demographic variables. Implications for measurement of therapist constructs and future research in identifying therapist predictors of EBP use and youth clinical improvement are discussed.

  9. Use of commercial off-the-shelf digital cameras for scientific data acquisition and scene-specific color calibration

    PubMed Central

    Akkaynak, Derya; Treibitz, Tali; Xiao, Bei; Gürkan, Umut A.; Allen, Justine J.; Demirci, Utkan; Hanlon, Roger T.

    2014-01-01

    Commercial off-the-shelf digital cameras are inexpensive and easy-to-use instruments that can be used for quantitative scientific data acquisition if images are captured in raw format and processed so that they maintain a linear relationship with scene radiance. Here we describe the image-processing steps required for consistent data acquisition with color cameras. In addition, we present a method for scene-specific color calibration that increases the accuracy of color capture when a scene contains colors that are not well represented in the gamut of a standard color-calibration target. We demonstrate applications of the proposed methodology in the fields of biomedical engineering, artwork photography, perception science, marine biology, and underwater imaging. PMID:24562030

  10. Optimization of High-Throughput Sequencing Kinetics for determining enzymatic rate constants of thousands of RNA substrates

    PubMed Central

    Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.

    2016-01-01

    Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633

  11. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images

    PubMed Central

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing. PMID:29023597

  12. A marker-free system for the analysis of movement disabilities.

    PubMed

    Legrand, L; Marzani, F; Dusserre, L

    1998-01-01

    A major step toward improving the treatments of disabled persons may be achieved by using motion analysis equipment. We are developing such a system. It allows the analysis of plane human motion (e.g. gait) without using the tracking of markers. The system is composed of one fixed camera which acquires an image sequence of a human in motion. Then the treatment is divided into two steps: first, a large number of pixels belonging to the boundaries of the human body are extracted at each acquisition time. Secondly, a two-dimensional model of the human body, based on tapered superquadrics, is successively matched with the sets of pixels previously extracted; a specific fuzzy clustering process is used for this purpose. Moreover, an optical flow procedure gives a prediction of the model location at each acquisition time from its location at the previous time. Finally we present some results of this process applied to a leg in motion.

  13. CIDOC-CRM extensions for conservation processes: A methodological approach

    NASA Astrophysics Data System (ADS)

    Vassilakaki, Evgenia; Zervos, Spiros; Giannakopoulos, Georgios

    2015-02-01

    This paper aims to report the steps taken to create the CIDOC Conceptual Reference Model (CIDOC-CRM) extensions and the relationships established to accommodate the depiction of conservation processes. In particular, the specific steps undertaken for developing and applying the CIDOC-CRM extensions for defining the conservation interventions performed on the cultural artifacts of the National Archaeological Museum of Athens, Greece are presented in detail. A report on the preliminary design of the DOC-CULTURE project (Development of an integrated information environment for assessment and documentation of conservation interventions to cultural works/objects with nondestructive testing techniques [NDTs], www.ndt-lab.gr/docculture), co-financed by the European Union NSRF THALES program, can be found in Kyriaki-Manessi, Zervos & Giannakopoulos (1) whereas the NDT&E methods and their output data through CIDOC-CRM extension of the DOC-CULTURE project approach to standardize the documentation of the conservation were further reported in Kouis et al. (2).

  14. Requirements for a documentation of the image manipulation processes within PACS

    NASA Astrophysics Data System (ADS)

    Retter, Klaus; Rienhoff, Otto; Karsten, Ch.; Prince, Hazel E.

    1990-08-01

    This paper discusses to which extent manipulation functions which have been applied to images handled in PACS should be documented. After postulating an increasing amount of postprocessing features on PACS-consoles, legal, educational and medical reasons for a documentation of image manipulation processes are presented. Besides legal necessities, aspects of storage capacity, response time, and potential uses determine the extent of this documentation. Is there a specific kind of manipulation functions which has to be documented generally? Should the physician decide which parts of the various pathways he tries are recorded by the system? To distinguish, for example, between reversible and irreversible functions or between interactive and non-interactive functions is one step towards a solution. Another step is to establish definitions for terms like "raw" and "final" image. The paper systematizes these questions and offers strategic help. The answers will have an important impact on PACS design and functionality.

  15. An improved method for purification of recombinant truncated heme oxygenase-1 by expanded bed adsorption and gel filtration.

    PubMed

    Hu, Hong-Bo; Wang, Wei; Han, Ling; Zhou, Wen-Pu; Zhang, Xue-Hong

    2007-03-01

    Recombinant truncated human heme oxygenase-1 (hHO-1) expressed in Escherichia coli was efficiently separated and purified from feedstock by DEAE-ion exchange expanded bed adsorption. Protocol optimization of hHO-1 on DEAE adsorbent resulted in adsorption in 0 M NaCl and elution in 150 mM NaCl at a pH of 8.5. The active enzyme fractions separated from the expanded bed column were further purified by a Superdex 75 gel filtration step. The specific hHO-1 activity increased from 0.82 +/- 0.05 to 24.8 +/- 1.8 U/mg during the whole purification steps. The recovery and purification factor of truncated hHO-1 of the whole purification were 72.7 +/- 4.7 and 30.2 +/- 2.3%, respectively. This purification process can decrease the demand on the preparation of feedstock and simplify the purification process.

  16. Laboratory procedures to generate viral metagenomes.

    PubMed

    Thurber, Rebecca V; Haynes, Matthew; Breitbart, Mya; Wegley, Linda; Rohwer, Forest

    2009-01-01

    This collection of laboratory protocols describes the steps to collect viruses from various samples with the specific aim of generating viral metagenome sequence libraries (viromes). Viral metagenomics, the study of uncultured viral nucleic acid sequences from different biomes, relies on several concentration, purification, extraction, sequencing and heuristic bioinformatic methods. No single technique can provide an all-inclusive approach, and therefore the protocols presented here will be discussed in terms of hypothetical projects. However, care must be taken to individualize each step depending on the source and type of viral-particles. This protocol is a description of the processes we have successfully used to: (i) concentrate viral particles from various types of samples, (ii) eliminate contaminating cells and free nucleic acids and (iii) extract, amplify and purify viral nucleic acids. Overall, a sample can be processed to isolate viral nucleic acids suitable for high-throughput sequencing in approximately 1 week.

  17. Vectorized Rebinning Algorithm for Fast Data Down-Sampling

    NASA Technical Reports Server (NTRS)

    Dean, Bruce; Aronstein, David; Smith, Jeffrey

    2013-01-01

    A vectorized rebinning (down-sampling) algorithm, applicable to N-dimensional data sets, has been developed that offers a significant reduction in computer run time when compared to conventional rebinning algorithms. For clarity, a two-dimensional version of the algorithm is discussed to illustrate some specific details of the algorithm content, and using the language of image processing, 2D data will be referred to as "images," and each value in an image as a "pixel." The new approach is fully vectorized, i.e., the down-sampling procedure is done as a single step over all image rows, and then as a single step over all image columns. Data rebinning (or down-sampling) is a procedure that uses a discretely sampled N-dimensional data set to create a representation of the same data, but with fewer discrete samples. Such data down-sampling is fundamental to digital signal processing, e.g., for data compression applications.

  18. The Overgrid Interface for Computational Simulations on Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Computational simulations using overset grids typically involve multiple steps and a variety of software modules. A graphical interface called OVERGRID has been specially designed for such purposes. Data required and created by the different steps include geometry, grids, domain connectivity information and flow solver input parameters. The interface provides a unified environment for the visualization, processing, generation and diagnosis of such data. General modules are available for the manipulation of structured grids and unstructured surface triangulations. Modules more specific for the overset approach include surface curve generators, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, Cartesian box grid generators, and domain connectivity: pre-processing tools. An interface provides automatic selection and viewing of flow solver boundary conditions, and various other flow solver inputs. For problems involving multiple components in relative motion, a module is available to build the component/grid relationships and to prescribe and animate the dynamics of the different components.

  19. A cultural accommodation model for cross-cultural psychotherapy: Illustrated with the case of Asian Americans.

    PubMed

    Leong, Frederick T; Lee, Szu-Hui

    2006-01-01

    As an extension of F. T. L. Leong's (1996) integrative model, this article presents the cultural accommodation model (CAM), an enhanced theoretical guide to effective cross-cultural clinical practice and research. Whereas F. T. L. Leong's model identifies the importance of integrating the universal, group, and individual dimensions, the CAM takes the next step by providing a theoretical guide to effective psychotherapy with culturally different clients by means of a cultural accommodation process. This model argues for the importance of selecting and applying culture-specific constructs when working with culturally diverse groups. The first step of the CAM is to identify cultural disparities that are often ignored and then accommodate them by using current culturally specific concepts. In this article, several different cultural "gaps" or culture-specific constructs of relevance to Asian Americans with strong scientific foundations are selected and discussed as they pertain to providing effective psychotherapy to this ethnic minority group. Finally, a case study is incorporated to illustrate application of the CAM. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  20. The Architecture of Chemical Alternatives Assessment.

    PubMed

    Geiser, Kenneth; Tickner, Joel; Edwards, Sally; Rossi, Mark

    2015-12-01

    Chemical alternatives assessment is a method rapidly developing for use by businesses, governments, and nongovernment organizations seeking to substitute chemicals of concern in production processes and products. Chemical alternatives assessment is defined as a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. The process is intended to provide guidance for assuring that chemicals of concern are replaced with safer alternatives that are not likely to be later regretted. Conceptually, the assessment methods are developed from a set of three foundational pillars and five common principles. Based on a number of emerging alternatives assessment initiatives, in this commentary, we outline a chemical alternatives assessment blueprint structured around three broad steps: Scope, Assessment, and Selection and Implementation. Specific tasks and tools are identified for each of these three steps. While it is recognized that on-going practice will further refine and develop the method and tools, it is important that the structure of the assessment process remain flexible, adaptive, and focused on the substitution of chemicals of concern with safer alternatives. © 2015 Society for Risk Analysis.

  1. Performance of a visuomotor walking task in an augmented reality training setting.

    PubMed

    Haarman, Juliet A M; Choi, Julia T; Buurke, Jaap H; Rietman, Johan S; Reenalda, Jasper

    2017-12-01

    Visual cues can be used to train walking patterns. Here, we studied the performance and learning capacities of healthy subjects executing a high-precision visuomotor walking task, in an augmented reality training set-up. A beamer was used to project visual stepping targets on the walking surface of an instrumented treadmill. Two speeds were used to manipulate task difficulty. All participants (n = 20) had to change their step length to hit visual stepping targets with a specific part of their foot, while walking on a treadmill over seven consecutive training blocks, each block composed of 100 stepping targets. Distance between stepping targets was varied between short, medium and long steps. Training blocks could either be composed of random stepping targets (no fixed sequence was present in the distance between the stepping targets) or sequenced stepping targets (repeating fixed sequence was present). Random training blocks were used to measure non-specific learning and sequenced training blocks were used to measure sequence-specific learning. Primary outcome measures were performance (% of correct hits), and learning effects (increase in performance over the training blocks: both sequence-specific and non-specific). Secondary outcome measures were the performance and stepping-error in relation to the step length (distance between stepping target). Subjects were able to score 76% and 54% at first try for lower speed (2.3 km/h) and higher speed (3.3 km/h) trials, respectively. Performance scores did not increase over the course of the trials, nor did the subjects show the ability to learn a sequenced walking task. Subjects were better able to hit targets while increasing their step length, compared to shortening it. In conclusion, augmented reality training by use of the current set-up was intuitive for the user. Suboptimal feedback presentation might have limited the learning effects of the subjects. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. SU-E-T-420: Failure Effects Mode Analysis for Trigeminal Neuralgia Frameless Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howe, J

    2015-06-15

    Purpose: Functional radiosurgery has been used successfully in the treatment of trigeminal neuralgia but presents significant challenges to ensuring the high prescription dose is delivered accurately. A review of existing practice should help direct the focus of quality improvement for this treatment regime. Method: Failure modes and effects analysis was used to identify the processes in preparing radiosurgery treatment for TN. The map was developed by a multidisciplinary team including: neurosurgeon, radiation oncology, physicist and therapist. Potential failure modes were identified for each step in the process map as well as potential causes and end effect. A risk priority numbermore » was assigned to each cause. Results: The process map identified 66 individual steps (see attached supporting document). Corrective actions were developed for areas of high risk priority number. Wrong site treatment is at higher risk for trigeminal neuralgia treatment due to the lack of site specific pathologic imaging on MR and CT – additional site specific checks were implemented to minimize the risk of wrong site treatment. Failed collision checks resulted from an insufficient collision model in the treatment planning system and a plan template was developed to address this problem. Conclusion: Failure modes and effects analysis is an effective tool for developing quality improvement in high risk radiotherapy procedures such as functional radiosurgery.« less

  3. Kinetics and Thermodynamics of DNA Processing by Wild Type DNA-Glycosylase Endo III and Its Catalytically Inactive Mutant Forms.

    PubMed

    Kladova, Olga A; Krasnoperov, Lev N; Kuznetsov, Nikita A; Fedorova, Olga S

    2018-03-30

    Endonuclease III (Endo III or Nth) is one of the key enzymes responsible for initiating the base excision repair of oxidized or reduced pyrimidine bases in DNA. In this study, a thermodynamic analysis of structural rearrangements of the specific and nonspecific DNA-duplexes during their interaction with Endo III is performed based on stopped-flow kinetic data. 1,3-diaza-2-oxophenoxazine (tC O ), a fluorescent analog of the natural nucleobase cytosine, is used to record multistep DNA binding and lesion recognition within a temperature range (5-37 °C). Standard Gibbs energy, enthalpy, and entropy of the specific steps are derived from kinetic data using Van't Hoff plots. The data suggest that enthalpy-driven exothermic 5,6-dihydrouracil (DHU) recognition and desolvation-accompanied entropy-driven adjustment of the enzyme-substrate complex into a catalytically active state play equally important parts in the overall process. The roles of catalytically significant amino acids Lys120 and Asp138 in the DNA lesion recognition and catalysis are identified. Lys120 participates not only in the catalytic steps but also in the processes of local duplex distortion, whereas substitution Asp138Ala leads to a complete loss of the ability of Endo III to distort a DNA double chain during enzyme-DNA complex formation.

  4. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  5. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  6. A UMLS-based spell checker for natural language processing in vaccine safety.

    PubMed

    Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C

    2007-02-12

    The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74-75), 100% (95% CI: 100-100), and 47% (95% CI: 46%-48%), respectively. We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest.

  7. A UMLS-based spell checker for natural language processing in vaccine safety

    PubMed Central

    Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C

    2007-01-01

    Background The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. Methods We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. Results We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74–75), 100% (95% CI: 100–100), and 47% (95% CI: 46%–48%), respectively. Conclusion We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest. PMID:17295907

  8. A methodological survey identified eight proposed frameworks for the adaptation of health related guidelines.

    PubMed

    Darzi, Andrea; Abou-Jaoude, Elias A; Agarwal, Arnav; Lakis, Chantal; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; El-Jardali, Fadi; Schünemann, Holger J; Akl, Elie A

    2017-06-01

    Our objective was to identify and describe published frameworks for adaptation of clinical, public health, and health services guidelines. We included reports describing methods of adaptation of guidelines in sufficient detail to allow its reproducibility. We searched Medline and EMBASE databases. We also searched personal files, as well manuals and handbooks of organizations and professional societies that proposed methods of adaptation and adoption of guidelines. We followed standard systematic review methodology. Our search captured 12,021 citations, out of which we identified eight proposed methods of guidelines adaptation: ADAPTE, Adapted ADAPTE, Alberta Ambassador Program adaptation phase, GRADE-ADOLOPMENT, MAGIC, RAPADAPTE, Royal College of Nursing (RCN), and Systematic Guideline Review (SGR). The ADAPTE framework consists of a 24-step process to adapt guidelines to a local context taking into consideration the needs, priorities, legislation, policies, and resources. The Alexandria Center for Evidence-Based Clinical Practice Guidelines updated one of ADAPTE's tools, modified three tools, and added three new ones. In addition, they proposed optionally using three other tools. The Alberta Ambassador Program adaptation phase consists of 11 steps and focused on adapting good-quality guidelines for nonspecific low back pain into local context. GRADE-ADOLOPMENT is an eight-step process based on the GRADE Working Group's Evidence to Decision frameworks and applied in 22 guidelines in the context of national guideline development program. The MAGIC research program developed a five-step adaptation process, informed by ADAPTE and the GRADE approach in the context of adapting thrombosis guidelines. The RAPADAPTE framework consists of 12 steps based on ADAPTE and using synthesized evidence databases, retrospectively derived from the experience of producing a high-quality guideline for the treatment of breast cancer with limited resources in Costa Rica. The RCN outlines five key steps strategy for adaptation of guidelines to the local context. The SGR method consists of nine steps and takes into consideration both methodological gaps and context-specific normative issues in source guidelines. We identified through searching personal files two abandoned methods. We identified and described eight proposed frameworks for the adaptation of health-related guidelines. There is a need to evaluate these different frameworks to assess rigor, efficiency, and transparency of their proposed processes. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. 25 CFR 15.11 - What are the basic steps of the probate process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps of the probate process? 15.11 Section 15.11 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE PROBATE OF INDIAN... are the basic steps of the probate process? The basic steps of the probate process are: (a) We learn...

  10. Relevance of a subjective quality of life questionnaire for long-term homeless persons with schizophrenia.

    PubMed

    Girard, V; Tinland, A; Bonin, J P; Olive, F; Poule, J; Lancon, C; Apostolidis, T; Rowe, M; Greacen, T; Simeoni, M C

    2017-02-17

    Increasing numbers of programs are addressing the specific needs of homeless people with schizophrenia in terms of access to housing, healthcare, basic human rights and other domains. Although quality of life scales are being used to evaluate such programs, few instruments have been validated for people with schizophrenia and none for people with schizophrenia who experience major social problems such as homelessness. The aim of the present study was to validate the French version of the S-QoL a self-administered, subjective quality of life questionnaire specific to schizophrenia for people with schizophrenia who are homeless. In a two-step process, the S-QoL was first administered to two independent convenience samples of long-term homeless people with schizophrenia in Marseille, France. The objective of the first step was to analyse the psychometric properties of the S-QoL. The objective of the second step was to examine, through qualitative interviews with members of the population in question, the relevance and acceptability of the principle quality of life indicators used in the S-QoL instrument. Although the psychometric characteristics of the S-QoL were found to be globally satisfactory, from the point of view of the people being interviewed, acceptability was poor. Respondents frequently interrupted participation complaining that questionnaire items did not take into account the specific context of life on the streets. Less intrusive questions, more readily understandable vocabulary and greater relevance to subjects' living conditions are needed to improve the S-QoL questionnaire for this population. A modular questionnaire with context specific sections or specific quality of life instruments for socially excluded populations may well be the way forward.

  11. Site-Specific Pre-Swelling-Directed Morphing Structures of Patterned Hydrogels.

    PubMed

    Wang, Zhi Jian; Hong, Wei; Wu, Zi Liang; Zheng, Qiang

    2017-12-11

    Morphing materials have promising applications in various fields, yet how to program the self-shaping process for specific configurations remains a challenge. Herein we show a versatile approach to control the buckling of individual domains and thus the outcome configurations of planar-patterned hydrogels. By photolithography, high-swelling disc gels were positioned in a non-swelling gel sheet; the swelling mismatch resulted in out-of-plain buckling of the disc gels. To locally control the buckling direction, masks with holes were used to guide site-specific swelling of the high-swelling gel under the holes, which built a transient through-thickness gradient and thus directed the buckling during the subsequent unmasked swelling process. Therefore, various configurations of an identical patterned hydrogel can be programmed by the pre-swelling step with different masks to encode the buckling directions of separate domains. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Purification and substrate specificities of a fructanase from Kluyveromyces marxianus isolated from the fermentation process of Mezcal.

    PubMed

    Arrizon, Javier; Morel, Sandrine; Gschaedler, Anne; Monsan, Pierre

    2011-02-01

    A fructanase, produced by a Kluyveromyces marxianus strain isolated during the fermentation step of the elaboration process of "Mezcal de Guerrero" was purified and biochemically characterized. The active protein was a glycosylated dimer with a molecular weight of approximately 250 kDa. The specific enzymatic activity of the protein was determined for different substrates: sucrose, inulin, Agave tequilana fructan, levan and Actilight® and compared with the activity of Fructozyme®. The hydrolysis profile of the different substrates analyzed by HPAEC-PAD showed that the enzyme has different affinities over the substrates tested with a sucrose/inulin enzymatic activity ratio (S/I) of 125. For the hydrolysis of Agave tequilana fructans, the enzyme also showed a higher enzymatic activity and specificity than Fructozyme®, which is important for its potential application in the tequila industry. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. [Autoantibodies as biomarkers].

    PubMed

    Tron, François

    2014-01-01

    Activation and differentiation of autoreactive B-lymphocytes lead to the production of autoantibodies, which are thus the direct consequence of the autoimmune process. They often constitute biomarkers of autoimmune diseases and are measured by tests displaying various diagnosis sensitivity and specificity. Autoantibody titers can be correlated to the disease activity and certain autoantibody populations associated with particular clinical manifestations or tissue lesions. The demonstration that autoantibodies appear years before the onset of autoimmune diseases indicates that their presence in healthy individuals may be a predictive marker of the occurrence of disease. Certain autoantibodies could also be predictive markers of a therapeutic response to biologics and of the occurrence of side effects as well. Thus, autoantibodies are useful tools in the diagnosis and the management of patients with organ specific or non-organ specific autoimmune diseases at different steps of the autoimmune process. Copyright © 2013. Published by Elsevier Masson SAS.

  14. Guiding electrical current in nanotube circuits using structural defects: a step forward in nanoelectronics.

    PubMed

    Romo-Herrera, Jose M; Terrones, Mauricio; Terrones, Humberto; Meunier, Vincent

    2008-12-23

    Electrical current could be efficiently guided in 2D nanotube networks by introducing specific topological defects within the periodic framework. Using semiempirical transport calculations coupled with Landauer-Buttiker formalism of quantum transport in multiterminal nanoscale systems, we provide a detailed analysis of the processes governing the atomic-scale design of nanotube circuits. We found that when defects are introduced as patches in specific sites, they act as bouncing centers that reinject electrons along specific paths, via a wave reflection process. This type of defects can be incorporated while preserving the 3-fold connectivity of each carbon atom embedded within the graphitic lattice. Our findings open up a new way to explore bottom-up design, at the nanometer scale, of complex nanotube circuits which could be extended to 3D nanosystems and applied in the fabrication of nanoelectronic devices.

  15. [The workplace injury trends in the petrochemical industry: from data analysis to risk management].

    PubMed

    Campo, Giuseppe; Martini, Benedetta

    2013-01-01

    The most recent INAIL data show that, in 2009-2011, the accident frequency rate and the severity rate of workplace injuries in the chemical industry are lower than for the total non-agricultural workforce. The chemical industry, primarily because of the complex and hazardous work processes, requires an appropriate system for assessing and monitoring specific risks.The implementation of Responsible Care, a risk management system specific for the chemical industry, in 1984, has represented a historical step in the process of critical awareness of risk management by the chemical companies. Responsible Care is a risk management system specifically designed on the risk profiles of this type of enterprise, which integrates safety, health and environment. A risk management system, suitable for the needs of a chemical company, should extend its coverage area, beyond the responsible management of products throughout the entire production cycle, to the issues of corporate responsibility.

  16. Preanalytics in lung cancer.

    PubMed

    Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko

    2015-01-01

    Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.

  17. Two steps forward, one step back? A commentary on the disease-specific core sets of the International Classification of Functioning, Disability and Health (ICF).

    PubMed

    McIntyre, Anne; Tempest, Stephanie

    2007-09-30

    The International Classification of Functioning, Disability and Health (ICF) has been received favourably by health care professionals, disability rights organizations and proponents of the social model of disability. The success of the ICF largely depends on its uptake in practice and is considered unwieldy in its full format. To enhance the application of the ICF in practice, disease and site-specific core sets have been developed. The objective of this paper is to stimulate thought and discussion about the place of the ICF core sets in rehabilitation practice. The authors' review of the literature uses the ICF core sets (especially stroke), to debate if the ICF is at risk of taking two steps forward, one step back in its holistic portrayal of health. ICF disease specific core sets could be seen as taking two steps forward to enhance the user friendliness of the ICF and evidence-based practice in rehabilitation. However, there is a danger of taking one step back in reverting to a disease-specific classification. It is too early to conclude the efficacy of the disease-specific core sets, but there is an opportunity to debate where the next steps may lead.

  18. Tolerance design of patient-specific range QA using the DMAIC framework in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Manger, Ryan P; Kim, Tae Hyun; Oh, Do Hoon; Kim, Dae Yong; Kim, Gwe-Ya

    2018-02-01

    To implement the DMAIC (Define-Measure-Analyze-Improve-Control) can be used for customizing the patient-specific QA by designing site-specific range tolerances. The DMAIC framework (process flow diagram, cause and effect, Pareto chart, control chart, and capability analysis) were utilized to determine the steps that need focus for improving the patient-specific QA. The patient-specific range QA plans were selected according to seven treatment site groups, a total of 1437 cases. The process capability index, C pm was used to guide the tolerance design of patient site-specific range. For prostate field, our results suggested that the patient range measurements were capable at the current tolerance level of ±1 mm in clinical proton plans. For other site-specific ranges, we analyzed that the tolerance tends to be overdesigned to insufficient process capability calculated by the patient-specific QA data. The customized tolerances were calculated for treatment sites. Control charts were constructed to simulate the patient QA time before and after the new tolerances were implemented. It is found that the total simulation QA time was decreased on average of approximately 20% after establishing new site-specific range tolerances. We simulated the financial impact of this project. The QA failure for whole process in proton therapy would lead up to approximately 30% increase in total cost. DMAIC framework can be used to provide an effective QA by setting customized tolerances. When tolerance design is customized, the quality is reasonably balanced with time and cost demands. © 2017 American Association of Physicists in Medicine.

  19. The influence of patient portals on users' decision making is insufficiently investigated: A systematic methodological review.

    PubMed

    Fraccaro, Paolo; Vigo, Markel; Balatsoukas, Panagiotis; Buchan, Iain E; Peek, Niels; van der Veer, Sabine N

    2018-03-01

    Patient portals are considered valuable conduits for supporting patients' self-management. However, it is unknown why they often fail to impact on health care processes and outcomes. This may be due to a scarcity of robust studies focusing on the steps that are required to induce improvement: users need to effectively interact with the portal (step 1) in order to receive information (step 2), which might influence their decision-making (step 3). We aimed to explore this potential knowledge gap by investigating to what extent each step has been investigated for patient portals, and explore the methodological approaches used. We performed a systematic literature review using Coiera's information value chain as a guiding theoretical framework. We searched MEDLINE and Scopus by combining terms related to patient portals and evaluation methodologies. Two reviewers selected relevant papers through duplicate screening, and one extracted data from the included papers. We included 115 articles. The large majority (n = 104) evaluated aspects related to interaction with patient portals (step 1). Usage was most often assessed (n = 61), mainly by analysing system interaction data (n = 50), with most authors considering participants as active users if they logged in at least once. Overall usability (n = 57) was commonly assessed through non-validated questionnaires (n = 44). Step 2 (information received) was investigated in 58 studies, primarily by analysing interaction data to evaluate usage of specific system functionalities (n = 34). Eleven studies explicitly assessed the influence of patient portals on patients' and clinicians' decisions (step 3). Whereas interaction with patient portals has been extensively studied, their influence on users' decision-making remains under-investigated. Methodological approaches to evaluating usage and usability of portals showed room for improvement. To unlock the potential of patient portals, more (robust) research should focus on better understanding the complex process of how portals lead to improved health and care. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  2. A comparison of fluctuations of Campylobacter and Escherichia coli concentrations on broiler chicken carcasses during processing in two slaughterhouses.

    PubMed

    Pacholewicz, Ewa; Swart, Arno; Schipper, Maarten; Gortemaker, Betty G M; Wagenaar, Jaap A; Havelaar, Arie H; Lipman, Len J A

    2015-07-16

    The causes of differences in Campylobacter and Escherichia coli concentrations on broiler chicken carcasses after chilling between slaughterhouses are not fully identified. Therefore, it is a challenge for slaughterhouses to comply with Process Hygiene Criteria for broiler meat. The aim of the study was to identify which processing steps contribute to increases or decreases in Campylobacter and E. coli concentrations within and between two slaughterhouses. Identifying the processing steps with variable performance could explain the differences in bacterial concentrations after chilling between slaughterhouses. Thermotolerant Campylobacter and E. coli concentrations on carcasses during broiler processing were measured during the summer period in 21 trials after bleeding, scalding, defeathering, evisceration and chilling. In two slaughterhouses with comparable Campylobacter and E. coli concentrations in the incoming batches (after bleeding), the mean log10 concentrations are found to be significantly different after chilling. Campylobacter concentrations decreased by 1.40 log10 in Slaughterhouse 1 and by 1.86 log10 in Slaughterhouse 2, whereas E. coli decreased by 2.19 log10 in Slaughterhouse 1 and by 2.84 log10 in Slaughterhouse 2. Higher concentrations of Campylobacter and E. coli on carcasses after chilling were observed in Slaughterhouse 1 in which an increase in concentrations was observed after evisceration. The effect of processing on Campylobacter and E. coli concentrations in Slaughterhouse 1 did not differ between batches. In Slaughterhouse 2, the effect of processing on the concentrations of both bacteria varied over batches. Changes in E. coli concentration levels during processing were similar to Campylobacter except for defeathering. E. coli concentration significantly decreased after defeathering in both slaughterhouses, whereas Campylobacter increased in Slaughterhouse 2 and in Slaughterhouse 1 no significant changes were observed. The patterns of increases and decreases in bacterial concentrations during processing are specific for each slaughterhouse. Inhomogeneous patterns potentially explain the differences in concentrations after chilling between slaughterhouses. Critical processing steps should be validated in each slaughterhouse by longitudinal studies and potentially based on E. coli. E. coli has a potential to be used as an indicator of processing hygiene, because the impact of most of the studied processing steps was similar as for Campylobacter. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Spermatogenic Cell-Specific Gene Mutation in Mice via CRISPR-Cas9.

    PubMed

    Bai, Meizhu; Liang, Dan; Wang, Yinghua; Li, Qing; Wu, Yuxuan; Li, Jinsong

    2016-05-20

    Tissue-specific knockout technology enables the analysis of the gene function in specific tissues in adult mammals. However, conventional strategy for producing tissue-specific knockout mice is a time- and labor-consuming process, restricting rapid study of the gene function in vivo. CRISPR-Cas9 system from bacteria is a simple and efficient gene-editing technique, which has enabled rapid generation of gene knockout lines in mouse by direct injection of CRISPR-Cas9 into zygotes. Here, we demonstrate CRISPR-Cas9-mediated spermatogenic cell-specific disruption of Scp3 gene in testes in one step. We first generated transgenic mice by pronuclear injection of a plasmid containing Hspa2 promoter driving Cas9 expression and showed Cas9 specific expression in spermatogenic cells. We then produced transgenic mice carrying Hspa2 promoter driven Cas9 and constitutive expressed sgRNA targeting Scp3 gene. Male founders were infertile due to developmental arrest of spermatogenic cells while female founders could produce progeny normally. Consistently, male progeny from female founders were infertile and females could transmit the transgenes to the next generation. Our study establishes a CRISPR-Cas9-based one-step strategy to analyze the gene function in adult tissues by a temporal-spatial pattern. Copyright © 2016 Institute of Genetics and Developmental Biology, Chinese Academy of Sciences, and Genetics Society of China. Published by Elsevier Ltd. All rights reserved.

  4. Efficient differentiation of mouse embryonic stem cells into motor neurons.

    PubMed

    Wu, Chia-Yen; Whye, Dosh; Mason, Robert W; Wang, Wenlan

    2012-06-09

    Direct differentiation of embryonic stem (ES) cells into functional motor neurons represents a promising resource to study disease mechanisms, to screen new drug compounds, and to develop new therapies for motor neuron diseases such as spinal muscular atrophy (SMA) and amyotrophic lateral sclerosis (ALS). Many current protocols use a combination of retinoic acid (RA) and sonic hedgehog (Shh) to differentiate mouse embryonic stem (mES) cells into motor neurons. However, the differentiation efficiency of mES cells into motor neurons has only met with moderate success. We have developed a two-step differentiation protocol that significantly improves the differentiation efficiency compared with currently established protocols. The first step is to enhance the neuralization process by adding Noggin and fibroblast growth factors (FGFs). Noggin is a bone morphogenetic protein (BMP) antagonist and is implicated in neural induction according to the default model of neurogenesis and results in the formation of anterior neural patterning. FGF signaling acts synergistically with Noggin in inducing neural tissue formation by promoting a posterior neural identity. In this step, mES cells were primed with Noggin, bFGF, and FGF-8 for two days to promote differentiation towards neural lineages. The second step is to induce motor neuron specification. Noggin/FGFs exposed mES cells were incubated with RA and a Shh agonist, Smoothened agonist (SAG), for another 5 days to facilitate motor neuron generation. To monitor the differentiation of mESs into motor neurons, we used an ES cell line derived from a transgenic mouse expressing eGFP under the control of the motor neuron specific promoter Hb9. Using this robust protocol, we achieved 51 ± 0.8% of differentiation efficiency (n = 3; p < 0.01, Student's t-test). Results from immunofluorescent staining showed that GFP+ cells express the motor neuron specific markers, Islet-1 and choline acetyltransferase (ChAT). Our two-step differentiation protocol provides an efficient way to differentiate mES cells into spinal motor neurons.

  5. Comparison of machinability of manganese alloyed austempered ductile iron produced using conventional and two step austempering processes

    NASA Astrophysics Data System (ADS)

    Hegde, Ananda; Sharma, Sathyashankara

    2018-05-01

    Austempered Ductile Iron (ADI) is a revolutionary material with high strength and hardness combined with optimum ductility and toughness. The discovery of two step austempering process has lead to the superior combination of all the mechanical properties. However, because of the high strength and hardness of ADI, there is a concern regarding its machinability. In the present study, machinability of ADI produced using conventional and two step heat treatment processes is assessed using tool life and the surface roughness. Speed, feed and depth of cut are considered as the machining parameters in the dry turning operation. The machinability results along with the mechanical properties are compared for ADI produced using both conventional and two step austempering processes. The results have shown that two step austempering process has produced better toughness with good hardness and strength without sacrificing ductility. Addition of 0.64 wt% manganese did not cause any detrimental effect on the machinability of ADI, both in conventional and two step processes. Marginal improvement in tool life and surface roughness were observed in two step process compared to that with conventional process.

  6. The MLE Teacher: An Agent of Change or a Cog in the Wheel?

    ERIC Educational Resources Information Center

    Bedamatta, Urmishree

    2014-01-01

    This article examines the role of the multilingual education (MLE) teacher in the mother tongue-based MLE program for the Juangas, a tribe in Odisha, an eastern state of India, and is part of a broader study of the MLE program in the state. For the specific purpose of this article, I have adopted Welmond's (2002) three-step process: identifying…

  7. Crew Skills and Training

    NASA Technical Reports Server (NTRS)

    Jones, Thomas; Burbank, Daniel C.; Eppler, Dean; Garrison, Robert; Harvey, Ralph; Hoffman, Paul; Schmitt, Harrison

    1998-01-01

    One of the major focus points for the workshop was the topic of crew skills and training necessary for the Mars surface mission. Discussions centered on the mix of scientific skills necessary to accomplish the proposed scientific goals, and the training environment that can bring the ground and flight teams to readiness. Subsequent discussion resulted in recommendations for specific steps to begin the process of training an experienced Mars exploration team.

  8. Results of the Software Process Improvement Efforts of the Early Adopters in NAVAIR 4.0

    DTIC Science & Technology

    2007-12-01

    and customer satisfaction. AIRSpeed utilizes a structured, problem solving methodology called DMAIC (Define, Measure, Analyze, Improve, Control...widely used in business. DMAIC leads project teams through the logical steps from problem definition to problem resolution. Each phase has a specific set...costs and improving productivity and customer satisfaction. AIRSpeed utilizes the DMAIC (Define, Measure, Analyze, Improve, Control) structured problem

  9. New numerical approach for the modelling of machining applied to aeronautical structural parts

    NASA Astrophysics Data System (ADS)

    Rambaud, Pierrick; Mocellin, Katia

    2018-05-01

    The manufacturing of aluminium alloy structural aerospace parts involves several steps: forming (rolling, forging …etc), heat treatments and machining. Before machining, the manufacturing processes have embedded residual stresses into the workpiece. The final geometry is obtained during this last step, when up to 90% of the raw material volume is removed by machining. During this operation, the mechanical equilibrium of the part is in constant evolution due to the redistribution of the initial stresses. This redistribution is the main cause for workpiece deflections during machining and for distortions - after unclamping. Both may lead to non-conformity of the part regarding the geometrical and dimensional specifications and therefore to rejection of the part or additional conforming steps. In order to improve the machining accuracy and the robustness of the process, the effect of the residual stresses has to be considered for the definition of the machining process plan and even in the geometrical definition of the part. In this paper, the authors present two new numerical approaches concerning the modelling of machining of aeronautical structural parts. The first deals with the use of an immersed volume framework to model the cutting step, improving the robustness and the quality of the resulting mesh compared to the previous version. The second is about the mechanical modelling of the machining problem. The authors thus show that in the framework of rolled aluminium parts the use of a linear elasticity model is functional in the finite element formulation and promising regarding the reduction of computation times.

  10. Emotional Speech Perception Unfolding in Time: The Role of the Basal Ganglia

    PubMed Central

    Paulmann, Silke; Ott, Derek V. M.; Kotz, Sonja A.

    2011-01-01

    The basal ganglia (BG) have repeatedly been linked to emotional speech processing in studies involving patients with neurodegenerative and structural changes of the BG. However, the majority of previous studies did not consider that (i) emotional speech processing entails multiple processing steps, and the possibility that (ii) the BG may engage in one rather than the other of these processing steps. In the present study we investigate three different stages of emotional speech processing (emotional salience detection, meaning-related processing, and identification) in the same patient group to verify whether lesions to the BG affect these stages in a qualitatively different manner. Specifically, we explore early implicit emotional speech processing (probe verification) in an ERP experiment followed by an explicit behavioral emotional recognition task. In both experiments, participants listened to emotional sentences expressing one of four emotions (anger, fear, disgust, happiness) or neutral sentences. In line with previous evidence patients and healthy controls show differentiation of emotional and neutral sentences in the P200 component (emotional salience detection) and a following negative-going brain wave (meaning-related processing). However, the behavioral recognition (identification stage) of emotional sentences was impaired in BG patients, but not in healthy controls. The current data provide further support that the BG are involved in late, explicit rather than early emotional speech processing stages. PMID:21437277

  11. Automating the evaluation of flood damages: methodology and potential gains

    NASA Astrophysics Data System (ADS)

    Eleutério, Julian; Martinez, Edgar Daniel

    2010-05-01

    The evaluation of flood damage potential consists of three main steps: assessing and processing data, combining data and calculating potential damages. The first step consists of modelling hazard and assessing vulnerability. In general, this step of the evaluation demands more time and investments than the others. The second step of the evaluation consists of combining spatial data on hazard with spatial data on vulnerability. Geographic Information System (GIS) is a fundamental tool in the realization of this step. GIS software allows the simultaneous analysis of spatial and matrix data. The third step of the evaluation consists of calculating potential damages by means of damage-functions or contingent analysis. All steps demand time and expertise. However, the last two steps must be realized several times when comparing different management scenarios. In addition, uncertainty analysis and sensitivity test are made during the second and third steps of the evaluation. The feasibility of these steps could be relevant in the choice of the extent of the evaluation. Low feasibility could lead to choosing not to evaluate uncertainty or to limit the number of scenario comparisons. Several computer models have been developed over time in order to evaluate the flood risk. GIS software is largely used to realise flood risk analysis. The software is used to combine and process different types of data, and to visualise the risk and the evaluation results. The main advantages of using a GIS in these analyses are: the possibility of "easily" realising the analyses several times, in order to compare different scenarios and study uncertainty; the generation of datasets which could be used any time in future to support territorial decision making; the possibility of adding information over time to update the dataset and make other analyses. However, these analyses require personnel specialisation and time. The use of GIS software to evaluate the flood risk requires personnel with a double professional specialisation. The professional should be proficient in GIS software and in flood damage analysis (which is already a multidisciplinary field). Great effort is necessary in order to correctly evaluate flood damages, and the updating and the improvement of the evaluation over time become a difficult task. The automation of this process should bring great advance in flood management studies over time, especially for public utilities. This study has two specific objectives: (1) show the entire process of automation of the second and third steps of flood damage evaluations; and (2) analyse the induced potential gains in terms of time and expertise needed in the analysis. A programming language is used within GIS software in order to automate hazard and vulnerability data combination and potential damages calculation. We discuss the overall process of flood damage evaluation. The main result of this study is a computational tool which allows significant operational gains on flood loss analyses. We quantify these gains by means of a hypothetical example. The tool significantly reduces the time of analysis and the needs for expertise. An indirect gain is that sensitivity and cost-benefit analyses can be more easily realized.

  12. A cascade reaction network mimicking the basic functional steps of adaptive immune response

    NASA Astrophysics Data System (ADS)

    Han, Da; Wu, Cuichen; You, Mingxu; Zhang, Tao; Wan, Shuo; Chen, Tao; Qiu, Liping; Zheng, Zheng; Liang, Hao; Tan, Weihong

    2015-10-01

    Biological systems use complex ‘information-processing cores’ composed of molecular networks to coordinate their external environment and internal states. An example of this is the acquired, or adaptive, immune system (AIS), which is composed of both humoral and cell-mediated components. Here we report the step-by-step construction of a prototype mimic of the AIS that we call an adaptive immune response simulator (AIRS). DNA and enzymes are used as simple artificial analogues of the components of the AIS to create a system that responds to specific molecular stimuli in vitro. We show that this network of reactions can function in a manner that is superficially similar to the most basic responses of the vertebrate AIS, including reaction sequences that mimic both humoral and cellular responses. As such, AIRS provides guidelines for the design and engineering of artificial reaction networks and molecular devices.

  13. Automatic alignment method for calibration of hydrometers

    NASA Astrophysics Data System (ADS)

    Lee, Y. J.; Chang, K. H.; Chon, J. C.; Oh, C. Y.

    2004-04-01

    This paper presents a new method to automatically align specific scale-marks for the calibration of hydrometers. A hydrometer calibration system adopting the new method consists of a vision system, a stepping motor, and software to control the system. The vision system is composed of a CCD camera and a frame grabber, and is used to acquire images. The stepping motor moves the camera, which is attached to the vessel containing a reference liquid, along the hydrometer. The operating program has two main functions: to process images from the camera to find the position of the horizontal plane and to control the stepping motor for the alignment of the horizontal plane with a particular scale-mark. Any system adopting this automatic alignment method is a convenient and precise means of calibrating a hydrometer. The performance of the proposed method is illustrated by comparing the calibration results using the automatic alignment method with those obtained using the manual method.

  14. Conducting survey research at nursing conferences.

    PubMed

    Sleutel, M R

    2001-01-01

    Conferences can provide large numbers of potential subjects in one location, yet there is little published guidance on how to collect data at a conference site. A computer search revealed no citations on this topic. This article outlines a systematic strategy to plan and perform research at conferences. This article provides a step-by-step process to guide researchers in planning and conducting survey research at conferences. Initial components in planning data collection at a conference include making a timeline and getting permission. Detailed advanced planning involves specific strategies for attracting participants, and for distributing and collecting the questionnaires. Travel provisions and on-site logistical approaches are explained, followed by suggestions for post-conference activities.

  15. A multi-step system for screening and localization of hard exudates in retinal images

    NASA Astrophysics Data System (ADS)

    Bopardikar, Ajit S.; Bhola, Vishal; Raghavendra, B. S.; Narayanan, Rangavittal

    2012-03-01

    The number of people being affected by Diabetes mellitus worldwide is increasing at an alarming rate. Monitoring of the diabetic condition and its effects on the human body are therefore of great importance. Of particular interest is diabetic retinopathy (DR) which is a result of prolonged, unchecked diabetes and affects the visual system. DR is a leading cause of blindness throughout the world. At any point of time 25 - 44% of people with diabetes are afflicted by DR. Automation of the screening and monitoring process for DR is therefore essential for efficient utilization of healthcare resources and optimizing treatment of the affected individuals. Such automation would use retinal images and detect the presence of specific artifacts such as hard exudates, hemorrhages and soft exudates (that may appear in the image) to gauge the severity of DR. In this paper, we focus on the detection of hard exudates. We propose a two step system that consists of a screening step that classifies retinal images as normal or abnormal based on the presence of hard exudates and a detection stage that localizes these artifacts in an abnormal retinal image. The proposed screening step automatically detects the presence of hard exudates with a high sensitivity and positive predictive value (PPV ). The detection/localization step uses a k-means based clustering approach to localize hard exudates in the retinal image. Suitable feature vectors are chosen based on their ability to isolate hard exudates while minimizing false detections. The algorithm was tested on a benchmark dataset (DIARETDB1) and was seen to provide a superior performance compared to existing methods. The two-step process described in this paper can be embedded in a tele-ophthalmology system to aid with speedy detection and diagnosis of the severity of DR.

  16. Selective removal of cesium by ammonium molybdophosphate - polyacrylonitrile bead and membrane.

    PubMed

    Ding, Dahu; Zhang, Zhenya; Chen, Rongzhi; Cai, Tianming

    2017-02-15

    The selective removal of radionuclides with extremely low concentrations from environmental medium remains a big challenge. Ammonium molybdophosphate possess considerable selectivity towards cesium ion (Cs + ) due to the specific ion exchange between Cs + and NH 4 + . Ammonium molybdophosphate - polyacrylonitrile (AMP-PAN) membrane was successfully prepared for the first time in this study. Efficient removal of Cs + (95.7%, 94.1% and 91.3% of 1mgL -1 ) from solutions with high ionic strength (400mgL -1 of Na + , Ca 2+ or K + ) was achieved by AMP-PAN composite. Multilayer chemical adsorption process was testified through kinetic and isotherm studies. The estimated maximum adsorption capacities even reached 138.9±21.3mgg -1 . Specifically, the liquid film diffusion was identified as the rate-limiting step throughout the removal process. Finally, AMP-PAN membrane could eliminate Cs + from water effectively through the filtration adsorption process. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Migration of antioxidants from polylactic acid films, a parameter estimation approach: Part I - A model including convective mass transfer coefficient.

    PubMed

    Samsudin, Hayati; Auras, Rafael; Burgess, Gary; Dolan, Kirk; Soto-Valdez, Herlinda

    2018-03-01

    A two-step solution based on the boundary conditions of Crank's equations for mass transfer in a film was developed. Three driving factors, the diffusion (D), partition (K p,f ) and convective mass transfer coefficients (h), govern the sorption and/or desorption kinetics of migrants from polymer films. These three parameters were simultaneously estimated. They provide in-depth insight into the physics of a migration process. The first step was used to find the combination of D, K p,f and h that minimized the sums of squared errors (SSE) between the predicted and actual results. In step 2, an ordinary least square (OLS) estimation was performed by using the proposed analytical solution containing D, K p,f and h. Three selected migration studies of PLA/antioxidant-based films were used to demonstrate the use of this two-step solution. Additional parameter estimation approaches such as sequential and bootstrap were also performed to acquire a better knowledge about the kinetics of migration. The proposed model successfully provided the initial guesses for D, K p,f and h. The h value was determined without performing a specific experiment for it. By determining h together with D, under or overestimation issues pertaining to a migration process can be avoided since these two parameters are correlated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumuluru, Jaya Shankar; McCulloch, Richard Chet James

    In this work a new hybrid genetic algorithm was developed which combines a rudimentary adaptive steepest ascent hill climbing algorithm with a sophisticated evolutionary algorithm in order to optimize complex multivariate design problems. By combining a highly stochastic algorithm (evolutionary) with a simple deterministic optimization algorithm (adaptive steepest ascent) computational resources are conserved and the solution converges rapidly when compared to either algorithm alone. In genetic algorithms natural selection is mimicked by random events such as breeding and mutation. In the adaptive steepest ascent algorithm each variable is perturbed by a small amount and the variable that caused the mostmore » improvement is incremented by a small step. If the direction of most benefit is exactly opposite of the previous direction with the most benefit then the step size is reduced by a factor of 2, thus the step size adapts to the terrain. A graphical user interface was created in MATLAB to provide an interface between the hybrid genetic algorithm and the user. Additional features such as bounding the solution space and weighting the objective functions individually are also built into the interface. The algorithm developed was tested to optimize the functions developed for a wood pelleting process. Using process variables (such as feedstock moisture content, die speed, and preheating temperature) pellet properties were appropriately optimized. Specifically, variables were found which maximized unit density, bulk density, tapped density, and durability while minimizing pellet moisture content and specific energy consumption. The time and computational resources required for the optimization were dramatically decreased using the hybrid genetic algorithm when compared to MATLAB's native evolutionary optimization tool.« less

  19. Loss of laminin alpha 1 results in multiple structural defects and divergent effects on adhesion during vertebrate optic cup morphogenesis

    PubMed Central

    Bryan, Chase D.; Chien, Chi-Bin; Kwan, Kristen M.

    2016-01-01

    The vertebrate eye forms via a complex set of morphogenetic events. The optic vesicle evaginates and undergoes transformative shape changes to form the optic cup, in which neural retina and retinal pigmented epithelium enwrap the lens. It has long been known that a complex, glycoprotein-rich extracellular matrix layer surrounds the developing optic cup throughout the process, yet the functions of the matrix and its specific molecular components have remained unclear. Previous work established a role for laminin extracellular matrix in particular steps of eye development, including optic vesicle evagination, lens differentiation, and retinal ganglion cell polarization, yet it is unknown what role laminin might play in the early process of optic cup formation subsequent to the initial step of optic vesicle evagination. Here, we use the zebrafish lama1 mutant (lama1UW1) to determine the function of laminin during optic cup morphogenesis. Using live imaging, we find, surprisingly, that loss of laminin leads to divergent effects on focal adhesion assembly in a spatiotemporally-specific manner, and that laminin is required for multiple steps of optic cup morphogenesis, including optic stalk constriction, invagination, and formation of a spherical lens. Laminin is not required for single cell behaviors and changes in cell shape. Rather, in lama1UW1 mutants, loss of epithelial polarity and altered adhesion lead to defective tissue architecture and formation of a disorganized retina. These results demonstrate that the laminin extracellular matrix plays multiple critical roles regulating adhesion and polarity to establish and maintain tissue structure during optic cup morphogenesis. PMID:27339294

  20. Planning for the next influenza pandemic: using the science and art of logistics.

    PubMed

    Cupp, O Shawn; Predmore, Brad G

    2011-01-01

    The complexities and challenges for healthcare providers and their efforts to provide fundamental basic items to meet the logistical demands of an influenza pandemic are discussed in this article. The supply chain, planning, and alternatives for inevitable shortages are some of the considerations associated with this emergency mass critical care situation. The planning process and support for such events are discussed in detail with several recommendations obtained from the literature and the experience from recent mass casualty incidents (MCIs). The first step in this planning process is the development of specific triage requirements during an influenza pandemic. The second step is identification of logistical resources required during such a pandemic, which are then analyzed within the proposed logistics science and art model for planning purposes. Resources highlighted within the model include allocation and use of work force, bed space, intensive care unit assets, ventilators, personal protective equipment, and oxygen. The third step is using the model to discuss in detail possible workarounds, suitable substitutes, and resource allocation. An examination is also made of the ethics surrounding palliative care within the construction of an MCI and the factors that will inevitably determine rationing and prioritizing of these critical assets to palliative care patients.

  1. A Geometry Based Infra-structure for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1997-01-01

    The computational steps traditionally taken for most engineering analysis (CFD, structural analysis, and etc.) are: Surface Generation - usually by employing a CAD system; Grid Generation - preparing the volume for the simulation; Flow Solver - producing the results at the specified operational point; and Post-processing Visualization - interactively attempting to understand the results For structural analysis, integrated systems can be obtained from a number of commercial vendors. For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. Specifically the problems with this procedure are: (1) File based. Information flows from one step to the next via data files with formats specified for that procedure. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. (3) One-Way communication. All information travels on from one phase to the next. Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive.

  2. Image preprocessing for improving computational efficiency in implementation of restoration and superresolution algorithms.

    PubMed

    Sundareshan, Malur K; Bhattacharjee, Supratik; Inampudi, Radhika; Pang, Ho-Yuen

    2002-12-10

    Computational complexity is a major impediment to the real-time implementation of image restoration and superresolution algorithms in many applications. Although powerful restoration algorithms have been developed within the past few years utilizing sophisticated mathematical machinery (based on statistical optimization and convex set theory), these algorithms are typically iterative in nature and require a sufficient number of iterations to be executed to achieve the desired resolution improvement that may be needed to meaningfully perform postprocessing image exploitation tasks in practice. Additionally, recent technological breakthroughs have facilitated novel sensor designs (focal plane arrays, for instance) that make it possible to capture megapixel imagery data at video frame rates. A major challenge in the processing of these large-format images is to complete the execution of the image processing steps within the frame capture times and to keep up with the output rate of the sensor so that all data captured by the sensor can be efficiently utilized. Consequently, development of novel methods that facilitate real-time implementation of image restoration and superresolution algorithms is of significant practical interest and is the primary focus of this study. The key to designing computationally efficient processing schemes lies in strategically introducing appropriate preprocessing steps together with the superresolution iterations to tailor optimized overall processing sequences for imagery data of specific formats. For substantiating this assertion, three distinct methods for tailoring a preprocessing filter and integrating it with the superresolution processing steps are outlined. These methods consist of a region-of-interest extraction scheme, a background-detail separation procedure, and a scene-derived information extraction step for implementing a set-theoretic restoration of the image that is less demanding in computation compared with the superresolution iterations. A quantitative evaluation of the performance of these algorithms for restoring and superresolving various imagery data captured by diffraction-limited sensing operations are also presented.

  3. EUV mask pilot line at Intel Corporation

    NASA Astrophysics Data System (ADS)

    Stivers, Alan R.; Yan, Pei-Yang; Zhang, Guojing; Liang, Ted; Shu, Emily Y.; Tejnil, Edita; Lieberman, Barry; Nagpal, Rajesh; Hsia, Kangmin; Penn, Michael; Lo, Fu-Chang

    2004-12-01

    The introduction of extreme ultraviolet (EUV) lithography into high volume manufacturing requires the development of a new mask technology. In support of this, Intel Corporation has established a pilot line devoted to encountering and eliminating barriers to manufacturability of EUV masks. It concentrates on EUV-specific process modules and makes use of the captive standard photomask fabrication capability of Intel Corporation. The goal of the pilot line is to accelerate EUV mask development to intersect the 32nm technology node. This requires EUV mask technology to be comparable to standard photomask technology by the beginning of the silicon wafer process development phase for that technology node. The pilot line embodies Intel's strategy to lead EUV mask development in the areas of the mask patterning process, mask fabrication tools, the starting material (blanks) and the understanding of process interdependencies. The patterning process includes all steps from blank defect inspection through final pattern inspection and repair. We have specified and ordered the EUV-specific tools and most will be installed in 2004. We have worked with International Sematech and others to provide for the next generation of EUV-specific mask tools. Our process of record is run repeatedly to ensure its robustness. This primes the supply chain and collects information needed for blank improvement.

  4. Decision Making Processes and Outcomes

    PubMed Central

    Hicks Patrick, Julie; Steele, Jenessa C.; Spencer, S. Melinda

    2013-01-01

    The primary aim of this study was to examine the contributions of individual characteristics and strategic processing to the prediction of decision quality. Data were provided by 176 adults, ages 18 to 93 years, who completed computerized decision-making vignettes and a battery of demographic and cognitive measures. We examined the relations among age, domain-specific experience, working memory, and three measures of strategic information search to the prediction of solution quality using a 4-step hierarchical linear regression analysis. Working memory and two measures of strategic processing uniquely contributed to the variance explained. Results are discussed in terms of potential advances to both theory and intervention efforts. PMID:24282638

  5. Solar cell and I.C. aspects of ingot-to-slice mechanical processing

    NASA Astrophysics Data System (ADS)

    Dyer, L. D.

    1985-08-01

    Intensive efforts have been put into the growth of silicon crystals to suit today's solar cell and integrated circuit requirements. Each step of processing the crystal must also receive concentrated attention to preserve the grown-in perfection and to provide a suitable device-ready wafer at reasonable cost. A comparison is made between solar cell and I.C. requirements on the mechanical processing of silicon from ingot to wafer. Specific defects are described that can ruin the slice or can possibly lead to device degradation. These include grinding cracks, saw exit chips, crow's-foot fractures, edge cracks, and handling scratches.

  6. Solar cell and I.C. aspects of ingot-to-slice mechanical processing

    NASA Technical Reports Server (NTRS)

    Dyer, L. D.

    1985-01-01

    Intensive efforts have been put into the growth of silicon crystals to suit today's solar cell and integrated circuit requirements. Each step of processing the crystal must also receive concentrated attention to preserve the grown-in perfection and to provide a suitable device-ready wafer at reasonable cost. A comparison is made between solar cell and I.C. requirements on the mechanical processing of silicon from ingot to wafer. Specific defects are described that can ruin the slice or can possibly lead to device degradation. These include grinding cracks, saw exit chips, crow's-foot fractures, edge cracks, and handling scratches.

  7. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  8. A Highly Flexible, Automated System Providing Reliable Sample Preparation in Element- and Structure-Specific Measurements.

    PubMed

    Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin

    2016-10-01

    Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.

  9. NPTool: Towards Scalability and Reliability of Business Process Management

    NASA Astrophysics Data System (ADS)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  10. A new approach for bioassays based on frequency- and time-domain measurements of magnetic nanoparticles.

    PubMed

    Oisjöen, Fredrik; Schneiderman, Justin F; Astalan, Andrea Prieto; Kalabukhov, Alexey; Johansson, Christer; Winkler, Dag

    2010-01-15

    We demonstrate a one-step wash-free bioassay measurement system capable of tracking biochemical binding events. Our approach combines the high resolution of frequency- and high speed of time-domain measurements in a single device in combination with a fast one-step bioassay. The one-step nature of our magnetic nanoparticle (MNP) based assay reduces the time between sample extraction and quantitative results while mitigating the risks of contamination related to washing steps. Our method also enables tracking of binding events, providing the possibility of, for example, investigation of how chemical/biological environments affect the rate of a binding process or study of the action of certain drugs. We detect specific biological binding events occurring on the surfaces of fluid-suspended MNPs that modify their magnetic relaxation behavior. Herein, we extrapolate a modest sensitivity to analyte of 100 ng/ml with the present setup using our rapid one-step bioassay. More importantly, we determine the size-distributions of the MNP systems with theoretical fits to our data obtained from the two complementary measurement modalities and demonstrate quantitative agreement between them. Copyright 2009 Elsevier B.V. All rights reserved.

  11. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Modeling metabolic networks in C. glutamicum: a comparison of rate laws in combination with various parameter optimization strategies

    PubMed Central

    Dräger, Andreas; Kronfeld, Marcel; Ziller, Michael J; Supper, Jochen; Planatscher, Hannes; Magnus, Jørgen B; Oldiges, Marco; Kohlbacher, Oliver; Zell, Andreas

    2009-01-01

    Background To understand the dynamic behavior of cellular systems, mathematical modeling is often necessary and comprises three steps: (1) experimental measurement of participating molecules, (2) assignment of rate laws to each reaction, and (3) parameter calibration with respect to the measurements. In each of these steps the modeler is confronted with a plethora of alternative approaches, e. g., the selection of approximative rate laws in step two as specific equations are often unknown, or the choice of an estimation procedure with its specific settings in step three. This overall process with its numerous choices and the mutual influence between them makes it hard to single out the best modeling approach for a given problem. Results We investigate the modeling process using multiple kinetic equations together with various parameter optimization methods for a well-characterized example network, the biosynthesis of valine and leucine in C. glutamicum. For this purpose, we derive seven dynamic models based on generalized mass action, Michaelis-Menten and convenience kinetics as well as the stochastic Langevin equation. In addition, we introduce two modeling approaches for feedback inhibition to the mass action kinetics. The parameters of each model are estimated using eight optimization strategies. To determine the most promising modeling approaches together with the best optimization algorithms, we carry out a two-step benchmark: (1) coarse-grained comparison of the algorithms on all models and (2) fine-grained tuning of the best optimization algorithms and models. To analyze the space of the best parameters found for each model, we apply clustering, variance, and correlation analysis. Conclusion A mixed model based on the convenience rate law and the Michaelis-Menten equation, in which all reactions are assumed to be reversible, is the most suitable deterministic modeling approach followed by a reversible generalized mass action kinetics model. A Langevin model is advisable to take stochastic effects into account. To estimate the model parameters, three algorithms are particularly useful: For first attempts the settings-free Tribes algorithm yields valuable results. Particle swarm optimization and differential evolution provide significantly better results with appropriate settings. PMID:19144170

  13. Neck linker length determines the degree of processivity in kinesin-1 and kinesin-2 motors.

    PubMed

    Shastry, Shankar; Hancock, William O

    2010-05-25

    Defining the mechanical and biochemical determinates of kinesin processivity is important for understanding how diverse kinesins are tuned for specific cellular functions. Because transmission of mechanical forces through the 14-18 amino acid neck linker domain underlies coordinated stepping, we investigated the role of neck linker length, charge, and structure in kinesin-1 and kinesin-2 motor behavior. For optimum comparison with kinesin-1, the KIF3A head and neck linker of kinesin-2 were fused to the kinesin-1 neck coil and rod. Extending the 14-residue kinesin-1 neck linker reduced processivity, and shortening the 17-residue kinesin-2 neck linker enhanced processivity. When a proline in the kinesin-2 neck linker was replaced, kinesin-1 and kinesin-2 run lengths scaled identically with neck linker length, despite moving at different speeds. In low-ionic-strength buffer, charge had a dominant effect on motor processivity, which resolves ongoing controversy regarding the effect of neck linker length on kinesin processivity. From stochastic simulations, the results are best explained by neck linker extension slowing strain-dependent detachment of the rear head along with diminishing strain-dependent inhibition of ATP binding. These results help delineate how interhead strain maximizes stepping and suggest that less processive kinesins are tuned to coordinate with other motors differently than the maximally processive kinesin-1. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. EUV process improvement with novel litho track hardware

    NASA Astrophysics Data System (ADS)

    Stokes, Harold; Harumoto, Masahiko; Tanaka, Yuji; Kaneyama, Koji; Pieczulewski, Charles; Asai, Masaya

    2017-03-01

    Currently, there are many developments in the field of EUV lithography that are helping to move it towards increased HVM feasibility. Targeted improvements in hardware design for advanced lithography are of interest to our group specifically for metrics such as CD uniformity, LWR, and defect density. Of course, our work is focused on EUV process steps that are specifically affected by litho track performance, and consequently, can be improved by litho track design improvement and optimization. In this study we are building on our experience to provide continual improvement for LWR, CDU, and Defects as applied to a standard EUV process by employing novel hardware solutions on our SOKUDO DUO coat develop track system. Although it is preferable to achieve such improvements post-etch process we feel, as many do, that improvements after patterning are a precursor to improvements after etching. We hereby present our work utilizing the SOKUDO DUO coat develop track system with an ASML NXE:3300 in the IMEC (Leuven, Belgium) cleanroom environment to improve aggressive dense L/S patterns.

  15. Space Shuttle ET Friction Stir Weld Machines

    NASA Technical Reports Server (NTRS)

    Thompson, Jack M.

    2003-01-01

    NASA and Lockheed-Martin approached the FSW machine vendor community with a specification for longitudinal barrel production FSW weld machines and a shorter travel process development machine in June of 2000. This specification was based on three years of FSW process development on the Space Shuttle External Tank alloys, AL2 195-T8M4 and AL22 19-T87. The primary motivations for changing the ET longitudinal welds from the existing variable polarity Plasma Arc plasma weld process included: (1) Significantly reduced weld defect rates and related reduction in cycle time and uncertainty; (2) Many fewer process variables to control (5 vs. 17); (3) Fewer manufacturing steps; (4) Lower residual stresses and distortion; (5) Improved weld strengths, particularly at cryogenic temperatures; (6) Fewer hazards to production personnel. General Tool was the successful bidder. The equipment is at this writing installed and welding flight hardware. This paper is a means of sharing with the rest of the FSW community the unique features developed to assure NASA/L-M of successful production welds.

  16. Three-Dimensional Finite Element Ablative Thermal Response and Thermostructural Design of Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2011-01-01

    A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.

  17. Analyzing women's roles through graphic representation of narratives.

    PubMed

    Hall, Joanne M

    2003-08-01

    A 1992 triangulated international nursing study of women's health was reported. The researchers used the perspectives of feminism and symbolic interactionism, specifically role theory. A narrative analysis was done to clarify the concept of role integration. The narrative analysis was reported in 1992, but graphic/visual techniques used in the team dialogue process of narrative analysis were not reported due to space limitations. These techniques have not been reported elsewhere and thus remain innovative. Specific steps in the method are outlined here in detail as an audit trail. The process would be useful to other qualitative researchers as an exemplar of one novel way that verbal data can be abstracted visually/graphically. Suggestions are included for aspects of narrative, in addition to roles, that could be depicted graphically in qualitative research.

  18. Direct separation of arsenic and antimony oxides by high-temperature filtration with porous FeAl intermetallic.

    PubMed

    Zhang, Huibin; Liu, Xinli; Jiang, Yao; Gao, Lin; Yu, Linping; Lin, Nan; He, Yuehui; Liu, C T

    2017-09-15

    A temperature-controlled selective filtration technology for synchronous removal of arsenic and recovery of antimony from the fume produced from reduction smelting process of lead anode slimes was proposed. The chromium (Cr) alloyed FeAl intermetallic with an asymmetric pore structure was developed as the high-temperature filter material after evaluating its corrosive resistance, structural stability and mechanical properties. The results showed that porous FeAl alloyed with 20wt.% Cr had a long term stability in a high-temperature sulfide-bearing environment. The separation of arsenic and antimony trioxides was realized principally based on their disparate saturated vapor pressures at specific temperature ranges and the asymmetric membrane of FeAl filter elements with a mean pore size of 1.8μm. Pilot-scale filtration tests showed that the direct separation of arsenic and antimony can be achieved by a one-step or two-step filtration process. A higher removal percentage of arsenic can reach 92.24% at the expense of 6∼7% loss of antimony in the two-step filtration process at 500∼550°C and 300∼400°C. The FeAl filters had still good permeable and mechanical properties with 1041h of uninterrupted service, which indicates the feasibility of this high-temperature filtration technology. Copyright © 2017. Published by Elsevier B.V.

  19. An Emerging Theoretical Model of Music Therapy Student Development.

    PubMed

    Dvorak, Abbey L; Hernandez-Ruiz, Eugenia; Jang, Sekyung; Kim, Borin; Joseph, Megan; Wells, Kori E

    2017-07-01

    Music therapy students negotiate a complex relationship with music and its use in clinical work throughout their education and training. This distinct, pervasive, and evolving relationship suggests a developmental process unique to music therapy. The purpose of this grounded theory study was to create a theoretical model of music therapy students' developmental process, beginning with a study within one large Midwestern university. Participants (N = 15) were music therapy students who completed one 60-minute intensive interview, followed by a 20-minute member check meeting. Recorded interviews were transcribed, analyzed, and coded using open and axial coding. The theoretical model that emerged was a six-step sequential developmental progression that included the following themes: (a) Personal Connection, (b) Turning Point, (c) Adjusting Relationship with Music, (d) Growth and Development, (e) Evolution, and (f) Empowerment. The first three steps are linear; development continues in a cyclical process among the last three steps. As the cycle continues, music therapy students continue to grow and develop their skills, leading to increased empowerment, and more specifically, increased self-efficacy and competence. Further exploration of the model is needed to inform educators' and other key stakeholders' understanding of student needs and concerns as they progress through music therapy degree programs. © the American Music Therapy Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  20. A theoretical model to determine the capacity performance of shape-specific electrodes

    NASA Astrophysics Data System (ADS)

    Yue, Yuan; Liang, Hong

    2018-06-01

    A theory is proposed to explain and predict the electrochemical process during reaction between lithium ions and electrode materials. In the model, the process of reaction is proceeded into two steps, surface adsorption and diffusion of lithium ions. The surface adsorption is an instantaneous process for lithium ions to adsorb onto the surface sites of active materials. The diffusion of lithium ions into particles is determined by the charge-discharge condition. A formula to determine the maximum specific capacity of active materials at different charging rates (C-rates) is derived. The maximum specific capacity is correlated to characteristic parameters of materials and cycling - such as size, aspect ratio, surface area, and C-rate. Analysis indicates that larger particle size or greater aspect ratio of active materials and faster C-rates can reduce maximum specific capacity. This suggests that reducing particle size of active materials and slowing the charge-discharge speed can provide enhanced electrochemical performance of a battery cell. Furthermore, the model is validated by published experimental results. This model brings new understanding in quantification of electrochemical kinetics and capacity performance. It enables development of design strategies for novel electrodes and future generation of energy storage devices.

  1. A phonon thermodynamics approach of gold nanofluids synthesized in solution plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heo, YongKang, E-mail: yk@rd.numse.nagoya-u.ac.jp; Aburaya, Daiki, E-mail: daiki@rd.numse.nagoya-u.ac.jp; Antoaneta Bratescu, Maria, E-mail: maria@rd.numse.nagoya-u.ac.jp

    2014-03-17

    The phonon thermodynamics theory for liquids was applied to explain the thermal characteristics of gold nanofluids synthesized by a simple, one-step, and chemical-free method using an electrical discharge in a liquid environment termed solution plasma process. The specific heat capacity of nanofluids was measured with a differential scanning calorimeter using the ratio between the differential heat flow rate and the heating rate. The decrease of the specific heat capacity with 10% of gold nanofluids relative to water was explained by the decrease of Frenkel relaxation time with 22%, considering a solid-like state model of liquids.

  2. SEM evaluation of metallization on semiconductors. [Scanning Electron Microscope

    NASA Technical Reports Server (NTRS)

    Fresh, D. L.; Adolphsen, J. W.

    1974-01-01

    A test method for the evaluation of metallization on semiconductors is presented and discussed. The method has been prepared in MIL-STD format for submittal as a proposed addition to MIL-STD-883. It is applicable to discrete devices and to integrated circuits and specifically addresses batch-process oriented defects. Quantitative accept/reject criteria are given for contact windows, other oxide steps, and general interconnecting metallization. Figures are provided that illustrate typical types of defects. Apparatus specifications, sampling plans, and specimen preparation and examination requirements are described. Procedures for glassivated devices and for multi-metal interconnection systems are included.

  3. Finding glenoid surface on scapula in 3D medical images for shoulder joint implant operation planning: 3D OCR

    NASA Astrophysics Data System (ADS)

    Mohammad Sadeghi, Majid; Kececi, Emin Faruk; Bilsel, Kerem; Aralasmak, Ayse

    2017-03-01

    Medical imaging has great importance in earlier detection, better treatment and follow-up of diseases. 3D Medical image analysis with CT Scan and MRI images has also been used to aid surgeries by enabling patient specific implant fabrication, where having a precise three dimensional model of associated body parts is essential. In this paper, a 3D image processing methodology for finding the plane on which the glenoid surface has a maximum surface area is proposed. Finding this surface is the first step in designing patient specific shoulder joint implant.

  4. Program Criteria Specifications Document. Computer Program TWDA for Design and Analysis of Inverted-T Retaining Walls and Floodwalls.

    DTIC Science & Technology

    1981-02-01

    or analysis IloduIls,* each pCr forming one specific step in the design or analysis process. These modules will be callable , in any logical sequence...tempt to 1)l 1cC Cind cut of I bar, hut Will slow the required steel area and bond r i u I rl- t t)s per I oot at Uitablt intervals across the base... bond strength) shall be as required in ACI 318-71 Chapter 12, except that computed shear V shall be multiplied by 2.0 and substituted for V u. Tn

  5. Bridging the guideline implementation gap: a systematic, document-centered approach to guideline implementation.

    PubMed

    Shiffman, Richard N; Michel, George; Essaihi, Abdelwaheb; Thornquist, Elizabeth

    2004-01-01

    A gap exists between the information contained in published clinical practice guidelines and the knowledge and information that are necessary to implement them. This work describes a process to systematize and make explicit the translation of document-based knowledge into workflow-integrated clinical decision support systems. This approach uses the Guideline Elements Model (GEM) to represent the guideline knowledge. Implementation requires a number of steps to translate the knowledge contained in guideline text into a computable format and to integrate the information into clinical workflow. The steps include: (1) selection of a guideline and specific recommendations for implementation, (2) markup of the guideline text, (3) atomization, (4) deabstraction and (5) disambiguation of recommendation concepts, (6) verification of rule set completeness, (7) addition of explanations, (8) building executable statements, (9) specification of origins of decision variables and insertions of recommended actions, (10) definition of action types and selection of associated beneficial services, (11) choice of interface components, and (12) creation of requirement specification. The authors illustrate these component processes using examples drawn from recent experience translating recommendations from the National Heart, Lung, and Blood Institute's guideline on management of chronic asthma into a workflow-integrated decision support system that operates within the Logician electronic health record system. Using the guideline document as a knowledge source promotes authentic translation of domain knowledge and reduces the overall complexity of the implementation task. From this framework, we believe that a better understanding of activities involved in guideline implementation will emerge.

  6. A metallurgical route to solar-grade silicon

    NASA Technical Reports Server (NTRS)

    Schei, A.

    1986-01-01

    The aim of the process is to produce silicon for crystallization into ingots that can be sliced to wafers for processing into photovoltaic cells. If the potential purity can be realized, the silicon will also be applicable for ribbon pulling techniques where the purification during crystallization is negligible. The process consists of several steps: selection and purification of raw materials, carbothermic reduction of silica, ladle treatment, casting, crushing, leaching, and melting. The leaching step is crucial for high purity, and the obtainable purity is determined by the solidification before leaching. The most difficult specifications to fulfill are the low contents of boron, phosphorus, and carbon. Boron and phosphorus can be excluded from the raw materials, but the carbothermic reduction will unavoidably saturate the silicon with carbon at high temperature. During cooling carbon will precipitate as silicon carbide crystals, which will be harmful in solar cells. The cost of this solar silicon will depend strongly on the scale of production. It is as yet premature to give exact figures, but with a scale of some thousand tons per year, the cost will only be a few times the cost of ordinary metallurgical silicon.

  7. Detection Methodologies for Pathogen and Toxins: A Review.

    PubMed

    Alahi, Md Eshrat E; Mukhopadhyay, Subhas Chandra

    2017-08-16

    Pathogen and toxin-contaminated foods and beverages are a major source of illnesses, even death, and have a significant economic impact worldwide. Human health is always under a potential threat, including from biological warfare, due to these dangerous pathogens. The agricultural and food production chain consists of many steps such as harvesting, handling, processing, packaging, storage, distribution, preparation, and consumption. Each step is susceptible to threats of environmental contamination or failure to safeguard the processes. The production process can be controlled in the food and agricultural sector, where smart sensors can play a major role, ensuring greater food quality and safety by low cost, fast, reliable, and profitable methods of detection. Techniques for the detection of pathogens and toxins may vary in cost, size, and specificity, speed of response, sensitivity, and precision. Smart sensors can detect, analyse and quantify at molecular levels contents of different biological origin and ensure quality of foods against spiking with pesticides, fertilizers, dioxin, modified organisms, anti-nutrients, allergens, drugs and so on. This paper reviews different methodologies to detect pathogens and toxins in foods and beverages.

  8. A Review of Organ Transplantation: Heart, Lung, Kidney, Liver, and Simultaneous Liver-Kidney.

    PubMed

    Scheuher, Cynthia

    2016-01-01

    Heart, lung, kidney, liver, and simultaneous liver-kidney transplants share many features. They all follow the same 7-step process, the same 3 immunosuppressant medications, and the same reason for organ transplantation. Organs are transplanted because of organ failure. The similarities end there. Each organ has its unique causes for failure. Each organ also has its own set of criteria that must be met prior to transplantation. Simultaneous liver-kidney transplant criteria vary per transplant center but are similar in nature. Both the criteria required and the 7-step process are described by the United Network of Organ Sharing, which is a private, nonprofit organization, under contract with the US Department of Health and Human Services. Its function is to increase the number of transplants, improve survival rates after transplantation, promote safe transplant practices, and endorse efficiency. The purpose of this article is to review the reasons transplant is needed, specifically heart, lung, kidney, liver, and simultaneous liver-kidney, and a brief overview of the transplant process including criteria used, contraindications, and medications prescribed.

  9. Designing industrial yeasts for the consolidated bioprocessing of starchy biomass to ethanol

    PubMed Central

    Favaro, Lorenzo; Jooste, Tania; Basaglia, Marina; Rose, Shaunita H.; Saayman, Maryna; Görgens, Johann F.; Casella, Sergio; van Zyl, Willem H.

    2013-01-01

    Consolidated bioprocessing (CBP), which integrates enzyme production, saccharification and fermentation into a one step process, is a promising strategy for the effective ethanol production from cheap lignocellulosic and starchy materials. CBP requires a highly engineered microbial strain able to both hydrolyze biomass with enzymes produced on its own and convert the resulting simple sugars into high-titer ethanol. Recently, heterologous production of cellulose and starch-degrading enzymes has been achieved in yeast hosts, which has realized direct processing of biomass to ethanol. However, essentially all efforts aimed at the efficient heterologous expression of saccharolytic enzymes in yeast have involved laboratory strains and much of this work has to be transferred to industrial yeasts that provide the fermentation capacity and robustness desired for large scale bioethanol production. Specifically, the development of an industrial CBP amylolytic yeast would allow the one-step processing of low-cost starchy substrates into ethanol. This article gives insight in the current knowledge and achievements on bioethanol production from starchy materials with industrial engineered S. cerevisiae strains. PMID:22989992

  10. Integrated system for single leg walking

    NASA Astrophysics Data System (ADS)

    Simmons, Reid; Krotkov, Eric; Roston, Gerry

    1990-07-01

    The Carnegie Mellon University Planetary Rover project is developing a six-legged walking robot capable of autonomously navigating, exploring, and acquiring samples in rugged, unknown environments. This report describes an integrated software system capable of navigating a single leg of the robot over rugged terrain. The leg, based on an early design of the Ambler Planetary Rover, is suspended below a carriage that slides along rails. To walk, the system creates an elevation map of the terrain from laser scanner images, plans an appropriate foothold based on terrain and geometric constraints, weaves the leg through the terrain to position it above the foothold, contacts the terrain with the foot, and applies force enough to advance the carriage along the rails. Walking both forward and backward, the system has traversed hundreds of meters of rugged terrain including obstacles too tall to step over, trenches too deep to step in, closely spaced obstacles, and sand hills. The implemented system consists of a number of task-specific processes (two for planning, two for perception, one for real-time control) and a central control process that directs the flow of communication between processes.

  11. Prion removal effect of a specific affinity ligand introduced into the manufacturing process of the pharmaceutical quality solvent/detergent (S/D)-treated plasma OctaplasLG.

    PubMed

    Neisser-Svae, A; Bailey, A; Gregori, L; Heger, A; Jordan, S; Behizad, M; Reichl, H; Römisch, J; Svae, T-E

    2009-10-01

    A new chromatographic step for the selective binding of abnormal prion protein (PrP(Sc)) was developed, and optimization for PrP(Sc) capture was achieved by binding to an affinity ligand attached to synthetic resin particles. This step was implemented into the manufacturing process of the solvent/detergent (S/D)-treated biopharmaceutical quality plasma Octaplas to further improve the safety margin in terms of risk for variant Creutzfeldt-Jakob disease (vCJD) transmission. Intermediates and Octaplas final container material, spiked with hamster brain-derived PrP(Sc)-containing fractions, were used for experiments to establish the feasibility of introducing this novel chromatography step. The binding capacity per millilitre of ligand gel was determined under the selected manufacturing conditions. In addition, the specificity of the ligand gel to bind PrP(Sc) from human sources was investigated. A validated Western blot test was used for the identification and quantification of PrP(Sc). A reduction factor of > or = 3.0 log(10) could be demonstrated by Western blotting, utilizing the relevant Octaplas matrix from manufacturing. In this particular cell-free plasma solution, the PrP(Sc) binding capacity of the selected gel was very high (> or = 6 log(10) ID(50)/ml, equivalent to roughly 10 log(10) ID(50)/column at manufacturing scale). The gel binds specifically PrP(Sc) from both animal (hamster and mouse) and human (sporadic and variant CJD) sources. This new single-use, disposable PrP(Sc)-harvesting gel ensures a very high capacity in terms of removing the pathogenic agent causing vCJD from the new generation OctaplasLG, in the event that prions can be found in plasma from donors incubating the disease and thereby contaminating the raw material plasma used for manufacturing.

  12. Production and Purification of the Native Saccharomyces cerevisiae Hsp12 in Escherichia coli.

    PubMed

    Léger, Antoine; Hocquellet, Agnès; Dieryck, Wilfrid; Moine, Virginie; Marchal, Axel; Marullo, Philippe; Josseaume, Annabelle; Cabanne, Charlotte

    2017-09-20

    Hsp12 is a small heat shock protein produced in many organisms, including the yeast Saccharomyces cerevisiae. It has been described as an indicator of yeast stress rate and has also been linked to the sweetness sensation of wine. To obtain a sufficient amount of protein, we produced and purified Hsp12 without tag in Escherichia coli. A simple fast two-step process was developed using a microplate approach and a design of experiments. A capture step on an anion-exchange salt-tolerant resin was followed by size exclusion chromatography for polishing, leading to a purity of 97%. Thereafter, specific anti-Hsp12 antibodies were obtained by rabbit immunization. An ELISA was developed to quantify Hsp12 in various strains of Saccharomyces cerevisiae. The antibodies showed high specificity and allowed the quantitation of Hsp12 in the yeast. The quantities of Hsp12 measured in the strains differed in direct proportion to the level of expression found in previous studies.

  13. Triggers of key calcium signals during erythrocyte invasion by Plasmodium falciparum

    PubMed Central

    Gao, Xiaohong; Gunalan, Karthigayan; Yap, Sally Shu Lin; Preiser, Peter R.

    2013-01-01

    Invasion of erythrocytes by Plasmodium falciparum merozoites is a complex multi-step process mediated by specific interactions between host receptors and parasite ligands. Reticulocyte-binding protein homologues (RHs) and erythrocyte-binding-like (EBL) proteins are discharged from specialized organelles and used in early steps of invasion. Here we show that monoclonal antibodies against PfRH1 (an RH) block merozoite invasion by specifically inhibiting calcium signalling in the parasite, whereas invasion-inhibiting monoclonal antibodies targeting EBA175 (an EBL protein) have no effect on signalling. We further show that inhibition of this calcium signalling prevents EBA175 discharge and thereby formation of the junction between parasite and host cell. Our results indicate that PfRH1 has an initial sensing as well as signal transduction role that leads to the subsequent release of EBA175. They also provide new insights on how RH–host cell interactions lead to essential downstream signalling events in the parasite, suggesting new targets for malaria intervention. PMID:24280897

  14. Three steps to gold: mechanism of protein adsorption revealed by Brownian and molecular dynamics simulations.

    PubMed

    Ozboyaci, M; Kokh, D B; Wade, R C

    2016-04-21

    The addition of three N-terminal histidines to β-lactamase inhibitor protein was shown experimentally to increase its binding potency to an Au(111) surface substantially but the binding mechanism was not resolved. Here, we propose a complete adsorption mechanism for this fusion protein by means of a multi-scale simulation approach and free energy calculations. We find that adsorption is a three-step process: (i) recognition of the surface predominantly by the histidine fusion peptide and formation of an encounter complex facilitated by a reduced dielectric screening of water in the interfacial region, (ii) adsorption of the protein on the surface and adoption of a specific binding orientation, and (iii) adaptation of the protein structure on the metal surface accompanied by induced fit. We anticipate that the mechanistic features of protein adsorption to an Au(111) surface revealed here can be extended to other inorganic surfaces and proteins and will therefore aid the design of specific protein-surface interactions.

  15. Processing of zero-derived words in English: an fMRI investigation.

    PubMed

    Pliatsikas, Christos; Wheeldon, Linda; Lahiri, Aditi; Hansen, Peter C

    2014-01-01

    Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationalitybridge-V) i.e., zero-derivation (Aronoff, 1980). We compared the processing of one-step (soaking

  16. Recognition of pigment network pattern in dermoscopy images based on fuzzy classification of pixels.

    PubMed

    Garcia-Arroyo, Jose Luis; Garcia-Zapirain, Begonya

    2018-01-01

    One of the most relevant dermoscopic patterns is the pigment network. An innovative method of pattern recognition is presented for its detection in dermoscopy images. It consists of two steps. In the first one, by means of a supervised machine learning process and after performing the extraction of different colour and texture features, a fuzzy classification of pixels into the three categories present in the pattern's definition ("net", "hole" and "other") is carried out. This enables the three corresponding fuzzy sets to be created and, as a result, the three probability images that map them out are generated. In the second step, the pigment network pattern is characterised from a parameterisation process -derived from the system specification- and the subsequent extraction of different features calculated from the combinations of image masks extracted from the probability images, corresponding to the alpha-cuts obtained from the fuzzy sets. The method was tested on a database of 875 images -by far the largest used in the state of the art to detect pigment network- extracted from a public Atlas of Dermoscopy, obtaining AUC results of 0.912 and 88%% accuracy, with 90.71%% sensitivity and 83.44%% specificity. The main contribution of this method is the very design of the algorithm, highly innovative, which could also be used to deal with other pattern recognition problems of a similar nature. Other contributions are: 1. The good performance in discriminating between the pattern and the disturbing artefacts -which means that no prior preprocessing is required in this method- and between the pattern and other dermoscopic patterns; 2. It puts forward a new methodological approach for work of this kind, introducing the system specification as a required step prior to algorithm design and development, being this specification the basis for a required parameterisation -in the form of configurable parameters (with their value ranges) and set threshold values- of the algorithm and the subsequent conducting of the experiments. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. A comparative study of one-step and two-step approaches for MAPbI3 perovskite layer and its influence on the performance of mesoscopic perovskite solar cell

    NASA Astrophysics Data System (ADS)

    Wang, Minhuan; Feng, Yulin; Bian, Jiming; Liu, Hongzhu; Shi, Yantao

    2018-01-01

    The mesoscopic perovskite solar cells (M-PSCs) were synthesized with MAPbI3 perovskite layers as light harvesters, which were grown with one-step and two-step solution process, respectively. A comparative study was performed through the quantitative correlation of resulting device performance and the crystalline quality of perovskite layers. Comparing with the one-step counterpart, a pronounced improvement in the steady-state power conversion efficiencies (PCEs) by 56.86% was achieved with two-step process, which was mainly resulted from the significant enhancement in fill factor (FF) from 48% to 77% without sacrificing the open circuit voltage (Voc) and short circuit current (Jsc). The enhanced FF was attributed to the reduced non-radiative recombination channels due to the better crystalline quality and larger grain size with the two-step processed perovskite layer. Moreover, the superiority of two-step over one-step process was demonstrated with rather good reproducibility.

  18. Installation Restoration Program. Feasibility Study for Sites 1, 3, 5, 6, 7, 8, and 9. Michigan Air National Guard. Alpena Combat Readiness Training Center, Alpena, Michigan.

    DTIC Science & Technology

    1996-04-01

    This study is to screen and evaluate remediation alternatives for IRP sites at Alpena CR TC, Alpena , MI. The purpose of this FS is to develop, screen... Alpena CRTC. The first step in the PS process is to identify the RAOs and general response actions (GRAs). RAOs consist of medium-specific or site

  19. RESEARCH FOR MANAGING URBAN WATERSHED MICROBIAL CONTAMINATION (PROJECT 1: MANAGING URBAN WATERSHED PATHOGEN CONTAMINATION: 2. EFFECT OF LAND USE AND SEASON ON MICROORGANISM CONCENTRATION ON URBAN STORMWATER RUNOFF; 3. MICROORGANISM DIE-OFF RATES UNDER VARIOUS CONDITIONS.

    EPA Science Inventory

    The Water Supply and Water Resources Division (WSWRD) developed a document entitled Managing Urban Watershed Pathogen Contamination (EPA 600/R-03/111). This document provides information to support specific steps of the total maximum daily load (TMDL) process for meeting water q...

  20. Subgrouping Chronic Fatigue Syndrome Patients By Genetic and Immune Profiling

    DTIC Science & Technology

    2015-12-01

    participant inclusion was also verified against our master demographic file. This process revealed that only a small percentage of participants (...the ! ! − !!! , ∈ ℤ!| ≤ 7 , is a cubic -spline basis on three knots, ! is value of outcome for batch control, and is residual ...tests. Specifically, -value adjustments will employ an 8 adaptive two- stage linear step-up procedure to control the FDR at 5% (Benjamani et al. 2006

  1. Effect of production management on semen quality during long-term storage in different European boar studs.

    PubMed

    Schulze, M; Kuster, C; Schäfer, J; Jung, M; Grossfeld, R

    2018-03-01

    The processing of ejaculates is a fundamental step for the fertilizing capacity of boar spermatozoa. The aim of the present study was to identify factors that affect quality of boar semen doses. The production process during 1 day of semen processing in 26 European boar studs was monitored. In each boar stud, nine to 19 randomly selected ejaculates from 372 Pietrain boars were analyzed for sperm motility, acrosome and plasma membrane integrity, mitochondrial activity and thermo-resistance (TRT). Each ejaculate was monitored for production time and temperature for each step in semen processing using the special programmed software SEQU (version 1.7, Minitüb, Tiefenbach, Germany). The dilution of ejaculates with a short-term extender was completed in one step in 10 AI centers (n = 135 ejaculates), in two steps in 11 AI centers (n = 158 ejaculates) and in three steps in five AI centers (n = 79 ejaculates). Results indicated there was a greater semen quality with one-step isothermal dilution compared with the multi-step dilution of AI semen doses (total motility TRT d7: 71.1 ± 19.2%, 64.6 ± 20.0%, 47.1 ± 27.1%; one-step compared with two-step compared with the three-step dilution; P < .05). There was a marked advantage when using the one-step isothermal dilution regarding time management, preservation suitability, stability and stress resistance. One-step dilution caused significant lower holding times of raw ejaculates and reduced the possible risk of making mistakes due to a lower number of processing steps. These results lead to refined recommendations for boar semen processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Somatic Coliphage Profiles of Produce and Environmental Samples from Farms in Northern México.

    PubMed

    Bartz, Faith E; Hodge, Domonique Watson; Heredia, Norma; de Aceituno, Anna Fabiszewski; Solís, Luisa; Jaykus, Lee-Ann; Garcia, Santos; Leon, Juan S

    2016-09-01

    Somatic coliphages were quantified in 459 produce and environmental samples from 11 farms in Northern Mexico to compare amounts of somatic coliphages among different types of fresh produce and environmental samples across the production steps on farms. Rinsates from cantaloupe melons, jalapeño peppers, tomatoes, and the hands of workers, soil, and water were collected during 2011-2012 at four successive steps on each farm, from the field before harvest through the packing facility, and assayed by FastPhage MPN Quanti-tray method. Cantaloupe farm samples contained more coliphages than jalapeño or tomato (p range <0.01-0.03). Across production steps, jalapeños had higher coliphage percentages before harvest than during packing (p = 0.03), while tomatoes had higher coliphage concentrations at packing than all preceding production steps (p range <0.01-0.02). These findings support the use of targeted produce-specific interventions at multiple points in the process of growing and packing produce to reduce the risk of enteric virus contamination and improve food safety during fruit and vegetable production.

  3. Mobile magnetic particles as solid-supports for rapid surface-based bioanalysis in continuous flow.

    PubMed

    Peyman, Sally A; Iles, Alexander; Pamme, Nicole

    2009-11-07

    An extremely versatile microfluidic device is demonstrated in which multi-step (bio)chemical procedures can be performed in continuous flow. The system operates by generating several co-laminar flow streams, which contain reagents for specific (bio)reactions across a rectangular reaction chamber. Functionalized magnetic microparticles are employed as mobile solid-supports and are pulled from one side of the reaction chamber to the other by use of an external magnetic field. As the particles traverse the co-laminar reagent streams, binding and washing steps are performed on their surface in one operation in continuous flow. The applicability of the platform was first demonstrated by performing a proof-of-principle binding assay between streptavidin coated magnetic particles and biotin in free solution with a limit of detection of 20 ng mL(-1) of free biotin. The system was then applied to a mouse IgG sandwich immunoassay as a first example of a process involving two binding steps and two washing steps, all performed within 60 s, a fraction of the time required for conventional testing.

  4. Individualizing drug dosage with longitudinal data.

    PubMed

    Zhu, Xiaolu; Qu, Annie

    2016-10-30

    We propose a two-step procedure to personalize drug dosage over time under the framework of a log-linear mixed-effect model. We model patients' heterogeneity using subject-specific random effects, which are treated as the realizations of an unspecified stochastic process. We extend the conditional quadratic inference function to estimate both fixed-effect coefficients and individual random effects on a longitudinal training data sample in the first step and propose an adaptive procedure to estimate new patients' random effects and provide dosage recommendations for new patients in the second step. An advantage of our approach is that we do not impose any distribution assumption on estimating random effects. Moreover, the new approach can accommodate more general time-varying covariates corresponding to random effects. We show in theory and numerical studies that the proposed method is more efficient compared with existing approaches, especially when covariates are time varying. In addition, a real data example of a clozapine study confirms that our two-step procedure leads to more accurate drug dosage recommendations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. [Gradient elevation of temperature startup experiment of thermophilic ASBR treating thermal-hydrolyzed sewage sludge].

    PubMed

    Ouyang, Er-Ming; Wang, Wei; Long, Neng; Li, Huai

    2009-04-15

    Startup experiment was conducted for thermophilic anaerobic sequencing batch reactor (ASBR) treating thermal-hydrolyzed sewage sludge using the strategy of the step-wise temperature increment: 35 degrees C-->40 degrees C-->47 degrees C-->53 degrees C. The results showed that the first step-increase (from 35 degrees C to 40 degrees C) and final step-increase (from 47 degrees C to 53 degrees C) had only a slight effect on the digestion process. The second step-increase (from 40 degrees C to 47 degrees C) resulted in a severe disturbance: the biogas production, methane content, CODeffluent and microorganism all have strong disturbance. At the steady stage of thermophilic ASBR treating thermal-hydrolyzed sewage sludge, the average daily gas production, methane content, specific methane production (CH4/CODinfluent), TCOD removal rate and SCOD removal rate were 2.038 L/d, 72.0%, 188.8 mL/g, 63.8%, 83.3% respectively. The results of SEM and DGGE indicated that the dominant species are obviously different at early stage and steady stage.

  6. [Quality indicators for National Disease Management Guidelines using the example of the National Disease Management Guideline for "Chronic Heart Failure"].

    PubMed

    Nothacker, Monika Judith; Langer, Thomas; Weinbrenner, Susanne

    2011-01-01

    Together with an expert committee a structured approach to determining quality indicators for National Disease Management Guidelines has been developed. The key steps of this approach include: introducing guideline authors to the methodology at an early stage of the process of guideline development, pre-selecting recommendations of the guideline which are potentially measurable by means of quality indicators, assessing the potentially measurable quality indicators in written form using five criteria (including their importance for the health care system and clarity of definitions) and approving them in a formal consensus process. For lack of a database these quality indicators must be regarded as preliminary. For the National Disease Management Guideline "Chronic Heart Failure" nine rate-based indicators have been chosen. The indicators correspond to important strong recommendations (grade of recommendation: A) from the fields of diagnosis (two), general therapeutic strategy (two), specific treatment (three), clinical monitoring (one) and co-ordination of care (one). In a second step, the quality indicators have to be validated within a pilot project. The determination and assessment of the potential quality indicators have revealed room for improvement of guideline development. In particular, there is a need for more health care data and for specification of recommendations.

  7. Laser-induced patterns on metals and polymers for biomimetic surface engineering

    NASA Astrophysics Data System (ADS)

    Kietzig, Anne-Marie; Lehr, Jorge; Matus, Luke; Liang, Fang

    2014-03-01

    One common feature of many functional surfaces found in nature is their modular composition often exhibiting several length scales. Prominent natural examples for extreme behaviors can be named in various plant leaf (rose, peanut, lotus) or animal toe surfaces (Gecko, tree frog). Influence factors of interest are the surface's chemical composition, its microstructure, its organized or random roughness and hence the resulting surface wetting and adhesion character. Femtosecond (fs) laser micromachining offers a possibility to render all these factors in one single processing step on metallic and polymeric surfaces. Exemplarily, studies on Titanium and PTFE are shown, where the dependence of the resulting feature sizes on lasing intensity is investigated. While Ti surfaces show rigid surface patterns of micrometer scaled features with superimposed nanostructures, PTFE exhibits elastic hairy structures of nanometric diameter, which upon a certain threshold tend to bundle to larger features. Both surface patterns can be adjusted to mimic specific wetting and flow behaviour as seen on natural examples. Therefore, fs-laser micromachining is suggested as an interesting industrially scalable technique to pattern and fine-tune the surface wettability of a surface to the desired extends in one process step. Possible applications can be seen with surfaces, which require specific wetting, fouling, icing, friction or cell adhesion behaviour.

  8. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Specificity control for read alignments using an artificial reference genome-guided false discovery rate.

    PubMed

    Giese, Sven H; Zickmann, Franziska; Renard, Bernhard Y

    2014-01-01

    Accurate estimation, comparison and evaluation of read mapping error rates is a crucial step in the processing of next-generation sequencing data, as further analysis steps and interpretation assume the correctness of the mapping results. Current approaches are either focused on sensitivity estimation and thereby disregard specificity or are based on read simulations. Although continuously improving, read simulations are still prone to introduce a bias into the mapping error quantitation and cannot capture all characteristics of an individual dataset. We introduce ARDEN (artificial reference driven estimation of false positives in next-generation sequencing data), a novel benchmark method that estimates error rates of read mappers based on real experimental reads, using an additionally generated artificial reference genome. It allows a dataset-specific computation of error rates and the construction of a receiver operating characteristic curve. Thereby, it can be used for optimization of parameters for read mappers, selection of read mappers for a specific problem or for filtering alignments based on quality estimation. The use of ARDEN is demonstrated in a general read mapper comparison, a parameter optimization for one read mapper and an application example in single-nucleotide polymorphism discovery with a significant reduction in the number of false positive identifications. The ARDEN source code is freely available at http://sourceforge.net/projects/arden/.

  10. FRAMEWORK FOR ASSESSING RISKS OF ...

    EPA Pesticide Factsheets

    The Framework for Children's Health Risk Assessment report can serve as a resource on children's health risk assessment and it addresses the need to provide a comprehensive and consistent framework for considering children in risk assessments at EPA. This framework lays out the process, points to existing published sources for more detailed information on life stage-specific considerations, and includes web links to specific online publications and relevant Agency science policy papers, guidelines and guidance. The document emphasizes the need to take into account the potential exposures to environmental agents during preconception and all stages of development and focuses on the relevant adverse health outcomes that may occur as a result of such exposures. This framework is not an Agency guideline, but rather describes the overall structure and the components considered important for children's health risk assessment. The document describes an approach that includes problem formulation, analysis, and risk characterization, and also builds on Agency experience assessing risk to susceptible populations. The problem formulation step focuses on the life stage-specific nature of the analysis to include scoping and screening level questions for hazard characterization, dose response and exposure assessment. The risk characterization step recognizes the need to consider life stage-specific risks and explicitly describes the uncertainties and variability in the d

  11. Regulation of neuronal axon specification by glia-neuron gap junctions in C. elegans.

    PubMed

    Meng, Lingfeng; Zhang, Albert; Jin, Yishi; Yan, Dong

    2016-10-21

    Axon specification is a critical step in neuronal development, and the function of glial cells in this process is not fully understood. Here, we show that C. elegans GLR glial cells regulate axon specification of their nearby GABAergic RME neurons through GLR-RME gap junctions. Disruption of GLR-RME gap junctions causes misaccumulation of axonal markers in non-axonal neurites of RME neurons and converts microtubules in those neurites to form an axon-like assembly. We further uncover that GLR-RME gap junctions regulate RME axon specification through activation of the CDK-5 pathway in a calcium-dependent manner, involving a calpain clp-4 . Therefore, our study reveals the function of glia-neuron gap junctions in neuronal axon specification and shows that calcium originated from glial cells can regulate neuronal intracellular pathways through gap junctions.

  12. Modelling uveal melanoma

    PubMed Central

    Foss, A.; Cree, I.; Dolin, P.; Hungerford, J.

    1999-01-01

    BACKGROUND/AIM—There has been no consistent pattern reported on how mortality for uveal melanoma varies with age. This information can be useful to model the complexity of the disease. The authors have examined ocular cancer trends, as an indirect measure for uveal melanoma mortality, to see how rates vary with age and to compare the results with their other studies on predicting metastatic disease.
METHODS—Age specific mortality was examined for England and Wales, the USA, and Canada. A log-log model was fitted to the data. The slopes of the log-log plots were used as measure of disease complexity and compared with the results of previous work on predicting metastatic disease.
RESULTS—The log-log model provided a good fit for the US and Canadian data, but the observed rates deviated for England and Wales among people over the age of 65 years. The log-log model for mortality data suggests that the underlying process depends upon four rate limiting steps, while a similar model for the incidence data suggests between three and four rate limiting steps. Further analysis of previous data on predicting metastatic disease on the basis of tumour size and blood vessel density would indicate a single rate limiting step between developing the primary tumour and developing metastatic disease.
CONCLUSIONS—There is significant underreporting or underdiagnosis of ocular melanoma for England and Wales in those over the age of 65 years. In those under the age of 65, a model is presented for ocular melanoma oncogenesis requiring three rate limiting steps to develop the primary tumour and a fourth rate limiting step to develop metastatic disease. The three steps in the generation of the primary tumour involve two key processes—namely, growth and angiogenesis within the primary tumour. The step from development of the primary to development of metastatic disease is likely to involve a single rate limiting process.

 PMID:10216060

  13. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central Greece. Moreover, remote sensing has proven very effective in delineating spatial variability and features in drought monitoring and assessment.

  14. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  15. Teaching ethics to engineers: ethical decision making parallels the engineering design process.

    PubMed

    Bero, Bridget; Kuhlman, Alana

    2011-09-01

    In order to fulfill ABET requirements, Northern Arizona University's Civil and Environmental engineering programs incorporate professional ethics in several of its engineering courses. This paper discusses an ethics module in a 3rd year engineering design course that focuses on the design process and technical writing. Engineering students early in their student careers generally possess good black/white critical thinking skills on technical issues. Engineering design is the first time students are exposed to "grey" or multiple possible solution technical problems. To identify and solve these problems, the engineering design process is used. Ethical problems are also "grey" problems and present similar challenges to students. Students need a practical tool for solving these ethical problems. The step-wise engineering design process was used as a model to demonstrate a similar process for ethical situations. The ethical decision making process of Martin and Schinzinger was adapted for parallelism to the design process and presented to students as a step-wise technique for identification of the pertinent ethical issues, relevant moral theories, possible outcomes and a final decision. Students had greatest difficulty identifying the broader, global issues presented in an ethical situation, but by the end of the module, were better able to not only identify the broader issues, but also to more comprehensively assess specific issues, generate solutions and a desired response to the issue.

  16. Cognitive representation of "musical fractals": Processing hierarchy and recursion in the auditory domain.

    PubMed

    Martins, Mauricio Dias; Gingras, Bruno; Puig-Waldmueller, Estela; Fitch, W Tecumseh

    2017-04-01

    The human ability to process hierarchical structures has been a longstanding research topic. However, the nature of the cognitive machinery underlying this faculty remains controversial. Recursion, the ability to embed structures within structures of the same kind, has been proposed as a key component of our ability to parse and generate complex hierarchies. Here, we investigated the cognitive representation of both recursive and iterative processes in the auditory domain. The experiment used a two-alternative forced-choice paradigm: participants were exposed to three-step processes in which pure-tone sequences were built either through recursive or iterative processes, and had to choose the correct completion. Foils were constructed according to generative processes that did not match the previous steps. Both musicians and non-musicians were able to represent recursion in the auditory domain, although musicians performed better. We also observed that general 'musical' aptitudes played a role in both recursion and iteration, although the influence of musical training was somehow independent from melodic memory. Moreover, unlike iteration, recursion in audition was well correlated with its non-auditory (recursive) analogues in the visual and action sequencing domains. These results suggest that the cognitive machinery involved in establishing recursive representations is domain-general, even though this machinery requires access to information resulting from domain-specific processes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Two-Step Plasma Process for Cleaning Indium Bonding Bumps

    NASA Technical Reports Server (NTRS)

    Greer, Harold F.; Vasquez, Richard P.; Jones, Todd J.; Hoenk, Michael E.; Dickie, Matthew R.; Nikzad, Shouleh

    2009-01-01

    A two-step plasma process has been developed as a means of removing surface oxide layers from indium bumps used in flip-chip hybridization (bump bonding) of integrated circuits. The two-step plasma process makes it possible to remove surface indium oxide, without incurring the adverse effects of the acid etching process.

  18. Functionalized Anodic Aluminum Oxide Membrane–Electrode System for Enzyme Immobilization

    PubMed Central

    2015-01-01

    A nanoporous membrane system with directed flow carrying reagents to sequentially attached enzymes to mimic nature’s enzyme complex system was demonstrated. Genetically modified glycosylation enzyme, OleD Loki variant, was immobilized onto nanometer-scale electrodes at the pore entrances/exits of anodic aluminum oxide membranes through His6-tag affinity binding. The enzyme activity was assessed in two reactions—a one-step “reverse” sugar nucleotide formation reaction (UDP-Glc) and a two-step sequential sugar nucleotide formation and sugar nucleotide-based glycosylation reaction. For the one-step reaction, enzyme specific activity of 6–20 min–1 on membrane supports was seen to be comparable to solution enzyme specific activity of 10 min–1. UDP-Glc production efficiencies as high as 98% were observed at a flow rate of 0.5 mL/min, at which the substrate residence time over the electrode length down pore entrances was matched to the enzyme activity rate. This flow geometry also prevented an unwanted secondary product hydrolysis reaction, as observed in the test homogeneous solution. Enzyme utilization increased by a factor of 280 compared to test homogeneous conditions due to the continuous flow of fresh substrate over the enzyme. To mimic enzyme complex systems, a two-step sequential reaction using OleD Loki enzyme was performed at membrane pore entrances then exits. After UDP-Glc formation at the entrance electrode, aglycon 4-methylumbelliferone was supplied at the exit face of the reactor, affording overall 80% glycosylation efficiency. The membrane platform showed the ability to be regenerated with purified enzyme as well as directly from expression crude, thus demonstrating a single-step immobilization and purification process. PMID:25025628

  19. Integrated conversion of food waste diluted with sewage into volatile fatty acids through fermentation and electricity through a fuel cell.

    PubMed

    Pant, Deepak; Arslan, Doga; Van Bogaert, Gilbert; Gallego, Yolanda Alvarez; De Wever, Heleen; Diels, Ludo; Vanbroekhoven, Karolien

    2013-01-01

    In this study, domestic wastewater was given a second life as dilution medium for concentrated organic waste streams, in particular artificial food waste. A two-step continuous process with first volatile fatty acid (VFA)/hydrogen production and second electricity production in microbial fuel cells (MFCs) was employed. For primary treatment, bioreactors were optimized to produce hydrogen and VFAs. Hydrolysis of the solids and formation of fermentation products and hydrogen was monitored. In the second step, MFCs were operated batch-wise using the effluent rich in VFAs specifically acetic acid from the continuous reactor of the first step. The combined system was able to reduce the chemical oxygen demand load by 90%. The concentration of VFAs was also monitored regularly in the MFCs and showed a decreasing trend over time. Further, the anode potential changed from -500 to OmV vs. Ag/AgCl when the VFAs (especially acetate) were depleted in the system. On feeding the system again with the effluent, the anode potential recovered back to -500 mV vs. Ag/AgCl. Thus, the overall aim of converting chemical energy into electrical energy was achieved with a columbic efficiency of 46% generating 65.33 mA/m2 at a specific cell potential of 148 mV.

  20. The next step towards more equity in health in Sweden: how can we close the gap in a generation?

    PubMed

    Lundberg, Olle

    2018-06-01

    In 2015, a national Commission for Equity in Health was appointed by the Swedish Government. In this paper, some key lines of thought from the three reports published by the Commission are summarised. First, the theories and principles for the Commission's work are outlined, in particular regarding the views taken on how health inequalities arise. Second, the importance of process is discussed in relation to cross-sectorial efforts to reduce inequalities in health. More specifically, this brings up some of the proposals made for how to redesign the public health policy framework for cross-sectorial work. Third, the proposed content of cross-sectorial work for more equal health is presented in three steps, namely: (1) overarching recommendations, (2) more equal conditions and opportunities, and (3) general problems of governance. Regarding people's conditions and opportunities, the Commission submitted a number of proposals for the general direction of work that needs to be taken in order to reduce health inequalities, as well as some examples of more specific policy changes or reforms on the basis of each of these general directions, which are summarised here. Finally, some challenges and difficulties that may prevent Sweden from taking the next step towards more equity in health are discussed.

  1. A Semi-Empirical Two Step Carbon Corrosion Reaction Model in PEM Fuel Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Alan; Colbow, Vesna; Harvey, David

    2013-01-01

    The cathode CL of a polymer electrolyte membrane fuel cell (PEMFC) was exposed to high potentials, 1.0 to 1.4 V versus a reversible hydrogen electrode (RHE), that are typically encountered during start up/shut down operation. While both platinum dissolution and carbon corrosion occurred, the carbon corrosion effects were isolated and modeled. The presented model separates the carbon corrosion process into two reaction steps; (1) oxidation of the carbon surface to carbon-oxygen groups, and (2) further corrosion of the oxidized surface to carbon dioxide/monoxide. To oxidize and corrode the cathode catalyst carbon support, the CL was subjected to an accelerated stressmore » test cycled the potential from 0.6 VRHE to an upper potential limit (UPL) ranging from 0.9 to 1.4 VRHE at varying dwell times. The reaction rate constants and specific capacitances of carbon and platinum were fitted by evaluating the double layer capacitance (Cdl) trends. Carbon surface oxidation increased the Cdl due to increased specific capacitance for carbon surfaces with carbon-oxygen groups, while the second corrosion reaction decreased the Cdl due to loss of the overall carbon surface area. The first oxidation step differed between carbon types, while both reaction rate constants were found to have a dependency on UPL, temperature, and gas relative humidity.« less

  2. Kinematic, muscular, and metabolic responses during exoskeletal-, elliptical-, or therapist-assisted stepping in people with incomplete spinal cord injury.

    PubMed

    Hornby, T George; Kinnaird, Catherine R; Holleran, Carey L; Rafferty, Miriam R; Rodriguez, Kelly S; Cain, Julie B

    2012-10-01

    Robotic-assisted locomotor training has demonstrated some efficacy in individuals with neurological injury and is slowly gaining clinical acceptance. Both exoskeletal devices, which control individual joint movements, and elliptical devices, which control endpoint trajectories, have been utilized with specific patient populations and are available commercially. No studies have directly compared training efficacy or patient performance during stepping between devices. The purpose of this study was to evaluate kinematic, electromyographic (EMG), and metabolic responses during elliptical- and exoskeletal-assisted stepping in individuals with incomplete spinal cord injury (SCI) compared with therapist-assisted stepping. Design A prospective, cross-sectional, repeated-measures design was used. Participants with incomplete SCI (n=11) performed 3 separate bouts of exoskeletal-, elliptical-, or therapist-assisted stepping. Unilateral hip and knee sagittal-plane kinematics, lower-limb EMG recordings, and oxygen consumption were compared across stepping conditions and with control participants (n=10) during treadmill stepping. Exoskeletal stepping kinematics closely approximated normal gait patterns, whereas significantly greater hip and knee flexion postures were observed during elliptical-assisted stepping. Measures of kinematic variability indicated consistent patterns in control participants and during exoskeletal-assisted stepping, whereas therapist- and elliptical-assisted stepping kinematics were more variable. Despite specific differences, EMG patterns generally were similar across stepping conditions in the participants with SCI. In contrast, oxygen consumption was consistently greater during therapist-assisted stepping. Limitations Limitations included a small sample size, lack of ability to evaluate kinetics during stepping, unilateral EMG recordings, and sagittal-plane kinematics. Despite specific differences in kinematics and EMG activity, metabolic activity was similar during stepping in each robotic device. Understanding potential differences and similarities in stepping performance with robotic assistance may be important in delivery of repeated locomotor training using robotic or therapist assistance and for consumers of robotic devices.

  3. SU-D-BRC-02: Application of Six Sigma Approach to Improve the Efficiency of Patient-Specific QA in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LAH, J; Shin, D; Manger, R

    Purpose: To show how the Six Sigma DMAIC (Define-Measure-Analyze-Improve-Control) can be used for improving and optimizing the efficiency of patient-specific QA process by designing site-specific range tolerances. Methods: The Six Sigma tools (process flow diagram, cause and effect, capability analysis, Pareto chart, and control chart) were utilized to determine the steps that need focus for improving the patient-specific QA process. The patient-specific range QA plans were selected according to 7 treatment site groups, a total of 1437 cases. The process capability index, Cpm was used to guide the tolerance design of patient site-specific range. We also analyzed the financial impactmore » of this project. Results: Our results suggested that the patient range measurements were non-capable at the current tolerance level of ±1 mm in clinical proton plans. The optimized tolerances were calculated for treatment sites. Control charts for the patient QA time were constructed to compare QA time before and after the new tolerances were implemented. It is found that overall processing time was decreased by 24.3% after establishing new site-specific range tolerances. The QA failure for whole process in proton therapy would lead up to a 46% increase in total cost. This result can also predict how costs are affected by changes in adopting the tolerance design. Conclusion: We often believe that the quality and performance of proton therapy can easily be improved by merely tightening some or all of its tolerance requirements. This can become costly, however, and it is not necessarily a guarantee of better performance. The tolerance design is not a task to be undertaken without careful thought. The Six Sigma DMAIC can be used to improve the QA process by setting optimized tolerances. When tolerance design is optimized, the quality is reasonably balanced with time and cost demands.« less

  4. Development of a water-jet assisted laser paint removal process

    NASA Astrophysics Data System (ADS)

    Madhukar, Yuvraj K.; Mullick, Suvradip; Nath, Ashish K.

    2013-12-01

    The laser paint removal process usually leaves behind traces of combustion product i.e. ashes on the surface. An additional post-processing such as light-brushing or wiping by some mechanical means is required to remove the residual ash. In order to strip out the paint completely from the surface in a single step, a water-jet assisted laser paint removal process has been investigated. The 1.07 μm wavelength of Yb-fiber laser radiation has low absorption in water; therefore a high power fiber laser was used in the experiment. The laser beam was delivered on the paint-surface along with a water jet to remove the paint and residual ashes effectively. The specific energy, defined as the laser energy required removing a unit volume of paint was found to be marginally more than that for the gas-jet assisted laser paint removal process. However, complete paint removal was achieved with the water-jet assist only. The relatively higher specific energy in case of water-jet assist is mainly due to the scattering of laser beam in the turbulent flow of water-jet.

  5. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Lessons Learned for Collaborative Clinical Content Development

    PubMed Central

    Collins, S.A.; Bavuso, K.; Zuccotti, G.; Rocha, R.A.

    2013-01-01

    Background Site-specific content configuration of vendor-based Electronic Health Records (EHRs) is a vital step in the development of standardized and interoperable content that can be used for clinical decision-support, reporting, care coordination, and information exchange. The multi-site, multi-stakeholder Acute Care Documentation (ACD) project at Partners Healthcare Systems (PHS) aimed to develop highly structured clinical content with adequate breadth and depth to meet the needs of all types of acute care clinicians at two academic medical centers. The Knowledge Management (KM) team at PHS led the informatics and knowledge management effort for the project. Objectives We aimed to evaluate the role, governance, and project management processes and resources for the KM team’s effort as part of the standardized clinical content creation. Methods We employed the Center for Disease Control’s six step Program Evaluation Framework to guide our evaluation steps. We administered a forty-four question, open-ended, semi-structured voluntary survey to gather focused, credible evidence from members of the KM team. Qualitative open-coding was performed to identify themes for lessons learned and concluding recommendations. Results Six surveys were completed. Qualitative data analysis informed five lessons learned and thirty specific recommendations associated with the lessons learned. The five lessons learned are: 1) Assess and meet knowledge needs and set expectations at the start of the project; 2) Define an accountable decision-making process; 3) Increase team meeting moderation skills; 4) Ensure adequate resources and competency training with online asynchronous collaboration tools; 5) Develop focused, goal-oriented teams and supportive, consultative service based teams. Conclusions Knowledge management requirements for the development of standardized clinical content within a vendor-based EHR among multi-stakeholder teams and sites include: 1) assessing and meeting informatics knowledge needs, 2) setting expectations and standardizing the process for decision-making, and 3) ensuring the availability of adequate resources and competency training. PMID:23874366

  7. Collection and conversion of algal lipid

    NASA Astrophysics Data System (ADS)

    Lin, Ching-Chieh

    Sustainable economic activities mandate a significant replacement of fossil energy by renewable forms. Algae-derived biofuels are increasingly seen as an alternative source of energy with potential to supplement the world's ever increasing demand. Our primary objective is, once the algae were cultivated, to eliminate or make more efficient energy-intensive processing steps of collection, drying, grinding, and solvent extraction prior to conversion. To overcome the processing barrier, we propose to streamline from cultivated algae to biodiesel via algal biomass collection by sand filtration, cell rupturing with ozone, and immediate transesterification. To collect the algal biomass, the specific Chlorococcum aquaticum suspension was acidified to pH 3.3 to promote agglomeration prior to sand filtration. The algae-loaded filter bed was drained of free water and added with methanol and ozonated for 2 min to rupture cell membrane to accelerate release of the cellular contents. The methanol solution now containing the dissolved lipid product was collected by draining, while the filter bed was regenerated by further ozonation when needed. The results showed 95% collection of the algal biomass from the suspension and a 16% yield of lipid from the algae, as well as restoration of filtration velocity of the sand bed via ozonation. The results further showed increased lipid yield upon cell rupturing and transesterified products composed entirely of fatty acid methyl ester (FAME) compounds, demonstrating that the rupture and transesterification processes could proceed consecutively in the same medium, requiring no separate steps of drying, extraction, and conversion. The FAME products from algae without exposure to ozone were mainly of 16 to 18 carbons containing up to 3 double bonds, while those from algae having been ozonated were smaller, highly saturated hydrocarbons. The new technique streamlines individual steps from cultivated algal lipid to transesterified products and represents an improvement over existing energy-intensive steps.

  8. Navigating "Assisted Dying".

    PubMed

    Schipper, Harvey

    2016-02-01

    Carter is a bellwether decision, an adjudication on a narrow point of law whose implications are vast across society, and whose impact may not be realized for years. Coupled with Quebec's Act Respecting End-of-life Care it has sharply changed the legal landscape with respect to actively ending a person's life. "Medically assisted dying" will be permitted under circumstances, and through processes, which have yet to be operationally defined. This decision carries with it moral assumptions, which mean that it will be difficult to reach a unifying consensus. For some, the decision and Act reflect a modern acknowledgement of individual autonomy. For others, allowing such acts is morally unspeakable. Having opened the Pandora's Box, the question becomes one of navigating a tolerable societal path. I believe it is possible to achieve a workable solution based on the core principle that "medically assisted dying" should be a very rarely employed last option, subject to transparent ongoing review, specifically as to why it was deemed necessary. My analysis is based on 1. The societal conditions in which have fostered demand for "assisted dying", 2. Actions in other jurisdictions, 3. Carter and Quebec Bill 52, 4. Political considerations, 5. Current medical practice. Leading to a series of recommendations regarding. 1. Legislation and regulation, 2. The role of professional regulatory agencies, 3. Medical professions education and practice, 4. Public education, 5. Health care delivery and palliative care. Given the burden of public opinion, and the legal steps already taken, a process for assisted-dying is required. However, those legal and regulatory steps should only be considered a necessary and defensive first step in a two stage process. The larger goal, the second step, is to drive the improvement of care, and thus minimize assisted-dying.

  9. DSP-Based dual-polarity mass spectrum pattern recognition for bio-detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V; Coffee, K; Gard, E

    2006-04-21

    The Bio-Aerosol Mass Spectrometry (BAMS) instrument analyzes single aerosol particles using a dual-polarity time-of-flight mass spectrometer recording simultaneously spectra of thirty to a hundred thousand points on each polarity. We describe here a real-time pattern recognition algorithm developed at Lawrence Livermore National Laboratory that has been implemented on a nine Digital Signal Processor (DSP) system from Signatec Incorporated. The algorithm first preprocesses independently the raw time-of-flight data through an adaptive baseline removal routine. The next step consists of a polarity dependent calibration to a mass-to-charge representation, reducing the data to about five hundred to a thousand channels per polarity. Themore » last step is the identification step using a pattern recognition algorithm based on a library of known particle signatures including threat agents and background particles. The identification step includes integrating the two polarities for a final identification determination using a score-based rule tree. This algorithm, operating on multiple channels per-polarity and multiple polarities, is well suited for parallel real-time processing. It has been implemented on the PMP8A from Signatec Incorporated, which is a computer based board that can interface directly to the two one-Giga-Sample digitizers (PDA1000 from Signatec Incorporated) used to record the two polarities of time-of-flight data. By using optimized data separation, pipelining, and parallel processing across the nine DSPs it is possible to achieve a processing speed of up to a thousand particles per seconds, while maintaining the recognition rate observed on a non-real time implementation. This embedded system has allowed the BAMS technology to improve its throughput and therefore its sensitivity while maintaining a large dynamic range (number of channels and two polarities) thus maintaining the systems specificity for bio-detection.« less

  10. Chemical Vapor Deposition Of Silicon Carbide

    NASA Technical Reports Server (NTRS)

    Powell, J. Anthony; Larkin, David J.; Matus, Lawrence G.; Petit, Jeremy B.

    1993-01-01

    Large single-crystal SiC boules from which wafers of large area cut now being produced commerically. Availability of wafers opens door for development of SiC semiconductor devices. Recently developed chemical vapor deposition (CVD) process produces thin single-crystal SiC films on SiC wafers. Essential step in sequence of steps used to fabricate semiconductor devices. Further development required for specific devices. Some potential high-temperature applications include sensors and control electronics for advanced turbine engines and automobile engines, power electronics for electromechanical actuators for advanced aircraft and for space power systems, and equipment used in drilling of deep wells. High-frequency applications include communication systems, high-speed computers, and microwave power transistors. High-radiation applications include sensors and controls for nuclear reactors.

  11. An integrated treatment model for dual diagnosis of psychosis and addiction.

    PubMed

    Minkoff, K

    1989-10-01

    A model that integrates the treatment of patients with a dual diagnosis of psychosis and addiction has been developed on a general hospital psychiatric unit. The model emphasizes the parallels between the standard biopsychosocial illness-and-rehabilitation model for treatment of serious psychiatric disorders and the 12-step disease-and-recovery model of Alcoholics Anonymous for treatment of addiction. Dual-diagnosis patients are viewed as having two primary, chronic, biologic mental illnesses, each requiring specific treatment to stabilize acute symptoms and engage the patient in a recovery process. An integrated treatment program is described, as are the steps taken to alleviate psychiatric clinicians' concerns about patient involvement in AA and addiction clinicians' discomfort with patients' use of medication.

  12. a Standardized Approach to Topographic Data Processing and Workflow Management

    NASA Astrophysics Data System (ADS)

    Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.

    2013-12-01

    An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and sharing effective topographic processing workflows.

  13. Procurement of a more efficient railroad system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detmold, P.J.

    An analysis of how new technology penetrates the railroad system and who initiates the changes suggests ways for improving the procedures. The author identifies a six-step process in transport development which includes invention, innovation, information dissemination, and persuasion. He concludes that poor economic justification and the lack of capital explain the low rate of equipment replacement. The nature of the market and the need for standardized specification appear to influence procurement changes, but both suppliers and transport operators must be convinced of the need for change. The Advanced Train Control Systems Project illustrates how the initiation process works.

  14. Surgery and Research: A Practical Approach to Managing the Research Process

    PubMed Central

    Swiatek, Peter R.; Chung, Kevin C.; Mahmoudi, Elham

    2016-01-01

    Following a practical project management method is essential in completing a research project on time and within budget. Although this concept is well developed in the business world, it has yet to be explored in academic surgical research. Defining and adhering to a suitable workflow would increase portability, reusability, and therefore, efficiency of the research process. In this article, we briefly review project management techniques. We specifically underline four main steps of project management: (1) definition and organization, (2) planning, (3) execution, and (4) evaluation, using practical examples from our own multidisciplinary plastic surgery research team. PMID:26710037

  15. A process for assessing the feasibility of a network meta-analysis: a case study of everolimus in combination with hormonal therapy versus chemotherapy for advanced breast cancer.

    PubMed

    Cope, Shannon; Zhang, Jie; Saletan, Stephen; Smiechowski, Brielan; Jansen, Jeroen P; Schmid, Peter

    2014-06-05

    The aim of this study is to outline a general process for assessing the feasibility of performing a valid network meta-analysis (NMA) of randomized controlled trials (RCTs) to synthesize direct and indirect evidence for alternative treatments for a specific disease population. Several steps to assess the feasibility of an NMA are proposed based on existing recommendations. Next, a case study is used to illustrate this NMA feasibility assessment process in order to compare everolimus in combination with hormonal therapy to alternative chemotherapies in terms of progression-free survival for women with advanced breast cancer. A general process for assessing the feasibility of an NMA is outlined that incorporates explicit steps to visualize the heterogeneity in terms of treatment and outcome characteristics (Part A) as well as the study and patient characteristics (Part B). Additionally, steps are performed to illustrate differences within and across different types of direct comparisons in terms of baseline risk (Part C) and observed treatment effects (Part D) since there is a risk that the treatment effect modifiers identified may not explain the observed heterogeneity or inconsistency in the results due to unexpected, unreported or unmeasured differences. Depending on the data available, alternative approaches are suggested: list assumptions, perform a meta-regression analysis, subgroup analysis, sensitivity analyses, or summarize why an NMA is not feasible. The process outlined to assess the feasibility of an NMA provides a stepwise framework that will help to ensure that the underlying assumptions are systematically explored and that the risks (and benefits) of pooling and indirectly comparing treatment effects from RCTs for a particular research question are transparent.

  16. Chromatography process development in the quality by design paradigm I: Establishing a high-throughput process development platform as a tool for estimating "characterization space" for an ion exchange chromatography step.

    PubMed

    Bhambure, R; Rathore, A S

    2013-01-01

    This article describes the development of a high-throughput process development (HTPD) platform for developing chromatography steps. An assessment of the platform as a tool for establishing the "characterization space" for an ion exchange chromatography step has been performed by using design of experiments. Case studies involving use of a biotech therapeutic, granulocyte colony-stimulating factor have been used to demonstrate the performance of the platform. We discuss the various challenges that arise when working at such small volumes along with the solutions that we propose to alleviate these challenges to make the HTPD data suitable for empirical modeling. Further, we have also validated the scalability of this platform by comparing the results from the HTPD platform (2 and 6 μL resin volumes) against those obtained at the traditional laboratory scale (resin volume, 0.5 mL). We find that after integration of the proposed correction factors, the HTPD platform is capable of performing the process optimization studies at 170-fold higher productivity. The platform is capable of providing semi-quantitative assessment of the effects of the various input parameters under consideration. We think that platform such as the one presented is an excellent tool for examining the "characterization space" and reducing the extensive experimentation at the traditional lab scale that is otherwise required for establishing the "design space." Thus, this platform will specifically aid in successful implementation of quality by design in biotech process development. This is especially significant in view of the constraints with respect to time and resources that the biopharma industry faces today. Copyright © 2013 American Institute of Chemical Engineers.

  17. Two-step Raman spectroscopy method for tumor diagnosis

    NASA Astrophysics Data System (ADS)

    Zakharov, V. P.; Bratchenko, I. A.; Kozlov, S. V.; Moryatov, A. A.; Myakinin, O. O.; Artemyev, D. N.

    2014-05-01

    Two-step Raman spectroscopy phase method was proposed for differential diagnosis of malignant tumor in skin and lung tissue. It includes detection of malignant tumor in healthy tissue on first step with identification of concrete cancer type on the second step. Proposed phase method analyze spectral intensity alteration in 1300-1340 and 1640-1680 cm-1 Raman bands in relation to the intensity of the 1450 cm-1 band on first step, and relative differences between RS intensities for tumor area and healthy skin closely adjacent to the lesion on the second step. It was tested more than 40 ex vivo samples of lung tissue and more than 50 in vivo skin tumors. Linear Discriminant Analysis, Quadratic Discriminant Analysis and Support Vector Machine were used for tumors type classification on phase planes. It is shown that two-step phase method allows to reach 88.9% sensitivity and 87.8% specificity for malignant melanoma diagnosis (skin cancer); 100% sensitivity and 81.5% specificity for adenocarcinoma diagnosis (lung cancer); 90.9% sensitivity and 77.8% specificity for squamous cell carcinoma diagnosis (lung cancer).

  18. Molecular mechanism of ERK dephosphorylation by striatal-enriched protein tyrosine phosphatase (STEP)

    PubMed Central

    Li, Hui; Li, Kang-shuai; Su, Jing; Chen, Lai-Zhong; Xu, Yun-Fei; Wang, Hong-Mei; Gong, Zheng; Cui, Guo-Ying; Yu, Xiao; Wang, Kai; Yao, Wei; Xin, Tao; Li, Min-Yong; Xiao, Kun-Hong; An, Xiao-fei; Huo, Yuqing; Xu, Zhi-gang; Sun, Jin-Peng; Pang, Qi

    2013-01-01

    Striatal-enriched tyrosine phosphatase (STEP) is an important regulator of neuronal synaptic plasticity, and its abnormal level or activity contributes to cognitive disorders. One crucial downstream effector and direct substrate of STEP is extracellular signal-regulated protein kinase (ERK), which has important functions in spine stabilisation and action potential transmission. The inhibition of STEP activity toward phospho-ERK has the potential to treat neuronal diseases, but the detailed mechanism underlying the dephosphorylation of phospho-ERK by STEP is not known. Therefore, we examined STEP activity toward pNPP, phospho-tyrosine-containing peptides, and the full-length phospho-ERK protein using STEP mutants with different structural features. STEP was found to be a highly efficient ERK tyrosine phosphatase that required both its N-terminal regulatory region and key residues in its active site. Specifically, both KIM and KIS of STEP were required for ERK interaction. In addition to the N-terminal KIS region, S245, hydrophobic residues L249/L251, and basic residues R242/R243 located in the KIM region were important in controlling STEP activity toward phospho-ERK. Further kinetic experiments revealed subtle structural differences between STEP and HePTP that affected the interactions of their KIMs with ERK. Moreover, STEP recognised specific positions of a phospho-ERK peptide sequence through its active site, and the contact of STEP F311 with phospho-ERK V205 and T207 were crucial interactions. Taken together, our results not only provide the information for interactions between ERK and STEP, but will also help in the development of specific strategies to target STEP-ERK recognition, which could serve as a potential therapy for neurological disorders. PMID:24117863

  19. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Emmert-Buck, Michael R

    2005-01-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of pancreatic malignancy and other biological phenomena. This chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed-over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification. High-quality tissue microdissection does not necessarily mean high-quality samples to analyze. The quality of biomaterials obtained for analysis is highly dependent on steps upstream and downstream from tissue microdissection. We provide protocols for each of these steps, and encourage you to improve upon these. It is worth the effort of every laboratory to optimize and document its technique at each stage of the process, and we provide a starting point for those willing to spend the time to optimize. In our view, poor documentation of tissue and cell type of origin and the use of nonoptimized protocols is a source of inefficiency in current life science research. Even incremental improvement in this area will increase productivity significantly.

  20. Laser-induced porous graphene films from commercial polymers

    NASA Astrophysics Data System (ADS)

    Lin, Jian; Peng, Zhiwei; Liu, Yuanyue; Ruiz-Zepeda, Francisco; Ye, Ruquan; Samuel, Errol L. G.; Yacaman, Miguel Jose; Yakobson, Boris I.; Tour, James M.

    2014-12-01

    The cost effective synthesis and patterning of carbon nanomaterials is a challenge in electronic and energy storage devices. Here we report a one-step, scalable approach for producing and patterning porous graphene films with three-dimensional networks from commercial polymer films using a CO2 infrared laser. The sp3-carbon atoms are photothermally converted to sp2-carbon atoms by pulsed laser irradiation. The resulting laser-induced graphene (LIG) exhibits high electrical conductivity. The LIG can be readily patterned to interdigitated electrodes for in-plane microsupercapacitors with specific capacitances of >4 mF cm-2 and power densities of ~9 mW cm-2. Theoretical calculations partially suggest that enhanced capacitance may result from LIG’s unusual ultra-polycrystalline lattice of pentagon-heptagon structures. Combined with the advantage of one-step processing of LIG in air from commercial polymer sheets, which would allow the employment of a roll-to-roll manufacturing process, this technique provides a rapid route to polymer-written electronic and energy storage devices.

  1. A design for an intelligent monitor and controller for space station electrical power using parallel distributed problem solving

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.

    1990-01-01

    The emphasis is on defining a set of communicating processes for intelligent spacecraft secondary power distribution and control. The computer hardware and software implementation platform for this work is that of the ADEPTS project at the Johnson Space Center (JSC). The electrical power system design which was used as the basis for this research is that of Space Station Freedom, although the functionality of the processes defined here generalize to any permanent manned space power control application. First, the Space Station Electrical Power Subsystem (EPS) hardware to be monitored is described, followed by a set of scenarios describing typical monitor and control activity. Then, the parallel distributed problem solving approach to knowledge engineering is introduced. There follows a two-step presentation of the intelligent software design for secondary power control. The first step decomposes the problem of monitoring and control into three primary functions. Each of the primary functions is described in detail. Suggestions for refinements and embelishments in design specifications are given.

  2. Laser-induced porous graphene films from commercial polymers

    PubMed Central

    Lin, Jian; Peng, Zhiwei; Liu, Yuanyue; Ruiz-Zepeda, Francisco; Ye, Ruquan; Samuel, Errol L. G.; Yacaman, Miguel Jose; Yakobson, Boris I.; Tour, James M.

    2014-01-01

    Synthesis and patterning of carbon nanomaterials cost effectively is a challenge in electronic and energy storage devices. Here report a one-step, scalable approach for producing and patterning porous graphene films with 3-dimensional networks from commercial polymer films using a CO2 infrared laser. The sp3-carbon atoms are photothermally converted to sp2-carbon atoms by pulsed laser irradiation. The resulting laser-induced graphene (LIG) exhibits high electrical conductivity. The LIG can be readily patterned to interdigitated electrodes for in-plane microsupercapacitors with specific capacitances of >4 mF·cm−2 and power densities of ~9 mW·cm−2. Theoretical calculations partially suggest that enhanced capacitance may result from LIG’s unusual ultra-polycrystalline lattice of pentagon-heptagon structures. Combined with the advantage of one-step processing of LIG in air from commercial polymer sheets, which would allow the employment of a roll-to-roll manufacturing process, this technique provides a rapid route to polymer-written electronic and energy storage devices. PMID:25493446

  3. Observing in space and time the ephemeral nucleation of liquid-to-crystal phase transitions.

    PubMed

    Yoo, Byung-Kuk; Kwon, Oh-Hoon; Liu, Haihua; Tang, Jau; Zewail, Ahmed H

    2015-10-19

    The phase transition of crystalline ordering is a general phenomenon, but its evolution in space and time requires microscopic probes for visualization. Here we report direct imaging of the transformation of amorphous titanium dioxide nanofilm, from the liquid state, passing through the nucleation step and finally to the ordered crystal phase. Single-pulse transient diffraction profiles at different times provide the structural transformation and the specific degree of crystallinity (η) in the evolution process. It is found that the temporal behaviour of η exhibits unique 'two-step' dynamics, with a robust 'plateau' that extends over a microsecond; the rate constants vary by two orders of magnitude. Such behaviour reflects the presence of intermediate structure(s) that are the precursor of the ordered crystal state. Theoretically, we extend the well-known Johnson-Mehl-Avrami-Kolmogorov equation, which describes the isothermal process with a stretched-exponential function, but here over the range of times covering the melt-to-crystal transformation.

  4. An effective hand vein feature extraction method.

    PubMed

    Li, Haigang; Zhang, Qian; Li, Chengdong

    2015-01-01

    As a new authentication method developed years ago, vein recognition technology features the unique advantage of bioassay. This paper studies the specific procedure for the extraction of hand back vein characteristics. There are different positions used in the collecting process, so that a suitable intravenous regional orientation method is put forward, allowing the positioning area to be the same for all hand positions. In addition, to eliminate the pseudo vein area, the valley regional shape extraction operator can be improved and combined with multiple segmentation algorithms. The images should be segmented step by step, making the vein texture to appear clear and accurate. Lastly, the segmented images should be filtered, eroded, and refined. This process helps to filter the most of the pseudo vein information. Finally, a clear vein skeleton diagram is obtained, demonstrating the effectiveness of the algorithm. This paper presents a hand back vein region location method. This makes it possible to rotate and correct the image by working out the inclination degree of contour at the side of hand back.

  5. Development of culturally tailored educational brochures on HPV and pap tests for American Indian women.

    PubMed

    Sharpe, Patricia A; Brandt, Heather M; McCree, Donna H; Owl-Myers, Elizabeth; Taylor, Betty; Mullins, Glenda

    2013-07-01

    Participatory formative research guided the creation of a culturally tailored educational brochure about human papillomavirus (HPV) at an American Indian women's clinic. A review of existing educational materials and in-depth interviews were conducted. Nine steps for creating health communications messages that were patterned after National Cancer Institute guidelines guided the brochure development process. Of 95 women tested for HPV, 41% were positive, 32 (34%) agreed to the in-depth interview, and 9 agreed to the pretesting interview. Mean age was 41 years. Interviews revealed key themes concerning emotional reactions to abnormal Pap test results and HPV; need for basic information about HPV, Pap tests, and results; concerns about HPV stigma, sexual transmission, and communication with sexual partner; and the preferred source and format for HPV educational materials. A literature review revealed 12 areas of basic HPV content. A participatory process successfully engaged nursing staff and patients in creating culturally appropriate brochures for clinic use. This article provides specific steps for creating culturally tailored patient education materials.

  6. Limits of acceptable change and natural resources planning: when is LAC useful, when is it not?

    Treesearch

    David N. Cole; Stephen F. McCool

    1997-01-01

    There are ways to improve the LAC process and its implementational procedures. One significant procedural modification is the addition of a new step. This step — which becomes the first step in the process — involves more explicitly defining goals and desired conditions. For other steps in the process, clarifications of concept and terminology are advanced, as are...

  7. Fabrication of diamond shells

    DOEpatents

    Hamza, Alex V.; Biener, Juergen; Wild, Christoph; Woerner, Eckhard

    2016-11-01

    A novel method for fabricating diamond shells is introduced. The fabrication of such shells is a multi-step process, which involves diamond chemical vapor deposition on predetermined mandrels followed by polishing, microfabrication of holes, and removal of the mandrel by an etch process. The resultant shells of the present invention can be configured with a surface roughness at the nanometer level (e.g., on the order of down to about 10 nm RMS) on a mm length scale, and exhibit excellent hardness/strength, and good transparency in the both the infra-red and visible. Specifically, a novel process is disclosed herein, which allows coating of spherical substrates with optical-quality diamond films or nanocrystalline diamond films.

  8. Contamination control in hybrid microelectronic modules. Part 3: Specifications for coating material and process controls

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.

    1975-01-01

    Resin systems for coating hybrids prior to hermetic sealing are described. The resin systems are a flexible silicone junction resin system and a flexible cycloaliphatic epoxy resin system. The coatings are intended for application to the hybrid after all the chips have been assembled and wire bonded, but prior to hermetic sealing of the package. The purpose of the coating is to control particulate contamination by immobilizing particles and by passivating the hybrid. Recommended process controls for the purpose of minimizing contamination in hybrid microcircuit packages are given. Emphasis is placed on those critical hybrid processing steps in which contamination is most likely to occur.

  9. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  10. Role of step stiffness and kinks in the relaxation of vicinal (001) with zigzag [110] steps

    NASA Astrophysics Data System (ADS)

    Mahjoub, B.; Hamouda, Ajmi BH.; Einstein, TL.

    2017-08-01

    We present a kinetic Monte Carlo study of the relaxation dynamics and steady state configurations of 〈110〉 steps on a vicinal (001) simple cubic surface. This system is interesting because 〈110〉 (fully kinked) steps have different elementary excitation energetics and favor step diffusion more than 〈100〉 (nominally straight) steps. In this study we show how this leads to different relaxation dynamics as well as to different steady state configurations, including that 2-bond breaking processes are rate determining for 〈110〉 steps in contrast to 3-bond breaking processes for 〈100〉-steps found in previous work [Surface Sci. 602, 3569 (2008)]. The analysis of the terrace-width distribution (TWD) shows a significant role of kink-generation-annihilation processes during the relaxation of steps: the kinetic of relaxation, toward the steady state, is much faster in the case of 〈110〉-zigzag steps, with a higher standard deviation of the TWD, in agreement with a decrease of step stiffness due to orientation. We conclude that smaller step stiffness leads inexorably to faster step dynamics towards the steady state. The step-edge anisotropy slows the relaxation of steps and increases the strength of step-step effective interactions.

  11. Solving the infeasible trust-region problem using approximations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, John E.; Perez, Victor M.; Eldred, Michael Scott

    2004-07-01

    The use of optimization in engineering design has fueled the development of algorithms for specific engineering needs. When the simulations are expensive to evaluate or the outputs present some noise, the direct use of nonlinear optimizers is not advisable, since the optimization process will be expensive and may result in premature convergence. The use of approximations for both cases is an alternative investigated by many researchers including the authors. When approximations are present, a model management is required for proper convergence of the algorithm. In nonlinear programming, the use of trust-regions for globalization of a local algorithm has been provenmore » effective. The same approach has been used to manage the local move limits in sequential approximate optimization frameworks as in Alexandrov et al., Giunta and Eldred, Perez et al. , Rodriguez et al., etc. The experience in the mathematical community has shown that more effective algorithms can be obtained by the specific inclusion of the constraints (SQP type of algorithms) rather than by using a penalty function as in the augmented Lagrangian formulation. The presence of explicit constraints in the local problem bounded by the trust region, however, may have no feasible solution. In order to remedy this problem the mathematical community has developed different versions of a composite steps approach. This approach consists of a normal step to reduce the amount of constraint violation and a tangential step to minimize the objective function maintaining the level of constraint violation attained at the normal step. Two of the authors have developed a different approach for a sequential approximate optimization framework using homotopy ideas to relax the constraints. This algorithm called interior-point trust-region sequential approximate optimization (IPTRSAO) presents some similarities to the two normal-tangential steps algorithms. In this paper, a description of the similarities is presented and an expansion of the two steps algorithm is presented for the case of approximations.« less

  12. Hierarchical fractional-step approximations and parallel kinetic Monte Carlo algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Giorgos, E-mail: garab@math.uoc.gr; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Plechac, Petr, E-mail: plechac@math.udel.edu

    2012-10-01

    We present a mathematical framework for constructing and analyzing parallel algorithms for lattice kinetic Monte Carlo (KMC) simulations. The resulting algorithms have the capacity to simulate a wide range of spatio-temporal scales in spatially distributed, non-equilibrium physiochemical processes with complex chemistry and transport micro-mechanisms. Rather than focusing on constructing exactly the stochastic trajectories, our approach relies on approximating the evolution of observables, such as density, coverage, correlations and so on. More specifically, we develop a spatial domain decomposition of the Markov operator (generator) that describes the evolution of all observables according to the kinetic Monte Carlo algorithm. This domain decompositionmore » corresponds to a decomposition of the Markov generator into a hierarchy of operators and can be tailored to specific hierarchical parallel architectures such as multi-core processors or clusters of Graphical Processing Units (GPUs). Based on this operator decomposition, we formulate parallel Fractional step kinetic Monte Carlo algorithms by employing the Trotter Theorem and its randomized variants; these schemes, (a) are partially asynchronous on each fractional step time-window, and (b) are characterized by their communication schedule between processors. The proposed mathematical framework allows us to rigorously justify the numerical and statistical consistency of the proposed algorithms, showing the convergence of our approximating schemes to the original serial KMC. The approach also provides a systematic evaluation of different processor communicating schedules. We carry out a detailed benchmarking of the parallel KMC schemes using available exact solutions, for example, in Ising-type systems and we demonstrate the capabilities of the method to simulate complex spatially distributed reactions at very large scales on GPUs. Finally, we discuss work load balancing between processors and propose a re-balancing scheme based on probabilistic mass transport methods.« less

  13. Validation of the manufacturing process used to produce long-acting recombinant factor IX Fc fusion protein

    PubMed Central

    McCue, J; Osborne, D; Dumont, J; Peters, R; Mei, B; Pierce, G F; Kobayashi, K; Euwart, D

    2014-01-01

    Recombinant factor IX Fc (rFIXFc) fusion protein is the first of a new class of bioengineered long-acting factors approved for the treatment and prevention of bleeding episodes in haemophilia B. The aim of this work was to describe the manufacturing process for rFIXFc, to assess product quality and to evaluate the capacity of the process to remove impurities and viruses. This manufacturing process utilized a transferable and scalable platform approach established for therapeutic antibody manufacturing and adapted for production of the rFIXFc molecule. rFIXFc was produced using a process free of human- and animal-derived raw materials and a host cell line derived from human embryonic kidney (HEK) 293H cells. The process employed multi-step purification and viral clearance processing, including use of a protein A affinity capture chromatography step, which binds to the Fc portion of the rFIXFc molecule with high affinity and specificity, and a 15 nm pore size virus removal nanofilter. Process validation studies were performed to evaluate identity, purity, activity and safety. The manufacturing process produced rFIXFc with consistent product quality and high purity. Impurity clearance validation studies demonstrated robust and reproducible removal of process-related impurities and adventitious viruses. The rFIXFc manufacturing process produces a highly pure product, free of non-human glycan structures. Validation studies demonstrate that this product is produced with consistent quality and purity. In addition, the scalability and transferability of this process are key attributes to ensure consistent and continuous supply of rFIXFc. PMID:24811361

  14. Lung microenvironment promotes the metastasis of human hepatocellular carcinoma cells to the lungs.

    PubMed

    Jin, Yun; Ai, Junhua; Shi, Jun

    2015-01-01

    Cancer metastasis is a highly tissue-specific and organ-selective process. It has been shown that the affected tissues and/or organs play a major role in this complex process. The lung is the most common target organ of extrahepatic hepatocellular carcinoma (HCC) metastasis, but the precise molecular mechanism underlying this organ-specific metastasis remains unclear. We hypothesized that lung microenvironment was able to promote the metastasis of HCC cells to the lungs leading to distant metastases. In support of our hypothesis, we provided evidence from targeted metastasis in various types of cancer and contributing factors in the microenvironment of targeted tissues/organs. A better understanding of the steps involved in the interplay between HCC cells and lung microenvironment may offer new perspectives for the medical management of lung metastases of HCC.

  15. Transformations of software design and code may lead to reduced errors

    NASA Technical Reports Server (NTRS)

    Connelly, E. M.

    1983-01-01

    The capability of programmers and non-programmers to specify problem solutions by developing example-solutions and also for the programmers by writing computer programs was investigated; each method of specification was accomplished at various levels of problem complexity. The level of difficulty of each problem was reflected by the number of steps needed by the user to develop a solution. Machine processing of the user inputs permitted inferences to be developed about the algorithms required to solve a particular problem. The interactive feedback of processing results led users to a more precise definition of the desired solution. Two participant groups (programmers and bookkeepers/accountants) working with three levels of problem complexity and three levels of processor complexity were used. The experimental task employed required specification of a logic for solution of a Navy task force problem.

  16. Electron beam, laser beam and plasma arc welding studies

    NASA Technical Reports Server (NTRS)

    Banas, C. M.

    1974-01-01

    This program was undertaken as an initial step in establishing an evaluation framework which would permit a priori selection of advanced welding processes for specific applications. To this end, a direct comparison of laser beam, electron beam and arc welding of Ti-6Al-4V alloy was undertaken. Ti-6Al-4V was selected for use in view of its established welding characteristics and its importance in aerospace applications.

  17. Aqueous Chloride Operations Overview: Plutonium and Americium Purification/Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, Kyle Shelton; Kimball, David Bryan; Skidmore, Bradley Evan

    These are a set of slides intended for an information session as part of recruiting activities at Brigham Young University. It gives an overview of aqueous chloride operations, specifically on plutonium and americium purification/recovery. This presentation details the steps taken perform these processes, from plutonium size reduction, dissolution, solvent extraction, oxalate precipitation, to calcination. For americium recovery, it details the CLEAR (chloride extraction and actinide recovery) Line, oxalate precipitation and calcination.

  18. Two-Step Vapor/Liquid/Solid Purification

    NASA Technical Reports Server (NTRS)

    Holland, L. R.

    1986-01-01

    Vertical distillation system combines in single operation advantages of multiple zone refining with those of distillation. Developed specifically to load Bridgman-Stockbarger (vertical-solidification) growth ampoules with ultrapure tellurium and cadmium, system, with suitable modifications, serves as material refiner. In first phase of purification process, ampoule heated to drive off absorbed volatiles. Second phase, evaporator heated to drive off volatiles in charge. Third phase, slowly descending heater causes distillation from evaporator to growing crystal in ampoule.

  19. Modeling of SBS Phase Conjugation in Multimode Step Index Fibers

    DTIC Science & Technology

    2008-03-01

    cavity or in an external amplifier. Since pumping is never a perfectly efficient process, some heat will be introduced, and for very high pump powers...modes it supports, and the incident pump power. While theoretical investigations of SBS PCMs have been conducted by a num- ber of authors, the model...predictions about the phase conjugate fidelity that could be expected from a given pump intensity input coupled into a specific fiber. A numerical

  20. Developmental Steps in Metaphorical Language Abilities: The Influence of Age, Gender, Cognitive Flexibility, Information Processing Speed, and Analogical Reasoning.

    PubMed

    Willinger, Ulrike; Deckert, Matthias; Schmöger, Michaela; Schaunig-Busch, Ines; Formann, Anton K; Auff, Eduard

    2017-12-01

    Metaphor is a specific type of figurative language that is used in various important fields such as in the work with children in clinical or teaching contexts. The aim of the study was to investigate the developmental course, developmental steps, and possible cognitive predictors regarding metaphor processing in childhood and early adolescence. One hundred sixty-four typically developing children (7-year-olds, 9-year-olds) and early adolescents (11-year-olds) were tested for metaphor identification, comprehension, comprehension quality, and preference by the Metaphoric Triads Task as well as for analogical reasoning, information processing speed, cognitive flexibility under time pressure, and cognitive flexibility without time pressure. Metaphor identification and comprehension consecutively increased with age. Eleven-year-olds showed significantly higher metaphor comprehension quality and preference scores than seven- and nine-year-olds, whilst these younger age groups did not differ. Age, cognitive flexibility under time pressure, information processing speed, analogical reasoning, and cognitive flexibility without time pressure significantly predicted metaphor comprehension. Metaphorical language ability shows an ongoing development and seemingly changes qualitatively at the beginning of early adolescence. These results can possibly be explained by a greater synaptic reorganization in early adolescents. Furthermore, cognitive flexibility under time pressure and information processing speed possibly facilitate the ability to adapt metaphor processing strategies in a flexible, quick, and appropriate way.

Top