Sample records for major process steps

  1. 40 CFR 93.104 - Frequency of conformity determinations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... in the project's design concept and scope; three years elapse since the most recent major step to.... Major steps include NEPA process completion; start of final design; acquisition of a significant portion...

  2. Effect of the processing steps on compositions of table olive since harvesting time to pasteurization.

    PubMed

    Nikzad, Nasim; Sahari, Mohammad A; Vanak, Zahra Piravi; Safafar, Hamed; Boland-nazar, Seyed A

    2013-08-01

    Weight, oil, fatty acids, tocopherol, polyphenol, and sterol properties of 5 olive cultivars (Zard, Fishomi, Ascolana, Amigdalolia, and Conservalia) during crude, lye treatment, washing, fermentation, and pasteurization steps were studied. Results showed: oil percent was higher and lower in Ascolana (crude step) and in Fishomi (pasteurization step), respectively; during processing steps, in all cultivars, oleic, palmitic, linoleic, and stearic acids were higher; the highest changes in saturated and unsaturated fatty acids were in fermentation step; the highest and the lowest ratios of ω3 / ω6 were in Ascolana (washing step) and in Zard (pasteurization step), respectively; the highest and the lowest tocopherol were in Amigdalolia and Fishomi, respectively, and major damage occurred in lye step; the highest and the lowest polyphenols were in Ascolana (crude step) and in Zard and Ascolana (pasteurization step), respectively; the major damage among cultivars occurred during lye step, in which the polyphenol reduced to 1/10 of first content; sterol did not undergo changes during steps. Reviewing of olive patents shows that many compositions of fruits such as oil quality, fatty acids, quantity and its fraction can be changed by alteration in cultivar and process.

  3. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  4. Failure modes and effects analysis for ocular brachytherapy.

    PubMed

    Lee, Yongsook C; Kim, Yongbok; Huynh, Jason Wei-Yeong; Hamilton, Russell J

    The aim of the study was to identify potential failure modes (FMs) having a high risk and to improve our current quality management (QM) program in Collaborative Ocular Melanoma Study (COMS) ocular brachytherapy by undertaking a failure modes and effects analysis (FMEA) and a fault tree analysis (FTA). Process mapping and FMEA were performed for COMS ocular brachytherapy. For all FMs identified in FMEA, risk priority numbers (RPNs) were determined by assigning and multiplying occurrence, severity, and lack of detectability values, each ranging from 1 to 10. FTA was performed for the major process that had the highest ranked FM. Twelve major processes, 121 sub-process steps, 188 potential FMs, and 209 possible causes were identified. For 188 FMs, RPN scores ranged from 1.0 to 236.1. The plaque assembly process had the highest ranked FM. The majority of FMs were attributable to human failure (85.6%), and medical physicist-related failures were the most numerous (58.9% of all causes). After FMEA, additional QM methods were included for the top 10 FMs and 6 FMs with severity values > 9.0. As a result, for these 16 FMs and the 5 major processes involved, quality control steps were increased from 8 (50%) to 15 (93.8%), and major processes having quality assurance steps were increased from 2 to 4. To reduce high risk in current clinical practice, we proposed QM methods. They mainly include a check or verification of procedures/steps and the use of checklists for both ophthalmology and radiation oncology staff, and intraoperative ultrasound-guided plaque positioning for ophthalmology staff. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  5. A Systems Approach towards an Intelligent and Self-Controlling Platform for Integrated Continuous Reaction Sequences**

    PubMed Central

    Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V

    2015-01-01

    Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747

  6. Syntheses and properties of the major hydroxy metabolites in humans of blonanserin AD-5423, a novel antipsychotic agent.

    PubMed

    Ochi, Takeshi; Sakamoto, Masato; Minamida, Akira; Suzuki, Kenji; Ueda, Tomohiko; Une, Teruaki; Toda, Hiroshi; Matsumoto, Kazuya; Terauchi, Yoshiaki

    2005-02-15

    Two major metabolites in humans of blonanserin, 2-(4-ethyl-1-piperazinyl)-4-(4-fluorophenyl)-5,6,7,8,9,10-hexahydrocycloocta-[b]pyridine (code name AD-5423), were synthesized. The first, 7-hydroxylated AD-5423, was synthesized through a four-step process starting from 4-fluorobenzoylacetonitrile (1), and the second, 8-hydroxylated AD-5423, a nine-step process also from 1. The optical resolution, structures, and receptor binding properties of the metabolites were documented.

  7. A Guide for Developing Standard Operating Job Procedures for the Tertiary Multimedia Filtration Process Wastewater Treatment Facility. SOJP No. 7.

    ERIC Educational Resources Information Center

    Petrasek, Al, Jr.

    This guide describes the standard operating job procedures for the tertiary multimedia filtration process of wastewater treatment plants. The major objective of the filtration process is the removal of suspended solids from the reclaimed wastewater. The guide gives step-by-step instructions for pre-start up, start-up, continuous operation, and…

  8. How To Build a Strategic Plan: A Step-by-Step Guide for School Managers.

    ERIC Educational Resources Information Center

    Clay, Katherine; And Others

    Strategic planning techniques for administrators, with a focus on process managers, are presented in this guidebook. The three major tasks of the strategic planning process include the assessment of the current organizational situation, goal setting, and the development of strategies to accomplish this. Strategic planning differs from long-range…

  9. Methods for the continuous production of plastic scintillator materials

    DOEpatents

    Bross, Alan; Pla-Dalmau, Anna; Mellott, Kerry

    1999-10-19

    Methods for producing plastic scintillating material employing either two major steps (tumble-mix) or a single major step (inline-coloring or inline-doping). Using the two step method, the polymer pellets are mixed with silicone oil, and the mixture is then tumble mixed with the dopants necessary to yield the proper response from the scintillator material. The mixture is then placed in a compounder and compounded in an inert gas atmosphere. The resultant scintillator material is then extruded and pelletized or formed. When only a single step is employed, the polymer pellets and dopants are metered into an inline-coloring extruding system. The mixture is then processed under a inert gas atmosphere, usually argon or nitrogen, to form plastic scintillator material in the form of either scintillator pellets, for subsequent processing, or as material in the direct formation of the final scintillator shape or form.

  10. One step sintering of homogenized bauxite raw material and kinetic study

    NASA Astrophysics Data System (ADS)

    Gao, Chang-he; Jiang, Peng; Li, Yong; Sun, Jia-lin; Zhang, Jun-jie; Yang, Huan-ying

    2016-10-01

    A one-step sintering process of bauxite raw material from direct mining was completed, and the kinetics of this process was analyzed thoroughly. The results show that the sintering kinetics of bauxite raw material exhibits the liquid-phase sintering behavior. A small portion of impurities existed in the raw material act as a liquid phase. After X-ray diffraction analyses, scanning electron microscopy observations, and kinetics calculations, sintering temperature and heating duration were determined as the two major factors contributing to the sintering process and densification of bauxite ore. An elevated heating temperature and longer duration favor the densification process. The major obstacle for the densification of bauxite material is attributed to the formation of the enclosed blowhole during liquid-phase sintering.

  11. Applications of process improvement techniques to improve workflow in abdominal imaging.

    PubMed

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  12. Flight data identification of six degree-of-freedom stability and control derivatives of a large crane type helicopter

    NASA Technical Reports Server (NTRS)

    Tomaine, R. L.

    1976-01-01

    Flight test data from a large 'crane' type helicopter were collected and processed for the purpose of identifying vehicle rigid body stability and control derivatives. The process consisted of using digital and Kalman filtering techniques for state estimation and Extended Kalman filtering for parameter identification, utilizing a least squares algorithm for initial derivative and variance estimates. Data were processed for indicated airspeeds from 0 m/sec to 152 m/sec. Pulse, doublet and step control inputs were investigated. Digital filter frequency did not have a major effect on the identification process, while the initial derivative estimates and the estimated variances had an appreciable effect on many derivative estimates. The major derivatives identified agreed fairly well with analytical predictions and engineering experience. Doublet control inputs provided better results than pulse or step inputs.

  13. Understanding, Developing, and Writing Effective IEPs: A Step-by-Step Guide for Educators

    ERIC Educational Resources Information Center

    Pierangelo, Roger; Giuliani, George A.

    2007-01-01

    Creating and evaluating Individualized Education Programs (IEPs) for students with disabilities is a major responsibility for teachers and school leaders, yet the process involves legal components not always understood by educators. In "Understanding, Developing, and Writing Effective IEPs," legal and special education experts Roger…

  14. A model for undergraduate physics major outcomes objectives

    NASA Astrophysics Data System (ADS)

    Taylor, G. R.; Erwin, T. Dary

    1989-06-01

    Concern with assessment of student outcomes of undergraduate physics major programs is rapidly rising. The Southern Association of Colleges and Schools and many other regional and state organizations are requiring explicit outcomes assessment in the accrediting process. The first step in this assessment process for major programs is the establishment of student outcomes objectives. A model and set of physics outcomes (educational) objectives that were developed by the faculty in the Physics Department at James Madison University are presented.

  15. Terraforming - Making an earth of Mars

    NASA Astrophysics Data System (ADS)

    McKay, C. P.

    1987-12-01

    The possibility of creating a habitable environment on Mars via terraforming is discussed. The first step is to determine the amount, distribution, and chemical state of water, carbon dioxide, and nitrogen. The process of warming Mars and altering its atmosphere naturally divides into two steps: in the first step, the planet would be heated by a warm thick carbon dioxide atmosphere, while the second step would be to convert the atmospheric carbon dioxide and soil nitrates to the desired oxygen and nitrogen mixture. It is concluded that life will play a major role in any terraforming of Mars, and that terraforming will be a gradual evolutionary process duplicating the early evolution of life on earth.

  16. A mechanism for leader stepping

    NASA Astrophysics Data System (ADS)

    Ebert, U.; Carlson, B. E.; Koehn, C.

    2013-12-01

    The stepping of negative leaders is well observed, but not well understood. A major problem consists of the fact that the streamer corona is typically invisible within a thunderstorm, but determines the evolution of a leader. Motivated by recent observations of streamer and leader formation in the laboratory by T.M.P. Briels, S. Nijdam, P. Kochkin, A.P.J. van Deursen et al., by recent simulations of these processes by J. Teunissen, A. Sun et al., and by our theoretical understanding of the process, we suggest how laboratory phenomena can be extrapolated to lightning leaders to explain the stepping mechanism.

  17. Financial Management: Major Deficiencies in Financial Reporting for Other Defense Organizations-General Funds

    DTIC Science & Technology

    2002-05-31

    explains major financial reporting deficiencies that diminish the quality and utility of the Other Defense Organizations-General Funds financial reports...Accounting Service have taken steps to improve the financial reporting process of the Other Defense Organizations, deficiencies related to financial

  18. Potential Dimension Yields From Direct Processing

    Treesearch

    Wenjie Lin; D. Earl Kline; Philip A. Araman

    1994-01-01

    As the price of timber increases and environmental leigslation limits harvestable log volumes, the process of converting logs directly into dimension parts needs further exploration. Direct processing converts logs directly into rough green dimension parts without the intermediate steps of lumber manufacturing, grading, trading, shipping and drying. A major attraction...

  19. Redesign of occupational health service operations--strategic planning and evaluation.

    PubMed

    Tobias, Beverley; Burnes-Line, Bernadette; Pellarin, Margaret

    2008-10-01

    This article describes the strategic planning process used by a major academic medical center to redesign the employee health service. The steps in the process are discussed and data demonstrating the success of the program redesign are presented.

  20. Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide

    PubMed Central

    McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger

    2015-01-01

    Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837

  1. CHARACTERIZATION OF RISKS POSED BY COMBUSTOR EMISSIONS

    EPA Science Inventory

    Risk characterization is the final step of the risk assessment process as practiced in the U.S. EPA. In risk characterization, the major scientific evidence and "bottom-line" results from the other components of the risk assessment process, hazard identification, dose-response as...

  2. Study of aroma formation and transformation during the manufacturing process of Biluochun green tea in Yunnan Province by HS-SPME and GC-MS.

    PubMed

    Wang, Chen; Lv, Shidong; Wu, Yuanshuang; Lian, Ming; Gao, Xuemei; Meng, Qingxiong

    2016-10-01

    Biluochun is a typical non-fermented tea and is also famous for its unique aroma in China. Few studies have been performed to evaluate the effect of the manufacturing process on the formation and content of its aroma. The volatile components were extracted at different manufacturing process steps of Biluochun green tea using fully automated headspace solid-phase microextraction (HS-SPME) and further characterised by gas chromatography-mass spectrometry (GC-MS). Among 67 volatile components collected, the fractions of linalool oxides, β-ionone, phenylacetaldehyde, aldehydes, ketones, and nitrogen compounds were increased while alcohols and hydrocarbons declined during the manufacturing process. The aroma compounds decreased the most during the drying steps. We identified a number of significantly changed components that can be used as markers and quality control during the producing process of Biluochun. The drying step played a major role in the aroma formation of green tea products and should be the most important step for quality control. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  3. Alternative Procedure of Heat Integration Tehnique Election between Two Unit Processes to Improve Energy Saving

    NASA Astrophysics Data System (ADS)

    Santi, S. S.; Renanto; Altway, A.

    2018-01-01

    The energy use system in a production process, in this case heat exchangers networks (HENs), is one element that plays a role in the smoothness and sustainability of the industry itself. Optimizing Heat Exchanger Networks (HENs) from process streams can have a major effect on the economic value of an industry as a whole. So the solving of design problems with heat integration becomes an important requirement. In a plant, heat integration can be carried out internally or in combination between process units. However, steps in the determination of suitable heat integration techniques require long calculations and require a long time. In this paper, we propose an alternative step in determining heat integration technique by investigating 6 hypothetical units using Pinch Analysis approach with objective function energy target and total annual cost target. The six hypothetical units consist of units A, B, C, D, E, and F, where each unit has the location of different process streams to the temperature pinch. The result is a potential heat integration (ΔH’) formula that can trim conventional steps from 7 steps to just 3 steps. While the determination of the preferred heat integration technique is to calculate the potential of heat integration (ΔH’) between the hypothetical process units. Completion of calculation using matlab language programming.

  4. Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.

    PubMed

    Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger

    2015-01-01

    To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Comparison of elevation derived from insar data with dem from topography map in Son Dong, Bac Giang, Viet Nam

    NASA Astrophysics Data System (ADS)

    Nguyen, Duy

    2012-07-01

    Digital Elevation Models (DEMs) are used in many applications in the context of earth sciences such as in topographic mapping, environmental modeling, rainfall-runoff studies, landslide hazard zonation, seismic source modeling, etc. During the last years multitude of scientific applications of Synthetic Aperture Radar Interferometry (InSAR) techniques have evolved. It has been shown that InSAR is an established technique of generating high quality DEMs from space borne and airborne data, and that it has advantages over other methods for the generation of large area DEM. However, the processing of InSAR data is still a challenging task. This paper describes InSAR operational steps and processing chain for DEM generation from Single Look Complex (SLC) SAR data and compare a satellite SAR estimate of surface elevation with a digital elevation model (DEM) from Topography map. The operational steps are performed in three major stages: Data Search, Data Processing, and product Validation. The Data processing stage is further divided into five steps of Data Pre-Processing, Co-registration, Interferogram generation, Phase unwrapping, and Geocoding. The Data processing steps have been tested with ERS 1/2 data using Delft Object-oriented Interferometric (DORIS) InSAR processing software. Results of the outcome of the application of the described processing steps to real data set are presented.

  6. Use of proteomics for validation of the isolation process of clotting factor IX from human plasma.

    PubMed

    Clifton, James; Huang, Feilei; Gaso-Sokac, Dajana; Brilliant, Kate; Hixson, Douglas; Josic, Djuro

    2010-01-03

    The use of proteomic techniques in the monitoring of different production steps of plasma-derived clotting factor IX (pd F IX) was demonstrated. The first step, solid-phase extraction with a weak anion-exchange resin, fractionates the bulk of human serum albumin (HSA), immunoglobulin G, and other non-binding proteins from F IX. The proteins that strongly bind to the anion-exchange resin are eluted by higher salt concentrations. In the second step, anion-exchange chromatography, residual HSA, some proteases and other contaminating proteins are separated. In the last chromatographic step, affinity chromatography with immobilized heparin, the majority of the residual impurities are removed. However, some contaminating proteins still remain in the eluate from the affinity column. The next step in the production process, virus filtration, is also an efficient step for the removal of residual impurities, mainly high molecular weight proteins, such as vitronectin and inter-alpha inhibitor proteins. In each production step, the active component, pd F IX and contaminating proteins are monitored by biochemical and immunochemical methods and by LC-MS/MS and their removal documented. Our methodology is very helpful for further process optimization, rapid identification of target proteins with relatively low abundance, and for the design of subsequent steps for their removal or purification.

  7. Roughness and uniformity improvements on self-aligned quadruple patterning technique for 10nm node and beyond by wafer stress engineering

    NASA Astrophysics Data System (ADS)

    Liu, Eric; Ko, Akiteru; O'Meara, David; Mohanty, Nihar; Franke, Elliott; Pillai, Karthik; Biolsi, Peter

    2017-05-01

    Dimension shrinkage has been a major driving force in the development of integrated circuit processing over a number of decades. The Self-Aligned Quadruple Patterning (SAQP) technique is widely adapted for sub-10nm node in order to achieve the desired feature dimensions. This technique provides theoretical feasibility of multiple pitch-halving from 193nm immersion lithography by using various pattern transferring steps. The major concept of this approach is to a create spacer defined self-aligned pattern by using single lithography print. By repeating the process steps, double, quadruple, or octuple are possible to be achieved theoretically. In these small architectures, line roughness control becomes extremely important since it may contribute to a significant portion of process and device performance variations. In addition, the complexity of SAQP in terms of processing flow makes the roughness improvement indirective and ineffective. It is necessary to discover a new approach in order to improve the roughness in the current SAQP technique. In this presentation, we demonstrate a novel method to improve line roughness performances on 30nm pitch SAQP flow. We discover that the line roughness performance is strongly related to stress management. By selecting different stress level of film to be deposited onto the substrate, we can manipulate the roughness performance in line and space patterns. In addition, the impact of curvature change by applied film stress to SAQP line roughness performance is also studied. No significant correlation is found between wafer curvature and line roughness performance. We will discuss in details the step-by-step physical performances for each processing step in terms of critical dimension (CD)/ critical dimension uniformity (CDU)/line width roughness (LWR)/line edge roughness (LER). Finally, we summarize the process needed to reach the full wafer performance targets of LWR/LER in 1.07nm/1.13nm on 30nm pitch line and space pattern.

  8. Six Sigma and Lean concepts, a case study: patient centered care model for a mammography center.

    PubMed

    Viau, Mark; Southern, Becky

    2007-01-01

    Boca Raton Community Hospital in South Florida decided to increase return while enhancing patient experience and increasing staff morale. They implemented a program to pursue "enterprise excellence" through Six Sigma methodologies. In order to ensure the root causes to delays and rework were addressed, a multigenerational project plan with 3 major components was developed. Step 1: Stabilize; Step 2: Optimize; Step 3: Innovate. By including staff and process owners in the process, they are empowered to think differently about what they do and how they do it. A team that works collaboratively to identify problems and develop solutions can only be a positive to any organization.

  9. Optical method for measuring the surface area of a threaded fastener

    Treesearch

    Douglas Rammer; Samuel Zelinka

    2010-01-01

    This article highlights major aspects of a new optical technique to determine the surface area of a threaded fastener; the theoretical framework has been reported elsewhere. Specifically, this article describes general surface area expressions used in the analysis, details of image acquisition system, and major image processing steps contained within the measurement...

  10. Isolation and Purification of Biotechnological Products

    NASA Astrophysics Data System (ADS)

    Hubbuch, Jürgen; Kula, Maria-Regina

    2007-05-01

    The production of modern pharma proteins is one of the most rapid growing fields in biotechnology. The overall development and production is a complex task ranging from strain development and cultivation to the purification and formulation of the drug. Downstream processing, however, still accounts for the major part of production costs. This is mainly due to the high demands on purity and thus safety of the final product and results in processes with a sequence of typically more than 10 unit operations. Consequently, even if each process step would operate at near optimal yield, a very significant amount of product would be lost. The majority of unit operations applied in downstream processing have a long history in the field of chemical and process engineering; nevertheless, mathematical descriptions of the respective processes and the economical large-scale production of modern pharmaceutical products are hampered by the complexity of the biological feedstock, especially the high molecular weight and limited stability of proteins. In order to develop new operational steps as well as a successful overall process, it is thus a necessary prerequisite to develop a deeper understanding of the thermodynamics and physics behind the applied processes as well as the implications for the product.

  11. Using Desktop Publishing To Enhance the "Writing Process."

    ERIC Educational Resources Information Center

    Millman, Patricia G.; Clark, Margaret P.

    1997-01-01

    Describes the development of an instructional technology course at Fairmont State College (West Virginia) for education majors that included a teaching module combining steps of the writing process to provide for the interdisciplinary focus of writing across the curriculum. Discusses desktop publishing, the National Writing Project, and student…

  12. Developing and Implementing a Process for the Review of Nonacademic Units.

    ERIC Educational Resources Information Center

    Brown, Marilyn K.

    1989-01-01

    A major research university's recently developed process for systematic evaluation of nonacademic units is described, and the steps in its development and implementation are outlined: review of literature on organizational effectiveness; survey of peer institutions; development of guidelines for review; and implementation in several campus units.…

  13. USING WASTE TO CLEAN UP THE ENVIRONMENT: CELLULOSIC ETHANOL, THE FUTURE OF FUELS

    EPA Science Inventory

    In the process of converting municipal solid waste (MSW) into ethanol we optimized the first two major steps of pretreatment and enzymatic hydrolysis stages to enhance the sugar yield and to reduce the cost. For the pretreatment process, we tested different parameters of react...

  14. Indicator Systems and Evaluation

    NASA Technical Reports Server (NTRS)

    Canright, Shelley; Grabowski, Barbara

    1995-01-01

    Participants in the workshop session were actively engaged in a hands-on, minds-on approach to learning about indicators and evaluation processes. The six hour session was broken down into three two hour sessions. Each session was built upon an instructional model which moved from general understanding to specific IITA application. Examples and practice exercises served to demonstrate tand reinforce the workshop concepts. Each successive session built upon the previous session and addressed the major steps in the evaluation process. The major steps covered in the workshop included: project descriptions, writing goals and objectives for categories, determining indicators and indicator systems for specific projects, and methods and issues of data collection. The workshop served as a baseline upon which the field centers will build during the summer in undertaking a comprehensive examination and evaluation of their existing K-12 education projects.

  15. Benefiting Female Students in Science, Math, and Engineering: The Nuts and Bolts of Establishing a WISE (Women in Science and Engineering) Learning Community

    ERIC Educational Resources Information Center

    Pace, Diana; Witucki, Laurie; Blumreich, Kathleen

    2008-01-01

    This paper describes the rationale and the step by step process for setting up a WISE (Women in Science and Engineering) learning community at one institution. Background information on challenges for women in science and engineering and the benefits of a learning community for female students in these major areas are described. Authors discuss…

  16. Fruit Antioxidants during Vinegar Processing: Changes in Content and in Vitro Bio-Accessibility

    PubMed Central

    Bakir, Sena; Toydemir, Gamze; Boyacioglu, Dilek; Beekwilder, Jules; Capanoglu, Esra

    2016-01-01

    Background: Vinegars based on fruit juices could conserve part of the health-associated compounds present in the fruits. However, in general very limited knowledge exists on the consequences of vinegar-making on different antioxidant compounds from fruit. In this study vinegars derived from apple and grape are studied. Methods: A number of steps, starting from the fermentation of the fruit juices to the formation of the final vinegars, were studied from an industrial vinegar process. The effect of each of the vinegar processing steps on content of antioxidants, phenolic compounds and flavonoids was studied, by spectroscopic methods and by high-performance liquid chromatography (HPLC). Results: The major observation was that spectrophotometric methods indicate a strong loss of antioxidant phenolic compounds during the transition from fruit wine to fruit vinegar. A targeted HPLC analysis indicates that metabolites such as gallic acid are lost in later stages of the vinegar process. Conclusion: The major conclusion of this work is that major changes occur in phenolic compounds during vinegar making. An untargeted metabolite analysis should be used to reveal these changes in more detail. In addition, the effect of vinegar processing on bio-accessibility of phenolic compounds was investigated by mimicking the digestive tract in an in vitro set up. This study is meant to provide insight into the potential of vinegar as a source of health-related compounds from fruit. PMID:27690020

  17. Secretory immunoglobulin purification from whey by chromatographic techniques.

    PubMed

    Matlschweiger, Alexander; Engelmaier, Hannah; Himmler, Gottfried; Hahn, Rainer

    2017-08-15

    Secretory immunoglobulins (SIg) are a major fraction of the mucosal immune system and represent potential drug candidates. So far, platform technologies for their purification do not exist. SIg from animal whey was used as a model to develop a simple, efficient and potentially generic chromatographic purification process. Several chromatographic stationary phases were tested. A combination of two anion-exchange steps resulted in the highest purity. The key step was the use of a small-porous anion exchanger operated in flow-through mode. Diffusion of SIg into the resin particles was significantly hindered, while the main impurities, IgG and serum albumin, were bound. In this step, initial purity was increased from 66% to 89% with a step yield of 88%. In a second anion-exchange step using giga-porous material, SIg was captured and purified by step or linear gradient elution to obtain fractions with purities >95%. For the step gradient elution step yield of highly pure SIg was 54%. Elution of SIgA and SIgM with a linear gradient resulted in a step yield of 56% and 35%, respectively. Overall yields for both anion exchange steps were 43% for the combination of flow-through and step elution mode. Combination of flow-through and linear gradient elution mode resulted in a yield of 44% for SIgA and 39% for SIgM. The proposed process allows the purification of biologically active SIg from animal whey in preparative scale. For future applications, the process can easily be adopted for purification of recombinant secretory immunoglobulin species. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Developing a Competency-Based Curriculum for a Dental Hygiene Program.

    ERIC Educational Resources Information Center

    DeWald, Janice P.; McCann, Ann L.

    1999-01-01

    Describes the three-step process used to develop a competency-based curriculum at the Caruth School of Dental Hygiene (Texas A&M University). The process involved development of a competency document (detailing three domains, nine major competencies, and 54 supporting competencies), an evaluation plan, and a curriculum inventory which defined…

  19. Melt Conditioning of Light Metals by Application of High Shear for Improved Microstructure and Defect Control

    NASA Astrophysics Data System (ADS)

    Patel, Jayesh B.; Yang, Xinliang; Mendis, Chamini L.; Fan, Zhongyun

    2017-04-01

    Casting is the first step toward the production of majority of metal products whether the final processing step is casting or other thermomechanical processes such as extrusion or forging. The high shear melt conditioning provides an easily adopted pathway to producing castings with a more uniform fine-grained microstructure along with a more uniform distribution of the chemical composition leading to fewer defects as a result of reduced shrinkage porosities and the presence of large oxide films through the microstructure. The effectiveness of high shear melt conditioning in improving the microstructure of processes used in industry illustrates the versatility of the high shear melt conditioning technology. The application of high shear process to direct chill and twin roll casting process is demonstrated with examples from magnesium melts.

  20. The Big Crunch: A Hybrid Solution to Earth and Space Science Instruction for Elementary Education Majors

    ERIC Educational Resources Information Center

    Cervato, Cinzia; Kerton, Charles; Peer, Andrea; Hassall, Lesya; Schmidt, Allan

    2013-01-01

    We describe the rationale and process for the development of a new hybrid Earth and Space Science course for elementary education majors. A five-step course design model, applicable to both online and traditional courses, is presented. Assessment of the course outcomes after two semesters indicates that the intensive time invested in the…

  1. Solving a layout design problem by analytic hierarchy process (AHP) and data envelopment analysis (DEA) approach

    NASA Astrophysics Data System (ADS)

    Tuzkaya, Umut R.; Eser, Arzum; Argon, Goner

    2004-02-01

    Today, growing amounts of waste due to fast consumption rate of products started an irreversible environmental pollution and damage. A considerable part of this waste is caused by packaging material. With the realization of this fact, various waste policies have taken important steps. Here we considered a firm, where waste Aluminum constitutes majority of raw materials for this fir0m. In order to achieve a profitable recycling process, plant layout should be well designed. In this study, we propose a two-step approach involving Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) to solve facility layout design problems. A case example is considered to demonstrate the results achieved.

  2. Quality by Design (QbD)-Based Process Development for Purification of a Biotherapeutic.

    PubMed

    Rathore, Anurag S

    2016-05-01

    Quality by Design (QbD) is currently receiving increased attention from the pharmaceutical community. As a result, most major biotech manufacturers are in varying stages of implementing QbD. Here, I present a case study that illustrates the step-by-step development using QbD of a purification process for the production of a biosimilar product: granulocyte colony-stimulating factor (GCSF). I also highlight and discuss the advantages that QbD-based process development offers over traditional approaches. The case study is intended to help those who wish to implement QbD towards the development and commercialization of biotech products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  4. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Bernstein, Max; Richey, Christina; Rall, Jonathan

    2015-11-01

    Introduction: NASA’s Planetary Science Division (PSD) solicits its research and analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD changed the structure of the program elements under which the majority of planetary science R&A is done. Major changes included the creation of five core research program elements aligned with PSD’s strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submission.ROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2015 submission changes: All PSD programs will continue to use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.

  5. Pocket Pal: A Graphic Arts Digest for Printers and Advertising Production Managers. Tenth Edition.

    ERIC Educational Resources Information Center

    1970

    In this digest of information about printing a brief survey of the history of printing precedes detailed explanations of the processes and the materials involved in printing. The four major printing processes--letterpress, gravure, offset lithography, and screen--are explained. Steps in preparing art and copy for printing, including selection of…

  6. Planning for the Aging: A Manual of Practical Methods.

    ERIC Educational Resources Information Center

    Cotton, Frank E., Jr.

    This manual is divided into three major parts. Part One explores the role and process of planning delivery services to the aging, including leadership. Part Two offers several methods for carrying out each of seven steps delineated in the planning process, and describes their application to the delivery of services for the aging. Part Three…

  7. Tennessee long-range transportation plan : modal needs

    DOT National Transportation Integrated Search

    2005-12-01

    This report documents one of several major steps in the long-range planning process. This report examines each component of the states transportation network to identify the long-term needs of the transportation modes to 2030. The determination of...

  8. Semiconductor grade, solar silicon purification project

    NASA Technical Reports Server (NTRS)

    Ingle, W. M.; Rosler, R. R.; Thompson, S. W.; Chaney, R. E.

    1979-01-01

    Experimental apparatus and procedures used in the development of a 3-step SiF2(x) polymer transport purification process are described. Both S.S.M.S. and E.S. analysis demonstrated that major purification had occured and some samples were indistinguishable from semiconductor grade silicon (except possibly for phosphorus). Recent electrical analysis via crystal growth reveals that the product contains compensated phosphorus and boron. The low projected product cost and short energy payback time suggest that the economics of this process will result in a cost less than the goal of $10/Kg(1975 dollars). The process appears to be readily scalable to a major silicon purification facility.

  9. Pollution prevention in the pulp and paper industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, P.G.

    1995-09-01

    Probably no other industry has made as much progress as the kraft pulp and paper industry in reclaiming waste products. About half of the wood used in making pulp is cellulose; the reclamation of the other ingredients in the wood constitutes a continuing evolution of pollution prevention and economic success. The by-products of chemical pulping include turpentine used in the paint industry, lignosulfonates used as surfactants and dispersants, ``tall oil`` used in chemical manufacturing, yeast, vanillin, acetic acid, activated carbon, and alcohol. Sulfamic turpentine recovered in the kraft process is used to manufacture pine oil, dimethyl sulfoxide (DMSO), and othermore » useful chemical products. In addition, the noncellulose portion of the wood is used to provide energy for the pulping process through the combustion of concentrated black liquor. Over 75% of the pulp produced in the US is manufactured using the kraft process. Because of the predominance of the kraft process, the remainder of this section will address pollution prevention methods for kraft pulp and paper mills. Some of these techniques may be applicable or adaptable to other pulping processes, especially sulfite mills. The major steps in the kraft process are described, followed by a discussion of major wastestreams, and proven pollution prevention methods for each of these steps.« less

  10. Tumor image signatures and habitats: a processing pipeline of multimodality metabolic and physiological images.

    PubMed

    You, Daekeun; Kim, Michelle M; Aryal, Madhava P; Parmar, Hemant; Piert, Morand; Lawrence, Theodore S; Cao, Yue

    2018-01-01

    To create tumor "habitats" from the "signatures" discovered from multimodality metabolic and physiological images, we developed a framework of a processing pipeline. The processing pipeline consists of six major steps: (1) creating superpixels as a spatial unit in a tumor volume; (2) forming a data matrix [Formula: see text] containing all multimodality image parameters at superpixels; (3) forming and clustering a covariance or correlation matrix [Formula: see text] of the image parameters to discover major image "signatures;" (4) clustering the superpixels and organizing the parameter order of the [Formula: see text] matrix according to the one found in step 3; (5) creating "habitats" in the image space from the superpixels associated with the "signatures;" and (6) pooling and clustering a matrix consisting of correlation coefficients of each pair of image parameters from all patients to discover subgroup patterns of the tumors. The pipeline was applied to a dataset of multimodality images in glioblastoma (GBM) first, which consisted of 10 image parameters. Three major image "signatures" were identified. The three major "habitats" plus their overlaps were created. To test generalizability of the processing pipeline, a second image dataset from GBM, acquired on the scanners different from the first one, was processed. Also, to demonstrate the clinical association of image-defined "signatures" and "habitats," the patterns of recurrence of the patients were analyzed together with image parameters acquired prechemoradiation therapy. An association of the recurrence patterns with image-defined "signatures" and "habitats" was revealed. These image-defined "signatures" and "habitats" can be used to guide stereotactic tissue biopsy for genetic and mutation status analysis and to analyze for prediction of treatment outcomes, e.g., patterns of failure.

  11. Recovery of actinides from actinide-aluminium alloys by chlorination: Part III - Chlorination with HCl(g)

    NASA Astrophysics Data System (ADS)

    Meier, Roland; Souček, Pavel; Walter, Olaf; Malmbeck, Rikard; Rodrigues, Alcide; Glatz, Jean-Paul; Fanghänel, Thomas

    2018-01-01

    Two steps of a pyrochemical route for the recovery of actinides from spent metallic nuclear fuel are being investigated at JRC-Karlsruhe. The first step consists in electrorefining the fuel in molten salt medium implying aluminium cathodes. The second step is a chlorination process for the separation of actinides (An) from An-Al alloys formed on the cathodes. The chlorination process, in turn, consists of three steps; the distillation of adhered salt (1), the chlorination of An-Al by HCl/Cl2 under formation of AlCl3 and An chlorides (2), and the subsequent sublimation of AlCl3 (3). In the present work UAl2, UAl3, NpAl2, and PuAl2 were chlorinated with HCl(g) in a temperature range between 300 and 400 °C forming UCl4, NpCl4 or PuCl3 as the major An containing phases, respectively. Thermodynamic calculations were carried out to support the experimental work. The results showed a high chlorination efficiency for all used starting materials and indicated that the sublimation step may not be necessary when using HCl(g).

  12. Detection Methodologies for Pathogen and Toxins: A Review.

    PubMed

    Alahi, Md Eshrat E; Mukhopadhyay, Subhas Chandra

    2017-08-16

    Pathogen and toxin-contaminated foods and beverages are a major source of illnesses, even death, and have a significant economic impact worldwide. Human health is always under a potential threat, including from biological warfare, due to these dangerous pathogens. The agricultural and food production chain consists of many steps such as harvesting, handling, processing, packaging, storage, distribution, preparation, and consumption. Each step is susceptible to threats of environmental contamination or failure to safeguard the processes. The production process can be controlled in the food and agricultural sector, where smart sensors can play a major role, ensuring greater food quality and safety by low cost, fast, reliable, and profitable methods of detection. Techniques for the detection of pathogens and toxins may vary in cost, size, and specificity, speed of response, sensitivity, and precision. Smart sensors can detect, analyse and quantify at molecular levels contents of different biological origin and ensure quality of foods against spiking with pesticides, fertilizers, dioxin, modified organisms, anti-nutrients, allergens, drugs and so on. This paper reviews different methodologies to detect pathogens and toxins in foods and beverages.

  13. Zero Liquid Discharge (ZLD) System for Flue-Gas Derived Water From Oxy-Combustion Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivaram Harendra; Danylo Oryshchyn; Thomas Ochs

    2011-10-16

    Researchers at the National Energy Technology Laboratory (NETL) located in Albany, Oregon, have patented a process - Integrated Pollutant Removal (IPR) that uses off-the-shelf technology to produce a sequestration ready CO{sub 2} stream from an oxy-combustion power plant. Capturing CO{sub 2} from fossil-fuel combustion generates a significant water product which can be tapped for use in the power plant and its peripherals. Water condensed in the IPR{reg_sign} process may contain fly ash particles, sodium (from pH control), and sulfur species, as well as heavy metals, cations and anions. NETL is developing a treatment approach for zero liquid discharge while maximizingmore » available heat from IPR. Current treatment-process steps being studied are flocculation/coagulation, for removal of cations and fine particles, and reverse osmosis, for anion removal as well as for scavenging the remaining cations. After reverse osmosis process steps, thermal evaporation and crystallization steps will be carried out in order to build the whole zero liquid discharge (ZLD) system for flue-gas condensed wastewater. Gypsum is the major product from crystallization process. Fast, in-line treatment of water for re-use in IPR seems to be one practical step for minimizing water treatment requirements for CO{sub 2} capture. The results obtained from above experiments are being used to build water treatment models.« less

  14. Quality Procedures in the European Higher Education Area and Beyond--Second ENQA Survey. ENQA Occasional Papers 14

    ERIC Educational Resources Information Center

    Costes, Nathalie; Crozier, Fiona; Cullen, Peter; Grifoll, Josep; Harris, Nick; Helle, Emmi; Hopbach, Achim; Kekalainen, Helka; Knezevic, Bozana; Sits, Tanel; Sohm, Kurt

    2008-01-01

    Quality assurance for higher education in Europe has developed significantly since 2002, and has increasingly influenced, and been influenced by, the Bologna Process. A major step in the Bologna Process was taken at the ministerial meeting in Bergen in May 2005, with the adoption of the Standards and Guidelines for Quality Assurance in the…

  15. Review of Peak Detection Algorithms in Liquid-Chromatography-Mass Spectrometry

    PubMed Central

    Zhang, Jianqiu; Gonzalez, Elias; Hestilow, Travis; Haskins, William; Huang, Yufei

    2009-01-01

    In this review, we will discuss peak detection in Liquid-Chromatography-Mass Spectrometry (LC/MS) from a signal processing perspective. A brief introduction to LC/MS is followed by a description of the major processing steps in LC/MS. Specifically, the problem of peak detection is formulated and various peak detection algorithms are described and compared. PMID:20190954

  16. High yield of recombinant human Apolipoprotein A-I expressed in Pichia pastoris by using mixed-mode chromatography.

    PubMed

    Narasimhan Janakiraman, Vignesh; Noubhani, Abdelmajid; Venkataraman, Krishnan; Vijayalakshmi, Mookambeswaran; Santarelli, Xavier

    2016-01-01

    A vast majority of the cardioprotective properties exhibited by High-Density Lipoprotein (HDL) is mediated by its major protein component Apolipoprotein A-I (ApoA1). In order to develop a simplified bioprocess for producing recombinant human Apolipoprotein A-I (rhApoA1) in its near-native form, rhApoA1was expressed without the use of an affinity tag in view of its potential therapeutic applications. Expressed in Pichia pastoris at expression levels of 58.2 mg ApoA1 per litre of culture in a reproducible manner, the target protein was purified by mixed-mode chromatography using Capto™ MMC ligand with a purity and recovery of 84% and 68%, respectively. ApoA1 purification was scaled up to Mixed-mode Expanded Bed Adsorption chromatography to establish an 'on-line' process for the efficient capture of rhApoA1 directly from the P. pastoris expression broth. A polishing step using anion exchange chromatography enabled the recovery of ApoA1 up to 96% purity. Purified ApoA1 was identified and verified by RPLC-ESI-Q-TOF mass spectrometry. This two-step process would reduce processing times and therefore costs in comparison to the twelve-step procedure currently used for recovering rhApoA1 from P. pastoris. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Step-by-step: a model for practice-based learning.

    PubMed

    Kane, Gabrielle M

    2007-01-01

    Innovative technology has led to high-precision radiation therapy that has dramatically altered the practice of radiation oncology. This qualitative study explored the implementation of this innovation into practice from the perspective of the practitioners in a large academic radiation medicine program and aimed to improve understanding of and facilitate the educational process of this change. Multiprofession staff participated in a series of seven focus groups and nine in-depth interviews, and the descriptive data from the transcripts were analyzed using grounded theory methodology. Practitioners believed that there had been a major effect on many aspects of their practice. The team structure supported the adoption of change. The technology changed the way the practices worked. Learning new skills increased workload and stress but led to a new conception of the discipline and the generation of new practice-based knowledge. When the concepts were examined longitudinally, a four-step process of learning was identified. In step 1, there was anxiety as staff acquired the skills to use the technology. Step 2 involved learning to interpret new findings and images, experiencing uncertainty until new perspectives developed. Step 3 involved questioning assumptions and critical reflection, which resulted in new understanding. The final step 4 identified a process of constructing new knowledge through research, development, and dialogue within the profession. These findings expand our understanding of how practice-based learning occurs in the context of change and can guide learning activities appropriate to each stage.

  18. Image preprocessing for improving computational efficiency in implementation of restoration and superresolution algorithms.

    PubMed

    Sundareshan, Malur K; Bhattacharjee, Supratik; Inampudi, Radhika; Pang, Ho-Yuen

    2002-12-10

    Computational complexity is a major impediment to the real-time implementation of image restoration and superresolution algorithms in many applications. Although powerful restoration algorithms have been developed within the past few years utilizing sophisticated mathematical machinery (based on statistical optimization and convex set theory), these algorithms are typically iterative in nature and require a sufficient number of iterations to be executed to achieve the desired resolution improvement that may be needed to meaningfully perform postprocessing image exploitation tasks in practice. Additionally, recent technological breakthroughs have facilitated novel sensor designs (focal plane arrays, for instance) that make it possible to capture megapixel imagery data at video frame rates. A major challenge in the processing of these large-format images is to complete the execution of the image processing steps within the frame capture times and to keep up with the output rate of the sensor so that all data captured by the sensor can be efficiently utilized. Consequently, development of novel methods that facilitate real-time implementation of image restoration and superresolution algorithms is of significant practical interest and is the primary focus of this study. The key to designing computationally efficient processing schemes lies in strategically introducing appropriate preprocessing steps together with the superresolution iterations to tailor optimized overall processing sequences for imagery data of specific formats. For substantiating this assertion, three distinct methods for tailoring a preprocessing filter and integrating it with the superresolution processing steps are outlined. These methods consist of a region-of-interest extraction scheme, a background-detail separation procedure, and a scene-derived information extraction step for implementing a set-theoretic restoration of the image that is less demanding in computation compared with the superresolution iterations. A quantitative evaluation of the performance of these algorithms for restoring and superresolving various imagery data captured by diffraction-limited sensing operations are also presented.

  19. Dalhousie Fire

    ERIC Educational Resources Information Center

    Matthews, Fred W.

    1986-01-01

    Describes steps taken by the Weldon Law Library at Dalhousie University in salvaging books damaged in a major fire, including procedures and processes used in packing, sorting, drying, and cleaning the books. The need for a disaster plan for specific libraries is emphasized, and some suggestions are made. (CDD)

  20. Timing paradox of stepping and falls in ageing: not so quick and quick(er) on the trigger

    PubMed Central

    Mille, Marie‐Laure

    2016-01-01

    Abstract Physiological and degenerative changes affecting human standing balance are major contributors to falls with ageing. During imbalance, stepping is a powerful protective action for preserving balance that may be voluntarily initiated in recognition of a balance threat, or be induced by an externally imposed mechanical or sensory perturbation. Paradoxically, with ageing and falls, initiation slowing of voluntary stepping is observed together with perturbation‐induced steps that are triggered as fast as or faster than for younger adults. While age‐associated changes in sensorimotor conduction, central neuronal processing and cognitive functions are linked to delayed voluntary stepping, alterations in the coupling of posture and locomotion may also prolong step triggering. It is less clear, however, how these factors may explain the accelerated triggering of induced stepping. We present a conceptual model that addresses this issue. For voluntary stepping, a disruption in the normal coupling between posture and locomotion may underlie step‐triggering delays through suppression of the locomotion network based on an estimation of the evolving mechanical state conditions for stability. During induced stepping, accelerated step initiation may represent an event‐triggering process whereby stepping is released according to the occurrence of a perturbation rather than to the specific sensorimotor information reflecting the evolving instability. In this case, errors in the parametric control of induced stepping and its effectiveness in stabilizing balance would be likely to occur. We further suggest that there is a residual adaptive capacity with ageing that could be exploited to improve paradoxical triggering and other changes in protective stepping to impact fall risk. PMID:26915664

  1. One‐Step Reforming of CO2 and CH4 into High‐Value Liquid Chemicals and Fuels at Room Temperature by Plasma‐Driven Catalysis

    PubMed Central

    Wang, Li; Yi, Yanhui; Wu, Chunfei; Guo, Hongchen

    2017-01-01

    Abstract The conversion of CO2 with CH4 into liquid fuels and chemicals in a single‐step catalytic process that bypasses the production of syngas remains a challenge. In this study, liquid fuels and chemicals (e.g., acetic acid, methanol, ethanol, and formaldehyde) were synthesized in a one‐step process from CO2 and CH4 at room temperature (30 °C) and atmospheric pressure for the first time by using a novel plasma reactor with a water electrode. The total selectivity to oxygenates was approximately 50–60 %, with acetic acid being the major component at 40.2 % selectivity, the highest value reported for acetic acid thus far. Interestingly, the direct plasma synthesis of acetic acid from CH4 and CO2 is an ideal reaction with 100 % atom economy, but it is almost impossible by thermal catalysis owing to the significant thermodynamic barrier. The combination of plasma and catalyst in this process shows great potential for manipulating the distribution of liquid chemical products in a given process. PMID:28842938

  2. Chemistry of rubber processing and disposal.

    PubMed Central

    Bebb, R L

    1976-01-01

    The major chemical changes during the processing of rubber occur with the breakdown in mastication and during vulcanization of the molded tire. There is little chemical change during the compounding, calendering, extrusion, and molding steps. Reclaiming is the process of converting scrap rubber into an unsaturated, processible product that can be vulcanized with sulfur. Pyrolysis of scrap rubber yields a complex mixture of liquids, gas, and residue in varying ratios dependent on the nature of the scrap and the conditions of pyrolysis. PMID:799964

  3. Membrane Fusion Induced by Small Molecules and Ions

    PubMed Central

    Mondal Roy, Sutapa; Sarkar, Munna

    2011-01-01

    Membrane fusion is a key event in many biological processes. These processes are controlled by various fusogenic agents of which proteins and peptides from the principal group. The fusion process is characterized by three major steps, namely, inter membrane contact, lipid mixing forming the intermediate step, pore opening and finally mixing of inner contents of the cells/vesicles. These steps are governed by energy barriers, which need to be overcome to complete fusion. Structural reorganization of big molecules like proteins/peptides, supplies the required driving force to overcome the energy barrier of the different intermediate steps. Small molecules/ions do not share this advantage. Hence fusion induced by small molecules/ions is expected to be different from that induced by proteins/peptides. Although several reviews exist on membrane fusion, no recent review is devoted solely to small moleculs/ions induced membrane fusion. Here we intend to present, how a variety of small molecules/ions act as independent fusogens. The detailed mechanism of some are well understood but for many it is still an unanswered question. Clearer understanding of how a particular small molecule can control fusion will open up a vista to use these moleucles instead of proteins/peptides to induce fusion both in vivo and in vitro fusion processes. PMID:21660306

  4. A novel automatic segmentation workflow of axial breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Besbes, Feten; Gargouri, Norhene; Damak, Alima; Sellami, Dorra

    2018-04-01

    In this paper we propose a novel process of a fully automatic breast tissue segmentation which is independent from expert calibration and contrast. The proposed algorithm is composed by two major steps. The first step consists in the detection of breast boundaries. It is based on image content analysis and Moore-Neighbour tracing algorithm. As a processing step, Otsu thresholding and neighbors algorithm are applied. Then, the external area of breast is removed to get an approximated breast region. The second preprocessing step is the delineation of the chest wall which is considered as the lowest cost path linking three key points; These points are located automatically at the breast. They are respectively, the left and right boundary points and the middle upper point placed at the sternum region using statistical method. For the minimum cost path search problem, we resolve it through Dijkstra algorithm. Evaluation results reveal the robustness of our process face to different breast densities, complex forms and challenging cases. In fact, the mean overlap between manual segmentation and automatic segmentation through our method is 96.5%. A comparative study shows that our proposed process is competitive and faster than existing methods. The segmentation of 120 slices with our method is achieved at least in 20.57+/-5.2s.

  5. Antimicrobial packaging for fresh-cut fruits

    USDA-ARS?s Scientific Manuscript database

    Fresh-cut fruits are minimally processed produce which are consumed directly at their fresh stage without any further kill step. Microbiological quality and safety are major challenges to fresh-cut fruits. Antimicrobial packaging is one of the innovative food packaging systems that is able to kill o...

  6. Structural Maturation of HIV-1 Reverse Transcriptase—A Metamorphic Solution to Genomic Instability

    PubMed Central

    London, Robert E.

    2016-01-01

    Human immunodeficiency virus 1 (HIV-1) reverse transcriptase (RT)—a critical enzyme of the viral life cycle—undergoes a complex maturation process, required so that a pair of p66 precursor proteins can develop conformationally along different pathways, one evolving to form active polymerase and ribonuclease H (RH) domains, while the second forms a non-functional polymerase and a proteolyzed RH domain. These parallel maturation pathways rely on the structural ambiguity of a metamorphic polymerase domain, for which the sequence–structure relationship is not unique. Recent nuclear magnetic resonance (NMR) studies utilizing selective labeling techniques, and structural characterization of the p66 monomer precursor have provided important insights into the details of this maturation pathway, revealing many aspects of the three major steps involved: (1) domain rearrangement; (2) dimerization; and (3) subunit-selective RH domain proteolysis. This review summarizes the major structural changes that occur during the maturation process. We also highlight how mutations, often viewed within the context of the mature RT heterodimer, can exert a major influence on maturation and dimerization. It is further suggested that several steps in the RT maturation pathway may provide attractive targets for drug development. PMID:27690082

  7. A practical guide for the identification of major sulcogyral structures of the human cortex.

    PubMed

    Destrieux, Christophe; Terrier, Louis Marie; Andersson, Frédéric; Love, Scott A; Cottier, Jean-Philippe; Duvernoy, Henri; Velut, Stéphane; Janot, Kevin; Zemmoura, Ilyess

    2017-05-01

    The precise sulcogyral localization of cortical lesions is mandatory to improve communication between practitioners and to predict and prevent post-operative deficits. This process, which assumes a good knowledge of the cortex anatomy and a systematic analysis of images, is, nevertheless, sometimes neglected in the neurological and neurosurgical training. This didactic paper proposes a brief overview of the sulcogyral anatomy, using conventional MR-slices, and also reconstructions of the cortical surface after a more or less extended inflation process. This method simplifies the cortical anatomy by removing part of the cortical complexity induced by the folding process, and makes it more understandable. We then reviewed several methods for localizing cortical structures, and proposed a three-step identification: after localizing the lateral, medial or ventro-basal aspect of the hemisphere (step 1), the main interlobar sulci were located to limit the lobes (step 2). Finally, intralobar sulci and gyri were identified (step 3) thanks to the same set of rules. This paper does not propose any new identification method but should be regarded as a set of practical guidelines, useful in daily clinical practice, for detecting the main sulci and gyri of the human cortex.

  8. Timing paradox of stepping and falls in ageing: not so quick and quick(er) on the trigger.

    PubMed

    Rogers, Mark W; Mille, Marie-Laure

    2016-08-15

    Physiological and degenerative changes affecting human standing balance are major contributors to falls with ageing. During imbalance, stepping is a powerful protective action for preserving balance that may be voluntarily initiated in recognition of a balance threat, or be induced by an externally imposed mechanical or sensory perturbation. Paradoxically, with ageing and falls, initiation slowing of voluntary stepping is observed together with perturbation-induced steps that are triggered as fast as or faster than for younger adults. While age-associated changes in sensorimotor conduction, central neuronal processing and cognitive functions are linked to delayed voluntary stepping, alterations in the coupling of posture and locomotion may also prolong step triggering. It is less clear, however, how these factors may explain the accelerated triggering of induced stepping. We present a conceptual model that addresses this issue. For voluntary stepping, a disruption in the normal coupling between posture and locomotion may underlie step-triggering delays through suppression of the locomotion network based on an estimation of the evolving mechanical state conditions for stability. During induced stepping, accelerated step initiation may represent an event-triggering process whereby stepping is released according to the occurrence of a perturbation rather than to the specific sensorimotor information reflecting the evolving instability. In this case, errors in the parametric control of induced stepping and its effectiveness in stabilizing balance would be likely to occur. We further suggest that there is a residual adaptive capacity with ageing that could be exploited to improve paradoxical triggering and other changes in protective stepping to impact fall risk. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.

  9. Failure mode and effects analysis: A community practice perspective.

    PubMed

    Schuller, Bradley W; Burns, Angi; Ceilley, Elizabeth A; King, Alan; LeTourneau, Joan; Markovic, Alexander; Sterkel, Lynda; Taplin, Brigid; Wanner, Jennifer; Albert, Jeffrey M

    2017-11-01

    To report our early experiences with failure mode and effects analysis (FMEA) in a community practice setting. The FMEA facilitator received extensive training at the AAPM Summer School. Early efforts focused on department education and emphasized the need for process evaluation in the context of high profile radiation therapy accidents. A multidisciplinary team was assembled with representation from each of the major department disciplines. Stereotactic radiosurgery (SRS) was identified as the most appropriate treatment technique for the first FMEA evaluation, as it is largely self-contained and has the potential to produce high impact failure modes. Process mapping was completed using breakout sessions, and then compiled into a simple electronic format. Weekly sessions were used to complete the FMEA evaluation. Risk priority number (RPN) values > 100 or severity scores of 9 or 10 were considered high risk. The overall time commitment was also tracked. The final SRS process map contained 15 major process steps and 183 subprocess steps. Splitting the process map into individual assignments was a successful strategy for our group. The process map was designed to contain enough detail such that another radiation oncology team would be able to perform our procedures. Continuous facilitator involvement helped maintain consistent scoring during FMEA. Practice changes were made responding to the highest RPN scores, and new resulting RPN scores were below our high-risk threshold. The estimated person-hour equivalent for project completion was 258 hr. This report provides important details on the initial steps we took to complete our first FMEA, providing guidance for community practices seeking to incorporate this process into their quality assurance (QA) program. Determining the feasibility of implementing complex QA processes into different practice settings will take on increasing significance as the field of radiation oncology transitions into the new TG-100 QA paradigm. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. Toward the reconstitution of synthetic cell motility

    PubMed Central

    Siton-Mendelson, Orit; Bernheim-Groswasser, Anne

    2016-01-01

    ABSTRACT Cellular motility is a fundamental process essential for embryonic development, wound healing, immune responses, and tissues development. Cells are mostly moving by crawling on external, or inside, substrates which can differ in their surface composition, geometry, and dimensionality. Cells can adopt different migration phenotypes, e.g., bleb-based and protrusion-based, depending on myosin contractility, surface adhesion, and cell confinement. In the few past decades, research on cell motility has focused on uncovering the major molecular players and their order of events. Despite major progresses, our ability to infer on the collective behavior from the molecular properties remains a major challenge, especially because cell migration integrates numerous chemical and mechanical processes that are coupled via feedbacks that span over large range of time and length scales. For this reason, reconstituted model systems were developed. These systems allow for full control of the molecular constituents and various system parameters, thereby providing insight into their individual roles and functions. In this review we describe the various reconstituted model systems that were developed in the past decades. Because of the multiple steps involved in cell motility and the complexity of the overall process, most of the model systems focus on very specific aspects of the individual steps of cell motility. Here we describe the main advancement in cell motility reconstitution and discuss the main challenges toward the realization of a synthetic motile cell. PMID:27019160

  11. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less

  12. [Implementation of a rational standard of hygiene for preparation of operating rooms].

    PubMed

    Bauer, M; Scheithauer, S; Moerer, O; Pütz, H; Sliwa, B; Schmidt, C E; Russo, S G; Waeschle, R M

    2015-10-01

    The assurance of high standards of care is a major requirement in German hospitals while cost reduction and efficient use of resources are mandatory. These requirements are particularly evident in the high-risk and cost-intensive operating theatre field with multiple process steps. The cleaning of operating rooms (OR) between surgical procedures is of major relevance for patient safety and requires time and human resources. The hygiene procedure plan for OR cleaning between operations at the university hospital in Göttingen was revised and optimized according to the plan-do-check-act principle due to not clearly defined specifications of responsibilities, use of resources, prolonged process times and increased staff engagement. The current status was evaluated in 2012 as part of the first step "plan". The subsequent step "do" included an expert symposium with external consultants, interdisciplinary consensus conferences with an actualization of the former hygiene procedure plan and the implementation process. All staff members involved were integrated into this management change process. The penetration rate of the training and information measures as well as the acceptance and compliance with the new hygiene procedure plan were reviewed within step "check". The rates of positive swabs and air sampling as well as of postoperative wound infections were analyzed for quality control and no evidence for a reduced effectiveness of the new hygiene plan was found. After the successful implementation of these measures the next improvement cycle ("act") was performed in 2014 which led to a simplification of the hygiene plan by reduction of the number of defined cleaning and disinfection programs for preparation of the OR. The reorganization measures described led to a comprehensive commitment of the hygiene procedure plan by distinct specifications for responsibilities, for the course of action and for the use of resources. Furthermore, a simplification of the plan, a rational staff assignment and reduced process times were accomplished. Finally, potential conflicts due to an insufficient evidence-based knowledge of personnel was reduced. This present project description can be used by other hospitals as a guideline for similar changes in management processes.

  13. Space experiment development process

    NASA Technical Reports Server (NTRS)

    Depauw, James F.

    1987-01-01

    Described is a process for developing space experiments utilizing the Space Shuttle. The role of the Principal Investigator is described as well as the Principal Investigator's relation with the project development team. Described also is the sequence of events from an early definition phase through the steps of hardware development. The major interactions between the hardware development program and the Shuttle integration and safety activities are also shown. The presentation is directed to people with limited Shuttle experiment experience. The objective is to summarize the development process, discuss the roles of major participants, and list some lessons learned. Two points should be made at the outset. First, no two projects are the same so the process varies from case to case. Second, the emphasis here is on Code EN/Microgravity Science and Applications Division (MSAD).

  14. Mechanism and the origins of stereospecificity in copper-catalyzed ring expansion of vinyl oxiranes: a traceless dual transition-metal-mediated process.

    PubMed

    Mustard, Thomas J L; Mack, Daniel J; Njardarson, Jon T; Cheong, Paul Ha-Yeon

    2013-01-30

    Density functional theory computations of the Cu-catalyzed ring expansion of vinyloxiranes is mediated by a traceless dual Cu(I)-catalyst mechanism. Overall, the reaction involves a monomeric Cu(I)-catalyst, but a single key step, the Cu migration, requires two Cu(I)-catalysts for the transformation. This dual-Cu step is found to be a true double Cu(I) transition state rather than a single Cu(I) transition state in the presence of an adventitious, spectator Cu(I). Both Cu(I) catalysts are involved in the bond forming and breaking process. The single Cu(I) transition state is not a stationary point on the potential energy surface. Interestingly, the reductive elimination is rate-determining for the major diastereomeric product, while the Cu(I) migration step is rate-determining for the minor. Thus, while the reaction requires dual Cu(I) activation to proceed, kinetically, the presence of the dual-Cu(I) step is untraceable. The diastereospecificity of this reaction is controlled by the Cu migration step. Suprafacial migration is favored over antarafacial migration due to the distorted Cu π-allyl in the latter.

  15. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Richey, Christina; Bernstein, Max; Rall, Jonathan

    2015-01-01

    Introduction: NASA's Planetary Science Division (PSD) solicits its Research and Analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD will be changing the structure of the program elements under which the majority of planetary science R&A is done. Major changes include the creation of five core research program elements aligned with PSD's strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submissionROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2014 submission changes: All PSD programs will use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.Additional Information: Additional details will be provided on the Cassini Data Analysis Program, the Exoplanets Research program and Discovery Data Analysis Program, for which Dr. Richey is the Lead Program Officer.

  16. Synthesis and purification of 1,3,5-triamino-2,4,6-trinitrobenzene (TATB)

    DOEpatents

    Mitchell, Alexander R [Livermore, CA; Coburn, Michael D [Santa Fe, NM; Lee, Gregory S [San Ramon, CA; Schmidt, Robert D [Livermore, CA; Pagoria, Philip F [Livermore, CA; Hsu, Peter C [Pleasanton, CA

    2006-06-06

    A method to convert surplus nitroarene explosives (picric acid, ammonium picrate,) into TATB is described. The process comprises three major steps: conversion of picric acid/ammonium picrate into picramide; conversion of picramide to TATB through vicarious nucleophilic substitution (VNS) of hydrogen chemistry; and purification of TATB.

  17. When New Boundaries Abound: A Systematic Approach to Redistricting.

    ERIC Educational Resources Information Center

    Creighton, Roger L.; Irwin, Armond J.

    1994-01-01

    A systematic approach to school redistricting that was developed over the past half-dozen years utilizes a computer. Crucial to achieving successful results are accuracy of data, enrollment forecasting, and citizen participation. Outlines the major steps of a typical redistricting study. One figure illustrates the redistricting process. (MLF)

  18. Blueprint for Acquisition Reform, Version 3.0

    DTIC Science & Technology

    2008-07-01

    represents a substantial and immediate step forward in establishing the Coast Guard as a model mid-sized federal agency for acquisition processes...Blueprint for Acquisition Reform in the U. S. Coast Guard “The Coast Guard must become the model for mid-sized Federal agency acquisition in process...acquisition (DoD 5000 model >CG Major Systems Acquisition Manual) • Deepwater Program Executive Officer (PEO): System of Systems performance-based

  19. Reforming Dutch substance abuse treatment services.

    PubMed

    Schippers, Gerard M; Schramade, Mark; Walburg, Jan A

    2002-01-01

    The Dutch substance abuse treatment system is in the middle of a major reorganization. The goal is to improve outcomes by redesigning all major primary treatment processes and by implementing a system of regular monitoring and feedback of clinical outcome data. The new program includes implementing standardized psychosocial behavior-oriented treatment modalities and a stepped-care patient placement algorithm in a core-shell organizational model. This article outlines the new program and presents its objectives, developmental stages, and current status.

  20. A proposed adaptive step size perturbation and observation maximum power point tracking algorithm based on photovoltaic system modeling

    NASA Astrophysics Data System (ADS)

    Huang, Yu

    Solar energy becomes one of the major alternative renewable energy options for its huge abundance and accessibility. Due to the intermittent nature, the high demand of Maximum Power Point Tracking (MPPT) techniques exists when a Photovoltaic (PV) system is used to extract energy from the sunlight. This thesis proposed an advanced Perturbation and Observation (P&O) algorithm aiming for relatively practical circumstances. Firstly, a practical PV system model is studied with determining the series and shunt resistances which are neglected in some research. Moreover, in this proposed algorithm, the duty ratio of a boost DC-DC converter is the object of the perturbation deploying input impedance conversion to achieve working voltage adjustment. Based on the control strategy, the adaptive duty ratio step size P&O algorithm is proposed with major modifications made for sharp insolation change as well as low insolation scenarios. Matlab/Simulink simulation for PV model, boost converter control strategy and various MPPT process is conducted step by step. The proposed adaptive P&O algorithm is validated by the simulation results and detail analysis of sharp insolation changes, low insolation condition and continuous insolation variation.

  1. Monitoring stream temperatures—A guide for non-specialists

    USGS Publications Warehouse

    Heck, Michael P.; Schultz, Luke D.; Hockman-Wert, David; Dinger, Eric C.; Dunham, Jason B.

    2018-04-19

    Executive SummaryWater temperature influences most physical and biological processes in streams, and along with streamflows is a major driver of ecosystem processes. Collecting data to measure water temperature is therefore imperative, and relatively straightforward. Several protocols exist for collecting stream temperature data, but these are frequently directed towards specialists. This document was developed to address the need for a protocol intended for non-specialists (non-aquatic) staff. It provides specific step-by-step procedures on (1) how to launch data loggers, (2) check the factory calibration of data loggers prior to field use, (3) how to install data loggers in streams for year-round monitoring, (4) how to download and retrieve data loggers from the field, and (5) how to input project data into organizational databases.

  2. A policy framework for accelerating adoption of new vaccines

    PubMed Central

    Hajjeh, Rana; Wecker, John; Cherian, Thomas; O'Brien, Katherine L; Knoll, Maria Deloria; Privor-Dumm, Lois; Kvist, Hans; Nanni, Angeline; Bear, Allyson P; Santosham, Mathuram

    2010-01-01

    Rapid uptake of new vaccines can improve health and wealth and contribute to meeting Millennium Development Goals. In the past, however, the introduction and use of new vaccines has been characterized by delayed uptake in the countries where the need is greatest. Based on experience with accelerating the adoption of Hib, pneumococcal and rotavirus vaccines, we propose here a framework for new vaccine adoption that may be useful for future efforts. The framework organizes the major steps in the process into a continuum from evidence to policy, implementation and finally access. It highlights the important roles of different actors at various times in the process and may allow new vaccine initiatives to save time and improve their efficiency by anticipating key steps and actions. PMID:21150269

  3. A policy framework for accelerating adoption of new vaccines.

    PubMed

    Levine, Orin S; Hajjeh, Rana; Wecker, John; Cherian, Thomas; O'Brien, Katherine L; Knoll, Maria Deloria; Privor-Dumm, Lois; Kvist, Hans; Nanni, Angeline; Bear, Allyson P; Santosham, Mathuram

    2010-12-01

    Rapid uptake of new vaccines can improve health and wealth and contribute to meeting Millennium Development Goals. In the past, however, the introduction and use of new vaccines has been characterized by delayed uptake in the countries where the need is greatest. Based on experience with accelerating the adoption of Hib, pneumococcal and rotavirus vaccines, we propose here a framework for new vaccine adoption that may be useful for future efforts. The framework organizes the major steps in the process into a continuum from evidence to policy, implementation and finally access. It highlights the important roles of different actors at various times in the process and may allow new vaccine initiatives to save time and improve their efficiency by anticipating key steps and actions.

  4. One-Step Reforming of CO2 and CH4 into High-Value Liquid Chemicals and Fuels at Room Temperature by Plasma-Driven Catalysis.

    PubMed

    Wang, Li; Yi, Yanhui; Wu, Chunfei; Guo, Hongchen; Tu, Xin

    2017-10-23

    The conversion of CO 2 with CH 4 into liquid fuels and chemicals in a single-step catalytic process that bypasses the production of syngas remains a challenge. In this study, liquid fuels and chemicals (e.g., acetic acid, methanol, ethanol, and formaldehyde) were synthesized in a one-step process from CO 2 and CH 4 at room temperature (30 °C) and atmospheric pressure for the first time by using a novel plasma reactor with a water electrode. The total selectivity to oxygenates was approximately 50-60 %, with acetic acid being the major component at 40.2 % selectivity, the highest value reported for acetic acid thus far. Interestingly, the direct plasma synthesis of acetic acid from CH 4 and CO 2 is an ideal reaction with 100 % atom economy, but it is almost impossible by thermal catalysis owing to the significant thermodynamic barrier. The combination of plasma and catalyst in this process shows great potential for manipulating the distribution of liquid chemical products in a given process. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  5. Impaired Response Selection During Stepping Predicts Falls in Older People-A Cohort Study.

    PubMed

    Schoene, Daniel; Delbaere, Kim; Lord, Stephen R

    2017-08-01

    Response inhibition, an important executive function, has been identified as a risk factor for falls in older people. This study investigated whether step tests that include different levels of response inhibition differ in their ability to predict falls and whether such associations are mediated by measures of attention, speed, and/or balance. A cohort study with a 12-month follow-up was conducted in community-dwelling older people without major cognitive and mobility impairments. Participants underwent 3 step tests: (1) choice stepping reaction time (CSRT) requiring rapid decision making and step initiation; (2) inhibitory choice stepping reaction time (iCSRT) requiring additional response inhibition and response-selection (go/no-go); and (3) a Stroop Stepping Test (SST) under congruent and incongruent conditions requiring conflict resolution. Participants also completed tests of processing speed, balance, and attention as potential mediators. Ninety-three of the 212 participants (44%) fell in the follow-up period. Of the step tests, only components of the iCSRT task predicted falls in this time with the relative risk per standard deviation for the reaction time (iCSRT-RT) = 1.23 (95%CI = 1.10-1.37). Multiple mediation analysis indicated that the iCSRT-RT was independently associated with falls and not mediated through slow processing speed, poor balance, or inattention. Combined stepping and response inhibition as measured in a go/no-go test stepping paradigm predicted falls in older people. This suggests that integrity of the response-selection component of a voluntary stepping response is crucial for minimizing fall risk. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  6. Achieving Continuous Manufacturing for Final Dosage Formation: Challenges and How to Meet Them May 20-21 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  7. Self-regenerating column chromatography

    DOEpatents

    Park, Woo K.

    1995-05-30

    The present invention provides a process for treating both cations and anions by using a self-regenerating, multi-ionic exchange resin column system which requires no separate regeneration steps. The process involves alternating ion-exchange chromatography for cations and anions in a multi-ionic exchange column packed with a mixture of cation and anion exchange resins. The multi-ionic mixed-charge resin column works as a multi-function column, capable of independently processing either cationic or anionic exchange, or simultaneously processing both cationic and anionic exchanges. The major advantage offered by the alternating multi-function ion exchange process is the self-regeneration of the resins.

  8. Network meta-analysis: application and practice using Stata

    PubMed Central

    2017-01-01

    This review aimed to arrange the concepts of a network meta-analysis (NMA) and to demonstrate the analytical process of NMA using Stata software under frequentist framework. The NMA tries to synthesize evidences for a decision making by evaluating the comparative effectiveness of more than two alternative interventions for the same condition. Before conducting a NMA, 3 major assumptions—similarity, transitivity, and consistency—should be checked. The statistical analysis consists of 5 steps. The first step is to draw a network geometry to provide an overview of the network relationship. The second step checks the assumption of consistency. The third step is to make the network forest plot or interval plot in order to illustrate the summary size of comparative effectiveness among various interventions. The fourth step calculates cumulative rankings for identifying superiority among interventions. The last step evaluates publication bias or effect modifiers for a valid inference from results. The synthesized evidences through five steps would be very useful to evidence-based decision-making in healthcare. Thus, NMA should be activated in order to guarantee the quality of healthcare system. PMID:29092392

  9. Network meta-analysis: application and practice using Stata.

    PubMed

    Shim, Sungryul; Yoon, Byung-Ho; Shin, In-Soo; Bae, Jong-Myon

    2017-01-01

    This review aimed to arrange the concepts of a network meta-analysis (NMA) and to demonstrate the analytical process of NMA using Stata software under frequentist framework. The NMA tries to synthesize evidences for a decision making by evaluating the comparative effectiveness of more than two alternative interventions for the same condition. Before conducting a NMA, 3 major assumptions-similarity, transitivity, and consistency-should be checked. The statistical analysis consists of 5 steps. The first step is to draw a network geometry to provide an overview of the network relationship. The second step checks the assumption of consistency. The third step is to make the network forest plot or interval plot in order to illustrate the summary size of comparative effectiveness among various interventions. The fourth step calculates cumulative rankings for identifying superiority among interventions. The last step evaluates publication bias or effect modifiers for a valid inference from results. The synthesized evidences through five steps would be very useful to evidence-based decision-making in healthcare. Thus, NMA should be activated in order to guarantee the quality of healthcare system.

  10. One Step beyond What the Literature Says on Institutional Effectiveness of Community, Junior, and Technical Colleges.

    ERIC Educational Resources Information Center

    Welker, William F.; Morgan, Samuel D.

    1991-01-01

    Analyzes the content of the literature on community, junior, and technical colleges, utilizing a matrix with seven major classifications (i.e., governance, organization, staffing, clientele, curriculum, finance, and evaluation) and three dimensions (i.e., structure/form, function/role, and process/operations). Offers observations on effectiveness…

  11. In Step with Technology: Can We Keep Up?

    ERIC Educational Resources Information Center

    James, Marcia L.

    1996-01-01

    A survey of 363 business communication instructors (146 responses) found a majority included technology in their courses (especially word processing, e-mail); 97% were self-taught. A test of a microcomputer applications course used a business plan project and a training unit project to teach students about home page development; the curriculum…

  12. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  13. Load forecasting via suboptimal seasonal autoregressive models and iteratively reweighted least squares estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mbamalu, G.A.N.; El-Hawary, M.E.

    The authors propose suboptimal least squares or IRWLS procedures for estimating the parameters of a seasonal multiplicative AR model encountered during power system load forecasting. The proposed method involves using an interactive computer environment to estimate the parameters of a seasonal multiplicative AR process. The method comprises five major computational steps. The first determines the order of the seasonal multiplicative AR process, and the second uses the least squares or the IRWLS to estimate the optimal nonseasonal AR model parameters. In the third step one obtains the intermediate series by back forecast, which is followed by using the least squaresmore » or the IRWLS to estimate the optimal season AR parameters. The final step uses the estimated parameters to forecast future load. The method is applied to predict the Nova Scotia Power Corporation's 168 lead time hourly load. The results obtained are documented and compared with results based on the Box and Jenkins method.« less

  14. Synthesis of indolizidinone analogues of cytotoxic alkaloids: monocyclic precursors are also active.

    PubMed

    Boto, Alicia; Miguélez, Javier; Marín, Raquel; Díaz, Mario

    2012-05-15

    Readily available proline derivatives can be transformed in just two steps into analogues of cytotoxic phenanthroindolizidine alkaloids. The key step uses a sequential radical scission-oxidation-alkylation process, which yields 2-substituted pyrrolidine amides. A second process effects the cyclization to give the desired alkaloid analogues, which possess an indolizidine core. The major and minor isomers (dr 3:2 to 3:1) can be easily separated, allowing their use to study structure-activity relationships (SAR). The process is versatile and allows the introduction of aryl and heteroaryl groups (including biphenyl, halogenated phenyl, and pyrrole rings). Some of these alkaloid analogues displayed a selective cytotoxic activity against tumorogenic human neuronal and mammary cancer cells, and one derivative caused around 80% cell death in both tumor lines at micromolar doses. The cytotoxicity of some monocyclic precursors was also studied, being comparable or superior to the bicyclic derivatives. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. The automated array assembly task of the low-cost silicon solar array project, phase 2

    NASA Technical Reports Server (NTRS)

    Coleman, M. G.; Pryor, R. A.; Sparks, T. G.; Legge, R.; Saltzman, D. L.

    1980-01-01

    Several specific processing steps as part of a total process sequence for manufacturing silicon solar cells were studied. Ion implantation was identified as the preferred process step for impurity doping. Unanalyzed beam ion implantation was shown to have major cost advantages over analyzed beam implantation. Further, high quality cells were fabricated using a high current unanalyzed beam. Mechanically masked plasma patterning of silicon nitride was shown to be capable of forming fine lines on silicon surfaces with spacings between mask and substrate as great as 250 micrometers. Extensive work was performed on advances in plated metallization. The need for the thick electroless palladium layer was eliminated. Further, copper was successfully utilized as a conductor layer utilizing nickel as a barrier to copper diffusion into the silicon. Plasma etching of silicon for texturing and saw damage removal was shown technically feasible but not cost effective compared to wet chemical etching techniques.

  16. Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.

    PubMed

    Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe

    2016-08-01

    Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Intensified recovery of valuable products from whey by use of ultrasound in processing steps - A review.

    PubMed

    Gajendragadkar, Chinmay N; Gogate, Parag R

    2016-09-01

    The current review focuses on the analysis of different aspects related to intensified recovery of possible valuable products from cheese whey using ultrasound. Ultrasound can be used for process intensification in processing steps such as pre-treatment, ultrafiltration, spray drying and crystallization. The combination of low-frequency, high intensity ultrasound with the pre-heat treatment minimizes the thickening or gelling of protein containing whey solutions. These characteristics of whey after the ultrasound assisted pretreatment helps in improving the efficacy of ultrafiltration used for separation and also helps in preventing the blockage of orifice of spray dryer atomizing device. Further, the heat stability of whey proteins is increased. In the subsequent processing step, use of ultrasound assisted atomization helps to reduce the treatment times as well as yield better quality whey protein concentrate (WPC) powder. After the removal of proteins from the whey, lactose is a major constituent remaining in the solution which can be efficiently recovered by sonocrystallization based on the use of anti-solvent as ethanol. The scale-up parameters to be considered during designing the process for large scale applications are also discussed along with analysis of various reactor designs. Overall, it appears that use of ultrasound can give significant process intensification benefits that can be harnessed even at commercial scale applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Writing an academic essay: a practical guide for nurses.

    PubMed

    Booth, Y

    Writing academic essays can be a major hurdle and source of anxiety for many students. Fears and misconceptions relating to this kind of writing can be dispelled if the task is approached in a logical and systematic manner. This article outlines the key steps involved in successfully completing an essay and provides some practical tips to facilitate critical and analytical writing. These steps are: analysing the task; exploring the subject; planning the essay; writing the account; and revising the drafts. Although this process is challenging, academic writing is a means of developing both personally and professionally.

  19. Perspectives on the manufacture of combination vaccines.

    PubMed

    Vose, J R

    2001-12-15

    Evolving regulatory requirements in the United States and Europe create major challenges for manufacturers tasked with production of vaccines that contain > or =9 separate antigens capable of protecting against infectious diseases, such as diphtheria, tetanus, pertussis, polio, hepatitis B, and Haemophilus influenza b, in a single shot. This article describes 10 steps that can facilitate the process of licensing these complex vaccines. It also points out problems associated with the use of animal tests for the crucial step of potency testing for batch release caused by the inherent variability of such tests and the difficulties of interpreting their results.

  20. Resolving the infection process reveals striking differences in the contribution of environment, genetics and phylogeny to host-parasite interactions.

    PubMed

    Duneau, David; Luijckx, Pepijn; Ben-Ami, Frida; Laforsch, Christian; Ebert, Dieter

    2011-02-22

    Infection processes consist of a sequence of steps, each critical for the interaction between host and parasite. Studies of host-parasite interactions rarely take into account the fact that different steps might be influenced by different factors and might, therefore, make different contributions to shaping coevolution. We designed a new method using the Daphnia magna - Pasteuria ramosa system, one of the rare examples where coevolution has been documented, in order to resolve the steps of the infection and analyse the factors that influence each of them. Using the transparent Daphnia hosts and fluorescently-labelled spores of the bacterium P. ramosa, we identified a sequence of infection steps: encounter between parasite and host; activation of parasite dormant spores; attachment of spores to the host; and parasite proliferation inside the host. The chances of encounter had been shown to depend on host genotype and environment. We tested the role of genetic and environmental factors in the newly described activation and attachment steps. Hosts of different genotypes, gender and species were all able to activate endospores of all parasite clones tested in different environments; suggesting that the activation cue is phylogenetically conserved. We next established that parasite attachment occurs onto the host oesophagus independently of host species, gender and environmental conditions. In contrast to spore activation, attachment depended strongly on the combination of host and parasite genotypes. Our results show that different steps are influenced by different factors. Host-type-independent spore activation suggests that this step can be ruled out as a major factor in Daphnia-Pasteuria coevolution. On the other hand, we show that the attachment step is crucial for the pronounced genetic specificities of this system. We suggest that this one step can explain host population structure and could be a key force behind coevolutionary cycles. We discuss how different steps can explain different aspects of the coevolutionary dynamics of the system: the properties of the attachment step, explaining the rapid evolution of infectivity and the properties of later parasite proliferation explaining the evolution of virulence. Our study underlines the importance of resolving the infection process in order to better understand host-parasite interactions.

  1. Operator models for delivering municipal solid waste management services in developing countries: Part B: Decision support.

    PubMed

    Soós, Reka; Whiteman, Andrew D; Wilson, David C; Briciu, Cosmin; Nürnberger, Sofia; Oelz, Barbara; Gunsilius, Ellen; Schwehn, Ekkehard

    2017-08-01

    This is the second of two papers reporting the results of a major study considering 'operator models' for municipal solid waste management (MSWM) in emerging and developing countries. Part A documents the evidence base, while Part B presents a four-step decision support system for selecting an appropriate operator model in a particular local situation. Step 1 focuses on understanding local problems and framework conditions; Step 2 on formulating and prioritising local objectives; and Step 3 on assessing capacities and conditions, and thus identifying strengths and weaknesses, which underpin selection of the operator model. Step 4A addresses three generic questions, including public versus private operation, inter-municipal co-operation and integration of services. For steps 1-4A, checklists have been developed as decision support tools. Step 4B helps choose locally appropriate models from an evidence-based set of 42 common operator models ( coms); decision support tools here are a detailed catalogue of the coms, setting out advantages and disadvantages of each, and a decision-making flowchart. The decision-making process is iterative, repeating steps 2-4 as required. The advantages of a more formal process include avoiding pre-selection of a particular com known to and favoured by one decision maker, and also its assistance in identifying the possible weaknesses and aspects to consider in the selection and design of operator models. To make the best of whichever operator models are selected, key issues which need to be addressed include the capacity of the public authority as 'client', management in general and financial management in particular.

  2. Modeling fatigue.

    PubMed

    Sumner, Walton; Xu, Jin Zhong

    2002-01-01

    The American Board of Family Practice is developing a patient simulation program to evaluate diagnostic and management skills. The simulator must give temporally and physiologically reasonable answers to symptom questions such as "Have you been tired?" A three-step process generates symptom histories. In the first step, the simulator determines points in time where it should calculate instantaneous symptom status. In the second step, a Bayesian network implementing a roughly physiologic model of the symptom generates a value on a severity scale at each sampling time. Positive, zero, and negative values represent increased, normal, and decreased status, as applicable. The simulator plots these values over time. In the third step, another Bayesian network inspects this plot and reports how the symptom changed over time. This mechanism handles major trends, multiple and concurrent symptom causes, and gradually effective treatments. Other temporal insights, such as observations about short-term symptom relief, require complimentary mechanisms.

  3. Interactions of aniline with soil and groundwater at an industrial spill site.

    PubMed Central

    Kosson, D S; Byrne, S V

    1995-01-01

    The interactions of aniline with soil at an industrial spill site were investigated. Sorption of aniline to the soil was observed to occur through a two-step mechanism. The first step was an ion exchange process with the protonated amine serving as an organic cation. This step was influenced by solution pH and ionic composition. The second step was covalent bonding most likely with quinone moieties and oxidation with polymerization of aniline. The extent of covalent bonding was influenced by the presence of oxygen and redox potential. The majority of aniline that was bound to the soil did not readily desorb under a variety of abiotic conditions. However, aniline was released to a significant extent in the presence of denitrifying and methanogenic microbial activity. Aniline in aqueous solution was readily biodegradable under aerobic and denitrifying conditions. Soil-bound aniline was observed not to be biodegradable. This paper provides an overview of results. PMID:8565915

  4. Step-by-step seeding procedure for preparing HKUST-1 membrane on porous α-alumina support.

    PubMed

    Nan, Jiangpu; Dong, Xueliang; Wang, Wenjin; Jin, Wanqin; Xu, Nanping

    2011-04-19

    Metal-organic framework (MOF) membranes have attracted considerable attention because of their striking advantages in small-molecule separation. The preparation of an integrated MOF membrane is still a major challenge. Depositing a uniform seed layer on a support for secondary growth is a main route to obtaining an integrated MOF membrane. A novel seeding method to prepare HKUST-1 (known as Cu(3)(btc)(2)) membranes on porous α-alumina supports is reported. The in situ production of the seed layer was realized in step-by-step fashion via the coordination of H(3)btc and Cu(2+) on an α-alumina support. The formation process of the seed layer was observed by ultraviolet-visible absorption spectroscopy and atomic force microscopy. An integrated HKUST-1 membrane could be synthesized by the secondary hydrothermal growth on the seeded support. The gas permeation performance of the membrane was evaluated. © 2011 American Chemical Society

  5. U.S. data processing for the IRAS project. [by Jet Propulsion Laboratory Scientific Data Analysis System

    NASA Technical Reports Server (NTRS)

    Duxbury, J. H.

    1983-01-01

    The JPL's Scientific Data Analysis System (SDAS), which will process IRAS data and produce a catalogue of perhaps a million infrared sources in the sky, as well as other information for astronomical records, is described. The purposes of SDAS are discussed, and the major SDAS processors are shown in block diagram. The catalogue processing is addressed, mentioning the basic processing steps which will be applied to raw detector data. Signal reconstruction and conversion to astrophysical units, source detection, source confirmation, data management, and survey data products are considered in detail.

  6. Composite chronicles: A study of the lessons learned in the development, production, and service of composite structures

    NASA Technical Reports Server (NTRS)

    Vosteen, Louis F.; Hadcock, Richard N.

    1994-01-01

    A study of past composite aircraft structures programs was conducted to determine the lessons learned during the programs. The study focused on finding major underlying principles and practices that experience showed have significant effects on the development process and should be recognized and understood by those responsible for using of composites. Published information on programs was reviewed and interviews were conducted with personnel associated with current and past major development programs. In all, interviews were conducted with about 56 people representing 32 organizations. Most of the people interviewed have been involved in the engineering and manufacturing development of composites for the past 20 to 25 years. Although composites technology has made great advances over the past 30 years, the effective application of composites to aircraft is still a complex problem that requires experienced personnel with special knowledge. All disciplines involved in the development process must work together in real time to minimize risk and assure total product quality and performance at acceptable costs. The most successful programs have made effective use of integrated, collocated, concurrent engineering teams, and most often used well-planned, systematic development efforts wherein the design and manufacturing processes are validated in a step-by-step or 'building block' approach. Such approaches reduce program risk and are cost effective.

  7. Transcriptome and Small RNA Deep Sequencing Reveals Deregulation of miRNA Biogenesis in Human Glioma

    PubMed Central

    Moore, Lynette M.; Kivinen, Virpi; Liu, Yuexin; Annala, Matti; Cogdell, David; Liu, Xiuping; Liu, Chang-Gong; Sawaya, Raymond; Yli-Harja, Olli; Shmulevich, Ilya; Fuller, Gregory N.; Zhang, Wei; Nykter, Matti

    2013-01-01

    Altered expression of oncogenic and tumor-suppressing microRNAs (miRNAs) is widely associated with tumorigenesis. However, the regulatory mechanisms underlying these alterations are poorly understood. We sought to shed light on the deregulation of miRNA biogenesis promoting the aberrant miRNA expression profiles identified in these tumors. Using sequencing technology to perform both whole-transcriptome and small RNA sequencing of glioma patient samples, we examined precursor and mature miRNAs to directly evaluate the miRNA maturation process, and interrogated expression profiles for genes involved in the major steps of miRNA biogenesis. We found that ratios of mature to precursor forms of a large number of miRNAs increased with the progression from normal brain to low-grade and then to high-grade gliomas. The expression levels of genes involved in each of the three major steps of miRNA biogenesis (nuclear processing, nucleo-cytoplasmic transport, and cytoplasmic processing) were systematically altered in glioma tissues. Survival analysis of an independent data set demonstrated that the alteration of genes involved in miRNA maturation correlates with survival in glioma patients. Direct quantification of miRNA maturation with deep sequencing demonstrated that deregulation of the miRNA biogenesis pathway is a hallmark for glioma genesis and progression. PMID:23007860

  8. Gynecologic oncology group strategies to improve timeliness of publication.

    PubMed

    Bialy, Sally; Blessing, John A; Stehman, Frederick B; Reardon, Anne M; Blaser, Kim M

    2013-08-01

    The Gynecologic Oncology Group (GOG) is a multi-institution cooperative group funded by the National Cancer Institute to conduct clinical trials encompassing clinical and basic scientific research in gynecologic malignancies. These results are disseminated via publication in peer-reviewed journals. This process requires collaboration of numerous investigators located in diverse cancer research centers. Coordination of manuscript development is positioned within the Statistical and Data Center (SDC), thus allowing the SDC personnel to manage the process and refine strategies to promote earlier dissemination of results. A major initiative to improve timeliness utilizing the assignment, monitoring, and enforcement of deadlines for each phase of manuscript development is the focus of this investigation. Document improvement in timeliness via comparison of deadline compliance and time to journal submission due to expanded administrative and technologic initiatives implemented in 2006. Major steps in the publication process include generation of first draft by the First Author and submission to SDC, Co-author review, editorial review by Publications Subcommittee, response to journal critique, and revision. Associated with each step are responsibilities of First Author to write or revise, collaborating Biostatistician to perform analysis and interpretation, and assigned SDC Clinical Trials Editorial Associate to format/revise according to journal requirements. Upon the initiation of each step, a deadline for completion is assigned. In order to improve efficiency, a publications database was developed to track potential steps in manuscript development that enables the SDC Director of Administration and the Publications Subcommittee Chair to assign, monitor, and enforce deadlines. They, in turn, report progress to Group Leadership through the Operations Committee. The success of the strategies utilized to improve the GOG publication process was assessed by comparing the timeliness of each potential step in the development of primary Phase II manuscripts during 2003-2006 versus 2007-2010. Improvement was noted in 10 of 11 identified steps resulting in a cumulative average improvement of 240 days from notification of data maturity to First Author through first submission to a journal. Moreover, the average time to journal acceptance has improved by an average of 346 days. The investigation is based on only Phase II trials to ensure comparability of manuscript complexity. Nonetheless, the procedures employed are applicable to the development of any clinical trials manuscript. The assignment, monitoring, and enforcement of deadlines for all stages of manuscript development have resulted in increased efficiency and timeliness. The positioning and support of manuscript development within the SDC provide a valuable resource to authors in meeting assigned deadlines, accomplishing peer review, and complying with journal requirements.

  9. DNA Bipedal Motor Achieves a Large Number of Steps Due to Operation Using Microfluidics-Based Interface.

    PubMed

    Tomov, Toma E; Tsukanov, Roman; Glick, Yair; Berger, Yaron; Liber, Miran; Avrahami, Dorit; Gerber, Doron; Nir, Eyal

    2017-04-25

    Realization of bioinspired molecular machines that can perform many and diverse operations in response to external chemical commands is a major goal in nanotechnology, but current molecular machines respond to only a few sequential commands. Lack of effective methods for introduction and removal of command compounds and low efficiencies of the reactions involved are major reasons for the limited performance. We introduce here a user interface based on a microfluidics device and single-molecule fluorescence spectroscopy that allows efficient introduction and removal of chemical commands and enables detailed study of the reaction mechanisms involved in the operation of synthetic molecular machines. The microfluidics provided 64 consecutive DNA strand commands to a DNA-based motor system immobilized inside the microfluidics, driving a bipedal walker to perform 32 steps on a DNA origami track. The microfluidics enabled removal of redundant strands, resulting in a 6-fold increase in processivity relative to an identical motor operated without strand removal and significantly more operations than previously reported for user-controlled DNA nanomachines. In the motor operated without strand removal, redundant strands interfere with motor operation and reduce its performance. The microfluidics also enabled computer control of motor direction and speed. Furthermore, analysis of the reaction kinetics and motor performance in the absence of redundant strands, made possible by the microfluidics, enabled accurate modeling of the walker processivity. This enabled identification of dynamic boundaries and provided an explanation, based on the "trap state" mechanism, for why the motor did not perform an even larger number of steps. This understanding is very important for the development of future motors with significantly improved performance. Our universal interface enables two-way communication between user and molecular machine and, relying on concepts similar to that of solid-phase synthesis, removes limitations on the number of external stimuli. This interface, therefore, is an important step toward realization of reliable, processive, reproducible, and useful externally controlled DNA nanomachines.

  10. Costume and Music-Specific Dance: A Structure for Experimentation with Process and Technology

    ERIC Educational Resources Information Center

    Brown, Nathan; Dasen, Ann; Trommer-Beardslee, Heather

    2016-01-01

    This article describes how the authors completed a project at Central Michigan University (CMU) with undergraduate theater majors and minors and dance minors as part of the annual mainstage dance concert. Although the concert is predominantly choreographed and designed by CMU faculty, students are engaged in every step of the performance and…

  11. THE DEVELOPMENT OF TRAINING OBJECTIVES.

    ERIC Educational Resources Information Center

    SMITH, ROBERT G., JR.

    A SIX-STEP PROCESS IS DESCRIBED FOR DEFINING JOB-RELEVANT OBJECTIVES FOR THE TRAINING OF MILITARY PERSONNEL. (1) A FORM OF SYSTEM ANALYSIS IS OUTLINED TO PROVIDE THE CONTEXT FOR THE STUDY OF A PARTICULAR MILITARY OCCUPATION SPECIALTY. (2) A TASK INVENTORY IS MADE OF THE MAJOR DUTIES IN THE JOB AND THE MORE SPECIFIC JOB TASKS ASSOCIATED WITH EACH…

  12. A Cognitive Science Approach to Writing. Technical Report No. 89.

    ERIC Educational Resources Information Center

    Bruce, Bertram; And Others

    This paper explores the process of writing from several perspectives, as a first step toward a more comprehensive theory. The first perspective sees writing as a communicative act. The observation that to write is to communicate, though commonplace, has major and sometimes surprising implications for a theory of writing. It forces a focus on the…

  13. Adaptation Criteria for the Personalised Delivery of Learning Materials: A Multi-Stage Empirical Investigation

    ERIC Educational Resources Information Center

    Thalmann, Stefan

    2014-01-01

    Personalised e-Learning represents a major step-change from the one-size-fits-all approach of traditional learning platforms to a more customised and interactive provision of learning materials. Adaptive learning can support the learning process by tailoring learning materials to individual needs. However, this requires the initial preparation of…

  14. DNA strand displacement reaction for programmable release of biomolecules.

    PubMed

    Ramezani, Hamid; Jed Harrison, D

    2015-05-14

    Sample cleanup is a major processing step in many analytical assays. Here, we propose an approach to capture-and-release of analytes based on the DNA strand displacement reaction (SDR) and demonstrate its application to a fluoroimmunoassay on beads for a thyroid cancer biomarker, thyroglobulin. The SDR-based cleanup showed no interference from matrix molecules in serum.

  15. Ten steps for managing organizational change.

    PubMed

    Bolton, L B; Aydin, C; Popolow, G; Ramseyer, J

    1992-06-01

    Managing interdepartmental relations in healthcare organizations is a major challenge for nursing administrators. The authors describe the implementation process of an organization-wide change effort involving individuals from departments throughout the medical center. These strategies can serve as a model to guide effective planning in other institutions embarking on change projects, resulting in smoother and more effective implementation of interdepartmental change.

  16. Fabrication of Submillimeter Axisymmetric Optical Components

    NASA Technical Reports Server (NTRS)

    Grudinin, Ivan; Savchenkov, Anatoliy; Strekalov, Dmitry

    2007-01-01

    It is now possible to fashion transparent crystalline materials into axisymmetric optical components having diameters ranging from hundreds down to tens of micrometers, whereas previously, the smallest attainable diameter was 500 m. A major step in the fabrication process that makes this possible can be characterized as diamond turning or computer numerically controlled machining on an ultrahigh-precision lathe.

  17. Classification and mensuration of LACIE segments

    NASA Technical Reports Server (NTRS)

    Heydorn, R. P.; Bizzell, R. M.; Quirein, J. A.; Abotteen, K. M.; Sumner, C. A. (Principal Investigator)

    1979-01-01

    The theory of classification methods and the functional steps in the manual training process used in the three phases of LACIE are discussed. The major problems that arose in using a procedure for manually training a classifier and a method of machine classification are discussed to reveal the motivation that led to a redesign for the third LACIE phase.

  18. Oligosaccharide formation during commercial pear juice processing.

    PubMed

    Willems, Jamie L; Low, Nicholas H

    2016-08-01

    The effect of enzyme treatment and processing on the oligosaccharide profile of commercial pear juice samples was examined by high performance anion exchange chromatography with pulsed amperometric detection and capillary gas chromatography with flame ionization detection. Industrial samples representing the major stages of processing produced with various commercial enzyme preparations were studied. Through the use of commercially available standards and laboratory scale enzymatic hydrolysis of pectin, starch and xyloglucan; galacturonic acid oligomers, glucose oligomers (e.g., maltose and cellotriose) and isoprimeverose were identified as being formed during pear juice production. It was found that the majority of polysaccharide hydrolysis and oligosaccharide formation occurred during enzymatic treatment at the pear mashing stage and that the remaining processing steps had minimal impact on the carbohydrate-based chromatographic profile of pear juice. Also, all commercial enzyme preparations and conditions (time and temperature) studied produced similar carbohydrate-based chromatographic profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Defense Intelligence: Additional Steps Could Better Integrate Intelligence Input into DODs Acquisition of Major Weapon Systems

    DTIC Science & Technology

    2016-11-01

    DEFENSE INTELLIGENCE Additional Steps Could Better Integrate Intelligence Input into DOD’s Acquisition of Major Weapon...States Government Accountability Office Highlights of GAO-17-10, a report to congressional committees November 2016 DEFENSE INTELLIGENCE ...Additional Steps Could Better Integrate Intelligence Input into DOD’s Acquisition of Major Weapon Systems What GAO Found The Department of Defense (DOD

  20. Biophysical Interactions within Step-Pool Mountain Streams Following Wildfire

    NASA Astrophysics Data System (ADS)

    Parker, A.; Chin, A.; O'Dowd, A. P.

    2014-12-01

    Recovery of riverine ecosystems following disturbance is driven by a variety of interacting processes. Wildfires pose increasing disturbances to riverine landscapes, with rising frequencies and magnitudes owing to warming climates and increased fuel loads. The effects of wildfire include loss of vegetation, elevated runoff and flash floods, erosion and deposition, and changing biological habitats and communities. Understanding process interactions in post-fire landscapes is increasingly urgent for successful management and restoration of affected ecosystems. In steep channels, steps and pools provide prominent habitats for organisms and structural integrity in high energy environments. Step-pools are typically stable, responding to extreme events with recurrence intervals often exceeding 50 years. Once wildfire occurs, however, intensification of post-fire flood events can potentially overpower the inherent stability of these systems, with significant consequences for aquatic life and human well-being downstream. This study examined the short-term response of step-pool streams following the 2012 Waldo Canyon Fire in Colorado. We explored interacting feedbacks among geomorphology, hydrology, and ecology in the post-fire environment. At selected sites with varying burn severity, we established baseline conditions immediately after the fire with channel surveys, biological assessment using benthic macroinvertebrates, sediment analysis including pebble counts, and precipitation gauging. Repeat measurements after major storm events over several years enabled analysis of the interacting feedbacks among post-fire processes. We found that channels able to retain the step-pool structure changed less and facilitated recovery more readily. Step habitats maintained higher percentages of sensitive macroinvertebrate taxa compared to pools through post-fire floods. Sites burned with high severity experienced greater reduction in the percentage of sensitive taxa. The decimation of macroinvertebrates closely coincides with the physical destruction of the step-pool morphology. The role that step-pools play in enhancing the ecological quality of fluvial systems, therefore, provides a key focus for effective management and restoration of aquatic resources following wildfires.

  1. RNA editing in nascent RNA affects pre-mRNA splicing

    PubMed Central

    Hsiao, Yun-Hua Esther; Bahn, Jae Hoon; Yang, Yun; Lin, Xianzhi; Tran, Stephen; Yang, Ei-Wen; Quinones-Valdez, Giovanni

    2018-01-01

    In eukaryotes, nascent RNA transcripts undergo an intricate series of RNA processing steps to achieve mRNA maturation. RNA editing and alternative splicing are two major RNA processing steps that can introduce significant modifications to the final gene products. By tackling these processes in isolation, recent studies have enabled substantial progress in understanding their global RNA targets and regulatory pathways. However, the interplay between individual steps of RNA processing, an essential aspect of gene regulation, remains poorly understood. By sequencing the RNA of different subcellular fractions, we examined the timing of adenosine-to-inosine (A-to-I) RNA editing and its impact on alternative splicing. We observed that >95% A-to-I RNA editing events occurred in the chromatin-associated RNA prior to polyadenylation. We report about 500 editing sites in the 3′ acceptor sequences that can alter splicing of the associated exons. These exons are highly conserved during evolution and reside in genes with important cellular function. Furthermore, we identified a second class of exons whose splicing is likely modulated by RNA secondary structures that are recognized by the RNA editing machinery. The genome-wide analyses, supported by experimental validations, revealed remarkable interplay between RNA editing and splicing and expanded the repertoire of functional RNA editing sites. PMID:29724793

  2. RNA editing in nascent RNA affects pre-mRNA splicing.

    PubMed

    Hsiao, Yun-Hua Esther; Bahn, Jae Hoon; Yang, Yun; Lin, Xianzhi; Tran, Stephen; Yang, Ei-Wen; Quinones-Valdez, Giovanni; Xiao, Xinshu

    2018-06-01

    In eukaryotes, nascent RNA transcripts undergo an intricate series of RNA processing steps to achieve mRNA maturation. RNA editing and alternative splicing are two major RNA processing steps that can introduce significant modifications to the final gene products. By tackling these processes in isolation, recent studies have enabled substantial progress in understanding their global RNA targets and regulatory pathways. However, the interplay between individual steps of RNA processing, an essential aspect of gene regulation, remains poorly understood. By sequencing the RNA of different subcellular fractions, we examined the timing of adenosine-to-inosine (A-to-I) RNA editing and its impact on alternative splicing. We observed that >95% A-to-I RNA editing events occurred in the chromatin-associated RNA prior to polyadenylation. We report about 500 editing sites in the 3' acceptor sequences that can alter splicing of the associated exons. These exons are highly conserved during evolution and reside in genes with important cellular function. Furthermore, we identified a second class of exons whose splicing is likely modulated by RNA secondary structures that are recognized by the RNA editing machinery. The genome-wide analyses, supported by experimental validations, revealed remarkable interplay between RNA editing and splicing and expanded the repertoire of functional RNA editing sites. © 2018 Hsiao et al.; Published by Cold Spring Harbor Laboratory Press.

  3. Effect of a Quality Improvement Intervention on Clinical Outcomes in Patients in India With Acute Myocardial Infarction: The ACS QUIK Randomized Clinical Trial.

    PubMed

    Huffman, Mark D; Mohanan, Padinhare P; Devarajan, Raji; Baldridge, Abigail S; Kondal, Dimple; Zhao, Lihui; Ali, Mumtaj; Krishnan, Mangalath N; Natesan, Syam; Gopinath, Rajesh; Viswanathan, Sunitha; Stigi, Joseph; Joseph, Johny; Chozhakkat, Somanathan; Lloyd-Jones, Donald M; Prabhakaran, Dorairaj

    2018-02-13

    Wide heterogeneity exists in acute myocardial infarction treatment and outcomes in India. To evaluate the effect of a locally adapted quality improvement tool kit on clinical outcomes and process measures in Kerala, a southern Indian state. Cluster randomized, stepped-wedge clinical trial conducted between November 10, 2014, and November 9, 2016, in 63 hospitals in Kerala, India, with a last date of follow-up of December 31, 2016. During 5 predefined steps over the study period, hospitals were randomly selected to move in a 1-way crossover from the control group to the intervention group. Consecutively presenting patients with acute myocardial infarction were offered participation. Hospitals provided either usual care (control group; n = 10 066 participants [step 0: n = 2915; step 1: n = 2649; step 2: n = 2251; step 3: n = 1422; step 4; n = 829; step 5: n = 0]) or care using a quality improvement tool kit (intervention group; n = 11 308 participants [step 0: n = 0; step 1: n = 662; step 2: n = 1265; step 3: n = 2432; step 4: n = 3214; step 5: n = 3735]) that consisted of audit and feedback, checklists, patient education materials, and linkage to emergency cardiovascular care and quality improvement training. The primary outcome was the composite of all-cause death, reinfarction, stroke, or major bleeding using standardized definitions at 30 days. Secondary outcomes included the primary outcome's individual components, 30-day cardiovascular death, medication use, and tobacco cessation counseling. Mixed-effects logistic regression models were used to account for clustering and temporal trends. Among 21 374 eligible randomized participants (mean age, 60.6 [SD, 12.0] years; n = 16 183 men [76%] ; n = 13 689 [64%] with ST-segment elevation myocardial infarction), 21 079 (99%) completed the trial. The primary composite outcome was observed in 5.3% of the intervention participants and 6.4% of the control participants. The observed difference in 30-day major adverse cardiovascular event rates between the groups was not statistically significant after adjustment (adjusted risk difference, -0.09% [95% CI, -1.32% to 1.14%]; adjusted odds ratio, 0.98 [95% CI, 0.80-1.21]). The intervention group had a higher rate of medication use including reperfusion but no effect on tobacco cessation counseling. There were no unexpected adverse events reported. Among patients with acute myocardial infarction in Kerala, India, use of a quality improvement intervention compared with usual care did not decrease a composite of 30-day major adverse cardiovascular events. Further research is needed to understand the lack of efficacy. clinicaltrials.gov Identifier: NCT02256657.

  4. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  5. Quality control in the development of coagulation factor concentrates.

    PubMed

    Snape, T J

    1987-01-01

    Limitation of process change is a major factor contributing to assurance of quality in pharmaceutical manufacturing. This is particularly true in the manufacture of coagulation factor concentrates, for which presumptive testing for poorly defined product characteristics is an integral feature of finished product quality control. The development of new or modified preparations requires that this comfortable position be abandoned, and that the effect on finished product characteristics of changes to individual process steps (and components) be assessed. The degree of confidence in the safety and efficacy of the new product will be determined by, amongst other things, the complexity of the process alteration and the extent to which the results of finished product tests can be considered predictive. The introduction of a heat-treatment step for inactivation of potential viral contaminants in coagulation factor concentrates presents a significant challenge in both respects, quite independent of any consideration of assessment of the effectiveness of the viral inactivation step. These interactions are illustrated by some of the problems encountered with terminal dry heat-treatment (72 h. at 80 degrees C) of factor VIII and prothrombin complex concentrates manufactured by the Blood Products Laboratory.

  6. Mobilization of major inorganic ions during experimental diagenesis of characterized peats

    USGS Publications Warehouse

    Bailey, A.M.; Cohen, A.D.; Orem, W.H.; Blackson, J.H.

    2000-01-01

    Laboratory experiments were undertaken to study changes in concentrations of major inorganic ions during simulated burial of peats to about 1.5 km. Cladium, Rhizophora, and Cyrilla peats were first analyzed to determine cation distributions among fractions of the initial materials and minerals in residues from wet oxidation. Subsamples of the peats (80 g) were then subjected to increasing temperatures and pressures in steps of 5??C and 300 psi at 2-day intervals and produced solutions collected. After six steps, starting from 30??C and 300 psi, a final temperature of 60??C and a final pressure of 2100 psi were achieved. The system was then allowed to stand for an additional 2 weeks at 60??C and 2100 psi. Treatments resulted in highly altered organic solids resembling lignite and expelled solutions of systematically varying compositions. Solutions from each step were analyzed for Na+, Ca2+, Mg2+, total dissolved Si (Si(T)), Cl-, SO42-, and organic acids and anions (OAAs). Some data on total dissolved Al (Al(T)) were also collected. Mobilization of major ions from peats during these experiments is controlled by at least three processes: (1) loss of dissolved ions in original porewater expelled during compaction, (2) loss of adsorbed cations as adsorption sites are lost during modification of organic solids, and (3) increased dissolution of inorganic phases at later steps due to increased temperatures (Si(T)) and increased complexing by OAAs (Al(T)). In general, results provide insight into early post-burial inorganic changes occurring during maturation of terrestrial organic matter. (C) 2000 Elsevier Science B.V. All rights reserved.

  7. Prospects for inhibiting the post-transcriptional regulation of gene expression in hepatitis B virus

    PubMed Central

    Chen, Augustine; Panjaworayan T-Thienprasert, Nattanan; Brown, Chris M

    2014-01-01

    There is a continuing need for novel antivirals to treat hepatitis B virus (HBV) infection, as it remains a major health problem worldwide. Ideally new classes of antivirals would target multiple steps in the viral lifecycle. In this review, we consider the steps in which HBV RNAs are processed, exported from the nucleus and translated. These are often overlooked steps in the HBV life-cycle. HBV, like retroviruses, incorporates a number of unusual steps in these processes, which use a combination of viral and host cellular machinery. Some of these unusual steps deserve a closer scrutiny. They may provide alternative targets to existing antiviral therapies, which are associated with increasing drug resistance. The RNA post-transcriptional regulatory element identified 20 years ago promotes nucleocytoplasmic export of all unspliced HBV RNAs. There is evidence that inhibition of this step is part of the antiviral action of interferon. Similarly, the structured RNA epsilon element situated at the 5’ end of the polycistronic HBV pregenomic RNA also performs key roles during HBV replication. The pregenomic RNA, which is the template for translation of both the viral core and polymerase proteins, is also encapsidated and used in replication. This complex process, regulated at the epsilon element, also presents an attractive antiviral target. These RNA elements that mediate and regulate gene expression are highly conserved and could be targeted using novel strategies employing RNAi, miRNAs or aptamers. Such approaches targeting these functionally constrained genomic regions should avoid escape mutations. Therefore understanding these regulatory elements, along with providing potential targets, may also facilitate the development of other new classes of antiviral drugs. PMID:25009369

  8. An approach to improving transporting velocity in the long-range ultrasonic transportation of micro-particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Jianxin; Mei, Deqing, E-mail: meidq-127@zju.edu.cn; Yang, Keji

    2014-08-14

    In existing ultrasonic transportation methods, the long-range transportation of micro-particles is always realized in step-by-step way. Due to the substantial decrease of the driving force in each step, the transportation is lower-speed and stair-stepping. To improve the transporting velocity, a non-stepping ultrasonic transportation approach is proposed. By quantitatively analyzing the acoustic potential well, an optimal region is defined as the position, where the largest driving force is provided under the condition that the driving force is simultaneously the major component of an acoustic radiation force. To keep the micro-particle trapped in the optimal region during the whole transportation process, anmore » approach of optimizing the phase-shifting velocity and phase-shifting step is adopted. Due to the stable and large driving force, the displacement of the micro-particle is an approximately linear function of time, instead of a stair-stepping function of time as in the existing step-by-step methods. An experimental setup is also developed to validate this approach. Long-range ultrasonic transportations of zirconium beads with high transporting velocity were realized. The experimental results demonstrated that this approach is an effective way to improve transporting velocity in the long-range ultrasonic transportation of micro-particles.« less

  9. Comparative TEA for Indirect Liquefaction Pathways to Distillate-Range Fuels via Oxygenated Intermediates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Eric; Snowden-Swan, Lesley J.; Talmadge, Michael

    This paper presents a comparative techno-economic analysis of five conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with specific focus on pathways utilizing oxygenated intermediates (derived either via thermochemical or biochemical conversion steps). The four emerging pathways of interest are compared with one conventional pathway (Fischer-Tropsch) for the production of the hydrocarbon blendstocks. The processing steps of the four emerging pathways include: biomass-to-syngas via indirect gasification, gas cleanup, conversion of syngas to alcohols/oxygenates, followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation. We show that the emerging pathways via oxygenated intermediatesmore » have the potential to be cost competitive with the conventional Fischer-Tropsch process. The evaluated pathways and the benchmark process generally exhibit similar fuel yields and carbon conversion efficiencies. The resulting minimum fuel selling prices are comparable to the benchmark at approximately $3.60 per gallon-gasoline equivalent, with potential for two new pathways to be more economically competitive. Additionally, the coproduct values can play an important role in the economics of the processes with oxygenated intermediates derived via syngas fermentation. Major cost drivers for the integrated processes are tied to achievable fuel yields and conversion efficiency of the intermediate steps, i.e., the production of oxygenates/alcohols from syngas and the conversion of oxygenates/alcohols to hydrocarbon fuels.« less

  10. Evaluation of target efficiencies for solid-liquid separation steps in biofuels production.

    PubMed

    Kochergin, Vadim; Miller, Keith

    2011-01-01

    Development of liquid biofuels has entered a new phase of large scale pilot demonstration. A number of plants that are in operation or under construction face the task of addressing the engineering challenges of creating a viable plant design, scaling up and optimizing various unit operations. It is well-known that separation technologies account for 50-70% of both capital and operating cost. Additionally, reduction of environmental impact creates technological challenges that increase project cost without adding to the bottom line. Different technologies vary in terms of selection of unit operations; however, solid-liquid separations are likely to be a major contributor to the overall project cost. Despite the differences in pretreatment approaches, similar challenges arise for solid-liquid separation unit operations. A typical process for ethanol production from biomass includes several solid-liquid separation steps, depending on which particular stream is targeted for downstream processing. The nature of biomass-derived materials makes it either difficult or uneconomical to accomplish complete separation in a single step. Therefore, setting realistic efficiency targets for solid-liquid separations is an important task that influences overall process recovery and economics. Experimental data will be presented showing typical characteristics for pretreated cane bagasse at various stages of processing into cellulosic ethanol. Results of generic material balance calculations will be presented to illustrate the influence of separation target efficiencies on overall process recoveries and characteristics of waste streams.

  11. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  12. Freeform Optics: current challenges for future serial production

    NASA Astrophysics Data System (ADS)

    Schindler, C.; Köhler, T.; Roth, E.

    2017-10-01

    One of the major developments in optics industry recently is the commercial manufacturing of freeform surfaces for optical mid- and high performance systems. The loss of limitation on rotational symmetry enables completely new optical design solutions - but causes completely new challenges for the manufacturer too. Adapting the serial production from radial-symmetric to freeform optics cannot be done just by the extension of machine capabilities and software for every process step. New solutions for conventional optics productions or completely new process chains are necessary.

  13. Design of two-column batch-to-batch recirculation to enhance performance in ion-exchange chromatography.

    PubMed

    Persson, Oliver; Andersson, Niklas; Nilsson, Bernt

    2018-01-05

    Preparative liquid chromatography is a separation technique widely used in the manufacturing of fine chemicals and pharmaceuticals. A major drawback of traditional single-column batch chromatography step is the trade-off between product purity and process performance. Recirculation of impure product can be utilized to make the trade-off more favorable. The aim of the present study was to investigate the usage of a two-column batch-to-batch recirculation process step to increase the performance compared to single-column batch chromatography at a high purity requirement. The separation of a ternary protein mixture on ion-exchange chromatography columns was used to evaluate the proposed process. The investigation used modelling and simulation of the process step, experimental validation and optimization of the simulated process. In the presented case the yield increases from 45.4% to 93.6% and the productivity increases 3.4 times compared to the performance of a batch run for a nominal case. A rapid concentration build-up product can be seen during the first cycles, before the process reaches a cyclic steady-state with reoccurring concentration profiles. The optimization of the simulation model predicts that the recirculated salt can be used as a flying start of the elution, which would enhance the process performance. The proposed process is more complex than a batch process, but may improve the separation performance, especially while operating at cyclic steady-state. The recirculation of impure fractions reduces the product losses and ensures separation of product to a high degree of purity. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. How to implement information technology in the operating room and the intensive care unit.

    PubMed

    Meyfroidt, Geert

    2009-03-01

    The number of operating rooms and intensive care units looking for a data management system to perform their increasingly complex tasks is rising. Although at this time only a minority is computerized, within the next few years many centres will start implementing information technology. The transition towards a computerized system is a major venture, which will have a major impact on workflow. This chapter reviews the present literature. Published papers on this subject are predominantly single- or multi-centre implementation reports. The general principles that should guide such a process are described. For healthcare institutions or individual practitioners that plan to undertake this venture, the implementation process is described in a practical, nine-step overview.

  15. Planning Library Interiors: The Selection of Furnishings for the 21st Century. Revised Edition.

    ERIC Educational Resources Information Center

    Brown, Carol R.

    For many librarians, the selection of furniture for a new building or the major refurbishing of an existing facility is a once-in-a-lifetime task. The furniture selection process now involves much more than considering what products are available. The acquisition of furniture and shelving includes the following steps: look at the existing library…

  16. Securing Funding in Rural Programs for Young Handicapped Children. Making It Work in Rural Communities. A Rural Network Monograph.

    ERIC Educational Resources Information Center

    Garland, Corinne Welt, Comp.

    The problem of securing funds to support programs for the young handicapped child is a major one for rural service providers. The process of securing funds from within the rural community itself should include nine steps: (1) defining the needy; (2) determining responsibility; (3) identifying resources; (4) considering the message; (5) choosing…

  17. Scaling environmental change through the community level: a trait-based response-and-effect framework for plants

    Treesearch

    Katharine N. Suding; Sandra Lavorel; F. Stuart Chapin; Johannes H.C. Cornelissen; Sandra Diaz; Eric Garnier; Deborah Goldberg; David U. Hooper; Stephen T. Jackson; Marie-Laure Navas

    2008-01-01

    Predicting ecosystem responses to global change is a major challenge in ecology. A critical step in that challenge is to understand how changing environmental conditions influence processes across levels of ecological organization. While direct scaling from individual to ecosystem dynamics can lead to robust and mechanistic predictions, new approaches are needed to...

  18. From Red Tape to Results: Creating a Government That Works Better & Costs Less. Report of the National Performance Review.

    ERIC Educational Resources Information Center

    Gore, Al

    This monograph presents results of a 6-month study of the federal government and the Clinton Administration's proposal for a decade-long process of re-inventing the federal government's operations. Each of four major principles are presented in a chapter organized around specific steps towards its implementation. These principles are: cutting red…

  19. Evaluating and selecting an information system, Part 1.

    PubMed

    Neal, T

    1993-01-01

    Initial steps in the process of evaluating and selecting a computerized information system for the pharmacy department are described. The first step in the selection process is to establish a steering committee and a project committee. The steering committee oversees the project, providing policy guidance, making major decisions, and allocating budgeted expenditures. The project committee conducts the departmental needs assessment, identifies system requirements, performs day-to-day functions, evaluates vendor proposals, trains personnel, and implements the system chosen. The second step is the assessment of needs in terms of personnel, workload, physical layout, and operating requirements. The needs assessment should be based on the department's mission statement and strategic plan. The third step is the development of a request for information (RFI) and a request for proposal (RFP). The RFI is a document designed for gathering preliminary information from a wide range of vendors; this general information is used in deciding whether to send the RFP to a given vendor. The RFP requests more detailed information and gives the purchaser's exact specifications for a system; the RFP also includes contractual information. To help ensure project success, many institutions turn to computer consultants for guidance. The initial steps in selecting a computerized pharmacy information system are establishing computerization committees, conducting a needs assessment, and writing an RFI and an RFP. A crucial early decision is whether to seek a consultant's expertise.

  20. Microengineering of Metals and Ceramics: Part I: Design, Tooling and Injection Molding; Volume 3: Advanced Micro & Nanosystems

    NASA Astrophysics Data System (ADS)

    Baltes, Henry; Brand, Oliver; Fedder, Gary K.; Hierold, Christofer; Korvink, Jan G.; Tabata, Osamu; Löhe, Detlef; Haußelt, Jürgen

    2005-09-01

    Microstructures, electronics, nanotechnology - these vast fields of research are growing together as the size gap narrows and many different materials are combined. Current research, engineering sucesses and newly commercialized products hint at the immense innovative potentials and future applications that open up once mankind controls shape and function from the atomic level right up to the visible world without any gaps. In this volume, authors from three major competence centres for microengineering illustrate step by step the process from designing and simulating microcomponents of metallic and ceramic materials to replicating micro-scale components by injection molding.

  1. Cold crucible Czochralski for solar cells

    NASA Technical Reports Server (NTRS)

    Trumble, T. M.

    1982-01-01

    The efficiency and radiation resistance of present silicon solar cells are a function of the oxygen and carbon impurities and the boron doping used to provide the proper resistivity material. The standard Czochralski process used grow single crystal silicon contaminates the silicon stock material due to the use of a quartz crucible and graphite components. The use of a process which replaces these elements with a water cooled copper to crucible has provided a major step in providing gallium doped (100) crystal orientation, low oxygen, low carbon, silicon. A discussion of the Cold Crucible Czochralski process and recent float Zone developments is provided.

  2. Cold crucible Czochralski for solar cells

    NASA Astrophysics Data System (ADS)

    Trumble, T. M.

    The efficiency and radiation resistance of present silicon solar cells are a function of the oxygen and carbon impurities and the boron doping used to provide the proper resistivity material. The standard Czochralski process used grow single crystal silicon contaminates the silicon stock material due to the use of a quartz crucible and graphite components. The use of a process which replaces these elements with a water cooled copper to crucible has provided a major step in providing gallium doped (100) crystal orientation, low oxygen, low carbon, silicon. A discussion of the Cold Crucible Czochralski process and recent float Zone developments is provided.

  3. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  4. Resolving the infection process reveals striking differences in the contribution of environment, genetics and phylogeny to host-parasite interactions

    PubMed Central

    2011-01-01

    Background Infection processes consist of a sequence of steps, each critical for the interaction between host and parasite. Studies of host-parasite interactions rarely take into account the fact that different steps might be influenced by different factors and might, therefore, make different contributions to shaping coevolution. We designed a new method using the Daphnia magna - Pasteuria ramosa system, one of the rare examples where coevolution has been documented, in order to resolve the steps of the infection and analyse the factors that influence each of them. Results Using the transparent Daphnia hosts and fluorescently-labelled spores of the bacterium P. ramosa, we identified a sequence of infection steps: encounter between parasite and host; activation of parasite dormant spores; attachment of spores to the host; and parasite proliferation inside the host. The chances of encounter had been shown to depend on host genotype and environment. We tested the role of genetic and environmental factors in the newly described activation and attachment steps. Hosts of different genotypes, gender and species were all able to activate endospores of all parasite clones tested in different environments; suggesting that the activation cue is phylogenetically conserved. We next established that parasite attachment occurs onto the host oesophagus independently of host species, gender and environmental conditions. In contrast to spore activation, attachment depended strongly on the combination of host and parasite genotypes. Conclusions Our results show that different steps are influenced by different factors. Host-type-independent spore activation suggests that this step can be ruled out as a major factor in Daphnia-Pasteuria coevolution. On the other hand, we show that the attachment step is crucial for the pronounced genetic specificities of this system. We suggest that this one step can explain host population structure and could be a key force behind coevolutionary cycles. We discuss how different steps can explain different aspects of the coevolutionary dynamics of the system: the properties of the attachment step, explaining the rapid evolution of infectivity and the properties of later parasite proliferation explaining the evolution of virulence. Our study underlines the importance of resolving the infection process in order to better understand host-parasite interactions. PMID:21342515

  5. The function of advanced treatment process in a drinking water treatment plant with organic matter-polluted source water.

    PubMed

    Lin, Huirong; Zhang, Shuting; Zhang, Shenghua; Lin, Wenfang; Yu, Xin

    2017-04-01

    To understand the relationship between chemical and microbial treatment at each treatment step, as well as the relationship between microbial community structure in biofilms in biofilters and their ecological functions, a drinking water plant with severe organic matter-polluted source water was investigated. The bacterial community dynamics of two drinking water supply systems (traditional and advanced treatment processes) in this plant were studied from the source to the product water. Analysis by 454 pyrosequencing was conducted to characterize the bacterial diversity in each step of the treatment processes. The bacterial communities in these two treatment processes were highly diverse. Proteobacteria, which mainly consisted of beta-proteobacteria, was the dominant phylum. The two treatment processes used in the plant could effectively remove organic pollutants and microbial polution, especially the advanced treatment process. Significant differences in the detection of the major groups were observed in the product water samples in the treatment processes. The treatment processes, particularly the biological pretreatment and O 3 -biological activated carbon in the advanced treatment process, highly influenced the microbial community composition and the water quality. Some opportunistic pathogens were found in the water. Nitrogen-relative microorganisms found in the biofilm of filters may perform an important function on the microbial community composition and water quality improvement.

  6. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  7. [ASSESSMENT OF EXTREME FACTORS OF SHIFT WORK IN ARCTIC CONDITIONS BY WORKERS WITH DIFFERENT REGULATORY PROCESSES].

    PubMed

    Korneeva, Ya A; Simonova, N N

    2016-01-01

    A man working on a shift basis in the Arctic, every day is under the influence of various extreme factors which are inevitable for oil and gas indudtry. To adapt to shift work employees use various resources of the individual. The purpose of research is the determination of personal resources of shift workers to overcome the adverse factors of the environment in the Arctic. The study involved 191 builder of main gas pipelines, working in shifts in the Tyumen region (the length of the shift 52 days of arrival) at the age of 23 to 59 (mean age 34.9 ± 8.1) years. Methods: psychological testing, questioning, observation, descriptive statistics, discriminant step by step analysis. There was revealed the correlation between the subjective assessment of the majority of adverse climatic factors in the regulatory process "assessment of results"; production factors--regulatory processes such as flexibility, autonomy, simulation, and the general level of self-regulation; social factors are more associated with the severity of such regulatory processes, flexibility and evaluation of results.

  8. Low cost hydrogen/novel membrane technology for hydrogen separation from synthesis gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-02-01

    To make the coal-to-hydrogen route economically attractive, improvements are being sought in each step of the process: coal gasification, water-carbon monoxide shift reaction, and hydrogen separation. This report addresses the use of membranes in the hydrogen separation step. The separation of hydrogen from synthesis gas is a major cost element in the manufacture of hydrogen from coal. Separation by membranes is an attractive, new, and still largely unexplored approach to the problem. Membrane processes are inherently simple and efficient and often have lower capital and operating costs than conventional processes. In this report current ad future trends in hydrogen productionmore » and use are first summarized. Methods of producing hydrogen from coal are then discussed, with particular emphasis on the Texaco entrained flow gasifier and on current methods of separating hydrogen from this gas stream. The potential for membrane separations in the process is then examined. In particular, the use of membranes for H{sub 2}/CO{sub 2}, H{sub 2}/CO, and H{sub 2}/N{sub 2} separations is discussed. 43 refs., 14 figs., 6 tabs.« less

  9. Effect of phase errors in stepped-frequency radar systems

    NASA Astrophysics Data System (ADS)

    Vanbrundt, H. E.

    1988-04-01

    Stepped-frequency waveforms are being considered for inverse synthetic aperture radar (ISAR) imaging from ship and airborne platforms and for detailed radar cross section (RCS) measurements of ships and aircraft. These waveforms make it possible to achieve resolutions of 1.0 foot by using existing radar designs and processing technology. One problem not yet fully resolved in using stepped-frequency waveform for ISAR imaging is the deterioration in signal level caused by random frequency error. Random frequency error of the stepped-frequency source results in reduced peak responses and increased null responses. The resulting reduced signal-to-noise ratio is range dependent. Two of the major concerns addressed in this report are radar range limitations for ISAR and the error in calibration for RCS measurements caused by differences in range between a passive reflector used for an RCS reference and the target to be measured. In addressing these concerns, NOSC developed an analysis to assess the tolerable frequency error in terms of resulting power loss in signal power and signal-to-phase noise.

  10. The Radio Play's the Thing: Teaching Text and Performance through Soundscripting.

    ERIC Educational Resources Information Center

    Young, Michael W.

    The technique of Soundscripting, the addition of sound cues and sound effects to the canonical pages of any play, is flexible enough to be done at no cost or with all the advantages of modern media. In class, the use of Shakespearean radio dramas or comedies can be effective. The long-term process in class involves four major steps and may take…

  11. Archaeological Study of CA-VEN-110, Ventura, California.

    DTIC Science & Technology

    1986-01-01

    technoeconomic studies of artifacts, on the other. The recovery, analysis , and interpretation of the data to be sought will - constitute a major step toward...associated with plant processing will be immediately overwrapped and removed promptly from the field for technical analysis . The laboratory supervisor...Places. Preservation is recommended, with mitigation by data recov- ery and further analysis needed if total conservation is not possible

  12. Crew Skills and Training

    NASA Technical Reports Server (NTRS)

    Jones, Thomas; Burbank, Daniel C.; Eppler, Dean; Garrison, Robert; Harvey, Ralph; Hoffman, Paul; Schmitt, Harrison

    1998-01-01

    One of the major focus points for the workshop was the topic of crew skills and training necessary for the Mars surface mission. Discussions centered on the mix of scientific skills necessary to accomplish the proposed scientific goals, and the training environment that can bring the ground and flight teams to readiness. Subsequent discussion resulted in recommendations for specific steps to begin the process of training an experienced Mars exploration team.

  13. Direct conversion of algal biomass to biofuel

    DOEpatents

    Deng, Shuguang; Patil, Prafulla D; Gude, Veera Gnaneswar

    2014-10-14

    A method and system for providing direct conversion of algal biomass. Optionally, the method and system can be used to directly convert dry algal biomass to biodiesels under microwave irradiation by combining the reaction and combining steps. Alternatively, wet algae can be directly processed and converted to fatty acid methyl esters, which have the major components of biodiesels, by reacting with methanol at predetermined pressure and temperature ranges.

  14. Cost Effective Analysis of New Markets: First Steps of Enrollment Management for Nursing and Allied Health Programs. AIR 1997 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Coyne, Thomas J.; Nordone, Ronald; Donovan, Joseph W.; Thygeson, William

    This paper describes the initial analyses needed to help institutions of higher education plan majors in nursing and allied health as institutions look for new markets based on demographic and employment factors. Twelve variables were identified and weighted to describe an ideal recruitment market. Using a three-phase process, potential U.S.…

  15. A marker-free system for the analysis of movement disabilities.

    PubMed

    Legrand, L; Marzani, F; Dusserre, L

    1998-01-01

    A major step toward improving the treatments of disabled persons may be achieved by using motion analysis equipment. We are developing such a system. It allows the analysis of plane human motion (e.g. gait) without using the tracking of markers. The system is composed of one fixed camera which acquires an image sequence of a human in motion. Then the treatment is divided into two steps: first, a large number of pixels belonging to the boundaries of the human body are extracted at each acquisition time. Secondly, a two-dimensional model of the human body, based on tapered superquadrics, is successively matched with the sets of pixels previously extracted; a specific fuzzy clustering process is used for this purpose. Moreover, an optical flow procedure gives a prediction of the model location at each acquisition time from its location at the previous time. Finally we present some results of this process applied to a leg in motion.

  16. Counter-propagation network with variable degree variable step size LMS for single switch typing recognition.

    PubMed

    Yang, Cheng-Huei; Luo, Ching-Hsing; Yang, Cheng-Hong; Chuang, Li-Yeh

    2004-01-01

    Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, including mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for disabled persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. This restriction is a major hindrance. Therefore, a switch adaptive automatic recognition method with a high recognition rate is needed. The proposed system combines counter-propagation networks with a variable degree variable step size LMS algorithm. It is divided into five stages: space recognition, tone recognition, learning process, adaptive processing, and character recognition. Statistical analyses demonstrated that the proposed method elicited a better recognition rate in comparison to alternative methods in the literature.

  17. Ubiquitin and Proteasomes in Transcription

    PubMed Central

    Geng, Fuqiang; Wenzel, Sabine; Tansey, William P.

    2013-01-01

    Regulation of gene transcription is vitally important for the maintenance of normal cellular homeostasis. Failure to correctly regulate gene expression, or to deal with problems that arise during the transcription process, can lead to cellular catastrophe and disease. One of the ways cells cope with the challenges of transcription is by making extensive use of the proteolytic and nonproteolytic activities of the ubiquitin-proteasome system (UPS). Here, we review recent evidence showing deep mechanistic connections between the transcription and ubiquitin-proteasome systems. Our goal is to leave the reader with a sense that just about every step in transcription—from transcription initiation through to export of mRNA from the nucleus—is influenced by the UPS and that all major arms of the system—from the first step in ubiquitin (Ub) conjugation through to the proteasome—are recruited into transcriptional processes to provide regulation, directionality, and deconstructive power. PMID:22404630

  18. Storytelling, behavior planning, and language evolution in context.

    PubMed

    McBride, Glen

    2014-01-01

    An attempt is made to specify the structure of the hominin bands that began steps to language. Storytelling could evolve without need for language yet be strongly subject to natural selection and could provide a major feedback process in evolving language. A storytelling model is examined, including its effects on the evolution of consciousness and the possible timing of language evolution. Behavior planning is presented as a model of language evolution from storytelling. The behavior programming mechanism in both directions provide a model of creating and understanding behavior and language. Culture began with societies, then family evolution, family life in troops, but storytelling created a culture of experiences, a final step in the long process of achieving experienced adults by natural selection. Most language evolution occurred in conversations where evolving non-verbal feedback ensured mutual agreements on understanding. Natural language evolved in conversations with feedback providing understanding of changes.

  19. Storytelling, behavior planning, and language evolution in context

    PubMed Central

    McBride, Glen

    2014-01-01

    An attempt is made to specify the structure of the hominin bands that began steps to language. Storytelling could evolve without need for language yet be strongly subject to natural selection and could provide a major feedback process in evolving language. A storytelling model is examined, including its effects on the evolution of consciousness and the possible timing of language evolution. Behavior planning is presented as a model of language evolution from storytelling. The behavior programming mechanism in both directions provide a model of creating and understanding behavior and language. Culture began with societies, then family evolution, family life in troops, but storytelling created a culture of experiences, a final step in the long process of achieving experienced adults by natural selection. Most language evolution occurred in conversations where evolving non-verbal feedback ensured mutual agreements on understanding. Natural language evolved in conversations with feedback providing understanding of changes. PMID:25360123

  20. Using Lean Process Improvement to Enhance Safety and Value in Orthopaedic Surgery: The Case of Spine Surgery.

    PubMed

    Sethi, Rajiv; Yanamadala, Vijay; Burton, Douglas C; Bess, Robert Shay

    2017-11-01

    Lean methodology was developed in the manufacturing industry to increase output and decrease costs. These labor organization methods have become the mainstay of major manufacturing companies worldwide. Lean methods involve continuous process improvement through the systematic elimination of waste, prevention of mistakes, and empowerment of workers to make changes. Because of the profit and productivity gains made in the manufacturing arena using lean methods, several healthcare organizations have adopted lean methodologies for patient care. Lean methods have now been implemented in many areas of health care. In orthopaedic surgery, lean methods have been applied to reduce complication rates and create a culture of continuous improvement. A step-by-step guide based on our experience can help surgeons use lean methods in practice. Surgeons and hospital centers well versed in lean methodology will be poised to reduce complications, improve patient outcomes, and optimize cost/benefit ratios for patient care.

  1. Nutritional assessment of processing effects on major and trace element content in sea buckthorn juice (Hippophaë rhamnoides L. ssp. rhamnoides).

    PubMed

    Gutzeit, D; Winterhalter, P; Jerz, G

    2008-08-01

    Processing effects on the mineral content were investigated during juice production from sea buckthorn (Hippophaë rhamnoides L. ssp. rhamnoides, Elaeagnaceae) using berries from 2 different growing areas. The major and trace elements of sea buckthorn berries and juices were determined by atomic absorption spectroscopy (AAS)--(calcium, iron, magnesium, potassium, sodium) and inductively coupled plasma-mass spectrometry (ICP-MS)--(arsenic, boron, chromium, copper, manganese, molybdenum, nickel, selenium, zinc). Potassium is the most abundant major element in sea buckthorn berries and juices. The production process increased the potassium content in the juice by about 20%. Moreover, the processing of juice increased the value of manganese up to 32% compared to the content in berries. During industrial juice production, the technological steps caused a loss of about 53% to 77% of the chromium concentration, 50% of the copper content, 64% to 75% of the molybdenum amount, and up to 45% of the iron concentration in the final juice product. Consumption of sea buckthorn juice represents a beneficial source of chromium, copper, manganese, molybdenum, iron, and potassium for the achievement of the respective dietary requirements.

  2. Processing of laser formed SiC powder

    NASA Technical Reports Server (NTRS)

    Haggerty, J. S.; Bowen, H. K.

    1987-01-01

    Processing research was undertaken to demonstrate that superior SiC characteristics could be achieved through the use of ideal constituent powders and careful post-synthesis processing steps. Initial research developed the means to produce approximately 1000 A uniform diameter, nonagglomerated, spherical, high purity SiC powders. Accomplishing this goal required major revision of the particle formation and growth model from one based on classical nucleation and growth to one based on collision and coalescence of Si particles followed by their carburization. Dispersions based on pure organic solvents as well as steric stabilization were investigated. Test parts were made by the colloidal pressing technique; both liquid filtration and consolidation (rearrangement) stages were modeled. Green densities corresponding to a random close packed structure were achieved. After drying, parts were densified at temperatures ranging from 1800 to 2100 C. This research program accomplished all of its major objectives. Superior microstructures and properties were attained by using powders having ideal characteristics and special post-synthesis processing procedures.

  3. Modelling of Sub-daily Hydrological Processes Using Daily Time-Step Models: A Distribution Function Approach to Temporal Scaling

    NASA Astrophysics Data System (ADS)

    Kandel, D. D.; Western, A. W.; Grayson, R. B.

    2004-12-01

    Mismatches in scale between the fundamental processes, the model and supporting data are a major limitation in hydrologic modelling. Surface runoff generation via infiltration excess and the process of soil erosion are fundamentally short time-scale phenomena and their average behaviour is mostly determined by the short time-scale peak intensities of rainfall. Ideally, these processes should be simulated using time-steps of the order of minutes to appropriately resolve the effect of rainfall intensity variations. However, sub-daily data support is often inadequate and the processes are usually simulated by calibrating daily (or even coarser) time-step models. Generally process descriptions are not modified but rather effective parameter values are used to account for the effect of temporal lumping, assuming that the effect of the scale mismatch can be counterbalanced by tuning the parameter values at the model time-step of interest. Often this results in parameter values that are difficult to interpret physically. A similar approach is often taken spatially. This is problematic as these processes generally operate or interact non-linearly. This indicates a need for better techniques to simulate sub-daily processes using daily time-step models while still using widely available daily information. A new method applicable to many rainfall-runoff-erosion models is presented. The method is based on temporal scaling using statistical distributions of rainfall intensity to represent sub-daily intensity variations in a daily time-step model. This allows the effect of short time-scale nonlinear processes to be captured while modelling at a daily time-step, which is often attractive due to the wide availability of daily forcing data. The approach relies on characterising the rainfall intensity variation within a day using a cumulative distribution function (cdf). This cdf is then modified by various linear and nonlinear processes typically represented in hydrological and erosion models. The statistical description of sub-daily variability is thus propagated through the model, allowing the effects of variability to be captured in the simulations. This results in cdfs of various fluxes, the integration of which over a day gives respective daily totals. Using 42-plot-years of surface runoff and soil erosion data from field studies in different environments from Australia and Nepal, simulation results from this cdf approach are compared with the sub-hourly (2-minute for Nepal and 6-minute for Australia) and daily models having similar process descriptions. Significant improvements in the simulation of surface runoff and erosion are achieved, compared with a daily model that uses average daily rainfall intensities. The cdf model compares well with a sub-hourly time-step model. This suggests that the approach captures the important effects of sub-daily variability while utilizing commonly available daily information. It is also found that the model parameters are more robustly defined using the cdf approach compared with the effective values obtained at the daily scale. This suggests that the cdf approach may offer improved model transferability spatially (to other areas) and temporally (to other periods).

  4. Multi-compartmental modeling of SORLA’s influence on amyloidogenic processing in Alzheimer’s disease

    PubMed Central

    2012-01-01

    Background Proteolytic breakdown of the amyloid precursor protein (APP) by secretases is a complex cellular process that results in formation of neurotoxic Aβ peptides, causative of neurodegeneration in Alzheimer’s disease (AD). Processing involves monomeric and dimeric forms of APP that traffic through distinct cellular compartments where the various secretases reside. Amyloidogenic processing is also influenced by modifiers such as sorting receptor-related protein (SORLA), an inhibitor of APP breakdown and major AD risk factor. Results In this study, we developed a multi-compartment model to simulate the complexity of APP processing in neurons and to accurately describe the effects of SORLA on these processes. Based on dose–response data, our study concludes that SORLA specifically impairs processing of APP dimers, the preferred secretase substrate. In addition, SORLA alters the dynamic behavior of β-secretase, the enzyme responsible for the initial step in the amyloidogenic processing cascade. Conclusions Our multi-compartment model represents a major conceptual advance over single-compartment models previously used to simulate APP processing; and it identified APP dimers and β-secretase as the two distinct targets of the inhibitory action of SORLA in Alzheimer’s disease. PMID:22727043

  5. Focused-electron-beam-induced processing (FEBIP) for emerging applications in carbon nanoelectronics

    NASA Astrophysics Data System (ADS)

    Fedorov, Andrei G.; Kim, Songkil; Henry, Mathias; Kulkarni, Dhaval; Tsukruk, Vladimir V.

    2014-12-01

    Focused-electron-beam-induced processing (FEBIP), a resist-free additive nanomanufacturing technique, is an actively researched method for "direct-write" processing of a wide range of structural and functional nanomaterials, with high degree of spatial and time-domain control. This article attempts to critically assess the FEBIP capabilities and unique value proposition in the context of processing of electronics materials, with a particular emphasis on emerging carbon (i.e., based on graphene and carbon nanotubes) devices and interconnect structures. One of the major hurdles in advancing the carbon-based electronic materials and device fabrication is a disjoint nature of various processing steps involved in making a functional device from the precursor graphene/CNT materials. Not only this multi-step sequence severely limits the throughput and increases the cost, but also dramatically reduces the processing reproducibility and negatively impacts the quality because of possible between-the-step contamination, especially for impurity-susceptible materials such as graphene. The FEBIP provides a unique opportunity to address many challenges of carbon nanoelectronics, especially when it is employed as part of an integrated processing environment based on multiple "beams" of energetic particles, including electrons, photons, and molecules. This avenue is promising from the applications' prospective, as such a multi-functional (electron/photon/molecule beam) enables one to define shapes (patterning), form structures (deposition/etching), and modify (cleaning/doping/annealing) properties with locally resolved control on nanoscale using the same tool without ever changing the processing environment. It thus will have a direct positive impact on enhancing functionality, improving quality and reducing fabrication costs for electronic devices, based on both conventional CMOS and emerging carbon (CNT/graphene) materials.

  6. CDC Kerala 1: Organization of clinical child development services (1987-2013).

    PubMed

    Nair, M K C; George, Babu; Nair, G S Harikumaran; Bhaskaran, Deepa; Leena, M L; Russell, Paul Swamidhas Sudhakar

    2014-12-01

    The main objective of establishing the Child Development Centre (CDC), Kerala for piloting comprehensive child adolescent development program in India, has been to understand the conceptualization, design and scaling up of a pro-active positive child development initiative, easily replicable all over India. The process of establishing the Child Development Centre (CDC) Kerala for research, clinical services, training and community extension services over the last 25 y, has been as follows; Step 1: Conceptualization--The life cycle approach to child development; Step 2: Research basis--CDC model early stimulation is effective; Step 3: Development and validation of seven simple developmental screening tools; Step 4: CDC Diagnostic services--Ultrasonology and genetic, and metabolic laboratory; Step 5: Developing seven intervention packages; Step 6: Training--Post graduate diploma in clinical child development; Step 7: CDC Clinic Services--seven major ones; Step 8: CDC Community Services--Child development referral units; Step 9: Community service delivery models--Childhood disability and for adolescent care counselling projects; Step 10: National capacity building--Four child development related courses. CDC Kerala follow-up and clinic services are offered till 18 y of age and premarital counselling till 24 y of age as shown in "CDC Kerala Clinic Services Flow Chart" and 74,291 children have availed CDC clinic services in the last 10 y. CDC Kerala is the first model for comprehensive child adolescent development services using a lifecycle approach in the Government sector and hence declared as the collaborative centre for Rashtriya Bal Swasthya Karyakram (RBSK), in Kerala.

  7. Basic exploration geophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, E.S.

    1988-01-01

    An introduction to geophysical methods used to explore for natural resources and to survey earth's geology is presented in this volume. It is suitable for second-and third-year undergraduate students majoring in geology or engineering and for professional engineering and for professional engineers and earth scientists without formal instruction in geophysics. The author assumes the reader is familiar with geometry, algebra, and trigonometry. Geophysical exploration includes seismic refraction and reflection surveying, electrical resistivity and electromagnetic field surveying, and geophysical well logging. Surveying operations are described in step-by-step procedures and are illustrated by practical examples. Computer-based methods of processing and interpreting datamore » as well as geographical methods are introduced.« less

  8. A Module Experimental Process System Development Unit (MEPSDU). [flat plate solar arrays

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which meet the price goal in 1986 of 70 cents or less per Watt peak is described. The major accomplishments include (1) an improved AR coating technique; (2) the use of sand blast back clean-up to reduce clean up costs and to allow much of the Al paste to serve as a back conductor; and (3) the development of wave soldering for use with solar cells. Cells were processed to evaluate different process steps, a cell and minimodule test plan was prepared and data were collected for preliminary Samics cost analysis.

  9. Economic evaluation of alternative wastewater treatment plant options for pulp and paper industry.

    PubMed

    Buyukkamaci, Nurdan; Koken, Emre

    2010-11-15

    Excessive water consumption in pulp and paper industry results in high amount of wastewater. Pollutant characteristics of the wastewater vary depending on the processes used in production and the quality of paper produced. However, in general, high organic material and suspended solid contents are considered as major pollutants of pulp and paper industry effluents. The major pollutant characteristics of pulp and paper industry effluents in Turkey were surveyed and means of major pollutant concentrations, which were grouped in three different pollution grades (low, moderate and high strength effluents), and flow rates within 3000 to 10,000m(3)/day range with 1000m(3)/day steps were used as design parameters. Ninety-six treatment plants were designed using twelve flow schemes which were combinations of physical treatment, chemical treatment, aerobic and anaerobic biological processes. Detailed comparative cost analysis which includes investment, operation, maintenance and rehabilitation costs was prepared to determine optimum treatment processes for each pollution grade. The most economic and technically optimal treatment processes were found as extended aeration activated sludge process for low strength effluents, extended aeration activated sludge process or UASB followed by an aeration basin for medium strength effluents, and UASB followed by an aeration basin or UASB followed by the conventional activated sludge process for high strength effluents. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Gait parameter and event estimation using smartphones.

    PubMed

    Pepa, Lucia; Verdini, Federica; Spalazzi, Luca

    2017-09-01

    The use of smartphones can greatly help for gait parameters estimation during daily living, but its accuracy needs a deeper evaluation against a gold standard. The objective of the paper is a step-by-step assessment of smartphone performance in heel strike, step count, step period, and step length estimation. The influence of smartphone placement and orientation on estimation performance is evaluated as well. This work relies on a smartphone app developed to acquire, process, and store inertial sensor data and rotation matrices about device position. Smartphone alignment was evaluated by expressing the acceleration vector in three reference frames. Two smartphone placements were tested. Three methods for heel strike detection were considered. On the basis of estimated heel strikes, step count is performed, step period is obtained, and the inverted pendulum model is applied for step length estimation. Pearson correlation coefficient, absolute and relative errors, ANOVA, and Bland-Altman limits of agreement were used to compare smartphone estimation with stereophotogrammetry on eleven healthy subjects. High correlations were found between smartphone and stereophotogrammetric measures: up to 0.93 for step count, to 0.99 for heel strike, 0.96 for step period, and 0.92 for step length. Error ranges are comparable to those in the literature. Smartphone placement did not affect the performance. The major influence of acceleration reference frames and heel strike detection method was found in step count. This study provides detailed information about expected accuracy when smartphone is used as a gait monitoring tool. The obtained results encourage real life applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  12. Idle waves in high-performance computing

    NASA Astrophysics Data System (ADS)

    Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre

    2015-01-01

    The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.

  13. Automating Guidelines for Clinical Decision Support: Knowledge Engineering and Implementation.

    PubMed

    Tso, Geoffrey J; Tu, Samson W; Oshiro, Connie; Martins, Susana; Ashcraft, Michael; Yuen, Kaeli W; Wang, Dan; Robinson, Amy; Heidenreich, Paul A; Goldstein, Mary K

    2016-01-01

    As utilization of clinical decision support (CDS) increases, it is important to continue the development and refinement of methods to accurately translate the intention of clinical practice guidelines (CPG) into a computable form. In this study, we validate and extend the 13 steps that Shiffman et al. 5 identified for translating CPG knowledge for use in CDS. During an implementation project of ATHENA-CDS, we encoded complex CPG recommendations for five common chronic conditions for integration into an existing clinical dashboard. Major decisions made during the implementation process were recorded and categorized according to the 13 steps. During the implementation period, we categorized 119 decisions and identified 8 new categories required to complete the project. We provide details on an updated model that outlines all of the steps used to translate CPG knowledge into a CDS integrated with existing health information technology.

  14. Change management in health care.

    PubMed

    Campbell, Robert James

    2008-01-01

    This article introduces health care managers to the theories and philosophies of John Kotter and William Bridges, 2 leaders in the evolving field of change management. For Kotter, change has both an emotional and situational component, and methods for managing each are expressed in his 8-step model (developing urgency, building a guiding team, creating a vision, communicating for buy-in, enabling action, creating short-term wins, don't let up, and making it stick). Bridges deals with change at a more granular, individual level, suggesting that change within a health care organization means that individuals must transition from one identity to a new identity when they are involved in a process of change. According to Bridges, transitions occur in 3 steps: endings, the neutral zone, and beginnings. The major steps and important concepts within the models of each are addressed, and examples are provided to demonstrate how health care managers can actualize the models within their health care organizations.

  15. Acidic Residues Control the Dimerization of the N-terminal Domain of Black Widow Spiders’ Major Ampullate Spidroin 1

    NASA Astrophysics Data System (ADS)

    Bauer, Joschka; Schaal, Daniel; Eisoldt, Lukas; Schweimer, Kristian; Schwarzinger, Stephan; Scheibel, Thomas

    2016-09-01

    Dragline silk is the most prominent amongst spider silks and comprises two types of major ampullate spidroins (MaSp) differing in their proline content. In the natural spinning process, the conversion of soluble MaSp into a tough fiber is, amongst other factors, triggered by dimerization and conformational switching of their helical amino-terminal domains (NRN). Both processes are induced by protonation of acidic residues upon acidification along the spinning duct. Here, the structure and monomer-dimer-equilibrium of the domain NRN1 of Latrodectus hesperus MaSp1 and variants thereof have been investigated, and the key residues for both could be identified. Changes in ionic composition and strength within the spinning duct enable electrostatic interactions between the acidic and basic pole of two monomers which prearrange into an antiparallel dimer. Upon naturally occurring acidification this dimer is stabilized by protonation of residue E114. A conformational change is independently triggered by protonation of clustered acidic residues (D39, E76, E81). Such step-by-step mechanism allows a controlled spidroin assembly in a pH- and salt sensitive manner, preventing premature aggregation of spider silk proteins in the gland and at the same time ensuring fast and efficient dimer formation and stabilization on demand in the spinning duct.

  16. A nonchromatographic process for purification of secretory immunoglobulins from caprine whey.

    PubMed

    Matlschweiger, Alexander; Himmler, Gottfried; Linhart, Clemens; Harasek, Michael; Hahn, Rainer

    2017-05-01

    Secretory immunoglobulins are an important antibody class being primarily responsible for immunoprotection of mucosal surfaces. A simple, non-chromatographic purification process for secretory immunoglobulins from caprine whey was developed. In the first process step whey was concentrated 30-40-fold on a 500 kDa membrane, thereby increasing the purity from 3% to 15%. The second step consisted of a fractionated PEG precipitation, in which high molecular weight impurities were removed first and in the second stage the secretory immunoglobulins were precipitated, leaving a majority of the low molecular weight proteins in solution. The re-dissolved secretory immunoglobulin fraction had a purity of 43% which could then be increased to 72% by diafiltration at a volume exchange factor of 10. Further increase of purity was only possible at the expense of very high buffer consumption. If diafiltration was performed directly after ultrafiltration, followed by precipitation, the yield was higher but purity was only 54%. Overall, filtration performance was characterized by high concentration polarization, therefore process conditions were set to low trans-membrane pressure and moderate protein concentration. As such purity and to a lesser extent throughput were the major objectives rather than yield, since whey, as a by-product of the dairy industry, is a cheap raw material of almost unlimited supply. Ultra-/diafiltration performance was described well by correlations using dimensionless numbers. Compared with a theoretical model (Graetz/Leveque solution) the flux was slightly overestimated. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:642-653, 2017. © 2017 American Institute of Chemical Engineers.

  17. Turbiditic systems on passive margins: fifteen years of fruitful industry-academic exchanges.

    NASA Astrophysics Data System (ADS)

    Guillocheau, F.

    2012-04-01

    During the last fifteen years, with the oil discovery in deep offshore plays, new tools have been developed that deeply modified our knowledge on sedimentary gravity processes on passive margins: geometry, physical processes, but also the importance of the topography and the quantification of the stratigraphic parameters of control. The major breakthrough was of course the extensive 3D seismic data available around most of the world margins with a focus on gravity-tectonics dominated margins. The first major progress was the characterization of the sinuous channels infilling, their diversity and different models for their origin. This also was a better knowledge of the different types of slopes (graded vs. above-graded) and the extension of the concept of accommodation to deep-water environments (ponded, healed-slope, incised submarine valley and slope accommodation). The second step was the understanding of the synsedimentary deformations for the location and the growth of turbiditic systems on margins dominated by gravity tectonics, with the importance of the sedimentary flux and its variation through time and space. The third step is now the integration of the sedimentary system, from the upstream erosional catchment to the abyssal plain (source to sink approach), with the question of the sediment routing system. During the last 100 Ma, continents experienced major changes of both topography and climate. In the case of Africa, those are (1) the growth of the plateaus (and mainly the South African one) around 90-80 Ma (Late Cretaceous) and 40-20 Ma (Late Eocene-Early Miocene) and (2) a climate evolution from hot humid (50-40 Ma) to hot dry conditions since 20-15 Ma. This evolution changed the topography, the processes of erosion and the volume and nature (weathered vs. non weathered rocks) materials. Those are primary processes for controlling the deposition of turbiditic systems, and then to predict the location of sands. This will be discussed along the Atlantic margin of Africa. Keywords: Turbidite, Passive margins, Topography, Deformation, Source to sink

  18. Nitrogen cycling models and their application to forest harvesting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.W.; Dale, V.H.

    1986-01-01

    The characterization of forest nitrogen- (N-) cycling processes by several N-cycling models (FORCYTE, NITCOMP, FORTNITE, and LINKAGES) is briefly reviewed and evaluated against current knowledge of N cycling in forests. Some important processes (e.g., translocation within trees, N dynamics in decaying leaf litter) appear to be well characterized, whereas others (e.g., N mineralization from soil organic matter, N fixation, N dynamics in decaying wood, nitrification, and nitrate leaching) are poorly characterized, primarily because of a lack of knowledge rather than an oversight by model developers. It is remarkable how well the forest models do work in the absence of datamore » on some key processes. For those systems in which the poorly understood processes could cause major changes in N availability or productivity, the accuracy of model predictions should be examined. However, the development of N-cycling models represents a major step beyond the much simpler, classic conceptual models of forest nutrient cycling developed by early investigators. The new generation of computer models will surely improve as research reveals how key nutrient-cycling processes operate.« less

  19. Elementary reaction modeling of reversible CO/CO2 electrochemical conversion on patterned nickel electrodes

    NASA Astrophysics Data System (ADS)

    Luo, Yu; Shi, Yixiang; Li, Wenying; Cai, Ningsheng

    2018-03-01

    CO/CO2 are the major gas reactant/product in the fuel electrode of reversible solid oxide cells (RSOC). This study proposes a two-charge-transfer-step mechanism to describe the reaction and transfer processes of CO-CO2 electrochemical conversion on a patterned Ni electrode of RSOC. An elementary reaction model is developed to couple two charge transfer reactions, C(Ni)+O2-(YSZ) ↔ CO(Ni)+(YSZ) +2e- and CO(Ni)+O2-(YSZ) ↔ CO2(Ni)+(YSZ)+2e-, with adsorption/desorption, surface chemical reactions and surface diffusion. This model well validates in both solid oxide electrolysis cell (SOEC) and solid oxide fuel cell (SOFC) modes by the experimental data from a patterned Ni electrode with 10 μm stripe width at different pCO (0-0.25 atm), pCO2 (0-0.35 atm) and operating temperature (600-700 °C). This model indicates SOEC mode is dominated by charge transfer step C(Ni)+O2-(YSZ)↔CO(Ni)+(YSZ) +2e-, while SOFC mode by CO(Ni)+ O2-(YSZ)↔CO2(Ni)+(YSZ)+2e- on the patterned Ni electrode. The sensitivity analysis shows charge transfer step is the major rate-determining step for RSOC, besides, surface diffusion of CO and CO2 as well as CO2 adsorption also plays a significant role in the electrochemical reaction of SOEC while surface diffusion of CO and CO2 desorption could be co-limiting in SOFC.

  20. Feature theory and the two-step hypothesis of Müllerian mimicry evolution.

    PubMed

    Balogh, Alexandra Catherine Victoria; Gamberale-Stille, Gabriella; Tullberg, Birgitta Sillén; Leimar, Olof

    2010-03-01

    The two-step hypothesis of Müllerian mimicry evolution states that mimicry starts with a major mutational leap between adaptive peaks, followed by gradual fine-tuning. The hypothesis was suggested to solve the problem of apostatic selection producing a valley between adaptive peaks, and appears reasonable for a one-dimensional phenotype. Extending the hypothesis to the realistic scenario of multidimensional phenotypes controlled by multiple genetic loci can be problematic, because it is unlikely that major mutational leaps occur simultaneously in several traits. Here we consider the implications of predator psychology on the evolutionary process. According to feature theory, single prey traits may be used by predators as features to classify prey into discrete categories. A mutational leap in such a trait could initiate mimicry evolution. We conducted individual-based evolutionary simulations in which virtual predators both categorize prey according to features and generalize over total appearances. We found that an initial mutational leap toward feature similarity in one dimension facilitates mimicry evolution of multidimensional traits. We suggest that feature-based predator categorization together with predator generalization over total appearances solves the problem of applying the two-step hypothesis to complex phenotypes, and provides a basis for a theory of the evolution of mimicry rings.

  1. Emotional Speech Perception Unfolding in Time: The Role of the Basal Ganglia

    PubMed Central

    Paulmann, Silke; Ott, Derek V. M.; Kotz, Sonja A.

    2011-01-01

    The basal ganglia (BG) have repeatedly been linked to emotional speech processing in studies involving patients with neurodegenerative and structural changes of the BG. However, the majority of previous studies did not consider that (i) emotional speech processing entails multiple processing steps, and the possibility that (ii) the BG may engage in one rather than the other of these processing steps. In the present study we investigate three different stages of emotional speech processing (emotional salience detection, meaning-related processing, and identification) in the same patient group to verify whether lesions to the BG affect these stages in a qualitatively different manner. Specifically, we explore early implicit emotional speech processing (probe verification) in an ERP experiment followed by an explicit behavioral emotional recognition task. In both experiments, participants listened to emotional sentences expressing one of four emotions (anger, fear, disgust, happiness) or neutral sentences. In line with previous evidence patients and healthy controls show differentiation of emotional and neutral sentences in the P200 component (emotional salience detection) and a following negative-going brain wave (meaning-related processing). However, the behavioral recognition (identification stage) of emotional sentences was impaired in BG patients, but not in healthy controls. The current data provide further support that the BG are involved in late, explicit rather than early emotional speech processing stages. PMID:21437277

  2. Automatic cloud coverage assessment of Formosat-2 image

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  3. Computational modeling of soot nucleation

    NASA Astrophysics Data System (ADS)

    Chung, Seung-Hyun

    Recent studies indicate that soot is the second most significant driver of climate change---behind CO2, but ahead of methane---and increased levels of soot particles in the air are linked to health hazards such as heart disease and lung cancer. Within the soot formation process, soot nucleation is the least understood step, and current experimental findings are still limited. This thesis presents computational modeling studies of the major pathways of the soot nucleation process. In this study, two regimes of soot nucleation---chemical growth and physical agglomeration---were evaluated and the results demonstrated that combustion conditions determine the relative importance of these two routes. Also, the dimerization process of polycyclic aromatic hydrocarbons, which has been regarded as one of the most important physical agglomeration processes in soot formation, was carefully examined with a new method for obtaining the nucleation rate using molecular dynamics simulation. The results indicate that the role of pyrene dimerization, which is the commonly accepted model, is expected to be highly dependent on various flame temperature conditions and may not be a key step in the soot nucleation process. An additional pathway, coronene dimerization in this case, needed to be included to improve the match with experimental data. The results of this thesis provide insight on the soot nucleation process and can be utilized to improve current soot formation models.

  4. The influence of patient portals on users' decision making is insufficiently investigated: A systematic methodological review.

    PubMed

    Fraccaro, Paolo; Vigo, Markel; Balatsoukas, Panagiotis; Buchan, Iain E; Peek, Niels; van der Veer, Sabine N

    2018-03-01

    Patient portals are considered valuable conduits for supporting patients' self-management. However, it is unknown why they often fail to impact on health care processes and outcomes. This may be due to a scarcity of robust studies focusing on the steps that are required to induce improvement: users need to effectively interact with the portal (step 1) in order to receive information (step 2), which might influence their decision-making (step 3). We aimed to explore this potential knowledge gap by investigating to what extent each step has been investigated for patient portals, and explore the methodological approaches used. We performed a systematic literature review using Coiera's information value chain as a guiding theoretical framework. We searched MEDLINE and Scopus by combining terms related to patient portals and evaluation methodologies. Two reviewers selected relevant papers through duplicate screening, and one extracted data from the included papers. We included 115 articles. The large majority (n = 104) evaluated aspects related to interaction with patient portals (step 1). Usage was most often assessed (n = 61), mainly by analysing system interaction data (n = 50), with most authors considering participants as active users if they logged in at least once. Overall usability (n = 57) was commonly assessed through non-validated questionnaires (n = 44). Step 2 (information received) was investigated in 58 studies, primarily by analysing interaction data to evaluate usage of specific system functionalities (n = 34). Eleven studies explicitly assessed the influence of patient portals on patients' and clinicians' decisions (step 3). Whereas interaction with patient portals has been extensively studied, their influence on users' decision-making remains under-investigated. Methodological approaches to evaluating usage and usability of portals showed room for improvement. To unlock the potential of patient portals, more (robust) research should focus on better understanding the complex process of how portals lead to improved health and care. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  5. A deep etching mechanism for trench-bridging silicon nanowires

    NASA Astrophysics Data System (ADS)

    Tasdemir, Zuhal; Wollschläger, Nicole; Österle, Werner; Leblebici, Yusuf; Erdem Alaca, B.

    2016-03-01

    Introducing a single silicon nanowire with a known orientation and dimensions to a specific layout location constitutes a major challenge. The challenge becomes even more formidable, if one chooses to realize the task in a monolithic fashion with an extreme topography, a characteristic of microsystems. The need for such a monolithic integration is fueled by the recent surge in the use of silicon nanowires as functional building blocks in various electromechanical and optoelectronic applications. This challenge is addressed in this work by introducing a top-down, silicon-on-insulator technology. The technology provides a pathway for obtaining well-controlled silicon nanowires along with the surrounding microscale features up to a three-order-of-magnitude scale difference. A two-step etching process is developed, where the first shallow etch defines a nanoscale protrusion on the wafer surface. After applying a conformal protection on the protrusion, a deep etch step is carried out forming the surrounding microscale features. A minimum nanowire cross-section of 35 nm by 168 nm is demonstrated in the presence of an etch depth of 10 μm. Nanowire cross-sectional features are characterized via transmission electron microscopy and linked to specific process steps. The technology allows control on all dimensional aspects along with the exact location and orientation of the silicon nanowire. The adoption of the technology in the fabrication of micro and nanosystems can potentially lead to a significant reduction in process complexity by facilitating direct access to the nanowire during surface processes such as contact formation and doping.

  6. A deep etching mechanism for trench-bridging silicon nanowires.

    PubMed

    Tasdemir, Zuhal; Wollschläger, Nicole; Österle, Werner; Leblebici, Yusuf; Alaca, B Erdem

    2016-03-04

    Introducing a single silicon nanowire with a known orientation and dimensions to a specific layout location constitutes a major challenge. The challenge becomes even more formidable, if one chooses to realize the task in a monolithic fashion with an extreme topography, a characteristic of microsystems. The need for such a monolithic integration is fueled by the recent surge in the use of silicon nanowires as functional building blocks in various electromechanical and optoelectronic applications. This challenge is addressed in this work by introducing a top-down, silicon-on-insulator technology. The technology provides a pathway for obtaining well-controlled silicon nanowires along with the surrounding microscale features up to a three-order-of-magnitude scale difference. A two-step etching process is developed, where the first shallow etch defines a nanoscale protrusion on the wafer surface. After applying a conformal protection on the protrusion, a deep etch step is carried out forming the surrounding microscale features. A minimum nanowire cross-section of 35 nm by 168 nm is demonstrated in the presence of an etch depth of 10 μm. Nanowire cross-sectional features are characterized via transmission electron microscopy and linked to specific process steps. The technology allows control on all dimensional aspects along with the exact location and orientation of the silicon nanowire. The adoption of the technology in the fabrication of micro and nanosystems can potentially lead to a significant reduction in process complexity by facilitating direct access to the nanowire during surface processes such as contact formation and doping.

  7. Defining care products to finance health care in the Netherlands.

    PubMed

    Westerdijk, Machiel; Zuurbier, Joost; Ludwig, Martijn; Prins, Sarah

    2012-04-01

    A case-mix project started in the Netherlands with the primary goal to define a complete set of health care products for hospitals. The definition of the product structure was completed 4 years later. The results are currently being used for billing purposes. This paper focuses on the methodology and techniques that were developed and applied in order to define the casemix product structure. The central research question was how to develop a manageable product structure, i.e., a limited set of hospital products, with acceptable cost homogeneity. For this purpose, a data warehouse with approximately 1.5 million patient records from 27 hospitals was build up over a period of 3 years. The data associated with each patient consist of a large number of a priori independent parameters describing the resource utilization in different stages of the treatment process, e.g., activities in the operating theatre, the lab and the radiology department. Because of the complexity of the database, it was necessary to apply advanced data analysis techniques. The full analyses process that starts from the database and ends up with a product definition consists of four basic analyses steps. Each of these steps has revealed interesting insights. This paper describes each step in some detail and presents the major results of each step. The result consists of 687 product groups for 24 medical specialties used for billing purposes.

  8. Early assessment of the 10-step patient engagement framework for patient-centred outcomes research studies: the first three steps.

    PubMed

    Sofolahan-Oladeinde, Yewande; Newhouse, Robin P; Lavallee, Danielle C; Huang, Jennifer C; Mullins, C Daniel

    2017-06-01

    A key principle of patient-centred outcomes research (PCOR) is the engagement of patients and other stakeholders in the research process, but the evidence is still emerging on the impact patient engagement has on the research process. A 10-step framework has been developed to provide methodological guidance for patient engagement throughout the research process. However, the utility of the framework for patient engagement has not been tested in actual research studies. To describe researcher's overall experiences with engaging patients at the beginning of their PCOR research process. Twelve in-depth interviews were conducted face-to-face and by telephone with PCOR researchers between November 2014 and January 2015 at an Academic Health Center in the eastern USA. All data were audiotaped and transcribed, and NVivo 10 software was used for data analysis. Four major themes emerged (i) the importance of patient engagement and how it provides 'a perspective you can't get unless you talk to the patient'; (ii) the impact of patient engagement; (iii) challenges and barriers of engagement; and (iv) the realities of patient engagement. Researchers' views illustrate the need to re-evaluate patient engagement in PCOR based on current realities. Given the many challenges to engagement that researchers encounter, it may be more productive to redefine the process of patient engagement so that the issues researchers now face are taken into account in future funding announcements, engagement rubrics and methodology frameworks developed. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. One-Step Preservation and Decalcification of Bony Tissue for Molecular Profiling.

    PubMed

    Mueller, Claudius; Harpole, Michael G; Espina, Virginia

    2017-01-01

    Bone metastasis from primary cancer sites creates diagnostic and therapeutic challenges. Calcified bone is difficult to biopsy due to tissue hardness and patient discomfort, thus limiting the frequency and availability of bone/bone marrow biopsy material for molecular profiling. In addition, bony tissue must be demineralized (decalcified) prior to histomorphologic analysis. Decalcification processes rely on three main principles: (a) solubility of calcium salts in an acid, such as formic or nitric acid; (b) calcium chelation with ethylenediaminetetraacetic acid (EDTA); or (c) ion-exchange resins in a weak acid. A major roadblock in molecular profiling of bony tissue has been the lack of a suitable demineralization process that preserves histomorphology of calcified and soft tissue elements while also preserving phosphoproteins and nucleic acids. In this chapter, we describe general issues relevant to specimen collection and preservation of osseous tissue for molecular profiling. We provide two protocols: (a) one-step preservation of tissue histomorphology and proteins and posttranslational modifications, with simultaneous decalcification of bony tissue, and (b) ethanol-based tissue processing for TheraLin-fixed bony tissue.

  10. Automatic Preocessing of Impact Ionization Mass Spectra Obtained by Cassini CDA

    NASA Astrophysics Data System (ADS)

    Villeneuve, M.

    2015-12-01

    Since Cassini's arrival at Saturn in 2004, the Comic Dust Analyzer (CDA) has recorded nearly 200,000 mass spectra of dust particles. A majority of this data has been collected in Saturn's diffuse E ring where sodium salts embedded in water ice particles indicate that many particles are in fact frozen droplets from Enceladus' subsurface ocean that have been expelled from cracks in the icy crust. So far only a small fraction of the obtained spectra have been processed because the steps in processing the spectra require human manipulation. We developed an automatic processing pipeline for CDA mass spectra which will consistently analyze this data. The preprocessing steps are to de-noise the spectra, determine and remove the baseline, calculate the correct stretch parameter, and finally to identify elements and compounds in the spectra. With the E ring constantly evolving due to embedded active moons, this data will provide valuable information about the source of the E ring, the subsurface of Saturn's ice moon Enceladus, as well as about the dynamics of the ring itself.

  11. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  12. All-solid-state lithium-ion and lithium metal batteries - paving the way to large-scale production

    NASA Astrophysics Data System (ADS)

    Schnell, Joscha; Günther, Till; Knoche, Thomas; Vieider, Christoph; Köhler, Larissa; Just, Alexander; Keller, Marlou; Passerini, Stefano; Reinhart, Gunther

    2018-04-01

    Challenges and requirements for the large-scale production of all-solid-state lithium-ion and lithium metal batteries are herein evaluated via workshops with experts from renowned research institutes, material suppliers, and automotive manufacturers. Aiming to bridge the gap between materials research and industrial mass production, possible solutions for the production chains of sulfide and oxide based all-solid-state batteries from electrode fabrication to cell assembly and quality control are presented. Based on these findings, a detailed comparison of the production processes for a sulfide based all-solid-state battery with conventional lithium-ion cell production is given, showing that processes for composite electrode fabrication can be adapted with some effort, while the fabrication of the solid electrolyte separator layer and the integration of a lithium metal anode will require completely new processes. This work identifies the major steps towards mass production of all-solid-state batteries, giving insight into promising manufacturing technologies and helping stakeholders, such as machine engineering, cell producers, and original equipment manufacturers, to plan the next steps towards safer batteries with increased storage capacity.

  13. Modern trends in lipomodeling

    PubMed Central

    El-Sabbagh, Ahmed Hassan

    2017-01-01

    Lipomodeling is the process of relocating autologous fat to change the shape, volume, consistency, and profile of tissues, with the aim of reconstructing, rejuvenating, and regenerating body features. There have been several important advancements in lipomodeling procedures during the last thirty years. Four clinical steps are important for the success of engraftment: fat harvesting, fat processing, fat reinjection, and preconditioning of the recipient site. With the discovery of adipose derived stem cells and dedifferentiated cells, fat cells become a major tool of regenerative medicine. This article reviews recent trends in lipomodeling trying to understand most of the issues in this field. PMID:28401032

  14. Low cost hydrogen/novel membrane technology for hydrogen separation from synthesis gas. Task 1, Literature survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-02-01

    To make the coal-to-hydrogen route economically attractive, improvements are being sought in each step of the process: coal gasification, water-carbon monoxide shift reaction, and hydrogen separation. This report addresses the use of membranes in the hydrogen separation step. The separation of hydrogen from synthesis gas is a major cost element in the manufacture of hydrogen from coal. Separation by membranes is an attractive, new, and still largely unexplored approach to the problem. Membrane processes are inherently simple and efficient and often have lower capital and operating costs than conventional processes. In this report current ad future trends in hydrogen productionmore » and use are first summarized. Methods of producing hydrogen from coal are then discussed, with particular emphasis on the Texaco entrained flow gasifier and on current methods of separating hydrogen from this gas stream. The potential for membrane separations in the process is then examined. In particular, the use of membranes for H{sub 2}/CO{sub 2}, H{sub 2}/CO, and H{sub 2}/N{sub 2} separations is discussed. 43 refs., 14 figs., 6 tabs.« less

  15. Self-management of type 2 diabetes mellitus: a qualitative investigation from the perspective of participants in a nurse-led, shared-care programme in the Netherlands.

    PubMed

    Moser, Albine; van der Bruggen, Harry; Widdershoven, Guy; Spreeuwenberg, Cor

    2008-03-18

    Diabetes mellitus is a major public health problem. Little is known about how people with type 2 diabetes experience self-management in a nurse-led, shared-care programme. The purpose of this article is to report an empirically grounded conceptualization of self-management in the context of autonomy of people with type 2 diabetes. This study has a qualitative descriptive, and exploratory design with an inductive approach. Data were collected by means of in-depth interviews. The sample consisted of older adults with type 2 diabetes in a nurse-led, shared-care setting. The data analysis was completed by applying the constant comparative analysis as recommended in grounded theory. People with type 2 diabetes use three kinds of self-management processes: daily, off-course, and preventive. The steps for daily self-management are adhering, adapting, and acting routinely. The steps for off-course self-management are becoming aware, reasoning, deciding, acting, and evaluating. The steps for preventive self-management are experiencing, learning, being cautious, and putting into practice. These processes are interwoven and recurring. Self-management consists of a complex and dynamic set of processes and it is deeply embedded in one's unique life situation. Support from diabetes specialist nurses and family caregivers is a necessity of self-managing diabetes.

  16. A 10-Step Guide to Adopting and Sustaining Evidence-Based Practices in Out-of-School Time Programs. Research-to-Results Brief. Publication #2007-15

    ERIC Educational Resources Information Center

    Metz, Allison J. R.

    2007-01-01

    This brief represents part 2 in a series on fostering the adoption of evidence-based practices in out-of-school time programs. Many practitioners lack information on how to implement evidence-based practice(s) in their own programs or communities. A major reason for this gap is a lack of research on the process for implementing evidence-based…

  17. Interferometry theory for the block 2 processor

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1987-01-01

    Presented is the interferometry theory for the Block 2 processor, including a high-level functional description and a discussion of data structure. The analysis covers the major processing steps: cross-correlation, fringe counter-rotation, transformation to the frequency domain, phase calibration, bandwidth synthesis, and extraction of the observables of amplitude, phase, phase rate, and delay. Also included are analyses for fractional bitshift correction, station clock error, ionosphere correction, and effective frequencies for the observables.

  18. Design review - A tool for all seasons.

    NASA Technical Reports Server (NTRS)

    Liberman, D. S.

    1972-01-01

    The origins of design review are considered together with questions of definitions. The main characteristics which distinguish the concept of design review discussed from the basic master-apprentice relationship include competence, objectivity, formality, and a systematic approach. Preliminary, major, and final reviews are the steps used in the management of the design and development process in each company. It is shown that the design review is generically a systems engineering milestone review with certain unique characteristics.

  19. Next Steps for the 1980s in Student Financial Aid. A Fourth Alternative. Comments and Recommendations by the Carnegie Council on Policy Studies in Higher Education.

    ERIC Educational Resources Information Center

    Carnegie Council on Policy Studies in Higher Education, Berkeley, CA.

    In 1979 the Congress and the administration will consider legislation to extend and revise the Higher Education Act and, in the process, will review the structure of the student-aid system. Several alternatives are likely to be considered, but a fourth is proposed here: a major overhaul of the existing package of programs to make them more…

  20. In operando neutron diffraction study of the temperature and current rate-dependent phase evolution of LiFePO4 in a commercial battery

    NASA Astrophysics Data System (ADS)

    Sharma, N.; Yu, D. H.; Zhu, Y.; Wu, Y.; Peterson, V. K.

    2017-02-01

    In operando NPD data of electrodes in lithium-ion batteries reveal unusual LiFePO4 phase evolution after the application of a thermal step and at high current. At low current under ambient conditions the LiFePO4 to FePO4 two-phase reaction occurs during the charge process, however, following a thermal step and at higher current this reaction appears at the end of charge and continues into the next electrochemical step. The same behavior is observed for the FePO4 to LiFePO4 transition, occurring at the end of discharge and continuing into the following electrochemical step. This suggests that the bulk (or the majority of the) electrode transformation is dependent on the battery's history, current, or temperature. Such information concerning the non-equilibrium evolution of an electrode allows a direct link between the electrode's functional mechanism that underpins lithium-ion battery behavior and the real-life operating conditions of the battery, such as variable temperature and current, to be made.

  1. Dimming LEDs with Phase-Cut Dimmers: The Specifier's Process for Maximizing Success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Naomi J.; Poplawski, Michael E.

    2013-10-01

    This report reviews how phase-cut dimmers work, how LEDs differ from the incandescent lamps that the dimmers were historically designed to control, and how these differences can lead to complications when trying to dim LEDs. Compatibility between a specific LED source and a specific phase-cut dimmer is often unknown and difficult to assess, and ensuring compatibility adds complexity to the design, specification, bidding, and construction observation phases for new buildings and major remodel projects. To maximize project success, this report provides both general guidance and step-by-step procedures for designing phase-controlled LED dimming on both new and existing projects, as wellmore » as real-world examples of how to use those procedures.« less

  2. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    PubMed

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  3. SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, M; Russell Eibling, R; David Koopman, D

    2007-09-04

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratiomore » of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.« less

  4. Characterization of shallow trench isolation CMP process and its application

    NASA Astrophysics Data System (ADS)

    Li, Helen; Zhang, ChunLei; Liu, JinBing; Liu, ZhengFang; Chen, Kuang Han; Gbondo-Tugbawa, Tamba; Ding, Hua; Li, Flora; Lee, Brian; Gower-Hall, Aaron; Chiu, Yang-Chih

    2016-03-01

    Chemical mechanical polishing (CMP) has been a critical enabling technology in shallow trench isolation (STI), which is used in current integrated circuit fabrication process to accomplish device isolation. Excessive dishing and erosion in STI CMP processes, however, create device yield concerns. This paper proposes characterization and modeling techniques to address a variety of concerns in STI CMP. In the past, majority of CMP publications have been addressed on interconnect layers in backend- of-line (BEOL) process. However, the number of CMP steps in front-end-of-line (FEOL) has been increasing in more advanced process techniques like 3D-FinFET and replacement metal gate, as a results incoming topography induced by FEOL CMP steps can no longer be ignored as the topography accumulates and stacks up across multiple CMP steps and eventually propagating to BEOL layers. In this paper, we first discuss how to characterize and model STI CMP process. Once STI CMP model is developed, it can be used for screening design and detect possible manufacturing weak spots. We also work with process engineering team to establish hotspot criteria in terms of oxide dishing and nitride loss. As process technologies move from planar transistor to 3D transistor like FinFet and multi-gate, it is important to accurately predict topography in FEOL CMP processes. These incoming topographies when stacked up can have huge impact in BEOL copper processes, where copper pooling becomes catastrophic yield loss. A calibration methodology to characterize STI CMP step is developed as shown in Figure 1; moreover, this STI CMP model is validated from silicon data collected from product chips not used in calibration stage. Additionally, wafer experimental setup and metrology plan are instrumental to an accurate model with high predictive power. After a model is generated, spec limits and threshold to establish hotspots criteria can be defined. Such definition requires working closely with foundry process engineering and integration team and reviewing past failure analysis (FA) to come up a reasonable metrics. Conventionally, a potential STI weak point can be found when nitride residues remains in the active region after nitride strip. Another source of STI hotspots occurs when nitride erosion is too much, and active region can suffer severe damage.

  5. Synchrotron x-ray microtomography of the interior microstructure of chocolate

    NASA Astrophysics Data System (ADS)

    Lügger, Svenja K.; Wilde, Fabian; Dülger, Nihan; Reinke, Lennart M.; Kozhar, Sergii; Beckmann, Felix; Greving, Imke; Vieira, Josélio; Heinrich, Stefan; Palzer, Stefan

    2016-10-01

    The structure of chocolate, a multicomponent food product, was analyzed using microtomography. Chocolate consists of a semi-solid cocoa butter matrix and a dense network of suspended particles. A detailed analysis of the microstructure is needed to understand mass transport phenomena. Transport of lipids from e.g. a filling or liquid cocoa butter is responsible for major problems in the confectionery industry such as formation of chocolate bloom, which is the formation of visible white spots or a grayish haze on the chocolate surface and leads to consumer rejections and thus large sales losses for the confectionery industry. In this study it was possible to visualize the inner structure of chocolate and clearly distinguish the particles from the continuous phase by taking advantage of the high density contrast of synchrotron radiation. Consequently, particle arrangement and cracks within the sample were made visible. The cracks are several micrometers thick and propagate throughout the entire sample. Images of pure cocoa butter, chocolate without any particles, did not show any cracks and thus confirmed that cracks are a result of embedded particles. They arise during the manufacturing process. Thus, the solidification process, a critical manufacturing step, was simulated with finite element methods in order to understand crack formation during this step. The simulation showed that cracks arise because of significant contraction of cocoa butter, the matrix phase, without any major change of volume of the suspended particles. Tempering of the chocolate mass prior to solidification is another critical step for a good product quality. We found that samples which solidified in an uncontrolled manner are less homogeneous than tempered samples. In summary, our study visualized for the first time the inner microstructure of tempered and untempered cocoa butter as well as chocolate without sample destruction and revealed cracks, which might act as transport pathways.

  6. Laser-zone growth in a Ribbon-To-Ribbon (RTR) process. Silicon sheet growth development for the large area silicon sheet task of the low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Gurtler, R. W.; Baghdadi, A.; Legge, R.; Sopori, B.; Ellis, R. J.

    1977-01-01

    The Ribbon-to-Ribbon (RTR) approach to silicon ribbon growth is investigated. An existing RTR apparatus is to be upgraded to its full capabilities and operated routinely to investigate and optimize the effects of various growth parameters on growth results. A new RTR apparatus was constructed to incorporate increased capabilities and improvements over the first apparatus and to be capable of continuous growth. New high power lasers were implemented and this led to major improvements in growth velocity -- 4 inch/min. growth has been demonstrated. A major step in demonstration of the full feasibility of the RTR process is reported in the demonstration of RTR growth from CVD polyribbon rather than sliced polyribbon ingots. Average solar cell efficiencies of greater than 9% and a best cell efficiency of 11.7% are reported. Processing was shown to provide a substantial improvement in material minority carrier diffusion length. An economic analysis is reported which treats both the polyribbon fabrication and RTR processes.

  7. Process engineering economics of bioethanol production.

    PubMed

    Galbe, Mats; Sassner, Per; Wingren, Anders; Zacchi, Guido

    2007-01-01

    This work presents a review of studies on the process economics of ethanol production from lignocellulosic materials published since 1996. Our objective was to identify the most costly process steps and the impact of various parameters on the final production cost, e.g. plant capacity, raw material cost, and overall product yield, as well as process configuration. The variation in estimated ethanol production cost is considerable, ranging from about 0.13 to 0.81 US$ per liter ethanol. This can be explained to a large extent by actual process differences and variations in the assumptions underlying the techno-economic evaluations. The most important parameters for the economic outcome are the feedstock cost, which varied between 30 and 90 US$ per metric ton in the papers studied, and the plant capacity, which influences the capital cost. To reduce the ethanol production cost it is necessary to reach high ethanol yields, as well as a high ethanol concentration during fermentation, to be able to decrease the energy required for distillation and other downstream process steps. Improved pretreatment methods, enhanced enzymatic hydrolysis with cheaper and more effective enzymes, as well as improved fermentation systems present major research challenges if we are to make lignocellulose-based ethanol production competitive with sugar- and starch-based ethanol. Process integration, either internally or externally with other types of plants, e.g. heat and power plants, also offers a way of reducing the final ethanol production cost.

  8. Strategies for Stabilizing Nitrogenous Compounds in ECLSS Wastewater: Top-Down System Design and Unit Operation Selection with Focus on Bio-Regenerative Processes for Short and Long Term Scenarios

    NASA Technical Reports Server (NTRS)

    Lunn, Griffin M.

    2011-01-01

    Water recycling and eventual nutrient recovery is crucial for surviving in or past low earth orbit. New approaches and syste.m architecture considerations need to be addressed to meet current and future system requirements. This paper proposes a flexible system architecture that breaks down pretreatment , steps into discrete areas where multiple unit operations can be considered. An overview focusing on the urea and ammonia conversion steps allows an analysis on each process's strengths and weaknesses and synergy with upstream and downstream processing. Process technologies to be covered include chemical pretreatment, biological urea hydrolysis, chemical urea hydrolysis, combined nitrification-denitrification, nitrate nitrification, anammox denitrification, and regenerative ammonia absorption through struvite formation. Biological processes are considered mainly for their ability to both maximize water recovery and to produce nutrients for future plant systems. Unit operations can be considered for traditional equivalent system mass requirements in the near term or what they can provide downstream in the form of usable chemicals or nutrients for the long term closed-loop ecological control and life support system. Optimally this would allow a system to meet the former but to support the latter without major modification.

  9. The effect of a novel minimally invasive strategy for infected necrotizing pancreatitis.

    PubMed

    Tong, Zhihui; Shen, Xiao; Ke, Lu; Li, Gang; Zhou, Jing; Pan, Yiyuan; Li, Baiqiang; Yang, Dongliang; Li, Weiqin; Li, Jieshou

    2017-11-01

    Step-up approach consisting of multiple minimally invasive techniques has gradually become the mainstream for managing infected pancreatic necrosis (IPN). In the present study, we aimed to compare the safety and efficacy of a novel four-step approach and the conventional approach in managing IPN. According to the treatment strategy, consecutive patients fulfilling the inclusion criteria were put into two time intervals to conduct a before-and-after comparison: the conventional group (2010-2011) and the novel four-step group (2012-2013). The conventional group was essentially open necrosectomy for any patient who failed percutaneous drainage of infected necrosis. And the novel drainage approach consisted of four different steps including percutaneous drainage, negative pressure irrigation, endoscopic necrosectomy and open necrosectomy in sequence. The primary endpoint was major complications (new-onset organ failure, sepsis or local complications, etc.). Secondary endpoints included mortality during hospitalization, need of emergency surgery, duration of organ failure and sepsis, etc. Of the 229 recruited patients, 92 were treated with the conventional approach and the remaining 137 were managed with the novel four-step approach. New-onset major complications occurred in 72 patients (78.3%) in the two-step group and 75 patients (54.7%) in the four-step group (p < 0.001). For other important endpoints, although there was no statistical difference in mortality between the two groups (p = 0.403), significantly fewer patients in the four-step group required emergency surgery when compared with the conventional group [14.6% (20/137) vs. 45.6% (42/92), p < 0.001]. In addition, stratified analysis revealed that the four-step approach group presented significantly lower incidence of new-onset organ failure and other major complications in patients with the most severe type of AP. Comparing with the conventional approach, the novel four-step approach significantly reduced the rate of new-onset major complications and requirement of emergency operations in treating IPN, especially in those with the most severe type of acute pancreatitis.

  10. Regulation of cerebral cortex development by Rho GTPases: insights from in vivo studies

    PubMed Central

    Azzarelli, Roberta; Kerloch, Thomas; Pacary, Emilie

    2015-01-01

    The cerebral cortex is the site of higher human cognitive and motor functions. Histologically, it is organized into six horizontal layers, each containing unique populations of molecularly and functionally distinct excitatory projection neurons and inhibitory interneurons. The stereotyped cellular distribution of cortical neurons is crucial for the formation of functional neural circuits and it is predominantly established during embryonic development. Cortical neuron development is a multiphasic process characterized by sequential steps of neural progenitor proliferation, cell cycle exit, neuroblast migration and neuronal differentiation. This series of events requires an extensive and dynamic remodeling of the cell cytoskeleton at each step of the process. As major regulators of the cytoskeleton, the family of small Rho GTPases has been shown to play essential functions in cerebral cortex development. Here we review in vivo findings that support the contribution of Rho GTPases to cortical projection neuron development and we address their involvement in the etiology of cerebral cortex malformations. PMID:25610373

  11. A smart technique for attendance system to recognize faces through parallelism

    NASA Astrophysics Data System (ADS)

    Prabhavathi, B.; Tanuja, V.; Madhu Viswanatham, V.; Rajashekhara Babu, M.

    2017-11-01

    Major part of recognising a person is face with the help of image processing techniques we can exploit the physical features of a person. In the old approach method that is used in schools and colleges it is there that the professor calls the student name and then the attendance for the students marked. Here in paper want to deviate from the old approach and go with the new approach by using techniques that are there in image processing. In this paper we presenting spontaneous presence for students in classroom. At first classroom image has been in use and after that image is kept in data record. For the images that are stored in the database we apply system algorithm which includes steps such as, histogram classification, noise removal, face detection and face recognition methods. So by using these steps we detect the faces and then compare it with the database. The attendance gets marked automatically if the system recognizes the faces.

  12. Direct Growth of Graphene Film on Germanium Substrate

    PubMed Central

    Wang, Gang; Zhang, Miao; Zhu, Yun; Ding, Guqiao; Jiang, Da; Guo, Qinglei; Liu, Su; Xie, Xiaoming; Chu, Paul K.; Di, Zengfeng; Wang, Xi

    2013-01-01

    Graphene has been predicted to play a role in post-silicon electronics due to the extraordinary carrier mobility. Chemical vapor deposition of graphene on transition metals has been considered as a major step towards commercial realization of graphene. However, fabrication based on transition metals involves an inevitable transfer step which can be as complicated as the deposition of graphene itself. By ambient-pressure chemical vapor deposition, we demonstrate large-scale and uniform depositon of high-quality graphene directly on a Ge substrate which is wafer scale and has been considered to replace conventional Si for the next generation of high-performance metal-oxide-semiconductor field-effect transistors (MOSFETs). The immiscible Ge-C system under equilibrium conditions dictates graphene depositon on Ge via a self-limiting and surface-mediated process rather than a precipitation process as observed from other metals with high carbon solubility. Our technique is compatible with modern microelectronics technology thus allowing integration with high-volume production of complementary metal-oxide-semiconductors (CMOS). PMID:23955352

  13. MO-D-213-02: Quality Improvement Through a Failure Mode and Effects Analysis of Pediatric External Beam Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, J; Lukose, R; Bronson, J

    2015-06-15

    Purpose: To conduct a failure mode and effects analysis (FMEA) as per AAPM Task Group 100 on clinical processes associated with teletherapy, and the development of mitigations for processes with identified high risk. Methods: A FMEA was conducted on clinical processes relating to teletherapy treatment plan development and delivery. Nine major processes were identified for analysis. These steps included CT simulation, data transfer, image registration and segmentation, treatment planning, plan approval and preparation, and initial and subsequent treatments. Process tree mapping was utilized to identify the steps contained within each process. Failure modes (FM) were identified and evaluated with amore » scale of 1–10 based upon three metrics: the severity of the effect, the probability of occurrence, and the detectability of the cause. The analyzed metrics were scored as follows: severity – no harm = 1, lethal = 10; probability – not likely = 1, certainty = 10; detectability – always detected = 1, undetectable = 10. The three metrics were combined multiplicatively to determine the risk priority number (RPN) which defined the overall score for each FM and the order in which process modifications should be deployed. Results: Eighty-nine procedural steps were identified with 186 FM accompanied by 193 failure effects with 213 potential causes. Eighty-one of the FM were scored with a RPN > 10, and mitigations were developed for FM with RPN values exceeding ten. The initial treatment had the most FM (16) requiring mitigation development followed closely by treatment planning, segmentation, and plan preparation with fourteen each. The maximum RPN was 400 and involved target delineation. Conclusion: The FMEA process proved extremely useful in identifying previously unforeseen risks. New methods were developed and implemented for risk mitigation and error prevention. Similar to findings reported for adult patients, the process leading to the initial treatment has an associated high risk.« less

  14. Air emissions of ammonia and methane from livestock operations: valuation and policy options.

    PubMed

    Shih, Jhih-Shyang; Burtraw, Dallas; Palmer, Karen; Siikamäki, Juha

    2008-09-01

    The animal husbandry industry is a major emitter of ammonia (NH3), which is a precursor of fine particulate matter (PM2.5)--arguably, the number-one environment-related public health threat facing the nation. The industry is also a major emitter of methane (CH4), which is an important greenhouse gas (GHG). We present an integrated process model of the engineering economics of technologies to reduce NH3 and CH4 emissions at dairy operations in California. Three policy options are explored: PM offset credits for NH3 control, GHG offset credits for CH4 control, and expanded net metering policies to provide revenue for the sale of electricity generated from captured methane (CH4) gas. Individually these policies vary substantially in the economic incentives they provide for farm operators to reduce emissions. We report on initial steps to fully develop the integrated process model that will provide guidance for policy-makers.

  15. Autonomy for Constellation

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    The newer types of space systems, which are planned for the future, are placing challenging demands for newer autonomy concepts and techniques. Motivating these challenges are resource constraints. Even though onboard computing power will surely increase in the coming years, the resource constraints associated with space-based processes will continue to be a major factor that needs to be considered when dealing with, for example, agent-based spacecraft autonomy. To realize "economical intelligence", i.e., constrained computational intelligence that can reside within a process under severe resource constraints (time, power, space, etc.), is a major goal for such space systems as the Nanosat constellations. To begin to address the new challenges, we are developing approaches to constellation autonomy with constraints in mind. Within the Agent Concepts Testbed (ACT) at the Goddard Space Flight Center we are currently developing a Nanosat-related prototype for the first of the two-step program.

  16. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central Greece. Moreover, remote sensing has proven very effective in delineating spatial variability and features in drought monitoring and assessment.

  17. Isoconversional approach for non-isothermal decomposition of un-irradiated and photon-irradiated 5-fluorouracil.

    PubMed

    Mohamed, Hala Sh; Dahy, AbdelRahman A; Mahfouz, Refaat M

    2017-10-25

    Kinetic analysis for the non-isothermal decomposition of un-irradiated and photon-beam-irradiated 5-fluorouracil (5-FU) as anti-cancer drug, was carried out in static air. Thermal decomposition of 5-FU proceeds in two steps. One minor step in the temperature range of (270-283°C) followed by the major step in the temperature range of (285-360°C). The non-isothermal data for un-irradiated and photon-irradiated 5-FU were analyzed using linear (Tang) and non-linear (Vyazovkin) isoconversional methods. The results of the application of these free models on the present kinetic data showed quite a dependence of the activation energy on the extent of conversion. For un-irradiated 5-FU, the non-isothermal data analysis indicates that the decomposition is generally described by A3 and A4 modeles for the minor and major decomposition steps, respectively. For a photon-irradiated sample of 5-FU with total absorbed dose of 10Gy, the decomposition is controlled by A2 model throughout the coversion range. The activation energies calculated in case of photon-irradiated 5-FU were found to be lower compared to the values obtained from the thermal decomposition of the un-irradiated sample probably due to the formation of additional nucleation sites created by a photon-irradiation. The decomposition path was investigated by intrinsic reaction coordinate (IRC) at the B3LYP/6-311++G(d,p) level of DFT. Two transition states were involved in the process by homolytic rupture of NH bond and ring secession, respectively. Published by Elsevier B.V.

  18. Expression and putative role of mitochondrial transport proteins in cancer.

    PubMed

    Lytovchenko, Oleksandr; Kunji, Edmund R S

    2017-08-01

    Cancer cells undergo major changes in energy and biosynthetic metabolism. One of them is the Warburg effect, in which pyruvate is used for fermentation rather for oxidative phosphorylation. Another major one is their increased reliance on glutamine, which helps to replenish the pool of Krebs cycle metabolites used for other purposes, such as amino acid or lipid biosynthesis. Mitochondria are central to these alterations, as the biochemical pathways linking these processes run through these organelles. Two membranes, an outer and inner membrane, surround mitochondria, the latter being impermeable to most organic compounds. Therefore, a large number of transport proteins are needed to link the biochemical pathways of the cytosol and mitochondrial matrix. Since the transport steps are relatively slow, it is expected that many of these transport steps are altered when cells become cancerous. In this review, changes in expression and regulation of these transport proteins are discussed as well as the role of the transported substrates. This article is part of a Special Issue entitled Mitochondria in Cancer, edited by Giuseppe Gasparre, Rodrigue Rossignol and Pierre Sonveaux. Copyright © 2017. Published by Elsevier B.V.

  19. Process, including PSA and membrane separation, for separating hydrogen from hydrocarbons

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2001-01-01

    An improved process for separating hydrogen from hydrocarbons. The process includes a pressure swing adsorption step, a compression/cooling step and a membrane separation step. The membrane step relies on achieving a methane/hydrogen selectivity of at least about 2.5 under the conditions of the process.

  20. Titanium: Industrial Base, Price Trends, and Technology Initiatives

    DTIC Science & Technology

    2009-01-01

    respectively.3 All titanium metal production begins with rutile (titanium oxide, or TiO2). High-titania slag , produced by ilmen- ite smelting, is the first...Ilmenite ores are used in iron production. They leave a TiO2-rich slag , which is usually upgraded to be used in titanium production. 4 According to the...and least expensive process for producing titanium sponge, has four major steps. First, rutile con- centrate or synthetic rutile (titanium slag ) is

  1. Solar Project Development Pathway & Resources

    EPA Pesticide Factsheets

    The Local Government Solar Project Portal's Solar Project Development Pathway and Resources page details the major steps along the project development pathway and each step includes resources and tools to assist you with that step.

  2. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images.

    PubMed

    Muncy, Nathan M; Hedges-Muncy, Ariana M; Kirwan, C Brock

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing.

  3. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    PubMed

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use of immobilized biocatalysts is considered. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Merging constitutional and motional covalent dynamics in reversible imine formation and exchange processes.

    PubMed

    Kovaříček, Petr; Lehn, Jean-Marie

    2012-06-06

    The formation and exchange processes of imines of salicylaldehyde, pyridine-2-carboxaldehyde, and benzaldehyde have been studied, showing that the former has features of particular interest for dynamic covalent chemistry, displaying high efficiency and fast rates. The monoimines formed with aliphatic α,ω-diamines display an internal exchange process of self-transimination type, inducing a local motion of either "stepping-in-place" or "single-step" type by bond interchange, whose rate decreases rapidly with the distance of the terminal amino groups. Control of the speed of the process over a wide range may be achieved by substituents, solvent composition, and temperature. These monoimines also undergo intermolecular exchange, thus merging motional and constitutional covalent behavior within the same molecule. With polyamines, the monoimines formed execute internal motions that have been characterized by extensive one-dimensional, two-dimensional, and EXSY proton NMR studies. In particular, with linear polyamines, nondirectional displacement occurs by shifting of the aldehyde residue along the polyamine chain serving as molecular track. Imines thus behave as simple prototypes of systems displaying relative motions of molecular moieties, a subject of high current interest in the investigation of synthetic and biological molecular motors. The motional processes described are of dynamic covalent nature and take place without change in molecular constitution. They thus represent a category of dynamic covalent motions, resulting from reversible covalent bond formation and dissociation. They extend dynamic covalent chemistry into the area of molecular motions. A major further step will be to achieve control of directionality. The results reported here for imines open wide perspectives, together with other chemical groups, for the implementation of such features in multifunctional molecules toward the design of molecular devices presenting a complex combination of motional and constitutional dynamic behaviors.

  5. Roll-to-roll suitable short-pulsed laser scribing of organic photovoltaics and close-to-process characterization

    NASA Astrophysics Data System (ADS)

    Kuntze, Thomas; Wollmann, Philipp; Klotzbach, Udo; Fledderus, Henri

    2017-03-01

    The proper long term operation of organic electronic devices like organic photovoltaics OPV depends on their resistance to environmental influences such as permeation of water vapor. Major efforts are spent to encapsulate OPV. State of the art is sandwich-like encapsulation between two ultra-barrier foils. Sandwich encapsulation faces two major disadvantages: high costs ( 1/3 of total costs) and parasitic intrinsic water (sponge effects of the substrate foil). To fight these drawbacks, a promising approach is to use the OPV substrate itself as barrier by integration of an ultra-barrier coating, followed by alternating deposition and structuring of OPV functional layers. In effect, more functionality will be integrated into less material, and production steps are reduced in number. All processing steps must not influence the underneath barrier functionality, while all electrical functionalities must be maintained. As most reasonable structuring tool, short and ultrashort pulsed lasers USP are used. Laser machining applies to three layers: bottom electrode made of transparent conductive materials (P1), organic photovoltaic operative stack (P2) and top electrode (P3). In this paper, the machining of functional 110…250 nm layers of flexible OPV by USP laser systems is presented. Main focus is on structuring without damaging the underneath ultra-barrier layer. The close-to-process machining quality characterization is performed with the analysis tool "hyperspectral imaging" (HSI), which is checked crosswise with the "gold standard" Ca-test. It is shown, that both laser machining and quality controlling, are well suitable for R2R production of OPV.

  6. Implementation and extension of the impulse transfer function method for future application to the space shuttle project. Volume 2: Program description and user's guide

    NASA Technical Reports Server (NTRS)

    Patterson, G.

    1973-01-01

    The data processing procedures and the computer programs were developed to predict structural responses using the Impulse Transfer Function (ITF) method. There are three major steps in the process: (1) analog-to-digital (A-D) conversion of the test data to produce Phase I digital tapes (2) processing of the Phase I digital tapes to extract ITF's and storing them in a permanent data bank, and (3) predicting structural responses to a set of applied loads. The analog to digital conversion is performed by a standard package which will be described later in terms of the contents of the resulting Phase I digital tape. Two separate computer programs have been developed to perform the digital processing.

  7. Multi-objective design optimization of antenna structures using sequential domain patching with automated patch size determination

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2018-02-01

    In this article, a simple yet efficient and reliable technique for fully automated multi-objective design optimization of antenna structures using sequential domain patching (SDP) is discussed. The optimization procedure according to SDP is a two-step process: (i) obtaining the initial set of Pareto-optimal designs representing the best possible trade-offs between considered conflicting objectives, and (ii) Pareto set refinement for yielding the optimal designs at the high-fidelity electromagnetic (EM) simulation model level. For the sake of computational efficiency, the first step is realized at the level of a low-fidelity (coarse-discretization) EM model by sequential construction and relocation of small design space segments (patches) in order to create a path connecting the extreme Pareto front designs obtained beforehand. The second stage involves response correction techniques and local response surface approximation models constructed by reusing EM simulation data acquired in the first step. A major contribution of this work is an automated procedure for determining the patch dimensions. It allows for appropriate selection of the number of patches for each geometry variable so as to ensure reliability of the optimization process while maintaining its low cost. The importance of this procedure is demonstrated by comparing it with uniform patch dimensions.

  8. Immunoglobulin G elution in protein A chromatography employing the method of chromatofocusing for reducing the co-elution of impurities.

    PubMed

    Pinto, Nuno D S; Uplekar, Shaunak D; Moreira, Antonio R; Rao, Govind; Frey, Douglas D

    2017-01-01

    Purification processes for monoclonal Immunoglobulin G (IgG) typically employ protein A chromatography as a capture step to remove most of the impurities. One major concern of the post-protein A chromatography processes is the co-elution of some of the host cell proteins (HCPs) with IgG in the capture step. In this work, a novel method for IgG elution in protein A chromatography that reduces the co-elution of HCPs is presented where a two-step pH gradient is self-formed inside a protein A chromatography column. The complexities involved in using an internally produced pH gradient in a protein A chromatography column employing adsorbed buffering species are discussed though equation-based modeling. Under the conditions employed, ELISA assays show a 60% reduction in the HCPs co-eluting with the IgG fraction when using the method as compared to conventional protein A elution without affecting the IgG yield. Evidence is also obtained which indicates that the amount of leached protein A present in free solution in the purified product is reduced by the new method. Biotechnol. Bioeng. 2017;114: 154-162. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Concerns about dose and underutilization of twelve-step programs: models, scales, and theory that inform treatment planning.

    PubMed

    Cloud, Richard N; Kingree, J B

    2008-01-01

    Researchers have observed that a majority of addicted persons who are encouraged and facilitated by treatment providers to attend twelve-step (TS) programs either drop out or sporadically use twelve-step programs following treatment. This is troubling given considerable evidence of TS program benefits associated with regular weekly attendance and ubiquitous reliance by treatment professionals on these programs to provide important support services. This chapter reviews and advances theory of TS utilization and dose that is supported by prior research, multivariate models, and scales that predict risk of TS meeting underutilization. Advancing theory should organize and clarify the process of initial utilization, guide intervention development, and improve adherence of TS program referrals, all of which should lead to improved treatment planning and better outcomes. Three theories are integrated to explain processes that may influence TS program dose: the health belief model, self-determination theory (motivational theory), and a person-in-organization cultural fit theory. Four multidimensional scales developed specifically to predict participation are described. Implications for practice and future research are considered in a final discussion. Information contained in this chapter raises awareness of the need for TS-focused treatments to focus on achieving weekly attendance during and after treatment.

  10. Removal of fluoride, SDS, ammonia and turbidity from semiconductor wastewater by combined electrocoagulation-electroflotation.

    PubMed

    Aoudj, S; Khelifa, A; Drouiche, N

    2017-08-01

    Semiconductor industry effluents contain organic and inorganic pollutants, such as sodium dodecyl sulfate (SDS), fluoride and ammonia, at high levels which consists a major environmental issue. A combined EC-EF process is proposed as a post-treatment after precipitation for simultaneous clarification and removal of pollutants. In EC step, a hybrid Fe-Al was used as the soluble anode in order to avoid supplementary EC step. EC-Fe is more suitable for SDS removal; EC-Al is more suitable for fluoride removal, while EC with hybrid Al-Fe makes a good compromise. Clarification and ammonia oxidation were achieved in the EF step. Effects of anodic material, initial pH, current, anion nature, chloride concentration and initial pollutant concentration were studied. The final concentrations may reach 0.27, 6.23 and 0.22 mg L -1 for SDS, fluoride and ammonia respectively. These concentrations are far lower than the correspondent discharge limits. Similarly, the final turbidity was found 4.35 NTU which is lower than 5NTU and the treated water does not need further filtration before discharge. Furthermore, the EC-EF process proves to be sufficiently energy-efficient with less soluble electrode consumption. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Lignocellulosic Biomass Transformations via Greener Oxidative Pretreatment Processes: Access to Energy and Value-Added Chemicals

    PubMed Central

    Den, Walter; Sharma, Virender K.; Lee, Mengshan; Nadadur, Govind; Varma, Rajender S.

    2018-01-01

    Anthropogenic climate change, principally induced by the large volume of carbon dioxide emission from the global economy driven by fossil fuels, has been observed and scientifically proven as a major threat to civilization. Meanwhile, fossil fuel depletion has been identified as a future challenge. Lignocellulosic biomass in the form of organic residues appears to be the most promising option as renewable feedstock for the generation of energy and platform chemicals. As of today, relatively little bioenergy comes from lignocellulosic biomass as compared to feedstock such as starch and sugarcane, primarily due to high cost of production involving pretreatment steps required to fragment biomass components via disruption of the natural recalcitrant structure of these rigid polymers; low efficiency of enzymatic hydrolysis of refractory feedstock presents a major challenge. The valorization of lignin and cellulose into energy products or chemical products is contingent on the effectiveness of selective depolymerization of the pretreatment regime which typically involve harsh pyrolytic and solvothermal processes assisted by corrosive acids or alkaline reagents. These unselective methods decompose lignin into many products that may not be energetically or chemically valuable, or even biologically inhibitory. Exploring milder, selective and greener processes, therefore, has become a critical subject of study for the valorization of these materials in the last decade. Efficient alternative activation processes such as microwave- and ultrasound irradiation are being explored as replacements for pyrolysis and hydrothermolysis, while milder options such as advanced oxidative and catalytic processes should be considered as choices to harsher acid and alkaline processes. Herein, we critically abridge the research on chemical oxidative techniques for the pretreatment of lignocellulosics with the explicit aim to rationalize the objectives of the biomass pretreatment step and the problems associated with the conventional processes. The mechanisms of reaction pathways, selectivity and efficiency of end-products obtained using greener processes such as ozonolysis, photocatalysis, oxidative catalysis, electrochemical oxidation, and Fenton or Fenton-like reactions, as applied to depolymerization of lignocellulosic biomass are summarized with deliberation on future prospects of biorefineries with greener pretreatment processes in the context of the life cycle assessment. PMID:29755972

  12. Lignocellulosic Biomass Transformations via Greener Oxidative Pretreatment Processes: Access to Energy and Value-Added Chemicals.

    PubMed

    Den, Walter; Sharma, Virender K; Lee, Mengshan; Nadadur, Govind; Varma, Rajender S

    2018-01-01

    Anthropogenic climate change, principally induced by the large volume of carbon dioxide emission from the global economy driven by fossil fuels, has been observed and scientifically proven as a major threat to civilization. Meanwhile, fossil fuel depletion has been identified as a future challenge. Lignocellulosic biomass in the form of organic residues appears to be the most promising option as renewable feedstock for the generation of energy and platform chemicals. As of today, relatively little bioenergy comes from lignocellulosic biomass as compared to feedstock such as starch and sugarcane, primarily due to high cost of production involving pretreatment steps required to fragment biomass components via disruption of the natural recalcitrant structure of these rigid polymers; low efficiency of enzymatic hydrolysis of refractory feedstock presents a major challenge. The valorization of lignin and cellulose into energy products or chemical products is contingent on the effectiveness of selective depolymerization of the pretreatment regime which typically involve harsh pyrolytic and solvothermal processes assisted by corrosive acids or alkaline reagents. These unselective methods decompose lignin into many products that may not be energetically or chemically valuable, or even biologically inhibitory. Exploring milder, selective and greener processes, therefore, has become a critical subject of study for the valorization of these materials in the last decade. Efficient alternative activation processes such as microwave- and ultrasound irradiation are being explored as replacements for pyrolysis and hydrothermolysis, while milder options such as advanced oxidative and catalytic processes should be considered as choices to harsher acid and alkaline processes. Herein, we critically abridge the research on chemical oxidative techniques for the pretreatment of lignocellulosics with the explicit aim to rationalize the objectives of the biomass pretreatment step and the problems associated with the conventional processes. The mechanisms of reaction pathways, selectivity and efficiency of end-products obtained using greener processes such as ozonolysis, photocatalysis, oxidative catalysis, electrochemical oxidation, and Fenton or Fenton-like reactions, as applied to depolymerization of lignocellulosic biomass are summarized with deliberation on future prospects of biorefineries with greener pretreatment processes in the context of the life cycle assessment.

  13. Lignocellulosic Biomass Transformations via Greener Oxidative Pretreatment Processes: Access to Energy and Value-Added Chemicals

    NASA Astrophysics Data System (ADS)

    Den, Walter; Sharma, Virender K.; Lee, Mengshan; Nadadur, Govind; Varma, Rajender S.

    2018-04-01

    Anthropogenic climate change, principally induced by the large volume of carbon dioxide emission from the global economy driven by fossil fuels, has been observed and scientifically proven as a major threat to civilization. Meanwhile, fossil fuel depletion has been identified as a future challenge. Lignocellulosic biomass in the form of organic residues appears to be the most promising option as renewable feedstock for the generation of energy and platform chemicals. As of today, relatively little bioenergy comes from lignocellulosic biomass as compared to feedstock such as starch and sugarcane, primarily due to high cost of production involving pretreatment steps required to fragment biomass components via disruption of the natural recalcitrant structure of these rigid polymers; low efficiency of enzymatic hydrolysis of refractory feedstock presents a major challenge. The valorization of lignin and cellulose into energy products or chemical products is contingent on the effectiveness of selective depolymerization of the pretreatment regime which typically involve harsh pyrolytic and solvothermal processes assisted by corrosive acids or alkaline reagents. These unselective methods decompose lignin into many products that may not be energetically or chemically valuable, or even biologically inhibitory. Exploring milder, selective and greener processes, therefore, has become a critical subject of study for the valorization of these materials in the last decade. Efficient alternative activation processes such as microwave- and ultrasound irradiation are being explored as replacements for pyrolysis and hydrothermolysis, while milder options such as advanced oxidative and catalytic processes should be considered as choices to harsher acid and alkaline processes. Herein, we critically abridge the research on chemical oxidative techniques for the pretreatment of lignocellulosics with the explicit aim to rationalize the objectives of the biomass pretreatment step and the problems associated with the conventional processes. The mechanisms of reaction pathways, selectivity and efficiency of end-products obtained using greener processes such as ozonolysis, photocatalysis, oxidative catalysis, electrochemical oxidation, and Fenton or Fenton-like reactions, as applied to depolymerization of lignocellulosic biomass are summarized with deliberation on future prospects of biorefineries with greener pretreatment processes in the context of the life cycle assessment.

  14. Tracking acidic pharmaceuticals, caffeine, and triclosan through the wastewater treatment process.

    PubMed

    Thomas, Paul M; Foster, Gregory D

    2005-01-01

    Pharmaceuticals are a class of emerging contaminants whose fate in the wastewater treatment process has received increasing attention in past years. Acidic pharmaceuticals (ibuprofen, naproxen, mefenamic acid, ketoprofen, and diclofenac), caffeine, and the antibacterial triclosan were quantified at four different steps of wastewater treatment from three urban wastewater treatment plants. The compounds were extracted from wastewater samples on Waters Oasis hydrophilic-lipophilic balance solid-phase extraction columns, silylated, and analyzed by gas chromatography-mass spectrometry. For the chemicals studied, it was found that the majority of the influent load was removed during secondary treatment (51-99%), yielding expected surface water concentrations of 13 to 56 ng/L.

  15. An Approach for Stitching Satellite Images in a Bigdata Mapreduce Framework

    NASA Astrophysics Data System (ADS)

    Sarı, H.; Eken, S.; Sayar, A.

    2017-11-01

    In this study we present a two-step map/reduce framework to stitch satellite mosaic images. The proposed system enable recognition and extraction of objects whose parts falling in separate satellite mosaic images. However this is a time and resource consuming process. The major aim of the study is improving the performance of the image stitching processes by utilizing big data framework. To realize this, we first convert the images into bitmaps (first mapper) and then String formats in the forms of 255s and 0s (second mapper), and finally, find the best possible matching position of the images by a reduce function.

  16. Team-Based Introductory Research Experiences in Mathematics

    ERIC Educational Resources Information Center

    Baum, Brittany Smith; Rowell, Ginger Holmes; Green, Lisa; Yantz, Jennifer; Beck, Jesse; Cheatham, Thomas; Stephens, D. Christopher; Nelson, Donald

    2017-01-01

    As part of Middle Tennessee State University's (MTSU's) initiative to improve retention of at-risk STEM majors, they recruit first-time, full-time freshman STEM majors with mathematics ACT scores of 19 to 23 to participate in MTSU's "Mathematics as a FirstSTEP to Success in STEM" project (FirstSTEP). This article overviews MTSU's…

  17. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  18. Steps toward Gaining Knowledge of World Music Pedagogy

    ERIC Educational Resources Information Center

    Carlisle, Katie

    2013-01-01

    This article presents steps toward gaining knowledge of world music pedagogy for K-12 general music educators. The majority of the article details steps that invite engagement within everyday contexts with accessible resources within local and online communities. The steps demonstrate ways general music teachers can diversify and self-direct their…

  19. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images

    PubMed Central

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing. PMID:29023597

  20. NASA Planning for Orion Multi-Purpose Crew Vehicle Ground Operations

    NASA Technical Reports Server (NTRS)

    Letchworth, Gary; Schlierf, Roland

    2011-01-01

    The NASA Orion Ground Processing Team was originally formed by the Kennedy Space Center (KSC) Constellation (Cx) Project Office's Orion Division to define, refine and mature pre-launch and post-landing ground operations for the Orion human spacecraft. The multidisciplined KSC Orion team consisted of KSC civil servant, SAIC, Productivity Apex, Inc. and Boeing-CAPPS engineers, project managers and safety engineers, as well as engineers from Constellation's Orion Project and Lockheed Martin Orion Prime contractor. The team evaluated the Orion design configurations as the spacecraft concept matured between Systems Design Review (SDR), Systems Requirement Review (SRR) and Preliminary Design Review (PDR). The team functionally decomposed prelaunch and post-landing steps at three levels' of detail, or tiers, beginning with functional flow block diagrams (FFBDs). The third tier FFBDs were used to build logic networks and nominal timelines. Orion ground support equipment (GSE) was identified and mapped to each step. This information was subsequently used in developing lower level operations steps in a Ground Operations Planning Document PDR product. Subject matter experts for each spacecraft and GSE subsystem were used to define 5th - 95th percentile processing times for each FFBD step, using the Delphi Method. Discrete event simulations used this information and the logic network to provide processing timeline confidence intervals for launch rate assessments. The team also used the capabilities of the KSC Visualization Lab, the FFBDs and knowledge of the spacecraft, GSE and facilities to build visualizations of Orion pre-launch and postlanding processing at KSC. Visualizations were a powerful tool for communicating planned operations within the KSC community (i.e., Ground Systems design team), and externally to the Orion Project, Lockheed Martin spacecraft designers and other Constellation Program stakeholders during the SRR to PDR timeframe. Other operations planning tools included Kaizen/Lean events, mockups and human factors analysis. The majority of products developed by this team are applicable as KSC prepares 21st Century Ground Systems for the Orion Multi-Purpose Crew Vehicle and Space Launch System.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederix, Marijke; Mingardon, Florence; Hu, Matthew

    Biological production of chemicals and fuels using microbial transformation of sustainable carbon sources, such as pretreated and saccharified plant biomass, is a multi-step process. Typically, each segment of the workflow is optimized separately, often generating conditions that may not be suitable for integration or consolidation with the upstream or downstream steps. While significant effort has gone into developing solutions to incompatibilities at discrete steps, very few studies report the consolidation of the multi-step workflow into a single pot reactor system. Here we demonstrate a one-pot biofuel production process that uses the ionic liquid 1-ethyl-3-methylimidazolium acetate (C 2C 1Im][OAc] ) formore » pretreatment of switchgrass biomass. [C 2C 1Im][OAc] is highly effective in deconstructing lignocellulose, but nonetheless leaves behind residual reagents that are toxic to standard saccharification enzymes and the microbial production host. We report the discovery of an [C 2C 1Im]-tolerant E. coli strain, where [C 2C 1Im] tolerance is bestowed by a P7Q mutation in the transcriptional regulator encoded by rcdA. We establish that the causal impact of this mutation is the derepression of a hitherto uncharacterized major facilitator family transporter, YbjJ. To develop the strain for a one-pot process we engineered this [C 2C 1Im]-tolerant strain to express a recently reported d-limonene production pathway. We also screened previously reported [C 2C 1Im]-tolerant cellulases to select one that would function with the range of E. coli cultivation conditions and expressed it in the [C 2C 1 Im]-tolerant E. coli strain so as to secrete this [C 2C 1Im]-tolerant cellulase. The final strain digests pretreated biomass, and uses the liberated sugars to produce the bio-jet fuel candidate precursor d-limonene in a one-pot process.« less

  2. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-07

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.

  3. Development of an E. coli strain for one-pot biofuel production from ionic liquid pretreated cellulose and switchgrass

    DOE PAGES

    Frederix, Marijke; Mingardon, Florence; Hu, Matthew; ...

    2016-04-11

    Biological production of chemicals and fuels using microbial transformation of sustainable carbon sources, such as pretreated and saccharified plant biomass, is a multi-step process. Typically, each segment of the workflow is optimized separately, often generating conditions that may not be suitable for integration or consolidation with the upstream or downstream steps. While significant effort has gone into developing solutions to incompatibilities at discrete steps, very few studies report the consolidation of the multi-step workflow into a single pot reactor system. Here we demonstrate a one-pot biofuel production process that uses the ionic liquid 1-ethyl-3-methylimidazolium acetate (C 2C 1Im][OAc] ) formore » pretreatment of switchgrass biomass. [C 2C 1Im][OAc] is highly effective in deconstructing lignocellulose, but nonetheless leaves behind residual reagents that are toxic to standard saccharification enzymes and the microbial production host. We report the discovery of an [C 2C 1Im]-tolerant E. coli strain, where [C 2C 1Im] tolerance is bestowed by a P7Q mutation in the transcriptional regulator encoded by rcdA. We establish that the causal impact of this mutation is the derepression of a hitherto uncharacterized major facilitator family transporter, YbjJ. To develop the strain for a one-pot process we engineered this [C 2C 1Im]-tolerant strain to express a recently reported d-limonene production pathway. We also screened previously reported [C 2C 1Im]-tolerant cellulases to select one that would function with the range of E. coli cultivation conditions and expressed it in the [C 2C 1 Im]-tolerant E. coli strain so as to secrete this [C 2C 1Im]-tolerant cellulase. The final strain digests pretreated biomass, and uses the liberated sugars to produce the bio-jet fuel candidate precursor d-limonene in a one-pot process.« less

  4. Nanofabrication on unconventional substrates using transferred hard masks

    DOE PAGES

    Li, Luozhou; Bayn, Igal; Lu, Ming; ...

    2015-01-15

    Here, a major challenge in nanofabrication is to pattern unconventional substrates that cannot be processed for a variety of reasons, such as incompatibility with spin coating, electron beam lithography, optical lithography, or wet chemical steps. Here, we present a versatile nanofabrication method based on re-usable silicon membrane hard masks, patterned using standard lithography and mature silicon processing technology. These masks, transferred precisely onto targeted regions, can be in the millimetre scale. They allow for fabrication on a wide range of substrates, including rough, soft, and non-conductive materials, enabling feature linewidths down to 10 nm. Plasma etching, lift-off, and ion implantationmore » are realized without the need for scanning electron/ion beam processing, UV exposure, or wet etching on target substrates.« less

  5. Indicators to facilitate the early identification of patients with major depressive disorder in need of highly specialized care: A concept mapping study.

    PubMed

    van Krugten, F C W; Goorden, M; van Balkom, A J L M; Spijker, J; Brouwer, W B F; Hakkaart-van Roijen, L

    2018-04-01

    Early identification of the subgroup of patients with major depressive disorder (MDD) in need of highly specialized care could enhance personalized intervention. This, in turn, may reduce the number of treatment steps needed to achieve and sustain an adequate treatment response. The aim of this study was to identify patient-related indicators that could facilitate the early identification of the subgroup of patients with MDD in need of highly specialized care. Initial patient indicators were derived from a systematic review. Subsequently, a structured conceptualization methodology known as concept mapping was employed to complement the initial list of indicators by clinical expertise and develop a consensus-based conceptual framework. Subject-matter experts were invited to participate in the subsequent steps (brainstorming, sorting, and rating) of the concept mapping process. A final concept map solution was generated using nonmetric multidimensional scaling and agglomerative hierarchical cluster analyses. In total, 67 subject-matter experts participated in the concept mapping process. The final concept map revealed the following 10 major clusters of indicators: 1-depression severity, 2-onset and (treatment) course, 3-comorbid personality disorder, 4-comorbid substance use disorder, 5-other psychiatric comorbidity, 6-somatic comorbidity, 7-maladaptive coping, 8-childhood trauma, 9-social factors, and 10-psychosocial dysfunction. The study findings highlight the need for a comprehensive assessment of patient indicators in determining the need for highly specialized care, and suggest that the treatment allocation of patients with MDD to highly specialized mental healthcare settings should be guided by the assessment of clinical and nonclinical patient factors. © 2018 Wiley Periodicals, Inc.

  6. Analysis/test correlation using VAWT-SDS on a step-relaxation test for the rotating Sandia 34 m test bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argueello, J.G.; Dohrmann, C.R.; Carne, T.G.

    The combined analysis/test effort described in this paper compares predictions with measured data from a step-relaxation test in the absence of significant wind-driven aerodynamic loading. The process described here is intended to illustrate a method for validation of time domain codes for structural analysis of wind turbine structures. Preliminary analyses were performed to investigate the transient dynamic response that the rotating Sandia 34 m Vertical Axis Wind Turbine (VAWT) would undergo when one of the two blades was excited by step-relaxation. The calculations served two purposes. The first was for pretest planning to evaluate the relative importance of the variousmore » forces that would be acting on the structure during the test and to determine if the applied force in the step-relaxation would be sufficient to produce an excitation that was distinguishable from that produced by the aerodynamic loads. The second was to provide predictions that could subsequently be compared to the data from the test. The test was carried out specifically to help in the validation of the time-domain structural dynamics code, VAWT-SDS, which predicts the dynamic response of VAWTs subject to transient events. Post-test comparisons with the data were performed and showed a qualitative agreement between pretest predictions and measured response. However, they also showed that there was significantly more damping in the measurements than included in the predictions. Efforts to resolve this difference, including post-test analyses, were undertaken and are reported herein. The overall effort described in this paper represents a major step in the process of arriving at a validated structural dynamics code.« less

  7. A major trauma course based on posters, audio-guides and simulation improves the management skills of medical students: Evaluation via medical simulator.

    PubMed

    Cuisinier, Adrien; Schilte, Clotilde; Declety, Philippe; Picard, Julien; Berger, Karine; Bouzat, Pierre; Falcon, Dominique; Bosson, Jean Luc; Payen, Jean-François; Albaladejo, Pierre

    2015-12-01

    Medical competence requires the acquisition of theoretical knowledge and technical skills. Severe trauma management teaching is poorly developed during internship. Nevertheless, the basics of major trauma management should be acquired by every future physician. For this reason, the major trauma course (MTC), an educational course in major traumatology, has been developed for medical students. Our objective was to evaluate, via a high fidelity medical simulator, the impact of the MTC on medical student skills concerning major trauma management. The MTC contains 3 teaching modalities: posters with associated audio-guides, a procedural workshop on airway management and a teaching session using a medical simulator. Skills evaluation was performed 1 month before (step 1) and 1 month after (step 3) the MTC (step 2). Nineteen students were individually evaluated on 2 different major trauma scenarios. The primary endpoint was the difference between steps 1 and 3, in a combined score evaluating: admission, equipment, monitoring and safety (skill set 1) and systematic clinical examinations (skill set 2). After the course, the combined primary outcome score improved by 47% (P<0.01). Scenario choice or the order of use had no significant influence on the skill set evaluations. This study shows improvement in student skills for major trauma management, which we attribute mainly to the major trauma course developed in our institution. Copyright © 2015 Société française d’anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  8. Plasma processing conditions substantially influence circulating microRNA biomarker levels.

    PubMed

    Cheng, Heather H; Yi, Hye Son; Kim, Yeonju; Kroh, Evan M; Chien, Jason W; Eaton, Keith D; Goodman, Marc T; Tait, Jonathan F; Tewari, Muneesh; Pritchard, Colin C

    2013-01-01

    Circulating, cell-free microRNAs (miRNAs) are promising candidate biomarkers, but optimal conditions for processing blood specimens for miRNA measurement remain to be established. Our previous work showed that the majority of plasma miRNAs are likely blood cell-derived. In the course of profiling lung cancer cases versus healthy controls, we observed a broad increase in circulating miRNA levels in cases compared to controls and that higher miRNA expression correlated with higher platelet and particle counts. We therefore hypothesized that the quantity of residual platelets and microparticles remaining after plasma processing might impact miRNA measurements. To systematically investigate this, we subjected matched plasma from healthy individuals to stepwise processing with differential centrifugation and 0.22 µm filtration and performed miRNA profiling. We found a major effect on circulating miRNAs, with the majority (72%) of detectable miRNAs substantially affected by processing alone. Specifically, 10% of miRNAs showed 4-30x variation, 46% showed 30-1,000x variation, and 15% showed >1,000x variation in expression solely from processing. This was predominantly due to platelet contamination, which persisted despite using standard laboratory protocols. Importantly, we show that platelet contamination in archived samples could largely be eliminated by additional centrifugation, even in frozen samples stored for six years. To minimize confounding effects in microRNA biomarker studies, additional steps to limit platelet contamination for circulating miRNA biomarker studies are necessary. We provide specific practical recommendations to help minimize confounding variation attributable to plasma processing and platelet contamination.

  9. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  10. MetaDB a Data Processing Workflow in Untargeted MS-Based Metabolomics Experiments.

    PubMed

    Franceschi, Pietro; Mylonas, Roman; Shahaf, Nir; Scholz, Matthias; Arapitsas, Panagiotis; Masuero, Domenico; Weingart, Georg; Carlin, Silvia; Vrhovsek, Urska; Mattivi, Fulvio; Wehrens, Ron

    2014-01-01

    Due to their sensitivity and speed, mass-spectrometry based analytical technologies are widely used to in metabolomics to characterize biological phenomena. To address issues like metadata organization, quality assessment, data processing, data storage, and, finally, submission to public repositories, bioinformatic pipelines of a non-interactive nature are often employed, complementing the interactive software used for initial inspection and visualization of the data. These pipelines often are created as open-source software allowing the complete and exhaustive documentation of each step, ensuring the reproducibility of the analysis of extensive and often expensive experiments. In this paper, we will review the major steps which constitute such a data processing pipeline, discussing them in the context of an open-source software for untargeted MS-based metabolomics experiments recently developed at our institute. The software has been developed by integrating our metaMS R package with a user-friendly web-based application written in Grails. MetaMS takes care of data pre-processing and annotation, while the interface deals with the creation of the sample lists, the organization of the data storage, and the generation of survey plots for quality assessment. Experimental and biological metadata are stored in the ISA-Tab format making the proposed pipeline fully integrated with the Metabolights framework.

  11. "Coding" and "Decoding": hypothesis for the regulatory mechanism involved in heparan sulfate biosynthesis.

    PubMed

    Zhang, Xu; Wang, Fengshan; Sheng, Juzheng

    2016-06-16

    Heparan sulfate (HS) is widely distributed in mammalian tissues in the form of HS proteoglycans, which play essential roles in various physiological and pathological processes. In contrast to the template-guided processes involved in the synthesis of DNA and proteins, HS biosynthesis is not believed to involve a template. However, it appears that the final structure of HS chains was strictly regulated. Herein, we report research based hypothesis that two major steps, namely "coding" and "decoding" steps, are involved in the biosynthesis of HS, which strictly regulate its chemical structure and biological activity. The "coding" process in this context is based on the distribution of sulfate moieties on the amino groups of the glucosamine residues in the HS chains. The sulfation of these amine groups is catalyzed by N-deacetylase/N-sulfotransferase, which has four isozymes. The composition and distribution of sulfate groups and iduronic acid residues on the glycan chains of HS are determined by several other modification enzymes, which can recognize these coding sequences (i.e., the "decoding" process). The degree and pattern of the sulfation and epimerization in the HS chains determines the extent of their interactions with several different protein factors, which further influences their biological activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. European Workshop Industrical Computer Science Systems approach to design for safety

    NASA Technical Reports Server (NTRS)

    Zalewski, Janusz

    1992-01-01

    This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.

  13. Robustness. [in space systems

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    1993-01-01

    The concept of rubustness includes design simplicity, component and path redundancy, desensitization to the parameter and environment variations, control of parameter variations, and punctual operations. These characteristics must be traded with functional concepts, materials, and fabrication approach against the criteria of performance, cost, and reliability. The paper describes the robustness design process, which includes the following seven major coherent steps: translation of vision into requirements, definition of the robustness characteristics desired, criteria formulation of required robustness, concept selection, detail design, manufacturing and verification, operations.

  14. Noncontact minimally invasive technique for the assessment of mechanical properties of single cardiac myocyte via magnetic field loading

    NASA Astrophysics Data System (ADS)

    Yin, Shizhuo; Zhang, Xueqian; Cheung, Joseph; Wu, Juntao; Zhan, Chun; Xue, Jinchao

    2004-07-01

    In this paper, a unique non-contact, minimum invasive technique for the assessment of mechanical properties of single cardiac myocyte is presented. The assessment process includes following major steps: (1) attach a micro magnetic bead to the cell to be measured, (2) measure the contractile performance of the cell under the different magnetic field loading, (3) calculate mechanical loading force, and (4) derive the contractile force from the measured contraction data under different magnetic field loading.

  15. 25 CFR 15.11 - What are the basic steps of the probate process?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What are the basic steps of the probate process? 15.11 Section 15.11 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR PROBATE PROBATE OF INDIAN... are the basic steps of the probate process? The basic steps of the probate process are: (a) We learn...

  16. Development of a hydro kinetic river turbine with simulation and operational measurement results in comparison

    NASA Astrophysics Data System (ADS)

    Ruopp, A.; Ruprecht, A.; Riedelbauch, S.; Arnaud, G.; Hamad, I.

    2014-03-01

    The development of a hydro-kinetic prototype was shown including the compound structure, guide vanes, runner blades and a draft tube section with a steeply sloping, short spoiler. The design process of the hydrodynamic layout was split into three major steps. First the compound and the draft tube section was designed and the best operating point was identified using porous media as replacement for the guide vane and runner section (step one). The best operating point and the volume flux as well as the pressure drop was identified and used for the design of the guide vane section and the runner section. Both were designed and simulated independently (step two). In step three, all parts were merged in stationary simulation runs detecting peak power and operational bandwidth. In addition, the full scale demonstrator was installed in August 2010 and measured in the St. Lawrence River in Quebec supporting the average inflow velocity using ADCP (Acoustic Doppler Current Profiler) and the generator power output over the variable rotational speed. Simulation data and measurements are in good agreement. Thus, the presented approach is a suitable way in designing a hydro kinetic turbine.

  17. Walking velocity and step length adjustments affect knee joint contact forces in healthy weight and obese adults.

    PubMed

    Milner, Clare E; Meardon, Stacey A; Hawkins, Jillian L; Willson, John D

    2018-04-28

    Knee osteoarthritis is a major public health problem and adults with obesity are particularly at risk. One approach to alleviating this problem is to reduce the mechanical load at the joint during daily activity. Adjusting temporospatial parameters of walking could mitigate cumulative knee joint mechanical loads. The purpose of this study was to determine how adjustments to velocity and step length affects knee joint loading in healthy weight adults and adults with obesity. We collected three-dimensional gait analysis data on 10 adults with a normal body mass index and 10 adults with obesity during over ground walking in nine different conditions. In addition to preferred velocity and step length, we also conducted combinations of 15% increased and decreased velocity and step length. Peak tibiofemoral joint impulse and knee adduction angular impulse were reduced in the decreased step length conditions in both healthy weight adults (main effect) and those with obesity (interaction effect). Peak knee joint adduction moment was also reduced with decreased step length, and with decreased velocity in both groups. We conclude from these results that adopting shorter step lengths during daily activity and when walking for exercise can reduce mechanical stimuli associated with articular cartilage degenerative processes in adults with and without obesity. Thus, walking with reduced step length may benefit adults at risk for disability due to knee osteoarthritis. Adopting a shorter step length during daily walking activity may reduce knee joint loading and thus benefit those at risk for knee cartilage degeneration. © 2018 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 9999:XX-XX, 2018. © 2018 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  18. Chemical composition of distillers grains, a review.

    PubMed

    Liu, KeShun

    2011-03-09

    In recent years, increasing demand for ethanol as a fuel additive and decreasing dependency on fossil fuels have resulted in a dramatic increase in the amount of grains used for ethanol production. Dry-grind is the major process, resulting in distillers dried grains with solubles (DDGS) as a major coproduct. Like fuel ethanol, DDGS has quickly become a global commodity. However, high compositional variation has been the main problem hindering its use as a feed ingredient. This review provides updated information on the chemical composition of distillers grains in terms of nutrient levels, changes during dry-grind processing, and causes for large variation. The occurrence in grain feedstock and the fate of mycotoxins during processing are also covered. During processing, starch is converted to glucose and then to ethanol and carbon dioxide. Most other components are relatively unchanged but concentrated in DDGS about 3-fold over the original feedstock. Mycotoxins, if present in the original feedstock, are also concentrated. Higher fold of increases in S, Na, and Ca are mostly due to exogenous addition during processing, whereas unusual changes in inorganic phosphorus (P) and phytate P indicate phytate hydrolysis by yeast phytase. Fermentation causes major changes, but other processing steps are also responsible. The causes for varying DDGS composition are multiple, including differences in feedstock species and composition, process methods and parameters, the amount of condensed solubles added to distiller wet grains, the effect of fermentation yeast, and analytical methodology. Most of them can be attributed to the complexity of the dry-grind process itself. It is hoped that information provided in this review will improve the understanding of the dry-grind process and aid in the development of strategies to control the compositional variation in DDGS.

  19. A Review of the Quantification and Classification of Pigmented Skin Lesions: From Dedicated to Hand-Held Devices.

    PubMed

    Filho, Mercedes; Ma, Zhen; Tavares, João Manuel R S

    2015-11-01

    In recent years, the incidence of skin cancer cases has risen, worldwide, mainly due to the prolonged exposure to harmful ultraviolet radiation. Concurrently, the computer-assisted medical diagnosis of skin cancer has undergone major advances, through an improvement in the instrument and detection technology, and the development of algorithms to process the information. Moreover, because there has been an increased need to store medical data, for monitoring, comparative and assisted-learning purposes, algorithms for data processing and storage have also become more efficient in handling the increase of data. In addition, the potential use of common mobile devices to register high-resolution images of skin lesions has also fueled the need to create real-time processing algorithms that may provide a likelihood for the development of malignancy. This last possibility allows even non-specialists to monitor and follow-up suspected skin cancer cases. In this review, we present the major steps in the pre-processing, processing and post-processing of skin lesion images, with a particular emphasis on the quantification and classification of pigmented skin lesions. We further review and outline the future challenges for the creation of minimum-feature, automated and real-time algorithms for the detection of skin cancer from images acquired via common mobile devices.

  20. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  2. Comparison of machinability of manganese alloyed austempered ductile iron produced using conventional and two step austempering processes

    NASA Astrophysics Data System (ADS)

    Hegde, Ananda; Sharma, Sathyashankara

    2018-05-01

    Austempered Ductile Iron (ADI) is a revolutionary material with high strength and hardness combined with optimum ductility and toughness. The discovery of two step austempering process has lead to the superior combination of all the mechanical properties. However, because of the high strength and hardness of ADI, there is a concern regarding its machinability. In the present study, machinability of ADI produced using conventional and two step heat treatment processes is assessed using tool life and the surface roughness. Speed, feed and depth of cut are considered as the machining parameters in the dry turning operation. The machinability results along with the mechanical properties are compared for ADI produced using both conventional and two step austempering processes. The results have shown that two step austempering process has produced better toughness with good hardness and strength without sacrificing ductility. Addition of 0.64 wt% manganese did not cause any detrimental effect on the machinability of ADI, both in conventional and two step processes. Marginal improvement in tool life and surface roughness were observed in two step process compared to that with conventional process.

  3. Achieving continuous manufacturing for final dosage formation: challenges and how to meet them. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are: Form precompetitive partnerships, including industry (pharmaceutical companies and equipment manufacturers), government, and universities. These precompetitive partnerships would develop case studies of continuous manufacturing and ideally perform joint-technology development, including development of small-scale equipment and processes. Develop ways to invest internally in continuous manufacturing. How best to do this will depend on the specifics of a given organization, in particular the current development projects. Upper managers will need to energize their process developers to incorporate continuous manufacturing in at least part of their processes to gain experience and demonstrate directly the benefits. Training of continuous manufacturing technologies, organizational approaches, and regulatory approaches is a key area that industrial leaders should pursue together. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  4. The Nucleation and Growth of Protein Crystals

    NASA Technical Reports Server (NTRS)

    Pusey, Marc

    2004-01-01

    Obtaining crystals of suitable size and high quality continues to be a major bottleneck in macromolecular crystallography. Currently, structural genomics efforts are achieving on average about a 10% success rate in going from purified protein to a deposited crystal structure. Growth of crystals in microgravity was proposed as a means of overcoming size and quality problems, which subsequently led to a major NASA effort in microgravity crystal growth, with the agency also funding research into understanding the process. Studies of the macromolecule crystal nucleation and growth process were carried out in a number of labs in an effort to understand what affected the resultant crystal quality on Earth, and how microgravity improved the process. Based upon experimental evidence, as well as simple starting assumptions, we have proposed that crystal nucleation occurs by a series of discrete self assembly steps, which 'set' the underlying crystal symmetry. This talk will review the model developed, and its origins, in our laboratory for how crystals nucleate and grow, and will then present, along with preliminary data, how we propose to use this model to improve the success rate for obtaining crystals from a given protein.

  5. Process Intensification for Cellulosic Biorefineries.

    PubMed

    Sadula, Sunitha; Athaley, Abhay; Zheng, Weiqing; Ierapetritou, Marianthi; Saha, Basudeb

    2017-06-22

    Utilization of renewable carbon source, especially non-food biomass is critical to address the climate change and future energy challenge. Current chemical and enzymatic processes for producing cellulosic sugars are multistep, and energy- and water-intensive. Techno-economic analysis (TEA) suggests that upstream lignocellulose processing is a major hurdle to the economic viability of the cellulosic biorefineries. Process intensification, which integrates processes and uses less water and energy, has the potential to overcome the aforementioned challenges. Here, we demonstrate a one-pot depolymerization and saccharification process of woody biomass, energy crops, and agricultural residues to produce soluble sugars with high yields. Lignin is separated as a solid for selective upgrading. Further integration of our upstream process with a reactive extraction step makes energy-efficient separation of sugars in the form of furans. TEA reveals that the process efficiency and integration enable, for the first time, economic production of feed streams that could profoundly improve process economics for downstream cellulosic bioproducts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  7. [Emotions and affect in psychoanalysisis].

    PubMed

    Carton, Solange; Widlöcher, Daniel

    2012-06-01

    The goal of this paper is to give some indications on the concept of affect in psychoanalysis. There is no single theory of affect, and Freud gave successive definitions, which continue to be deepened in contemporary psychoanalysis. We review some steps of Freud works on affect, then we look into some present major questions, such as its relationship to soma, the nature of unconscious affects and the repression of affect, which is particularly developed in the field of psychoanalytic psychosomatic. From Freud's definitions of affect as one of the drive representative and as a limit-concept between the somatic and the psychic, we develop some major theoretical perspectives, which give a central place to soma and drive impulses, and which agree on the major idea that affect is the result of a process. We then note some parallelism between psychoanalysis of affect and psychology and neurosciences of emotion, and underline the gaps and conditions of comparison between these different epistemological approaches.

  8. [Innovative technology and blood safety].

    PubMed

    Begue, S; Morel, P; Djoudi, R

    2016-11-01

    If technological innovations are not enough alone to improve blood safety, their contributions for several decades in blood transfusion are major. The improvement of blood donation (new apheresis devices, RFID) or blood components (additive solutions, pathogen reduction technology, automated processing of platelets concentrates) or manufacturing process of these products (by automated processing of whole blood), all these steps where technological innovations were implemented, lead us to better traceability, more efficient processes, quality improvement of blood products and therefore increased blood safety for blood donors and patients. If we are on the threshold of a great change with the progress of pathogen reduction technology (for whole blood and red blood cells), we hope to see production of ex vivo red blood cells or platelets who are real and who open new conceptual paths on blood safety. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  9. Recent developments in membrane-based separations in biotechnology processes: review.

    PubMed

    Rathore, A S; Shirke, A

    2011-01-01

    Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.

  10. 20 CFR 404.231 - Steps in computing your primary insurance amount under the guaranteed alternative-general.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Steps in computing your primary insurance... Steps in computing your primary insurance amount under the guaranteed alternative—general. If you reach age 62 after 1978 but before 1984, we follow three major steps in finding your guaranteed alternative...

  11. Development of a large low-cost double-chamber vacuum laminator

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1983-01-01

    A double-chamber vacuum laminator was required to investigate the processing and control of the fabrication of large terrestrial photovoltaic modules, and economic problems arising therefrom. Major design considerations were low cost, process flexibility and the exploration of novel equipment approaches. Spherical end caps for industrial tanks were used for the vacuum chambers. A stepping programmer and adjustable timers were used for process flexibility. New processing options were obtained by use of vacuum sensors. The upper vacuum chamber was provided with a diaphragm support to reduce diaphragm stress. A counterweight was used for handling ease and safety. Heat was supplied by a large electrical strip heater. Thermal isolation and mechanical support were provided inexpensively by a bed of industrial marbles. Operational testing disclosed the need for a differential vacuum gauge and proportional valve. Reprogramming of the process control system was simple and quick.

  12. Precise turnaround time measurement of laboratory processes using radiofrequency identification technology.

    PubMed

    Mayer, Horst; Brümmer, Jens; Brinkmann, Thomas

    2011-01-01

    To implement Lean Six Sigma in our central laboratory we conducted a project to measure single pre-analytical steps influencing turnaround time (TAT) of emergency department (ED) serum samples. The traditional approach of extracting data from the Laboratory Information System (LIS) for a retrospective calculation of a mean TAT is not suitable. Therefore, we used radiofrequency identification (RFID) chips for real time tracking of individual samples at any pre-analytical step. 1,200 serum tubes were labelled with RFID chips and were provided to the emergency department. 3 RFID receivers were installed in the laboratory: at the outlet of the pneumatic tube system, at the centrifuge, and in the analyser area. In addition, time stamps of sample entry at the automated sample distributor and communication of results from the analyser were collected from LIS. 1,023 labelled serum tubes arrived at our laboratory. 899 RFID tags were used for TAT calculation. The following transfer times were determined (median 95th percentile in min:sec): pneumatic tube system --> centrifuge (01:25/04:48), centrifuge --> sample distributor (14:06/5:33), sample distributor --> analysis system zone (02:39/15:07), analysis system zone --> result communication (12:42/22:21). Total TAT was calculated at 33:19/57:40 min:sec. Manual processes around centrifugation were identified as a major part of TAT with 44%/60% (median/95th percentile). RFID is a robust, easy to use, and error-free technology and not susceptible to interferences in the laboratory environment. With this study design we were able to measure significant variations in a single manual sample transfer process. We showed that TAT is mainly influenced by manual steps around the centrifugation process and we concluded that centrifugation should be integrated in solutions for total laboratory automation.

  13. The University of Minnesota Morris (UMM) STEP Program: an initiative to encourage the participation of Native Americans in the sciences

    NASA Astrophysics Data System (ADS)

    Cotter, J. F.

    2009-12-01

    The goal of the UMM STEP program is to increase the number of graduates in STEM fields through innovative curricular, recruiting and mentoring strategies. A unique focus of the UMM STEP program is increasing the number of Native American science majors. The STEP program fosters a summer research environment where peer interaction and mentoring creates a web of support. To do so we will establish a supportive and fulfilling pipeline that: 1) Identifies Native American students and involves them in research while they are high school; 2) Mentors and prepares participants for university academics the summer before their freshman year; 3) Provides a complete tuition waiver, mentoring and a support network throughout their undergraduate career; and 4) Involves participants in an active and dynamic summer undergraduate research environment where under-represented individuals are in the majority. The third and fourth components of this pipeline are in very good shape. The Morris campus was originally established as an Indian School in 1887. When the federal government deeded the Indian school campus to the University of Minnesota a stipulation was that Native American students attend the college for free. At present, 196 Native Americans are enrolled at UMM (50 are STEM majors). The UMM STEP research experience provides the unique opportunity to interact with a scientific community that both breaks down a number of traditional barriers and aids in the maturation of these students as scientists. In Summer 2008, 4 students were involved in summer research and in 2009 seven Native American students participated. Early efforts of the UMM STEP program are encouraging. UMM Admissions staff used the UMM STEP program to recruit Native American students and the P.I. phoned “uncommitted admits”, visited reservations and hosted reservation student visits. The result was an increase in freshman Native American Science majors from 7 in Fall 2007, 15 in fall 2008 and 20 in fall 2009. Overall, Native American Science majors increased from 33 (2007) to 39 (2008) and now 50 (2009). One UMM STEP participant from summer of 2008 graduated and is now in the University of Wisconsin Pharmacy Program. The biggest challenge to date for the UMM STEP program is making contact with rising junior and senior High School students. As a result, the envisioned UMM STEP “pipeline” remains without a solid foundation. Strategies employed to resolve this situation include: 1) writing to all educators with announcements and advertisements for summer opportunities for High School Juniors, 2) contacting alumni who educators in school districts with a significant number of Native American students, 3) utilizing the UMM STEP review board to make additional contacts in reservations, and 4) bringing UMM STEP students to High Schools to talk about their experiences and answer questions about UMM. This research is funded by the NSF STEP program (NSF/DUE-0653063).

  14. Mechanical work for step-to-step transitions is a major determinant of the metabolic cost of human walking.

    PubMed

    Donelan, J Maxwell; Kram, Rodger; Kuo, Arthur D

    2002-12-01

    In the single stance phase of walking, center of mass motion resembles that of an inverted pendulum. Theoretically, mechanical work is not necessary for producing the pendular motion, but work is needed to redirect the center of mass velocity from one pendular arc to the next during the transition between steps. A collision model predicts a rate of negative work proportional to the fourth power of step length. Positive work is required to restore the energy lost, potentially exacting a proportional metabolic cost. We tested these predictions with humans (N=9) walking over a range of step lengths (0.4-1.1 m) while keeping step frequency fixed at 1.8 Hz. We measured individual limb external mechanical work using force plates, and metabolic rate using indirect calorimetry. As predicted, average negative and positive external mechanical work rates increased with the fourth power of step length (from 1 W to 38 W; r(2)=0.96). Metabolic rate also increased with the fourth power of step length (from 7 W to 379 W; r(2)=0.95), and linearly with mechanical work rate. Mechanical work for step-to-step transitions, rather than pendular motion itself, appears to be a major determinant of the metabolic cost of walking.

  15. Enantioselective catalysis of photochemical reactions.

    PubMed

    Brimioulle, Richard; Lenhart, Dominik; Maturi, Mark M; Bach, Thorsten

    2015-03-23

    The nature of the excited state renders the development of chiral catalysts for enantioselective photochemical reactions a considerable challenge. The absorption of a 400 nm photon corresponds to an energy uptake of approximately 300 kJ mol(-1) . Given the large distance to the ground state, innovative concepts are required to open reaction pathways that selectively lead to a single enantiomer of the desired product. This Review outlines the two major concepts of homogenously catalyzed enantioselective processes. The first part deals with chiral photocatalysts, which intervene in the photochemical key step and induce an asymmetric induction in this step. In the second part, reactions are presented in which the photochemical excitation is mediated by an achiral photocatalyst and the transfer of chirality is ensured by a second chiral catalyst (dual catalysis). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Cardiac and circulatory assessment in intensive care units.

    PubMed

    McGrath, A; Cox, C L

    1998-12-01

    As healthcare delivery changes in critical care, nursing continues to evolve and develop. Nursing skills are expanding to incorporate skills once seen as the remit of the medical profession. Nurses are now equipping themselves with the skills and knowledge that can enhance the care they provide to their patients. Assessment of patients is a major role in nursing and, by expanding assessment skills, nurses can ensure that patients receive the care most appropriate to their needs. Nurses in critical care settings are well placed to carry out a more detailed assessment, which can help to focus nursing care. This article describes the step-by-step process of undertaking a full and comprehensive cardiac and circulatory assessment in a clinical setting. It identifies many of the problems that patients may have and the signs that the nurse may note whilst undertaking the assessment.

  17. Electrochemical characterization of p(+)n and n(+)p diffused InP structures

    NASA Technical Reports Server (NTRS)

    Wilt, David M.; Faur, Maria; Faur, Mircea; Goradia, M.; Vargas-Aburto, Carlos

    1993-01-01

    The relatively well documented and widely used electrolytes for characterization and processing of Si and GaAs-related materials and structures by electrochemical methods are of little or no use with InP because the electrolytes presently used either dissolve the surface preferentially at the defect areas or form residual oxides and introduce a large density of surface states. Using an electrolyte which was newly developed for anodic dissolution of InP, and was named the 'FAP' electrolyte, accurate characterization of InP related structures including nature and density of surface states, defect density, and net majority carrier concentration, all as functions of depth was performed. A step-by-step optimization of n(+)p and p(+)n InP structures made by thermal diffusion was done using the electrochemical techniques, and resulted in high performance homojunction InP structures.

  18. Estimating continuous floodplain and major river bed topography mixing ordinal coutour lines and topographic points

    NASA Astrophysics Data System (ADS)

    Bailly, J. S.; Dartevelle, M.; Delenne, C.; Rousseau, A.

    2017-12-01

    Floodplain and major river bed topography govern many river biophysical processes during floods. Despite the grow of direct topographic measurements from LiDARS on riverine systems, it still room to develop methods for large (e.g. deltas) or very local (e.g. ponds) riverine systems that take advantage of information coming from simple SAR or optical image processing on floodplain, resulting from waterbodies delineation during flood up or down, and producing ordered coutour lines. The next challenge is thus to exploit such data in order to estimate continuous topography on the floodplain combining heterogeneous data: a topographic points dataset and a located but unknown and ordered contourline dataset. This article is comparing two methods designed to estimate continuous topography on the floodplain mixing ordinal coutour lines and continuous topographic points. For both methods a first estimation step is to value each contourline with elevation and a second step is next to estimate the continuous field from both topographic points and valued contourlines. The first proposed method is a stochastic method starting from multigaussian random-fields and conditional simualtion. The second is a deterministic method based on radial spline fonction for thin layers used for approximated bivariate surface construction. Results are first shown and discussed from a set of synoptic case studies presenting various topographic points density and topographic smoothness. Next, results are shown and discuss on an actual case study in the Montagua laguna, located in the north of Valparaiso, Chile.

  19. Estimating continuous floodplain and major river bed topography mixing ordinal coutour lines and topographic points

    NASA Astrophysics Data System (ADS)

    Brown, T. G.; Lespez, L.; Sear, D. A.; Houben, P.; Klimek, K.

    2016-12-01

    Floodplain and major river bed topography govern many river biophysical processes during floods. Despite the grow of direct topographic measurements from LiDARS on riverine systems, it still room to develop methods for large (e.g. deltas) or very local (e.g. ponds) riverine systems that take advantage of information coming from simple SAR or optical image processing on floodplain, resulting from waterbodies delineation during flood up or down, and producing ordered coutour lines. The next challenge is thus to exploit such data in order to estimate continuous topography on the floodplain combining heterogeneous data: a topographic points dataset and a located but unknown and ordered contourline dataset. This article is comparing two methods designed to estimate continuous topography on the floodplain mixing ordinal coutour lines and continuous topographic points. For both methods a first estimation step is to value each contourline with elevation and a second step is next to estimate the continuous field from both topographic points and valued contourlines. The first proposed method is a stochastic method starting from multigaussian random-fields and conditional simualtion. The second is a deterministic method based on radial spline fonction for thin layers used for approximated bivariate surface construction. Results are first shown and discussed from a set of synoptic case studies presenting various topographic points density and topographic smoothness. Next, results are shown and discuss on an actual case study in the Montagua laguna, located in the north of Valparaiso, Chile.

  20. SU-E-T-128: Applying Failure Modes and Effects Analysis to a Risk-Based Quality Management for Stereotactic Radiosurgery in Brazil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teixeira, F; Universidade do Estado do Rio de Janeiro, Rio De Janeiro, RJ; Almeida, C de

    2015-06-15

    Purpose: The goal of the present work was to evaluate the process maps for stereotactic radiosurgery (SRS) treatment at three radiotherapy centers in Brazil and apply the FMEA technique to evaluate similarities and differences, if any, of the hazards and risks associated with these processes. Methods: A team, consisting of professionals from different disciplines and involved in the SRS treatment, was formed at each center. Each team was responsible for the development of the process map, and performance of FMEA and FTA. A facilitator knowledgeable in these techniques led the work at each center. The TG100 recommended scales were usedmore » for the evaluation of hazard and severity for each step for the major process “treatment planning”. Results: Hazard index given by the Risk Priority Number (RPN) is found to range from 4–270 for various processes and the severity (S) index is found to range from 1–10. The RPN values > 100 and severity value ≥ 7 were chosen to flag safety improvement interventions. Number of steps with RPN ≥100 were found to be 6, 59 and 45 for the three centers. The corresponding values for S ≥ 7 are 24, 21 and 25 respectively. The range of RPN and S values for each center belong to different process steps and failure modes. Conclusion: These results show that interventions to improve safety is different for each center and it is associated with the skill level of the professional team as well as the technology used to provide radiosurgery treatment. The present study will very likely be a model for implementation of risk-based prospective quality management program for SRS treatment in Brazil where currently there are 28 radiotherapy centers performing SRS. A complete FMEA for SRS for these three radiotherapy centers is currently under development.« less

  1. Physico-chemical, colorimetric, rheological parameters and chemometric discrimination of the origin of Mugil cephalus' roes during the manufacturing process of Bottarga.

    PubMed

    Caredda, Marco; Addis, Margherita; Pes, Massimo; Fois, Nicola; Sanna, Gabriele; Piredda, Giovanni; Sanna, Gavino

    2018-06-01

    The aim of this work was to measure the physico-chemical and the colorimetric parameters of ovaries from Mugil cephalus caught in the Tortolì lagoon (South-East coast of Sardinia) along the steps of the manufacturing process of Bottarga, together with the rheological parameters of the final product. A lowering of all CIELab coordinates (lightness, redness and yellowness) was observed during the manufacture process. All CIELab parameters were used to build a Linear Discriminant Analysis (LDA) predictive model able to determine in real time if the roes had been subdued to a freezing process, with a success in prediction of 100%. This model could be used to identify the origin of the roes, since only the imported ones are frozen. The major changes of all the studied parameters (p < 0.05) were noted in the drying step rather than in the salting step. After processing, Bottarga was characterized by a pH value of 5.46 (CV = 2.8) and a moisture content of 25% (CV = 8), whereas the typical per cent amounts of proteins, fat and NaCl, calculated as a percentage on the dried weight, were 56 (CV = 2), 34 (CV = 3) and 3.6 (CV = 17), respectively. The physical chemical changes of the roes during the manufacturing process were consistent for moisture, which decreased by 28%, whereas the protein and the fat contents on the dried weight got respectively lower of 3% and 2%. NaCl content increased by 3.1%. Principal Component Analyses (PCA) were also performed on all data to establish trends and relationships among all parameters. Hardness and consistency of Bottarga were negatively correlated with the moisture content (r = -0.87 and r = -0.88, respectively), while its adhesiveness was negatively correlated with the fat content (r = -0.68). Copyright © 2018. Published by Elsevier Ltd.

  2. Arbitrary Shape Deformation in CFD Design

    NASA Technical Reports Server (NTRS)

    Landon, Mark; Perry, Ernest

    2014-01-01

    Sculptor(R) is a commercially available software tool, based on an Arbitrary Shape Design (ASD), which allows the user to perform shape optimization for computational fluid dynamics (CFD) design. The developed software tool provides important advances in the state-of-the-art of automatic CFD shape deformations and optimization software. CFD is an analysis tool that is used by engineering designers to help gain a greater understanding of the fluid flow phenomena involved in the components being designed. The next step in the engineering design process is to then modify, the design to improve the components' performance. This step has traditionally been performed manually via trial and error. Two major problems that have, in the past, hindered the development of an automated CFD shape optimization are (1) inadequate shape parameterization algorithms, and (2) inadequate algorithms for CFD grid modification. The ASD that has been developed as part of the Sculptor(R) software tool is a major advancement in solving these two issues. First, the ASD allows the CFD designer to freely create his own shape parameters, thereby eliminating the restriction of only being able to use the CAD model parameters. Then, the software performs a smooth volumetric deformation, which eliminates the extremely costly process of having to remesh the grid for every shape change (which is how this process had previously been achieved). Sculptor(R) can be used to optimize shapes for aerodynamic and structural design of spacecraft, aircraft, watercraft, ducts, and other objects that affect and are affected by flows of fluids and heat. Sculptor(R) makes it possible to perform, in real time, a design change that would manually take hours or days if remeshing were needed.

  3. Top Ten Reasons for DEOX as a Front End to Pyroprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B.R. Westphal; K.J. Bateman; S.D. Herrmann

    A front end step is being considered to augment chopping during the treatment of spent oxide fuel by pyroprocessing. The front end step, termed DEOX for its emphasis on decladding via oxidation, employs high temperatures to promote the oxidation of UO2 to U3O8 via an oxygen carrier gas. During oxidation, the spent fuel experiences a 30% increase in lattice structure volume resulting in the separation of fuel from cladding with a reduced particle size. A potential added benefit of DEOX is the removal of fission products, either via direct release from the broken fuel structure or via oxidation and volatilizationmore » by the high temperature process. Fuel element chopping is the baseline operation to prepare spent oxide fuel for an electrolytic reduction step. Typical chopping lengths range from 1 to 5 mm for both individual elements and entire assemblies. During electrolytic reduction, uranium oxide is reduced to metallic uranium via a lithium molten salt. An electrorefining step is then performed to separate a majority of the fission products from the recoverable uranium. Although DEOX is based on a low temperature oxidation cycle near 500oC, additional conditions have been tested to distinguish their effects on the process.[1] Both oxygen and air have been utilized during the oxidation portion followed by vacuum conditions to temperatures as high as 1200oC. In addition, the effects of cladding on fission product removal have also been investigated with released fuel to temperatures greater than 500oC.« less

  4. A macrosonic system for industrial processing

    PubMed

    Gallego-Juarez; Rodriguez-Corral; Riera-Franco de Sarabia E; Campos-Pozuelo; Vazquez-Martinez; Acosta-Aparicio

    2000-03-01

    The development of high-power applications of sonic and ultrasonic energy in industrial processing requires a great variety of practical systems with characteristics which are dependent on the effect to be exploited. Nevertheless, the majority of systems are basically constituted of a treatment chamber and one or several transducers coupled to it. Therefore, the feasibility of the application mainly depends on the efficiency of the transducer-chamber system. This paper deals with a macrosonic system which is essentially constituted of a high-power transducer with a double stepped-plate radiator coupled to a chamber of square section. The radiator, which has a rectangular shape, is placed on one face of the chamber in order to drive the inside fluid volume. The stepped profile of the radiator allows a piston-like radiation to be obtained. The radiation from the back face of the radiator is also applied to the chamber by using adequate reflectors. Transducer-chamber systems for sonic and ultrasonic frequencies have been developed with power capacities up to about 5 kW for the treatment of fluid volumes of several cubic meters. The characteristics of these systems are presented in this paper.

  5. Impact of atmospheric correction and image filtering on hyperspectral classification of tree species using support vector machine

    NASA Astrophysics Data System (ADS)

    Shahriari Nia, Morteza; Wang, Daisy Zhe; Bohlman, Stephanie Ann; Gader, Paul; Graves, Sarah J.; Petrovic, Milenko

    2015-01-01

    Hyperspectral images can be used to identify savannah tree species at the landscape scale, which is a key step in measuring biomass and carbon, and tracking changes in species distributions, including invasive species, in these ecosystems. Before automated species mapping can be performed, image processing and atmospheric correction is often performed, which can potentially affect the performance of classification algorithms. We determine how three processing and correction techniques (atmospheric correction, Gaussian filters, and shade/green vegetation filters) affect the prediction accuracy of classification of tree species at pixel level from airborne visible/infrared imaging spectrometer imagery of longleaf pine savanna in Central Florida, United States. Species classification using fast line-of-sight atmospheric analysis of spectral hypercubes (FLAASH) atmospheric correction outperformed ATCOR in the majority of cases. Green vegetation (normalized difference vegetation index) and shade (near-infrared) filters did not increase classification accuracy when applied to large and continuous patches of specific species. Finally, applying a Gaussian filter reduces interband noise and increases species classification accuracy. Using the optimal preprocessing steps, our classification accuracy of six species classes is about 75%.

  6. Using cognitive behaviour therapy with South Asian Muslims: Findings from the culturally sensitive CBT project.

    PubMed

    Naeem, Farooq; Phiri, Peter; Munshi, Tariq; Rathod, Shanaya; Ayub, Muhhhamad; Gobbi, Mary; Kingdon, David

    2015-01-01

    It has been suggested that cognitive behaviour therapy (CBT) needs adaptation for it to be effective for patients from collectivistic cultures, as currently CBT is underpinned by individualistic values. In prior studies we have demonstrated that CBT could be adapted for Pakistani patients in Southampton, UK, and for local populations in Pakistan. Findings from these studies suggest that CBT can be adapted for patients from collectivistic cultures using a series of steps. In this paper we focus on these steps, and the process of adapting CBT for specific groups. The adaptation process should focus on three major areas of therapy, rather than simple translation of therapy manuals. These include (1) awareness of relevant cultural issues and preparation for therapy, (2) assessment and engagement, and (3) adjustments in therapy. We also discuss the best practice guidelines that evolved from this work to help therapists working with this population. We reiterate that CBT can be adapted effectively for patients from traditional cultures. This is, however, an emerging area in psychotherapy, and further work is required to refine the methodology and to test adapted CBT.

  7. Deceased organ donation for transplantation: Challenges and opportunities

    PubMed Central

    Girlanda, Raffaele

    2016-01-01

    Organ transplantation saves thousands of lives every year but the shortage of donors is a major limiting factor to increase transplantation rates. To allow more patients to be transplanted before they die on the wait-list an increase in the number of donors is necessary. Patients with devastating irreversible brain injury, if medically suitable, are potential deceased donors and strategies are needed to successfully convert them into actual donors. Multiple steps in the process of deceased organ donation can be targeted to increase the number of organs suitable for transplant. In this review, after describing this process, we discuss current challenges and potential strategies to expand the pool of deceased donors. PMID:27683626

  8. An outlook on microalgal biofuels.

    PubMed

    Wijffels, René H; Barbosa, Maria J

    2010-08-13

    Microalgae are considered one of the most promising feedstocks for biofuels. The productivity of these photosynthetic microorganisms in converting carbon dioxide into carbon-rich lipids, only a step or two away from biodiesel, greatly exceeds that of agricultural oleaginous crops, without competing for arable land. Worldwide, research and demonstration programs are being carried out to develop the technology needed to expand algal lipid production from a craft to a major industrial process. Although microalgae are not yet produced at large scale for bulk applications, recent advances-particularly in the methods of systems biology, genetic engineering, and biorefining-present opportunities to develop this process in a sustainable and economical way within the next 10 to 15 years.

  9. Approaches to automated protein crystal harvesting

    PubMed Central

    Deller, Marc C.; Rupp, Bernhard

    2014-01-01

    The harvesting of protein crystals is almost always a necessary step in the determination of a protein structure using X-ray crystallographic techniques. However, protein crystals are usually fragile and susceptible to damage during the harvesting process. For this reason, protein crystal harvesting is the single step that remains entirely dependent on skilled human intervention. Automation has been implemented in the majority of other stages of the structure-determination pipeline, including cloning, expression, purification, crystallization and data collection. The gap in automation between crystallization and data collection results in a bottleneck in throughput and presents unfortunate opportunities for crystal damage. Several automated protein crystal harvesting systems have been developed, including systems utilizing microcapillaries, microtools, microgrippers, acoustic droplet ejection and optical traps. However, these systems have yet to be commonly deployed in the majority of crystallography laboratories owing to a variety of technical and cost-related issues. Automation of protein crystal harvesting remains essential for harnessing the full benefits of fourth-generation synchrotrons, free-electron lasers and microfocus beamlines. Furthermore, automation of protein crystal harvesting offers several benefits when compared with traditional manual approaches, including the ability to harvest microcrystals, improved flash-cooling procedures and increased throughput. PMID:24637746

  10. Automatic Layout Design for Power Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ning, Puqi; Wang, Fei; Ngo, Khai

    The layout of power modules is one of the most important elements in power module design, especially for high power densities, where couplings are increased. In this paper, an automatic design process using a genetic algorithm is presented. Some practical considerations are introduced in the optimization of the layout design of the module. This paper presents a process for automatic layout design for high power density modules. Detailed GA implementations are introduced both for outer loop and inner loop. As verified by a design example, the results of the automatic design process presented here are better than those from manualmore » design and also better than the results from a popular design software. This automatic design procedure could be a major step toward improving the overall performance of future layout design.« less

  11. Thermal Spray Maps: Material Genomics of Processing Technologies

    NASA Astrophysics Data System (ADS)

    Ang, Andrew Siao Ming; Sanpo, Noppakun; Sesso, Mitchell L.; Kim, Sun Yung; Berndt, Christopher C.

    2013-10-01

    There is currently no method whereby material properties of thermal spray coatings may be predicted from fundamental processing inputs such as temperature-velocity correlations. The first step in such an important understanding would involve establishing a foundation that consolidates the thermal spray literature so that known relationships could be documented and any trends identified. This paper presents a method to classify and reorder thermal spray data so that relationships and correlations between competing processes and materials can be identified. Extensive data mining of published experimental work was performed to create thermal spray property-performance maps, known as "TS maps" in this work. Six TS maps will be presented. The maps are based on coating characteristics of major importance; i.e., porosity, microhardness, adhesion strength, and the elastic modulus of thermal spray coatings.

  12. Processing of zero-derived words in English: an fMRI investigation.

    PubMed

    Pliatsikas, Christos; Wheeldon, Linda; Lahiri, Aditi; Hansen, Peter C

    2014-01-01

    Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationalitybridge-V) i.e., zero-derivation (Aronoff, 1980). We compared the processing of one-step (soaking

  13. Personalized Physical Activity Coaching: A Machine Learning Approach

    PubMed Central

    Dijkhuis, Talko B.; van Ittersum, Miriam W.; Velthuijsen, Hugo

    2018-01-01

    Living a sedentary lifestyle is one of the major causes of numerous health problems. To encourage employees to lead a less sedentary life, the Hanze University started a health promotion program. One of the interventions in the program was the use of an activity tracker to record participants' daily step count. The daily step count served as input for a fortnightly coaching session. In this paper, we investigate the possibility of automating part of the coaching procedure on physical activity by providing personalized feedback throughout the day on a participant’s progress in achieving a personal step goal. The gathered step count data was used to train eight different machine learning algorithms to make hourly estimations of the probability of achieving a personalized, daily steps threshold. In 80% of the individual cases, the Random Forest algorithm was the best performing algorithm (mean accuracy = 0.93, range = 0.88–0.99, and mean F1-score = 0.90, range = 0.87–0.94). To demonstrate the practical usefulness of these models, we developed a proof-of-concept Web application that provides personalized feedback about whether a participant is expected to reach his or her daily threshold. We argue that the use of machine learning could become an invaluable asset in the process of automated personalized coaching. The individualized algorithms allow for predicting physical activity during the day and provides the possibility to intervene in time. PMID:29463052

  14. A comparative study of one-step and two-step approaches for MAPbI3 perovskite layer and its influence on the performance of mesoscopic perovskite solar cell

    NASA Astrophysics Data System (ADS)

    Wang, Minhuan; Feng, Yulin; Bian, Jiming; Liu, Hongzhu; Shi, Yantao

    2018-01-01

    The mesoscopic perovskite solar cells (M-PSCs) were synthesized with MAPbI3 perovskite layers as light harvesters, which were grown with one-step and two-step solution process, respectively. A comparative study was performed through the quantitative correlation of resulting device performance and the crystalline quality of perovskite layers. Comparing with the one-step counterpart, a pronounced improvement in the steady-state power conversion efficiencies (PCEs) by 56.86% was achieved with two-step process, which was mainly resulted from the significant enhancement in fill factor (FF) from 48% to 77% without sacrificing the open circuit voltage (Voc) and short circuit current (Jsc). The enhanced FF was attributed to the reduced non-radiative recombination channels due to the better crystalline quality and larger grain size with the two-step processed perovskite layer. Moreover, the superiority of two-step over one-step process was demonstrated with rather good reproducibility.

  15. Physicochemical structural changes of cellulosic substrates during enzymatic saccharification

    DOE PAGES

    Meng, Xianzhi; Yoo, Chang Geun; Li, Mi; ...

    2016-12-30

    Enzymatic hydrolysis represents one of the major steps and barriers in the commercialization process of converting cellulosic substrates into biofuels and other value added products. It is usually achieved by a synergistic action of enzyme mixture typically consisting of multiple enzymes such as glucanase, cellobiohydrolase and β-glucosidase with different mode of actions. Due to the innate biomass recalcitrance, enzymatic hydrolysis normally starts with an initial fast rate of hydrolysis followed by a rapid decrease of rate toward the end of hydrolysis. With majority of literature studies focusing on the effect of key substrate characteristics on the initial rate or finalmore » yield of enzymatic hydrolysis, information about physicochemical structural changes of cellulosic substrates during enzymatic hydrolysis is still quite limited. Consequently, what slows down the reaction rate toward the end of hydrolysis is not well understood. Lastly, this review highlights recent advances in understanding the structural changes of cellulosic substrates during the hydrolysis process, to better understand the fundamental mechanisms of enzymatic hydrolysis.« less

  16. Physicochemical structural changes of cellulosic substrates during enzymatic saccharification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Xianzhi; Yoo, Chang Geun; Li, Mi

    Enzymatic hydrolysis represents one of the major steps and barriers in the commercialization process of converting cellulosic substrates into biofuels and other value added products. It is usually achieved by a synergistic action of enzyme mixture typically consisting of multiple enzymes such as glucanase, cellobiohydrolase and β-glucosidase with different mode of actions. Due to the innate biomass recalcitrance, enzymatic hydrolysis normally starts with an initial fast rate of hydrolysis followed by a rapid decrease of rate toward the end of hydrolysis. With majority of literature studies focusing on the effect of key substrate characteristics on the initial rate or finalmore » yield of enzymatic hydrolysis, information about physicochemical structural changes of cellulosic substrates during enzymatic hydrolysis is still quite limited. Consequently, what slows down the reaction rate toward the end of hydrolysis is not well understood. Lastly, this review highlights recent advances in understanding the structural changes of cellulosic substrates during the hydrolysis process, to better understand the fundamental mechanisms of enzymatic hydrolysis.« less

  17. Rapid saccharification for production of cellulosic biofuels.

    PubMed

    Lee, Dae-Seok; Wi, Seung Gon; Lee, Soo Jung; Lee, Yoon-Gyo; Kim, Yeong-Suk; Bae, Hyeun-Jong

    2014-04-01

    The economical production of biofuels is hindered by the recalcitrance of lignocellulose to processing, causing high consumption of processing enzymes and impeding hydrolysis of pretreated lignocellulosic biomass. We determined the major rate-limiting factor in the hydrolysis of popping pre-treated rice straw (PPRS) by examining cellulase adsorption to lignin and cellulose, amorphogenesis of PPRS, and re-hydrolysis. Based on the results, equivalence between enzyme loading and the open structural area of cellulose was required to significantly increase productive adsorption of cellulase and to accelerate enzymatic saccharification of PPRS. Amorphogenesis of PPRS by phosphoric acid treatment to expand open structural area of the cellulose fibers resulted in twofold higher cellulase adsorption and increased the yield of the first re-hydrolysis step from 13% to 46%. The total yield from PPRS was increased to 84% after 3h. These results provide evidence that cellulose structure is one of major effects on the enzymatic hydrolysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Effect of production management on semen quality during long-term storage in different European boar studs.

    PubMed

    Schulze, M; Kuster, C; Schäfer, J; Jung, M; Grossfeld, R

    2018-03-01

    The processing of ejaculates is a fundamental step for the fertilizing capacity of boar spermatozoa. The aim of the present study was to identify factors that affect quality of boar semen doses. The production process during 1 day of semen processing in 26 European boar studs was monitored. In each boar stud, nine to 19 randomly selected ejaculates from 372 Pietrain boars were analyzed for sperm motility, acrosome and plasma membrane integrity, mitochondrial activity and thermo-resistance (TRT). Each ejaculate was monitored for production time and temperature for each step in semen processing using the special programmed software SEQU (version 1.7, Minitüb, Tiefenbach, Germany). The dilution of ejaculates with a short-term extender was completed in one step in 10 AI centers (n = 135 ejaculates), in two steps in 11 AI centers (n = 158 ejaculates) and in three steps in five AI centers (n = 79 ejaculates). Results indicated there was a greater semen quality with one-step isothermal dilution compared with the multi-step dilution of AI semen doses (total motility TRT d7: 71.1 ± 19.2%, 64.6 ± 20.0%, 47.1 ± 27.1%; one-step compared with two-step compared with the three-step dilution; P < .05). There was a marked advantage when using the one-step isothermal dilution regarding time management, preservation suitability, stability and stress resistance. One-step dilution caused significant lower holding times of raw ejaculates and reduced the possible risk of making mistakes due to a lower number of processing steps. These results lead to refined recommendations for boar semen processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Diagnostic, design and implementation of an integrated model of care in France: a bottom-up process with a continuous leadership

    PubMed Central

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Context Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Case description In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA) was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher) followed by a double one (clinician and managers of services) in the implementation phase. Conclusions The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements.

  20. Development and usability of a computer-tailored pedometer-based physical activity advice for breast cancer survivors.

    PubMed

    De Cocker, K; Charlier, C; Van Hoof, E; Pauwels, E; Lechner, L; Bourgois, J; Spittaels, H; Vandelanotte, C; De Bourdeaudhuij, I

    2015-09-01

    This observational study aimed to adapt a computer-tailored step advice for the general population into a feasible advice for breast cancer survivors and to test its usability. First, several adaptations were made to the original design (adding cancer-related physical activity (PA) barriers and beliefs, and self-management strategies to improve survivors' personal control). Second, the adapted advice was evaluated in two phases: (1) a usability testing in healthy women (n = 3) and survivors (n = 6); and (2) a process evaluation during 3 weeks in breast cancer survivors (n = 8). Preliminary usability testing revealed no problems during logging-in; however, three survivors misinterpreted some questions. After refining the questionnaire and advice, survivors evaluated the advice as interesting, attractive to read, comprehensible and credible. Inactive survivors found the advice novel, but too long. The process evaluation indicated that the majority of the women (n = 5/8) reported increased steps. Monitoring step counts by using a pedometer was perceived as an important motivator to be more active. To conclude, this study provides initial support for the usability and acceptability of a computer-tailored pedometer-based PA advice for breast cancer survivors. After testing efficacy and effectiveness of this intervention, this tool can broaden the reach of PA promotion in breast cancer survivors. © 2014 John Wiley & Sons Ltd.

  1. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  2. An early cytoplasmic step of peptidoglycan synthesis is associated to MreB in Bacillus subtilis.

    PubMed

    Rueff, Anne-Stéphanie; Chastanet, Arnaud; Domínguez-Escobar, Julia; Yao, Zhizhong; Yates, James; Prejean, Maria-Victoria; Delumeau, Olivier; Noirot, Philippe; Wedlich-Söldner, Roland; Filipe, Sergio R; Carballido-López, Rut

    2014-01-01

    MreB proteins play a major role during morphogenesis of rod-shaped bacteria by organizing biosynthesis of the peptidoglycan cell wall. However, the mechanisms underlying this process are not well understood. In Bacillus subtilis, membrane-associated MreB polymers have been shown to be associated to elongation-specific complexes containing transmembrane morphogenetic factors and extracellular cell wall assembly proteins. We have now found that an early intracellular step of cell wall synthesis is also associated to MreB. We show that the previously uncharacterized protein YkuR (renamed DapI) is required for synthesis of meso-diaminopimelate (m-DAP), an essential constituent of the peptidoglycan precursor, and that it physically interacts with MreB. Highly inclined laminated optical sheet microscopy revealed that YkuR forms uniformly distributed foci that exhibit fast motion in the cytoplasm, and are not detected in cells lacking MreB. We propose a model in which soluble MreB organizes intracellular steps of peptidoglycan synthesis in the cytoplasm to feed the membrane-associated cell wall synthesizing machineries. © 2013 John Wiley & Sons Ltd.

  3. Diffraction Techniques in Structural Biology

    PubMed Central

    Egli, Martin

    2010-01-01

    A detailed understanding of chemical and biological function and the mechanisms underlying the activities ultimately requires atomic-resolution structural data. Diffraction-based techniques such as single-crystal X-ray crystallography, electron microscopy and neutron diffraction are well established and have paved the road to the stunning successes of modern-day structural biology. The major advances achieved in the last 20 years in all aspects of structural research, including sample preparation, crystallization, the construction of synchrotron and spallation sources, phasing approaches and high-speed computing and visualization, now provide specialists and non-specialists alike with a steady flow of molecular images of unprecedented detail. The present chapter combines a general overview of diffraction methods with a step-by-step description of the process of a single-crystal X-ray structure determination experiment, from chemical synthesis or expression to phasing and refinement, analysis and quality control. For novices it may serve as a stepping-stone to more in-depth treatises of the individual topics. Readers relying on structural information for interpreting functional data may find it a useful consumer guide. PMID:20517991

  4. Two-Step Plasma Process for Cleaning Indium Bonding Bumps

    NASA Technical Reports Server (NTRS)

    Greer, Harold F.; Vasquez, Richard P.; Jones, Todd J.; Hoenk, Michael E.; Dickie, Matthew R.; Nikzad, Shouleh

    2009-01-01

    A two-step plasma process has been developed as a means of removing surface oxide layers from indium bumps used in flip-chip hybridization (bump bonding) of integrated circuits. The two-step plasma process makes it possible to remove surface indium oxide, without incurring the adverse effects of the acid etching process.

  5. Evaluating the cost effectiveness of environmental projects: Case studies in aerospace and defense

    NASA Technical Reports Server (NTRS)

    Shunk, James F.

    1995-01-01

    Using the replacement technology of high pressure waterjet decoating systems as an example, a simple methodology is presented for developing a cost effectiveness model. The model uses a four-step process to formulate an economic justification designed for presentation to decision makers as an assessment of the value of the replacement technology over conventional methods. Three case studies from major U.S. and international airlines are used to illustrate the methodology and resulting model. Tax and depreciation impacts are also presented as potential additions to the model.

  6. A method for using unmanned aerial vehicles for emergency investigation of single geo-hazards and sample applications of this method

    NASA Astrophysics Data System (ADS)

    Huang, Haifeng; Long, Jingjing; Yi, Wu; Yi, Qinglin; Zhang, Guodong; Lei, Bangjun

    2017-11-01

    In recent years, unmanned aerial vehicles (UAVs) have become widely used in emergency investigations of major natural hazards over large areas; however, UAVs are less commonly employed to investigate single geo-hazards. Based on a number of successful investigations in the Three Gorges Reservoir area, China, a complete UAV-based method for performing emergency investigations of single geo-hazards is described. First, a customized UAV system that consists of a multi-rotor UAV subsystem, an aerial photography subsystem, a ground control subsystem and a ground surveillance subsystem is described in detail. The implementation process, which includes four steps, i.e., indoor preparation, site investigation, on-site fast processing and application, and indoor comprehensive processing and application, is then elaborated, and two investigation schemes, automatic and manual, that are used in the site investigation step are put forward. Moreover, some key techniques and methods - e.g., the layout and measurement of ground control points (GCPs), route planning, flight control and image collection, and the Structure from Motion (SfM) photogrammetry processing - are explained. Finally, three applications are given. Experience has shown that using UAVs for emergency investigation of single geo-hazards greatly reduces the time, intensity and risks associated with on-site work and provides valuable, high-accuracy, high-resolution information that supports emergency responses.

  7. Effects of Topography-based Subgrid Structures on Land Surface Modeling

    NASA Astrophysics Data System (ADS)

    Tesfa, T. K.; Ruby, L.; Brunke, M.; Thornton, P. E.; Zeng, X.; Ghan, S. J.

    2017-12-01

    Topography has major control on land surface processes through its influence on atmospheric forcing, soil and vegetation properties, network topology and drainage area. Consequently, accurate climate and land surface simulations in mountainous regions cannot be achieved without considering the effects of topographic spatial heterogeneity. To test a computationally less expensive hyper-resolution land surface modeling approach, we developed topography-based landunits within a hierarchical subgrid spatial structure to improve representation of land surface processes in the ACME Land Model (ALM) with minimal increase in computational demand, while improving the ability to capture the spatial heterogeneity of atmospheric forcing and land cover influenced by topography. This study focuses on evaluation of the impacts of the new spatial structures on modeling land surface processes. As a first step, we compare ALM simulations with and without subgrid topography and driven by grid cell mean atmospheric forcing to isolate the impacts of the subgrid topography on the simulated land surface states and fluxes. Recognizing that subgrid topography also has important effects on atmospheric processes that control temperature, radiation, and precipitation, methods are being developed to downscale atmospheric forcings. Hence in the second step, the impacts of the subgrid topographic structure on land surface modeling will be evaluated by including spatial downscaling of the atmospheric forcings. Preliminary results on the atmospheric downscaling and the effects of the new spatial structures on the ALM simulations will be presented.

  8. Introduction to Polymer Chemistry.

    ERIC Educational Resources Information Center

    Harris, Frank W.

    1981-01-01

    Reviews the physical and chemical properties of polymers and the two major methods of polymer synthesis: addition (chain, chain-growth, or chain-reaction), and condensation (step-growth or step-reaction) polymerization. (JN)

  9. A sustainable woody biomass biorefinery.

    PubMed

    Liu, Shijie; Lu, Houfang; Hu, Ruofei; Shupe, Alan; Lin, Lu; Liang, Bin

    2012-01-01

    Woody biomass is renewable only if sustainable production is imposed. An optimum and sustainable biomass stand production rate is found to be one with the incremental growth rate at harvest equal to the average overall growth rate. Utilization of woody biomass leads to a sustainable economy. Woody biomass is comprised of at least four components: extractives, hemicellulose, lignin and cellulose. While extractives and hemicellulose are least resistant to chemical and thermal degradation, cellulose is most resistant to chemical, thermal, and biological attack. The difference or heterogeneity in reactivity leads to the recalcitrance of woody biomass at conversion. A selection of processes is presented together as a biorefinery based on incremental sequential deconstruction, fractionation/conversion of woody biomass to achieve efficient separation of major components. A preference is given to a biorefinery absent of pretreatment and detoxification process that produce waste byproducts. While numerous biorefinery approaches are known, a focused review on the integrated studies of water-based biorefinery processes is presented. Hot-water extraction is the first process step to extract value from woody biomass while improving the quality of the remaining solid material. This first step removes extractives and hemicellulose fractions from woody biomass. While extractives and hemicellulose are largely removed in the extraction liquor, cellulose and lignin largely remain in the residual woody structure. Xylo-oligomers, aromatics and acetic acid in the hardwood extract are the major components having the greatest potential value for development. Higher temperature and longer residence time lead to higher mass removal. While high temperature (>200°C) can lead to nearly total dissolution, the amount of sugars present in the extraction liquor decreases rapidly with temperature. Dilute acid hydrolysis of concentrated wood extracts renders the wood extract with monomeric sugars. At higher acid concentration and higher temperature the hydrolysis produced more xylose monomers in a comparatively shorter period of reaction time. Xylose is the most abundant monomeric sugar in the hydrolysate. The other comparatively small amounts of monomeric sugars include arabinose, glucose, rhamnose, mannose and galactose. Acetic acid, formic acid, furfural, HMF and other byproducts are inevitably generated during the acid hydrolysis process. Short reaction time is preferred for the hydrolysis of hot-water wood extracts. Acid hydrolysis presents a perfect opportunity for the removal or separation of aromatic materials from the wood extract/hydrolysate. The hot-water wood extract hydrolysate, after solid-removal, can be purified by Nano-membrane filtration to yield a fermentable sugar stream. Fermentation products such as ethanol can be produced from the sugar stream without a detoxification step. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Process development for elemental recovery from PGM tailings by thermochemical treatment: Preliminary major element extraction studies using ammonium sulphate as extracting agent.

    PubMed

    Mohamed, Sameera; van der Merwe, Elizabet M; Altermann, Wladyslaw; Doucet, Frédéric J

    2016-04-01

    Mine tailings can represent untapped secondary resources of non-ferrous, ferrous, precious, rare and trace metals. Continuous research is conducted to identify opportunities for the utilisation of these materials. This preliminary study investigated the possibility of extracting major elements from South African tailings associated with the mining of Platinum Group Metals (PGM) at the Two Rivers mine operations. These PGM tailings typically contain four major elements (11% Al2O3; 12% MgO; 22% Fe2O3; 34% Cr2O3), with lesser amounts of SiO2 (18%) and CaO (2%). Extraction was achieved via thermochemical treatment followed by aqueous dissolution, as an alternative to conventional hydrometallurgical processes. The thermochemical treatment step used ammonium sulphate, a widely available, low-cost, recyclable chemical agent. Quantification of the efficiency of the thermochemical process required the development and optimisation of the dissolution technique. Dissolution in water promoted the formation of secondary iron precipitates, which could be prevented by leaching thermochemically-treated tailings in 0.6M HNO3 solution. The best extraction efficiencies were achieved for aluminium (ca. 60%) and calcium (ca. 80%). 35% iron and 32% silicon were also extracted, alongside chromium (27%) and magnesium (25%). Thermochemical treatment using ammonium sulphate may therefore represent a promising technology for extracting valuable elements from PGM tailings, which could be subsequently converted to value-added products. However, it is not element-selective, and major elements were found to compete with the reagent to form water-soluble sulphate-metal species. Further development of this integrated process, which aims at achieving the full potential of utilisation of PGM tailings, is currently underway. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Development of the GPM Observatory Thermal Vacuum Test Model

    NASA Technical Reports Server (NTRS)

    Yang, Kan; Peabody, Hume

    2012-01-01

    A software-based thermal modeling process was documented for generating the thermal panel settings necessary to simulate worst-case on-orbit flight environments in an observatory-level thermal vacuum test setup. The method for creating such a thermal model involved four major steps: (1) determining the major thermal zones for test as indicated by the major dissipating components on the spacecraft, then mapping the major heat flows between these components; (2) finding the flight equivalent sink temperatures for these test thermal zones; (3) determining the thermal test ground support equipment (GSE) design and initial thermal panel settings based on the equivalent sink temperatures; and (4) adjusting the panel settings in the test model to match heat flows and temperatures with the flight model. The observatory test thermal model developed from this process allows quick predictions of the performance of the thermal vacuum test design. In this work, the method described above was applied to the Global Precipitation Measurement (GPM) core observatory spacecraft, a joint project between NASA and the Japanese Aerospace Exploration Agency (JAXA) which is currently being integrated at NASA Goddard Space Flight Center for launch in Early 2014. From preliminary results, the thermal test model generated from this process shows that the heat flows and temperatures match fairly well with the flight thermal model, indicating that the test model can simulate fairly accurately the conditions on-orbit. However, further analysis is needed to determine the best test configuration possible to validate the GPM thermal design before the start of environmental testing later this year. Also, while this analysis method has been applied solely to GPM, it should be emphasized that the same process can be applied to any mission to develop an effective test setup and panel settings which accurately simulate on-orbit thermal environments.

  12. Theory and applications for optimization of every part of a photovoltaic system

    NASA Technical Reports Server (NTRS)

    Redfield, D.

    1978-01-01

    A general method is presented for quantitatively optimizing the design of every part and fabrication step of an entire photovoltaic system, based on the criterion of minimum cost/Watt for the system output power. It is shown that no element or process step can be optimized properly by considering only its own cost and performance. Moreover, a fractional performance loss at any fabrication step within the cell or array produces the same fractional increase in the cost/Watt of the entire array, but not of the full system. One general equation is found to be capable of optimizing all parts of a system, although the cell and array steps are basically different from the power-handling elements. Applications of this analysis are given to show (1) when Si wafers should be cut to increase their packing fraction; and (2) what the optimum dimensions for solar cell metallizations are. The optimum shadow fraction of the fine grid is shown to be independent of metal cost and resistivity as well as cell size. The optimum thicknesses of both the fine grid and the bus bar are substantially greater than the values in general use, and the total array cost has a major effect on these values. By analogy, this analysis is adaptable to other solar energy systems.

  13. Direct Sensor Orientation of a Land-Based Mobile Mapping System

    PubMed Central

    Rau, Jiann-Yeou; Habib, Ayman F.; Kersting, Ana P.; Chiang, Kai-Wei; Bang, Ki-In; Tseng, Yi-Hsing; Li, Yu-Hua

    2011-01-01

    A land-based mobile mapping system (MMS) is flexible and useful for the acquisition of road environment geospatial information. It integrates a set of imaging sensors and a position and orientation system (POS). The positioning quality of such systems is highly dependent on the accuracy of the utilized POS. This limitation is the major drawback due to the elevated cost associated with high-end GPS/INS units, particularly the inertial system. The potential accuracy of the direct sensor orientation depends on the architecture and quality of the GPS/INS integration process as well as the validity of the system calibration (i.e., calibration of the individual sensors as well as the system mounting parameters). In this paper, a novel single-step procedure using integrated sensor orientation with relative orientation constraint for the estimation of the mounting parameters is introduced. A comparative analysis between the proposed single-step and the traditional two-step procedure is carried out. Moreover, the estimated mounting parameters using the different methods are used in a direct geo-referencing procedure to evaluate their performance and the feasibility of the implemented system. Experimental results show that the proposed system using single-step system calibration method can achieve high 3D positioning accuracy. PMID:22164015

  14. Step Up-Not On-The Step 2 Clinical Skills Exam: Directors of Clinical Skills Courses (DOCS) Oppose Ending Step 2 CS.

    PubMed

    Ecker, David J; Milan, Felise B; Cassese, Todd; Farnan, Jeanne M; Madigosky, Wendy S; Massie, F Stanford; Mendez, Paul; Obadia, Sharon; Ovitsh, Robin K; Silvestri, Ronald; Uchida, Toshiko; Daniel, Michelle

    2018-05-01

    Recently, a student-initiated movement to end the United States Medical Licensing Examination Step 2 Clinical Skills and the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation has gained momentum. These are the only national licensing examinations designed to assess clinical skills competence in the stepwise process through which physicians gain licensure and certification. Therefore, the movement to end these examinations and the ensuing debate merit careful consideration. The authors, elected representatives of the Directors of Clinical Skills Courses, an organization comprising clinical skills educators in the United States and beyond, believe abolishing the national clinical skills examinations would have a major negative impact on the clinical skills training of medical students, and that forfeiting a national clinical skills competency standard has the potential to diminish the quality of care provided to patients. In this Perspective, the authors offer important additional background information, outline key concerns regarding the consequences of ending these national clinical skills examinations, and provide recommendations for moving forward: reducing the costs for students, exploring alternatives, increasing the value and transparency of the current examinations, recognizing and enhancing the strengths of the current examinations, and engaging in a national dialogue about the issue.

  15. Investigating interactions between phospholipase B-Like 2 and antibodies during Protein A chromatography.

    PubMed

    Tran, Benjamin; Grosskopf, Vanessa; Wang, Xiangdan; Yang, Jihong; Walker, Don; Yu, Christopher; McDonald, Paul

    2016-03-18

    Purification processes for therapeutic antibodies typically exploit multiple and orthogonal chromatography steps in order to remove impurities, such as host-cell proteins. While the majority of host-cell proteins are cleared through purification processes, individual host-cell proteins such as Phospholipase B-like 2 (PLBL2) are more challenging to remove and can persist into the final purification pool even after multiple chromatography steps. With packed-bed chromatography runs using host-cell protein ELISAs and mass spectrometry analysis, we demonstrated that different therapeutic antibodies interact to varying degrees with host-cell proteins in general, and PLBL2 specifically. We then used a high-throughput Protein A chromatography method to further examine the interaction between our antibodies and PLBL2. Our results showed that the co-elution of PLBL2 during Protein A chromatography is highly dependent on the individual antibody and PLBL2 concentration in the chromatographic load. Process parameters such as antibody resin load density and pre-elution wash conditions also influence the levels of PLBL2 in the Protein A eluate. Furthermore, using surface plasmon resonance, we demonstrated that there is a preference for PLBL2 to interact with IgG4 subclass antibodies compared to IgG1 antibodies. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Fish welfare assurance system: initial steps to set up an effective tool to safeguard and monitor farmed fish welfare at a company level.

    PubMed

    van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M

    2012-02-01

    The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required.

  17. The Use of Lactic Acid Bacteria Starter Cultures during the Processing of Fermented Cereal-based Foods in West Africa: A Review.

    PubMed

    Soro-Yao, Amenan Anastasie; Brou, Kouakou; Amani, Georges; Thonart, Philippe; Djè, Koffi Marcelin

    2014-12-01

    Lactic acid bacteria (LAB) are the primary microorganisms used to ferment maize-, sorghum- or millet-based foods that are processed in West Africa. Fermentation contributes to desirable changes in taste, flavour, acidity, digestibility and texture in gruels (ogi, baca, dalaki), doughs (agidi, banku, komé) or steam-cooked granulated products (arraw, ciacry, dégué). Similar to other fermented cereal foods that are available in Africa, these products suffer from inconsistent quality. The use of LAB starter cultures during cereal dough fermentation is a subject of increasing interest in efforts to standardise this step and guaranty product uniformity. However, their use by small-scale processing units or small agro-food industrial enterprises is still limited. This review aims to illustrate and discuss major issues that influence the use of LAB starter cultures during the processing of fermented cereal foods in West Africa.

  18. The Use of Lactic Acid Bacteria Starter Cultures during the Processing of Fermented Cereal-based Foods in West Africa: A Review

    PubMed Central

    Soro-Yao, Amenan Anastasie; Brou, Kouakou; Amani, Georges; Thonart, Philippe; Djè, Koffi Marcelin

    2014-01-01

    Lactic acid bacteria (LAB) are the primary microorganisms used to ferment maize-, sorghum- or millet-based foods that are processed in West Africa. Fermentation contributes to desirable changes in taste, flavour, acidity, digestibility and texture in gruels (ogi, baca, dalaki), doughs (agidi, banku, komé) or steam-cooked granulated products (arraw, ciacry, dégué). Similar to other fermented cereal foods that are available in Africa, these products suffer from inconsistent quality. The use of LAB starter cultures during cereal dough fermentation is a subject of increasing interest in efforts to standardise this step and guaranty product uniformity. However, their use by small-scale processing units or small agro-food industrial enterprises is still limited. This review aims to illustrate and discuss major issues that influence the use of LAB starter cultures during the processing of fermented cereal foods in West Africa. PMID:27073601

  19. p53 on the crossroad between regeneration and cancer.

    PubMed

    Charni, Meital; Aloni-Grinstein, Ronit; Molchadsky, Alina; Rotter, Varda

    2017-01-01

    Regeneration and tumorigenesis share common molecular pathways, nevertheless the outcome of regeneration is life, whereas tumorigenesis leads to death. Although the process of regeneration is strictly controlled, malignant transformation is unrestrained. In this review, we discuss the involvement of TP53, the major tumor-suppressor gene, in the regeneration process. We point to the role of p53 as coordinator assuring that regeneration will not shift to carcinogenesis. The fluctuation in p53 activity during the regeneration process permits a tight control. On one hand, its inhibition at the initial stages allows massive proliferation, on the other its induction at advanced steps of regeneration is essential for preservation of robustness and fidelity of the regeneration process. A better understanding of the role of p53 in regulation of regeneration may open new opportunities for implementation of TP53-based therapies, currently available for cancer patients, in regenerative medicine.

  20. In situ encapsulation of liquids by means of crystallization

    NASA Astrophysics Data System (ADS)

    Hartwig, Anne; Ulrich, Joachim

    2017-07-01

    The in situ encapsulation process is due to its few process steps an innovative and cost effective alternative to common encapsulation techniques. It combines the well-known processes of pastillation and crystallization. This concept is proven with case studies of three xylitol capsules which vary in composition and size. It could be shown that the knowledge concerning the solubility of the components is essential to determine the suitable production conditions. The application of seed crystals and the temperatures during the process have major effects on the capsules quality. A fast crystallization of the capsules results in an instable shell. However, with increasing layer thickness of the shell, the crushing force that needs to be applied to break the capsules is increasing as well. But the stability which is related to the capsules size is decreasing with increasing diameter, even though layer thickness and crushing force are increasing, too.

  1. Process development of starch hydrolysis using mixing characteristics of Taylor vortices.

    PubMed

    Masuda, Hayato; Horie, Takafumi; Hubacz, Robert; Ohmura, Naoto; Shimoyamada, Makoto

    2017-04-01

    In food industries, enzymatic starch hydrolysis is an important process that consists of two steps: gelatinization and saccharification. One of the major difficulties in designing the starch hydrolysis process is the sharp change in its rheological properties. In this study, Taylor-Couette flow reactor was applied to continuous starch hydrolysis process. The concentration of reducing sugar produced via enzymatic hydrolysis was evaluated by varying operational variables: rotational speed of the inner cylinder, axial velocity (reaction time), amount of enzyme, and initial starch content in the slurry. When Taylor vortices were formed in the annular space, efficient hydrolysis occurred because Taylor vortices improved the mixing of gelatinized starch with enzyme. Furthermore, a modified inner cylinder was proposed, and its mixing performance was numerically investigated. The modified inner cylinder showed higher potential for enhanced mixing of gelatinized starch and the enzyme than the conventional cylinder.

  2. Genetically Engineered Materials for Biofuels Production

    NASA Astrophysics Data System (ADS)

    Raab, Michael

    2012-02-01

    Agrivida, Inc., is an agricultural biotechnology company developing industrial crop feedstocks for the fuel and chemical industries. Agrivida's crops have improved processing traits that enable efficient, low cost conversion of the crops' cellulosic components into fermentable sugars. Currently, pretreatment and enzymatic conversion of the major cell wall components, cellulose and hemicellulose, into fermentable sugars is the most expensive processing step that prevents widespread adoption of biomass in biofuels processes. To lower production costs we are consolidating pretreatment and enzyme production within the crop. In this strategy, transgenic plants express engineered cell wall degrading enzymes in an inactive form, which can be reactivated after harvest. We have engineered protein elements that disrupt enzyme activity during normal plant growth. Upon exposure to specific processing conditions, the engineered enzymes are converted into their active forms. This mechanism significantly lowers pretreatment costs and enzyme loadings (>75% reduction) below those currently available to the industry.

  3. Flexible active-matrix displays and shift registers based on solution-processed organic transistors.

    PubMed

    Gelinck, Gerwin H; Huitema, H Edzer A; van Veenendaal, Erik; Cantatore, Eugenio; Schrijnemakers, Laurens; van der Putten, Jan B P H; Geuns, Tom C T; Beenhakkers, Monique; Giesbers, Jacobus B; Huisman, Bart-Hendrik; Meijer, Eduard J; Benito, Estrella Mena; Touwslager, Fred J; Marsman, Albert W; van Rens, Bas J E; de Leeuw, Dago M

    2004-02-01

    At present, flexible displays are an important focus of research. Further development of large, flexible displays requires a cost-effective manufacturing process for the active-matrix backplane, which contains one transistor per pixel. One way to further reduce costs is to integrate (part of) the display drive circuitry, such as row shift registers, directly on the display substrate. Here, we demonstrate flexible active-matrix monochrome electrophoretic displays based on solution-processed organic transistors on 25-microm-thick polyimide substrates. The displays can be bent to a radius of 1 cm without significant loss in performance. Using the same process flow we prepared row shift registers. With 1,888 transistors, these are the largest organic integrated circuits reported to date. More importantly, the operating frequency of 5 kHz is sufficiently high to allow integration with the display operating at video speed. This work therefore represents a major step towards 'system-on-plastic'.

  4. Structure and Function of the 26S Proteasome.

    PubMed

    Bard, Jared A M; Goodall, Ellen A; Greene, Eric R; Jonsson, Erik; Dong, Ken C; Martin, Andreas

    2018-06-20

    As the endpoint for the ubiquitin-proteasome system, the 26S proteasome is the principal proteolytic machine responsible for regulated protein degradation in eukaryotic cells. The proteasome's cellular functions range from general protein homeostasis and stress response to the control of vital processes such as cell division and signal transduction. To reliably process all the proteins presented to it in the complex cellular environment, the proteasome must combine high promiscuity with exceptional substrate selectivity. Recent structural and biochemical studies have shed new light on the many steps involved in proteasomal substrate processing, including recognition, deubiquitination, and ATP-driven translocation and unfolding. In addition, these studies revealed a complex conformational landscape that ensures proper substrate selection before the proteasome commits to processive degradation. These advances in our understanding of the proteasome's intricate machinery set the stage for future studies on how the proteasome functions as a major regulator of the eukaryotic proteome.

  5. Fabrication of magnetic tunnel junctions connected through a continuous free layer to enable spin logic devices

    NASA Astrophysics Data System (ADS)

    Wan, Danny; Manfrini, Mauricio; Vaysset, Adrien; Souriau, Laurent; Wouters, Lennaert; Thiam, Arame; Raymenants, Eline; Sayan, Safak; Jussot, Julien; Swerts, Johan; Couet, Sebastien; Rassoul, Nouredine; Babaei Gavan, Khashayar; Paredis, Kristof; Huyghebaert, Cedric; Ercken, Monique; Wilson, Christopher J.; Mocuta, Dan; Radu, Iuliana P.

    2018-04-01

    Magnetic tunnel junctions (MTJs) interconnected via a continuous ferromagnetic free layer were fabricated for spin torque majority gate (STMG) logic. The MTJs are biased independently and show magnetoelectric response under spin transfer torque. The electrical control of these devices paves the way to future spin logic devices based on domain wall (DW) motion. In particular, it is a significant step towards the realization of a majority gate. To our knowledge, this is the first fabrication of a cross-shaped free layer shared by several perpendicular MTJs. The fabrication process can be generalized to any geometry and any number of MTJs. Thus, this framework can be applied to other spin logic concepts based on magnetic interconnect. Moreover, it allows exploration of spin dynamics for logic applications.

  6. Dimensional measuring techniques in the automotive and aircraft industry

    NASA Astrophysics Data System (ADS)

    Muench, K. H.; Baertlein, Hugh

    1994-03-01

    Optical tooling methods used in industry are rapidly being replaced by new electronic sensor techniques. The impact of new measuring technologies on the production process has caused major changes on the industrial shop floor as well as within industrial measurement systems. The paper deals with one particular industrial measuring system, the manual theodolite measuring system (TMS), within the aircraft and automobile industry. With TMS, setup, data capture, and data analysis are flexible enough to suit industry's demands regarding speed, accuracy, and mobility. Examples show the efficiency and the wide range of TMS applications. In cooperation with industry, the Video Theodolite System was developed. Its origin, functions, capabilities, and future plans are briefly described. With the VTS a major step has been realized in direction to vision systems for industrial applications.

  7. Ammonia oxidation: Ecology, physiology, biochemistry and why they must all come together.

    PubMed

    Lehtovirta-Morley, Laura E

    2018-05-01

    Ammonia oxidation is a fundamental core process in the global biogeochemical nitrogen cycle. Oxidation of ammonia (NH3) to nitrite (NO2 -) is the first and rate-limiting step in nitrification and is carried out by distinct groups of microorganisms. Ammonia oxidation is essential for nutrient turnover in most terrestrial, aquatic and engineered ecosystems and plays a major role, both directly and indirectly, in greenhouse gas production and environmental damage. Although ammonia oxidation has been studied for over a century, this research field has been galvanised in the past decade by the surprising discoveries of novel ammonia oxidising microorganisms. This review reflects on the ammonia oxidation research to date and discusses the major gaps remaining in our knowledge of the biology of ammonia oxidation.

  8. Limits of acceptable change and natural resources planning: when is LAC useful, when is it not?

    Treesearch

    David N. Cole; Stephen F. McCool

    1997-01-01

    There are ways to improve the LAC process and its implementational procedures. One significant procedural modification is the addition of a new step. This step — which becomes the first step in the process — involves more explicitly defining goals and desired conditions. For other steps in the process, clarifications of concept and terminology are advanced, as are...

  9. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  10. Biosorption of Pb (II) from aqueous solution by extracellular polymeric substances extracted from Klebsiella sp. J1: Adsorption behavior and mechanism assessment

    NASA Astrophysics Data System (ADS)

    Wei, Wei; Wang, Qilin; Li, Ang; Yang, Jixian; Ma, Fang; Pi, Shanshan; Wu, Dan

    2016-08-01

    The adsorption performance and mechanism of extracellular polymeric substances (EPS) extracted from Klebsiella sp. J1 for soluble Pb (II) were investigated. The maximum biosorption capacity of EPS for Pb (II) was found to be 99.5 mg g-1 at pH 6.0 and EPS concentration of 0.2 g/L. The data for adsorption process satisfactorily fitted to both Langmuir isotherm and pseudo-second order kinetic model. The mean free energy E and activation energy Ea were determined at 8.22- 8.98 kJ mol-1 and 42.46 kJ mol-1, respectively. The liquid-film diffusion step might be the rate-limiting step. The thermodynamic parameters (ΔGo, ΔHo and ΔSo) revealed that the adsorption process was spontaneous and exothermic under natural conditions. The interactions between EPS system and Pb (II) ions were investigated by qualitative analysis methods (i.e Zeta potential, FT-IR and EDAX). Based on the strong experimental evidence from the mass balance of the related elements participating in the sorption process, an ion exchange process was identified quantitatively as the major mechanism responsible for Pb (II) adsorption by EPS. Molar equivalents of both K+ and Mg2+ could be exchanged with Pb2+ molar equivalents in the process and the contribution rate of ion exchange to adsorption accounted for 85.72% (Δmequiv = -0.000541).

  11. Biosorption of Pb (II) from aqueous solution by extracellular polymeric substances extracted from Klebsiella sp. J1: Adsorption behavior and mechanism assessment

    PubMed Central

    Wei, Wei; Wang, Qilin; Li, Ang; Yang, Jixian; Ma, Fang; Pi, Shanshan; Wu, Dan

    2016-01-01

    The adsorption performance and mechanism of extracellular polymeric substances (EPS) extracted from Klebsiella sp. J1 for soluble Pb (II) were investigated. The maximum biosorption capacity of EPS for Pb (II) was found to be 99.5 mg g−1 at pH 6.0 and EPS concentration of 0.2 g/L. The data for adsorption process satisfactorily fitted to both Langmuir isotherm and pseudo-second order kinetic model. The mean free energy E and activation energy Ea were determined at 8.22– 8.98 kJ mol−1 and 42.46 kJ mol−1, respectively. The liquid-film diffusion step might be the rate-limiting step. The thermodynamic parameters (ΔGo, ΔHo and ΔSo) revealed that the adsorption process was spontaneous and exothermic under natural conditions. The interactions between EPS system and Pb (II) ions were investigated by qualitative analysis methods (i.e Zeta potential, FT-IR and EDAX). Based on the strong experimental evidence from the mass balance of the related elements participating in the sorption process, an ion exchange process was identified quantitatively as the major mechanism responsible for Pb (II) adsorption by EPS. Molar equivalents of both K+ and Mg2+ could be exchanged with Pb2+ molar equivalents in the process and the contribution rate of ion exchange to adsorption accounted for 85.72% (Δmequiv = −0.000541). PMID:27514493

  12. Role of step stiffness and kinks in the relaxation of vicinal (001) with zigzag [110] steps

    NASA Astrophysics Data System (ADS)

    Mahjoub, B.; Hamouda, Ajmi BH.; Einstein, TL.

    2017-08-01

    We present a kinetic Monte Carlo study of the relaxation dynamics and steady state configurations of 〈110〉 steps on a vicinal (001) simple cubic surface. This system is interesting because 〈110〉 (fully kinked) steps have different elementary excitation energetics and favor step diffusion more than 〈100〉 (nominally straight) steps. In this study we show how this leads to different relaxation dynamics as well as to different steady state configurations, including that 2-bond breaking processes are rate determining for 〈110〉 steps in contrast to 3-bond breaking processes for 〈100〉-steps found in previous work [Surface Sci. 602, 3569 (2008)]. The analysis of the terrace-width distribution (TWD) shows a significant role of kink-generation-annihilation processes during the relaxation of steps: the kinetic of relaxation, toward the steady state, is much faster in the case of 〈110〉-zigzag steps, with a higher standard deviation of the TWD, in agreement with a decrease of step stiffness due to orientation. We conclude that smaller step stiffness leads inexorably to faster step dynamics towards the steady state. The step-edge anisotropy slows the relaxation of steps and increases the strength of step-step effective interactions.

  13. A multistage gene normalization system integrating multiple effective methods.

    PubMed

    Li, Lishuang; Liu, Shanshan; Li, Lihua; Fan, Wenting; Huang, Degen; Zhou, Huiwei

    2013-01-01

    Gene/protein recognition and normalization is an important preliminary step for many biological text mining tasks. In this paper, we present a multistage gene normalization system which consists of four major subtasks: pre-processing, dictionary matching, ambiguity resolution and filtering. For the first subtask, we apply the gene mention tagger developed in our earlier work, which achieves an F-score of 88.42% on the BioCreative II GM testing set. In the stage of dictionary matching, the exact matching and approximate matching between gene names and the EntrezGene lexicon have been combined. For the ambiguity resolution subtask, we propose a semantic similarity disambiguation method based on Munkres' Assignment Algorithm. At the last step, a filter based on Wikipedia has been built to remove the false positives. Experimental results show that the presented system can achieve an F-score of 90.1%, outperforming most of the state-of-the-art systems.

  14. Rock climbing: A local-global algorithm to compute minimum energy and minimum free energy pathways.

    PubMed

    Templeton, Clark; Chen, Szu-Hua; Fathizadeh, Arman; Elber, Ron

    2017-10-21

    The calculation of minimum energy or minimum free energy paths is an important step in the quantitative and qualitative studies of chemical and physical processes. The computations of these coordinates present a significant challenge and have attracted considerable theoretical and computational interest. Here we present a new local-global approach to study reaction coordinates, based on a gradual optimization of an action. Like other global algorithms, it provides a path between known reactants and products, but it uses a local algorithm to extend the current path in small steps. The local-global approach does not require an initial guess to the path, a major challenge for global pathway finders. Finally, it provides an exact answer (the steepest descent path) at the end of the calculations. Numerical examples are provided for the Mueller potential and for a conformational transition in a solvated ring system.

  15. Respiratory assessment in critical care units.

    PubMed

    Cox, C L; McGrath, A

    1999-08-01

    As healthcare delivery changes in critical care, nursing continues to extend its practice base. Nursing practice is expanding to incorporate skills once seen as the remit of the medical profession. Critical care nurses are equipping themselves with evidence-based knowledge and skills that can enhance the care they provide to their patients. Assessment of patients is a major role in nursing and, by expanding assessment techniques, nurses can ensure patients receive the care most appropriate to their needs. Nurses in critical care are well placed to perform a more detailed assessment which can help to focus nursing care. This article describes the step-by-step process of undertaking a full and comprehensive respiratory assessment in critical care settings. It identifies many of the problems that patients may have and the signs and symptoms that a nurse may not whilst undertaking the assessment and preparing to prescribe care.

  16. Report of the In Situ Resources Utilization Workshop

    NASA Technical Reports Server (NTRS)

    Fairchild, Kyle (Editor); Mendell, Wendell W. (Editor)

    1988-01-01

    The results of a workshop of 50 representatives from the public and private sector which investigated the potential joint development of the key technologies and mechanisms that will enable the permanent habitation of space are presented. The workshop is an initial step to develop a joint public/private assessment of new technology requirements of future space options, to share knowledge on required technologies that may exist in the private sector, and to investigate potential joint technology development opportunities. The majority of the material was produced in 5 working groups: (1) Construction, Assembly, Automation and Robotics; (2) Prospecting, Mining, and Surface Transportation; (3) Biosystems and Life Support; (4) Materials Processing; and (5) Innovative Ventures. In addition to the results of the working groups, preliminary technology development recommendations to assist in near-term development priority decisions are presented. Finally, steps are outlined for potential new future activities and relationships among the public, private, and academic sectors.

  17. A Method for Transforming Existing Web Service Descriptions into an Enhanced Semantic Web Service Framework

    NASA Astrophysics Data System (ADS)

    Du, Xiaofeng; Song, William; Munro, Malcolm

    Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.

  18. Students Targeting Engineering and Physical Science (STEPS) at California State University Northridge (CSUN):Activities and Outcomes 2011-2016

    NASA Astrophysics Data System (ADS)

    Cadavid, A. C.; Pedone, V. A.; Horn, W.; Rich, H.

    2016-12-01

    The specific goal of STEPS at CSUN is to increase the number bachelor's degrees in STEM majors, particularly those in engineering, computer science, mathematics and the physical sciences. Prior to STEPS, only 33% of first-time freshmen in these majors graduated from CSUN within 6-7 years. We employ two main strategies: 1) fostering success in lower-division mathematics for freshmen and sophomores, 2) Summer Interdisciplinary Team Experience (SITE) for students transitioning to junior level courses. To improve success in mathematics, we have advanced initial placements in the foundational mathematics sequence by one or two semesters through improvements in the placement test (6-7% improvement) and have increased the first-time pass rate in foundational math courses through mandatory supplementary laboratories for at-risk students. Students who successfully complete the supplemental laboratories pass the lecture class at a higher rate than the total population of at-risk students (65% compared to 61%). Both approaches have been institutionalized. SITE targets students entering their junior years in a 3-week interdisciplinary team project that highlights problem solving and hands-on activities. Survey results of the 233 participants show that SITE: 1) maintained or increased desire to earn a STEM degree, 2) increased positive attitudes toward team-based problem solving, 3) increased understanding in how they will use their major in a career, and 4) increased interest in faculty-mentored research and industry internships. Our 5-year program is nearing completion and shows success in meeting our goal. We have measured a 9% point increase in the pass rate of Calculus I for post-STEPS cohorts compared to pre-STEPS cohorts. Failure to pass Calculus is a leading cause in non-completion of the majors targeted by STEPS. We have analyzed the graduation rates of two pre-STEPS cohorts that have had over 6 years to graduate. Both have a graduate rate of 28%. We expect that the 9% point increase in calculus passers will lead to a comparable increase in graduation rate, resulting in a 37% graduation rate for the post-STEPS cohorts.

  19. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less

  20. Ten steps to successful software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, R. K.

    2003-01-01

    This paper identifies ten steps for managing change that address organizational and cultural issues. Four of these steps are critical, that if not done, will almost guarantee failure. This ten-step program emphasizes the alignment of business goals, change process goals, and the work performed by the employees of an organization.

  1. Bistatic SAR: Signal Processing and Image Formation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahl, Daniel E.; Yocky, David A.

    This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013more » on Kirtland Air Force Base, New Mexico.« less

  2. Nonlinear Response of Layer Growth Dynamics in the Mixed Kinetics-Bulk-Transport Regime

    NASA Technical Reports Server (NTRS)

    Vekilov, Peter G.; Alexander, J. Iwan D.; Rosenberger, Franz

    1996-01-01

    In situ high-resolution interferometry on horizontal facets of the protein lysozyme reveal that the local growth rate R, vicinal slope p, and tangential (step) velocity v fluctuate by up to 80% of their average values. The time scale of these fluctuations, which occur under steady bulk transport conditions through the formation and decay of step bunches (macrosteps), is of the order of 10 min. The fluctuation amplitude of R increases with growth rate (supersaturation) and crystal size, while the amplitude of the v and p fluctuations changes relatively little. Based on a stability analysis for equidistant step trains in the mixed transport-interface-kinetics regime, we argue that the fluctuations originate from the coupling of bulk transport with nonlinear interface kinetics. Furthermore, step bunches moving across the interface in the direction of or opposite to the buoyancy-driven convective flow increase or decrease in height, respectively. This is in agreement with analytical treatments of the interaction of moving steps with solution flow. Major excursions in growth rate are associated with the formation of lattice defects (striations). We show that, in general, the system-dependent kinetic Peclet number, Pe(sub k) , i.e., the relative weight of bulk transport and interface kinetics in the control of the growth process, governs the step bunching dynamics. Since Pe(sub k) can be modified by either forced solution flow or suppression of buoyancy-driven convection under reduced gravity, this model provides a rationale for the choice of specific transport conditions to minimize the formation of compositional inhomogeneities under steady bulk nutrient crystallization conditions.

  3. Effective virus inactivation and removal by steps of Biotest Pharmaceuticals IGIV production process

    PubMed Central

    Dichtelmüller, Herbert O.; Flechsig, Eckhard; Sananes, Frank; Kretschmar, Michael; Dougherty, Christopher J.

    2012-01-01

    The virus validation of three steps of Biotest Pharmaceuticals IGIV production process is described here. The steps validated are precipitation and removal of fraction III of the cold ethanol fractionation process, solvent/detergent treatment and 35 nm virus filtration. Virus validation was performed considering combined worst case conditions. By these validated steps sufficient virus inactivation/removal is achieved, resulting in a virus safe product. PMID:24371563

  4. Developing stepped care treatment for depression (STEPS): study protocol for a pilot randomised controlled trial.

    PubMed

    Hill, Jacqueline J; Kuyken, Willem; Richards, David A

    2014-11-20

    Stepped care is recommended and implemented as a means to organise depression treatment. Compared with alternative systems, it is assumed to achieve equivalent clinical effects and greater efficiency. However, no trials have examined these assumptions. A fully powered trial of stepped care compared with intensive psychological therapy is required but a number of methodological and procedural uncertainties associated with the conduct of a large trial need to be addressed first. STEPS (Developing stepped care treatment for depression) is a mixed methods study to address uncertainties associated with a large-scale evaluation of stepped care compared with high-intensity psychological therapy alone for the treatment of depression. We will conduct a pilot randomised controlled trial with an embedded process study. Quantitative trial data on recruitment, retention and the pathway of patients through treatment will be used to assess feasibility. Outcome data on the effects of stepped care compared with high-intensity therapy alone will inform a sample size calculation for a definitive trial. Qualitative interviews will be undertaken to explore what people think of our trial methods and procedures and the stepped care intervention. A minimum of 60 patients with Major Depressive Disorder will be recruited from an Improving Access to Psychological Therapies service and randomly allocated to receive stepped care or intensive psychological therapy alone. All treatments will be delivered at clinic facilities within the University of Exeter. Quantitative patient-related data on depressive symptoms, worry and anxiety and quality of life will be collected at baseline and 6 months. The pilot trial and interviews will be undertaken concurrently. Quantitative and qualitative data will be analysed separately and then integrated. The outcomes of this study will inform the design of a fully powered randomised controlled trial to evaluate the effectiveness and efficiency of stepped care. Qualitative data on stepped care will be of immediate interest to patients, clinicians, service managers, policy makers and guideline developers. A more informed understanding of the feasibility of a large trial will be obtained than would be possible from a purely quantitative (or qualitative) design. Current Controlled Trials ISRCTN66346646 registered on 2 July 2014.

  5. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  6. Synthesis and Characterization of Antireflective ZnO Nanoparticles Coatings Used for Energy Improving Efficiency of Silicone Solar Cells

    NASA Astrophysics Data System (ADS)

    Pîslaru-Dănescu, Lucian; Chitanu, Elena; El-Leathey, Lucia-Andreea; Marinescu, Virgil; Marin, Dorian; Sbârcea, Beatrice-Gabriela

    2018-05-01

    The paper proposes a new and complex process for the synthesis of ZnO nanoparticles for antireflective coating corresponding to silicone solar cells applications. The process consists of two major steps: preparation of seed layer and hydrothermal growth of ZnO nanoparticles. Due to the fact that the seed layer morphology influences the ZnO nanoparticles proprieties, the process optimization of the seed layer preparation is necessary. Following the hydrothermal growth of the ZnO nanoparticles, antireflective coating of silicone solar cells is achieved. After determining the functional parameters of the solar cells provided either with glass or with ZnO, it is concluded that all the parameters values are superior in the case of solar cells with ZnO antireflection coating and are increasing along with the solar irradiance.

  7. Examining the Association between Patient-Reported Symptoms of Attention and Memory Dysfunction with Objective Cognitive Performance: A Latent Regression Rasch Model Approach.

    PubMed

    Li, Yuelin; Root, James C; Atkinson, Thomas M; Ahles, Tim A

    2016-06-01

    Patient-reported cognition generally exhibits poor concordance with objectively assessed cognitive performance. In this article, we introduce latent regression Rasch modeling and provide a step-by-step tutorial for applying Rasch methods as an alternative to traditional correlation to better clarify the relationship of self-report and objective cognitive performance. An example analysis using these methods is also included. Introduction to latent regression Rasch modeling is provided together with a tutorial on implementing it using the JAGS programming language for the Bayesian posterior parameter estimates. In an example analysis, data from a longitudinal neurocognitive outcomes study of 132 breast cancer patients and 45 non-cancer matched controls that included self-report and objective performance measures pre- and post-treatment were analyzed using both conventional and latent regression Rasch model approaches. Consistent with previous research, conventional analysis and correlations between neurocognitive decline and self-reported problems were generally near zero. In contrast, application of latent regression Rasch modeling found statistically reliable associations between objective attention and processing speed measures with self-reported Attention and Memory scores. Latent regression Rasch modeling, together with correlation of specific self-reported cognitive domains with neurocognitive measures, helps to clarify the relationship of self-report with objective performance. While the majority of patients attribute their cognitive difficulties to memory decline, the Rash modeling suggests the importance of processing speed and initial learning. To encourage the use of this method, a step-by-step guide and programming language for implementation is provided. Implications of this method in cognitive outcomes research are discussed. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. The “Health Coaching” programme: a new patient-centred and visually supported approach for health behaviour change in primary care

    PubMed Central

    2013-01-01

    Background Health related behaviour is an important determinant of chronic disease, with a high impact on public health. Motivating and assisting people to change their unfavourable health behaviour is thus a major challenge for health professionals. The objective of the study was to develop a structured programme of counselling in primary care practice, and to test its feasibility and acceptance among general practitioners (GPs) and their patients. Methods Our new concept integrates change of roles, shared responsibility, patient-centredness, and modern communication techniques—such as motivational interviewing. A new colour-coded visual communication tool is used for the purpose of leading through the 4-step counselling process. As doctors’ communication skills are crucial, communication training is a mandatory part of the programme. We tested the feasibility and acceptance of the “Health Coaching” programme with 20 GPs and 1045 patients, using questionnaires and semistructured interviewing techniques. The main outcomes were participation rates; the duration of counselling; patients’ self-rated behavioural change in their areas of choice; and ratings of motivational, conceptual, acceptance, and feasibility issues. Results In total, 37% (n=350) of the patients enrolled in step 1 completed the entire 4-Step counselling process, with each step taking 8–22 minutes. 50% of ratings (n=303) improved by one or two categories in the three-colour circle, and the proportion of favourable health behaviour ratings increased from 9% to 39%. The ratings for motivation, concept, acceptance, and feasibility of the “Health Coaching” programme were consistently high. Conclusions Our innovative, patient-centred counselling programme for health behaviour change was well accepted and feasible among patients and physicians in a primary care setting. Randomised controlled studies will have to establish cost-effectiveness and promote dissemination. PMID:23865509

  9. The "Health Coaching" programme: a new patient-centred and visually supported approach for health behaviour change in primary care.

    PubMed

    Neuner-Jehle, Stefan; Schmid, Margareta; Grüninger, Ueli

    2013-07-17

    Health related behaviour is an important determinant of chronic disease, with a high impact on public health. Motivating and assisting people to change their unfavourable health behaviour is thus a major challenge for health professionals. The objective of the study was to develop a structured programme of counselling in primary care practice, and to test its feasibility and acceptance among general practitioners (GPs) and their patients. Our new concept integrates change of roles, shared responsibility, patient-centredness, and modern communication techniques-such as motivational interviewing. A new colour-coded visual communication tool is used for the purpose of leading through the 4-step counselling process. As doctors' communication skills are crucial, communication training is a mandatory part of the programme. We tested the feasibility and acceptance of the "Health Coaching" programme with 20 GPs and 1045 patients, using questionnaires and semistructured interviewing techniques. The main outcomes were participation rates; the duration of counselling; patients' self-rated behavioural change in their areas of choice; and ratings of motivational, conceptual, acceptance, and feasibility issues. In total, 37% (n=350) of the patients enrolled in step 1 completed the entire 4-Step counselling process, with each step taking 8-22 minutes. 50% of ratings (n=303) improved by one or two categories in the three-colour circle, and the proportion of favourable health behaviour ratings increased from 9% to 39%. The ratings for motivation, concept, acceptance, and feasibility of the "Health Coaching" programme were consistently high. Our innovative, patient-centred counselling programme for health behaviour change was well accepted and feasible among patients and physicians in a primary care setting. Randomised controlled studies will have to establish cost-effectiveness and promote dissemination.

  10. Methanogenic Hydrocarbon Degradation: Evidence from Field and Laboratory Studies.

    PubMed

    Jiménez, Núria; Richnow, Hans H; Vogt, Carsten; Treude, Tina; Krüger, Martin

    2016-01-01

    Microbial transformation of hydrocarbons to methane is an environmentally relevant process taking place in a wide variety of electron acceptor-depleted habitats, from oil reservoirs and coal deposits to contaminated groundwater and deep sediments. Methanogenic hydrocarbon degradation is considered to be a major process in reservoir degradation and one of the main processes responsible for the formation of heavy oil deposits and oil sands. In the absence of external electron acceptors such as oxygen, nitrate, sulfate or Fe(III), fermentation and methanogenesis become the dominant microbial metabolisms. The major end product under these conditions is methane, and the only electron acceptor necessary to sustain the intermediate steps in this process is CO2, which is itself a net product of the overall reaction. We are summarizing the state of the art and recent advances in methanogenic hydrocarbon degradation research. Both the key microbial groups involved as well as metabolic pathways are described, and we discuss the novel insights into methanogenic hydrocarbon-degrading populations studied in laboratory as well as environmental systems enabled by novel cultivation-based and molecular approaches. Their possible implications on energy resources, bioremediation of contaminated sites, deep-biosphere research, and consequences for atmospheric composition and ultimately climate change are also addressed. © 2016 S. Karger AG, Basel.

  11. Podocyte is the major culprit accounting for the progression of chronic renal disease.

    PubMed

    Kriz, Wilhelm

    2002-05-15

    The concept that the podocyte is the major culprit underlying development and progression of glomerular diseases leading to chronic renal failure is well established. The essential steps in this process are (1) the establishment of tuft adhesions to Bowman's capsule; (2) the formation by capillaries contained in a tuft adhesion of a filtrate that is delivered, instead into Bowman's space, towards the interstitium; and (3) the spreading of this filtrate on the outer aspect of the affected nephron leading to the degeneration of this nephron. The present review summarizes the pros and cons concerning the relevance of misdirected filtration for a nephron-to-nephron transfer of the disease at the level of the tubular interstitium. Surprisingly, the histopathology clearly shows that interstitial proliferation surrounding degenerating nephrons tends to encapsulate the degenerative process, confining it to the already affected nephron. No evidence is available that the disease, mediated by interstitial proliferation and matrix deposition, may jump to a neighboring, so far unaffected, nephron. It appears that the process that leads to the degeneration of a nephron in the context of "classic" FSGS always starts separately in the respective glomerulus by severe podocyte injury. Copyright 2002 Wiley-Liss, Inc.

  12. Leading trends in environmental regulation that affect energy development. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steele, R V; Attaway, L D; Christerson, J A

    1980-01-01

    Major environmental issues that are likely to affect the implementation of energy technologies between now and the year 2000 are identified and assessed. The energy technologies specifically addressed are: oil recovery and processing; gas recovery and processing; coal liquefaction; coal gasification (surface); in situ coal gasification; direct coal combustion; advanced power systems; magnetohydrodynamics; surface oil shale retorting; true and modified in situ oil shale retorting; geothermal energy; biomass energy conversion; and nuclear power (fission). Environmental analyses of these technologies included, in addition to the main processing steps, the complete fuel cycle from resource extraction to end use. A comprehensive surveymore » of the environmental community (including environmental groups, researchers, and regulatory agencies) was carried out in parallel with an analysis of the technologies to identify important future environmental issues. Each of the final 20 issues selected by the project staff has the following common attributes: consensus of the environmental community that the issue is important; it is a likely candidate for future regulatory action; it deals with a major environmental aspect of energy development. The analyses of the 20 major issues address their environmental problem areas, current regulatory status, and the impact of future regulations. These analyses are followed by a quantitative assessment of the impact on energy costs and nationwide pollutant emissions of possible future regulations. This is accomplished by employing the Strategic Environmental Assessment System (SEAS) for a subset of the 20 major issues. The report concludes with a more general discussion of the impact of environmental regulatory action on energy development.« less

  13. Common threads in cardiac fibrosis, infarct scar formation, and wound healing.

    PubMed

    Czubryt, Michael P

    2012-11-01

    Wound healing, cardiac fibrosis, and infarct scar development, while possessing distinct features, share a number of key functional similarities, including extracellular matrix synthesis and remodeling by fibroblasts and myofibroblasts. Understanding the underlying mechanisms that are common to these processes may suggest novel therapeutic approaches for pathologic situations such as fibrosis, or defective wound healing such as hypertrophic scarring or keloid formation. This manuscript will briefly review the major steps of wound healing, and will contrast this process with how cardiac infarct scar formation or interstitial fibrosis occurs. The feasibility of targeting common pro-fibrotic growth factor signaling pathways will be discussed. Finally, the potential exploitation of novel regulators of wound healing and fibrosis (ski and scleraxis), will be examined.

  14. fMRI paradigm designing and post-processing tools

    PubMed Central

    James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan

    2014-01-01

    In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001

  15. Process Waste Assessment Machine and Fabrication Shop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, N.M.

    1993-03-01

    This Process Waste Assessment was conducted to evaluate hazardous wastes generated in the Machine and Fabrication Shop at Sandia National Laboratories, Bonding 913, Room 119. Spent machine coolant is the major hazardous chemical waste generated in this facility. The volume of spent coolant generated is approximately 150 gallons/month. It is sent off-site to a recycler, but a reclaiming system for on-site use is being investigated. The Shop`s line management considers hazardous waste minimization very important. A number of steps have already been taken to minimize wastes, including replacement of a hazardous solvent with biodegradable, non-caustic solution and filtration unit; wastemore » segregation; restriction of beryllium-copper alloy machining; and reduction of lead usage.« less

  16. Collagen Quantification in Tissue Specimens.

    PubMed

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  17. Simulant Basis for the Standard High Solids Vessel Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Reid A.; Fiskum, Sandra K.; Suffield, Sarah R.

    The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. Themore » simulant recipes that meet this basis will be provided in a subsequent document.« less

  18. Coal gasification systems engineering and analysis. Appendix E: Cost estimation and economic evaluation methodology

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The cost estimation and economic evaluation methodologies presented are consistent with industry practice for assessing capital investment requirements and operating costs of coal conversion systems. All values stated are based on January, 1980 dollars with appropriate recognition of the time value of money. Evaluation of project economic feasibility can be considered a two step process (subject to considerable refinement). First, the costs of the project must be quantified and second, the price at which the product can be manufacturd must be determined. These two major categories are discussed. The summary of methodology is divided into five parts: (1) systems costs, (2)instant plant costs, (3) annual operating costs, (4) escalation and discounting process, and (5) product pricing.

  19. Artistic creativity, style and brain disorders.

    PubMed

    Bogousslavsky, Julien

    2005-01-01

    The production of novel, motivated or useful material defines creativity, which appears to be one of the higher, specific, human brain functions. While creativity can express itself in virtually any domain, art might particularly well illustrate how creativity may be modulated by the normal or pathological brain. Evidence emphasizes global brain functioning in artistic creativity and output, but critical steps which link perception processing to execution of a work, such as extraction-abstraction, as well as major developments of non-esthetic values attached to art also underline complex activation and inhibition processes mainly localized in the frontal lobe. Neurological diseases in artists provide a unique opportunity to study brain-creativity relationships, in particular through the stylistic changes which may develop after brain lesion. (c) 2005 S. Karger AG, Basel

  20. Wood formation in Angiosperms.

    PubMed

    Déjardin, Annabelle; Laurans, Françoise; Arnaud, Dominique; Breton, Christian; Pilate, Gilles; Leplé, Jean-Charles

    2010-04-01

    Wood formation is a complex biological process, involving five major developmental steps, including (1) cell division from a secondary meristem called the vascular cambium, (2) cell expansion (cell elongation and radial enlargement), (3) secondary cell wall deposition, (4) programmed cell death, and (5) heartwood formation. Thanks to the development of genomic studies in woody species, as well as genetic engineering, recent progress has been made in the understanding of the molecular mechanisms underlying wood formation. In this review, we will focus on two different aspects, the lignification process and the control of microfibril angle in the cell wall of wood fibres, as they are both key features of wood material properties. Copyright (c) 2010 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  1. Separation process using pervaporation and dephlegmation

    DOEpatents

    Vane, Leland M.; Mairal, Anurag P.; Ng, Alvin; Alvarez, Franklin R.; Baker, Richard W.

    2004-06-29

    A process for treating liquids containing organic compounds and water. The process includes a pervaporation step in conjunction with a dephlegmation step to treat at least a portion of the permeate vapor from the pervaporation step. The process yields a membrane residue stream, a stream enriched in the more volatile component (usually the organic) as the overhead stream from the dephlegmator and a condensate stream enriched in the less volatile component (usually the water) as a bottoms stream from the dephlegmator. Any of these may be the principal product of the process. The membrane separation step may also be performed in the vapor phase, or by membrane distillation.

  2. 40 CFR 35.927-3 - Rehabilitation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... minor rehabilitation concurrently with the sewer system evaluation survey in any step under a grant if... consisting of major rehabilitation work may be awarded concurrently with step 2 work for the design of the...

  3. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  4. Environmental impact associated with activated carbon preparation from olive-waste cake via life cycle assessment.

    PubMed

    Hjaila, K; Baccar, R; Sarrà, M; Gasol, C M; Blánquez, P

    2013-11-30

    The life cycle assessment (LCA) environmental tool was implemented to quantify the potential environmental impacts associated with the activated carbon (AC) production process from olive-waste cakes in Tunisia. On the basis of laboratory investigations for AC preparation, a flowchart was developed and the environmental impacts were determined. The LCA functional unit chosen was the production of 1 kg of AC from by-product olive-waste cakes. The results showed that impregnation using H3PO4 presented the highest environmental impacts for the majority of the indicators tested: acidification potential (62%), eutrophication (96%), ozone depletion potential (44%), human toxicity (64%), fresh water aquatic ecotoxicity (90%) and terrestrial ecotoxicity (92%). One of the highest impacts was found to be the global warming potential (11.096 kg CO2 eq/kg AC), which was equally weighted between the steps involving impregnation, pyrolysis, and drying the washed AC. The cumulative energy demand of the AC production process from the by-product olive-waste cakes was 167.63 MJ contributed by impregnation, pyrolysis, and drying the washed AC steps. The use of phosphoric acid and electricity in the AC production were the main factors responsible for the majority of the impacts. If certain modifications are incorporated into the AC production, such as implementing synthesis gas recovery and reusing it as an energy source and recovery of phosphoric acid after AC washing, additional savings could be realized, and environmental impacts could be minimized. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. The thermodynamics of pyrochemical processes for liquid metal reactor fuel cycles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, I.

    1987-01-01

    The thermodynamic basis for pyrochemical processes for the recovery and purification of fuel for the liquid metal reactor fuel cycle is described. These processes involve the transport of the uranium and plutonium from one liquid alloy to another through a molten salt. The processes discussed use liquid alloys of cadmium, zinc, and magnesium and molten chloride salts. The oxidation-reduction steps are done either chemically by the use of an auxiliary redox couple or electrochemically by the use of an external electrical supply. The same basic thermodynamics apply to both the salt transport and the electrotransport processes. Large deviations from idealmore » solution behavior of the actinides and lanthanides in the liquid alloys have a major influence on the solubilities and the performance of both the salt transport and electrotransport processes. Separation of plutonium and uranium from each other and decontamination from the more noble fission product elements can be achieved using both transport processes. The thermodynamic analysis is used to make process design computations for different process conditions.« less

  6. Maelstrom Research guidelines for rigorous retrospective data harmonization

    PubMed Central

    Fortier, Isabel; Raina, Parminder; Van den Heuvel, Edwin R; Griffith, Lauren E; Craig, Camille; Saliba, Matilda; Doiron, Dany; Stolk, Ronald P; Knoppers, Bartha M; Ferretti, Vincent; Granda, Peter; Burton, Paul

    2017-01-01

    Abstract Background: It is widely accepted and acknowledged that data harmonization is crucial: in its absence, the co-analysis of major tranches of high quality extant data is liable to inefficiency or error. However, despite its widespread practice, no formalized/systematic guidelines exist to ensure high quality retrospective data harmonization. Methods: To better understand real-world harmonization practices and facilitate development of formal guidelines, three interrelated initiatives were undertaken between 2006 and 2015. They included a phone survey with 34 major international research initiatives, a series of workshops with experts, and case studies applying the proposed guidelines. Results: A wide range of projects use retrospective harmonization to support their research activities but even when appropriate approaches are used, the terminologies, procedures, technologies and methods adopted vary markedly. The generic guidelines outlined in this article delineate the essentials required and describe an interdependent step-by-step approach to harmonization: 0) define the research question, objectives and protocol; 1) assemble pre-existing knowledge and select studies; 2) define targeted variables and evaluate harmonization potential; 3) process data; 4) estimate quality of the harmonized dataset(s) generated; and 5) disseminate and preserve final harmonization products. Conclusions: This manuscript provides guidelines aiming to encourage rigorous and effective approaches to harmonization which are comprehensively and transparently documented and straightforward to interpret and implement. This can be seen as a key step towards implementing guiding principles analogous to those that are well recognised as being essential in securing the foundational underpinning of systematic reviews and the meta-analysis of clinical trials. PMID:27272186

  7. Chemical interaction matrix between reagents in a Purex based process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brahman, R.K.; Hennessy, W.P.; Paviet-Hartmann, P.

    2008-07-01

    The United States Department of Energy (DOE) is the responsible entity for the disposal of the United States excess weapons grade plutonium. DOE selected a PUREX-based process to convert plutonium to low-enriched mixed oxide fuel for use in commercial nuclear power plants. To initiate this process in the United States, a Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF) is under construction and will be operated by Shaw AREVA MOX Services at the Savannah River Site. This facility will be licensed and regulated by the U.S. Nuclear Regulatory Commission (NRC). A PUREX process, similar to the one used at La Hague,more » France, will purify plutonium feedstock through solvent extraction. MFFF employs two major process operations to manufacture MOX fuel assemblies: (1) the Aqueous Polishing (AP) process to remove gallium and other impurities from plutonium feedstock and (2) the MOX fuel fabrication process (MP), which processes the oxides into pellets and manufactures the MOX fuel assemblies. The AP process consists of three major steps, dissolution, purification, and conversion, and is the center of the primary chemical processing. A study of process hazards controls has been initiated that will provide knowledge and protection against the chemical risks associated from mixing of reagents over the life time of the process. This paper presents a comprehensive chemical interaction matrix evaluation for the reagents used in the PUREX-based process. Chemical interaction matrix supplements the process conditions by providing a checklist of any potential inadvertent chemical reactions that may take place. It also identifies the chemical compatibility/incompatibility of the reagents if mixed by failure of operations or equipment within the process itself or mixed inadvertently by a technician in the laboratories. (aut0010ho.« less

  8. Surface Modified Particles By Multi-Step Addition And Process For The Preparation Thereof

    DOEpatents

    Cook, Ronald Lee; Elliott, Brian John; Luebben, Silvia DeVito; Myers, Andrew William; Smith, Bryan Matthew

    2006-01-17

    The present invention relates to a new class of surface modified particles and to a multi-step surface modification process for the preparation of the same. The multi-step surface functionalization process involves two or more reactions to produce particles that are compatible with various host systems and/or to provide the particles with particular chemical reactivities. The initial step comprises the attachment of a small organic compound to the surface of the inorganic particle. The subsequent steps attach additional compounds to the previously attached organic compounds through organic linking groups.

  9. Users guide to E859 phoswich analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costales, J.B.

    1992-11-30

    In this memo the authors describe the analysis path used to transform the phoswich data from raw data banks into cross sections suitable for publication. The primary purpose of this memo is not to document each analysis step in great detail but rather to point the reader to the fortran code used and to point out the essential features of the analysis path. A flow chart which summarizes the various steps performed to massage the data from beginning to end is given. In general, each step corresponds to a fortran program which was written to perform that particular task. Themore » automation of the data analysis has been kept purposefully minimal in order to ensure the highest quality of the final product. However, tools have been developed which ease the non--automated steps. There are two major parallel routes for the data analysis: data reduction and acceptance determination using detailed GEANT Monte Carlo simulations. In this memo, the authors will first describe the data reduction up to the point where PHAD banks (Pass 1-like banks) are created. They the will describe the steps taken in the GEANT Monte Carlo route. Note that a detailed memo describing the methodology of the acceptance corrections has already been written. Therefore the discussion of the acceptance determination will be kept to a minimum and the reader will be referred to the other memo for further details. Finally, they will describe the cross section formation process and how final spectra are extracted.« less

  10. Influence of gait speed on stability: recovery from anterior slips and compensatory stepping.

    PubMed

    Bhatt, T; Wening, J D; Pai, Y-C

    2005-02-01

    Falls precipitated by slipping are a major health concern, with the majority of all slip-related falls occurring during gait. Recent evidence shows that a faster and/or more anteriorly positioned center of mass (COM) is more stable against backward balance loss, and that compensatory stepping is the key to recovering stability upon balance loss. The purposes of this paper were to determine whether walking speed affected gait stability for backward balance loss at slip onset and touchdown of compensatory stepping, and whether compensatory stepping response resembled the regular gait pattern. Forty-seven young subjects were slipped unexpectedly either at a self-selected fast, natural or slow speed. Speed-related differences in stability at slip onset and touchdown of the subsequent compensatory step were analyzed using the COM position-velocity state. The results indicate that gait speed highly correlated with stability against backward balance loss at slip onset. The low COM velocity of the slow group was not sufficiently compensated for by a more anteriorly positioned COM associated with a shorter step length at slip onset. At touchdown of the compensatory step, the speed-related differences in stability diminished, due to the continued advantage of anterior COM positioning from a short compensatory step retained by the slow group, coupled with an increase in COM velocity. Compensatory step length and relative COM position altered as a function of gait speed, indicating the motor program for gait regulation may play a role in modulating the compensatory step.

  11. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  12. Mechanical, thermal and morphological characterization of polycarbonate/oxidized carbon nanofiber composites produced with a lean 2-step manufacturing process.

    PubMed

    Lively, Brooks; Kumar, Sandeep; Tian, Liu; Li, Bin; Zhong, Wei-Hong

    2011-05-01

    In this study we report the advantages of a 2-step method that incorporates an additional process pre-conditioning step for rapid and precise blending of the constituents prior to the commonly used melt compounding method for preparing polycarbonate/oxidized carbon nanofiber composites. This additional step (equivalent to a manufacturing cell) involves the formation of a highly concentrated solid nano-nectar of polycarbonate/carbon nanofiber composite using a solution mixing process followed by melt mixing with pure polycarbonate. This combined method yields excellent dispersion and improved mechanical and thermal properties as compared to the 1-step melt mixing method. The test results indicated that inclusion of carbon nanofibers into composites via the 2-step method resulted in dramatically reduced ( 48% lower) coefficient of thermal expansion compared to that of pure polycarbonate and 30% lower than that from the 1-step processing, at the same loading of 1.0 wt%. Improvements were also found in dynamic mechanical analysis and flexural mechanical properties. The 2-step approach is more precise and leads to better dispersion, higher quality, consistency, and improved performance in critical application areas. It is also consistent with Lean Manufacturing principles in which manufacturing cells are linked together using less of the key resources and creates a smoother production flow. Therefore, this 2-step process can be more attractive for industry.

  13. Cost-effectiveness of a stepped-care intervention to prevent major depression in patients with type 2 diabetes mellitus and/or coronary heart disease and subthreshold depression: design of a cluster-randomized controlled trial.

    PubMed

    van Dijk, Susan E M; Pols, Alide D; Adriaanse, Marcel C; Bosmans, Judith E; Elders, Petra J M; van Marwijk, Harm W J; van Tulder, Maurits W

    2013-05-07

    Co-morbid major depression is a significant problem among patients with type 2 diabetes mellitus and/or coronary heart disease and this negatively impacts quality of life. Subthreshold depression is the most important risk factor for the development of major depression. Given the highly significant association between depression and adverse health outcomes and the limited capacity for depression treatment in primary care, there is an urgent need for interventions that successfully prevent the transition from subthreshold depression into a major depressive disorder. Nurse led stepped-care is a promising way to accomplish this. The aim of this study is to evaluate the cost-effectiveness of a nurse-led indicated stepped-care program to prevent major depression among patients with type 2 diabetes mellitus and/or coronary heart disease in primary care who also have subthreshold depressive symptoms. An economic evaluation will be conducted alongside a cluster-randomized controlled trial in approximately thirty general practices in the Netherlands. Randomization takes place at the level of participating practice nurses. We aim to include 236 participants who will either receive a nurse-led indicated stepped-care program for depressive symptoms or care as usual. The stepped-care program consists of four sequential but flexible treatment steps: 1) watchful waiting, 2) guided self-help treatment, 3) problem solving treatment and 4) referral to the general practitioner. The primary clinical outcome measure is the cumulative incidence of major depressive disorder as measured with the Mini International Neuropsychiatric Interview. Secondary outcomes include severity of depressive symptoms, quality of life, anxiety and physical outcomes. Costs will be measured from a societal perspective and include health care utilization, medication and lost productivity costs. Measurements will be performed at baseline and 3, 6, 9 and 12 months. The intervention being investigated is expected to prevent new cases of depression among people with type 2 diabetes mellitus and/or coronary heart disease and subthreshold depression, with subsequent beneficial effects on quality of life, clinical outcomes and health care costs. When proven cost-effective, the program provides a viable treatment option in the Dutch primary care system. Dutch Trial Register NTR3715.

  14. Oxidation-driven surface dynamics on NiAl(100)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Hailang; Chen, Xidong; Li, Liang

    Atomic steps, a defect common to all crystal surfaces, can play an important role in many physical and chemical processes. However, attempts to predict surface dynamics under nonequilibrium conditions are usually frustrated by poor knowledge of the atomic processes of surface motion arising from mass transport from/to surface steps. Using low-energy electron microscopy that spatially and temporally resolves oxide film growth during the oxidation of NiAl(100) we demonstrate that surface steps are impermeable to oxide film growth. The advancement of the oxide occurs exclusively on the same terrace and requires the coordinated migration of surface steps. The resulting piling upmore » of surface steps ahead of the oxide growth front progressively impedes the oxide growth. This process is reversed during oxide decomposition. The migration of the substrate steps is found to be a surface-step version of the well-known Hele-Shaw problem, governed by detachment (attachment) of Al atoms at step edges induced by the oxide growth (decomposition). As a result, by comparing with the oxidation of NiAl(110) that exhibits unimpeded oxide film growth over substrate steps, we suggest that whenever steps are the source of atoms used for oxide growth they limit the oxidation process; when atoms are supplied from the bulk, the oxidation rate is not limited by the motion of surface steps.« less

  15. Oxidation-driven surface dynamics on NiAl(100)

    DOE PAGES

    Qin, Hailang; Chen, Xidong; Li, Liang; ...

    2014-12-29

    Atomic steps, a defect common to all crystal surfaces, can play an important role in many physical and chemical processes. However, attempts to predict surface dynamics under nonequilibrium conditions are usually frustrated by poor knowledge of the atomic processes of surface motion arising from mass transport from/to surface steps. Using low-energy electron microscopy that spatially and temporally resolves oxide film growth during the oxidation of NiAl(100) we demonstrate that surface steps are impermeable to oxide film growth. The advancement of the oxide occurs exclusively on the same terrace and requires the coordinated migration of surface steps. The resulting piling upmore » of surface steps ahead of the oxide growth front progressively impedes the oxide growth. This process is reversed during oxide decomposition. The migration of the substrate steps is found to be a surface-step version of the well-known Hele-Shaw problem, governed by detachment (attachment) of Al atoms at step edges induced by the oxide growth (decomposition). As a result, by comparing with the oxidation of NiAl(110) that exhibits unimpeded oxide film growth over substrate steps, we suggest that whenever steps are the source of atoms used for oxide growth they limit the oxidation process; when atoms are supplied from the bulk, the oxidation rate is not limited by the motion of surface steps.« less

  16. Closing the brain-to-brain loop in laboratory testing.

    PubMed

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  17. WASS: An open-source pipeline for 3D stereo reconstruction of ocean waves

    NASA Astrophysics Data System (ADS)

    Bergamasco, Filippo; Torsello, Andrea; Sclavo, Mauro; Barbariol, Francesco; Benetazzo, Alvise

    2017-10-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community and industry. Indeed, recent advances of both computer vision algorithms and computer processing power now allow the study of the spatio-temporal wave field with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner, so that the implementation of a sea-waves 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well tested software package that automates the reconstruction process from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS (http://www.dais.unive.it/wass), an Open-Source stereo processing pipeline for sea waves 3D reconstruction. Our tool completely automates all the steps required to estimate dense point clouds from stereo images. Namely, it computes the extrinsic parameters of the stereo rig so that no delicate calibration has to be performed on the field. It implements a fast 3D dense stereo reconstruction procedure based on the consolidated OpenCV library and, lastly, it includes set of filtering techniques both on the disparity map and the produced point cloud to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface. In this paper, we describe the architecture of WASS and the internal algorithms involved. The pipeline workflow is shown step-by-step and demonstrated on real datasets acquired at sea.

  18. Cellulose Biosynthesis: Current Views and Evolving Concepts

    PubMed Central

    SAXENA, INDER M.; BROWN, R. MALCOLM

    2005-01-01

    • Aims To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. • Scope Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. • Conclusions With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back. PMID:15894551

  19. Cellulose biosynthesis: current views and evolving concepts.

    PubMed

    Saxena, Inder M; Brown, R Malcolm

    2005-07-01

    To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. * Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. * With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back.

  20. The Ph.D. Process - A Student's Guide to Graduate School in the Sciences

    NASA Astrophysics Data System (ADS)

    Bloom, Dale F.; Karp, Jonathan D.; Cohen, Nicholas

    1999-02-01

    The Ph.D. Process offers the essential guidance that students in the biological and physical sciences need to get the most out of their years in graduate school. Drawing upon the insights of numerous current and former graduate students, this book presents a rich portrayal of the intellectual and emotional challenges inherent in becoming a scientist, and offers the informed, practical advice a "best friend" would give about each stage of the graduate school experience. What are the best strategies for applying to a graduate program? How are classes conducted? How should I choose an advisor and a research project? What steps can I take now to make myself more "employable" when I get my degree? What goes on at the oral defense? Through a balanced, thorough examination of issues ranging from lab etiquette to stress management, the authors--each a Ph.D. in the sciences--provide the vital information that will allow students to make informed decisions all along the way to the degree. Headlined sections within each chapter make it fast and easy to look up any subject, while dozens of quotes describing personal experiences in graduate programs from people in diverse scientific fields contribute invaluable real-life expertise. Special attention is also given to the needs of international students.Read in advance, this book prepares students for each step of the graduate school experience that awaits them. Read during the course of a graduate education, it serves as a handy reference covering virtually all major issues and decisions a doctoral candidate is likely to face. The Ph.D. Process is the one book every graduate student in the biological and physical sciences can use to stay a step ahead, from application all the way through graduation.

  1. A Taxonomy of Instructional Strategies in Early Childhood Education; Toward a Developmental Theory of Instructional Design.

    ERIC Educational Resources Information Center

    Vance, Barbara

    This paper suggests two steps in instructional deisgn for early childhood that can be derived from a recent major paper on instructional strategy taxonomy. These steps, together with the instructional design variables involved in each step, are reviewed relative to current research in child development and early education. The variables reviewed…

  2. Patient accounting systems: needs and capabilities.

    PubMed

    Kennedy, O G; Collignon, S

    1987-09-01

    In the first article of this series, it was stated that most finance executives are not very satisfied with the performance of their current patient accounting systems. What steps can a patient accounting system planner take to help ensure the system selected will garner high ratings from managers and users? Two primarily steps need to be taken. First, the planner needs to perform a thorough evaluation of both near- and long-term patient accounting requirements. He should determine which features and functions are most critical and ensure they are incorporated as selection criteria. The planner should also incorporate institutional planning into that process, such as planned expansion of facilities or services, to ensure that the system selected has the growth potential, interfacing capabilities, and flexibility to respond to the changing environment. Then, once system needs are fully charted, the planner should educate himself about the range of patient accounting system solutions available. The data show that most financial managers lack knowledge about most of the major patient accounting system vendors in the marketplace. Once vendors that offer systems that seemingly could meet needs are identified, the wise system planner will also want to obtain information from users about those vendors, to determine whether the systems perform as described and whether the vendor has been responsive to the needs of its customers. This step is a particularly important part of the planning process, because the data also show that users of some systems are significantly more satisfied than users of other patient accounting systems.

  3. Modeling the MHC class I pathway by combining predictions of proteasomal cleavage, TAP transport and MHC class I binding.

    PubMed

    Tenzer, S; Peters, B; Bulik, S; Schoor, O; Lemmel, C; Schatz, M M; Kloetzel, P-M; Rammensee, H-G; Schild, H; Holzhütter, H-G

    2005-05-01

    Epitopes presented by major histocompatibility complex (MHC) class I molecules are selected by a multi-step process. Here we present the first computational prediction of this process based on in vitro experiments characterizing proteasomal cleavage, transport by the transporter associated with antigen processing (TAP) and MHC class I binding. Our novel prediction method for proteasomal cleavages outperforms existing methods when tested on in vitro cleavage data. The analysis of our predictions for a new dataset consisting of 390 endogenously processed MHC class I ligands from cells with known proteasome composition shows that the immunological advantage of switching from constitutive to immunoproteasomes is mainly to suppress the creation of peptides in the cytosol that TAP cannot transport. Furthermore, we show that proteasomes are unlikely to generate MHC class I ligands with a C-terminal lysine residue, suggesting processing of these ligands by a different protease that may be tripeptidyl-peptidase II (TPPII).

  4. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    NASA Astrophysics Data System (ADS)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  5. Green dyeing process of modified cotton fibres using natural dyes extracted from Tamarix aphylla (L.) Karst. leaves.

    PubMed

    Baaka, Noureddine; Mahfoudhi, Adel; Haddar, Wafa; Mhenni, Mohamed Farouk; Mighri, Zine

    2017-01-01

    This research work involves an eco-friendly dyeing process of modified cotton with the aqueous extract of Tamarix aphylla leaves. During this process, the dyeing step was carried out on modified cotton by several cationising agents in order to improve its dyeability. The influence of the main dyeing conditions (dye bath pH, dyeing time, dyeing temperature, salt addition) on the performances of this dyeing process were studied. The dyeing performances of this process were appreciated by measuring the colour yield (K/S) and the fastness properties of the dyed samples. The effect of mordant type with different mordanting methods on dyeing quality was also studied. The results showed that mordanting gave deeper shades and enhanced fastness properties. In addition, environmental indicators (BOD 5 , COD and COD/BOD 5 ) were used to describe potential improvements in the biodegradability of the dyebath wastewater. Further, HPLC was used to identify the major phenolic compounds in the extracted dye.

  6. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  7. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    NASA Astrophysics Data System (ADS)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  8. Parents' perceptions of the role of schools in tobacco use prevention and cessation for youth.

    PubMed

    Wyman, Jodi; Price, James H; Jordan, Timothy R; Dake, Joseph A; Telljohann, Susan K

    2006-06-01

    The purpose of this study was to examine Ohio parents' perceptions of the role of schools in smoking prevention, cessation, and anti-tobacco policy for their children. A 46-item questionnaire was based on the CDC Guidelines for School Health Programs to Prevent Tobacco Use and Addiction. Surveys (n = 800) were sent to a stratified random sample of parents of junior high and high school aged students and 57% responded. Parents were supportive of smoking prevention activities, but almost two-thirds believed their child's school should get parents' input. Furthermore, mothers/step-mothers were more likely than fathers/step-fathers to agree that the school had a role in smoking prevention activities. The majority of parents were also supportive of smoking cessation activities. However, only 8% of parent respondents supported schools providing nicotine gum or patches to students trying to quit smoking. Overall, the majority of parents were supportive of the seven recommendations developed by the CDC as guidelines for school health programs to prevent tobacco use and addiction. Schools have the opportunity to impact student smoking through prevention and cessation activities. Schools need to know that parents are supportive of these activities and want to be included in the process of implementing effective prevention or cessation programs.

  9. Hierarchical recruitment of ribosomal proteins and assembly factors remodels nucleolar pre-60S ribosomes.

    PubMed

    Biedka, Stephanie; Micic, Jelena; Wilson, Daniel; Brown, Hailey; Diorio-Toth, Luke; Woolford, John L

    2018-04-24

    Ribosome biogenesis involves numerous preribosomal RNA (pre-rRNA) processing events to remove internal and external transcribed spacer sequences, ultimately yielding three mature rRNAs. Removal of the internal transcribed spacer 2 spacer RNA is the final step in large subunit pre-rRNA processing and begins with endonucleolytic cleavage at the C 2 site of 27SB pre-rRNA. C 2 cleavage requires the hierarchical recruitment of 11 ribosomal proteins and 14 ribosome assembly factors. However, the function of these proteins in C 2 cleavage remained unclear. In this study, we have performed a detailed analysis of the effects of depleting proteins required for C 2 cleavage and interpreted these results using cryo-electron microscopy structures of assembling 60S subunits. This work revealed that these proteins are required for remodeling of several neighborhoods, including two major functional centers of the 60S subunit, suggesting that these remodeling events form a checkpoint leading to C 2 cleavage. Interestingly, when C 2 cleavage is directly blocked by depleting or inactivating the C 2 endonuclease, assembly progresses through all other subsequent steps. © 2018 Biedka et al.

  10. Kinetic modelling of a diesel-polluted clayey soil bioremediation process.

    PubMed

    Fernández, Engracia Lacasa; Merlo, Elena Moliterni; Mayor, Lourdes Rodríguez; Camacho, José Villaseñor

    2016-07-01

    A mathematical model is proposed to describe a diesel-polluted clayey soil bioremediation process. The reaction system under study was considered a completely mixed closed batch reactor, which initially contacted a soil matrix polluted with diesel hydrocarbons, an aqueous liquid-specific culture medium and a microbial inoculation. The model coupled the mass transfer phenomena and the distribution of hydrocarbons among four phases (solid, S; water, A; non-aqueous liquid, NAPL; and air, V) with Monod kinetics. In the first step, the model simulating abiotic conditions was used to estimate only the mass transfer coefficients. In the second step, the model including both mass transfer and biodegradation phenomena was used to estimate the biological kinetic and stoichiometric parameters. In both situations, the model predictions were validated with experimental data that corresponded to previous research by the same authors. A correct fit between the model predictions and the experimental data was observed because the modelling curves captured the major trends for the diesel distribution in each phase. The model parameters were compared to different previously reported values found in the literature. Pearson correlation coefficients were used to show the reproducibility level of the model. Copyright © 2016. Published by Elsevier B.V.

  11. The Pathway of Oligomeric DNA Melting Investigated by Molecular Dynamics Simulations

    PubMed Central

    Wong, Ka-Yiu; Pettitt, B. Montgomery

    2008-01-01

    Details of the reaction coordinate for DNA melting are fundamental to much of biology and biotechnology. Recently, it has been shown experimentally that there are at least three states involved. To clarify the reaction mechanism of the melting transition of DNA, we perform 100-ns molecular dynamics simulations of a homo-oligomeric, 12-basepair DNA duplex, d(A12)·d(T12), with explicit salt water at 400 K. Analysis of the trajectory reveals the various biochemically important processes that occur on different timescales. Peeling (including fraying from the ends), searching for Watson-Crick complements, and dissociation are recognizable processes. However, we find that basepair searching for Watson-Crick complements along a strand is not mechanistically tied to or directly accessible from the dissociation steps of strand melting. A three-step melting mechanism is proposed where the untwisting of the duplex is determined to be the major component of the reaction coordinate at the barrier. Though the observations are limited to the characteristics of the system being studied, they provide important insight into the mechanism of melting of other more biologically relevant forms of DNA, which will certainly differ in details from those here. PMID:18952784

  12. Application of Organophosphonic Acids by One-Step Supercritical CO2 on 1D and 2D Semiconductors: Toward Enhanced Electrical and Sensing Performances.

    PubMed

    Bhartia, Bhavesh; Bacher, Nadav; Jayaraman, Sundaramurthy; Khatib, Salam; Song, Jing; Guo, Shifeng; Troadec, Cedric; Puniredd, Sreenivasa Reddy; Srinivasan, Madapusi Palavedu; Haick, Hossam

    2015-07-15

    Formation of dense monolayers with proven atmospheric stability using simple fabrication conditions remains a major challenge for potential applications such as (bio)sensors, solar cells, surfaces for growth of biological cells, and molecular, organic, and plastic electronics. Here, we demonstrate a single-step modification of organophosphonic acids (OPA) on 1D and 2D structures using supercritical carbon dioxide (SCCO2) as a processing medium, with high stability and significantly shorter processing times than those obtained by the conventional physisorption-chemisorption method (2.5 h vs 48-60 h).The advantages of this approach in terms of stability and atmospheric resistivity are demonstrated on various 2D materials, such as indium-tin-oxide (ITO) and 2D Si surfaces. The advantage of the reported approach on electronic and sensing devices is demonstrated by Si nanowire field effect transistors (SiNW FETs), which have shown a few orders of magnitude higher electrical and sensing performances, compared with devices obtained by conventional approaches. The compatibility of the reported approach with various materials and its simple implementation with a single reactor makes it easily scalable for various applications.

  13. Technical Performance and Economic Evaluation of Evaporative and Membrane-Based Concentration for Biomass-Derived Sugars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sievers, David A.; Stickel, Jonathan J.; Grundl, Nicholas J.

    Several conversion pathways of lignocellulosic biomass to advanced biofuels require or benefit from using concentrated sugar syrups of 600 g/L or greater. And while concentration may seem straightforward, thermal sugar degradation and energy efficiency remain major concerns. This study evaluated the trade-offs in product recovery, energy consumption, and economics between evaporative and membrane-based concentration methods. The degradation kinetics of xylose and glucose were characterized and applied to an evaporator process simulation. Though significant sugar loss was predicted for certain scenarios due to the Maillard reaction, industrially common falling-film plate evaporators offer short residence times (<5 min) and are expected tomore » limit sugar losses. Membrane concentration experiments characterized flux and sugar rejection, but diminished flux occurred at >100 g/L. A second step using evaporation is necessary to achieve target concentrations. Techno-economic process model simulations evaluated the overall economics of concentrating a 35 g/L sugar stream to 600 g/L in a full-scale biorefinery. A two-step approach of preconcentrating using membranes and finishing with an evaporator consumed less energy than evaporation alone but was more expensive because of high capital expenses of the membrane units.« less

  14. Technical Performance and Economic Evaluation of Evaporative and Membrane-Based Concentration for Biomass-Derived Sugars

    DOE PAGES

    Sievers, David A.; Stickel, Jonathan J.; Grundl, Nicholas J.; ...

    2017-09-18

    Several conversion pathways of lignocellulosic biomass to advanced biofuels require or benefit from using concentrated sugar syrups of 600 g/L or greater. And while concentration may seem straightforward, thermal sugar degradation and energy efficiency remain major concerns. This study evaluated the trade-offs in product recovery, energy consumption, and economics between evaporative and membrane-based concentration methods. The degradation kinetics of xylose and glucose were characterized and applied to an evaporator process simulation. Though significant sugar loss was predicted for certain scenarios due to the Maillard reaction, industrially common falling-film plate evaporators offer short residence times (<5 min) and are expected tomore » limit sugar losses. Membrane concentration experiments characterized flux and sugar rejection, but diminished flux occurred at >100 g/L. A second step using evaporation is necessary to achieve target concentrations. Techno-economic process model simulations evaluated the overall economics of concentrating a 35 g/L sugar stream to 600 g/L in a full-scale biorefinery. A two-step approach of preconcentrating using membranes and finishing with an evaporator consumed less energy than evaporation alone but was more expensive because of high capital expenses of the membrane units.« less

  15. On brain lesions, the milkman and Sigmunda.

    PubMed

    Izquierdo, I; Medina, J H

    1998-10-01

    Lesion studies have been of historical importance in establishing the brain systems involved in memory processes. Many of those studies, however, have been overinterpreted in terms of the actual role of each system and of connections between systems. The more recent molecular pharmacological approach has produced major advances in these two areas. The main biochemical steps of memory formation in the CAI region of the hippocampus have been established by localized microinfusions of drugs acting on specific enzymes of receptors, by subcellular measurements of the activity or function of those enzymes and receptors at definite times, and by transgenic deletions or changes of those proteins. The biochemical steps of long-term memory formation in CAI have been found to be quite similar to those of long-term potentiation in the same region, and of other forms of plasticity. Connections between the hippocampus and the entorhinal and parietal cortices in the formation and modulation of short- and long-term memory have also been elucidated using these techniques. Lesion studies, coupled with imaging studies, still have a role to play; with regard to human memory, this role is in many ways unique. But these methods by themselves are not informative as to the mechanisms of memory processing, storage or modulation.

  16. Development of a TiAl Alloy by Spark Plasma Sintering

    NASA Astrophysics Data System (ADS)

    Couret, Alain; Voisin, Thomas; Thomas, Marc; Monchoux, Jean-Philippe

    2017-12-01

    Spark plasma sintering (SPS) is a consolidated powder metallurgy process for which the powder sintering is achieved through an applied electric current. The present article aims to describe the method we employed to develop a TiAl-based alloy adjusted for this SPS process. Owing to its enhanced mechanical properties, this alloy was found to fully match the industrial specifications for the aeronautic and automotive industries, which require a high strength at high temperature and a reasonably good ductility at room temperature. A step-by-step method was followed for this alloy development. Starting from a basic study on the as-SPSed GE alloy (Ti-48Al-2Cr-2Nb) in which the influence of the microstructure was studied, the microstructure-alloy composition relationships were then investigated to increase the mechanical properties. As a result of this study, we concluded that tungsten had to be the major alloying element to improve the resistance at high temperature and a careful addition of boron would serve the properties at room temperature. Thus, we developed the IRIS alloy (Ti-48Al-2W-0.08B). Its microstructure and mechanical properties are described here.

  17. Utilization of 3D printing for an intravital microscopy platform to study the intestinal microcirculation.

    PubMed

    Burkovskiy, I; Lehmann, C; Jiang, C; Zhou, J

    2016-11-01

    Intravital microscopy of the intestine is a sophisticated technique that allows qualitative and quantitative in vivo observation of dynamic cellular interactions and blood flow at a high resolution. Physiological conditions of the animal and in particular of the observed organ, such as temperature and moisture are crucial for intravital imaging. Often, the microscopy stage with the animal or the organ of interest imposes limitations on how well the animal can be maintained. In addition, the access for additional oxygen supply or drug administration during the procedure is rather restricted. To address these limitations, we developed a novel intravital microscopy platform, allowing us to have improved access to the animal during the intravital microscopy procedure, as well as improved microenvironmental maintenance. The production process of this prototype platform is based on 3D printing of device parts in a single-step process. The simplicity of production and the advantages of this versatile and customizable design are shown and discussed in this paper. Our design potentially represents a major step forward in facilitating intestinal intravital imaging using fluorescent microscopy. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  18. Structural evolutions and hereditary characteristics of icosahedral nano-clusters formed in Mg70Zn30 alloys during rapid solidification processes

    NASA Astrophysics Data System (ADS)

    Liang, Yong-Chao; Liu, Rang-Su; Xie, Quan; Tian, Ze-An; Mo, Yun-Fei; Zhang, Hai-Tao; Liu, Hai-Rong; Hou, Zhao-Yang; Zhou, Li-Li; Peng, Ping

    2017-02-01

    To investigate the structural evolution and hereditary mechanism of icosahedral nano-clusters formed during rapid solidification, a molecular dynamics (MD) simulation study has been performed for a system consisting of 107 atoms of liquid Mg70Zn30 alloy. Adopting Honeycutt-Anderson (HA) bond-type index method and cluster type index method (CTIM-3) to analyse the microstructures in the system it is found that for all the nano-clusters including 2~8 icosahedral clusters in the system, there are 62 kinds of geometrical structures, and those can be classified, by the configurations of the central atoms of basic clusters they contained, into four types: chain-like, triangle-tailed, quadrilateral-tailed and pyramidal-tailed. The evolution of icosahedral nano-clusters can be conducted by perfect heredity and replacement heredity, and the perfect heredity emerges when temperature is slightly less than Tm then increase rapidly and far exceeds the replacement heredity at Tg; while for the replacement heredity, there are three major modes: replaced by triangle (3-atoms), quadrangle (4-atoms) and pentagonal pyramid (6-atoms), rather than by single atom step by step during rapid solidification processes.

  19. Adsorption process to recover hydrogen from feed gas mixtures having low hydrogen concentration

    DOEpatents

    Golden, Timothy Christopher; Weist, Jr., Edward Landis; Hufton, Jeffrey Raymond; Novosat, Paul Anthony

    2010-04-13

    A process for selectively separating hydrogen from at least one more strongly adsorbable component in a plurality of adsorption beds to produce a hydrogen-rich product gas from a low hydrogen concentration feed with a high recovery rate. Each of the plurality of adsorption beds subjected to a repetitive cycle. The process comprises an adsorption step for producing the hydrogen-rich product from a feed gas mixture comprising 5% to 50% hydrogen, at least two pressure equalization by void space gas withdrawal steps, a provide purge step resulting in a first pressure decrease, a blowdown step resulting in a second pressure decrease, a purge step, at least two pressure equalization by void space gas introduction steps, and a repressurization step. The second pressure decrease is at least 2 times greater than the first pressure decrease.

  20. Considerations for pattern placement error correction toward 5nm node

    NASA Astrophysics Data System (ADS)

    Yaegashi, Hidetami; Oyama, Kenichi; Hara, Arisa; Natori, Sakurako; Yamauchi, Shohei; Yamato, Masatoshi; Koike, Kyohei; Maslow, Mark John; Timoshkov, Vadim; Kiers, Ton; Di Lorenzo, Paolo; Fonseca, Carlos

    2017-03-01

    Multi-patterning has been adopted widely in high volume manufacturing as 193 immersion extension, and it becomes realistic solution of nano-order scaling. In fact, it must be key technology on single directional (1D) layout design [1] for logic devise and it becomes a major option for further scaling technique in SAQP. The requirement for patterning fidelity control is getting savior more and more, stochastic fluctuation as well as LER (Line edge roughness) has to be micro-scopic observation aria. In our previous work, such atomic order controllability was viable in complemented technique with etching and deposition [2]. Overlay issue form major potion in yield management, therefore, entire solution is needed keenly including alignment accuracy on scanner and detectability on overlay measurement instruments. As EPE (Edge placement error) was defined as the gap between design pattern and contouring of actual pattern edge, pattern registration in single process level must be considerable. The complementary patterning to fabricate 1D layout actually mitigates any process restrictions, however, multiple process step, symbolized as LELE with 193-i, is burden to yield management and affordability. Recent progress of EUV technology is remarkable, and it is major potential solution for such complicated technical issues. EUV has robust resolution limit and it must be definitely strong scaling driver for process simplification. On the other hand, its stochastic variation such like shot noise due to light source power must be resolved with any additional complemented technique. In this work, we examined the nano-order CD and profile control on EUV resist pattern and would introduce excellent accomplishments.

  1. Investigating the Conceptual Variation of Major Physics Textbooks

    NASA Astrophysics Data System (ADS)

    Stewart, John; Campbell, Richard; Clanton, Jessica

    2008-04-01

    The conceptual problem content of the electricity and magnetism chapters of seven major physics textbooks was investigated. The textbooks presented a total of 1600 conceptual electricity and magnetism problems. The solution to each problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content among the set of topics common to the texts. The variation of the distribution of conceptual coverage within each text is studied. The variation between the major groupings of the textbooks (conceptual, algebra-based, and calculus-based) is also studied. A measure of the conceptual complexity of the problems in each text is presented.

  2. [Molecular combing method in the research of DNA replication parameters in isolated organs of Drosophyla melanogaster].

    PubMed

    Ivankin, A V; Kolesnikova, T D; Demakov, S A; Andreenkov, O V; Bil'danova, E R; Andreenkova, N G; Zhimulev, I F

    2011-01-01

    Methods of physical DNA mapping and direct visualization of replication and transcription in specific regions of genome play crucial role in the researches of structural and functional organization of eukaryotic genomes. Since DNA strands in the cells are organized into high-fold structure and present as highly compacted chromosomes, the majority of these methods have lower resolution at chromosomal level. One of the approaches to enhance the resolution and mapping accuracy is the method of molecular combing. The method is based on the process of stretching and alignment of DNA molecules that are covalently attached with one of the ends to the cover glass surface. In this article we describe the major methodological steps of molecular combing and their adaptation for researches of DNA replication parameters in polyploidy and diploid tissues of Drosophyla larvae.

  3. Exploring bacterial lignin degradation.

    PubMed

    Brown, Margaret E; Chang, Michelle C Y

    2014-04-01

    Plant biomass represents a renewable carbon feedstock that could potentially be used to replace a significant level of petroleum-derived chemicals. One major challenge in its utilization is that the majority of this carbon is trapped in the recalcitrant structural polymers of the plant cell wall. Deconstruction of lignin is a key step in the processing of biomass to useful monomers but remains challenging. Microbial systems can provide molecular information on lignin depolymerization as they have evolved to break lignin down using metalloenzyme-dependent radical pathways. Both fungi and bacteria have been observed to metabolize lignin; however, their differential reactivity with this substrate indicates that they may utilize different chemical strategies for its breakdown. This review will discuss recent advances in studying bacterial lignin degradation as an approach to exploring greater diversity in the environment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Establishing effective working relations with a potential user community - NASA Lewis Research Center experience

    NASA Technical Reports Server (NTRS)

    Foster, P.

    1977-01-01

    The NASA Lewis Research Center has held a series of six major and unique technology utilization conferences which were major milestones in planned structured efforts to establish effective working relationships with specific technology user communities. These efforts were unique in that the activities undertaken prior to the conference were extensive, and effectively laid the groundwork for productive technology transfer following, and as a direct result of, the conferences. The effort leading to the conference was in each case tailored to the characteristics of the potential user community, however, the common factors comprise a basic framework applicable to similar endeavors. The process is essentially a planned sequence of steps that constitute a technical market survey and a marketing program for the development of beneficial applications of aerospace technology beyond the aerospace field.

  5. Accelerating electron tomography reconstruction algorithm ICON with GPU.

    PubMed

    Chen, Yu; Wang, Zihao; Zhang, Jingrong; Li, Lun; Wan, Xiaohua; Sun, Fei; Zhang, Fa

    2017-01-01

    Electron tomography (ET) plays an important role in studying in situ cell ultrastructure in three-dimensional space. Due to limited tilt angles, ET reconstruction always suffers from the "missing wedge" problem. With a validation procedure, iterative compressed-sensing optimized NUFFT reconstruction (ICON) demonstrates its power in the restoration of validated missing information for low SNR biological ET dataset. However, the huge computational demand has become a major problem for the application of ICON. In this work, we analyzed the framework of ICON and classified the operations of major steps of ICON reconstruction into three types. Accordingly, we designed parallel strategies and implemented them on graphics processing units (GPU) to generate a parallel program ICON-GPU. With high accuracy, ICON-GPU has a great acceleration compared to its CPU version, up to 83.7×, greatly relieving ICON's dependence on computing resource.

  6. Space science at NASA - Retrospect and prospect

    NASA Technical Reports Server (NTRS)

    Rosendhal, Jeffrey D.

    1988-01-01

    Following a brief overview of past accomplishments in space science, a status report is given concerning progress toward recovering from the Challenger accident and a number of trends are described which are likely to have a major influence on the future of the NASA Space Science program. Key changes in process include a trend toward a program centered on the use of large, long-lived facilities, the emergence of strong space capabilities outside the U.S., and steps being taken toward the diversification of NASA's launch capability. A number of recent planning activities are also discussed. Major considerations which will specifically need to be taken into account in NASA's prgram planning include the need for provision of a spectrum of flight activities and the need to recognize likely resource limitations and to do more realistic program planning.

  7. Analytical separations of mammalian decomposition products for forensic science: a review.

    PubMed

    Swann, L M; Forbes, S L; Lewis, S W

    2010-12-03

    The study of mammalian soft tissue decomposition is an emerging area in forensic science, with a major focus of the research being the use of various chemical and biological methods to study the fate of human remains in the environment. Decomposition of mammalian soft tissue is a postmortem process that, depending on environmental conditions and physiological factors, will proceed until complete disintegration of the tissue. The major stages of decomposition involve complex reactions which result in the chemical breakdown of the body's main constituents; lipids, proteins, and carbohydrates. The first step to understanding this chemistry is identifying the compounds present in decomposition fluids and determining when they are produced. This paper provides an overview of decomposition chemistry and reviews recent advances in this area utilising analytical separation science. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Investigation to biodiesel production by the two-step homogeneous base-catalyzed transesterification.

    PubMed

    Ye, Jianchu; Tu, Song; Sha, Yong

    2010-10-01

    For the two-step transesterification biodiesel production made from the sunflower oil, based on the kinetics model of the homogeneous base-catalyzed transesterification and the liquid-liquid phase equilibrium of the transesterification product, the total methanol/oil mole ratio, the total reaction time, and the split ratios of methanol and reaction time between the two reactors in the stage of the two-step reaction are determined quantitatively. In consideration of the transesterification intermediate product, both the traditional distillation separation process and the improved separation process of the two-step reaction product are investigated in detail by means of the rigorous process simulation. In comparison with the traditional distillation process, the improved separation process of the two-step reaction product has distinct advantage in the energy duty and equipment requirement due to replacement of the costly methanol-biodiesel distillation column. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Mapping of Technological Opportunities-Labyrinth Seal Example

    NASA Technical Reports Server (NTRS)

    Clarke, Dana W., Sr.

    2006-01-01

    All technological systems evolve based on evolutionary sequences that have repeated throughout history and can be abstracted from the history of technology and patents. These evolutionary sequences represent objective patterns and provide considerable insights that can be used to proactively model future seal concepts. This presentation provides an overview of how to map seal technology into the future using a labyrinth seal example. The mapping process delivers functional descriptions of sequential changes in market/consumer demand, from today s current paradigm to the next major paradigm shift. The future paradigm is developed according to a simple formula: the future paradigm is free of all flaws associated with the current paradigm; it is as far into the future as we can see. Although revolutionary, the vision of the future paradigm is typically not immediately or completely realizable nor is it normally seen as practical. There are several reasons that prevent immediate and complete practical application, such as: 1) Some of the required technological or business resources and knowledge not being available; 2) Availability of other technological or business resources are limited; and/or 3) Some necessary knowledge has not been completely developed. These factors tend to drive the Total Cost of Ownership or Utilization out of an acceptable range and revealing the reasons for the high Total Cost of Ownership or Utilization which provides a clear understanding of research opportunities essential for future developments and defines the current limits of the immediately achievable improvements. The typical roots of high Total Cost of Ownership or Utilization lie in the limited availability or even the absence of essential resources and knowledge necessary for its realization. In order to overcome this obstacle, step-by-step modification of the current paradigm is pursued to evolve from the current situation toward the ideal future, i.e., evolution rather than revolution. A key point is that evolutionary stages are mapped to show step-by-step evolution from the current paradigm to the next major paradigm.

  10. A model based method for recognizing psoas major muscles in torso CT images

    NASA Astrophysics Data System (ADS)

    Kamiya, Naoki; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi

    2010-03-01

    In aging societies, it is important to analyze age-related hypokinesia. A psoas major muscle has many important functional capabilities such as capacity of balance and posture control. These functions can be measured by its cross sectional area (CSA), volume, and thickness. However, these values are calculated manually in the clinical situation. The purpose of our study is to propose an automated recognition method of psoas major muscles in X-ray torso CT images. The proposed recognition process involves three steps: 1) determination of anatomical points such as the origin and insertion of the psoas major muscle, 2) generation of a shape model for the psoas major muscle, and 3) recognition of the psoas major muscles by use of the shape model. The model was built using quadratic function, and was fit to the anatomical center line of psoas major muscle. The shape model was generated using 20 CT cases and tested by 20 other CT cases. The applied database consisted of 12 male and 8 female cases from the ages of 40's to 80's. The average value of Jaccard similarity coefficient (JSC) values employed in the evaluation was 0.7. Our experimental results indicated that the proposed method was effective for a volumetric analysis and could be possible to be used for a quantitative measurement of psoas major muscles in CT images.

  11. Functional Fault Modeling of a Cryogenic System for Real-Time Fault Detection and Isolation

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Brown, Barbara

    2010-01-01

    The purpose of this paper is to present the model development process used to create a Functional Fault Model (FFM) of a liquid hydrogen (L H2) system that will be used for realtime fault isolation in a Fault Detection, Isolation and Recover (FDIR) system. The paper explains th e steps in the model development process and the data products required at each step, including examples of how the steps were performed fo r the LH2 system. It also shows the relationship between the FDIR req uirements and steps in the model development process. The paper concl udes with a description of a demonstration of the LH2 model developed using the process and future steps for integrating the model in a live operational environment.

  12. Array automated assembly task low cost silicon solar array project, phase 2

    NASA Technical Reports Server (NTRS)

    Olson, C.

    1980-01-01

    Analyses of solar cell and module process steps for throughput rate, cost effectiveness, and reproductibility are reported. In addition to the concentration on cell and module processing sequences, an investigation was made into the capability of using microwave energy in the diffusion, sintering, and thick film firing steps of cell processing. Although the entire process sequence was integrated, the steps are treated individually with test and experimental data, conclusions, and recommendations.

  13. Application of quality by design principles to the development and technology transfer of a major process improvement for the manufacture of a recombinant protein.

    PubMed

    Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank

    2011-01-01

    This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  14. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    PubMed

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  15. Next generation mothers: Maternal control of germline development in zebrafish.

    PubMed

    Dosch, Roland

    2015-01-01

    In many animals, factors deposited by the mother into the egg control the earliest events in development of the zygote. These maternal RNAs and proteins play critical roles in oocyte development and the earliest steps of embryogenesis such as fertilization, cell division and embryonic patterning. Here, this article summarizes recent discoveries made on the maternal control of germline specification in zebrafish. Moreover, this review will discuss the major gaps remaining in our understanding of this process and highlight recent technical innovations in zebrafish, which allow tackling some of these questions in the near future.

  16. Cross-Cultural Adaptation and Validation of the Attitudes Toward Suicide Questionnaire Among Healthcare personnel in Malaysia.

    PubMed

    Siau, Ching Sin; Wee, Lei-Hum; Ibrahim, Norhayati; Visvalingam, Uma; Wahab, Suzaily

    2017-01-01

    Understanding attitudes toward suicide, especially among healthcare personnel, is an important step in both suicide prevention and treatment. We document the adaptation process and establish the validity and reliability of the Attitudes Toward Suicide (ATTS) questionnaire among 262 healthcare personnel in 2 major public hospitals in the Klang Valley, Malaysia. The findings indicate that healthcare personnel in Malaysia have unique constructs on suicide attitude, compared with the original study on a Western European sample. The adapted Malay ATTS questionnaire demonstrates adequate reliability and validity for use among healthcare personnel in Malaysia.

  17. Cross-Cultural Adaptation and Validation of the Attitudes Toward Suicide Questionnaire Among Healthcare personnel in Malaysia

    PubMed Central

    Siau, Ching Sin; Wee, Lei-Hum; Ibrahim, Norhayati; Visvalingam, Uma; Wahab, Suzaily

    2017-01-01

    Understanding attitudes toward suicide, especially among healthcare personnel, is an important step in both suicide prevention and treatment. We document the adaptation process and establish the validity and reliability of the Attitudes Toward Suicide (ATTS) questionnaire among 262 healthcare personnel in 2 major public hospitals in the Klang Valley, Malaysia. The findings indicate that healthcare personnel in Malaysia have unique constructs on suicide attitude, compared with the original study on a Western European sample. The adapted Malay ATTS questionnaire demonstrates adequate reliability and validity for use among healthcare personnel in Malaysia. PMID:28486042

  18. The Science Manager's Guide to Case Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branch, Kristi M.; Peffers, Melissa S.; Ruegg, Rosalie T.

    2001-09-24

    This guide takes the science manager through the steps of planning, implementing, validating, communicating, and using case studies. It outlines the major methods of analysis, describing their relative merits and applicability while providing relevant examples and sources of additional information. Well-designed case studies can provide a combination of rich qualitative and quantitative information, offering valuable insights into the nature, outputs, and longer-term impacts of the research. An objective, systematic, and credible approach to the evaluation of U.S. Department of Energy Office of Science programs adds value to the research process and is the subject of this guide.

  19. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  20. Enhanced attrition bioreactor for enzyme hydrolysis of cellulosic materials

    DOEpatents

    Scott, Timothy C.; Scott, Charles D.; Faison, Brendlyn D.; Davison, Brian H.; Woodward, Jonathan

    1997-01-01

    A process for converting cellulosic materials, such as waste paper, into fuels and chemicals, such as sugars and ethanol, utilizing enzymatic hydrolysis of the major carbohydrate of paper: cellulose. A waste paper slurry is contacted by cellulase in an agitated hydrolyzer. An attritor and a cellobiase reactor are coupled to the agitated hydrolyzer to improve reaction efficiency. Additionally, microfiltration, ultrafiltration and reverse osmosis steps are included to further increase reaction efficiency. The resulting sugars are converted to a dilute product in a fluidized-bed bioreactor utilizing a biocatalyst, such as microorganisms. The dilute product is then concentrated and purified.

  1. Enhanced attrition bioreactor for enzyme hydrolysis or cellulosic materials

    DOEpatents

    Scott, Timothy C.; Scott, Charles D.; Faison, Brendlyn D.; Davison, Brian H.; Woodward, Jonathan

    1996-01-01

    A process for converting cellulosic materials, such as waste paper, into fuels and chemicals, such as sugars and ethanol, utilizing enzymatic hydrolysis of the major carbohydrate of paper: cellulose. A waste paper slurry is contacted by cellulase in an agitated hydrolyzer. An attritor and a cellobiase reactor are coupled to the agitated hydrolyzer to improve reaction efficiency. Additionally, microfiltration, ultrafiltration and reverse osmosis steps are included to further increase reaction efficiency. The resulting sugars are converted to a dilute product in a fluidized-bed bioreactor utilizing a biocatalyst, such as microorganisms. The dilute product is then concentrated and purified.

  2. Calculation of muscle loading and joint contact forces during the rock step in Irish dance.

    PubMed

    Shippen, James M; May, Barbara

    2010-01-01

    A biomechanical model for the analysis of dancers and their movements is described. The model consisted of 31 segments, 35 joints, and 539 muscles, and was animated using movement data obtained from a three-dimensional optical tracking system that recorded the motion of dancers. The model was used to calculate forces within the muscles and contact forces at the joints of the dancers in this study. Ground reaction forces were measured using force plates mounted in a sprung floor. The analysis procedure is generic and can be applied to any dance form. As an exemplar of the application process an Irish dance step, the rock, was analyzed. The maximum ground reaction force found was 4.5 times the dancer's body weight. The muscles connected to the Achilles tendon experienced a maximum force comparable to their maximal isometric strength. The contact force at the ankle joint was 14 times body weight, of which the majority of the force was due to muscle contraction. It is suggested that as the rock step produces high forces, and therefore the potential to cause injury, its use should be carefully monitored.

  3. Scaled-up production of poacic acid, a plant-derived antifungal agent

    DOE PAGES

    Yue, Fengxia; Gao, Ruili; Piotrowski, Jeff S.; ...

    2017-09-01

    Poacic acid, a decarboxylated product from 8–5-diferulic acid that is commonly found in monocot lignocellulosic hydrolysates, has been identified as a natural antifungal agent against economically significant fungi and oomycete plant pathogens. Starting from commercially available or monocot-derivable ferulic acid, a three-step synthetic procedure has been developed for the production of poacic acid needed for field testing in a controlled agricultural setting. First, ferulic acid was esterified to produce ethyl ferulate in 92% yield. Second, peroxidase-catalyzed free radical dehydrodimerization of ethyl ferulate produced crude diferulates, mainly 8–5-diferulate, in 91% yield. Finally, crystalline poacic acid was obtained in 25% yield viamore » alkaline hydrolysis of the crude diferulates after purification by flash-column chromatography. Thus, this new procedure offers two key improvements relevant to large-scale production: 1) bubbling air through the reaction mixture in the second step to remove acetone greatly improves the recovery efficiency of the crude diferulates; and 2) telescoping minor impurities directly into the alkaline hydrolysis step eliminates the need for additional column purifications, thus reducing the overall cost of production and removing a major impediment to process scale-up.« less

  4. Scaled-up production of poacic acid, a plant-derived antifungal agent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Fengxia; Gao, Ruili; Piotrowski, Jeff S.

    Poacic acid, a decarboxylated product from 8–5-diferulic acid that is commonly found in monocot lignocellulosic hydrolysates, has been identified as a natural antifungal agent against economically significant fungi and oomycete plant pathogens. Starting from commercially available or monocot-derivable ferulic acid, a three-step synthetic procedure has been developed for the production of poacic acid needed for field testing in a controlled agricultural setting. First, ferulic acid was esterified to produce ethyl ferulate in 92% yield. Second, peroxidase-catalyzed free radical dehydrodimerization of ethyl ferulate produced crude diferulates, mainly 8–5-diferulate, in 91% yield. Finally, crystalline poacic acid was obtained in 25% yield viamore » alkaline hydrolysis of the crude diferulates after purification by flash-column chromatography. Thus, this new procedure offers two key improvements relevant to large-scale production: 1) bubbling air through the reaction mixture in the second step to remove acetone greatly improves the recovery efficiency of the crude diferulates; and 2) telescoping minor impurities directly into the alkaline hydrolysis step eliminates the need for additional column purifications, thus reducing the overall cost of production and removing a major impediment to process scale-up.« less

  5. Diffractive optics fabricated by direct write methods with an electron beam

    NASA Technical Reports Server (NTRS)

    Kress, Bernard; Zaleta, David; Daschner, Walter; Urquhart, Kris; Stein, Robert; Lee, Sing H.

    1993-01-01

    State-of-the-art diffractive optics are fabricated using e-beam lithography and dry etching techniques to achieve multilevel phase elements with very high diffraction efficiencies. One of the major challenges encountered in fabricating diffractive optics is the small feature size (e.g. for diffractive lenses with small f-number). It is not only the e-beam system which dictates the feature size limitations, but also the alignment systems (mask aligner) and the materials (e-beam and photo resists). In order to allow diffractive optics to be used in new optoelectronic systems, it is necessary not only to fabricate elements with small feature sizes but also to do so in an economical fashion. Since price of a multilevel diffractive optical element is closely related to the e-beam writing time and the number of etching steps, we need to decrease the writing time and etching steps without affecting the quality of the element. To do this one has to utilize the full potentials of the e-beam writing system. In this paper, we will present three diffractive optics fabrication techniques which will reduce the number of process steps, the writing time, and the overall fabrication time for multilevel phase diffractive optics.

  6. Low Complexity Compression and Speed Enhancement for Optical Scanning Holography

    PubMed Central

    Tsang, P. W. M.; Poon, T.-C.; Liu, J.-P.; Kim, T.; Kim, Y. S.

    2016-01-01

    In this paper we report a low complexity compression method that is suitable for compact optical scanning holography (OSH) systems with different optical settings. Our proposed method can be divided into 2 major parts. First, an automatic decision maker is applied to select the rows of holographic pixels to be scanned. This process enhances the speed of acquiring a hologram, and also lowers the data rate. Second, each row of down-sampled pixels is converted into a one-bit representation with delta modulation (DM). Existing DM-based hologram compression techniques suffers from the disadvantage that a core parameter, commonly known as the step size, has to be determined in advance. However, the correct value of the step size for compressing each row of hologram is dependent on the dynamic range of the pixels, which could deviate significantly with the object scene, as well as OSH systems with different opical settings. We have overcome this problem by incorporating a dynamic step-size adjustment scheme. The proposed method is applied in the compression of holograms that are acquired with 2 different OSH systems, demonstrating a compression ratio of over two orders of magnitude, while preserving favorable fidelity on the reconstructed images. PMID:27708410

  7. Use of aluminum phosphate as the dehydration catalyst in single step dimethyl ether process

    DOEpatents

    Peng, Xiang-Dong; Parris, Gene E.; Toseland, Bernard A.; Battavio, Paula J.

    1998-01-01

    The present invention pertains to a process for the coproduction of methanol and dimethyl ether (DME) directly from a synthesis gas in a single step (hereafter, the "single step DME process"). In this process, the synthesis gas comprising hydrogen and carbon oxides is contacted with a dual catalyst system comprising a physical mixture of a methanol synthesis catalyst and a methanol dehydration catalyst. The present invention is an improvement to this process for providing an active and stable catalyst system. The improvement comprises the use of an aluminum phosphate based catalyst as the methanol dehydration catalyst. Due to its moderate acidity, such a catalyst avoids the coke formation and catalyst interaction problems associated with the conventional dual catalyst systems taught for the single step DME process.

  8. Producing optical (contact) lenses by a novel low cost process

    NASA Astrophysics Data System (ADS)

    Skipper, Richard S.; Spencer, Ian D.

    2005-09-01

    The rapid and impressive growth of China has been achieved on the back of highly labour intensive industries, often in manufacturing, and at the cost of companies and jobs in Europe and America. Approaches that worked well in the 1990's to reduce production costs in the developed countries are no longer effective when confronted with the low labour costs of China and India. We have looked at contact lenses as a product that has become highly available to consumers here but as an industry that has reduced costs by moving to low labour cost countries. The question to be answered was, "Do we have the skill to still make the product in the UK, and can we make it cheap enough to export to China?" if we do not, then contact lens manufacture will move to China sooner or later. The challenge to enter the markets of the BRIC (Brazil, Russia, India and China) countries is extremely exciting as here is the new money, high growth and here is a product that sells to those with disposable incomes. To succeed we knew we had to be radical in our approach; the radical step was very simple: to devise a process in which each step added value to the customer and not cost to the product. The presentation examines the processes used by the major producers and how, by applying good manufacturing practice sound scientific principles to them, the opportunity to design a new low cost patented process was identified.

  9. Engineered Multifunctional Surfaces for Fluid Handling

    NASA Technical Reports Server (NTRS)

    Thomas, Chris; Ma, Yonghui; Weislogel, Mark

    2012-01-01

    Designs incorporating variations in capillary geometry and hydrophilic and/or antibacterial surface properties have been developed that are capable of passive gas/liquid separation and passive water flow. These designs can incorporate capillary grooves and/or surfaces arranged to create linear and circumferential capillary geometry at the micro and macro scale, radial fin configurations, micro holes and patterns, and combinations of the above. The antibacterial property of this design inhibits the growth of bacteria or the development of biofilm. The hydrophilic property reduces the water contact angle with a treated substrate such that water spreads into a thin layer atop the treated surface. These antibacterial and hydrophilic properties applied to a thermally conductive surface, combined with capillary geometry, create a novel heat exchanger capable of condensing water from a humid, two-phase water and gas flow onto the treated heat exchanger surfaces, and passively separating the condensed water from the gas flow in a reduced gravity application. The overall process to generate the antibacterial and hydrophilic properties includes multiple steps to generate the two different surface properties, and can be divided into two major steps. Step 1 uses a magnetron-based sputtering technique to implant the silver atoms into the base material. A layer of silver is built up on top of the base material. Completion of this step provides the antibacterial property. Step 2 uses a cold-plasma technique to generate the hydrophilic surface property on top of the silver layer generated in Step 1. Completion of this step provides the hydrophilic property in addition to the antibacterial property. Thermally conductive materials are fabricated and then treated to create the antibacterial and hydrophilic surface properties. The individual parts are assembled to create a condensing heat exchanger with antibacterial and hydrophilic surface properties and capillary geometry, which is capable of passive phase separation in a reduced gravity application. The plasma processes for creating antibacterial and hydrophilic surface properties are suitable for applications where water is present on an exposed surface for an extended time, such that bacteria or biofilms could form, and where there is a need to manage the water on the surface. The processes are also suitable for applications where only the hydrophilic property is needed. In particular, the processes are applicable to condensing heat exchangers (CHXs), which benefit from the antibacterial properties as well as the hydrophilic properties. Water condensing onto the control surfaces of the CHX will provide the moist conditions necessary for the growth of bacteria and the formation of biofilms. The antibacterial properties of the base layer (silver) will mitigate and prevent the growth of bacteria and formation of biofilms that would otherwise reduce the CHX performance. In addition, the hydrophilic properties reduce the water contact angle and prevent water droplets from bridging between control surfaces. Overall, the hydrophilic properties reduce the pressure drop across the CHX.

  10. Frequency-velocity mismatch: a fundamental abnormality in parkinsonian gait.

    PubMed

    Cho, Catherine; Kunin, Mikhail; Kudo, Koji; Osaki, Yasuhiro; Olanow, C Warren; Cohen, Bernard; Raphan, Theodore

    2010-03-01

    Gait dysfunction and falling are major sources of disability for patients with advanced Parkinson's disease (PD). It is presently thought that the fundamental defect is an inability to generate normal stride length. Our data suggest, however, that the basic problem in PD gait is an impaired ability to match step frequency to walking velocity. In this study, foot movements of PD and normal subjects were monitored with an OPTOTRAK motion-detection system while they walked on a treadmill at different velocities. PD subjects were also paced with auditory stimuli at different frequencies. PD gait was characterized by step frequencies that were faster and stride lengths that were shorter than those of normal controls. At low walking velocities, PD stepping had a reduced or absent terminal toe lift, which truncated swing phases, producing shortened steps. Auditory pacing was not able to normalize step frequency at these lower velocities. Peak forward toe velocities increased with walking velocity and PD subjects could initiate appropriate foot dynamics during initial phases of the swing. They could not control the foot appropriately in terminal phases, however. Increased treadmill velocity, which matched the natural PD step frequency, generated a second toe lift, normalizing step size. Levodopa increased the bandwidth of step frequencies, but was not as effective as increases in walking velocity in normalizing gait. We postulate that the inability to control step frequency and adjust swing phase dynamics to slower walking velocities are major causes for the gait impairment in PD.

  11. Frequency-Velocity Mismatch: A Fundamental Abnormality in Parkinsonian Gait

    PubMed Central

    Kunin, Mikhail; Kudo, Koji; Osaki, Yasuhiro; Olanow, C. Warren; Cohen, Bernard; Raphan, Theodore

    2010-01-01

    Gait dysfunction and falling are major sources of disability for patients with advanced Parkinson's disease (PD). It is presently thought that the fundamental defect is an inability to generate normal stride length. Our data suggest, however, that the basic problem in PD gait is an impaired ability to match step frequency to walking velocity. In this study, foot movements of PD and normal subjects were monitored with an OPTOTRAK motion-detection system while they walked on a treadmill at different velocities. PD subjects were also paced with auditory stimuli at different frequencies. PD gait was characterized by step frequencies that were faster and stride lengths that were shorter than those of normal controls. At low walking velocities, PD stepping had a reduced or absent terminal toe lift, which truncated swing phases, producing shortened steps. Auditory pacing was not able to normalize step frequency at these lower velocities. Peak forward toe velocities increased with walking velocity and PD subjects could initiate appropriate foot dynamics during initial phases of the swing. They could not control the foot appropriately in terminal phases, however. Increased treadmill velocity, which matched the natural PD step frequency, generated a second toe lift, normalizing step size. Levodopa increased the bandwidth of step frequencies, but was not as effective as increases in walking velocity in normalizing gait. We postulate that the inability to control step frequency and adjust swing phase dynamics to slower walking velocities are major causes for the gait impairment in PD. PMID:20042701

  12. Fostering Autonomy through Syllabus Design: A Step-by-Step Guide for Success

    ERIC Educational Resources Information Center

    Ramírez Espinosa, Alexánder

    2016-01-01

    Promoting learner autonomy is relevant in the field of applied linguistics due to the multiple benefits it brings to the process of learning a new language. However, despite the vast array of research on how to foster autonomy in the language classroom, it is difficult to find step-by-step processes to design syllabi and curricula focused on the…

  13. Disruption of striatal-enriched protein tyrosine phosphatase (STEP) function in neuropsychiatric disorders

    PubMed Central

    Karasawa, Takatoshi; Lombroso, Paul J.

    2014-01-01

    Striatal-enriched protein tyrosine phosphatase (STEP) is a brain-specific tyrosine phosphatase that plays a major role in the development of synaptic plasticity. Recent findings have implicated STEP in several psychiatric and neurological disorders, including Alzheimer’s disease, schizophrenia, fragile X syndrome, Huntington’s disease, stroke/ischemia, and stress-related psychiatric disorders. In these disorders, STEP protein expression levels and activity are dysregulated, contributing to the cognitive deficits that are present. In this review, we focus on the most recent findings on STEP, discuss how STEP expression and activity are maintained during normal cognitive function, and how disruptions in STEP activity contribute to a number of illnesses. PMID:25218562

  14. Treating the Tough Adolescent: A Family-Based, Step-by-Step Guide. The Guilford Family Therapy Series.

    ERIC Educational Resources Information Center

    Sells, Scott P.

    A model for treating difficult adolescents and their families is presented. Part 1 offers six basic assumptions about the causes of severe behavioral problems and presents the treatment model with guidelines necessary to address each of these six causes. Case examples highlight and clarify major points within each of the 15 procedural steps of the…

  15. Coupling of Spinosad Fermentation and Separation Process via Two-Step Macroporous Resin Adsorption Method.

    PubMed

    Zhao, Fanglong; Zhang, Chuanbo; Yin, Jing; Shen, Yueqi; Lu, Wenyu

    2015-08-01

    In this paper, a two-step resin adsorption technology was investigated for spinosad production and separation as follows: the first step resin addition into the fermentor at early cultivation period to decrease the timely product concentration in the broth; the second step of resin addition was used after fermentation to adsorb and extract the spinosad. Based on this, a two-step macroporous resin adsorption-membrane separation process for spinosad fermentation, separation, and purification was established. Spinosad concentration in 5-L fermentor increased by 14.45 % after adding 50 g/L macroporous at the beginning of fermentation. The established two-step macroporous resin adsorption-membrane separation process got the 95.43 % purity and 87 % yield for spinosad, which were both higher than that of the conventional crystallization of spinosad from aqueous phase that were 93.23 and 79.15 % separately. The two-step macroporous resin adsorption method has not only carried out the coupling of spinosad fermentation and separation but also increased spinosad productivity. In addition, the two-step macroporous resin adsorption-membrane separation process performs better in spinosad yield and purity.

  16. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  17. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  18. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  19. 49 CFR 40.63 - What steps does the collector take in the collection process before the employee provides a urine...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... collection process before the employee provides a urine specimen? 40.63 Section 40.63 Transportation Office... PROGRAMS Urine Specimen Collections § 40.63 What steps does the collector take in the collection process before the employee provides a urine specimen? As the collector, you must take the following steps before...

  20. Improving the two-step remediation process for CCA-treated wood. Part I, Evaluating oxalic acid extraction

    Treesearch

    Carol Clausen

    2004-01-01

    In this study, three possible improvements to a remediation process for chromated-copper-arsenate (CCA) treated wood were evaluated. The process involves two steps: oxalic acid extraction of wood fiber followed by bacterial culture with Bacillus licheniformis CC01. The three potential improvements to the oxalic acid extraction step were (1) reusing oxalic acid for...

  1. A First Step in Learning Analytics: Pre-Processing Low-Level Alice Logging Data of Middle School Students

    ERIC Educational Resources Information Center

    Werner, Linda; McDowell, Charlie; Denner, Jill

    2013-01-01

    Educational data mining can miss or misidentify key findings about student learning without a transparent process of analyzing the data. This paper describes the first steps in the process of using low-level logging data to understand how middle school students used Alice, an initial programming environment. We describe the steps that were…

  2. Interference resolution in major depression.

    PubMed

    Joormann, Jutta; Nee, Derek Evan; Berman, Marc G; Jonides, John; Gotlib, Ian H

    2010-03-01

    In two experiments, we investigated individual differences in the ability to resolve interference in participants diagnosed with major depressive disorder (MDD). Participants were administered the "Ignore/Suppress" task, a short-term memory task composed of two steps. In Step 1 ("ignore"), participants were instructed to memorize a set of stimuli while ignoring simultaneously presented irrelevant material. In Step 2 ("suppress"), participants were instructed to forget a subset of the previously memorized material. The ability to resolve interference was indexed by response latencies on two recognition tasks in which participants decided whether a probe was a member of the target set. In Step 1, we compared response latencies to probes from the to-be-ignored list with response latencies to nonrecently presented items. In Step 2, we compared response latencies to probes from the to-be-suppressed list with response latencies to nonrecently presented items. The results indicate that, compared with control participants, depressed participants exhibited increased interference in the "suppress" but not in the "ignore" step of the task, when the stimuli were negative words. No group differences were obtained when we presented letters instead of emotional words. These findings indicate that depression is associated with difficulty in removing irrelevant negative material from short-term memory.

  3. Semantic Service Matchmaking in the ATM Domain Considering Infrastructure Capability Constraints

    NASA Astrophysics Data System (ADS)

    Moser, Thomas; Mordinyi, Richard; Sunindyo, Wikan Danar; Biffl, Stefan

    In a service-oriented environment business processes flexibly build on software services provided by systems in a network. A key design challenge is the semantic matchmaking of business processes and software services in two steps: 1. Find for one business process the software services that meet or exceed the BP requirements; 2. Find for all business processes the software services that can be implemented within the capability constraints of the underlying network, which poses a major problem since even for small scenarios the solution space is typically very large. In this chapter we analyze requirements from mission-critical business processes in the Air Traffic Management (ATM) domain and introduce an approach for semi-automatic semantic matchmaking for software services, the “System-Wide Information Sharing” (SWIS) business process integration framework. A tool-supported semantic matchmaking process like SWIS can provide system designers and integrators with a set of promising software service candidates and therefore strongly reduces the human matching effort by focusing on a much smaller space of matchmaking candidates. We evaluate the feasibility of the SWIS approach in an industry use case from the ATM domain.

  4. Extraction of astaxanthin from microalgae: process design and economic feasibility study

    NASA Astrophysics Data System (ADS)

    Zgheib, Nancy; Saade, Roxana; Khallouf, Rindala; Takache, Hosni

    2018-03-01

    In this work, the process design and the economic feasibility of natural astaxanthin extraction fromHaematococcus pluvialisspecies have been reported. Complete process drawing of the process was first performed, and then the process was designed including five main steps being the harvesting process, the cell disruption, the spray drying, the supercritical CO2extraction and the anaerobic digestion. The major components of the facility would include sedimentation tanks, a disk stack centrifuge, a bed miller, a spray dryer, a multistage compressor, an extractor, a pasteurizer and a digester. All units have been sized assuming a 10 kg/h of dried biomass as a feedstock to produce nearly 2592 kg of astaxanthin per year. The investment payback time and the return on investment were all estimated for different market prices of astaxanthin. Based on the results the production process was found to become economically feasible for a market price higher than 1500/Kg. Also, a payback period of 1 year and an ROI equal to 113% was estimated for an astaxanthin market price equal to 6000/Kg.

  5. Comparison of Campylobacter contamination levels on chicken carcasses between modern and traditional types of slaughtering facilities in Malaysia.

    PubMed

    Rejab, Saira Banu Mohamed; Zessin, Karl-Hans; Fries, Reinhard; Patchanee, Prapas

    2012-01-01

    A total of 360 samples including fresh fecal droppings, neck skins, and swab samples was collected from 24 broiler flocks and processed by 12 modern processing plants in 6 states in Malaysia. Ninety samples from 10 traditional wet markets located in the same states as modern processing plants were also collected. Microbiological isolation for Campylobacter was performed following ISO 10272-1:2006 (E). The overall rate of contamination for Campylobacter in modern processing plants and in traditional wet markets was 61.1% (220/360) and 85.6% (77/90), respectively. Campylobacter jejuni was detected as the majority with approximately 70% for both facilities. In the modern processing plants, the contamination rate for Campylobacter gradually declined from 80.6% before the inside-outside washing to 62.5% after inside-outside washing and to 38.9% after the post chilling step. The contamination rate for Campylobacter from processed chicken neck skin in traditional wet markets (93.3%) was significantly (P<0.01) higher than in modern processing plants (38.9%).

  6. Biodiesel production from waste frying oils and its quality control.

    PubMed

    Sabudak, T; Yildiz, M

    2010-05-01

    The use of biodiesel as fuel from alternative sources has increased considerably over recent years, affording numerous environmental benefits. Biodiesel an alternative fuel for diesel engines is produced from renewable sources such as vegetable oils or animal fats. However, the high costs implicated in marketing biodiesel constitute a major obstacle. To this regard therefore, the use of waste frying oils (WFO) should produce a marked reduction in the cost of biodiesel due to the ready availability of WFO at a relatively low price. In the present study waste frying oils collected from several McDonald's restaurants in Istanbul, were used to produce biodiesel. Biodiesel from WFO was prepared by means of three different transesterification processes: a one-step base-catalyzed, a two-step base-catalyzed and a two-step acid-catalyzed transesterification followed by base transesterification. No detailed previous studies providing information for a two-step acid-catalyzed transesterification followed by a base (CH(3)ONa) transesterification are present in literature. Each reaction was allowed to take place with and without tetrahydrofuran added as a co-solvent. Following production, three different procedures; washing with distilled water, dry wash with magnesol and using ion-exchange resin were applied to purify biodiesel and the best outcome determined. The biodiesel obtained to verify compliance with the European Standard 14214 (EN 14214), which also corresponds to Turkish Biodiesel Standards. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  7. The Effect of Surfactant and Compatibilizer on Inorganic Loading and Properties of PPO-based EPMM Membranes

    NASA Astrophysics Data System (ADS)

    Bissadi, Golnaz

    Hybrid membranes represent a promising alternative to the limitations of organic and inorganic materials for high productivity and selectivity gas separation membranes. In this study, the previously developed concept of emulsion-polymerized mixed matrix (EPMM) membranes was further advanced by investigating the effects of surfactant and compatibilizer on inorganic loading in poly(2,6-dimethyl-1,4-phenylene oxide) (PPO)-based EPMM membranes, in which inorganic part of the membranes originated from tetraethylorthosilicate (TEOS). The polymerization of TEOS, which consists of hydrolysis of TEOS and condensation of the hydrolyzed TEOS, was carried out as (i) one- and (ii) two-step processes. In the one-step process, the hydrolysis and condensation take place in the same environment of a weak acid provided by the aqueous solution of aluminum hydroxonitrate and sodium carbonate. In the two-step process, the hydrolysis takes place in the environment of a strong acid (solution of hydrochloric acid), whereas the condensation takes place in weak base environment obtained by adding excess of the ammonium hydroxide solution to the acidic solution of the hydrolyzed TEOS. For both one- and two-step processes, the emulsion polymerization of TEOS was carried out in two types of emulsions made of (i) pure trichloroethylene (TCE) solvent, and (ii) 10 w/v% solution of PPO in TCE, using different combinations of the compatibilizer (ethanol) and the surfactant (n-octanol). The experiments with pure TCE, which are referred to as a gravimetric powder method (GPM) allowed assessing the effect of different experimental parameters on the conversion of TEOS. The GPM tests also provided a guide for the synthesis of casting emulsions containing PPO, from which the EPMM membranes were prepared using a spin coating technique. The synthesized EPMM membranes were characterized using 29Si nuclear magnetic resonance (29Si NMR), differential scanning calorimetry (DSC), inductively coupled plasma mass spectrometry (ICP-MS), and gas permeation measurements carried out in a constant pressure (CP) system. The 29Si NMR analysis verified polymerization of TEOS in the emulsions made of pure TCE, and the PPO solution in TCE. The conversions of TEOS in the two-step process in the two types of emulsions were very close to each other. In the case of the one-step process, the conversions in the TCE emulsion were significantly greater than those in the emulsion of the PPO solution in TCE. Consequently, the conversions of TEOS in the EPMM membranes made in the two-step process were greater than those in the EPMM membranes made in the one-step process. The latter ranged between 10 - 20%, while the highest conversion in the two-step process was 74% in the presence of pure compatibilizer with no surfactant. Despite greater conversions and hence the greater inorganic loadings, the EPMM membranes prepared in the two-step process had glass transition temperatures (Tg) only slightly greater than the reference PPO membranes. In contrast, despite relatively low inorganic loadings, the EPMM membranes prepared in the one-step process had Tgs markedly greater than PPO, and showed the expected trend of an increase in Tg with the inorganic loading. These results indicate that in the case of the one-step process the polymerized TEOS was well integrated with the PPO chains and the interactions between the two phases lead to high Tgs. On the other hand, this was not the case for the EPMM membranes prepared in the two-step process, suggesting possible phase separation between the polymerized TEOS and the organic phase. The latter was confirmed by detecting no selectivity in the EPMM membranes prepared by the two-step process. In contrast, the EPMM membranes prepared in the one-step process in the presence of the compatibilizer and no surfactant showed 50% greater O2 permeability coefficient and a slightly greater O2/N2 permeability ratio compared to the reference PPO membranes.

  8. Ancient numerical daemons of conceptual hydrological modeling: 1. Fidelity and efficiency of time stepping schemes

    NASA Astrophysics Data System (ADS)

    Clark, Martyn P.; Kavetski, Dmitri

    2010-10-01

    A major neglected weakness of many current hydrological models is the numerical method used to solve the governing model equations. This paper thoroughly evaluates several classes of time stepping schemes in terms of numerical reliability and computational efficiency in the context of conceptual hydrological modeling. Numerical experiments are carried out using 8 distinct time stepping algorithms and 6 different conceptual rainfall-runoff models, applied in a densely gauged experimental catchment, as well as in 12 basins with diverse physical and hydroclimatic characteristics. Results show that, over vast regions of the parameter space, the numerical errors of fixed-step explicit schemes commonly used in hydrology routinely dwarf the structural errors of the model conceptualization. This substantially degrades model predictions, but also, disturbingly, generates fortuitously adequate performance for parameter sets where numerical errors compensate for model structural errors. Simply running fixed-step explicit schemes with shorter time steps provides a poor balance between accuracy and efficiency: in some cases daily-step adaptive explicit schemes with moderate error tolerances achieved comparable or higher accuracy than 15 min fixed-step explicit approximations but were nearly 10 times more efficient. From the range of simple time stepping schemes investigated in this work, the fixed-step implicit Euler method and the adaptive explicit Heun method emerge as good practical choices for the majority of simulation scenarios. In combination with the companion paper, where impacts on model analysis, interpretation, and prediction are assessed, this two-part study vividly highlights the impact of numerical errors on critical performance aspects of conceptual hydrological models and provides practical guidelines for robust numerical implementation.

  9. Materials and processes laboratory composite materials characterization task, part 1. Damage tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Tucker, D. S.; Patterson, W. J.; Franklin, S. W.; Gordon, G. H.; Hart, L.; Hodge, A. J.; Lance, D. G.; Russel, S. S.

    1991-01-01

    A test run was performed on IM6/3501-6 carbon-epoxy in which the material was processed, machined into specimens, and tested for damage tolerance capabilities. Nondestructive test data played a major role in this element of composite characterization. A time chart was produced showing the time the composite material spent within each Branch or Division in order to identify those areas which produce a long turnaround time. Instrumented drop weight testing was performed on the specimens with nondestructive evaluation being performed before and after the impacts. Destructive testing in the form of cross-sectional photomicrography and compression-after-impact testing were used. Results show that the processing and machining steps needed to be performed more rapidly if data on composite material is to be collected within a reasonable timeframe. The results of the damage tolerance testing showed that IM6/3501-6 is a brittle material that is very susceptible to impact damage.

  10. Resin bleed improvement on surface mount semiconductor device

    NASA Astrophysics Data System (ADS)

    Rajoo, Indra Kumar; Tahir, Suraya Mohd; Aziz, Faieza Abdul; Shamsul Anuar, Mohd

    2018-04-01

    Resin bleed is a transparent layer of epoxy compound which occurs during molding process but is difficult to be detected after the molding process. Resin bleed on the lead on the unit from the focused package, SOD123, can cause solderability failure at end customer. This failed unit from the customer will be considered as a customer complaint. Generally, the semiconductor company has to perform visual inspection after the plating process to detect resin bleed. Mold chase with excess hole, split cavity & stepped design ejector pin hole have been found to be the major root cause of resin bleed in this company. The modifications of the mold chase, changing of split cavity to solid cavity and re-design of the ejector pin proposed were derived after a detailed study & analysis conducted to arrive at these solutions. The solutions proposed have yield good results during the pilot run with zero (0) occurrence of resin bleed for 3 consecutive months.

  11. Investigations for the Recycle of Pyroprocessed Uranium

    NASA Astrophysics Data System (ADS)

    Westphal, B. R.; Price, J. C.; Chambers, E. E.; Patterson, M. N.

    Given the renewed interest in uranium from the pyroprocessing of used nuclear fuel in a molten salt system, the two biggest hurdles for marketing the uranium are radiation levels and transuranic content. A radiation level as low as possible is desired so that handling operations can be performed directly with the uranium. The transuranic content of the uranium will affect the subsequent waste streams generated and, thus also should be minimized. Although the pyroprocessing technology was originally developed without regard to radiation and transuranic levels, adaptations to the process have been considered. Process conditions have been varied during the distillation and casting cycles of the process with increasing temperature showing the largest effect on the reduction of radiation levels. Transuranic levels can be reduced significantly by incorporating a pre-step in the salt distillation operation to remove a majority of the salt prior to distillation.

  12. Omics on bioleaching: current and future impacts.

    PubMed

    Martinez, Patricio; Vera, Mario; Bobadilla-Fazzini, Roberto A

    2015-10-01

    Bioleaching corresponds to the microbial-catalyzed process of conversion of insoluble metals into soluble forms. As an applied biotechnology globally used, it represents an extremely interesting field of research where omics techniques can be applied in terms of knowledge development, but moreover in terms of process design, control, and optimization. In this mini-review, the current state of genomics, proteomics, and metabolomics of bioleaching and the major impacts of these analytical methods at industrial scale are highlighted. In summary, genomics has been essential in the determination of the biodiversity of leaching processes and for development of conceptual and functional metabolic models. Proteomic impacts are mostly related to microbe-mineral interaction analysis, including copper resistance and biofilm formation. Early steps of metabolomics in the field of bioleaching have shown a significant potential for the use of metabolites as industrial biomarkers. Development directions are given in order to enhance the future impacts of the omics in biohydrometallurgy.

  13. GRI: The Gamma-Ray Imager mission

    NASA Astrophysics Data System (ADS)

    Knödlseder, Jürgen; GRI Consortium

    With the INTEGRAL observatory ESA has provided a unique tool to the astronomical community revealing hundreds of sources, new classes of objects, extraordinary views of antimatter annihilation in our Galaxy, and fingerprints of recent nucleosynthesis processes. While INTEGRAL provides the global overview over the soft gamma-ray sky, there is a growing need to perform deeper, more focused investigations of gamma-ray sources. In soft X-rays a comparable step was taken going from the Einstein and the EXOSAT satellites to the Chandra and XMM/Newton observatories. Technological advances in the past years in the domain of gamma-ray focusing using Laue diffraction have paved the way towards a new gamma-ray mission, providing major improvements regarding sensitivity and angular resolution. Such a future Gamma-Ray Imager will allow studies of particle acceleration processes and explosion physics in unprecedented detail, providing essential clues on the innermost nature of the most violent and most energetic processes in the Universe.

  14. GRI: The Gamma-Ray Imager mission

    NASA Astrophysics Data System (ADS)

    Knödlseder, Jürgen; GRI Consortium

    2006-06-01

    With the INTEGRAL observatory, ESA has provided a unique tool to the astronomical community revealing hundreds of sources, new classes of objects, extraordinary views of antimatter annihilation in our Galaxy, and fingerprints of recent nucleosynthesis processes. While INTEGRAL provides the global overview over the soft gamma-ray sky, there is a growing need to perform deeper, more focused investigations of gamma-ray sources. In soft X-rays a comparable step was taken going from the Einstein and the EXOSAT satellites to the Chandra and XMM/Newton observatories. Technological advances in the past years in the domain of gamma-ray focusing using Laue diffraction have paved the way towards a new gamma-ray mission, providing major improvements regarding sensitivity and angular resolution. Such a future Gamma-Ray Imager will allow the study of particle acceleration processes and explosion physics in unprecedented detail, providing essential clues on the innermost nature of the most violent and most energetic processes in the Universe.

  15. Simultaneous concentration and detoxification of lignocellulosic hydrolyzates by vacuum membrane distillation coupled with adsorption.

    PubMed

    Zhang, Yaqin; Li, Ming; Wang, Yafei; Ji, Xiaosheng; Zhang, Lin; Hou, Lian

    2015-12-01

    Low sugar concentration and the presence of various inhibitors are the major challenges associated with lignocellulosic hydrolyzates as a fermentation broth. Vacuum membrane distillation (VMD) process can be used to concentrate sugars and remove inhibitors (furans) efficiently, but it's not desirable for the removal of less volatile inhibitors such as acetic acid. In this study, a VMD-adsorption process was proposed to improve the removal of acetic acid, achieving simultaneous concentration and detoxification of lignocellulosic hydrolyzates by one step process. Results showed that sugars were concentrated with high rejections (>98%) and little sugar loss (<2%), with the significant reduction in nearly total furans (99.7%) and acetic acid (83.5%) under optimal operation conditions. Fermentation results showed the ethanol production of hydrolyzates concentrated and detoxified using the VMD-adsorption method were approximately 10-fold greater than from untreated hydrolyzates. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Automatic processing of spoken dialogue in the home hemodialysis domain.

    PubMed

    Lacson, Ronilda; Barzilay, Regina

    2005-01-01

    Spoken medical dialogue is a valuable source of information, and it forms a foundation for diagnosis, prevention and therapeutic management. However, understanding even a perfect transcript of spoken dialogue is challenging for humans because of the lack of structure and the verbosity of dialogues. This work presents a first step towards automatic analysis of spoken medical dialogue. The backbone of our approach is an abstraction of a dialogue into a sequence of semantic categories. This abstraction uncovers structure in informal, verbose conversation between a caregiver and a patient, thereby facilitating automatic processing of dialogue content. Our method induces this structure based on a range of linguistic and contextual features that are integrated in a supervised machine-learning framework. Our model has a classification accuracy of 73%, compared to 33% achieved by a majority baseline (p<0.01). This work demonstrates the feasibility of automatically processing spoken medical dialogue.

  17. [Clinical and etiopathogenetic role of plasminogen and metaloproteinase systems in the tumor growth. Pericellular proteolysis of extracellular matrix and tumor growth].

    PubMed

    Cosić, Sanda Jelisavac; Kovac, Zdenko

    2011-01-01

    Pericellular proteolysis is a cascade process involved in degradation of extracellular matrix. This process is included in various physiological and pathological processes. Pericellullar proteolysis has major functions like degradation of tissue stroma and weakening of intercellular connections but it also has a function in the synthesis of bioactive molecules (cytokines, growth factors and inhibitory factors). Plasminogen system is involved in fibrinolysis and starts metalloproteinase activation. Activity of proteolytic molecules is controlled by the rate of zymogenic activation, half-life of molecules, and action of inhibitory molecules. Inhibition is achieved through direct binding of inhibitor and enzyme and takes a few steps. Pericellular proteolysis is involved in tumor invasion and metastasis, inflammatory reaction, degenerative diseases and other diseases. Pathophysiological regulation of pericellular proteolysis in mentioned diseases contributes to clinical properties of diseases and has diagnostic and therapeutic importance.

  18. High-quality bulk hybrid perovskite single crystals within minutes by inverse temperature crystallization

    NASA Astrophysics Data System (ADS)

    Saidaminov, Makhsud I.; Abdelhady, Ahmed L.; Murali, Banavoth; Alarousu, Erkki; Burlakov, Victor M.; Peng, Wei; Dursun, Ibrahim; Wang, Lingfei; He, Yao; Maculan, Giacomo; Goriely, Alain; Wu, Tom; Mohammed, Omar F.; Bakr, Osman M.

    2015-07-01

    Single crystals of methylammonium lead trihalide perovskites (MAPbX3; MA=CH3NH3+, X=Br- or I-) have shown remarkably low trap density and charge transport properties; however, growth of such high-quality semiconductors is a time-consuming process. Here we present a rapid crystal growth process to obtain MAPbX3 single crystals, an order of magnitude faster than previous reports. The process is based on our observation of the substantial decrease of MAPbX3 solubility, in certain solvents, at elevated temperatures. The crystals can be both size- and shape-controlled by manipulating the different crystallization parameters. Despite the rapidity of the method, the grown crystals exhibit transport properties and trap densities comparable to the highest quality MAPbX3 reported to date. The phenomenon of inverse or retrograde solubility and its correlated inverse temperature crystallization strategy present a major step forward for advancing the field on perovskite crystallization.

  19. Characterization of landfill leachates by molecular size distribution, biodegradability, and inert chemical oxygen demand.

    PubMed

    Amaral, Míriam C S; Ferreira, Cynthia F A; Lange, Liséte Celina; Aquino, Sérgio F

    2009-05-01

    This work presents results from a detailed characterization of landfill leachates of different ages from a landfill in a major Brazilian city. This characterization consists of determining the molecular size distribution and the inert chemical oxygen demand (COD) and the biodegradability of both aerobic and anaerobic processes. Results show that leachate with a high COD concentration leachate has low biodegradability. A significant fraction of the COD is not characterized as protein, carbohydrate, or lipids, which reinforces the hypothesis that the remaining fraction was present in all leachate fractions (less than 1 kDa; between 1 and 10 kDa; between 10 and 100 kDa; and greater than 100 kDa) and is refractory. These results suggest that leachates with such characteristics require treatment systems that use physical-chemical processes as a pre- or post-treatment step to biological processes.

  20. Full-waveform data for building roof step edge localization

    NASA Astrophysics Data System (ADS)

    Słota, Małgorzata

    2015-08-01

    Airborne laser scanning data perfectly represent flat or gently sloped areas; to date, however, accurate breakline detection is the main drawback of this technique. This issue becomes particularly important in the case of modeling buildings, where accuracy higher than the footprint size is often required. This article covers several issues related to full-waveform data registered on building step edges. First, the full-waveform data simulator was developed and presented in this paper. Second, this article provides a full description of the changes in echo amplitude, echo width and returned power caused by the presence of edges within the laser footprint. Additionally, two important properties of step edge echoes, peak shift and echo asymmetry, were noted and described. It was shown that these properties lead to incorrect echo positioning along the laser center line and can significantly reduce the edge points' accuracy. For these reasons and because all points are aligned with the center of the beam, regardless of the actual target position within the beam footprint, we can state that step edge points require geometric corrections. This article presents a novel algorithm for the refinement of step edge points. The main distinguishing advantage of the developed algorithm is the fact that none of the additional data, such as emitted signal parameters, beam divergence, approximate edge geometry or scanning settings, are required. The proposed algorithm works only on georeferenced profiles of reflected laser energy. Another major advantage is the simplicity of the calculation, allowing for very efficient data processing. Additionally, the developed method of point correction allows for the accurate determination of points lying on edges and edge point densification. For this reason, fully automatic localization of building roof step edges based on LiDAR full-waveform data with higher accuracy than the size of the lidar footprint is feasible.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yueh-Ning; Hennebelle, Patrick; Chabrier, Gilles, E-mail: yueh-ning.lee@cea.fr

    Observations suggest that star formation in filamentary molecular clouds occurs in a two-step process, with the formation of filaments preceding that of prestellar cores and stars. Here, we apply the gravoturbulent fragmentation theory of Hennebelle and Chabrier to a filamentary environment, taking into account magnetic support. We discuss the induced geometrical effect on the cores, with a transition from 3D geometry at small scales to 1D at large ones. The model predicts the fragmentation behavior of a filament for a given mass per unit length (MpL) and level of magnetization. This core mass function (CMF) for individual filaments is thenmore » convolved with the distribution of filaments to obtain the final system CMF. The model yields two major results. (i) The filamentary geometry naturally induces a hierarchical fragmentation process, first into groups of cores, separated by a length equal to a few filament Jeans lengths, i.e., a few times the filament width. These groups then fragment into individual cores. (ii) Non-magnetized filaments with high MpL are found to fragment excessively, at odds with observations. This is resolved by taking into account the magnetic field (treated simply as additional pressure support). The present theory suggests two complementary modes of star formation: although small (spherical or filamentary) structures will collapse directly into prestellar cores, according to the standard Hennebelle–Chabrier theory, the large (filamentary) ones, the dominant population according to observations, will follow the aforedescribed two-step process.« less

  2. Comparison of Tobacco Host Cell Protein Removal Methods by Blanching Intact Plants or by Heat Treatment of Extracts.

    PubMed

    Buyel, Johannes F; Hubbuch, Jürgen; Fischer, Rainer

    2016-08-08

    Plants not only provide food, feed and raw materials for humans, but have also been developed as an economical production system for biopharmaceutical proteins, such as antibodies, vaccine candidates and enzymes. These must be purified from the plant biomass but chromatography steps are hindered by the high concentrations of host cell proteins (HCPs) in plant extracts. However, most HCPs irreversibly aggregate at temperatures above 60 °C facilitating subsequent purification of the target protein. Here, three methods are presented to achieve the heat precipitation of tobacco HCPs in either intact leaves or extracts. The blanching of intact leaves can easily be incorporated into existing processes but may have a negative impact on subsequent filtration steps. The opposite is true for heat precipitation of leaf extracts in a stirred vessel, which can improve the performance of downstream operations albeit with major changes in process equipment design, such as homogenizer geometry. Finally, a heat exchanger setup is well characterized in terms of heat transfer conditions and easy to scale, but cleaning can be difficult and there may be a negative impact on filter capacity. The design-of-experiments approach can be used to identify the most relevant process parameters affecting HCP removal and product recovery. This facilitates the application of each method in other expression platforms and the identification of the most suitable method for a given purification strategy.

  3. Valorization of Proteins from Co- and By-Products from the Fish and Meat Industry.

    PubMed

    Aspevik, Tone; Oterhals, Åge; Rønning, Sissel Beate; Altintzoglou, Themistoklis; Wubshet, Sileshi Gizachew; Gildberg, Asbjørn; Afseth, Nils Kristian; Whitaker, Ragnhild Dragøy; Lindberg, Diana

    2017-06-01

    Large volumes of protein-rich residual raw materials, such as heads, bones, carcasses, blood, skin, viscera, hooves and feathers, are created as a result of processing of animals from fisheries, aquaculture, livestock and poultry sectors. These residuals contain proteins and other essential nutrients with potentially bioactive properties, eligible for recycling and upgrading for higher-value products, e.g. for human, pet food and feed purposes. Here, we aim to cover all the important aspects of achieving optimal utilization of proteins in such residual raw materials, identifying those eligible for human consumption as co-products and for feed applications as by-products. Strict legislation regulates the utilization of various animal-based co- and by-products, representing a major hurdle if not addressed properly. Thorough understanding and optimization of all parts of the production chain, including conservation and processing, are important prerequisites for successful upgrading and industrial implementation of such products. This review includes industrially applied technologies such as freezing/cooling, acid preservation, salting, rendering and protein hydrolysis. In this regard, it is important to achieve stable production and quality through all the steps in the manufacturing chain, preferably supported by at- or online quality control points in the actual processing step. If aiming for the human market, knowledge of consumer trends and awareness are important for production and successful introduction of new products and ingredients.

  4. Organic solar cells: understanding the role of Förster resonance energy transfer.

    PubMed

    Feron, Krishna; Belcher, Warwick J; Fell, Christopher J; Dastoor, Paul C

    2012-12-12

    Organic solar cells have the potential to become a low-cost sustainable energy source. Understanding the photoconversion mechanism is key to the design of efficient organic solar cells. In this review, we discuss the processes involved in the photo-electron conversion mechanism, which may be subdivided into exciton harvesting, exciton transport, exciton dissociation, charge transport and extraction stages. In particular, we focus on the role of energy transfer as described by F¨orster resonance energy transfer (FRET) theory in the photoconversion mechanism. FRET plays a major role in exciton transport, harvesting and dissociation. The spectral absorption range of organic solar cells may be extended using sensitizers that efficiently transfer absorbed energy to the photoactive materials. The limitations of F¨orster theory to accurately calculate energy transfer rates are discussed. Energy transfer is the first step of an efficient two-step exciton dissociation process and may also be used to preferentially transport excitons to the heterointerface, where efficient exciton dissociation may occur. However, FRET also competes with charge transfer at the heterointerface turning it in a potential loss mechanism. An energy cascade comprising both energy transfer and charge transfer may aid in separating charges and is briefly discussed. Considering the extent to which the photo-electron conversion efficiency is governed by energy transfer, optimisation of this process offers the prospect of improved organic photovoltaic performance and thus aids in realising the potential of organic solar cells.

  5. A stepped strategy that aims at the nationwide implementation of the Enhanced Recovery After Surgery programme in major gynaecological surgery: study protocol of a cluster randomised controlled trial.

    PubMed

    de Groot, Jeanny Ja; Maessen, José Mc; Slangen, Brigitte Fm; Winkens, Bjorn; Dirksen, Carmen D; van der Weijden, Trudy

    2015-07-30

    Enhanced Recovery After Surgery (ERAS) programmes aim at an early recovery after surgical trauma and consequently at a reduced length of hospitalisation. This paper presents the protocol for a study that focuses on large-scale implementation of the ERAS programme in major gynaecological surgery in the Netherlands. The trial will evaluate effectiveness and costs of a stepped implementation approach that is characterised by tailoring the intensity of implementation activities to the needs of organisations and local barriers for change, in comparison with the generic breakthrough strategy that is usually applied in large-scale improvement projects in the Netherlands. All Dutch hospitals authorised to perform major abdominal surgery in gynaecological oncology patients are eligible for inclusion in this cluster randomised controlled trial. The hospitals that already fully implemented the ERAS programme in their local perioperative management or those who predominantly admit gynaecological surgery patients to an external hospital replacement care facility will be excluded. Cluster randomisation will be applied at the hospital level and will be stratified based on tertiary status. Hospitals will be randomly assigned to the stepped implementation strategy or the breakthrough strategy. The control group will receive the traditional breakthrough strategy with three educational sessions and the use of plan-do-study-act cycles for planning and executing local improvement activities. The intervention group will receive an innovative stepped strategy comprising four levels of intensity of support. Implementation starts with generic low-cost activities and may build up to the highest level of tailored and labour-intensive activities. The decision for a stepwise increase in intensive support will be based on the success of implementation so far. Both implementation strategies will be completed within 1 year and evaluated on effect, process, and cost-effectiveness. The primary outcome is length of postoperative hospital stay. Additional outcome measures are length of recovery, guideline adherence, and mean implementation costs per patient. This study takes up the challenge to evaluate an efficient strategy for large-scale implementation. Comparing effectiveness and costs of two different approaches, this study will help to define a preferred strategy for nationwide dissemination of best practices. Dutch Trial Register NTR4058.

  6. Systemic safety project selection tool.

    DOT National Transportation Integrated Search

    2013-07-01

    "The Systemic Safety Project Selection Tool presents a process for incorporating systemic safety planning into traditional safety management processes. The Systemic Tool provides a step-by-step process for conducting systemic safety analysis; conside...

  7. Failure mode and effects analysis and fault tree analysis of surface image guided cranial radiosurgery.

    PubMed

    Manger, Ryan P; Paxton, Adam B; Pawlicki, Todd; Kim, Gwe-Ya

    2015-05-01

    Surface image guided, Linac-based radiosurgery (SIG-RS) is a modern approach for delivering radiosurgery that utilizes optical stereoscopic imaging to monitor the surface of the patient during treatment in lieu of using a head frame for patient immobilization. Considering the novelty of the SIG-RS approach and the severity of errors associated with delivery of large doses per fraction, a risk assessment should be conducted to identify potential hazards, determine their causes, and formulate mitigation strategies. The purpose of this work is to investigate SIG-RS using the combined application of failure modes and effects analysis (FMEA) and fault tree analysis (FTA), report on the effort required to complete the analysis, and evaluate the use of FTA in conjunction with FMEA. A multidisciplinary team was assembled to conduct the FMEA on the SIG-RS process. A process map detailing the steps of the SIG-RS was created to guide the FMEA. Failure modes were determined for each step in the SIG-RS process, and risk priority numbers (RPNs) were estimated for each failure mode to facilitate risk stratification. The failure modes were ranked by RPN, and FTA was used to determine the root factors contributing to the riskiest failure modes. Using the FTA, mitigation strategies were formulated to address the root factors and reduce the risk of the process. The RPNs were re-estimated based on the mitigation strategies to determine the margin of risk reduction. The FMEA and FTAs for the top two failure modes required an effort of 36 person-hours (30 person-hours for the FMEA and 6 person-hours for two FTAs). The SIG-RS process consisted of 13 major subprocesses and 91 steps, which amounted to 167 failure modes. Of the 91 steps, 16 were directly related to surface imaging. Twenty-five failure modes resulted in a RPN of 100 or greater. Only one of these top 25 failure modes was specific to surface imaging. The riskiest surface imaging failure mode had an overall RPN-rank of eighth. Mitigation strategies for the top failure mode decreased the RPN from 288 to 72. Based on the FMEA performed in this work, the use of surface imaging for monitoring intrafraction position in Linac-based stereotactic radiosurgery (SRS) did not greatly increase the risk of the Linac-based SRS process. In some cases, SIG helped to reduce the risk of Linac-based RS. The FMEA was augmented by the use of FTA since it divided the failure modes into their fundamental components, which simplified the task of developing mitigation strategies.

  8. 3D Texture Features Mining for MRI Brain Tumor Identification

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra

    2014-03-01

    Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.

  9. Low-energy (anti)neutrino physics with Borexino: Neutrinos from the primary proton-proton fusion process in the Sun

    NASA Astrophysics Data System (ADS)

    Mosteiro, P.; Bellini, G.; Benziger, J.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Cadonati, L.; Calaprice, F.; Caminata, A.; Cavalcante, P.; Chavarría, Á.; Chepurnov, A.; D'Angelo, D.; Davini, S.; Derbin, A.; Empl, A.; Etenko, A.; Fomenko, K.; Franco, D.; Gabriele, F.; Galbiati, C.; Gazzana, S.; Ghiano, C.; Giammarchi, M.; Göger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Hungerford, E.; Ianni, Al.; Ianni, An.; Kobychev, V.; Korablëv, D.; Korga, G.; Kryn, D.; Laubenstein, M.; Lehnert, B.; Lewke, T.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Marcocci, S.; Meindl, Q.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Muratova, V.; Oberauer, L.; Obolensky, M.; Ortica, F.; Otis, K.; Pallavicini, M.; Papp, L.; Perasso, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Rossi, N.; Saldanha, R.; Salvo, C.; Schönert, S.; Simgen, H.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Vignaud, D.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Winter, J.; Wojcik, M.; Wright, A.; Wurm, M.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.

    2015-08-01

    The Sun is fueled by a series of nuclear reactions that produce the energy that makes it shine. The primary reaction is the fusion of two protons into a deuteron, a positron and a neutrino. These neutrinos constitute the vast majority of neutrinos reaching Earth, providing us with key information about what goes on at the core of our star. Several experiments have now confirmed the observation of neutrino oscillations by detecting neutrinos from secondary nuclear processes in the Sun; this is the first direct spectral measurement of the neutrinos from the keystone proton-proton fusion. This observation is a crucial step towards the completion of the spectroscopy of pp-chain neutrinos, as well as further validation of the LMA-MSW model of neutrino oscillations.

  10. Containerless Measurements of Density and Viscosity of Fe-Co Alloys

    NASA Technical Reports Server (NTRS)

    Lee, Jonghyun; Choufani, Paul; Bradshaw, Richard C.; Hyers, Robert W.; Matson, Douglas M.

    2012-01-01

    During the past years, extensive collaborative research has been done to understand phase selection in undercooled metals using novel containerless processing techniques such as electrostatic and electromagnetic levitation. Of major interest is controlling a two-step solidification process, double recalescence, in which the metastable phase forms first and then transforms to the stable phase after a certain delay time. The previous research has shown that the delay time is greatly influenced by the internal convection velocity. In the prediction of internal flow, the fidelity of the results depends on the accuracy of the material properties. This research focuses on the measurements of density and viscosity of Fe-Co alloys which will be used for the fluid simulations whose results will support upcoming International Space Station flight experiments.

  11. Optimizing resource allocation and patient flow: process analysis and reorganization in three chemotherapy outpatient clinics.

    PubMed

    Holmes, Morgan; Bodie, Kelly; Porter, Geoffrey; Sullivan, Victoria; Tarasuk, Joy; Trembley, Jodie; Trudeau, Maureen

    2010-01-01

    Optimizing human and physical resources is a major concern for cancer care decision-makers and practitioners. This issue is particularly acute in the context of ambulatory out patient chemotherapy clinics, especially when - as is the case almost everywhere in the industrialized world - the number of people requiring systemic therapy is increasing while budgets, staffing and physical space remain static. Recent initiatives at three hospital-based chemotherapy units - in Halifax, Toronto and Kingston - shed light on the value of process analysis and reorganization for using existing human and physical resources to their full potential, improving patient flow and enhancing patient satisfaction. The steps taken in these settings are broadly applicable to other healthcare settings and would likely result in similar benefits in those environments.

  12. The RiverFish Approach to Business Process Modeling: Linking Business Steps to Control-Flow Patterns

    NASA Astrophysics Data System (ADS)

    Zuliane, Devanir; Oikawa, Marcio K.; Malkowski, Simon; Alcazar, José Perez; Ferreira, João Eduardo

    Despite the recent advances in the area of Business Process Management (BPM), today’s business processes have largely been implemented without clearly defined conceptual modeling. This results in growing difficulties for identification, maintenance, and reuse of rules, processes, and control-flow patterns. To mitigate these problems in future implementations, we propose a new approach to business process modeling using conceptual schemas, which represent hierarchies of concepts for rules and processes shared among collaborating information systems. This methodology bridges the gap between conceptual model description and identification of actual control-flow patterns for workflow implementation. We identify modeling guidelines that are characterized by clear phase separation, step-by-step execution, and process building through diagrams and tables. The separation of business process modeling in seven mutually exclusive phases clearly delimits information technology from business expertise. The sequential execution of these phases leads to the step-by-step creation of complex control-flow graphs. The process model is refined through intuitive table and diagram generation in each phase. Not only does the rigorous application of our modeling framework minimize the impact of rule and process changes, but it also facilitates the identification and maintenance of control-flow patterns in BPM-based information system architectures.

  13. Processes for producing low cost, high efficiency silicon solar cells

    DOEpatents

    Rohatgi, Ajeet; Doshi, Parag; Tate, John Keith; Mejia, Jose; Chen, Zhizhang

    1998-06-16

    Processes which utilize rapid thermal processing (RTP) are provided for inexpensively producing high efficiency silicon solar cells. The RTP processes preserve minority carrier bulk lifetime .tau. and permit selective adjustment of the depth of the diffused regions, including emitter and back surface field (bsf), within the silicon substrate. In a first RTP process, an RTP step is utilized to simultaneously diffuse phosphorus and aluminum into the front and back surfaces, respectively, of a silicon substrate. Moreover, an in situ controlled cooling procedure preserves the carrier bulk lifetime .tau. and permits selective adjustment of the depth of the diffused regions. In a second RTP process, both simultaneous diffusion of the phosphorus and aluminum as well as annealing of the front and back contacts are accomplished during the RTP step. In a third RTP process, the RTP step accomplishes simultaneous diffusion of the phosphorus and aluminum, annealing of the contacts, and annealing of a double-layer antireflection/passivation coating SiN/SiO.sub.x. In a fourth RTP process, the process of applying front and back contacts is broken up into two separate respective steps, which enhances the efficiency of the cells, at a slight time expense. In a fifth RTP process, a second RTP step is utilized to fire and adhere the screen printed or evaporated contacts to the structure.

  14. Endoscopic or surgical step-up approach for infected necrotising pancreatitis: a multicentre randomised trial.

    PubMed

    van Brunschot, Sandra; van Grinsven, Janneke; van Santvoort, Hjalmar C; Bakker, Olaf J; Besselink, Marc G; Boermeester, Marja A; Bollen, Thomas L; Bosscha, Koop; Bouwense, Stefan A; Bruno, Marco J; Cappendijk, Vincent C; Consten, Esther C; Dejong, Cornelis H; van Eijck, Casper H; Erkelens, Willemien G; van Goor, Harry; van Grevenstein, Wilhelmina M U; Haveman, Jan-Willem; Hofker, Sijbrand H; Jansen, Jeroen M; Laméris, Johan S; van Lienden, Krijn P; Meijssen, Maarten A; Mulder, Chris J; Nieuwenhuijs, Vincent B; Poley, Jan-Werner; Quispel, Rutger; de Ridder, Rogier J; Römkens, Tessa E; Scheepers, Joris J; Schepers, Nicolien J; Schwartz, Matthijs P; Seerden, Tom; Spanier, B W Marcel; Straathof, Jan Willem A; Strijker, Marin; Timmer, Robin; Venneman, Niels G; Vleggaar, Frank P; Voermans, Rogier P; Witteman, Ben J; Gooszen, Hein G; Dijkgraaf, Marcel G; Fockens, Paul

    2018-01-06

    Infected necrotising pancreatitis is a potentially lethal disease and an indication for invasive intervention. The surgical step-up approach is the standard treatment. A promising alternative is the endoscopic step-up approach. We compared both approaches to see whether the endoscopic step-up approach was superior to the surgical step-up approach in terms of clinical and economic outcomes. In this multicentre, randomised, superiority trial, we recruited adult patients with infected necrotising pancreatitis and an indication for invasive intervention from 19 hospitals in the Netherlands. Patients were randomly assigned to either the endoscopic or the surgical step-up approach. The endoscopic approach consisted of endoscopic ultrasound-guided transluminal drainage followed, if necessary, by endoscopic necrosectomy. The surgical approach consisted of percutaneous catheter drainage followed, if necessary, by video-assisted retroperitoneal debridement. The primary endpoint was a composite of major complications or death during 6-month follow-up. Analyses were by intention to treat. This trial is registered with the ISRCTN registry, number ISRCTN09186711. Between Sept 20, 2011, and Jan 29, 2015, we screened 418 patients with pancreatic or extrapancreatic necrosis, of which 98 patients were enrolled and randomly assigned to the endoscopic step-up approach (n=51) or the surgical step-up approach (n=47). The primary endpoint occurred in 22 (43%) of 51 patients in the endoscopy group and in 21 (45%) of 47 patients in the surgery group (risk ratio [RR] 0·97, 95% CI 0·62-1·51; p=0·88). Mortality did not differ between groups (nine [18%] patients in the endoscopy group vs six [13%] patients in the surgery group; RR 1·38, 95% CI 0·53-3·59, p=0·50), nor did any of the major complications included in the primary endpoint. In patients with infected necrotising pancreatitis, the endoscopic step-up approach was not superior to the surgical step-up approach in reducing major complications or death. The rate of pancreatic fistulas and length of hospital stay were lower in the endoscopy group. The outcome of this trial will probably result in a shift to the endoscopic step-up approach as treatment preference. The Dutch Digestive Disease Foundation, Fonds NutsOhra, and the Netherlands Organization for Health Research and Development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Hetero-diffusion of Au epitaxy on stepped Ag(110) surface: Study of the jump rate and diffusion coefficient

    NASA Astrophysics Data System (ADS)

    Benlattar, M.; El koraychy, E.; Kotri, A.; Mazroui, M.

    2017-12-01

    We have used molecular dynamics simulations combined with an interatomic potential derived from the embedded atom method, to investigate the hetero-diffusion of Au adatom near a stepped Ag(110) surface with the height of one monoatomic layer. The activation energies for different diffusion processes, which occur on the terrace and near the step edge, are calculated both by molecular statics and molecular dynamics simulations. Static energies are found by the drag method, whereas the dynamic barriers are computed at high temperature from the Arrhenius plots. Our numerical results reveal that the jump process requires very high activation energy compared to the exchange process either on the terrace or near the step edge. In this work, other processes, such as upward and downward diffusion at step edges, have also been discussed.

  16. 45 CFR 16.7 - The first steps in the appeal process: The notice of appeal and the Board's response.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false The first steps in the appeal process: The notice... SERVICES GENERAL ADMINISTRATION PROCEDURES OF THE DEPARTMENTAL GRANT APPEALS BOARD § 16.7 The first steps... of these procedures, and advise the appellant of the next steps. The Board will also send a copy of...

  17. Application of a 2-step process for the biological treatment of sulfidic spent caustics.

    PubMed

    de Graaff, Marco; Klok, Johannes B M; Bijmans, Martijn F M; Muyzer, Gerard; Janssen, Albert J H

    2012-03-01

    This research demonstrates the feasibility and advantages of a 2-step process for the biological treatment of sulfidic spent caustics under halo-alkaline conditions (i.e. pH 9.5; Na(+) = 0.8 M). Experiments with synthetically prepared solutions were performed in a continuously fed system consisting of two gas-lift reactors in series operated at aerobic conditions at 35 °C. The detoxification of sulfide to thiosulfate in the first step allowed the successful biological treatment of total-S loading rates up to 33 mmol L(-1) day(-1). In the second, biological step, the remaining sulfide and thiosulfate was completely converted to sulfate by haloalkaliphilic sulfide oxidizing bacteria. Mathematical modeling of the 2-step process shows that under the prevailing conditions an optimal reactor configuration consists of 40% 'abiotic' and 60% 'biological' volume, whilst the total reactor volume is 22% smaller than for the 1-step process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. 45 CFR 16.8 - The next step in the appeal process: Preparation of an appeal file and written argument.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false The next step in the appeal process: Preparation of an appeal file and written argument. 16.8 Section 16.8 Public Welfare DEPARTMENT OF HEALTH AND... step in the appeal process: Preparation of an appeal file and written argument. Except in expedited...

  19. Major transcriptome re-organisation and abrupt changes in signalling, cell cycle and chromatin regulation at neural differentiation in vivo.

    PubMed

    Olivera-Martinez, Isabel; Schurch, Nick; Li, Roman A; Song, Junfang; Halley, Pamela A; Das, Raman M; Burt, Dave W; Barton, Geoffrey J; Storey, Kate G

    2014-08-01

    Here, we exploit the spatial separation of temporal events of neural differentiation in the elongating chick body axis to provide the first analysis of transcriptome change in progressively more differentiated neural cell populations in vivo. Microarray data, validated against direct RNA sequencing, identified: (1) a gene cohort characteristic of the multi-potent stem zone epiblast, which contains neuro-mesodermal progenitors that progressively generate the spinal cord; (2) a major transcriptome re-organisation as cells then adopt a neural fate; and (3) increasing diversity as neural patterning and neuron production begin. Focussing on the transition from multi-potent to neural state cells, we capture changes in major signalling pathways, uncover novel Wnt and Notch signalling dynamics, and implicate new pathways (mevalonate pathway/steroid biogenesis and TGFβ). This analysis further predicts changes in cellular processes, cell cycle, RNA-processing and protein turnover as cells acquire neural fate. We show that these changes are conserved across species and provide biological evidence for reduced proteasome efficiency and a novel lengthening of S phase. This latter step may provide time for epigenetic events to mediate large-scale transcriptome re-organisation; consistent with this, we uncover simultaneous downregulation of major chromatin modifiers as the neural programme is established. We further demonstrate that transcription of one such gene, HDAC1, is dependent on FGF signalling, making a novel link between signals that control neural differentiation and transcription of a core regulator of chromatin organisation. Our work implicates new signalling pathways and dynamics, cellular processes and epigenetic modifiers in neural differentiation in vivo, identifying multiple new potential cellular and molecular mechanisms that direct differentiation. © 2014. Published by The Company of Biologists Ltd.

  20. Structure and assembly of desmosome junctions: biosynthesis, processing, and transport of the major protein and glycoprotein components in cultured epithelial cells.

    PubMed

    Penn, E J; Hobson, C; Rees, D A; Magee, A I

    1987-07-01

    Extracts of metabolically labeled cultured epithelial cells have been analyzed by immunoprecipitation followed by SDS-PAGE, using antisera to the major high molecular mass proteins and glycoproteins (greater than 100 kD) from desmosomes of bovine muzzle epidermis. For nonstratifying cells (Madin-Darby canine kidney [MDCK] and Madin-Darby bovine kidney), and A431 cells that have lost the ability to stratify through transformation, and a stratifying cell type (primary human keratinocytes) apparently similar polypeptides were immunoprecipitated with our antisera. These comprised three glycoproteins (DGI, DGII, and DGIII) and one major nonglycosylated protein (DPI). DPII, which has already been characterized by others in stratifying tissues, appeared to be absent or present in greatly reduced amounts in the nonstratifying cell types. The desmosome glycoproteins were further characterized in MDCK cells. Pulse-chase studies showed all three DGs were separate translation products. The two major glycoprotein families (DGI and DGII/III) were both found to be synthesized with co-translational addition of 2-4 high mannose cores later processed into complex type chains. However, they became endo-beta-N-acetylglucosaminidase H resistant at different times (DGII/III being slower). None of the DGs were found to have O-linked oligosaccharides unlike bovine muzzle DGI. Transport to the cell surface was rapid for all glycoproteins (60-120 min) as demonstrated by the rate at which they became sensitive to trypsin in intact cells. This also indicated that they were exposed at the outer cell surface. DGII/III, but not DGI, underwent a proteolytic processing step, losing 10 kD of carbohydrate-free peptide, during transport to the cell surface suggesting a possible regulatory mechanism in desmosome assembly.

  1. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets

    PubMed Central

    2012-01-01

    Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with other texture identifiers, and we plan to explore this in future work. PMID:22321695

  2. Novel Cyclosilazane-Type Silicon Precursor and Two-Step Plasma for Plasma-Enhanced Atomic Layer Deposition of Silicon Nitride.

    PubMed

    Park, Jae-Min; Jang, Se Jin; Lee, Sang-Ick; Lee, Won-Jun

    2018-03-14

    We designed cyclosilazane-type silicon precursors and proposed a three-step plasma-enhanced atomic layer deposition (PEALD) process to prepare silicon nitride films with high quality and excellent step coverage. The cyclosilazane-type precursor, 1,3-di-isopropylamino-2,4-dimethylcyclosilazane (CSN-2), has a closed ring structure for good thermal stability and high reactivity. CSN-2 showed thermal stability up to 450 °C and a sufficient vapor pressure of 4 Torr at 60 °C. The energy for the chemisorption of CSN-2 on the undercoordinated silicon nitride surface as calculated by density functional theory method was -7.38 eV. The PEALD process window was between 200 and 500 °C, with a growth rate of 0.43 Å/cycle. The best film quality was obtained at 500 °C, with hydrogen impurity of ∼7 atom %, oxygen impurity less than 2 atom %, low wet etching rate, and excellent step coverage of ∼95%. At 300 °C and lower temperatures, the wet etching rate was high especially at the lower sidewall of the trench pattern. We introduced the three-step PEALD process to improve the film quality and the step coverage on the lower sidewall. The sequence of the three-step PEALD process consists of the CSN-2 feeding step, the NH 3 /N 2 plasma step, and the N 2 plasma step. The H radicals in NH 3 /N 2 plasma efficiently remove the ligands from the precursor, and the N 2 plasma after the NH 3 plasma removes the surface hydrogen atoms to activate the adsorption of the precursor. The films deposited at 300 °C using the novel precursor and the three-step PEALD process showed a significantly improved step coverage of ∼95% and an excellent wet etching resistance at the lower sidewall, which is only twice as high as that of the blanket film prepared by low-pressure chemical vapor deposition.

  3. Developing Poultry Facility Type Information from USDA Agricultural Census Data for Use in Epidemiological and Economic Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melius, C

    2007-12-05

    The epidemiological and economic modeling of poultry diseases requires knowing the size, location, and operational type of each poultry type operation within the US. At the present time, the only national database of poultry operations that is available to the general public is the USDA's 2002 Agricultural Census data, published by the National Agricultural Statistics Service, herein referred to as the 'NASS data'. The NASS data provides census data at the county level on poultry operations for various operation types (i.e., layers, broilers, turkeys, ducks, geese). However, the number of farms and sizes of farms for the various types aremore » not independent since some facilities have more than one type of operation. Furthermore, some data on the number of birds represents the number sold, which does not represent the number of birds present at any given time. In addition, any data tabulated by NASS that could identify numbers of birds or other data reported by an individual respondent is suppressed by NASS and coded with a 'D'. To be useful for epidemiological and economic modeling, the NASS data must be converted into a unique set of facility types (farms having similar operational characteristics). The unique set must not double count facilities or birds. At the same time, it must account for all the birds, including those for which the data has been suppressed. Therefore, several data processing steps are required to work back from the published NASS data to obtain a consistent database for individual poultry operations. This technical report documents data processing steps that were used to convert the NASS data into a national poultry facility database with twenty-six facility types (7 egg-laying, 6 broiler, 1 backyard, 3 turkey, and 9 others, representing ducks, geese, ostriches, emus, pigeons, pheasants, quail, game fowl breeders and 'other'). The process involves two major steps. The first step defines the rules used to estimate the data that is suppressed within the NASS database. The first step is similar to the first step used to estimate suppressed data for livestock [Melius et al (2006)]. The second step converts the NASS poultry types into the operational facility types used by the epidemiological and economic model. We also define two additional facility types for high and low risk poultry backyards, and an additional two facility types for live bird markets and swap meets. The distribution of these additional facility types among counties is based on US population census data. The algorithm defining the number of premises and the corresponding distribution among counties and the resulting premises density plots for the continental US are provided.« less

  4. Investigation of International Space Station Major Constituent Analyzer Anomalous ORU 02 Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Burchfield, David E.; Trubey, Richard; Denson, Steve; Tissandier, Amber; Gentry, Greg; Granahan, John; Matty, Chris

    2011-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer-based instrument designed to provide critical monitoring of six major atmospheric constituents; nitrogen, oxygen, hydrogen, carbon dioxide, methane, and water vapor on-board the International Space Station. It is an integral part of the Environmental Control and Life Support System (ECLSS). The MCA system is comprised of seven orbit-replaceable units (ORUs) that provide flexibility in maintaining the MCA. Of these, ORU 02, the analyzer assembly requires replacement every 1 to 2 years due to the consumption of limited life components including the ion pump and ion source filaments. Typically, ORU 02s that reach end of life are swapped out of the MCA on orbit and replaced with the on-orbit spare. The replaced ORU 02 is then returned to the OEM for refurbishment and is then return to service. Recently, 2 refurbished ORU 02s, serial numbers F0001 and F0003, failed on orbit shortly after being installed into the MCA. Both ORU 02s have been returned to ground for TT&E, and a failure investigation is underway. The failure signatures have been reproduced on the ground and an initial investigation has determined that both ORU 02 failures involve either the ion source or the ion source control electronics. This paper discusses the results of the failure investigation, the steps required to refurbish the ORU 02s, and the risk mitigation steps that are being incorporated into the refurbishment process to preclude the reoccurrence of these failures in the future

  5. Capacity building for health inequality monitoring in Indonesia: enhancing the equity orientation of country health information system.

    PubMed

    Hosseinpoor, Ahmad Reza; Nambiar, Devaki; Tawilah, Jihane; Schlotheuber, Anne; Briot, Benedicte; Bateman, Massee; Davey, Tamzyn; Kusumawardani, Nunik; Myint, Theingi; Nuryetty, Mariet Tetty; Prasetyo, Sabarinah; Suparmi; Floranita, Rustini

    Inequalities in health represent a major problem in many countries, including Indonesia. Addressing health inequality is a central component of the Sustainable Development Goals and a priority of the World Health Organization (WHO). WHO provides technical support for health inequality monitoring among its member states. Following a capacity-building workshop in the WHO South-East Asia Region in 2014, Indonesia expressed interest in incorporating health-inequality monitoring into its national health information system. This article details the capacity-building process for national health inequality monitoring in Indonesia, discusses successes and challenges, and how this process may be adapted and implemented in other countries/settings. We outline key capacity-building activities undertaken between April 2016 and December 2017 in Indonesia and present the four key outcomes of this process. The capacity-building process entailed a series of workshops, meetings, activities, and processes undertaken between April 2016 and December 2017. At each stage, a range of stakeholders with access to the relevant data and capacity for data analysis, interpretation and reporting was engaged with, under the stewardship of state agencies. Key steps to strengthening health inequality monitoring included capacity building in (1) identification of the health topics/areas of interest, (2) mapping data sources and identifying gaps, (3) conducting equity analyses using raw datasets, and (4) interpreting and reporting inequality results. As a result, Indonesia developed its first national report on the state of health inequality. A number of peer-reviewed manuscripts on various aspects of health inequality in Indonesia have also been developed. The capacity-building process undertaken in Indonesia is designed to be adaptable to other contexts. Capacity building for health inequality monitoring among countries is a critical step for strengthening equity-oriented national health information systems and eventually tackling health inequities.

  6. Design and implementation of a virtual world training simulation of ICU first hour handover processes.

    PubMed

    Brown, Ross; Rasmussen, Rune; Baldwin, Ian; Wyeth, Peta

    2012-08-01

    Nursing training for an Intensive Care Unit (ICU) is a resource intensive process. High demands are made on staff, students and physical resources. Interactive, 3D computer simulations, known as virtual worlds, are increasingly being used to supplement training regimes in the health sciences; especially in areas such as complex hospital ward processes. Such worlds have been found to be very useful in maximising the utilisation of training resources. Our aim is to design and develop a novel virtual world application for teaching and training Intensive Care nurses in the approach and method for shift handover, to provide an independent, but rigorous approach to teaching these important skills. In this paper we present a virtual world simulator for students to practice key steps in handing over the 24/7 care requirements of intensive care patients during the commencing first hour of a shift. We describe the modelling process to provide a convincing interactive simulation of the handover steps involved. The virtual world provides a practice tool for students to test their analytical skills with scenarios previously provided by simple physical simulations, and live on the job training. Additional educational benefits include facilitation of remote learning, high flexibility in study hours and the automatic recording of a reviewable log from the session. To the best of our knowledge, we believe this is a novel and original application of virtual worlds to an ICU handover process. The major outcome of the work was a virtual world environment for training nurses in the shift handover process, designed and developed for use by postgraduate nurses in training. Copyright © 2012 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.

  7. 24 CFR 55.20 - Decision making process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Decision making process. 55.20... Decision making process. The decision making process for compliance with this part contains eight steps... decision making process are: (a) Step 1. Determine whether the proposed action is located in a 100-year...

  8. Enhanced performance of wearable piezoelectric nanogenerator fabricated by two-step hydrothermal process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu, Yu; Lei, Jixue; Yin, Bing

    2014-03-17

    A simple two-step hydrothermal process was proposed for enhancing the performance of the nanogenerator on flexible and wearable terylene-fabric substrate. With this method, a significant enhancement in output voltage of the nanogenerator from ∼10 mV to 7 V was achieved, comparing with the one by conventional one-step process. In addition, another advantage with the devices synthesized by two-step hydrothermal process was that their output voltages are only sensitive to strain rather than strain rate. The devices with a high output voltage have the ability to power common electric devices and will have important applications in flexible electronics and wearable devices.

  9. The Pharmaceutical Capping Process-Correlation between Residual Seal Force, Torque Moment, and Flip-off Removal Force.

    PubMed

    Mathaes, Roman; Mahler, Hanns-Christian; Vorgrimler, Lothar; Steinberg, Henrik; Dreher, Sascha; Roggo, Yves; Nieto, Alejandra; Brown, Helen; Roehl, Holger; Adler, Michael; Luemkemann, Joerg; Huwyler, Joerg; Lam, Philippe; Stauch, Oliver; Mohl, Silke; Streubel, Alexander

    2016-01-01

    The majority of parenteral drug products are manufactured in glass vials with an elastomeric rubber stopper and a crimp cap. The vial sealing process is a critical process step during fill-and-finish operations, as it defines the seal quality of the final product. Different critical capping process parameters can affect rubber stopper defects, rubber stopper compression, container closure integrity, and also crimp cap quality. A sufficiently high force to remove the flip-off button prior to usage is required to ensure quality of the drug product unit by the flip-off button during storage, transportation, and until opening and use. Therefore, the final product is 100% visually inspected for lose or defective crimp caps, which is subjective as well as time- and labor-intensive. In this study, we sealed several container closure system configurations with different capping equipment settings (with corresponding residual seal force values) to investigate the torque moment required to turn the crimp cap. A correlation between torque moment and residual seal force has been established. The torque moment was found to be influenced by several parameters, including diameter of the vial head, type of rubber stopper (serum or lyophilized) and type of crimp cap (West(®) or Datwyler(®)). In addition, we measured the force required to remove the flip-off button of a sealed container closure system. The capping process had no influence on measured forces; however, it was possible to detect partially crimped vials. In conclusion, a controlled capping process with a defined target residual seal force range leads to a tight crimp cap on a sealed container closure system and can ensure product quality. The majority of parenteral drug products are manufactured in a glass vials with an elastomeric rubber stopper and a crimp cap. The vial sealing process is a critical process step during fill-and-finish operations, as it defines the seal quality of the final product. An adequate force to remove the flip-off button prior to usage is required to ensure product quality during storage and transportation until use. In addition, the complete crimp cap needs to be fixed in a tight position on the vial. In this study, we investigated the torque moment required to turn the crimp cap and the force required to remove the flip-off button of container closure system sealed with different capping equipment process parameters (having different residual seal force values). © PDA, Inc. 2016.

  10. The developmental processes for NANDA International Nursing Diagnoses.

    PubMed

    Scroggins, Leann M

    2008-01-01

    This study aims to provide a step-by-step procedural guideline for the development of a nursing diagnosis that meets the necessary criteria for inclusion in the NANDA International and NNN classification systems. The guideline is based on the processes developed by the Diagnosis Development Committee of NANDA International and includes the necessary processes for development of Actual, Wellness, Health Promotion, and Risk nursing diagnoses. Definitions of Actual, Wellness, Health Promotion, and Risk nursing diagnoses along with inclusion criteria and taxonomy rules have been incorporated into the guideline to streamline the development and review processes for submitted diagnoses. A step-by-step procedural guideline will assist the submitter to move efficiently and effectively through the submission process, resulting in increased submissions and enhancement of the NANDA International and NNN classification systems.

  11. Reducing noise component on medical images

    NASA Astrophysics Data System (ADS)

    Semenishchev, Evgeny; Voronin, Viacheslav; Dub, Vladimir; Balabaeva, Oksana

    2018-04-01

    Medical visualization and analysis of medical data is an actual direction. Medical images are used in microbiology, genetics, roentgenology, oncology, surgery, ophthalmology, etc. Initial data processing is a major step towards obtaining a good diagnostic result. The paper considers the approach allows an image filtering with preservation of objects borders. The algorithm proposed in this paper is based on sequential data processing. At the first stage, local areas are determined, for this purpose the method of threshold processing, as well as the classical ICI algorithm, is applied. The second stage uses a method based on based on two criteria, namely, L2 norm and the first order square difference. To preserve the boundaries of objects, we will process the transition boundary and local neighborhood the filtering algorithm with a fixed-coefficient. For example, reconstructed images of CT, x-ray, and microbiological studies are shown. The test images show the effectiveness of the proposed algorithm. This shows the applicability of analysis many medical imaging applications.

  12. Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.

    PubMed

    Ivory, Catherine H

    2016-07-01

    The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.

  13. Microprocessor activity controls differential miRNA biogenesis In Vivo.

    PubMed

    Conrad, Thomas; Marsico, Annalisa; Gehre, Maja; Orom, Ulf Andersson

    2014-10-23

    In miRNA biogenesis, pri-miRNA transcripts are converted into pre-miRNA hairpins. The in vivo properties of this process remain enigmatic. Here, we determine in vivo transcriptome-wide pri-miRNA processing using next-generation sequencing of chromatin-associated pri-miRNAs. We identify a distinctive Microprocessor signature in the transcriptome profile from which efficiency of the endogenous processing event can be accurately quantified. This analysis reveals differential susceptibility to Microprocessor cleavage as a key regulatory step in miRNA biogenesis. Processing is highly variable among pri-miRNAs and a better predictor of miRNA abundance than primary transcription itself. Processing is also largely stable across three cell lines, suggesting a major contribution of sequence determinants. On the basis of differential processing efficiencies, we define functionality for short sequence features adjacent to the pre-miRNA hairpin. In conclusion, we identify Microprocessor as the main hub for diversified miRNA output and suggest a role for uncoupling miRNA biogenesis from host gene expression. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  15. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  16. Le management des projets scientifiques

    NASA Astrophysics Data System (ADS)

    Perrier, Françoise

    2000-12-01

    We describe in this paper a new approach for the management of scientific projects. This approach is the result of a long reflexion carried out within the MQDP (Methodology and Quality in the Project Development) group of INSU-CNRS, and continued with Guy Serra. Our reflexion was initiated with the study of the so-called `North-American Paradigm' which was, initially considered as the only relevant management model. Through our active participation in several astrophysical projects we realized that this model could not be applied to our laboratories without major modifications. Therefore, step-by-step, we have constructed our own methodology, using to the fullest human potential resources existing in our research field, their habits and skills. We have also participated in various working groups in industrial and scientific organisms for the benefits of CNRS. The management model presented here is based on a systemic and complex approach. This approach lets us describe the multiple aspects of a scientific project specially taking into account the human dimension. The project system model includes three major interconnected systems, immersed within an influencing and influenced environment: the `System to be Realized' which defines scientific and technical tasks leading to the scientific goals, the `Realizing System' which describes procedures, processes and organization, and the `Actors' System' which implements and boosts all the processes. Each one exists only through a series of successive models, elaborated at predefined dates of the project called `key-points'. These systems evolve with time and under often-unpredictable circumstances and the models have to take it into account. At these key-points, each model is compared to reality and the difference between the predicted and realized tasks is evaluated in order to define the data for the next model. This model can be applied to any kind of projects.

  17. Development of a resource protection and waste strategy for water use by the agricultural sector.

    PubMed

    Ligthelm, M E; Ranwedzi, R; Morokane, M; Senne, M

    2007-01-01

    The South African Department of Water Affairs and Forestry (DWAF) has started developing a strategy to regulate activities and water uses by the agricultural sector that could impact on the water resource quality. The aim would not be to over-regulate the sector, but to protect the water resource where necessary. Most of these activities constitute diffuse sources of potential pollution. The strategic process will start with investigative discussions with major stakeholders and determining the strategic context and current situation. The latter will consist of a detailed literature and stakeholder survey, and an evaluation of existing agricultural activities. The next steps of determining a vision and the setting of strategic objectives will be done with active participation by the major players. An action plan will be developed to achieve the set objectives. Important components of the strategy will be to: classify activities according to their risk to the water resource, taking into account the sensitivity of the water resource; set regulatory measures in accordance with the risk posed by the activity (measures could include the promulgation of regulations, general authorisations and/or issuing of licenses); harmonise and link the process with existing relevant processes and guidelines within DWAF and other government departments; review existing guidelines; sign agreements with relevant government departments and the agricultural sector; and provide training, built capacity and raise awareness during and after the process.

  18. Method and apparatus for automated assembly

    DOEpatents

    Jones, Rondall E.; Wilson, Randall H.; Calton, Terri L.

    1999-01-01

    A process and apparatus generates a sequence of steps for assembly or disassembly of a mechanical system. Each step in the sequence is geometrically feasible, i.e., the part motions required are physically possible. Each step in the sequence is also constraint feasible, i.e., the step satisfies user-definable constraints. Constraints allow process and other such limitations, not usually represented in models of the completed mechanical system, to affect the sequence.

  19. The impact of a public health department's expansion from a one-step to a two-step refugee screening process on the detection and initiation of treatment of latent tuberculosis.

    PubMed

    Einterz, E M; Younge, O; Hadi, C

    2018-06-01

    To determine, subsequent to the expansion of a county health department's refugee screening process from a one-step to a two-step process, the change in early loss to follow-up and time to initiation of treatment of new refugees with latent tuberculosis infection (LTBI). Quasi-experimental, quantitative. Review of patient medical records. Among 384 refugees who met the case definition of LTBI without prior tuberculosis (TB) classification, the number of cases lost to early follow-up fell from 12.5% to 0% after expansion to a two-step screening process. The average interval between in-country arrival and initiation of LTBI treatment was shortened by 41.4%. The addition of a second step to the refugee screening process was correlated with significant improvements in the county's success in tracking and treating cases of LTBI in refugees. Given the disproportionate importance of foreign-born cases of LTBI to the incidence of TB disease in low-incidence countries, these improvements could have a substantial impact on overall TB control, and the process described could serve as a model for other local health department refugee screening programs. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  20. A comparison between atmospheric/humidity and vacuum cyanoacrylate fuming of latent fingermarks.

    PubMed

    Farrugia, Kevin J; Fraser, Joanna; Friel, Lauren; Adams, Duncan; Attard-Montalto, Nicola; Deacon, Paul

    2015-12-01

    A number of pseudo-operational trials were set up to compare the atmospheric/humidity and vacuum cyanoacrylate fuming processes on plastic carrier bags. The fuming processes were compared using two-step cyanoacrylate fuming with basic yellow 40 (BY40) staining and a one-step fluorescent cyanoacrylate fuming, Lumicyano 4%. Preliminary work using planted fingermarks and split depletions were performed to identify the optimum vacuum fuming conditions. The first pseudo-operational trial compared the different fuming conditions (atmospheric/humidity vs. vacuum) for the two-step process where an additional 50% more marks were detected with the atmospheric/humidity process. None of the marks by the vacuum process could be observed visually; however, a significant number of marks were detected by fluorescence after BY40 staining. The second trial repeated the same work in trial 1 using the one-step cyanoacrylate process, Lumicyano at a concentration of 4%. Trial 2 provided comparable results to trial 1 and all the items were then re-treated with Lumicyano 4% at atmospheric/humidity conditions before dyeing with BY40 to provide the sequences of process A (Lumicyano 4% atmospheric-Lumicyano 4% atmospheric-BY40) and process B (Lumicyano 4% vacuum-Lumicyano 4% atmospheric-BY40). The number of marks (visual and fluorescent) was counted after each treatment with a substantial increase in the number of detected marks in the second and third treatments of the process. The increased detection rate after the double Lumicyano process was unexpected and may have important implications. Trial 3 was performed to investigate whether the amount of cyanoacrylate and/or fuming time had an impact on the results observed in trial 2 whereas trial 4 assessed if the double process using conventional cyanoacrylate, rather than Lumicyano 4%, provided an increased detection rate. Trials 3 and 4 confirmed that doubling the amount of Lumicyano 4% cyanoacrylate and fuming time produced a lower detection rate than the double process with Lumicyano 4%. Furthermore, the double process with conventional cyanoacrylate did not provide any benefit. Scanning electron microscopy was also performed to investigate the morphology of the cyanoacrylate polymer under different conditions. The atmospheric/humidity process appears to be superior to the vacuum process for both the two-step and one-step cyanoacrylate fuming, although the two-step process performed better in comparison to the one-step process under vacuum conditions. Nonetheless, the use of vacuum cyanoacrylate fuming may have certain operational advantages and its use does not adversely affect subsequent cyanoacrylate fuming with atmospheric/humidity conditions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

Top