Sample records for processing steps required

  1. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    NASA Astrophysics Data System (ADS)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  2. Performance monitoring and response conflict resolution associated with choice stepping reaction tasks.

    PubMed

    Watanabe, Tatsunori; Tsutou, Kotaro; Saito, Kotaro; Ishida, Kazuto; Tanabe, Shigeo; Nojima, Ippei

    2016-11-01

    Choice reaction requires response conflict resolution, and the resolution processes that occur during a choice stepping reaction task undertaken in a standing position, which requires maintenance of balance, may be different to those processes occurring during a choice reaction task performed in a seated position. The study purpose was to investigate the resolution processes during a choice stepping reaction task at the cortical level using electroencephalography and compare the results with a control task involving ankle dorsiflexion responses. Twelve young adults either stepped forward or dorsiflexed the ankle in response to a visual imperative stimulus presented on a computer screen. We used the Simon task and examined the error-related negativity (ERN) that follows an incorrect response and the correct-response negativity (CRN) that follows a correct response. Error was defined as an incorrect initial weight transfer for the stepping task and as an incorrect initial tibialis anterior activation for the control task. Results revealed that ERN and CRN amplitudes were similar in size for the stepping task, whereas the amplitude of ERN was larger than that of CRN for the control task. The ERN amplitude was also larger in the stepping task than the control task. These observations suggest that a choice stepping reaction task involves a strategy emphasizing post-response conflict and general performance monitoring of actual and required responses and also requires greater cognitive load than a choice dorsiflexion reaction. The response conflict resolution processes appear to be different for stepping tasks and reaction tasks performed in a seated position.

  3. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  4. Thermochemical water decomposition processes

    NASA Technical Reports Server (NTRS)

    Chao, R. E.

    1974-01-01

    Thermochemical processes which lead to the production of hydrogen and oxygen from water without the consumption of any other material have a number of advantages when compared to other processes such as water electrolysis. It is possible to operate a sequence of chemical steps with net work requirements equal to zero at temperatures well below the temperature required for water dissociation in a single step. Various types of procedures are discussed, giving attention to halide processes, reverse Deacon processes, iron oxide and carbon oxide processes, and metal and alkali metal processes. Economical questions are also considered.

  5. Alternative Procedure of Heat Integration Tehnique Election between Two Unit Processes to Improve Energy Saving

    NASA Astrophysics Data System (ADS)

    Santi, S. S.; Renanto; Altway, A.

    2018-01-01

    The energy use system in a production process, in this case heat exchangers networks (HENs), is one element that plays a role in the smoothness and sustainability of the industry itself. Optimizing Heat Exchanger Networks (HENs) from process streams can have a major effect on the economic value of an industry as a whole. So the solving of design problems with heat integration becomes an important requirement. In a plant, heat integration can be carried out internally or in combination between process units. However, steps in the determination of suitable heat integration techniques require long calculations and require a long time. In this paper, we propose an alternative step in determining heat integration technique by investigating 6 hypothetical units using Pinch Analysis approach with objective function energy target and total annual cost target. The six hypothetical units consist of units A, B, C, D, E, and F, where each unit has the location of different process streams to the temperature pinch. The result is a potential heat integration (ΔH’) formula that can trim conventional steps from 7 steps to just 3 steps. While the determination of the preferred heat integration technique is to calculate the potential of heat integration (ΔH’) between the hypothetical process units. Completion of calculation using matlab language programming.

  6. Step-Growth Polymerization.

    ERIC Educational Resources Information Center

    Stille, J. K.

    1981-01-01

    Following a comparison of chain-growth and step-growth polymerization, focuses on the latter process by describing requirements for high molecular weight, step-growth polymerization kinetics, synthesis and molecular weight distribution of some linear step-growth polymers, and three-dimensional network step-growth polymers. (JN)

  7. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  8. STEP Experiment Requirements

    NASA Technical Reports Server (NTRS)

    Brumfield, M. L. (Compiler)

    1984-01-01

    A plan to develop a space technology experiments platform (STEP) was examined. NASA Langley Research Center held a STEP Experiment Requirements Workshop on June 29 and 30 and July 1, 1983, at which experiment proposers were invited to present more detailed information on their experiment concept and requirements. A feasibility and preliminary definition study was conducted and the preliminary definition of STEP capabilities and experiment concepts and expected requirements for support services are presented. The preliminary definition of STEP capabilities based on detailed review of potential experiment requirements is investigated. Topics discussed include: Shuttle on-orbit dynamics; effects of the space environment on damping materials; erectable beam experiment; technology for development of very large solar array deployers; thermal energy management process experiment; photovoltaic concentrater pointing dynamics and plasma interactions; vibration isolation technology; flight tests of a synthetic aperture radar antenna with use of STEP.

  9. Implementation of Competency-Based Pharmacy Education (CBPE)

    PubMed Central

    Koster, Andries; Schalekamp, Tom; Meijerman, Irma

    2017-01-01

    Implementation of competency-based pharmacy education (CBPE) is a time-consuming, complicated process, which requires agreement on the tasks of a pharmacist, commitment, institutional stability, and a goal-directed developmental perspective of all stakeholders involved. In this article the main steps in the development of a fully-developed competency-based pharmacy curriculum (bachelor, master) are described and tips are given for a successful implementation. After the choice for entering into CBPE is made and a competency framework is adopted (step 1), intended learning outcomes are defined (step 2), followed by analyzing the required developmental trajectory (step 3) and the selection of appropriate assessment methods (step 4). Designing the teaching-learning environment involves the selection of learning activities, student experiences, and instructional methods (step 5). Finally, an iterative process of evaluation and adjustment of individual courses, and the curriculum as a whole, is entered (step 6). Successful implementation of CBPE requires a system of effective quality management and continuous professional development as a teacher. In this article suggestions for the organization of CBPE and references to more detailed literature are given, hoping to facilitate the implementation of CBPE. PMID:28970422

  10. Systems Maintenance Automated Repair Tasks (SMART)

    NASA Technical Reports Server (NTRS)

    Schuh, Joseph; Mitchell, Brent; Locklear, Louis; Belson, Martin A.; Al-Shihabi, Mary Jo Y.; King, Nadean; Norena, Elkin; Hardin, Derek

    2010-01-01

    SMART is a uniform automated discrepancy analysis and repair-authoring platform that improves technical accuracy and timely delivery of repair procedures for a given discrepancy (see figure a). SMART will minimize data errors, create uniform repair processes, and enhance the existing knowledge base of engineering repair processes. This innovation is the first tool developed that links the hardware specification requirements with the actual repair methods, sequences, and required equipment. SMART is flexibly designed to be useable by multiple engineering groups requiring decision analysis, and by any work authorization and disposition platform (see figure b). The organizational logic creates the link between specification requirements of the hardware, and specific procedures required to repair discrepancies. The first segment in the SMART process uses a decision analysis tree to define all the permutations between component/ subcomponent/discrepancy/repair on the hardware. The second segment uses a repair matrix to define what the steps and sequences are for any repair defined in the decision tree. This segment also allows for the selection of specific steps from multivariable steps. SMART will also be able to interface with outside databases and to store information from them to be inserted into the repair-procedure document. Some of the steps will be identified as optional, and would only be used based on the location and the current configuration of the hardware. The output from this analysis would be sent to a work authoring system in the form of a predefined sequence of steps containing required actions, tools, parts, materials, certifications, and specific requirements controlling quality, functional requirements, and limitations.

  11. Method and apparatus for automated assembly

    DOEpatents

    Jones, Rondall E.; Wilson, Randall H.; Calton, Terri L.

    1999-01-01

    A process and apparatus generates a sequence of steps for assembly or disassembly of a mechanical system. Each step in the sequence is geometrically feasible, i.e., the part motions required are physically possible. Each step in the sequence is also constraint feasible, i.e., the step satisfies user-definable constraints. Constraints allow process and other such limitations, not usually represented in models of the completed mechanical system, to affect the sequence.

  12. 20 CFR 404.1520 - Evaluation of disability in general.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-step sequential evaluation process we use to decide whether you are disabled, as defined in § 404.1505...-step sequential evaluation process. The sequential evaluation process is a series of five “steps” that... severe medically determinable physical or mental impairment that meets the duration requirement in § 404...

  13. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Seven Steps to Responsible Software Selection. ERIC Digest.

    ERIC Educational Resources Information Center

    Komoski, P. Kenneth; Plotnick, Eric

    Microcomputers in schools contribute significantly to the learning process, and software selection is taken as seriously as the selection of text books. The seven step process for responsible software selection are: (1) analyzing needs, including the differentiation between needs and objectives; (2) specification of requirements; (3) identifying…

  15. Functional Fault Modeling of a Cryogenic System for Real-Time Fault Detection and Isolation

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Brown, Barbara

    2010-01-01

    The purpose of this paper is to present the model development process used to create a Functional Fault Model (FFM) of a liquid hydrogen (L H2) system that will be used for realtime fault isolation in a Fault Detection, Isolation and Recover (FDIR) system. The paper explains th e steps in the model development process and the data products required at each step, including examples of how the steps were performed fo r the LH2 system. It also shows the relationship between the FDIR req uirements and steps in the model development process. The paper concl udes with a description of a demonstration of the LH2 model developed using the process and future steps for integrating the model in a live operational environment.

  16. State Requirements for Educational Facilities, 1999.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Office of Educational Facilities.

    This updated, two-volume document provides guidance for those involved in the educational facilities procurement process, and includes recent legislative changes affecting the state of Florida's building code. The first volume is organized by the sequence of steps required in the facilities procurement process and presents state requirements for…

  17. Mitigation Strategies To Protect Food Against Intentional Adulteration. Final rule.

    PubMed

    2016-05-27

    The Food and Drug Administration (FDA or we) is issuing this final rule to require domestic and foreign food facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to address hazards that may be introduced with the intention to cause wide scale public health harm. These food facilities are required to conduct a vulnerability assessment to identify significant vulnerabilities and actionable process steps and implement mitigation strategies to significantly minimize or prevent significant vulnerabilities identified at actionable process steps in a food operation. FDA is issuing these requirements as part of our implementation of the FDA Food Safety Modernization Act (FSMA).

  18. Reforming Cookbook Labs

    ERIC Educational Resources Information Center

    Peters, Erin

    2005-01-01

    Deconstructing cookbook labs to require the students to be more thoughtful could break down perceived teacher barriers to inquiry learning. Simple steps that remove or disrupt the direct transfer of step-by-step procedures in cookbook labs make students think more critically about their process. Through trials in the author's middle school…

  19. Investigation to biodiesel production by the two-step homogeneous base-catalyzed transesterification.

    PubMed

    Ye, Jianchu; Tu, Song; Sha, Yong

    2010-10-01

    For the two-step transesterification biodiesel production made from the sunflower oil, based on the kinetics model of the homogeneous base-catalyzed transesterification and the liquid-liquid phase equilibrium of the transesterification product, the total methanol/oil mole ratio, the total reaction time, and the split ratios of methanol and reaction time between the two reactors in the stage of the two-step reaction are determined quantitatively. In consideration of the transesterification intermediate product, both the traditional distillation separation process and the improved separation process of the two-step reaction product are investigated in detail by means of the rigorous process simulation. In comparison with the traditional distillation process, the improved separation process of the two-step reaction product has distinct advantage in the energy duty and equipment requirement due to replacement of the costly methanol-biodiesel distillation column. Copyright 2010 Elsevier Ltd. All rights reserved.

  20. MRO DKF Post-Processing Tool

    NASA Technical Reports Server (NTRS)

    Ayap, Shanti; Fisher, Forest; Gladden, Roy; Khanampompan, Teerapat

    2008-01-01

    This software tool saves time and reduces risk by automating two labor-intensive and error-prone post-processing steps required for every DKF [DSN (Deep Space Network) Keyword File] that MRO (Mars Reconnaissance Orbiter) produces, and is being extended to post-process the corresponding TSOE (Text Sequence Of Events) as well. The need for this post-processing step stems from limitations in the seq-gen modeling resulting in incorrect DKF generation that is then cleaned up in post-processing.

  1. Hetero-diffusion of Au epitaxy on stepped Ag(110) surface: Study of the jump rate and diffusion coefficient

    NASA Astrophysics Data System (ADS)

    Benlattar, M.; El koraychy, E.; Kotri, A.; Mazroui, M.

    2017-12-01

    We have used molecular dynamics simulations combined with an interatomic potential derived from the embedded atom method, to investigate the hetero-diffusion of Au adatom near a stepped Ag(110) surface with the height of one monoatomic layer. The activation energies for different diffusion processes, which occur on the terrace and near the step edge, are calculated both by molecular statics and molecular dynamics simulations. Static energies are found by the drag method, whereas the dynamic barriers are computed at high temperature from the Arrhenius plots. Our numerical results reveal that the jump process requires very high activation energy compared to the exchange process either on the terrace or near the step edge. In this work, other processes, such as upward and downward diffusion at step edges, have also been discussed.

  2. Steps in the open space planning process

    Treesearch

    Stephanie B. Kelly; Melissa M. Ryan

    1995-01-01

    This paper presents the steps involved in developing an open space plan. The steps are generic in that the methods may be applied various size communities. The intent is to provide a framework to develop an open space plan that meets Massachusetts requirements for funding of open space acquisition.

  3. A Neural Dynamic Model Generates Descriptions of Object-Oriented Actions.

    PubMed

    Richter, Mathis; Lins, Jonas; Schöner, Gregor

    2017-01-01

    Describing actions entails that relations between objects are discovered. A pervasively neural account of this process requires that fundamental problems are solved: the neural pointer problem, the binding problem, and the problem of generating discrete processing steps from time-continuous neural processes. We present a prototypical solution to these problems in a neural dynamic model that comprises dynamic neural fields holding representations close to sensorimotor surfaces as well as dynamic neural nodes holding discrete, language-like representations. Making the connection between these two types of representations enables the model to describe actions as well as to perceptually ground movement phrases-all based on real visual input. We demonstrate how the dynamic neural processes autonomously generate the processing steps required to describe or ground object-oriented actions. By solving the fundamental problems of neural pointing, binding, and emergent discrete processing, the model may be a first but critical step toward a systematic neural processing account of higher cognition. Copyright © 2017 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  4. Mass production of silicon pore optics for ATHENA

    NASA Astrophysics Data System (ADS)

    Wille, Eric; Bavdaz, Marcos; Collon, Maximilien

    2016-07-01

    Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.

  5. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  6. Effects of aging on the relationship between cognitive demand and step variability during dual-task walking.

    PubMed

    Decker, Leslie M; Cignetti, Fabien; Hunt, Nathaniel; Potter, Jane F; Stergiou, Nicholas; Studenski, Stephanie A

    2016-08-01

    A U-shaped relationship between cognitive demand and gait control may exist in dual-task situations, reflecting opposing effects of external focus of attention and attentional resource competition. The purpose of the study was twofold: to examine whether gait control, as evaluated from step-to-step variability, is related to cognitive task difficulty in a U-shaped manner and to determine whether age modifies this relationship. Young and older adults walked on a treadmill without attentional requirement and while performing a dichotic listening task under three attention conditions: non-forced (NF), forced-right (FR), and forced-left (FL). The conditions increased in their attentional demand and requirement for inhibitory control. Gait control was evaluated by the variability of step parameters related to balance control (step width) and rhythmic stepping pattern (step length and step time). A U-shaped relationship was found for step width variability in both young and older adults and for step time variability in older adults only. Cognitive performance during dual tasking was maintained in both young and older adults. The U-shaped relationship, which presumably results from a trade-off between an external focus of attention and competition for attentional resources, implies that higher-level cognitive processes are involved in walking in young and older adults. Specifically, while these processes are initially involved only in the control of (lateral) balance during gait, they become necessary for the control of (fore-aft) rhythmic stepping pattern in older adults, suggesting that attentional resources turn out to be needed in all facets of walking with aging. Finally, despite the cognitive resources required by walking, both young and older adults spontaneously adopted a "posture second" strategy, prioritizing the cognitive task over the gait task.

  7. Impaired Response Selection During Stepping Predicts Falls in Older People-A Cohort Study.

    PubMed

    Schoene, Daniel; Delbaere, Kim; Lord, Stephen R

    2017-08-01

    Response inhibition, an important executive function, has been identified as a risk factor for falls in older people. This study investigated whether step tests that include different levels of response inhibition differ in their ability to predict falls and whether such associations are mediated by measures of attention, speed, and/or balance. A cohort study with a 12-month follow-up was conducted in community-dwelling older people without major cognitive and mobility impairments. Participants underwent 3 step tests: (1) choice stepping reaction time (CSRT) requiring rapid decision making and step initiation; (2) inhibitory choice stepping reaction time (iCSRT) requiring additional response inhibition and response-selection (go/no-go); and (3) a Stroop Stepping Test (SST) under congruent and incongruent conditions requiring conflict resolution. Participants also completed tests of processing speed, balance, and attention as potential mediators. Ninety-three of the 212 participants (44%) fell in the follow-up period. Of the step tests, only components of the iCSRT task predicted falls in this time with the relative risk per standard deviation for the reaction time (iCSRT-RT) = 1.23 (95%CI = 1.10-1.37). Multiple mediation analysis indicated that the iCSRT-RT was independently associated with falls and not mediated through slow processing speed, poor balance, or inattention. Combined stepping and response inhibition as measured in a go/no-go test stepping paradigm predicted falls in older people. This suggests that integrity of the response-selection component of a voluntary stepping response is crucial for minimizing fall risk. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  8. Rapid oxidation/stabilization technique for carbon foams, carbon fibers and C/C composites

    DOEpatents

    Tan, Seng; Tan, Cher-Dip

    2004-05-11

    An enhanced method for the post processing, i.e. oxidation or stabilization, of carbon materials including, but not limited to, carbon foams, carbon fibers, dense carbon-carbon composites, carbon/ceramic and carbon/metal composites, which method requires relatively very short and more effective such processing steps. The introduction of an "oxygen spill over catalyst" into the carbon precursor by blending with the carbon starting material or exposure of the carbon precursor to such a material supplies required oxygen at the atomic level and permits oxidation/stabilization of carbon materials in a fraction of the time and with a fraction of the energy normally required to accomplish such carbon processing steps. Carbon based foams, solids, composites and fiber products made utilizing this method are also described.

  9. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  10. Connecticut Department of Transportation safety techniques enhancement plan.

    DOT National Transportation Integrated Search

    2015-03-15

    The Highway Safety Manual (HSM) defines a six-step cycle of safety management processes. This report evaluates the : Conncituct Department on how well conform to the six safety management steps. The methods recommended in the HSM : require additional...

  11. 40 CFR 161.162 - Description of production process.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements...) A flow chart of the chemical equations of each intended reaction occurring at each step of the...

  12. Transfer of a three step mAb chromatography process from batch to continuous: Optimizing productivity to minimize consumable requirements.

    PubMed

    Gjoka, Xhorxhi; Gantier, Rene; Schofield, Mark

    2017-01-20

    The goal of this study was to adapt a batch mAb purification chromatography platform for continuous operation. The experiments and rationale used to convert from batch to continuous operation are described. Experimental data was used to design chromatography methods for continuous operation that would exceed the threshold for critical quality attributes and minimize the consumables required as compared to batch mode of operation. Four unit operations comprising of Protein A capture, viral inactivation, flow-through anion exchange (AEX), and mixed-mode cation exchange chromatography (MMCEX) were integrated across two Cadence BioSMB PD multi-column chromatography systems in order to process a 25L volume of harvested cell culture fluid (HCCF) in less than 12h. Transfer from batch to continuous resulted in an increase in productivity of the Protein A step from 13 to 50g/L/h and of the MMCEX step from 10 to 60g/L/h with no impact on the purification process performance in term of contaminant removal (4.5 log reduction of host cell proteins, 50% reduction in soluble product aggregates) and overall chromatography process yield of recovery (75%). The increase in productivity, combined with continuous operation, reduced the resin volume required for Protein A and MMCEX chromatography by more than 95% compared to batch. The volume of AEX membrane required for flow through operation was reduced by 74%. Moreover, the continuous process required 44% less buffer than an equivalent batch process. This significant reduction in consumables enables cost-effective, disposable, single-use manufacturing. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Quantitative analysis of the thermal requirements for stepwise physical dormancy-break in seeds of the winter annual Geranium carolinianum (Geraniaceae)

    PubMed Central

    Gama-Arachchige, N. S.; Baskin, J. M.; Geneve, R. L.; Baskin, C. C.

    2013-01-01

    Background and Aims Physical dormancy (PY)-break in some annual plant species is a two-step process controlled by two different temperature and/or moisture regimes. The thermal time model has been used to quantify PY-break in several species of Fabaceae, but not to describe stepwise PY-break. The primary aims of this study were to quantify the thermal requirement for sensitivity induction by developing a thermal time model and to propose a mechanism for stepwise PY-breaking in the winter annual Geranium carolinianum. Methods Seeds of G. carolinianum were stored under dry conditions at different constant and alternating temperatures to induce sensitivity (step I). Sensitivity induction was analysed based on the thermal time approach using the Gompertz function. The effect of temperature on step II was studied by incubating sensitive seeds at low temperatures. Scanning electron microscopy, penetrometer techniques, and different humidity levels and temperatures were used to explain the mechanism of stepwise PY-break. Key Results The base temperature (Tb) for sensitivity induction was 17·2 °C and constant for all seed fractions of the population. Thermal time for sensitivity induction during step I in the PY-breaking process agreed with the three-parameter Gompertz model. Step II (PY-break) did not agree with the thermal time concept. Q10 values for the rate of sensitivity induction and PY-break were between 2·0 and 3·5 and between 0·02 and 0·1, respectively. The force required to separate the water gap palisade layer from the sub-palisade layer was significantly reduced after sensitivity induction. Conclusions Step I and step II in PY-breaking of G. carolinianum are controlled by chemical and physical processes, respectively. This study indicates the feasibility of applying the developed thermal time model to predict or manipulate sensitivity induction in seeds with two-step PY-breaking processes. The model is the first and most detailed one yet developed for sensitivity induction in PY-break. PMID:23456728

  14. Calculation tool for transported geothermal energy using two-step absorption process

    DOE Data Explorer

    Kyle Gluesenkamp

    2016-02-01

    This spreadsheet allows the user to calculate parameters relevant to techno-economic performance of a two-step absorption process to transport low temperature geothermal heat some distance (1-20 miles) for use in building air conditioning. The parameters included are (1) energy density of aqueous LiBr and LiCl solutions, (2) transportation cost of trucking solution, and (3) equipment cost for the required chillers and cooling towers in the two-step absorption approach. More information is available in the included public report: "A Technical and Economic Analysis of an Innovative Two-Step Absorption System for Utilizing Low-Temperature Geothermal Resources to Condition Commercial Buildings"

  15. The value marketing chain in health care.

    PubMed

    MacStravic, S

    1999-01-01

    In health care, Michael Porter's value chain can be reconceptualized as a "Value Marketing Chain," in which value is reinforced during each step of the customer recruitment and retention process. "Value" is a concept that must jointly be defined by buyer and seller as they interact every step of the way during the process. This requires the establishment of end-to-end mechanisms for soliciting feedback from customers.

  16. Lignocellulose hydrolysis by multienzyme complexes

    USDA-ARS?s Scientific Manuscript database

    Lignocellulosic biomass is the most abundant renewable resource on the planet. Converting this material into a usable fuel is a multi-step process, the rate-limiting step being enzymatic hydrolysis of organic polymers into monomeric sugars. While the substrate can be complex and require a multitud...

  17. Oxidation-driven surface dynamics on NiAl(100)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Hailang; Chen, Xidong; Li, Liang

    Atomic steps, a defect common to all crystal surfaces, can play an important role in many physical and chemical processes. However, attempts to predict surface dynamics under nonequilibrium conditions are usually frustrated by poor knowledge of the atomic processes of surface motion arising from mass transport from/to surface steps. Using low-energy electron microscopy that spatially and temporally resolves oxide film growth during the oxidation of NiAl(100) we demonstrate that surface steps are impermeable to oxide film growth. The advancement of the oxide occurs exclusively on the same terrace and requires the coordinated migration of surface steps. The resulting piling upmore » of surface steps ahead of the oxide growth front progressively impedes the oxide growth. This process is reversed during oxide decomposition. The migration of the substrate steps is found to be a surface-step version of the well-known Hele-Shaw problem, governed by detachment (attachment) of Al atoms at step edges induced by the oxide growth (decomposition). As a result, by comparing with the oxidation of NiAl(110) that exhibits unimpeded oxide film growth over substrate steps, we suggest that whenever steps are the source of atoms used for oxide growth they limit the oxidation process; when atoms are supplied from the bulk, the oxidation rate is not limited by the motion of surface steps.« less

  18. Oxidation-driven surface dynamics on NiAl(100)

    DOE PAGES

    Qin, Hailang; Chen, Xidong; Li, Liang; ...

    2014-12-29

    Atomic steps, a defect common to all crystal surfaces, can play an important role in many physical and chemical processes. However, attempts to predict surface dynamics under nonequilibrium conditions are usually frustrated by poor knowledge of the atomic processes of surface motion arising from mass transport from/to surface steps. Using low-energy electron microscopy that spatially and temporally resolves oxide film growth during the oxidation of NiAl(100) we demonstrate that surface steps are impermeable to oxide film growth. The advancement of the oxide occurs exclusively on the same terrace and requires the coordinated migration of surface steps. The resulting piling upmore » of surface steps ahead of the oxide growth front progressively impedes the oxide growth. This process is reversed during oxide decomposition. The migration of the substrate steps is found to be a surface-step version of the well-known Hele-Shaw problem, governed by detachment (attachment) of Al atoms at step edges induced by the oxide growth (decomposition). As a result, by comparing with the oxidation of NiAl(110) that exhibits unimpeded oxide film growth over substrate steps, we suggest that whenever steps are the source of atoms used for oxide growth they limit the oxidation process; when atoms are supplied from the bulk, the oxidation rate is not limited by the motion of surface steps.« less

  19. Low-Pressure Alcohol Distillation

    NASA Technical Reports Server (NTRS)

    Frazier, D. O.; Zur Burg, F. W.; Cody, J. C.

    1984-01-01

    Heat requirements lowered for process. Temperature requirements lowered enough to make solar heat absorbed by flat-plate collectors feasible energy source. Alcohol produced without adding other solvents, eliminating need for dehydration or hydrocarbon stripping as final step.

  20. Point-of-care testing.

    PubMed

    O'Brien, J A

    2000-12-01

    Is POCT worth integrating into a facility? Despite its promise of speed and convenience, this technology requires careful evaluation of potential benefits, disadvantages, and challenges to the existing system. If the pros outweigh the cons, a step-by-step approach can ease the process of implementing a POCT program.

  1. Selective catalytic two-step process for ethylene glycol from carbon monoxide

    PubMed Central

    Dong, Kaiwu; Elangovan, Saravanakumar; Sang, Rui; Spannenberg, Anke; Jackstell, Ralf; Junge, Kathrin; Li, Yuehui; Beller, Matthias

    2016-01-01

    Upgrading C1 chemicals (for example, CO, CO/H2, MeOH and CO2) with C–C bond formation is essential for the synthesis of bulk chemicals. In general, these industrially important processes (for example, Fischer Tropsch) proceed at drastic reaction conditions (>250 °C; high pressure) and suffer from low selectivity, which makes high capital investment necessary and requires additional purifications. Here, a different strategy for the preparation of ethylene glycol (EG) via initial oxidative coupling and subsequent reduction is presented. Separating coupling and reduction steps allows for a completely selective formation of EG (99%) from CO. This two-step catalytic procedure makes use of a Pd-catalysed oxycarbonylation of amines to oxamides at room temperature (RT) and subsequent Ru- or Fe-catalysed hydrogenation to EG. Notably, in the first step the required amines can be efficiently reused. The presented stepwise oxamide-mediated coupling provides the basis for a new strategy for selective upgrading of C1 chemicals. PMID:27377550

  2. Breathe easy with proper respiratory protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bidwell, J.

    1996-05-01

    Evaluating the need for respiratory protection in chemical process industries (CPI) plants and selecting the appropriate respirator involves several steps. The Occupational Safety and Health Administration (OSHA) general industry standard for respiratory protection (29 CFR 1910.134(b)) requires the employer to establish a program to help reduce exposures to occupational contaminants. When feasible, employers must eliminate contaminants by using engineering controls (such as general and local ventilation, enclosure or isolation, or substitution of a less-hazardous process or material). Establishing a respiratory protection program consists of four steps: (1) Identify respiratory hazards and concentrations; (2) Understand the contaminants` effects on workers` health;more » (3) Select appropriate respiratory protection; and (4) Train in proper respirator use and maintenance. Consult applicable state and OSHA requirements to ensure that your program satisfies these steps. Industrial respirator manufacturers can assist with on-site training and fit testing. The paper discusses these four steps, program guidelines, determination of the hazard, and styles of respirators.« less

  3. [Indications of lung transplantation: Patients selection, timing of listing, and choice of procedure].

    PubMed

    Morisse Pradier, H; Sénéchal, A; Philit, F; Tronc, F; Maury, J-M; Grima, R; Flamens, C; Paulus, S; Neidecker, J; Mornex, J-F

    2016-02-01

    Lung transplantation (LT) is now considered as an excellent treatment option for selected patients with end-stage pulmonary diseases, such as COPD, cystic fibrosis, idiopathic pulmonary fibrosis, and pulmonary arterial hypertension. The 2 goals of LT are to provide a survival benefit and to improve quality of life. The 3-step decision process leading to LT is discussed in this review. The first step is the selection of candidates, which requires a careful examination in order to check absolute and relative contraindications. The second step is the timing of listing for LT; it requires the knowledge of disease-specific prognostic factors available in international guidelines, and discussed in this paper. The third step is the choice of procedure: indications of heart-lung, single-lung, and bilateral-lung transplantation are described. In conclusion, this document provides guidelines to help pulmonologists in the referral and selection processes of candidates for transplantation in order to optimize the outcome of LT. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  4. Space Medicine in the Human System Integration Process

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.

    2010-01-01

    This slide presentation reviews the importance of integration of space medicine in the human system of lunar exploration. There is a review of historical precedence in reference to lunar surface operations. The integration process is reviewed in a chart which shows the steps from research to requirements development, requirements integration, design, verification, operations and using the lessons learned, giving more information and items for research. These steps are reviewed in view of specific space medical issues. Some of the testing of the operations are undertaken in an environment that is an analog to the exploration environment. Some of these analog environments are reviewed, and there is some discussion of the benefits of use of an analog environment in testing the processes that are derived.

  5. Morphological Study on Porous Silicon Carbide Membrane Fabricated by Double-Step Electrochemical Etching

    NASA Astrophysics Data System (ADS)

    Omiya, Takuma; Tanaka, Akira; Shimomura, Masaru

    2012-07-01

    The structure of porous silicon carbide membranes that peeled off spontaneously during electrochemical etching was studied. They were fabricated from n-type 6H SiC(0001) wafers by a double-step electrochemical etching process in a hydrofluoric electrolyte. Nanoporous membranes were obtained after double-step etching with current densities of 10-20 and 60-100 mA/cm2 in the first and second steps, respectively. Microporous membranes were also fabricated after double-step etching with current densities of 100 and 200 mA/cm2. It was found that the pore diameter is influenced by the etching current in step 1, and that a higher current is required in step 2 when the current in step 1 is increased. During the etching processes in steps 1 and 2, vertical nanopore and lateral crack formations proceed, respectively. The influx pathway of hydrofluoric solution, expansion of generated gases, and transfer limitation of positive holes to the pore surface are the key factors in the peeling-off mechanism of the membrane.

  6. One-Step Sub-micrometer-Scale Electrohydrodynamic Inkjet Three-Dimensional Printing Technique with Spontaneous Nanoscale Joule Heating.

    PubMed

    Zhang, Bin; Seong, Baekhoon; Lee, Jaehyun; Nguyen, VuDat; Cho, Daehyun; Byun, Doyoung

    2017-09-06

    A one-step sub-micrometer-scale electrohydrodynamic (EHD) inkjet three-dimensional (3D)-printing technique that is based on the drop-on-demand (DOD) operation for which an additional postsintering process is not required is proposed. Both the numerical simulation and the experimental observations proved that nanoscale Joule heating occurs at the interface between the charged silver nanoparticles (Ag-NPs) because of the high electrical contact resistance during the printing process; this is the reason why an additional postsintering process is not required. Sub-micrometer-scale 3D structures were printed with an above-35 aspect ratio via the use of the proposed printing technique; furthermore, it is evident that the designed 3D structures such as a bridge-like shape can be printed with the use of the proposed printing technique, allowing for the cost-effective fabrication of a 3D touch sensor and an ultrasensitive air flow-rate sensor. It is believed that the proposed one-step printing technique may replace the conventional 3D conductive-structure printing techniques for which a postsintering process is used because of its economic efficiency.

  7. Planning that works: Empowerment through stakeholder focused interactive planning (SFIP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, J.E.; Ison, S.A.

    1994-12-31

    This paper describes a powerful planning tool that can enable government, private industries, and public interest organizations to actualize their visions through sound decision making. The stakeholder focused interactive planning model is designed to integrate and ultimately gain stakeholder investment in the success of attainment of their vision. The only concessions required of the planning organization using this process is the acceptance of the premise that sustained vision success requires the support of both internal and external stakeholders and that each step in the process must be used as a validation of the previous step and essential to the completionmore » of the next step. What is stakeholder/public involvement? It is the process in which the stakeholders (both internal and external) values, interests and expectations are included in decision-making processes. The primary goal of public involvement efforts is to include all those who have a stake in the decision, whether or not they have already been identified. Stakeholders are individuals, contractors, clients, suppliers, public organizations, state and local governments, Indian tribes, federal agencies, and other parties affected by decisions.« less

  8. Method to Improve Indium Bump Bonding via Indium Oxide Removal Using a Multi-Step Plasma Process

    NASA Technical Reports Server (NTRS)

    Dickie, Matthew R. (Inventor); Nikzad, Shouleh (Inventor); Greer, H. Frank (Inventor); Jones, Todd J. (Inventor); Vasquez, Richard P. (Inventor); Hoenk, Michael E. (Inventor)

    2012-01-01

    A process for removing indium oxide from indium bumps in a flip-chip structure to reduce contact resistance, by a multi-step plasma treatment. A first plasma treatment of the indium bumps with an argon, methane and hydrogen plasma reduces indium oxide, and a second plasma treatment with an argon and hydrogen plasma removes residual organics. The multi-step plasma process for removing indium oxide from the indium bumps is more effective in reducing the oxide, and yet does not require the use of halogens, does not change the bump morphology, does not attack the bond pad material or under-bump metallization layers, and creates no new mechanisms for open circuits.

  9. Improvement of the System of Training of Specialists by University for Coal Mining Enterprises

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, Vadim; Seredkina, Irina

    2017-11-01

    In the article the ingenious technique of the Quality Function Deployment with reference to the process of training of specialists with higher education by university is considered. The method is based on the step-by-step conversion of customer requirements into specific organizational, meaningful and functional transformations of the technological process of the university. A fully deployed quality function includes four stages of tracking customer requirements while creating a product: product planning and design, process design, production design. The Quality Function Deployment can be considered as one of the methods for optimizing the technological processes of training of specialists with higher education in the current economic conditions. Implemented at the initial stages of the life cycle of the technological process, it ensures not only the high quality of the "product" of graduate school, but also the fullest possible satisfaction of consumer's requests and expectations.

  10. Overview ID/POV Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, James G.

    2016-08-16

    This is a flowchart of the inventory difference/propagation of variance process, with steps such as determine all relevance transactions, group relevant transactions by affected MATLID, filter groups to S, F and identify required M's, etc.

  11. Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.

    PubMed

    Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M

    2017-01-01

    Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.

  12. Environmental Benign Process for Production of Molybdenum Metal from Sulphide Based Minerals

    NASA Astrophysics Data System (ADS)

    Rajput, Priyanka; Janakiram, Vangada; Jayasankar, Kalidoss; Angadi, Shivakumar; Bhoi, Bhagyadhar; Mukherjee, Partha Sarathi

    2017-10-01

    Molybdenum is a strategic and high temperature refractory metal which is not found in nature in free state, it is predominantly found in earth's crust in the form of MoO3/MoS2. The main disadvantage of the industrial treatment of Mo concentrate is that the process contains many stages and requires very high temperature. Almost in every step many gaseous, liquid, solid chemical substances are formed which require further treatment. To overcome the above drawback, a new alternative one step novel process is developed for the treatment of sulphide and trioxide molybdenum concentrates. This paper presents the results of the investigations on molybdenite dissociation (MoS2) using microwave assisted plasma unit as well as transferred arc thermal plasma torch. It is a single step process for the preparation of pure molybdenum metal from MoS2 by hydrogen reduction in thermal plasma. Process variable such as H2 gas, Ar gas, input current, voltage and time have been examined to prepare molybdenum metal. Molybdenum recovery of the order of 95% was achieved. The XRD results confirm the phases of molybdenum metal and the chemical analysis of the end product indicate the formation of metallic molybdenum (Mo 98%).

  13. Expedited vocational assessment under the sequential evaluation process. Final rules.

    PubMed

    2012-07-25

    We are revising our rules to give adjudicators the discretion to proceed to the fifth step of the sequential evaluation process for assessing disability when we have insufficient information about a claimant's past relevant work history to make the findings required for step 4. If an adjudicator finds at step 5 that a claimant may be unable to adjust to other work existing in the national economy, the adjudicator will return to the fourth step to develop the claimant's work history and make a finding about whether the claimant can perform his or her past relevant work. We expect that this new expedited process will not disadvantage any claimant or change the ultimate conclusion about whether a claimant is disabled, but it will promote administrative efficiency and help us make more timely disability determinations and decisions.

  14. Estimation of Managerial and Technical Personnel Requirements in Selected Industries. Training for Industry Series, No. 2.

    ERIC Educational Resources Information Center

    United Nations Industrial Development Organization, Vienna (Austria).

    The need to develop managerial and technical personnel in the cement, fertilizer, pulp and paper, sugar, leather and shoe, glass, and metal processing industries of various nations was studied, with emphasis on necessary steps in developing nations to relate occupational requirements to technology, processes, and scale of output. Estimates were…

  15. Alternative process for thin layer etching: Application to nitride spacer etching stopping on silicon germanium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Posseme, N., E-mail: nicolas.posseme@cea.fr; Pollet, O.; Barnola, S.

    2014-08-04

    Silicon nitride spacer etching realization is considered today as one of the most challenging of the etch process for the new devices realization. For this step, the atomic etch precision to stop on silicon or silicon germanium with a perfect anisotropy (no foot formation) is required. The situation is that none of the current plasma technologies can meet all these requirements. To overcome these issues and meet the highly complex requirements imposed by device fabrication processes, we recently proposed an alternative etching process to the current plasma etch chemistries. This process is based on thin film modification by light ionsmore » implantation followed by a selective removal of the modified layer with respect to the non-modified material. In this Letter, we demonstrate the benefit of this alternative etch method in term of film damage control (silicon germanium recess obtained is less than 6 A), anisotropy (no foot formation), and its compatibility with other integration steps like epitaxial. The etch mechanisms of this approach are also addressed.« less

  16. Cooperative Buying for New Associates: Some Assembly Required. Important Safety Instructions.

    ERIC Educational Resources Information Center

    Talarico, Scott

    1998-01-01

    A guide for using cooperative buying to block-book campus activities or attractions through a campus activities convention provides a step-by-step process and outlines some specific considerations, including forms, pricing, preparation for the conference, follow-up approaches, and hints for new users of the system. (MSE)

  17. Physical characterization of a new composition of oxidized zirconium-2.5 wt% niobium produced using a two step process for biomedical applications

    NASA Astrophysics Data System (ADS)

    Pawar, V.; Weaver, C.; Jani, S.

    2011-05-01

    Zirconium and particularly Zr-2.5 wt%Nb (Zr2.5Nb) alloy are useful for engineering bearing applications because they can be oxidized in air to form a hard surface ceramic. Oxidized zirconium (OxZr) due to its abrasion resistant ceramic surface and biocompatible substrate alloy has been used as a bearing surface in total joint arthroplasty for several years. OxZr is characterized by hard zirconium oxide (oxide) formed on Zr2.5Nb using one step thermal oxidation carried out in air. Because the oxide is only at the surface, the bulk material behaves like a metal, with high toughness. The oxide, furthermore, exhibits high adhesion to the substrate because of an oxygen-rich diffusion hardened zone (DHZ) interposing between the oxide and the substrate. In this study, we demonstrate a two step process that forms a thicker DHZ and thus increased depth of hardening than that can be obtained using a one step oxidation process. The first step is thermal oxidation in air and the second step is a heat treatment in vacuum. The second step drives oxygen from the oxide formed in the first step deeper into the substrate to form a thicker DHZ. During the process only a portion of the oxide is dissolved. This new composition (DHOxZr) has approximately 4-6 μm oxide similar to that of OxZr. The nano-hardness of the oxide is similar but the DHZ is approximately 10 times thicker. The stoichiometry of the oxide is similar and a secondary phase rich in oxygen is present through the entire thickness. Due to the increased depth of hardening, the critical load required for the onset of oxide cracking is approximately 1.6 times more than that of the oxide of OxZr. This new composition has a potential to be used as a bearing surface in applications where greater depth of hardening is required.

  18. Biodiesel production from waste frying oil using waste animal bone and solar heat.

    PubMed

    Corro, Grisel; Sánchez, Nallely; Pal, Umapada; Bañuelos, Fortino

    2016-01-01

    A two-step catalytic process for the production of biodiesel from waste frying oil (WFO) at low cost, utilizing waste animal-bone as catalyst and solar radiation as heat source is reported in this work. In the first step, the free fatty acids (FFA) in WFO were esterified with methanol by a catalytic process using calcined waste animal-bone as catalyst, which remains active even after 10 esterification runs. The trans-esterification step was catalyzed by NaOH through thermal activation process. Produced biodiesel fulfills all the international requirements for its utilization as a fuel. A probable reaction mechanism for the esterification process is proposed considering the presence of hydroxyapatite at the surface of calcined animal bones. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Mechanism and the origins of stereospecificity in copper-catalyzed ring expansion of vinyl oxiranes: a traceless dual transition-metal-mediated process.

    PubMed

    Mustard, Thomas J L; Mack, Daniel J; Njardarson, Jon T; Cheong, Paul Ha-Yeon

    2013-01-30

    Density functional theory computations of the Cu-catalyzed ring expansion of vinyloxiranes is mediated by a traceless dual Cu(I)-catalyst mechanism. Overall, the reaction involves a monomeric Cu(I)-catalyst, but a single key step, the Cu migration, requires two Cu(I)-catalysts for the transformation. This dual-Cu step is found to be a true double Cu(I) transition state rather than a single Cu(I) transition state in the presence of an adventitious, spectator Cu(I). Both Cu(I) catalysts are involved in the bond forming and breaking process. The single Cu(I) transition state is not a stationary point on the potential energy surface. Interestingly, the reductive elimination is rate-determining for the major diastereomeric product, while the Cu(I) migration step is rate-determining for the minor. Thus, while the reaction requires dual Cu(I) activation to proceed, kinetically, the presence of the dual-Cu(I) step is untraceable. The diastereospecificity of this reaction is controlled by the Cu migration step. Suprafacial migration is favored over antarafacial migration due to the distorted Cu π-allyl in the latter.

  20. AMPS data management concepts. [Atmospheric, Magnetospheric and Plasma in Space experiment

    NASA Technical Reports Server (NTRS)

    Metzelaar, P. N.

    1975-01-01

    Five typical AMPS experiments were formulated to allow simulation studies to verify data management concepts. Design studies were conducted to analyze these experiments in terms of the applicable procedures, data processing and displaying functions. Design concepts for AMPS data management system are presented which permit both automatic repetitive measurement sequences and experimenter-controlled step-by-step procedures. Extensive use is made of a cathode ray tube display, the experimenters' alphanumeric keyboard, and the computer. The types of computer software required by the system and the possible choices of control and display procedures available to the experimenter are described for several examples. An electromagnetic wave transmission experiment illustrates the methods used to analyze data processing requirements.

  1. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  2. Heat recirculating cooler for fluid stream pollutant removal

    DOEpatents

    Richards, George A.; Berry, David A.

    2008-10-28

    A process by which heat is removed from a reactant fluid to reach the operating temperature of a known pollutant removal method and said heat is recirculated to raise the temperature of the product fluid. The process can be utilized whenever an intermediate step reaction requires a lower reaction temperature than the prior and next steps. The benefits of a heat-recirculating cooler include the ability to use known pollutant removal methods and increased thermal efficiency of the system.

  3. Glucose repression may involve processes with different sugar kinase requirements.

    PubMed Central

    Sanz, P; Nieto, A; Prieto, J A

    1996-01-01

    Adding glucose to Saccharomyces cerevisiae cells growing among nonfermentable carbon sources leads to glucose repression. This process may be resolved into several steps. An early repression response requires any one of the three glucose kinases present in S. cerevisiae (HXK1, HXK2, or GLK1). A late response is only achieved when Hxk2p is present. PMID:8755906

  4. Safe and Effective Schooling for All Students: Putting into Practice the Disciplinary Provisions of the 1997 IDEA.

    ERIC Educational Resources Information Center

    Gable, Robert A.; Butler, C. J.; Walker-Bolton, Irene; Tonelson, Stephen W.; Quinn, Mary M.; Fox, James J.

    2003-01-01

    Virginia's statewide plan of educator preparation in functional behavioral assessment, as required under the Individuals with Disabilities Education Act, is described. The step-by-step training process facilitated positive academic and nonacademic outcomes for all students. Preliminary data support the effectiveness of both the content and…

  5. Five Steps for Developing Effective Transition Plans for High School Students with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Szidon, Katherine; Ruppar, Andrea; Smith, Leann

    2015-01-01

    The Individuals With Disabilities Education Act (IDEA; 2006) requires schools to develop transition plans for students with disabilities, beginning at age 16, if not before. For students with autism spectrum disorder (ASD), the transition planning process includes unique considerations. This article describes five steps for developing effective…

  6. 77 FR 26929 - Requirements for Official Establishments To Notify FSIS of Adulterated or Misbranded Product...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-08

    ... Inspection Service, USDA. ACTION: Final rule. SUMMARY: The Food Safety and Inspection Service (FSIS) is... that outlines a step-by-step reaction process. They also requested that FSIS consider factors such as... to determine more quickly whether a recall action is necessary (including detention and seizure of...

  7. Space shuttle orbiter guidance, naviagation and control software functional requirements: Horizontal flight operations

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The shuttle GN&C software functions for horizontal flight operations are defined. Software functional requirements are grouped into two categories: first horizontal flight requirements and full mission horizontal flight requirements. The document privides the intial step in the shuttle GN&C software design process. It also serves as a management tool to identify analyses which are required to define requirements.

  8. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  9. The neural correlates of morphological complexity processing: Detecting structure in pseudowords.

    PubMed

    Schuster, Swetlana; Scharinger, Mathias; Brooks, Colin; Lahiri, Aditi; Hartwigsen, Gesa

    2018-06-01

    Morphological complexity is a highly debated issue in visual word recognition. Previous neuroimaging studies have shown that speakers are sensitive to degrees of morphological complexity. Two-step derived complex words (bridging through bridge N  > bridge V  > bridging) led to more enhanced activation in the left inferior frontal gyrus than their 1-step derived counterparts (running through run V  > running). However, it remains unclear whether sensitivity to degrees of morphological complexity extends to pseudowords. If this were the case, it would indicate that abstract knowledge of morphological structure is independent of lexicality. We addressed this question by investigating the processing of two sets of pseudowords in German. Both sets contained morphologically viable two-step derived pseudowords differing in the number of derivational steps required to access an existing lexical representation and therefore the degree of structural analysis expected during processing. Using a 2 × 2 factorial design, we found lexicality effects to be distinct from processing signatures relating to structural analysis in pseudowords. Semantically-driven processes such as lexical search showed a more frontal distribution while combinatorial processes related to structural analysis engaged more parietal parts of the network. Specifically, more complex pseudowords showed increased activation in parietal regions (right superior parietal lobe and left precuneus) relative to pseudowords that required less structural analysis to arrive at an existing lexical representation. As the two sets were matched on cohort size and surface form, these results highlight the role of internal levels of morphological structure even in forms that do not possess a lexical representation. © 2018 Wiley Periodicals, Inc.

  10. A detailed comparison of analysis processes for MCC-IMS data in disease classification—Automated methods can replace manual peak annotations

    PubMed Central

    Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven

    2017-01-01

    Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313

  11. One Step at a Time: SBM as an Incremental Process.

    ERIC Educational Resources Information Center

    Conrad, Mark

    1995-01-01

    Discusses incremental SBM budgeting and answers questions regarding resource equity, bookkeeping requirements, accountability, decision-making processes, and purchasing. Approaching site-based management as an incremental process recognizes that every school system engages in some level of site-based decisions. Implementation can be gradual and…

  12. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  13. The "Motor" in Implicit Motor Sequence Learning: A Foot-stepping Serial Reaction Time Task.

    PubMed

    Du, Yue; Clark, Jane E

    2018-05-03

    This protocol describes a modified serial reaction time (SRT) task used to study implicit motor sequence learning. Unlike the classic SRT task that involves finger-pressing movements while sitting, the modified SRT task requires participants to step with both feet while maintaining a standing posture. This stepping task necessitates whole body actions that impose postural challenges. The foot-stepping task complements the classic SRT task in several ways. The foot-stepping SRT task is a better proxy for the daily activities that require ongoing postural control, and thus may help us better understand sequence learning in real-life situations. In addition, response time serves as an indicator of sequence learning in the classic SRT task, but it is unclear whether response time, reaction time (RT) representing mental process, or movement time (MT) reflecting the movement itself, is a key player in motor sequence learning. The foot-stepping SRT task allows researchers to disentangle response time into RT and MT, which may clarify how motor planning and movement execution are involved in sequence learning. Lastly, postural control and cognition are interactively related, but little is known about how postural control interacts with learning motor sequences. With a motion capture system, the movement of the whole body (e.g., the center of mass (COM)) can be recorded. Such measures allow us to reveal the dynamic processes underlying discrete responses measured by RT and MT, and may aid in elucidating the relationship between postural control and the explicit and implicit processes involved in sequence learning. Details of the experimental set-up, procedure, and data processing are described. The representative data are adopted from one of our previous studies. Results are related to response time, RT, and MT, as well as the relationship between the anticipatory postural response and the explicit processes involved in implicit motor sequence learning.

  14. Creating the Infrastructure for Rapid Application Development and Processing Response to the HIRDLS Radiance Anomaly

    NASA Astrophysics Data System (ADS)

    Cavanaugh, C.; Gille, J.; Francis, G.; Nardi, B.; Hannigan, J.; McInerney, J.; Krinsky, C.; Barnett, J.; Dean, V.; Craig, C.

    2005-12-01

    The High Resolution Dynamics Limb Sounder (HIRDLS) instrument onboard the NASA Aura spacecraft experienced a rupture of the thermal blanketing material (Kapton) during the rapid depressurization of launch. The Kapton draped over the HIRDLS scan mirror, severely limiting the aperture through which HIRDLS views space and Earth's atmospheric limb. In order for HIRDLS to achieve its intended measurement goals, rapid characterization of the anomaly, and rapid recovery from it were required. The recovery centered around a new processing module inserted into the standard HIRDLS processing scheme, with a goal of minimizing the effect of the anomaly on the already existing processing modules. We describe the software infrastructure on which the new processing module was built, and how that infrastructure allows for rapid application development and processing response. The scope of the infrastructure spans three distinct anomaly recovery steps and the means for their intercommunication. Each of the three recovery steps (removing the Kapton-induced oscillation in the radiometric signal, removing the Kapton signal contamination upon the radiometric signal, and correcting for the partially-obscured atmospheric view) is completely modularized and insulated from the other steps, allowing focused and rapid application development towards a specific step, and neutralizing unintended inter-step influences, thus greatly shortening the design-development-test lifecycle. The intercommunication is also completely modularized and has a simple interface to which the three recovery steps adhere, allowing easy modification and replacement of specific recovery scenarios, thereby heightening the processing response.

  15. STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, S.H.; Morris, J.W.

    1962-12-15

    Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)

  16. Magnetically Enhanced Solid-Liquid Separation

    NASA Astrophysics Data System (ADS)

    Rey, C. M.; Keller, K.; Fuchs, B.

    2005-07-01

    DuPont is developing an entirely new method of solid-liquid filtration involving the use of magnetic fields and magnetic field gradients. The new hybrid process, entitled Magnetically Enhanced Solid-Liquid Separation (MESLS), is designed to improve the de-watering kinetics and reduce the residual moisture content of solid particulates mechanically separated from liquid slurries. Gravitation, pressure, temperature, centrifugation, and fluid dynamics have dictated traditional solid-liquid separation for the past 50 years. The introduction of an external field (i.e. the magnetic field) offers the promise to manipulate particle behavior in an entirely new manner, which leads to increased process efficiency. Traditional solid-liquid separation typically consists of two primary steps. The first is a mechanical step in which the solid particulate is separated from the liquid using e.g. gas pressure through a filter membrane, centrifugation, etc. The second step is a thermal drying process, which is required due to imperfect mechanical separation. The thermal drying process is over 100-200 times less energy efficient than the mechanical step. Since enormous volumes of materials are processed each year, more efficient mechanical solid-liquid separations can be leveraged into dramatic reductions in overall energy consumption by reducing downstream drying requirements have a tremendous impact on energy consumption. Using DuPont's MESLS process, initial test results showed four very important effects of the magnetic field on the solid-liquid filtration process: 1) reduction of the time to reach gas breakthrough, 2) less loss of solid into the filtrate, 3) reduction of the (solids) residual moisture content, and 4) acceleration of the de-watering kinetics. These test results and their potential impact on future commercial solid-liquid filtration is discussed. New applications can be found in mining, chemical and bioprocesses.

  17. Customers First: Using Process Improvement To Improve Service Quality and Efficiency.

    ERIC Educational Resources Information Center

    Larson, Catherine A.

    1998-01-01

    Describes steps in a process-improvement project for reserve book services at the University of Arizona Library: (1) plan--identify process boundaries and customer requirements, gather/analyze data, prioritize problems; (2) do--encourage divergent thinking, reach convergent thinking, find solutions; (3) check--pilot solutions, compare costs; and…

  18. Automated synthesis of image processing procedures using AI planning techniques

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  19. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  20. 10 Steps to Building an Architecture for Space Surveillance Projects

    NASA Astrophysics Data System (ADS)

    Gyorko, E.; Barnhart, E.; Gans, H.

    Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.

  1. Investigator's Guide to Missing Child Cases. For Law-Enforcement Officers Locating Missing Children. Second Edition.

    ERIC Educational Resources Information Center

    Patterson, John C.

    This booklet provides guidance to law enforcement officers investigating missing children cases, whether through parental kidnappings, abductions by strangers, runaway or "throwaway" cases, and those in which the circumstances are unknown. The guide describes, step-by-step, the investigative process required for each of the four types of missing…

  2. Tequila production.

    PubMed

    Cedeño, M

    1995-01-01

    Tequila is obtained from the distillation of fermented juice of agave plant, Agave tequilana, to which up to 49% (w/v) of an adjunct sugar, mainly from cane or corn, could be added. Agave plants require from 8 to 12 years to mature and during all this time cleaning, pest control, and slacken of land are required to produce an initial raw material with the appropriate chemical composition for tequila production. Production process comprises four steps: cooking to hydrolyze inulin into fructose, milling to extract the sugars, fermentation with a strain of Saccharomyces cerevisiae to convert the sugars into ethanol and organoleptic compounds, and, finally, a two-step distillation process. Maturation, if needed, is carried out in white oak barrels to obtain rested or aged tequila in 2 or 12 months, respectively.

  3. An online replanning method using warm start optimization and aperture morphing for flattening-filter-free beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahunbay, Ergun E., E-mail: eahunbay@mcw.edu; Ates,

    Purpose: In a situation where a couch shift for patient positioning is not preferred or prohibited (e.g., MR-linac), segment aperture morphing (SAM) can address target dislocation and deformation. For IMRT/VMAT with flattening-filter-free (FFF) beams, however, SAM method would lead to an adverse translational dose effect due to the beam unflattening. Here the authors propose a new two-step process to address both the translational effect of FFF beams and the target deformation. Methods: The replanning method consists of an offline and an online step. The offline step is to create a series of preshifted-plans (PSPs) obtained by a so-called “warm start”more » optimization (starting optimization from the original plan, rather than from scratch) at a series of isocenter shifts. The PSPs all have the same number of segments with very similar shapes, since the warm start optimization only adjusts the MLC positions instead of regenerating them. In the online step, a new plan is obtained by picking the closest PSP or linearly interpolating the MLC positions and the monitor units of the closest PSPs for the shift determined from the image of the day. This two-step process is completely automated and almost instantaneous (no optimization or dose calculation needed). The previously developed SAM algorithm is then applied for daily deformation. The authors tested the method on sample prostate and pancreas cases. Results: The two-step interpolation method can account for the adverse dose effects from FFF beams, while SAM corrects for the target deformation. Plan interpolation method is effective in diminishing the unflat beam effect and may allow reducing the required number of PSPs. The whole process takes the same time as the previously reported SAM process (5–10 min). Conclusions: The new two-step method plus SAM can address both the translation effects of FFF beams and target deformation, and can be executed in full automation except the delineation of target contour required by the SAM process.« less

  4. Has1 regulates consecutive maturation and processing steps for assembly of 60S ribosomal subunits

    PubMed Central

    Dembowski, Jill A.; Kuo, Benjamin; Woolford, John L.

    2013-01-01

    Ribosome biogenesis requires ∼200 assembly factors in Saccharomyces cerevisiae. The pre-ribosomal RNA (rRNA) processing defects associated with depletion of most of these factors have been characterized. However, how assembly factors drive the construction of ribonucleoprotein neighborhoods and how structural rearrangements are coupled to pre-rRNA processing are not understood. Here, we reveal ATP-independent and ATP-dependent roles of the Has1 DEAD-box RNA helicase in consecutive pre-rRNA processing and maturation steps for construction of 60S ribosomal subunits. Has1 associates with pre-60S ribosomes in an ATP-independent manner. Has1 binding triggers exonucleolytic trimming of 27SA3 pre-rRNA to generate the 5′ end of 5.8S rRNA and drives incorporation of ribosomal protein L17 with domain I of 5.8S/25S rRNA. ATP-dependent activity of Has1 promotes stable association of additional domain I ribosomal proteins that surround the polypeptide exit tunnel, which are required for downstream processing of 27SB pre-rRNA. Furthermore, in the absence of Has1, aberrant 27S pre-rRNAs are targeted for irreversible turnover. Thus, our data support a model in which Has1 helps to establish domain I architecture to prevent pre-rRNA turnover and couples domain I folding with consecutive pre-rRNA processing steps. PMID:23788678

  5. State Requirements for Educational Facilities, 1997.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Office of Educational Facilities.

    This document updates Florida's deregulation of construction of educational facilities guidelines, while keeping as the primary focus the safety of the students in pre-K through community college facilities. Organized by the sequence of steps required in the facilities procurement process, it covers general definitions, property…

  6. Study of residue type defect formation mechanism and the effect of advanced defect reduction (ADR) rinse process

    NASA Astrophysics Data System (ADS)

    Arima, Hiroshi; Yoshida, Yuichi; Yoshihara, Kosuke; Shibata, Tsuyoshi; Kushida, Yuki; Nakagawa, Hiroki; Nishimura, Yukio; Yamaguchi, Yoshikazu

    2009-03-01

    Residue type defect is one of yield detractors in lithography process. It is known that occurrence of the residue type defect is dependent on resist development process and the defect is reduced by optimized rinsing condition. However, the defect formation is affected by resist materials and substrate conditions. Therefore, it is necessary to optimize the development process condition by each mask level. Those optimization steps require a large amount of time and effort. The formation mechanism is investigated from viewpoint of both material and process. The defect formation is affected by resist material types, substrate condition and development process condition (D.I.W. rinse step). Optimized resist formulation and new rinse technology significantly reduce the residue type defect.

  7. Refining of metallurgical-grade silicon

    NASA Technical Reports Server (NTRS)

    Dietl, J.

    1986-01-01

    A basic requirement of large scale solar cell fabrication is to provide low cost base material. Unconventional refining of metallurical grade silicon represents one of the most promising ways of silicon meltstock processing. The refining concept is based on an optimized combination of metallurgical treatments. Commercially available crude silicon, in this sequence, requires a first pyrometallurgical step by slagging, or, alternatively, solvent extraction by aluminum. After grinding and leaching, high purity qualtiy is gained as an advanced stage of refinement. To reach solar grade quality a final pyrometallurgical step is needed: liquid-gas extraction.

  8. On-site manufacture of propellant oxygen from lunar resources

    NASA Technical Reports Server (NTRS)

    Rosenberg, Sanders D.

    1992-01-01

    The Aerojet Carbothermal Process for the manufacture of oxygen from lunar resources has three essential steps: the reduction of silicate with methane to form carbon monoxide and hydrogen; the reduction of carbon monoxide with hydrogen to form methane and water; and the electrolysis of water to form oxygen and hydrogen. This cyclic process does not depend upon the presence of water or water precursors in the lunar materials; it will produce oxygen from silicates regardless of their precise composition and fine structure. Research on the first step of the process was initiated by determining some of the operating conditions required to reduce igneous rock with carbon and silicon carbide. The initial phase of research on the second step is completed; quantitative conversion of carbon monoxide and hydrogen to methane and water was achieved with a nickel-on-kieselguhr catalyst. The equipment used in and the results obtained from these process studies are reported in detail.

  9. Health care priority setting: principles, practice and challenges

    PubMed Central

    Mitton, Craig; Donaldson, Cam

    2004-01-01

    Background Health organizations the world over are required to set priorities and allocate resources within the constraint of limited funding. However, decision makers may not be well equipped to make explicit rationing decisions and as such often rely on historical or political resource allocation processes. One economic approach to priority setting which has gained momentum in practice over the last three decades is program budgeting and marginal analysis (PBMA). Methods This paper presents a detailed step by step guide for carrying out a priority setting process based on the PBMA framework. This guide is based on the authors' experience in using this approach primarily in the UK and Canada, but as well draws on a growing literature of PBMA studies in various countries. Results At the core of the PBMA approach is an advisory panel charged with making recommendations for resource re-allocation. The process can be supported by a range of 'hard' and 'soft' evidence, and requires that decision making criteria are defined and weighted in an explicit manner. Evaluating the process of PBMA using an ethical framework, and noting important challenges to such activity including that of organizational behavior, are shown to be important aspects of developing a comprehensive approach to priority setting in health care. Conclusion Although not without challenges, international experience with PBMA over the last three decades would indicate that this approach has the potential to make substantial improvement on commonly relied upon historical and political decision making processes. In setting out a step by step guide for PBMA, as is done in this paper, implementation by decision makers should be facilitated. PMID:15104792

  10. Proposed Conceptual Requirements for the CTBT Knowledge Base,

    DTIC Science & Technology

    1995-08-14

    knowledge available to automated processing routines and human analysts are significant, and solving these problems is an essential step in ensuring...knowledge storage in a CTBT system. In addition to providing regional knowledge to automated processing routines, the knowledge base will also address

  11. Designing Flightdeck Procedures

    NASA Technical Reports Server (NTRS)

    Barshi, Immanuel; Mauro, Robert; Degani, Asaf; Loukopoulou, Loukia

    2016-01-01

    The primary goal of this document is to provide guidance on how to design, implement, and evaluate flight deck procedures. It provides a process for developing procedures that meet clear and specific requirements. This document provides a brief overview of: 1) the requirements for procedures, 2) a process for the design of procedures, and 3) a process for the design of checklists. The brief overview is followed by amplified procedures that follow the above steps and provide details for the proper design, implementation and evaluation of good flight deck procedures and checklists.

  12. Business Management Practices and Requirements for Colorado School Districts: An Overview of Selected Colorado Business Management Practices and Requirements.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    Guidelines to help school district supervisors and business management personnel implement state-required financial policies and procedures are presented in this report. Steps to comply with Colorado regulations for budgeting, accounting, reporting, and auditing processes are discussed. Figures illustrate the budgeting cycle and schedule. (LMI)

  13. Toward the Decision Tree for Inferring Requirements Maturation Types

    NASA Astrophysics Data System (ADS)

    Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi

    Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.

  14. Double-Vacuum-Bag Process for Making Resin-Matrix Composites

    NASA Technical Reports Server (NTRS)

    Bradford, Larry J.

    2007-01-01

    A double-vacuum-bag process has been devised as a superior alternative to a single-vacuum-bag process used heretofore in making laminated fiber-reinforced resin-matrix composite-material structural components. This process is applicable to broad classes of high-performance matrix resins including polyimides and phenolics that emit volatile compounds (solvents and volatile by-products of resin-curing chemical reactions) during processing. The superiority of the double-vacuum-bag process lies in enhanced management of the volatile compounds. Proper management of volatiles is necessary for making composite-material components of high quality: if not removed and otherwise properly managed, volatiles can accumulate in interior pockets as resins cure, thereby forming undesired voids in the finished products. The curing cycle for manufacturing a composite laminate containing a reactive resin matrix usually consists of a two-step ramp-and-hold temperature profile and an associated single-step pressure profile as shown in Figure 1. The lower-temperature ramp-and-hold step is known in the art as the B stage. During the B stage, prepregs are heated and volatiles are generated. Because pressure is not applied at this stage, volatiles are free to escape. Pressure is applied during the higher-temperature ramp-and-hold step to consolidate the laminate and impart desired physical properties to the resin matrix. The residual volatile content and fluidity of the resin at the beginning of application of consolidation pressure are determined by the temperature and time parameters of the B stage. Once the consolidation pressure is applied, residual volatiles are locked in. In order to produce a void-free, high-quality laminate, it is necessary to design the curing cycle to obtain the required residual fluidity and the required temperature at the time of application of the consolidation pressure.

  15. Advanced Information Processing. Volume II. Instructor's Materials. Curriculum Improvement Project. Region II.

    ERIC Educational Resources Information Center

    Stanford, Linda

    This course curriculum is intended for use by community college insructors and administrators in implementing an advanced information processing course. It builds on the skills developed in the previous information processing course but goes one step further by requiring students to perform in a simulated office environment and improve their…

  16. Discovery of Cellular Proteins Required for the Early Steps of HCV Infection Using Integrative Genomics

    PubMed Central

    Yang, Jae-Seong; Kwon, Oh Sung; Kim, Sanguk; Jang, Sung Key

    2013-01-01

    Successful viral infection requires intimate communication between virus and host cell, a process that absolutely requires various host proteins. However, current efforts to discover novel host proteins as therapeutic targets for viral infection are difficult. Here, we developed an integrative-genomics approach to predict human genes involved in the early steps of hepatitis C virus (HCV) infection. By integrating HCV and human protein associations, co-expression data, and tight junction-tetraspanin web specific networks, we identified host proteins required for the early steps in HCV infection. Moreover, we validated the roles of newly identified proteins in HCV infection by knocking down their expression using small interfering RNAs. Specifically, a novel host factor CD63 was shown to directly interact with HCV E2 protein. We further demonstrated that an antibody against CD63 blocked HCV infection, indicating that CD63 may serve as a new therapeutic target for HCV-related diseases. The candidate gene list provides a source for identification of new therapeutic targets. PMID:23593195

  17. Improvement of CFD Methods for Modeling Full Scale Circulating Fluidized Bed Combustion Systems

    NASA Astrophysics Data System (ADS)

    Shah, Srujal; Klajny, Marcin; Myöhänen, Kari; Hyppänen, Timo

    With the currently available methods of computational fluid dynamics (CFD), the task of simulating full scale circulating fluidized bed combustors is very challenging. In order to simulate the complex fluidization process, the size of calculation cells should be small and the calculation should be transient with small time step size. For full scale systems, these requirements lead to very large meshes and very long calculation times, so that the simulation in practice is difficult. This study investigates the requirements of cell size and the time step size for accurate simulations, and the filtering effects caused by coarser mesh and longer time step. A modeling study of a full scale CFB furnace is presented and the model results are compared with experimental data.

  18. Chain of evidence generation for contrast enhancement in digital image forensics

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Messina, Giuseppe; Strano, Daniela

    2010-01-01

    The quality of the images obtained by digital cameras has improved a lot since digital cameras early days. Unfortunately, it is not unusual in image forensics to find wrongly exposed pictures. This is mainly due to obsolete techniques or old technologies, but also due to backlight conditions. To extrapolate some invisible details a stretching of the image contrast is obviously required. The forensics rules to produce evidences require a complete documentation of the processing steps, enabling the replication of the entire process. The automation of enhancement techniques is thus quite difficult and needs to be carefully documented. This work presents an automatic procedure to find contrast enhancement settings, allowing both image correction and automatic scripting generation. The technique is based on a preprocessing step which extracts the features of the image and selects correction parameters. The parameters are thus saved through a JavaScript code that is used in the second step of the approach to correct the image. The generated script is Adobe Photoshop compliant (which is largely used in image forensics analysis) thus permitting the replication of the enhancement steps. Experiments on a dataset of images are also reported showing the effectiveness of the proposed methodology.

  19. Investigation of field induced trapping on floating gates

    NASA Technical Reports Server (NTRS)

    Gosney, W. M.

    1975-01-01

    The development of a technology for building electrically alterable read only memories (EAROMs) or reprogrammable read only memories (RPROMs) using a single level metal gate p channel MOS process with all conventional processing steps is outlined. Nonvolatile storage of data is achieved by the use of charged floating gate electrodes. The floating gates are charged by avalanche injection of hot electrodes through gate oxide, and discharged by avalanche injection of hot holes through gate oxide. Three extra diffusion and patterning steps are all that is required to convert a standard p channel MOS process into a nonvolatile memory process. For identification, this nonvolatile memory technology was given the descriptive acronym DIFMOS which stands for Dual Injector, Floating gate MOS.

  20. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    PubMed

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  1. MIRADS-2 Implementation Manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.

  2. Evaluation of Reaction Cross Section Data Used for Thin Layer Activation Technique

    NASA Astrophysics Data System (ADS)

    Ditrói, F.; Takács, S.; Tárkányi, F.

    2005-05-01

    Thin layer activation (TLA) is a widely used nuclear method to investigate and control the loss of material during wear, corrosion and erosion processes. The process requires knowledge of depth profiles of the investigated radioisotopes produced by charged particle bombardment. The depth distribution of the activity can be determined with direct, very time-consuming step by step measurement or by calculation from reliable cross section, stopping power and sample composition data. These data were checked experimentally at several points performing only a couple of measurements.

  3. Evaluation of Reaction Cross Section Data Used for Thin Layer Activation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditroi, F.; Takacs, S.; Tarkanyi, F.

    2005-05-24

    Thin layer activation (TLA) is a widely used nuclear method to investigate and control the loss of material during wear, corrosion and erosion processes. The process requires knowledge of depth profiles of the investigated radioisotopes produced by charged particle bombardment. The depth distribution of the activity can be determined with direct, very time-consuming step by step measurement or by calculation from reliable cross section, stopping power and sample composition data. These data were checked experimentally at several points performing only a couple of measurements.

  4. Real-time traffic sign detection and recognition

    NASA Astrophysics Data System (ADS)

    Herbschleb, Ernst; de With, Peter H. N.

    2009-01-01

    The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput

  5. Using the critical incident technique in community-based participatory research: a case study.

    PubMed

    Belkora, Jeffrey; Stupar, Lauren; O'Donnell, Sara

    2011-01-01

    Successful community-based participatory research involves the community partner in every step of the research process. The primary study for this paper took place in rural, Northern California. Collaborative partners included an academic researcher and two community based resource centers that provide supportive services to people diagnosed with cancer. This paper describes our use of the Critical Incident Technique (CIT) to conduct Community-based Participatory Research. We ask: Did the CIT facilitate or impede the active engagement of the community in all steps of the study process? We identified factors about the Critical Incident Technique that were either barriers or facilitators to involving the community partner in every step of the research process. Facilitators included the CIT's ability to accommodate involvement from a large spectrum of the community, its flexible design, and its personal approach. Barriers to community engagement included training required to conduct interviews, depth of interview probes, and time required. Overall, our academic-community partners felt that our use of the CIT facilitated community involvement in our Community-Based Participatory Research Project, where we used it to formally document the forces promoting and inhibiting successful achievement of community aims.

  6. K-space data processing for magnetic resonance elastography (MRE).

    PubMed

    Corbin, Nadège; Breton, Elodie; de Mathelin, Michel; Vappou, Jonathan

    2017-04-01

    Magnetic resonance elastography (MRE) requires substantial data processing based on phase image reconstruction, wave enhancement, and inverse problem solving. The objective of this study is to propose a new, fast MRE method based on MR raw data processing, particularly adapted to applications requiring fast MRE measurement or high elastogram update rate. The proposed method allows measuring tissue elasticity directly from raw data without prior phase image reconstruction and without phase unwrapping. Experimental feasibility is assessed both in a gelatin phantom and in the liver of a porcine model in vivo. Elastograms are reconstructed with the raw MRE method and compared to those obtained using conventional MRE. In a third experiment, changes in elasticity are monitored in real-time in a gelatin phantom during its solidification by using both conventional MRE and raw MRE. The raw MRE method shows promising results by providing similar elasticity values to the ones obtained with conventional MRE methods while decreasing the number of processing steps and circumventing the delicate step of phase unwrapping. Limitations of the proposed method are the influence of the magnitude on the elastogram and the requirement for a minimum number of phase offsets. This study demonstrates the feasibility of directly reconstructing elastograms from raw data.

  7. Dynamic load balancing for petascale quantum Monte Carlo applications: The Alias method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudheer, C. D.; Krishnan, S.; Srinivasan, A.

    Diffusion Monte Carlo is the most accurate widely used Quantum Monte Carlo method for the electronic structure of materials, but it requires frequent load balancing or population redistribution steps to maintain efficiency and avoid accumulation of systematic errors on parallel machines. The load balancing step can be a significant factor affecting performance, and will become more important as the number of processing elements increases. We propose a new dynamic load balancing algorithm, the Alias Method, and evaluate it theoretically and empirically. An important feature of the new algorithm is that the load can be perfectly balanced with each process receivingmore » at most one message. It is also optimal in the maximum size of messages received by any process. We also optimize its implementation to reduce network contention, a process facilitated by the low messaging requirement of the algorithm. Empirical results on the petaflop Cray XT Jaguar supercomputer at ORNL showing up to 30% improvement in performance on 120,000 cores. The load balancing algorithm may be straightforwardly implemented in existing codes. The algorithm may also be employed by any method with many near identical computational tasks that requires load balancing.« less

  8. Medical device development.

    PubMed

    Panescu, Dorin

    2009-01-01

    The development of a successful medical product requires not only engineering design efforts, but also clinical, regulatory, marketing and business expertise. This paper reviews items related to the process of designing medical devices. It discusses the steps required to take a medical product idea from concept, through development, verification and validation, regulatory approvals and market release.

  9. 33 CFR 230.17 - Filing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... supplement, district commanders will establish a time schedule for each step of the process based upon considerations listed in 40 CFR 1501.8 and upon other management considerations. The time required from the... reviews by division and the incorporation of division's comments in the EIS. HQUSACE and/or division will...

  10. Designing the Lunar Regolith Excavation Competition

    NASA Technical Reports Server (NTRS)

    Le, Christopher

    2009-01-01

    The project assigned this summer involves designing a lunar regolith mining robotics competition. This process involves consulting several assets available at the Kennedy Space Center. The process involves several steps. The first step is to determine the requirements for the competition. Once these requirements are determined, the dimensions of the playing field are drawn up, first by hand, and then using computer models. After these drawings are tentatively decided upon, the cost of materials must be determined, so as to fit within the allotted budget for the project. The materials are to then be ordered, assembled, broken down, and stored throughout the duration of the competition. We must also design the advertisements and logos for the competition. This is to market and publicize the competition to college level teams. We must also determine the rules for the competition so as to have uniform requirements for all teams. Once these processes are completed, the competition can be finalized and publicized for the public. The contributing parties are Greg Galloway, Robert Mueller, Susan Sawyer, Gloria Murphy, Julia Nething, and Cassandra Liles.

  11. One-step fabrication of multifunctional micromotors

    NASA Astrophysics Data System (ADS)

    Gao, Wenlong; Liu, Mei; Liu, Limei; Zhang, Hui; Dong, Bin; Li, Christopher Y.

    2015-08-01

    Although artificial micromotors have undergone tremendous progress in recent years, their fabrication normally requires complex steps or expensive equipment. In this paper, we report a facile one-step method based on an emulsion solvent evaporation process to fabricate multifunctional micromotors. By simultaneously incorporating various components into an oil-in-water droplet, upon emulsification and solidification, a sphere-shaped, asymmetric, and multifunctional micromotor is formed. Some of the attractive functions of this model micromotor include autonomous movement in high ionic strength solution, remote control, enzymatic disassembly and sustained release. This one-step, versatile fabrication method can be easily scaled up and therefore may have great potential in mass production of multifunctional micromotors for a wide range of practical applications.Although artificial micromotors have undergone tremendous progress in recent years, their fabrication normally requires complex steps or expensive equipment. In this paper, we report a facile one-step method based on an emulsion solvent evaporation process to fabricate multifunctional micromotors. By simultaneously incorporating various components into an oil-in-water droplet, upon emulsification and solidification, a sphere-shaped, asymmetric, and multifunctional micromotor is formed. Some of the attractive functions of this model micromotor include autonomous movement in high ionic strength solution, remote control, enzymatic disassembly and sustained release. This one-step, versatile fabrication method can be easily scaled up and therefore may have great potential in mass production of multifunctional micromotors for a wide range of practical applications. Electronic supplementary information (ESI) available: Videos S1-S4 and Fig. S1-S3. See DOI: 10.1039/c5nr03574k

  12. Space Station tethered waste disposal

    NASA Technical Reports Server (NTRS)

    Rupp, Charles C.

    1988-01-01

    The Shuttle Transportation System (STS) launches more payload to the Space Station than can be returned creating an accumulation of waste. Several methods of deorbiting the waste are compared including an OMV, solid rocket motors, and a tether system. The use of tethers is shown to offer the unique potential of having a net savings in STS launch requirement. Tether technology is being developed which can satisfy the deorbit requirements but additional effort is required in waste processing, packaging, and container design. The first step in developing this capability is already underway in the Small Expendable Deployer System program. A developmental flight test of a tether initiated recovery system is seen as the second step in the evolution of this capability.

  13. Advanced Information Processing. Volume I. Student's Materials. Curriculum Improvement Project. Region II.

    ERIC Educational Resources Information Center

    Stanford, Linda

    This course curriculum is intended for use in an advanced information processing course. It builds on the skills developed in the previous information processing course but goes one step further by requiring students to perform in a simulated office environment and improve their decision-making skills. This volume contains two parts of the…

  14. [Implementation of a rational standard of hygiene for preparation of operating rooms].

    PubMed

    Bauer, M; Scheithauer, S; Moerer, O; Pütz, H; Sliwa, B; Schmidt, C E; Russo, S G; Waeschle, R M

    2015-10-01

    The assurance of high standards of care is a major requirement in German hospitals while cost reduction and efficient use of resources are mandatory. These requirements are particularly evident in the high-risk and cost-intensive operating theatre field with multiple process steps. The cleaning of operating rooms (OR) between surgical procedures is of major relevance for patient safety and requires time and human resources. The hygiene procedure plan for OR cleaning between operations at the university hospital in Göttingen was revised and optimized according to the plan-do-check-act principle due to not clearly defined specifications of responsibilities, use of resources, prolonged process times and increased staff engagement. The current status was evaluated in 2012 as part of the first step "plan". The subsequent step "do" included an expert symposium with external consultants, interdisciplinary consensus conferences with an actualization of the former hygiene procedure plan and the implementation process. All staff members involved were integrated into this management change process. The penetration rate of the training and information measures as well as the acceptance and compliance with the new hygiene procedure plan were reviewed within step "check". The rates of positive swabs and air sampling as well as of postoperative wound infections were analyzed for quality control and no evidence for a reduced effectiveness of the new hygiene plan was found. After the successful implementation of these measures the next improvement cycle ("act") was performed in 2014 which led to a simplification of the hygiene plan by reduction of the number of defined cleaning and disinfection programs for preparation of the OR. The reorganization measures described led to a comprehensive commitment of the hygiene procedure plan by distinct specifications for responsibilities, for the course of action and for the use of resources. Furthermore, a simplification of the plan, a rational staff assignment and reduced process times were accomplished. Finally, potential conflicts due to an insufficient evidence-based knowledge of personnel was reduced. This present project description can be used by other hospitals as a guideline for similar changes in management processes.

  15. An implementation of the look-ahead Lanczos algorithm for non-Hermitian matrices, part 1

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.; Gutknecht, Martin H.; Nachtigal, Noel M.

    1990-01-01

    The nonsymmetric Lanczos method can be used to compute eigenvalues of large sparse non-Hermitian matrices or to solve large sparse non-Hermitian linear systems. However, the original Lanczos algorithm is susceptible to possible breakdowns and potential instabilities. We present an implementation of a look-ahead version of the Lanczos algorithm which overcomes these problems by skipping over those steps in which a breakdown or near-breakdown would occur in the standard process. The proposed algorithm can handle look-ahead steps of any length and is not restricted to steps of length 2, as earlier implementations are. Also, our implementation has the feature that it requires roughly the same number of inner products as the standard Lanczos process without look-ahead.

  16. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  17. Labor education programs in health and safety.

    PubMed

    Wallerstein, N; Baker, R

    1994-01-01

    Labor health and safety programs encourage workers to take an active part in making the workplace safe. The authors describe the growing need for preparing workers to participate in prevention efforts, the role of training in addressing this need, educational principles and traditions that contribute to empowerment education, and a step-by-step process that is required to achieve the goals of worker involvement and empowerment.

  18. Evaluating and selecting an information system, Part 1.

    PubMed

    Neal, T

    1993-01-01

    Initial steps in the process of evaluating and selecting a computerized information system for the pharmacy department are described. The first step in the selection process is to establish a steering committee and a project committee. The steering committee oversees the project, providing policy guidance, making major decisions, and allocating budgeted expenditures. The project committee conducts the departmental needs assessment, identifies system requirements, performs day-to-day functions, evaluates vendor proposals, trains personnel, and implements the system chosen. The second step is the assessment of needs in terms of personnel, workload, physical layout, and operating requirements. The needs assessment should be based on the department's mission statement and strategic plan. The third step is the development of a request for information (RFI) and a request for proposal (RFP). The RFI is a document designed for gathering preliminary information from a wide range of vendors; this general information is used in deciding whether to send the RFP to a given vendor. The RFP requests more detailed information and gives the purchaser's exact specifications for a system; the RFP also includes contractual information. To help ensure project success, many institutions turn to computer consultants for guidance. The initial steps in selecting a computerized pharmacy information system are establishing computerization committees, conducting a needs assessment, and writing an RFI and an RFP. A crucial early decision is whether to seek a consultant's expertise.

  19. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  20. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  1. An automated workflow for parallel processing of large multiview SPIM recordings

    PubMed Central

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-01-01

    Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585

  2. An automated workflow for parallel processing of large multiview SPIM recordings.

    PubMed

    Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2016-04-01

    Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  3. An intraorganizational model for developing and spreading quality improvement innovations.

    PubMed

    Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J

    Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.

  4. An intraorganizational model for developing and spreading quality improvement innovations

    PubMed Central

    Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.

    2017-01-01

    Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788

  5. Implementation of Energy Code Controls Requirements in New Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.; Hatten, Mike

    Most state energy codes in the United States are based on one of two national model codes; ANSI/ASHRAE/IES 90.1 (Standard 90.1) or the International Code Council (ICC) International Energy Conservation Code (IECC). Since 2004, covering the last four cycles of Standard 90.1 updates, about 30% of all new requirements have been related to building controls. These requirements can be difficult to implement and verification is beyond the expertise of most building code officials, yet the assumption in studies that measure the savings from energy codes is that they are implemented and working correctly. The objective of the current research ismore » to evaluate the degree to which high impact controls requirements included in commercial energy codes are properly designed, commissioned and implemented in new buildings. This study also evaluates the degree to which these control requirements are realizing their savings potential. This was done using a three-step process. The first step involved interviewing commissioning agents to get a better understanding of their activities as they relate to energy code required controls measures. The second involved field audits of a sample of commercial buildings to determine whether the code required control measures are being designed, commissioned and correctly implemented and functioning in new buildings. The third step includes compilation and analysis of the information gather during the first two steps. Information gathered during these activities could be valuable to code developers, energy planners, designers, building owners, and building officials.« less

  6. Using lean methodology to improve productivity in a hospital oncology pharmacy.

    PubMed

    Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D

    2014-09-01

    Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  7. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  8. One-step process of hydrothermal and alkaline treatment of wheat straw for improving the enzymatic saccharification.

    PubMed

    Sun, Shaolong; Zhang, Lidan; Liu, Fang; Fan, Xiaolin; Sun, Run-Cang

    2018-01-01

    To increase the production of bioethanol, a two-step process based on hydrothermal and dilute alkaline treatment was applied to reduce the natural resistance of biomass. However, the process required a large amount of water and a long operation time due to the solid/liquid separation before the alkaline treatment, which led to decrease the pure economic profit for production of bioethanol. Therefore, four one-step processes based on order of hydrothermal and alkaline treatment have been developed to enhance concentration of glucose of wheat straw by enzymatic saccharification. The aim of the present study was to systematically evaluated effect for different one-step processes by analyzing the physicochemical properties (composition, structural change, crystallinity, surface morphology, and BET surface area) and enzymatic saccharification of the treated substrates. In this study, hemicelluloses and lignins were removed from wheat straw and the morphologic structures were destroyed to various extents during the four one-step processes, which were favorable for cellulase absorption on cellulose. A positive correlation was also observed between the crystallinity and enzymatic saccharification rate of the substrate under the conditions given. The surface area of the substrate was positively related to the concentration of glucose in this study. As compared to the control (3.0 g/L) and treated substrates (11.2-14.6 g/L) obtained by the other three one-step processes, the substrate treated by one-step process based on successively hydrothermal and alkaline treatment had a maximum glucose concentration of 18.6 g/L, which was due to the high cellulose concentration and surface area for the substrate, accompanying with removal of large amounts of lignins and hemicelluloses. The present study demonstrated that the order of hydrothermal and alkaline treatment had significant effects on the physicochemical properties and enzymatic saccharification of wheat straw. The one-step process based on successively hydrothermal and alkaline treatment is a simple operating and economical feasible method for the production of glucose, which will be further converted into bioethanol.

  9. A one pot organic/CdSe nanoparticle hybrid material synthesis with in situ π-conjugated ligand functionalization.

    PubMed

    Mazzio, Katherine A; Okamoto, Ken; Li, Zhi; Gutmann, Sebastian; Strein, Elisabeth; Ginger, David S; Schlaf, Rudy; Luscombe, Christine K

    2013-02-14

    A one pot method for organic/colloidal CdSe nanoparticle hybrid material synthesis is presented. Relative to traditional ligand exchange processes, these materials require smaller amounts of the desired capping ligand, shorter syntheses and fewer processing steps, while maintaining nanoparticle morphology.

  10. DESIGN MANUAL - REMOVAL OF ARSENIC FROM DRINKING WATER SUPPLIES BY ION EXCHANGE

    EPA Science Inventory

    This design manual is an in-depth presentation of the steps required to design and operate a water treatment plant for removal of excess arsenic from drinking water using the anion exchange process. The treatment process is very reliable, simple and cost-effective. This design ...

  11. REMOVAL OF ARSENIC FROM DRINKING WATER SUPPLIES BY IRON REMOVAL PROCESS

    EPA Science Inventory

    This design manual is an in-depth presentation of the steps required to design and operate a water treatment plant for removal of arsenic in the As (V) form from drinking water using an iron removal process. The manual also discusses the capital and operating costs including many...

  12. DESIGN MANUAL - REMOVAL OF ARSENIC FROM DRINKING WATER SUPPLIES BY ADSORPTIVE MEDIA

    EPA Science Inventory

    This design manual is an in-depth presentation of the steps required to design and operate a water treatment plant for removal of excess arsenic from drinking water using the adsorptive media process. The treatment process is very reliable, simple and cost-effective. The adsorpt...

  13. A Winning Transition Plan

    ERIC Educational Resources Information Center

    Moeder-Chandler, Markus

    2014-01-01

    Helping high school athletes navigate the college recruitment process requires some extra steps. This article assists school counselors in the athletic identification process with support provided for both the student and parents. Also covered is how the recruitment criteria for a college and team works. The role of counselor places them on the…

  14. Purification of anti-Japanese encephalitis virus monoclonal antibody by ceramic hydroxyapatite chromatography without proteins A and G.

    PubMed

    Saito, Maiko; Kurosawa, Yae; Okuyama, Tsuneo

    2012-02-01

    Antibody purification using proteins A and G has been a standard method for research and industrial processes. The conventional method, however, includes a three-step process, including buffer exchange, before chromatography. In addition, proteins A and G require low pH elution, which causes antibody aggregation and inactivates the antibody's immunity. This report proposes a two-step method using hydroxyapatite chromatography and membrane filtration, without proteins A and G. This novel method shortens the running time to one-third the conventional method for each cycle. Using our two-step method, 90.2% of the monoclonal antibodies purified were recovered in the elution fraction, the purity achieved was >90%, and most of the antigen-specific activity was retained. This report suggests that the two-step method using hydroxyapatite chromatography and membrane filtration should be considered as an alternative to purification using proteins A and G.

  15. A high-throughput semi-automated preparation for filtered synaptoneurosomes.

    PubMed

    Murphy, Kathryn M; Balsor, Justin; Beshara, Simon; Siu, Caitlin; Pinto, Joshua G A

    2014-09-30

    Synaptoneurosomes have become an important tool for studying synaptic proteins. The filtered synaptoneurosomes preparation originally developed by Hollingsworth et al. (1985) is widely used and is an easy method to prepare synaptoneurosomes. The hand processing steps in that preparation, however, are labor intensive and have become a bottleneck for current proteomic studies using synaptoneurosomes. For this reason, we developed new steps for tissue homogenization and filtration that transform the preparation of synaptoneurosomes to a high-throughput, semi-automated process. We implemented a standardized protocol with easy to follow steps for homogenizing multiple samples simultaneously using a FastPrep tissue homogenizer (MP Biomedicals, LLC) and then filtering all of the samples in centrifugal filter units (EMD Millipore, Corp). The new steps dramatically reduce the time to prepare synaptoneurosomes from hours to minutes, increase sample recovery, and nearly double enrichment for synaptic proteins. These steps are also compatible with biosafety requirements for working with pathogen infected brain tissue. The new high-throughput semi-automated steps to prepare synaptoneurosomes are timely technical advances for studies of low abundance synaptic proteins in valuable tissue samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Image Processing Using a Parallel Architecture.

    DTIC Science & Technology

    1987-12-01

    ENG/87D-25 Abstract This study developed a set o± low level image processing tools on a parallel computer that allows concurrent processing of images...environment, the set of tools offers a significant reduction in the time required to perform some commonly used image processing operations. vI IMAGE...step toward developing these systems, a structured set of image processing tools was implemented using a parallel computer. More important than

  17. Automatic sequencing and control of Space Station airlock operations

    NASA Technical Reports Server (NTRS)

    Himel, Victor; Abeles, Fred J.; Auman, James; Tqi, Terry O.

    1989-01-01

    Procedures that have been developed as part of the NASA JSC-sponsored pre-prototype Checkout, Servicing and Maintenance (COSM) program for pre- and post-EVA airlock operations are described. This paper addresses the accompanying pressure changes in the airlock and in the Advanced Extravehicular Mobility Unit (EMU). Additionally, the paper focuses on the components that are checked out, and includes the step-by-step sequences to be followed by the crew, the required screen displays and prompts that accompany each step, and a description of the automated processes that occur.

  18. Cellobiohydrolase 1 from Trichoderma reesei degrades cellulose in single cellobiose steps

    NASA Astrophysics Data System (ADS)

    Brady, Sonia K.; Sreelatha, Sarangapani; Feng, Yinnian; Chundawat, Shishir P. S.; Lang, Matthew J.

    2015-12-01

    Cellobiohydrolase 1 from Trichoderma reesei (TrCel7A) processively hydrolyses cellulose into cellobiose. Although enzymatic techniques have been established as promising tools in biofuel production, a clear understanding of the motor's mechanistic action has yet to be revealed. Here, we develop an optical tweezers-based single-molecule (SM) motility assay for precision tracking of TrCel7A. Direct observation of motility during degradation reveals processive runs and distinct steps on the scale of 1 nm. Our studies suggest TrCel7A is not mechanically limited, can work against 20 pN loads and speeds up when assisted. Temperature-dependent kinetic studies establish the energy requirements for the fundamental stepping cycle, which likely includes energy from glycosidic bonds and other sources. Through SM measurements of isolated TrCel7A domains, we determine that the catalytic domain alone is sufficient for processive motion, providing insight into TrCel7A's molecular motility mechanism.

  19. Curriculum Redesign in Veterinary Medicine: Part I.

    PubMed

    Chaney, Kristin P; Macik, Maria L; Turner, Jacqueline S; Korich, Jodi A; Rogers, Kenita S; Fowler, Debra; Scallan, Elizabeth M; Keefe, Lisa M

    Curricular review is considered a necessary component for growth and enhancement of academic programs and requires time, energy, creativity, and persistence from both faculty and administration. At Texas A&M College of Veterinary Medicine & Biomedical Sciences (TAMU), the faculty and administration partnered with the university's Center for Teaching Excellence to create a faculty-driven, data-enhanced curricular redesign process. The 8-step process begins with the formation of a dedicated faculty curriculum design team to drive the redesign process and to support the college curriculum committee. The next steps include defining graduate outcomes and mapping the current curriculum to identify gaps and redundancies across the curriculum. Data are collected from internal and external stakeholders including veterinary students, faculty, alumni, and employers of graduates. Data collected through curriculum mapping and stakeholder engagement substantiate the curriculum redesign. The guidelines, supporting documents, and 8-step process developed at TAMU are provided to assist other veterinary schools in successful curricular redesign. This is the first of a two-part report that provides the background, context, and description of the process for charting the course for curricular change. The process involves defining expected learning outcomes for new graduates, conducting a curriculum mapping exercise, and collecting stakeholder data for curricular evaluation (steps 1-4). The second part of the report describes the development of rubrics that were applied to the graduate learning outcomes (steps 5-8) and engagement of faculty during the implementation phases of data-driven curriculum change.

  20. 20 CFR 655.15 - Required pre-filing recruitment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... all required steps of the recruitment process as specified in this section. (c) Retention of... § 655.10; (2) Submit a job order to the SWA serving the area of intended employment; (3) Publish two... section); and (4) Where the employer is a party to a collective bargaining agreement governing the job...

  1. 49 CFR 192.925 - What are the requirements for using External Corrosion Direct Assessment (ECDA)?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Corrosion Direct Assessment (ECDA)? 192.925 Section 192.925 Transportation Other Regulations Relating to... External Corrosion Direct Assessment (ECDA)? (a) Definition. ECDA is a four-step process that combines... corrosion to the integrity of a pipeline. (b) General requirements. An operator that uses direct assessment...

  2. 49 CFR 192.925 - What are the requirements for using External Corrosion Direct Assessment (ECDA)?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Corrosion Direct Assessment (ECDA)? 192.925 Section 192.925 Transportation Other Regulations Relating to... External Corrosion Direct Assessment (ECDA)? (a) Definition. ECDA is a four-step process that combines... corrosion to the integrity of a pipeline. (b) General requirements. An operator that uses direct assessment...

  3. 49 CFR 192.925 - What are the requirements for using External Corrosion Direct Assessment (ECDA)?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Corrosion Direct Assessment (ECDA)? 192.925 Section 192.925 Transportation Other Regulations Relating to... External Corrosion Direct Assessment (ECDA)? (a) Definition. ECDA is a four-step process that combines... corrosion to the integrity of a pipeline. (b) General requirements. An operator that uses direct assessment...

  4. 49 CFR 192.925 - What are the requirements for using External Corrosion Direct Assessment (ECDA)?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Corrosion Direct Assessment (ECDA)? 192.925 Section 192.925 Transportation Other Regulations Relating to... External Corrosion Direct Assessment (ECDA)? (a) Definition. ECDA is a four-step process that combines... corrosion to the integrity of a pipeline. (b) General requirements. An operator that uses direct assessment...

  5. Enhanced Traceability for Bulk Processing of Sentinel-Derived Information Products

    NASA Astrophysics Data System (ADS)

    Lankester, Thomas; Hubbard, Steven; Knowelden, Richard

    2016-08-01

    The advent of widely available, systematically acquired and advanced Earth observations from the Sentinel platforms is spurring development of a wide range of derived information products. Whilst welcome, this rapid rate of development inevitably leads to some processing instability as algorithms and production steps are required to evolve accordingly. To mitigate this instability, the provenance of EO-derived information products needs to be traceable and transparent.Airbus Defence and Space (Airbus DS) has developed the Airbus Processing Cloud (APC) as a virtualised processing farm for bulk production of EO-derived data and information products. The production control system of the APC transforms internal configuration control information into an INSPIRE metadata file containing a stepwise set of processing steps and data source elements that provide the complete and transparent provenance of each product generated.

  6. Non-equilibrium calculations of atmospheric processes initiated by electron impact.

    NASA Astrophysics Data System (ADS)

    Campbell, L.; Brunger, M. J.

    2007-05-01

    Electron impact in the atmosphere produces ionisation, dissociation, electronic excitation and vibrational excitation of atoms and molecules. The products can then take part in chemical reactions, recombination with electrons, or radiative or collisional deactivation. While most such processes are fast, some longer--lived species do not reach equilibrium. The electron source (photoelectrons or auroral electrons) also varies over time and longer-lived species can move substantially in altitude by molecular, ambipolar or eddy diffusion. Hence non-equilibrium calculations are required in some circumstances. Such time-step calculations need to have sufficiently short steps so that the fastest processes are still calculated correctly, but this can lead to computation times that are too large. Hence techniques to allow for longer time steps by incorporating equilibrium calculations are described. Examples are given for results of atmospheric non-equilibrium calculations, including the populations of the vibrational levels of ground state N2, the electron density and its dependence on vibrationally excited N2, predictions of nitric oxide density, and detailed processes during short duration auroral events.

  7. Middleware Case Study: MeDICi

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wynne, Adam S.

    2011-05-05

    In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less

  8. Process for fabrication of large titanium diboride ceramic bodies

    DOEpatents

    Moorhead, Arthur J.; Bomar, E. S.; Becher, Paul F.

    1989-01-01

    A process for manufacturing large, fully dense, high purity TiB.sub.2 articles by pressing powders with a sintering aid at relatively low temperatures to reduce grain growth. The process requires stringent temperature and pressure applications in the hot-pressing step to ensure maximum removal of sintering aid and to avoid damage to the fabricated article or the die.

  9. Quality Indicators for the Total Testing Process.

    PubMed

    Plebani, Mario; Sciacovelli, Laura; Aita, Ada

    2017-03-01

    ISO 15189:2012 requires the use of quality indicators (QIs) to monitor and evaluate all steps of the total testing process, but several difficulties dissuade laboratories from effective and continuous use of QIs in routine practice. An International Federation of Clinical Chemistry and Laboratory Medicine working group addressed this problem and implemented a project to develop a model of QIs to be used in clinical laboratories worldwide to monitor and evaluate all steps of the total testing process, and decrease error rates and improve patient services in laboratory testing. All laboratories are invited, at no cost, to enroll in the project and contribute to harmonized management at the international level. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Process modeling of a HLA research lab

    NASA Astrophysics Data System (ADS)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  11. Process Development for Automated Solar Cell and Module Production. Task 4: Automated Array Assembly

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A baseline sequence for the manufacture of solar cell modules was specified. Starting with silicon wafers, the process goes through damage etching, texture etching, junction formation, plasma edge etch, aluminum back surface field formation, and screen printed metallization to produce finished solar cells. The cells were then series connected on a ribbon and bonded into a finished glass tedlar module. A number of steps required additional developmental effort to verify technical and economic feasibility. These steps include texture etching, plasma edge etch, aluminum back surface field formation, array layup and interconnect, and module edge sealing and framing.

  12. Implementing Immediate Postpartum Long-Acting Reversible Contraception Programs.

    PubMed

    Hofler, Lisa G; Cordes, Sarah; Cwiak, Carrie A; Goedken, Peggy; Jamieson, Denise J; Kottke, Melissa

    2017-01-01

    To understand the most important steps required to implement immediate postpartum long-acting reversible contraception (LARC) programs in different Georgia hospitals and the barriers to implementing such a program. This was a qualitative study. We interviewed 32 key personnel from 10 Georgia hospitals working to establish immediate postpartum LARC programs. Data were analyzed using directed qualitative content analysis principles. We used the Stages of Implementation to organize participant-identified key steps for immediate postpartum LARC into an implementation guide. We compared this guide to hospitals' implementation experiences. At the completion of the study, LARC was available for immediate postpartum placement at 7 of 10 study hospitals. Participants identified common themes for the implementation experience: team member identification and ongoing communication, payer preparedness challenges, interdependent department-specific tasks, and piloting with continuing improvements. Participants expressed a need for anticipatory guidance throughout the process. Key first steps to immediate postpartum LARC program implementation were identifying project champions, creating an implementation team that included all relevant departments, obtaining financial reassurance, and ensuring hospital administration awareness of the project. Potential barriers included lack of knowledge about immediate postpartum LARC, financial concerns, and competing clinical and administrative priorities. Hospitals that were successful at implementing immediate postpartum LARC programs did so by prioritizing clear communication and multidisciplinary teamwork. Although the implementation guide reflects a comprehensive assessment of the steps to implementing immediate postpartum LARC programs, not all hospitals required every step to succeed. Hospital teams report that implementing immediate postpartum LARC programs involves multiple departments and a number of important steps to consider. A stage-based approach to implementation, and a standardized guide detailing these steps, may provide the necessary structure for the complex process of implementing immediate postpartum LARC programs in the hospital setting.

  13. Development of DKB ETL module in case of data conversion

    NASA Astrophysics Data System (ADS)

    Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.

    2018-05-01

    Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.

  14. HALE UAS Command and Control Communications: Step 1 - Functional Requirements Document. Version 4.0

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The High Altitude Long Endurance (HALE) unmanned aircraft system (UAS) communicates with an off-board pilot-in-command in all flight phases via the C2 data link, making it a critical component for the UA to fly in the NAS safely and routinely. This is a new requirement in current FAA communications planning and monitoring processes. This document provides a set of comprehensive C2 communications functional requirements and performance guidelines to help facilitate the future FAA certification process for civil UAS to operate in the NAS. The objective of the guidelines is to provide the ability to validate the functional requirements and in future be used to develop performance-level requirements.

  15. Magnetorheological finishing: a perfect solution to nanofinishing requirements

    NASA Astrophysics Data System (ADS)

    Sidpara, Ajay

    2014-09-01

    Finishing of optics for different applications is the most important as well as difficult step to meet the specification of optics. Conventional grinding or other polishing processes are not able to reduce surface roughness beyond a certain limit due to high forces acting on the workpiece, embedded abrasive particles, limited control over process, etc. Magnetorheological finishing (MRF) process provides a new, efficient, and innovative way to finish optical materials as well many metals to their desired level of accuracy. This paper provides an overview of MRF process for different applications, important process parameters, requirement of magnetorheological fluid with respect to workpiece material, and some areas that need to be explored for extending the application of MRF process.

  16. DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING DL

    2010-08-03

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less

  17. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  18. Survival in a Down Economy: A Budget Reduction Process for Superintendents

    ERIC Educational Resources Information Center

    Davis, E. E.; Coffland, Jack A.

    2010-01-01

    Dramatic reductions in the dollars available for public education require a new and systemic approach to balancing school district budgets. This manual provides numerous examples of successful budget reduction strategies based on a six-step process that has demonstrated its effectiveness in small, medium, and large school districts. Supported by…

  19. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  20. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  1. Software for Automated Reading of STEP Files by I-DEAS(trademark)

    NASA Technical Reports Server (NTRS)

    Pinedo, John

    2003-01-01

    A program called "readstep" enables the I-DEAS(tm) computer-aided-design (CAD) software to automatically read Standard for the Exchange of Product Model Data (STEP) files. (The STEP format is one of several used to transfer data between dissimilar CAD programs.) Prior to the development of "readstep," it was necessary to read STEP files into I-DEAS(tm) one at a time in a slow process that required repeated intervention by the user. In operation, "readstep" prompts the user for the location of the desired STEP files and the names of the I-DEAS(tm) project and model file, then generates an I-DEAS(tm) program file called "readstep.prg" and two Unix shell programs called "runner" and "controller." The program "runner" runs I-DEAS(tm) sessions that execute readstep.prg, while "controller" controls the execution of "runner" and edits readstep.prg if necessary. The user sets "runner" and "controller" into execution simultaneously, and then no further intervention by the user is required. When "runner" has finished, the user should see only parts from successfully read STEP files present in the model file. STEP files that could not be read successfully (e.g., because of format errors) should be regenerated before attempting to read them again.

  2. Apparatus and method for polarizing polarizable nuclear species

    DOEpatents

    Hersman, F. William; Leuschner, Mark; Carberry, Jeannette

    2005-09-27

    The present invention is a polarizing process involving a number of steps. The first step requires moving a flowing mixture of gas, the gas at least containing a polarizable nuclear species and vapor of at least one alkali metal, with a transport velocity that is not negligible when compared with the natural velocity of diffusive transport. The second step is propagating laser light in a direction, preferably at least partially through a polarizing cell. The next step is directing the flowing gas along a direction generally opposite to the direction of laser light propagating. The next step is containing the flowing gas mixture in the polarizing cell. The final step is immersing the polarizing cell in a magnetic field. These steps can be initiated in any order, although the flowing gas, the propagating laser and the magnetic field immersion must be concurrently active for polarization to occur.

  3. Terminological aspects of data elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strehlow, R.A.; Kenworthey, W.H. Jr.; Schuldt, R.E.

    1991-01-01

    The creation and display of data comprise a process that involves a sequence of steps requiring both semantic and systems analysis. An essential early step in this process is the choice, definition, and naming of data element concepts and is followed by the specification of other needed data element concept attributes. The attributes and the values of data element concept remain associated with them from their birth as a concept to a generic data element that serves as a template for final application. Terminology is, therefore, centrally important to the entire data creation process. Smooth mapping from natural language tomore » a database is a critical aspect of database, and consequently, it requires terminology standardization from the outset of database work. In this paper the semantic aspects of data elements are analyzed and discussed. Seven kinds of data element concept information are considered and those that require terminological development and standardization are identified. The four terminological components of a data element are the hierarchical type of a concept, functional dependencies, schematas showing conceptual structures, and definition statements. These constitute the conventional role of terminology in database design. 12 refs., 8 figs., 1 tab.« less

  4. Low-temperature direct bonding of glass nanofluidic chips using a two-step plasma surface activation process.

    PubMed

    Xu, Yan; Wang, Chenxi; Dong, Yiyang; Li, Lixiao; Jang, Kihoon; Mawatari, Kazuma; Suga, Tadatomo; Kitamori, Takehiko

    2012-01-01

    Owing to the well-established nanochannel fabrication technology in 2D nanoscales with high resolution, reproducibility, and flexibility, glass is the leading, ideal, and unsubstitutable material for the fabrication of nanofluidic chips. However, high temperature (~1,000 °C) and a vacuum condition are usually required in the conventional fusion bonding process, unfortunately impeding the nanofluidic applications and even the development of the whole field of nanofluidics. We present a direct bonding of fused silica glass nanofluidic chips at low temperature, around 200 °C in ambient air, through a two-step plasma surface activation process which consists of an O(2) reactive ion etching plasma treatment followed by a nitrogen microwave radical activation. The low-temperature bonded glass nanofluidic chips not only had high bonding strength but also could work continuously without leakage during liquid introduction driven by air pressure even at 450 kPa, a very high pressure which can meet the requirements of most nanofluidic operations. Owing to the mild conditions required in the bonding process, the method has the potential to allow the integration of a range of functional elements into nanofluidic chips during manufacture, which is nearly impossible in the conventional high-temperature fusion bonding process. Therefore, we believe that the developed low-temperature bonding would be very useful and contribute to the field of nanofluidics.

  5. [Process optimisation in hospitals: from process to business organisation].

    PubMed

    Eberlein-Gonska, Maria

    2010-01-01

    Apart from a multidimensional quality definition and the understanding of quality as a company-wide challenge, a third essential element of quality management is prevention. Thus, company quality policy has to be prevention-oriented and requires both customer and process orientation as important prerequisites. Process orientation especially focuses on the critical analyses of work flows as a condition for identifying early intervention options which, in turn, may influence the result. Developing a business organisation requires the definition of criteria for space planning, room assignment and room integration in consideration of both medical and economic aspects and the architectural concept. Specific experiences will be demonstrated as a case study using the example of a new building in the midst of the Carl Gustav Carus University Hospital in Dresden, the Diagnostic Centre for Internal Medicine and Neurology. The hospital management placed an order to develop a sustainable as well as feasible business organisation for all the different departments. The idea was to create a medical centre where maximum use was made of all planned spaces and resources on the basis of target processes which had to be defined and agreed upon with all the persons concerned. In a next step all the personal, space and operational resources required were assigned. The success of management in all industries, including the health care sector, crucially depends on the translation of ideas into practice, among them the critical factor of sustainability. In this context, the support by the management as a role model, a formal frame for the respective project group and the definition of controlling via defined indicators have special importance. The example of the Diagnostic Centre for Internal Medicine and Neurology demonstrates that the result of changed processes may release a cultural change where competition can be replaced by cooperation step by step. Copyright © 2010. Published by Elsevier GmbH.

  6. One-step aluminium-assisted crystallization of Ge epitaxy on Si by magnetron sputtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ziheng, E-mail: ziheng.liu@unsw.edu.au; Hao, Xiaojing; Ho-Baillie, Anita

    In this work, one-step aluminium-assisted crystallization of Ge on Si is achieved via magnetron sputtering by applying an in-situ low temperature (50 °C to 150 °C) heat treatment in between Al and Ge depositions. The effect of heat treatment on film properties and the growth mechanism of Ge epitaxy on Si are studied via X-ray diffraction, Raman and transmission electron microscopy analyses. Compared with the conventional two-step process, the one-step aluminium-assisted crystallization requires much lower thermal budget and results in pure Ge epitaxial layer, which may be suitable for use as a virtual substrate for the fabrication of III-V solar cells.

  7. One-step fabrication of multifunctional micromotors.

    PubMed

    Gao, Wenlong; Liu, Mei; Liu, Limei; Zhang, Hui; Dong, Bin; Li, Christopher Y

    2015-09-07

    Although artificial micromotors have undergone tremendous progress in recent years, their fabrication normally requires complex steps or expensive equipment. In this paper, we report a facile one-step method based on an emulsion solvent evaporation process to fabricate multifunctional micromotors. By simultaneously incorporating various components into an oil-in-water droplet, upon emulsification and solidification, a sphere-shaped, asymmetric, and multifunctional micromotor is formed. Some of the attractive functions of this model micromotor include autonomous movement in high ionic strength solution, remote control, enzymatic disassembly and sustained release. This one-step, versatile fabrication method can be easily scaled up and therefore may have great potential in mass production of multifunctional micromotors for a wide range of practical applications.

  8. 3D freeform printing of silk fibroin.

    PubMed

    Rodriguez, Maria J; Dixon, Thomas A; Cohen, Eliad; Huang, Wenwen; Omenetto, Fiorenzo G; Kaplan, David L

    2018-04-15

    Freeform fabrication has emerged as a key direction in printing biologically-relevant materials and structures. With this emerging technology, complex structures with microscale resolution can be created in arbitrary geometries and without the limitations found in traditional bottom-up or top-down additive manufacturing methods. Recent advances in freeform printing have used the physical properties of microparticle-based granular gels as a medium for the submerged extrusion of bioinks. However, most of these techniques require post-processing or crosslinking for the removal of the printed structures (Miller et al., 2015; Jin et al., 2016) [1,2]. In this communication, we introduce a novel method for the one-step gelation of silk fibroin within a suspension of synthetic nanoclay (Laponite) and polyethylene glycol (PEG). Silk fibroin has been used as a biopolymer for bioprinting in several contexts, but chemical or enzymatic additives or bulking agents are needed to stabilize 3D structures. Our method requires no post-processing of printed structures and allows for in situ physical crosslinking of pure aqueous silk fibroin into arbitrary geometries produced through freeform 3D printing. 3D bioprinting has emerged as a technology that can produce biologically relevant structures in defined geometries with microscale resolution. Techniques for fabrication of free-standing structures by printing into granular gel media has been demonstrated previously, however, these methods require crosslinking agents and post-processing steps on printed structures. Our method utilizes one-step gelation of silk fibroin within a suspension of synthetic nanoclay (Laponite), with no need for additional crosslinking compounds or post processing of the material. This new method allows for in situ physical crosslinking of pure aqueous silk fibroin into defined geometries produced through freeform 3D printing. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  9. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Bernstein, Max; Richey, Christina; Rall, Jonathan

    2015-11-01

    Introduction: NASA’s Planetary Science Division (PSD) solicits its research and analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD changed the structure of the program elements under which the majority of planetary science R&A is done. Major changes included the creation of five core research program elements aligned with PSD’s strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submission.ROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2015 submission changes: All PSD programs will continue to use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.

  10. Framework for Integrating Science Data Processing Algorithms Into Process Control Systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.

    2011-01-01

    A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.

  11. Zero Liquid Discharge (ZLD) System for Flue-Gas Derived Water From Oxy-Combustion Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivaram Harendra; Danylo Oryshchyn; Thomas Ochs

    2011-10-16

    Researchers at the National Energy Technology Laboratory (NETL) located in Albany, Oregon, have patented a process - Integrated Pollutant Removal (IPR) that uses off-the-shelf technology to produce a sequestration ready CO{sub 2} stream from an oxy-combustion power plant. Capturing CO{sub 2} from fossil-fuel combustion generates a significant water product which can be tapped for use in the power plant and its peripherals. Water condensed in the IPR{reg_sign} process may contain fly ash particles, sodium (from pH control), and sulfur species, as well as heavy metals, cations and anions. NETL is developing a treatment approach for zero liquid discharge while maximizingmore » available heat from IPR. Current treatment-process steps being studied are flocculation/coagulation, for removal of cations and fine particles, and reverse osmosis, for anion removal as well as for scavenging the remaining cations. After reverse osmosis process steps, thermal evaporation and crystallization steps will be carried out in order to build the whole zero liquid discharge (ZLD) system for flue-gas condensed wastewater. Gypsum is the major product from crystallization process. Fast, in-line treatment of water for re-use in IPR seems to be one practical step for minimizing water treatment requirements for CO{sub 2} capture. The results obtained from above experiments are being used to build water treatment models.« less

  12. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soares, A.S.; Schneider, D. K.; Skinner, J. M.

    2008-09-01

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  13. Remote Access to the PXRR Macromolecular Crystallography Facilities at the NSLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A Soares; D Schneider; J Skinner

    2011-12-31

    The most recent surge of innovations that have simplified and streamlined the process of determining macromolecular structures by crystallography owes much to the efforts of the structural genomics community. However, this was only the last step in a long evolution that saw the metamorphosis of crystallography from an heroic effort that involved years of dedication and skill into a straightforward measurement that is occasionally almost trivial. Many of the steps in this remarkable odyssey involved reducing the physical labor that is demanded of experimenters in the field. Other steps reduced the technical expertise required for conducting those experiments.

  14. Introduction to Remote Sensing Image Registration

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline

    2017-01-01

    For many applications, accurate and fast image registration of large amounts of multi-source data is the first necessary step before subsequent processing and integration. Image registration is defined by several steps and each step can be approached by various methods which all present diverse advantages and drawbacks depending on the type of data, the type of applications, the a prior information known about the data and the type of accuracy that is required. This paper will first present a general overview of remote sensing image registration and then will go over a few specific methods and their applications

  15. Tetraethylene glycol promoted two-step, one-pot rapid synthesis of indole-3-[1- 11C]acetic acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Sojeong; Qu, Wenchao; Alexoff, David L.

    2014-12-12

    An operationally friendly, two-step, one-pot process has been developed for the rapid synthesis of carbon-11 labeled indole-3-acetic acid ([ 11]IAA or [ 11]auxin). By replacing an aprotic polar solvent with tetraethylene glycol, nucleophilic [ 11]cyanation and alkaline hydrolysis reactions were performed consecutively in a single pot without a time-consuming intermediate purification step. The entire production time for this updated procedure is 55 min, which dramatically simplifies the entire synthesis and reduces the starting radioactivity required for a whole plant imaging study.

  16. Issues in nanocomposite ceramic engineering: focus on processing and properties of alumina-based composites.

    PubMed

    Palmero, Paola; Kern, Frank; Sommer, Frank; Lombardi, Mariangela; Gadow, Rainer; Montanaro, Laura

    2014-12-30

    Ceramic nanocomposites, containing at least one phase in the nanometric dimension, have received special interest in recent years. They have, in fact, demonstrated increased performance, reliability and lifetime with respect to monolithic ceramics. However, a successful approach to the production of tailored composite nanostructures requires the development of innovative concepts at each step of manufacturing, from the synthesis of composite nanopowders, to their processing and sintering.This review aims to deepen understanding of some of the critical issues associated with the manufacturing of nanocomposite ceramics, focusing on alumina-based composite systems. Two case studies are presented and briefly discussed. The former illustrates the benefits, in terms of sintered microstructure and related mechanical properties, resulting from the application of an engineering approach to a laboratory-scale protocol for the elaboration of nanocomposites in the system alumina-ZrO2-YAG (yttrium aluminium garnet). The latter illustrates the manufacturing of alumina-based composites for large-scale applications such as cutting tools, carried out by an injection molding process. The need for an engineering approach to be applied in all processing steps is demonstrated also in this second case study, where a tailored manufacturing process is required to obtain the desired results.

  17. Electrochemical reduction of CerMet fuels for transmutation using surrogate CeO2-Mo pellets

    NASA Astrophysics Data System (ADS)

    Claux, B.; Souček, P.; Malmbeck, R.; Rodrigues, A.; Glatz, J.-P.

    2017-08-01

    One of the concepts chosen for the transmutation of minor actinides in Accelerator Driven Systems or fast reactors proposes the use of fuels and targets containing minor actinides oxides embedded in an inert matrix either composed of molybdenum metal (CerMet fuel) or of ceramic magnesium oxide (CerCer fuel). Since the sufficient transmutation cannot be achieved in a single step, it requires multi-recycling of the fuel including recovery of the not transmuted minor actinides. In the present work, a pyrochemical process for treatment of Mo metal inert matrix based CerMet fuels is studied, particularly the electroreduction in molten chloride salt as a head-end step required prior the main separation process. At the initial stage, different inactive pellets simulating the fuel containing CeO2 as minor actinide surrogates were examined. The main studied parameters of the process efficiency were the porosity and composition of the pellets and the process parameters as current density and passed charge. The results indicated the feasibility of the process, gave insight into its limiting parameters and defined the parameters for the future experiment on minor actinide containing material.

  18. Does footwear type impact the number of steps required to reach gait steady state?: an innovative look at the impact of foot orthoses on gait initiation.

    PubMed

    Najafi, Bijan; Miller, Daniel; Jarrett, Beth D; Wrobel, James S

    2010-05-01

    Many studies have attempted to better elucidate the effect of foot orthoses on gait dynamics. To our knowledge, most previous studies exclude the first few steps of gait and begin analysis at steady state walking. These unanalyzed steps of gait may contain important information about the dynamic and complex processes required to achieve equilibrium for a given gait velocity. The purpose of this study was to quantify gait initiation and determine how many steps were required to reach steady state walking under three footwear conditions: barefoot, habitual shoes, and habitual shoes with a prefabricated foot orthoses. Fifteen healthy subjects walked 50m at habitual speed in each condition. Wearing habitual shoes with the prefabricated orthoses enabled subjects to reach steady state walking in fewer steps (3.5 steps+/-2.0) compared to the barefoot condition (5.2 steps+/-3.0; p=0.02) as well as compared to the habitual shoes condition (4.7 steps+/-1.6; p=0.05). Interestingly, the subjects' dynamic medial-lateral balance was significantly improved (22%, p<0.05) by using foot orthoses compared to other footwear conditions. These findings suggest that foot orthoses may help individuals reach steady state more quickly and with a better dynamic balance in the medial-lateral direction, independent of foot type. The findings of this pilot study may open new avenues for objectively assessing the impact of prescription footwear on dynamic balance and spatio-temporal parameters of gait. Further work to better assess the impact of foot orthoses on gait initiation in patients suffering from gait and instability pathologies may be warranted. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Planning bioinformatics workflows using an expert system.

    PubMed

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  1. Demonstration of the feasibility of automated silicon solar cell fabrication

    NASA Technical Reports Server (NTRS)

    Taylor, W. E.; Schwartz, F. M.

    1975-01-01

    A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.

  2. Epicenter location by analysis for interictal spikes

    NASA Technical Reports Server (NTRS)

    Hand, C.

    2001-01-01

    The MEG recording is a quick and painless process that requires no surgery. This approach has the potential to save time, reduce patient discomfort, and eliminates a painful and potentially dangerous surgical step in the treatment procedure.

  3. Improving the Simplified Acquisition of Base Engineering Requirements (SABER) Delivery Order Award Process: Results of a Process Improvement Plan

    DTIC Science & Technology

    1991-09-01

    putting all tasks directed towsrds achieving an outcome in aequence. The tasks can be viewed as steps in the process (39:2.3). Using this...improvement opportunity is investigated. A plan is developed, root causes are identified, and solutions are tested and implemented. The process is... solutions , check for actual improvement, and integrate the successful improvements into the process. ?UP 7. Check Improvement Performance. Finally, the

  4. A collaborative approach to lean laboratory workstation design reduces wasted technologist travel.

    PubMed

    Yerian, Lisa M; Seestadt, Joseph A; Gomez, Erron R; Marchant, Kandice K

    2012-08-01

    Lean methodologies have been applied in many industries to reduce waste. We applied Lean techniques to redesign laboratory workstations with the aim of reducing the number of times employees must leave their workstations to complete their tasks. At baseline in 68 workflows (aggregates or sequence of process steps) studied, 251 (38%) of 664 tasks required workers to walk away from their workstations. After analysis and redesign, only 59 (9%) of the 664 tasks required technologists to leave their workstations to complete these tasks. On average, 3.4 travel events were removed for each workstation. Time studies in a single laboratory section demonstrated that workers spend 8 to 70 seconds in travel each time they step away from the workstation. The redesigned workstations will allow employees to spend less time travelling around the laboratory. Additional benefits include employee training in waste identification, improved overall laboratory layout, and identification of other process improvement opportunities in our laboratory.

  5. Components of the Engulfment Machinery Have Distinct Roles in Corpse Processing

    PubMed Central

    Meehan, Tracy L.; Joudi, Tony F.; Timmons, Allison K.; Taylor, Jeffrey D.; Habib, Corey S.; Peterson, Jeanne S.; Emmanuel, Shanan; Franc, Nathalie C.; McCall, Kimberly

    2016-01-01

    Billions of cells die in our bodies on a daily basis and are engulfed by phagocytes. Engulfment, or phagocytosis, can be broken down into five basic steps: attraction of the phagocyte, recognition of the dying cell, internalization, phagosome maturation, and acidification. In this study, we focus on the last two steps, which can collectively be considered corpse processing, in which the engulfed material is degraded. We use the Drosophila ovarian follicle cells as a model for engulfment of apoptotic cells by epithelial cells. We show that engulfed material is processed using the canonical corpse processing pathway involving the small GTPases Rab5 and Rab7. The phagocytic receptor Draper is present on the phagocytic cup and on nascent, phosphatidylinositol 3-phosphate (PI(3)P)- and Rab7-positive phagosomes, whereas integrins are maintained on the cell surface during engulfment. Due to the difference in subcellular localization, we investigated the role of Draper, integrins, and downstream signaling components in corpse processing. We found that some proteins were required for internalization only, while others had defects in corpse processing as well. This suggests that several of the core engulfment proteins are required for distinct steps of engulfment. We also performed double mutant analysis and found that combined loss of draper and αPS3 still resulted in a small number of engulfed vesicles. Therefore, we investigated another known engulfment receptor, Crq. We found that loss of all three receptors did not inhibit engulfment any further, suggesting that Crq does not play a role in engulfment by the follicle cells. A more complete understanding of how the engulfment and corpse processing machinery interact may enable better understanding and treatment of diseases associated with defects in engulfment by epithelial cells. PMID:27347682

  6. CNC Machining Of The Complex Copper Electrodes

    NASA Astrophysics Data System (ADS)

    Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina

    2015-07-01

    This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.

  7. Gynecologic oncology group strategies to improve timeliness of publication.

    PubMed

    Bialy, Sally; Blessing, John A; Stehman, Frederick B; Reardon, Anne M; Blaser, Kim M

    2013-08-01

    The Gynecologic Oncology Group (GOG) is a multi-institution cooperative group funded by the National Cancer Institute to conduct clinical trials encompassing clinical and basic scientific research in gynecologic malignancies. These results are disseminated via publication in peer-reviewed journals. This process requires collaboration of numerous investigators located in diverse cancer research centers. Coordination of manuscript development is positioned within the Statistical and Data Center (SDC), thus allowing the SDC personnel to manage the process and refine strategies to promote earlier dissemination of results. A major initiative to improve timeliness utilizing the assignment, monitoring, and enforcement of deadlines for each phase of manuscript development is the focus of this investigation. Document improvement in timeliness via comparison of deadline compliance and time to journal submission due to expanded administrative and technologic initiatives implemented in 2006. Major steps in the publication process include generation of first draft by the First Author and submission to SDC, Co-author review, editorial review by Publications Subcommittee, response to journal critique, and revision. Associated with each step are responsibilities of First Author to write or revise, collaborating Biostatistician to perform analysis and interpretation, and assigned SDC Clinical Trials Editorial Associate to format/revise according to journal requirements. Upon the initiation of each step, a deadline for completion is assigned. In order to improve efficiency, a publications database was developed to track potential steps in manuscript development that enables the SDC Director of Administration and the Publications Subcommittee Chair to assign, monitor, and enforce deadlines. They, in turn, report progress to Group Leadership through the Operations Committee. The success of the strategies utilized to improve the GOG publication process was assessed by comparing the timeliness of each potential step in the development of primary Phase II manuscripts during 2003-2006 versus 2007-2010. Improvement was noted in 10 of 11 identified steps resulting in a cumulative average improvement of 240 days from notification of data maturity to First Author through first submission to a journal. Moreover, the average time to journal acceptance has improved by an average of 346 days. The investigation is based on only Phase II trials to ensure comparability of manuscript complexity. Nonetheless, the procedures employed are applicable to the development of any clinical trials manuscript. The assignment, monitoring, and enforcement of deadlines for all stages of manuscript development have resulted in increased efficiency and timeliness. The positioning and support of manuscript development within the SDC provide a valuable resource to authors in meeting assigned deadlines, accomplishing peer review, and complying with journal requirements.

  8. One-Step Method to Prepare PLLA Porous Microspheres in a High-Voltage Electrostatic Anti-Solvent Process

    PubMed Central

    Wang, Ying; Zhu, Li-Hui; Chen, Ai-Zheng; Xu, Qiao; Hong, Yu-Juan; Wang, Shi-Bin

    2016-01-01

    A one-step method using a high-voltage electrostatic anti-solvent process was employed to fabricate poly-l-lactide (PLLA) porous microspheres (PMs). To address the simplification and control of the preparation process, a 24 full factorial experiment was performed to optimize the operating process and analyze the effect of the factors on the morphology and aerodynamic properties of the PLLA PMs, and various characterization tests were also performed. The resulting PLLA PMs exhibited an even and porous morphology with a density less than 0.4 g/cm3, a geometric mean diameter (Dg) of 10–30 μm, an aerodynamic diameter (Da) of 1–5 μm, a fine particle fraction (FPF) of 56.3%, and a porosity of 76.2%, meeting the requirements for pulmonary drug delivery. The physicochemical characterizations reveal that no significant chemical change occurred in the PLLA during the process. An investigation of its in vitro cytotoxicity and pulmonary toxicity shows no obvious toxic response, indicating good biosafety. This study indicates that the one-step method using a high-voltage electrostatic anti-solvent process has great potential in developing an inhalable drug carrier for pulmonary drug delivery. PMID:28773489

  9. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  10. State Standard-Setting Processes in Brief. State Academic Standards: Standard-Setting Processes

    ERIC Educational Resources Information Center

    Thomsen, Jennifer

    2014-01-01

    Concerns about academic standards, whether created by states from scratch or adopted by states under the Common Core State Standards (CCSS) banner, have drawn widespread media attention and are at the top of many state policymakers' priority lists. Recently, a number of legislatures have required additional steps, such as waiting periods for…

  11. Low-temperature wafer direct bonding of silicon and quartz glass by a two-step wet chemical surface cleaning

    NASA Astrophysics Data System (ADS)

    Wang, Chenxi; Xu, Jikai; Zeng, Xiaorun; Tian, Yanhong; Wang, Chunqing; Suga, Tadatomo

    2018-02-01

    We demonstrate a facile bonding process for combining silicon and quartz glass wafers by a two-step wet chemical surface cleaning. After a post-annealing at 200 °C, strong bonding interfaces with no defects or microcracks were obtained. On the basis of the detailed surface and bonding interface characterizations, the bonding mechanism was explored and discussed. The amino groups terminated on the cleaned surfaces might contribute to the bonding strength enhancement during the annealing. This cost-effective bonding process has great potentials for silicon- and glass-based heterogeneous integrations without requiring a vacuum system.

  12. Gaussian process regression for geometry optimization

    NASA Astrophysics Data System (ADS)

    Denzel, Alexander; Kästner, Johannes

    2018-03-01

    We implemented a geometry optimizer based on Gaussian process regression (GPR) to find minimum structures on potential energy surfaces. We tested both a two times differentiable form of the Matérn kernel and the squared exponential kernel. The Matérn kernel performs much better. We give a detailed description of the optimization procedures. These include overshooting the step resulting from GPR in order to obtain a higher degree of interpolation vs. extrapolation. In a benchmark against the Limited-memory Broyden-Fletcher-Goldfarb-Shanno optimizer of the DL-FIND library on 26 test systems, we found the new optimizer to generally reduce the number of required optimization steps.

  13. Data requirements for valuing externalities: The role of existing permitting processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, A.D.; Baechler, M.C.; Callaway, J.M.

    1990-08-01

    While the assessment of externalities, or residual impacts, will place new demands on regulators, utilities, and developers, existing processes already require certain data and information that may fulfill some of the data needs for externality valuation. This paper examines existing siting, permitting, and other processes and highlights similarities and differences between their data requirements and the data required to value environmental externalities. It specifically considers existing requirements for siting new electricity resources in Oregon and compares them with the information and data needed to value externalities for such resources. This paper also presents several observations about how states can takemore » advantage of data acquired through processes already in place as they move into an era when externalities are considered in utility decision-making. It presents other observations on the similarities and differences between the data requirements under existing processes and those for valuing externalities. This paper also briefly discusses the special case of cumulative impacts. And it presents recommendations on what steps to take in future efforts to value externalities. 35 refs., 2 tabs.« less

  14. Work flow of signal processing data of ground penetrating radar case of rigid pavement measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Handayani, Gunawan

    The signal processing of Ground Penetrating Radar (GPR) requires a certain work flow to obtain good results. Even though the Ground Penetrating Radar data looks similar with seismic reflection data, but the GPR data has particular signatures that the seismic reflection data does not have. This is something to do with coupling between antennae and the ground surface. Because of this, the GPR data should be treated differently from the seismic signal data processing work flow. Even though most of the processing steps still follow the same work flow of seismic reflection data such as: filtering, predictive deconvolution etc. Thismore » paper presents the work flow of GPR processing data on rigid pavement measurements. The processing steps start from raw data, de-Wow process, remove DC and continue with the standard process to get rid of noises i.e. filtering process. Some radargram particular features of rigid pavement along with pile foundations are presented.« less

  15. The Fractional Step Method Applied to Simulations of Natural Convective Flows

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; Heinrich, Juan C.; Saxon, Jeff (Technical Monitor)

    2002-01-01

    This paper describes research done to apply the Fractional Step Method to finite-element simulations of natural convective flows in pure liquids, permeable media, and in a directionally solidified metal alloy casting. The Fractional Step Method has been applied commonly to high Reynold's number flow simulations, but is less common for low Reynold's number flows, such as natural convection in liquids and in permeable media. The Fractional Step Method offers increased speed and reduced memory requirements by allowing non-coupled solution of the pressure and the velocity components. The Fractional Step Method has particular benefits for predicting flows in a directionally solidified alloy, since other methods presently employed are not very efficient. Previously, the most suitable method for predicting flows in a directionally solidified binary alloy was the penalty method. The penalty method requires direct matrix solvers, due to the penalty term. The Fractional Step Method allows iterative solution of the finite element stiffness matrices, thereby allowing more efficient solution of the matrices. The Fractional Step Method also lends itself to parallel processing, since the velocity component stiffness matrices can be built and solved independently of each other. The finite-element simulations of a directionally solidified casting are used to predict macrosegregation in directionally solidified castings. In particular, the finite-element simulations predict the existence of 'channels' within the processing mushy zone and subsequently 'freckles' within the fully processed solid, which are known to result from macrosegregation, or what is often referred to as thermo-solutal convection. These freckles cause material property non-uniformities in directionally solidified castings; therefore many of these castings are scrapped. The phenomenon of natural convection in an alloy under-going directional solidification, or thermo-solutal convection, will be explained. The development of the momentum and continuity equations for natural convection in a fluid, a permeable medium, and in a binary alloy undergoing directional solidification will be presented. Finally, results for natural convection in a pure liquid, natural convection in a medium with a constant permeability, and for directional solidification will be presented.

  16. Step 1: C3 Flight Demo Data Analysis Plan

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The Data Analysis Plan (DAP) describes the data analysis that the C3 Work Package (WP) will perform in support of the Access 5 Step 1 C3 flight demonstration objectives as well as the processes that will be used by the Flight IPT to gather and distribute the data collected to satisfy those objectives. In addition to C3 requirements, this document will encompass some Human Systems Interface (HSI) requirements in performing the C3 flight demonstrations. The C3 DAP will be used as the primary interface requirements document between the C3 Work Package and Flight Test organizations (Flight IPT and Non-Access 5 Flight Programs). In addition to providing data requirements for Access 5 flight test (piggyback technology demonstration flights, dedicated C3 technology demonstration flights, and Airspace Operations Demonstration flights), the C3 DAP will be used to request flight data from Non- Access 5 flight programs for C3 related data products

  17. Large-Scale Traffic Microsimulation From An MPO Perspective

    DOT National Transportation Integrated Search

    1997-01-01

    One potential advancement of the four-step travel model process is the forecasting and simulation of individual activities and travel. A common concern with such an approach is that the data and computational requirements for a large-scale, regional ...

  18. Rac-WAVE-mediated actin reorganization is required for organization and maintenance of cell-cell adhesion.

    PubMed

    Yamazaki, Daisuke; Oikawa, Tsukasa; Takenawa, Tadaomi

    2007-01-01

    During cadherin-dependent cell-cell adhesion, the actin cytoskeleton undergoes dynamic reorganization in epithelial cells. Rho-family small GTPases, which regulate actin dynamics, play pivotal roles in cadherin-dependent cell-cell adhesion; however, the precise molecular mechanisms that underlie cell-cell adhesion formation remain unclear. Here we show that Wiskott-Aldrich syndrome protein family verprolin-homologous protein (WAVE)-mediated reorganization of actin, downstream of Rac plays an important role in normal development of cadherin-dependent cell-cell adhesions in MDCK cells. Rac-induced development of cadherin-dependent adhesions required WAVE2-dependent actin reorganization. The process of cell-cell adhesion is divided into three steps: formation of new cell-cell contacts, stabilization of these new contacts and junction maturation. WAVE1 and WAVE2 were expressed in MDCK cells. The functions of WAVE1 and WAVE2 were redundant in this system but WAVE2 appeared to play a more significant role. During the first step, WAVE2-dependent lamellipodial protrusions facilitated formation of cell-cell contacts. During the second step, WAVE2 recruited actin filaments to new cell-cell contacts and stabilized newly formed cadherin clusters. During the third step, WAVE2-dependent actin reorganization was required for organization and maintenance of mature cell-cell adhesions. Thus, Rac-WAVE-dependent actin reorganization is not only involved in formation of cell-cell adhesions but is also required for their maintenance.

  19. Completing the Physical Representation of Quantum Algorithms Provides a Quantitative Explanation of Their Computational Speedup

    NASA Astrophysics Data System (ADS)

    Castagnoli, Giuseppe

    2018-03-01

    The usual representation of quantum algorithms, limited to the process of solving the problem, is physically incomplete. We complete it in three steps: (i) extending the representation to the process of setting the problem, (ii) relativizing the extended representation to the problem solver to whom the problem setting must be concealed, and (iii) symmetrizing the relativized representation for time reversal to represent the reversibility of the underlying physical process. The third steps projects the input state of the representation, where the problem solver is completely ignorant of the setting and thus the solution of the problem, on one where she knows half solution (half of the information specifying it when the solution is an unstructured bit string). Completing the physical representation shows that the number of computation steps (oracle queries) required to solve any oracle problem in an optimal quantum way should be that of a classical algorithm endowed with the advanced knowledge of half solution.

  20. Cellobiohydrolase 1 from Trichoderma reesei degrades cellulose in single cellobiose steps

    DOE PAGES

    Brady, Sonia K.; Sreelatha, Sarangapani; Feng, Yinnian; ...

    2015-12-10

    Cellobiohydrolase 1 from Trichoderma reesei (TrCel7A) processively hydrolyses cellulose into cellobiose. Although enzymatic techniques have been established as promising tools in biofuel production, a clear understanding of the motor’s mechanistic action has yet to be revealed. We develop an optical tweezers-based single-molecule (SM) motility assay for precision tracking of TrCel7A. Direct observation of motility during degradation reveals processive runs and distinct steps on the scale of 1 nm. Our studies suggest TrCel7A is not mechanically limited, can work against 20 pN loads and speeds up when assisted. Temperature-dependent kinetic studies establish the energy requirements for the fundamental stepping cycle, whichmore » likely includes energy from glycosidic bonds and other sources. Moreover, through SM measurements of isolated TrCel7A domains, we determine that the catalytic domain alone is sufficient for processive motion, providing insight into TrCel7A’s molecular motility mechanism.« less

  1. Cellobiohydrolase 1 from Trichoderma reesei degrades cellulose in single cellobiose steps

    PubMed Central

    Brady, Sonia K.; Sreelatha, Sarangapani; Feng, Yinnian; Chundawat, Shishir P. S.; Lang, Matthew J

    2015-01-01

    Cellobiohydrolase 1 from Trichoderma reesei (TrCel7A) processively hydrolyses cellulose into cellobiose. Although enzymatic techniques have been established as promising tools in biofuel production, a clear understanding of the motor's mechanistic action has yet to be revealed. Here, we develop an optical tweezers-based single-molecule (SM) motility assay for precision tracking of TrCel7A. Direct observation of motility during degradation reveals processive runs and distinct steps on the scale of 1 nm. Our studies suggest TrCel7A is not mechanically limited, can work against 20 pN loads and speeds up when assisted. Temperature-dependent kinetic studies establish the energy requirements for the fundamental stepping cycle, which likely includes energy from glycosidic bonds and other sources. Through SM measurements of isolated TrCel7A domains, we determine that the catalytic domain alone is sufficient for processive motion, providing insight into TrCel7A's molecular motility mechanism. PMID:26657780

  2. Characterization of Developer Application Methods Used in Fluorescent Penetrant Inspection

    NASA Astrophysics Data System (ADS)

    Brasche, L. J. H.; Lopez, R.; Eisenmann, D.

    2006-03-01

    Fluorescent penetrant inspection (FPI) is the most widely used inspection method for aviation components seeing use for production as well as an inservice inspection applications. FPI is a multiple step process requiring attention to the process parameters for each step in order to enable a successful inspection. A multiyear program is underway to evaluate the most important factors affecting the performance of FPI, to determine whether existing industry specifications adequately address control of the process parameters, and to provide the needed engineering data to the public domain. The final step prior to the inspection is the application of developer with typical aviation inspections involving the use of dry powder (form d) usually applied using either a pressure wand or dust storm chamber. Results from several typical dust storm chambers and wand applications have shown less than optimal performance. Measurements of indication brightness and recording of the UVA image, and in some cases, formal probability of detection (POD) studies were used to assess the developer application methods. Key conclusions and initial recommendations are provided.

  3. Membrane Fusion Induced by Small Molecules and Ions

    PubMed Central

    Mondal Roy, Sutapa; Sarkar, Munna

    2011-01-01

    Membrane fusion is a key event in many biological processes. These processes are controlled by various fusogenic agents of which proteins and peptides from the principal group. The fusion process is characterized by three major steps, namely, inter membrane contact, lipid mixing forming the intermediate step, pore opening and finally mixing of inner contents of the cells/vesicles. These steps are governed by energy barriers, which need to be overcome to complete fusion. Structural reorganization of big molecules like proteins/peptides, supplies the required driving force to overcome the energy barrier of the different intermediate steps. Small molecules/ions do not share this advantage. Hence fusion induced by small molecules/ions is expected to be different from that induced by proteins/peptides. Although several reviews exist on membrane fusion, no recent review is devoted solely to small moleculs/ions induced membrane fusion. Here we intend to present, how a variety of small molecules/ions act as independent fusogens. The detailed mechanism of some are well understood but for many it is still an unanswered question. Clearer understanding of how a particular small molecule can control fusion will open up a vista to use these moleucles instead of proteins/peptides to induce fusion both in vivo and in vitro fusion processes. PMID:21660306

  4. The Military Spouse Education and Career Opportunities Program: Recommendations for an Internal Monitoring System

    DTIC Science & Technology

    2016-01-01

    Family Policy’s SECO program, which reviewed existing SECO metrics and data sources, as well as analytic methods of previ- ous research, to determine ...process that requires an iterative cycle of assessment of collected data (typically, but not solely, quantitative data) to determine whether SECO...RAND suggests five steps to develop and implement the SECO inter- nal monitoring system: Step 1. Describe the logic or theory of how activities are

  5. Copper-catalyzed decarboxylative trifluoromethylation of allylic bromodifluoroacetates.

    PubMed

    Ambler, Brett R; Altman, Ryan A

    2013-11-01

    The development of new synthetic fluorination reactions has important implications in medicinal, agricultural, and materials chemistries. Given the prevalence and accessibility of alcohols, methods to convert alcohols to trifluoromethanes are desirable. However, this transformation typically requires four-step processes, specialty chemicals, and/or stoichiometric metals to access the trifluoromethyl-containing product. A two-step copper-catalyzed decarboxylative protocol for converting allylic alcohols to trifluoromethanes is reported. Preliminary mechanistic studies distinguish this reaction from previously reported Cu-mediated reactions.

  6. Licensing of future mobile satellite systems

    NASA Technical Reports Server (NTRS)

    Lepkowski, Ronald J.

    1990-01-01

    The regulatory process for licensing mobile satellite systems is complex and can require many years to complete. This process involves frequency allocations, national licensing, and frequency coordination. The regulatory process that resulted in the establishment of the radiodetermination satellite service (RDSS) between 1983 and 1987 is described. In contrast, each of these steps in the licensing of the mobile satellite service (MSS) is taking a significantly longer period of time to complete.

  7. Results of a State-Wide Evaluation of “Paperwork Burden” in Addiction Treatment

    PubMed Central

    Carise, Deni; Love, Meghan; Zur, Julia; McLellan, A. Thomas; Kemp, Jack

    2009-01-01

    This article chronicles three steps taken by research, clinical and state staff towards assessing, evaluating and streamlining clinical and administrative paperwork at all public outpatient addiction treatment programs in 1 state. The first step was an accounting of all paperwork requirements at each program. Step two included the development of time estimates for the paperwork requirements, synthesis of information across sites, providing written evaluation of the need, utility and redundancy of all forms (paperwork) collected, and suggestions for eliminating unused or unnecessary data collection and streamlining the remaining data collection. Thirdly, the state agency hosted a meeting with the state staff, researchers and staff from all programs and agencies with state-funded contracts and took action. Paperwork reductions over the course of a 6-month outpatient treatment episode were estimated at 4 – 6 hours, with most of the time burden being eliminated from the intake process. PMID:19150201

  8. Advantages offered by high average power picosecond lasers

    NASA Astrophysics Data System (ADS)

    Moorhouse, C.

    2011-03-01

    As electronic devices shrink in size to reduce material costs, device size and weight, thinner material thicknesses are also utilized. Feature sizes are also decreasing, which is pushing manufacturers towards single step laser direct write process as an attractive alternative to conventional, multiple step photolithography processes by eliminating process steps and the cost of chemicals. The fragile nature of these thin materials makes them difficult to machine either mechanically or with conventional nanosecond pulsewidth, Diode Pumped Solids State (DPSS) lasers. Picosecond laser pulses can cut materials with reduced damage regions and selectively remove thin films due to the reduced thermal effects of the shorter pulsewidth. Also, the high repetition rate allows high speed processing for industrial applications. Selective removal of thin films for OLED patterning, silicon solar cells and flat panel displays is discussed, as well as laser cutting of transparent materials with low melting point such as Polyethylene Terephthalate (PET). For many of these thin film applications, where low pulse energy and high repetition rate are required, throughput can be increased by the use of a novel technique to using multiple beams from a single laser source is outlined.

  9. Planning for the next influenza pandemic: using the science and art of logistics.

    PubMed

    Cupp, O Shawn; Predmore, Brad G

    2011-01-01

    The complexities and challenges for healthcare providers and their efforts to provide fundamental basic items to meet the logistical demands of an influenza pandemic are discussed in this article. The supply chain, planning, and alternatives for inevitable shortages are some of the considerations associated with this emergency mass critical care situation. The planning process and support for such events are discussed in detail with several recommendations obtained from the literature and the experience from recent mass casualty incidents (MCIs). The first step in this planning process is the development of specific triage requirements during an influenza pandemic. The second step is identification of logistical resources required during such a pandemic, which are then analyzed within the proposed logistics science and art model for planning purposes. Resources highlighted within the model include allocation and use of work force, bed space, intensive care unit assets, ventilators, personal protective equipment, and oxygen. The third step is using the model to discuss in detail possible workarounds, suitable substitutes, and resource allocation. An examination is also made of the ethics surrounding palliative care within the construction of an MCI and the factors that will inevitably determine rationing and prioritizing of these critical assets to palliative care patients.

  10. Systematic development of technical textiles

    NASA Astrophysics Data System (ADS)

    Beer, M.; Schrank, V.; Gloy, Y.-S.; Gries, T.

    2016-07-01

    Technical textiles are used in various fields of applications, ranging from small scale (e.g. medical applications) to large scale products (e.g. aerospace applications). The development of new products is often complex and time consuming, due to multiple interacting parameters. These interacting parameters are production process related and also a result of the textile structure and used material. A huge number of iteration steps are necessary to adjust the process parameter to finalize the new fabric structure. A design method is developed to support the systematic development of technical textiles and to reduce iteration steps. The design method is subdivided into six steps, starting from the identification of the requirements. The fabric characteristics vary depending on the field of application. If possible, benchmarks are tested. A suitable fabric production technology needs to be selected. The aim of the method is to support a development team within the technology selection without restricting the textile developer. After a suitable technology is selected, the transformation and correlation between input and output parameters follows. This generates the information for the production of the structure. Afterwards, the first prototype can be produced and tested. The resulting characteristics are compared with the initial product requirements.

  11. Ion Implantation with in-situ Patterning for IBC Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graff, John W.

    2014-10-24

    Interdigitated back-side Contact (IBC) solar cells are the highest efficiency silicon solar cells currently on the market. Unfortunately the cost to produce these solar cells is also very high, due to the large number of processing steps required. Varian believes that only the combination of high efficiency and low cost can meet the stated goal of $1/Wp. The core of this program has been to develop an in-situ patterning capability for an ion implantation system capable of producing patterned doped regions for IBC solar cells. Such a patterning capable ion implanter can reduce the number of process steps required tomore » manufacture IBC cells, and therefore significantly reduce the cost. The present program was organized into three phases. Phase I was to select a patterning approach and determine the patterning requirements for IBC cells. Phase II consists of construction of a Beta ion implantation system containing in-situ patterning capability. Phase III consists of shipping and installation of the ion implant system in a customer factory where it will be tested and proven in a pilot production line.« less

  12. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  13. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rotman, D.

    After nearly a decade of work and $150 million in development costs. Exxon Research and Engineering (ER&E; Florham Park, NJ) says its natural gas conversion process based on Fischer-Tropsch technology is ready for full-scale commercialization. ER&E is looking to entice one of Exxon`s other business units into building a plant based on the process. The Exxon technology makes refinery or petrochemical feedstocks from natural gas in an integrated three-step process, including fluid-bed reactor to make synthesis gas and a hydrocarbon synthesis step using a proprietary Fischer-Tropsch catalyst. Exxon has successfully demonstrated the process at a pilot plant in Baton Rouge,more » LA but says no commercialization decision has been made. ER&E estimates that to commercialize the technology economically will require a large gas conversion plant-with a price tag of about $2 billion.« less

  15. Authorized limits for Fernald copper ingots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frink, N.; Kamboj, S.; Hensley, J.

    This development document contains data and analysis to support the approval of authorized limits for the unrestricted release of 59 t of copper ingots containing residual radioactive material from the U.S. Department of Energy (DOE) Fernald Environmental Management Project (FEMP). The analysis presented in this document comply with the requirements of DOE Order 5400.5, {open_quotes}Radiation Protection of the Public and the Environment,{close_quotes} as well as the requirements of the proposed promulgation of this order as 10 CFR Part 834. The document was developed following the step-by-step process described in the Draft Handbook for Controlling Release for Reuse or Recycle Propertymore » Containing Residual Radioactive Material.« less

  16. How development and manufacturing will need to be structured--heads of development/manufacturing. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Nepveux, Kevin; Sherlock, Jon-Paul; Futran, Mauricio; Thien, Michael; Krumme, Markus

    2015-03-01

    Continuous manufacturing (CM) is a process technology that has been used in the chemical industry for large-scale mass production of chemicals in single-purpose plants with benefit for many years. Recent interest has been raised to expand CM into the low-volume, high-value pharmaceutical business with its unique requirements regarding readiness for human use and the required quality, supply chain, and liability constraints in this business context. Using a fairly abstract set of definitions, this paper derives technical consequences of CM in different scenarios along the development-launch-supply axis in different business models and how they compare to batch processes. Impact of CM on functions in development is discussed and several operational models suitable for originators and other business models are discussed and specific aspects of CM are deduced from CM's technical characteristics. Organizational structures of current operations typically can support CM implementations with just minor refinements if the CM technology is limited to single steps or small sequences (bin-to-bin approach) and if the appropriate technical skill set is available. In such cases, a small, dedicated group focused on CM is recommended. The manufacturing strategy, as centralized versus decentralized in light of CM processes, is discussed and the potential impact of significantly shortened supply lead times on the organization that runs these processes. The ultimate CM implementation may be seen by some as a totally integrated monolithic plant, one that unifies chemistry and pharmaceutical operations into one plant. The organization supporting this approach will have to reflect this change in scope and responsibility. The other extreme, admittedly futuristic at this point, would be a highly decentralized approach with multiple smaller hubs; this would require a new and different organizational structure. This processing approach would open up new opportunities for products that, because of stability constraints or individualization to patients, do not allow centralized manufacturing approaches at all. Again, the entire enterprise needs to be restructured accordingly. The situation of CM in an outsourced operation business model is discussed. Next steps for the industry are recommended. In summary, opportunistic implementation of isolated steps in existing portfolios can be implemented with minimal organizational changes; the availability of the appropriate skills is the determining factor. The implementation of more substantial sequences requires business processes that consider the portfolio, not just single products. Exploration and implementation of complete process chains with consequences for quality decisions do require appropriate organizational support. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  17. Evaluation of Lumicyano™ cyanoacrylate fuming process for the development of latent fingermarks on plastic carrier bags by means of a pseudo operational comparative trial.

    PubMed

    Farrugia, Kevin J; Deacon, Paul; Fraser, Joanna

    2014-03-01

    There are a number of studies discussing recent developments of a one-step fluorescent cyanoacrylate process. This study is a pseudo operational trial to compare an example of a one-step fluorescent cyanoacrylate product, Lumicyano™, with the two recommended techniques for plastic carrier bags; cyanoacrylate fuming followed by basic yellow 40 (BY40) dyeing and powder suspensions. 100 plastic carrier bags were collected from the place of work and the items were treated as found without any additional fingermark deposition. The bags were split into three and after treatment with the three techniques a comparable number of fingermarks were detected by each technique (average of 300 fingermarks). The items treated with Lumicyano™ were sequentially processed with BY40 and an additional 43 new fingermarks were detected. Lumicyano™ appears to be a suitable technique for the development of fingermarks on plastic carrier bags and it can help save lab space and time as it does not require dyeing or drying procedures. Furthermore, contrary to other one-step cyanoacrylate products, existing cyanoacrylate cabinets do not require any modification for the treatment of articles with Lumicyano™. To date, there is little peer reviewed articles in the literature on trials related to Lumicyano™ and this study aims to contribute to fill this gap. © 2013.

  18. Fabrication High Resolution Metrology Target By Step And Repeat Method

    NASA Astrophysics Data System (ADS)

    Dusa, Mircea

    1983-10-01

    Based on the photolithography process generally used to generate high resolution masks for semiconductor I.C.S, we found a very useful industrial application of laser technology.First, we have generated high resolution metrology targets which are used in industrial measurement laser interferometers as difra.ction gratings. Secondi we have generated these targets using step and repeat machine, with He-Ne laser interferometer controlled state, as a pattern generator, due to suitable computer programming.Actually, high resolution metrology target, means two chromium plates, one of which is called the" rule" the other one the "vernier". In Fig.1 we have the configuration of the rule and the vernier. The rule has a succesion of 3 μM lines generated as a difraction grating on a 4 x 4 inch chromium blank. The vernier has several exposed fields( areas) having 3 - 15 μm lines, fields placed on very precise position on the chromium blank surface. High degree of uniformity, tight CD tolerances, low defect density required by the targets, creates specialised problems during processing. Details of the processing, together with experimental results will be presented. Before we start to enter into process details, we have to point out that the dimensional requirements of the reticle target, are quite similar or perhaps more strict than LSI master casks. These requirements presented in Fig.2.

  19. Theoretical and experimental study of the formation conditions of stepped leaders in negative flashes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Shijun, E-mail: sj-xie@163.com; State Key Laboratory of Control and Simulation of Power System and Generation Equipment, Department of Electrical Engineering, Tsinghua University, Beijing 100084; Zeng, Rong

    2015-08-15

    Natural lightning flashes are stochastic and uncontrollable, and thus, it is difficult to observe the formation process of a downward negative stepped leader (NSL) directly and in detail. This situation has led to some dispute over the actual NSL formation mechanism, and thus has hindered improvements in the lightning shielding analysis model. In this paper, on the basis of controllable long air gap discharge experiments, the formation conditions required for NSLs in negative flashes have been studied. First, a series of simulation experiments on varying scales were designed and carried out. The NSL formation processes were observed, and several ofmore » the characteristic process parameters, including the scale, the propagation velocity, and the dark period, were obtained. By comparing the acquired formation processes and the characteristic parameters with those in natural lightning flashes, the similarity between the NSLs in the simulation experiments and those in natural flashes was proved. Then, based on the local thermodynamic equation and the space charge estimation method, the required NSL formation conditions were deduced, and the space background electric field (E{sub b}) was proposed as the primary parameter for NSL formation. Finally, the critical value of E{sub b} required for the formation of NSLs in natural flashes was determined to be approximately 75 kV/m by extrapolation of the results of the simulation experiments.« less

  20. Automating the evaluation of flood damages: methodology and potential gains

    NASA Astrophysics Data System (ADS)

    Eleutério, Julian; Martinez, Edgar Daniel

    2010-05-01

    The evaluation of flood damage potential consists of three main steps: assessing and processing data, combining data and calculating potential damages. The first step consists of modelling hazard and assessing vulnerability. In general, this step of the evaluation demands more time and investments than the others. The second step of the evaluation consists of combining spatial data on hazard with spatial data on vulnerability. Geographic Information System (GIS) is a fundamental tool in the realization of this step. GIS software allows the simultaneous analysis of spatial and matrix data. The third step of the evaluation consists of calculating potential damages by means of damage-functions or contingent analysis. All steps demand time and expertise. However, the last two steps must be realized several times when comparing different management scenarios. In addition, uncertainty analysis and sensitivity test are made during the second and third steps of the evaluation. The feasibility of these steps could be relevant in the choice of the extent of the evaluation. Low feasibility could lead to choosing not to evaluate uncertainty or to limit the number of scenario comparisons. Several computer models have been developed over time in order to evaluate the flood risk. GIS software is largely used to realise flood risk analysis. The software is used to combine and process different types of data, and to visualise the risk and the evaluation results. The main advantages of using a GIS in these analyses are: the possibility of "easily" realising the analyses several times, in order to compare different scenarios and study uncertainty; the generation of datasets which could be used any time in future to support territorial decision making; the possibility of adding information over time to update the dataset and make other analyses. However, these analyses require personnel specialisation and time. The use of GIS software to evaluate the flood risk requires personnel with a double professional specialisation. The professional should be proficient in GIS software and in flood damage analysis (which is already a multidisciplinary field). Great effort is necessary in order to correctly evaluate flood damages, and the updating and the improvement of the evaluation over time become a difficult task. The automation of this process should bring great advance in flood management studies over time, especially for public utilities. This study has two specific objectives: (1) show the entire process of automation of the second and third steps of flood damage evaluations; and (2) analyse the induced potential gains in terms of time and expertise needed in the analysis. A programming language is used within GIS software in order to automate hazard and vulnerability data combination and potential damages calculation. We discuss the overall process of flood damage evaluation. The main result of this study is a computational tool which allows significant operational gains on flood loss analyses. We quantify these gains by means of a hypothetical example. The tool significantly reduces the time of analysis and the needs for expertise. An indirect gain is that sensitivity and cost-benefit analyses can be more easily realized.

  1. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  2. Adaptive Time Stepping for Transient Network Flow Simulation in Rocket Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok K.; Ravindran, S. S.

    2017-01-01

    Fluid and thermal transients found in rocket propulsion systems such as propellant feedline system is a complex process involving fast phases followed by slow phases. Therefore their time accurate computation requires use of short time step initially followed by the use of much larger time step. Yet there are instances that involve fast-slow-fast phases. In this paper, we present a feedback control based adaptive time stepping algorithm, and discuss its use in network flow simulation of fluid and thermal transients. The time step is automatically controlled during the simulation by monitoring changes in certain key variables and by feedback. In order to demonstrate the viability of time adaptivity for engineering problems, we applied it to simulate water hammer and cryogenic chill down in pipelines. Our comparison and validation demonstrate the accuracy and efficiency of this adaptive strategy.

  3. BTS guide to good statistical practice

    DOT National Transportation Integrated Search

    2002-09-01

    Quality of data has many faces. Primarily, it has to be relevant to its users. Relevance is : an outcome that is achieved through a series of steps starting with a planning process that : link user needs to data requirements. It continues through acq...

  4. Improvements to the single screw extruder

    NASA Technical Reports Server (NTRS)

    Hiemenz, C.; Ziegmann, G.; Franzkoch, B.; Hoffmanns, W.; Michaeli, W.

    1977-01-01

    The extrusion on a single screw extruder is examined. The process is divided into several steps: the dosage of the materials to be conveyed; the modification of the shape of the feeding opening which influences the feeding process and consequently the throughput of the extruder; optimizing the shape of the feeding zone to meet the specific material requirements; and plasticizing and homogenizing.

  5. Solar cell and I.C. aspects of ingot-to-slice mechanical processing

    NASA Astrophysics Data System (ADS)

    Dyer, L. D.

    1985-08-01

    Intensive efforts have been put into the growth of silicon crystals to suit today's solar cell and integrated circuit requirements. Each step of processing the crystal must also receive concentrated attention to preserve the grown-in perfection and to provide a suitable device-ready wafer at reasonable cost. A comparison is made between solar cell and I.C. requirements on the mechanical processing of silicon from ingot to wafer. Specific defects are described that can ruin the slice or can possibly lead to device degradation. These include grinding cracks, saw exit chips, crow's-foot fractures, edge cracks, and handling scratches.

  6. Solar cell and I.C. aspects of ingot-to-slice mechanical processing

    NASA Technical Reports Server (NTRS)

    Dyer, L. D.

    1985-01-01

    Intensive efforts have been put into the growth of silicon crystals to suit today's solar cell and integrated circuit requirements. Each step of processing the crystal must also receive concentrated attention to preserve the grown-in perfection and to provide a suitable device-ready wafer at reasonable cost. A comparison is made between solar cell and I.C. requirements on the mechanical processing of silicon from ingot to wafer. Specific defects are described that can ruin the slice or can possibly lead to device degradation. These include grinding cracks, saw exit chips, crow's-foot fractures, edge cracks, and handling scratches.

  7. A biogenesis step upstream of Microprocessor controls miR-17~92 expression

    PubMed Central

    Du, Peng; Wang, Longfei; Sliz, Piotr; Gregory, Richard I.

    2015-01-01

    SUMMARY The precise control of miR-17~92 microRNA (miRNA) is essential for normal development and overexpression of certain miRNAs from this cluster is oncogenic. Here we find the relative expression of the six miRNAs processed from the primary (pri-miR-17~92) transcript is dynamically regulated during embryonic stem cell (ESC) differentiation. Pri-miR-17~92 is processed to a biogenesis intermediate, termed ‘progenitor-miRNA’ (pro-miRNA). Pro-miRNA is an efficient substrate for Microprocessor and is required to selectively license production of pre-miR-17, -18a, -19a, 20a, and -19b from this cluster. Two complementary cis-regulatory repression domains within pri-miR-17~92 are required for the blockade of miRNA processing through the formation of an autoinhibitory RNA conformation. The endonuclease CPSF3 (CPSF73), and the Spliceosome-associated ISY1 are responsible for pro-miRNA biogenesis and expression of all miRNAs within the cluster except miR-92. Thus, developmentally regulated pro-miRNA processing is key step controlling miRNA expression and explains the posttranscriptional control of miR-17~92 expression in development. PMID:26255770

  8. Effect of Processing on Silk-Based Biomaterials: Reproducibility and Biocompatibility

    PubMed Central

    Wray, Lindsay S.; Hu, Xiao; Gallego, Jabier; Georgakoudi, Irene; Omenetto, Fiorenzo G.; Schmidt, Daniel; Kaplan, David L.

    2012-01-01

    Silk fibroin has been successfully used as a biomaterial for tissue regeneration. In order to prepare silk fibroin biomaterials for human implantation a series of processing steps are required to purify the protein. Degumming to remove inflammatory sericin is a crucial step related to biocompatibility and variability in the material. Detailed characterization of silk fibroin degumming is reported. The degumming conditions significantly affected cell viability on the silk fibroin material and the ability to form three-dimensional porous scaffolds from the silk fibroin, but did not affect macrophage activation or β-sheet content in the materials formed. Methods are also provided to determine the content of residual sericin in silk fibroin solutions and to assess changes in silk fibroin molecular weight. Amino acid composition analysis was used to detect sericin residuals in silk solutions with a detection limit between 1.0% and 10% wt/wt, while fluorescence spectroscopy was used to reproducibly distinguish between silk samples with different molecular weights. Both methods are simple and require minimal sample volume, providing useful quality control tools for silk fibroin preparation processes. PMID:21695778

  9. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  10. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  11. Automatic Coregistration and orthorectification (ACRO) and subsequent mosaicing of NASA high-resolution imagery over the Mars MC11 quadrangle, using HRSC as a baseline

    NASA Astrophysics Data System (ADS)

    Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian

    2018-02-01

    This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.

  12. All-solid-state lithium-ion and lithium metal batteries - paving the way to large-scale production

    NASA Astrophysics Data System (ADS)

    Schnell, Joscha; Günther, Till; Knoche, Thomas; Vieider, Christoph; Köhler, Larissa; Just, Alexander; Keller, Marlou; Passerini, Stefano; Reinhart, Gunther

    2018-04-01

    Challenges and requirements for the large-scale production of all-solid-state lithium-ion and lithium metal batteries are herein evaluated via workshops with experts from renowned research institutes, material suppliers, and automotive manufacturers. Aiming to bridge the gap between materials research and industrial mass production, possible solutions for the production chains of sulfide and oxide based all-solid-state batteries from electrode fabrication to cell assembly and quality control are presented. Based on these findings, a detailed comparison of the production processes for a sulfide based all-solid-state battery with conventional lithium-ion cell production is given, showing that processes for composite electrode fabrication can be adapted with some effort, while the fabrication of the solid electrolyte separator layer and the integration of a lithium metal anode will require completely new processes. This work identifies the major steps towards mass production of all-solid-state batteries, giving insight into promising manufacturing technologies and helping stakeholders, such as machine engineering, cell producers, and original equipment manufacturers, to plan the next steps towards safer batteries with increased storage capacity.

  13. Molgenis-impute: imputation pipeline in a box.

    PubMed

    Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A

    2015-08-19

    Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional computational steps or it can be included in other bioinformatics pipelines. It is available as open source from: https://github.com/molgenis/molgenis-imputation.

  14. Framework for the quality assurance of 'omics technologies considering GLP requirements.

    PubMed

    Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben

    2017-12-01

    'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  16. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  17. Towards Core Modelling Practices in Integrated Water Resource Management: An Interdisciplinary View of the Modelling Process

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Elsawah, S.; Pierce, S. A.; Ames, D. P.

    2016-12-01

    The National Socio-Environmental Synthesis Center (SESYNC) Core Modelling Practices Pursuit is developing resources to describe core practices for developing and using models to support integrated water resource management. These practices implement specific steps in the modelling process with an interdisciplinary perspective; however, the particular practice that is most appropriate depends on contextual aspects specific to the project. The first task of the pursuit is to identify the various steps for which implementation practices are to be described. This paper reports on those results. The paper draws on knowledge from the modelling process literature for environmental modelling (Jakeman et al., 2006), engaging stakeholders (Voinov and Bousquet, 2010) and general modelling (Banks, 1999), as well as the experience of the consortium members. We organise the steps around the four modelling phases. The planning phase identifies what is to be achieved, how and with what resources. The model is built and tested during the construction phase, and then used in the application phase. Finally, models that become part of the ongoing policy process require a maintenance phase. For each step, the paper focusses on what is to be considered or achieved, rather than how it is performed. This reflects the separation of the steps from the practices that implement them in different contexts. We support description of steps with a wide range of examples. Examples are designed to be generic and do not reflect any one project or context, but instead are drawn from common situations or from extremely different ones so as to highlight some of the issues that may arise at each step. References Banks, J. (1999). Introduction to simulation. In Proceedings of the 1999 Winter Simulation Conference. Jakeman, A. J., R. A. Letcher, and J. P. Norton (2006). Ten iterative steps in development and evaluation of environmental models. Environmental Modelling and Software 21, 602-614. Voinov, A. and F. Bousquet (2010). Modelling with stakeholders. Environmental Modelling & Software 25 (11), 1268-1281.

  18. Website Redesign: A Case Study.

    PubMed

    Wu, Jin; Brown, Janis F

    2016-01-01

    A library website redesign is a complicated and at times arduous task, requiring many different steps including determining user needs, analyzing past user behavior, examining other websites, defining design preferences, testing, marketing, and launching the site. Many different types of expertise are required over the entire process. Lessons learned from the Norris Medical Library's experience with the redesign effort may be useful to others undertaking a similar project.

  19. Manufacturing considerations for AMLCD cockpit displays

    NASA Astrophysics Data System (ADS)

    Luo, Fang-Chen

    1995-06-01

    AMLCD cockpit displays need to meet more stringent requirements compared with AMLCD commercial displays in areas such as environmental conditions, optical performance and device reliability. Special considerations are required for the manufacturing of AMLCD cockpit displays in each process step to address these issues. Some examples are: UV stable polarizers, wide-temperature LC material, strong LC glue seal, ESS test system, gray scale voltage EEPROM, etc.

  20. Fort Benning Land-Use Planning and Management Study

    DTIC Science & Technology

    1990-04-01

    process is three-tiered: (a) an initial phase that results in preliminary allocations for natural resources, (b) a second phase that focuses on...allocations of military training requirements, and (c) a final phase that resolves conflicts between the military and natural resource requirements and...assigns final allocations. 34. Initial phase : Natural resource allocations. The first step in this phase was to make allocations among natural resource

  1. Determining the Number of Clusters in a Data Set Without Graphical Interpretation

    NASA Technical Reports Server (NTRS)

    Aguirre, Nathan S.; Davies, Misty D.

    2011-01-01

    Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,

  2. [Information system for supporting the Nursing Care Systematization].

    PubMed

    Malucelli, Andreia; Otemaier, Kelly Rafaela; Bonnet, Marcel; Cubas, Marcia Regina; Garcia, Telma Ribeiro

    2010-01-01

    It is an unquestionable fact, the importance, relevance and necessity of implementing the Nursing Care Systematization in the different environments of professional practice. Considering it as a principle, emerged the motivation for the development of an information system to support the Nursing Care Systematization, based on Nursing Process steps and Human Needs, using the diagnoses language, nursing interventions and outcomes for professional practice documentation. This paper describes the methodological steps and results of the information system development - requirements elicitation, modeling, object-relational mapping, implementation and system validation.

  3. DART system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less

  4. A flow-through chromatography process for influenza A and B virus purification.

    PubMed

    Weigel, Thomas; Solomaier, Thomas; Peuker, Alessa; Pathapati, Trinath; Wolff, Michael W; Reichl, Udo

    2014-10-01

    Vaccination is still the most efficient measure to protect against influenza virus infections. Besides the seasonal wave of influenza, pandemic outbreaks of bird or swine flu represent a high threat to human population. With the establishment of cell culture-based processes, there is a growing demand for robust, economic and efficient downstream processes for influenza virus purification. This study focused on the development of an economic flow-through chromatographic process avoiding virus strain sensitive capture steps. Therefore, a three-step process consisting of anion exchange chromatography (AEC), Benzonase(®) treatment, and size exclusion chromatography with a ligand-activated core (LCC) was established, and tested for purification of two influenza A virus strains and one influenza B virus strain. The process resulted in high virus yields (≥68%) with protein contamination levels fulfilling requirements of the European Pharmacopeia for production of influenza vaccines for human use. DNA was depleted by ≥98.7% for all strains. The measured DNA concentrations per dose were close to the required limits of 10ng DNA per dose set by the European Pharmacopeia. In addition, the added Benzonase(®) could be successfully removed from the product fraction. Overall, the presented downstream process could potentially represent a simple, robust and economic platform technology for production of cell culture-derived influenza vaccines. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A Generic Data Harmonization Process for Cross-linked Research and Network Interaction. Construction and Application for the Lung Cancer Phenotype Database of the German Center for Lung Research.

    PubMed

    Firnkorn, D; Ganzinger, M; Muley, T; Thomas, M; Knaup, P

    2015-01-01

    Joint data analysis is a key requirement in medical research networks. Data are available in heterogeneous formats at each network partner and their harmonization is often rather complex. The objective of our paper is to provide a generic approach for the harmonization process in research networks. We applied the process when harmonizing data from three sites for the Lung Cancer Phenotype Database within the German Center for Lung Research. We developed a spreadsheet-based solution as tool to support the harmonization process for lung cancer data and a data integration procedure based on Talend Open Studio. The harmonization process consists of eight steps describing a systematic approach for defining and reviewing source data elements and standardizing common data elements. The steps for defining common data elements and harmonizing them with local data definitions are repeated until consensus is reached. Application of this process for building the phenotype database led to a common basic data set on lung cancer with 285 structured parameters. The Lung Cancer Phenotype Database was realized as an i2b2 research data warehouse. Data harmonization is a challenging task requiring informatics skills as well as domain knowledge. Our approach facilitates data harmonization by providing guidance through a uniform process that can be applied in a wide range of projects.

  6. Study on Capturing Functional Requirements of the New Product Based on Evolution

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Song, Liya; Bai, Zhonghang; Zhang, Peng

    In order to exist in an increasingly competitive global marketplace, it is important for corporations to forecast the evolutionary direction of new products rapidly and effectively. Most products in the world are developed based on the design of existing products. In the product design, capturing functional requirements is a key step. Function is continuously evolving, which is driven by the evolution of needs and technologies. So the functional requirements of new product can be forecasted based on the functions of existing product. Eight laws of function evolution are put forward in this paper. The process model of capturing the functional requirements of new product based on function evolution is proposed. An example illustrates the design process.

  7. Strategies for Stabilizing Nitrogenous Compounds in ECLSS Wastewater: Top-Down System Design and Unit Operation Selection with Focus on Bio-Regenerative Processes for Short and Long Term Scenarios

    NASA Technical Reports Server (NTRS)

    Lunn, Griffin M.

    2011-01-01

    Water recycling and eventual nutrient recovery is crucial for surviving in or past low earth orbit. New approaches and syste.m architecture considerations need to be addressed to meet current and future system requirements. This paper proposes a flexible system architecture that breaks down pretreatment , steps into discrete areas where multiple unit operations can be considered. An overview focusing on the urea and ammonia conversion steps allows an analysis on each process's strengths and weaknesses and synergy with upstream and downstream processing. Process technologies to be covered include chemical pretreatment, biological urea hydrolysis, chemical urea hydrolysis, combined nitrification-denitrification, nitrate nitrification, anammox denitrification, and regenerative ammonia absorption through struvite formation. Biological processes are considered mainly for their ability to both maximize water recovery and to produce nutrients for future plant systems. Unit operations can be considered for traditional equivalent system mass requirements in the near term or what they can provide downstream in the form of usable chemicals or nutrients for the long term closed-loop ecological control and life support system. Optimally this would allow a system to meet the former but to support the latter without major modification.

  8. Human Factors Design Of Automated Highway Systems: Scenario Definition

    DOT National Transportation Integrated Search

    1995-09-01

    Attention to driver acceptance and performance issues during system design will be key to the success of the Automated Highway System (AHS). A first step in the process of defining driver roles and driver-system interface requirements of AHS is the d...

  9. Autoclave heat treatment for prealloyed powder products

    NASA Technical Reports Server (NTRS)

    Freche, J. C.; Ashbrook, R. L.

    1973-01-01

    Technique could be applied directly to loose powders as part of hot pressing process of forming them to any required shapes. This would eliminate initial extrusion step commonly applied to prealloyed powders, substantially reduce cost of forming operation, and result in optimum properties.

  10. Guide to good statistical practice in the transportation field

    DOT National Transportation Integrated Search

    2003-05-01

    Quality of data has many faces. Primarily, it has to be relevant (i.e., useful) to its users. Relevance is achieved through a series of steps starting with a planning process that links user needs to data requirements. It continues through acquisitio...

  11. Methods and Techniques of Revenue Forecasting.

    ERIC Educational Resources Information Center

    Caruthers, J. Kent; Wentworth, Cathi L.

    1997-01-01

    Revenue forecasting is the critical first step in most college and university budget-planning processes. While it seems a straightforward exercise, effective forecasting requires consideration of a number of interacting internal and external variables, including demographic trends, economic conditions, and broad social priorities. The challenge…

  12. DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR BENCH-SCALE REFORMER TREATABILITY STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING DL

    2011-02-11

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.« less

  13. Algae to Bio-Crude in Less Than 60 Minutes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Doug

    Engineers have created a chemical process that produces useful crude oil just minutes after engineers pour in harvested algae -- a verdant green paste with the consistency of pea soup. The PNNL team combined several chemical steps into one continuous process that starts with an algae slurry that contains as much as 80 to 90 percent water. Most current processes require the algae to be dried -- an expensive process that takes a lot of energy. The research has been licensed by Genifuel Corp.

  14. Algae to Bio-Crude in Less Than 60 Minutes

    ScienceCinema

    Elliott, Doug

    2018-01-16

    Engineers have created a chemical process that produces useful crude oil just minutes after engineers pour in harvested algae -- a verdant green paste with the consistency of pea soup. The PNNL team combined several chemical steps into one continuous process that starts with an algae slurry that contains as much as 80 to 90 percent water. Most current processes require the algae to be dried -- an expensive process that takes a lot of energy. The research has been licensed by Genifuel Corp.

  15. Thermal contouring of forestry data: Wallops Island

    NASA Technical Reports Server (NTRS)

    Thomson, F.

    1972-01-01

    The contouring of 8-13.5 micrometer thermal data collected over a forestry site in Virginia is described. The data were collected at an altitude of 1000 ft above terrain on November 4, 1970. The site was covered on three approximately parallel lines. The purpose of the contouring was to attempt to delineate pine trees attacked by southern pine bark beetle, and to map other important terrain categories. Special processing steps were required to achieve the correct aspect ratio of the thermal data. The reference for the correction procedure was color infrared photography. Data form and quality are given, processing steps are outlined, a brief interpretation of results is given, and conclusion are presented.

  16. An implementation of the look-ahead Lanczos algorithm for non-Hermitian matrices

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.; Gutknecht, Martin H.; Nachtigal, Noel M.

    1991-01-01

    The nonsymmetric Lanczos method can be used to compute eigenvalues of large sparse non-Hermitian matrices or to solve large sparse non-Hermitian linear systems. However, the original Lanczos algorithm is susceptible to possible breakdowns and potential instabilities. An implementation is presented of a look-ahead version of the Lanczos algorithm that, except for the very special situation of an incurable breakdown, overcomes these problems by skipping over those steps in which a breakdown or near-breakdown would occur in the standard process. The proposed algorithm can handle look-ahead steps of any length and requires the same number of matrix-vector products and inner products as the standard Lanczos process without look-ahead.

  17. Non-rigid CT/CBCT to CBCT registration for online external beam radiotherapy guidance

    NASA Astrophysics Data System (ADS)

    Zachiu, Cornel; de Senneville, Baudouin Denis; Tijssen, Rob H. N.; Kotte, Alexis N. T. J.; Houweling, Antonetta C.; Kerkmeijer, Linda G. W.; Lagendijk, Jan J. W.; Moonen, Chrit T. W.; Ries, Mario

    2018-01-01

    Image-guided external beam radiotherapy (EBRT) allows radiation dose deposition with a high degree of accuracy and precision. Guidance is usually achieved by estimating the displacements, via image registration, between cone beam computed tomography (CBCT) and computed tomography (CT) images acquired at different stages of the therapy. The resulting displacements are then used to reposition the patient such that the location of the tumor at the time of treatment matches its position during planning. Moreover, ongoing research aims to use CBCT-CT image registration for online plan adaptation. However, CBCT images are usually acquired using a small number of x-ray projections and/or low beam intensities. This often leads to the images being subject to low contrast, low signal-to-noise ratio and artifacts, which ends-up hampering the image registration process. Previous studies addressed this by integrating additional image processing steps into the registration procedure. However, these steps are usually designed for particular image acquisition schemes, therefore limiting their use on a case-by-case basis. In the current study we address CT to CBCT and CBCT to CBCT registration by the means of the recently proposed EVolution registration algorithm. Contrary to previous approaches, EVolution does not require the integration of additional image processing steps in the registration scheme. Moreover, the algorithm requires a low number of input parameters, is easily parallelizable and provides an elastic deformation on a point-by-point basis. Results have shown that relative to a pure CT-based registration, the intrinsic artifacts present in typical CBCT images only have a sub-millimeter impact on the accuracy and precision of the estimated deformation. In addition, the algorithm has low computational requirements, which are compatible with online image-based guidance of EBRT treatments.

  18. Temporally distinct transcriptional regulation of myocyte dedifferentiation and Myofiber growth during muscle regeneration.

    PubMed

    Louie, Ke'ale W; Saera-Vila, Alfonso; Kish, Phillip E; Colacino, Justin A; Kahana, Alon

    2017-11-09

    Tissue regeneration requires a series of steps, beginning with generation of the necessary cell mass, followed by cell migration into damaged area, and ending with differentiation and integration with surrounding tissues. Temporal regulation of these steps lies at the heart of the regenerative process, yet its basis is not well understood. The ability of zebrafish to dedifferentiate mature "post-mitotic" myocytes into proliferating myoblasts that in turn regenerate lost muscle tissue provides an opportunity to probe the molecular mechanisms of regeneration. Following subtotal excision of adult zebrafish lateral rectus muscle, dedifferentiating residual myocytes were collected at two time points prior to cell cycle reentry and compared to uninjured muscles using RNA-seq. Functional annotation (GAGE or K-means clustering followed by GO enrichment) revealed a coordinated response encompassing epigenetic regulation of transcription, RNA processing, and DNA replication and repair, along with protein degradation and translation that would rewire the cellular proteome and metabolome. Selected candidate genes were phenotypically validated in vivo by morpholino knockdown. Rapidly induced gene products, such as the Polycomb group factors Ezh2 and Suz12a, were necessary for both efficient dedifferentiation (i.e. cell reprogramming leading to cell cycle reentry) and complete anatomic regeneration. In contrast, the late activated gene fibronectin was important for efficient anatomic muscle regeneration but not for the early step of myocyte cell cycle reentry. Reprogramming of a "post-mitotic" myocyte into a dedifferentiated myoblast requires a complex coordinated effort that reshapes the cellular proteome and rewires metabolic pathways mediated by heritable yet nuanced epigenetic alterations and molecular switches, including transcription factors and non-coding RNAs. Our studies show that temporal regulation of gene expression is programmatically linked to distinct steps in the regeneration process, with immediate early expression driving dedifferentiation and reprogramming, and later expression facilitating anatomical regeneration.

  19. 5 CFR 531.504 - Level of performance required for quality step increase.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... step increase. 531.504 Section 531.504 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER THE GENERAL SCHEDULE Quality Step Increases § 531.504 Level of performance required for quality step increase. A quality step increase shall not be required but may be granted only...

  20. Gallium arsenide processing for gate array logic

    NASA Technical Reports Server (NTRS)

    Cole, Eric D.

    1989-01-01

    The development of a reliable and reproducible GaAs process was initiated for applications in gate array logic. Gallium Arsenide is an extremely important material for high speed electronic applications in both digital and analog circuits since its electron mobility is 3 to 5 times that of silicon, this allows for faster switching times for devices fabricated with it. Unfortunately GaAs is an extremely difficult material to process with respect to silicon and since it includes the arsenic component GaAs can be quite dangerous (toxic) especially during some heating steps. The first stage of the research was directed at developing a simple process to produce GaAs MESFETs. The MESFET (MEtal Semiconductor Field Effect Transistor) is the most useful, practical and simple active device which can be fabricated in GaAs. It utilizes an ohmic source and drain contact separated by a Schottky gate. The gate width is typically a few microns. Several process steps were required to produce a good working device including ion implantation, photolithography, thermal annealing, and metal deposition. A process was designed to reduce the total number of steps to a minimum so as to reduce possible errors. The first run produced no good devices. The problem occurred during an aluminum etch step while defining the gate contacts. It was found that the chemical etchant attacked the GaAs causing trenching and subsequent severing of the active gate region from the rest of the device. Thus all devices appeared as open circuits. This problem is being corrected and since it was the last step in the process correction should be successful. The second planned stage involves the circuit assembly of the discrete MESFETs into logic gates for test and analysis. Finally the third stage is to incorporate the designed process with the tested circuit in a layout that would produce the gate array as a GaAs integrated circuit.

  1. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  2. Integrated experimental and technoeconomic evaluation of two-stage Cu-catalyzed alkaline–oxidative pretreatment of hybrid poplar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhalla, Aditya; Fasahati, Peyman; Particka, Chrislyn A.

    2018-05-17

    When applied to recalcitrant lignocellulosic feedstocks, multi-stage pretreatments can provide more processing flexibility to optimize or balance process outcomes such as increasing delignification, preserving hemicellulose, and maximizing enzymatic hydrolysis yields. We previously reported that adding an alkaline pre-extraction step to a copper-catalyzed alkaline hydrogen peroxide (Cu-AHP) pretreatment process resulted in improved sugar yields, but the process still utilized relatively high chemical inputs (catalyst and H2O2) and enzyme loadings. We hypothesized that by increasing the temperature of the alkaline pre-extraction step in water or ethanol, we could reduce the inputs required during Cu-AHP pretreatment and enzymatic hydrolysis without significant loss inmore » sugar yield. We also performed technoeconomic analysis to determine if ethanol or water was the more cost-effective solvent during alkaline pre-extraction and if the expense associated with increasing the temperature was economically justified.« less

  3. A novel patterning control strategy based on real-time fingerprint recognition and adaptive wafer level scanner optimization

    NASA Astrophysics Data System (ADS)

    Cekli, Hakki Ergun; Nije, Jelle; Ypma, Alexander; Bastani, Vahid; Sonntag, Dag; Niesing, Henk; Zhang, Linmiao; Ullah, Zakir; Subramony, Venky; Somasundaram, Ravin; Susanto, William; Matsunobu, Masazumi; Johnson, Jeff; Tabery, Cyrus; Lin, Chenxi; Zou, Yi

    2018-03-01

    In addition to lithography process and equipment induced variations, processes like etching, annealing, film deposition and planarization exhibit variations, each having their own intrinsic characteristics and leaving an effect, a `fingerprint', on the wafers. With ever tighter requirements for CD and overlay, controlling these process induced variations is both increasingly important and increasingly challenging in advanced integrated circuit (IC) manufacturing. For example, the on-product overlay (OPO) requirement for future nodes is approaching <3nm, requiring the allowable budget for process induced variance to become extremely small. Process variance control is seen as an bottleneck to further shrink which drives the need for more sophisticated process control strategies. In this context we developed a novel `computational process control strategy' which provides the capability of proactive control of each individual wafer with aim to maximize the yield, without introducing a significant impact on metrology requirements, cycle time or productivity. The complexity of the wafer process is approached by characterizing the full wafer stack building a fingerprint library containing key patterning performance parameters like Overlay, Focus, etc. Historical wafer metrology is decomposed into dominant fingerprints using Principal Component Analysis. By associating observed fingerprints with their origin e.g. process steps, tools and variables, we can give an inline assessment of the strength and origin of the fingerprints on every wafer. Once the fingerprint library is established, a wafer specific fingerprint correction recipes can be determined based on its processing history. Data science techniques are used in real-time to ensure that the library is adaptive. To realize this concept, ASML TWINSCAN scanners play a vital role with their on-board full wafer detection and exposure correction capabilities. High density metrology data is created by the scanner for each wafer and on every layer during the lithography steps. This metrology data will be used to obtain the process fingerprints. Also, the per exposure and per wafer correction potential of the scanners will be utilized for improved patterning control. Additionally, the fingerprint library will provide early detection of excursions for inline root cause analysis and process optimization guidance.

  4. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less

  5. Recycling plant, human and animal wastes to plant nutrients in a closed ecological system

    NASA Technical Reports Server (NTRS)

    Meissner, H. P.; Modell, M.

    1979-01-01

    The essential minerals for plant growth are nitrogen, phosphorous, potassium (macronutrients), calcium, magnesium, sulfur (secondary nutrients), iron, manganese, boron, copper, zinc, chlorine, sodium, and molybdenum (micronutrients). The first step in recycling wastes will undoubtedly be oxidation of carbon and hydrogen to CO2 and H2O. Transformation of minerals to plant nutrients depends upon the mode of oxidation to define the state of the nutrients. For the purpose of illustrating the type of processing required, ash and off-gas compositions of an incineration process were assumed and subsequent processing requirements were identified. Several processing schemes are described for separating out sodium chloride from the ash, leading to reformulation of a nutrient solution which should be acceptable to plants.

  6. A microfluidic device for preparing next generation DNA sequencing libraries and for automating other laboratory protocols that require one or more column chromatography steps.

    PubMed

    Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C; Quake, Stephen R; Burkholder, William F

    2013-01-01

    Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.

  7. A Microfluidic Device for Preparing Next Generation DNA Sequencing Libraries and for Automating Other Laboratory Protocols That Require One or More Column Chromatography Steps

    PubMed Central

    Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.

    2013-01-01

    Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273

  8. An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)

    NASA Astrophysics Data System (ADS)

    van den Heever, Lize; Marais, Neilen; Slabber, Martin

    2016-08-01

    This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.

  9. The people side of MRP (materiel requirements planning).

    PubMed

    Lunn, T

    1994-05-01

    A montage of ideas and concepts have been successfully used to train and motivate people to use MRP II systems more effectively. This is important today because many companies are striving to achieve World Class Manufacturing status. Closed loop Materiel Requirements Planning (MRP) systems are an integral part of the process of continuous improvement. Successfully using a formal management planning system, such as MRP II, is a fundamental stepping stone on the path toward World Class Excellence. Included in this article are techniques that companies use to reduce lead time, simplify bills of materiel, and improve schedule adherence. These and other steps all depend on the people who use the system. The focus will be on how companies use the MRP tool more effectively.

  10. Instructional System Development.

    ERIC Educational Resources Information Center

    Department of the Air Force, Washington, DC.

    The manual presents a technology of instructional design and a model for developing and conducting efficient and cost effective Air Force instructional systems. Chapter 1 provides an overview of Instructional System Development (ISD). Chapters 2-6 each focus on a step of the process: analysis of system requirements; definition of…

  11. Guidance for Product Category Rule Development: Process, Outcome and Next Steps

    EPA Science Inventory

    Background The development of Product Category Rules (PCRs) is inconsistent among the program operators using ISO 14025 as the basis. Furthermore, the existence of several other product claim standards and specifications that require PCRs for making product claims, has the potent...

  12. Apoptosis induced in an early step of African swine fever virus entry into vero cells does not require virus replication.

    PubMed

    Carrascosa, Angel L; Bustos, María J; Nogal, María L; González de Buitrago, Gonzalo; Revilla, Yolanda

    2002-03-15

    Permissive Vero cells develop apoptosis, as characterized by DNA fragmentation, caspases activation, cytosolic release of mitochondrial cytochrome c, and flow cytometric analysis of DNA content, upon infection with African swine fever virus (ASFV). To determine the step in virus replication that triggers apoptosis, we used UV-inactivated virus, inhibitors of protein and nucleic acid synthesis, and lysosomotropic drugs that block virus uncoating. ASFV-induced apoptosis was accompanied by caspase-3 activation, which was detected even in the presence of either cytosine arabinoside or cycloheximide, indicating that viral DNA replication and protein synthesis were not required to activate the apoptotic process. The activation of caspase-3 was released from chloroquine inhibition 2 h after virus absorption, while the infection with UV-inactivated ASFV did not induce the activation of the caspase cascade. We conclude that ASFV induces apoptosis in the infected cell by an intracellular pathway probably triggered during the process of virus uncoating.

  13. Pozzolanic filtration/solidification of radionuclides in nuclear reactor cooling water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Englehardt, J.D.; Peng, C.

    1995-12-31

    Laboratory studies to investigate the feasibility of one- and two-step processes for precipitation/coprecipitating radionuclides from nuclear reactor cooling water, filtering with pozzolanic filter aid, and solidifying, are reported in this paper. In the one-step process, ferrocyanide salt and excess lime are added ahead of the filter, and the resulting filter cake solidifies by a pozzolanic reaction. The two-step process involves addition of solidifying agents subsequent to filtration. It was found that high surface area diatomaceous synthetic calcium silicate powders, sold commercially as functional fillers and carriers, adsorb nickel isotopes from solution at neutral and slightly basic pH. Addition of themore » silicates to cooling water allowed removal of the tested metal isotopes (nickel, iron, manganese, cobalt, and cesium) simultaneously at neutral to slightly basic pH. Lime to diatomite ratio was the most influential characteristic of composition on final strength tested, with higher lime ratios giving higher strength. Diatomaceous earth filter aids manufactured without sodium fluxes exhibited higher pozzolanic activity. Pozzolanic filter cake solidified with sodium silicate and a ratio of 0.45 parts lime to 1 part diatomite had compressive strength ranging from 470 to 595 psi at a 90% confidence level. Leachability indices of all tested metals in the solidified waste were acceptable. In light of the typical requirement of removing iron and desirability of control over process pH, a two-step process involving addition of Portland cement to the filter cake may be most generally applicable.« less

  14. Spatial Data Integration Using Ontology-Based Approach

    NASA Astrophysics Data System (ADS)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  15. Evaluation of Vitrification Processing Step for Rocky Flats Incinerator Ash

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wigent, W.L.; Luey, J.K.; Scheele, R.D.

    In 1997, Pacific Northwest National Laboratory (PNNL) staff developed a processing option for incinerator ash at the Rocky Flats Environmental Technology Sites (RFETS). This work was performed with support from Los Alamos National Laboratory (LANL) and Safe Sites of Colorado (SSOC). A description of the remediation needs for the RFETS incinerator ash is provided in a report summarizing the recommended processing option for treatment of the ash (Lucy et al. 1998). The recommended process flowsheet involves a calcination pretreatment step to remove carbonaceous material followed by a vitrification processing step for a mixture of glass tit and calcined incinerator ash.more » Using the calcination pretreatment step to remove carbonaceous material reduced process upsets for the vitrification step, allowed for increased waste loading in the final product, and improved the quality of the final product. Figure 1.1 illustrates the flow sheet for the recommended processing option for treatment of RFETS incinerator ash. In 1998, work at PNNL further developed the recommended flow sheet through a series of studies to better define the vitrification operating parameters and to address secondary processing issues (such as characterizing the offgas species from the calcination process). Because a prototypical rotary calciner was not available for use, studies to evaluate the offgas from the calcination process were performed using a benchtop rotary calciner and laboratory-scale equipment (Lucy et al. 1998). This report focuses on the vitrification process step after ash has been calcined. Testing with full-scale containers was performed using ash surrogates and a muffle furnace similar to that planned for use at RFETS. Small-scale testing was performed using plutonium-bearing incinerator ash to verify performance of the waste form. Ash was not obtained from RFETS because of transportation requirements to calcine the incinerator ash prior to shipment of the material. Because part of PNNL's work was to characterize the ash prior to calcination and to investigate the effect of calcination on product quality, representative material was obtained from LANL. Ash obtained from LANL was selected based on its similarity to that currently stored at RFETS. The plutonium-bearing ashes obtained from LANL are likely from a RFETS incinerator, but the exact origin was not identified.« less

  16. SWIFT MODELLER: a Java based GUI for molecular modeling.

    PubMed

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  17. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Equivalent model construction for a non-linear dynamic system based on an element-wise stiffness evaluation procedure and reduced analysis of the equivalent system

    NASA Astrophysics Data System (ADS)

    Kim, Euiyoung; Cho, Maenghyo

    2017-11-01

    In most non-linear analyses, the construction of a system matrix uses a large amount of computation time, comparable to the computation time required by the solving process. If the process for computing non-linear internal force matrices is substituted with an effective equivalent model that enables the bypass of numerical integrations and assembly processes used in matrix construction, efficiency can be greatly enhanced. A stiffness evaluation procedure (STEP) establishes non-linear internal force models using polynomial formulations of displacements. To efficiently identify an equivalent model, the method has evolved such that it is based on a reduced-order system. The reduction process, however, makes the equivalent model difficult to parameterize, which significantly affects the efficiency of the optimization process. In this paper, therefore, a new STEP, E-STEP, is proposed. Based on the element-wise nature of the finite element model, the stiffness evaluation is carried out element-by-element in the full domain. Since the unit of computation for the stiffness evaluation is restricted by element size, and since the computation is independent, the equivalent model can be constructed efficiently in parallel, even in the full domain. Due to the element-wise nature of the construction procedure, the equivalent E-STEP model is easily characterized by design parameters. Various reduced-order modeling techniques can be applied to the equivalent system in a manner similar to how they are applied in the original system. The reduced-order model based on E-STEP is successfully demonstrated for the dynamic analyses of non-linear structural finite element systems under varying design parameters.

  19. PLAN-TA9-2443(U), Rev. B Remediated Nitrate Salt (RNS) Surrogate Formulation and Testing Standard Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Geoffrey Wayne

    2016-03-16

    This document identifies scope and some general procedural steps for performing Remediated Nitrate Salt (RNS) Surrogate Formulation and Testing. This Test Plan describes the requirements, responsibilities, and process for preparing and testing a range of chemical surrogates intended to mimic the energetic response of waste created during processing of legacy nitrate salts. The surrogates developed are expected to bound1 the thermal and mechanical sensitivity of such waste, allowing for the development of process parameters required to minimize the risk to worker and public when processing this waste. Such parameters will be based on the worst-case kinetic parameters as derived frommore » APTAC measurements as well as the development of controls to mitigate sensitivities that may exist due to friction, impact, and spark. This Test Plan will define the scope and technical approach for activities that implement Quality Assurance requirements relevant to formulation and testing.« less

  20. The roles of SSU processome components and surveillance factors in the initial processing of human ribosomal RNA

    PubMed Central

    Sloan, Katherine E.; Bohnsack, Markus T.; Schneider, Claudia; Watkins, Nicholas J.

    2014-01-01

    During eukaryotic ribosome biogenesis, three of the mature ribosomal (r)RNAs are released from a single precursor transcript (pre-rRNA) by an ordered series of endonucleolytic cleavages and exonucleolytic processing steps. Production of the 18S rRNA requires the removal of the 5′ external transcribed spacer (5′ETS) by endonucleolytic cleavages at sites A0 and A1/site 1. In metazoans, an additional cleavage in the 5′ETS, at site A′, upstream of A0, has also been reported. Here, we have investigated how A′ processing is coordinated with assembly of the early preribosomal complex. We find that only the tUTP (UTP-A) complex is critical for A′ cleavage, while components of the bUTP (UTP-B) and U3 snoRNP are important, but not essential, for efficient processing at this site. All other factors involved in the early stages of 18S rRNA processing that were tested here function downstream from this processing step. Interestingly, we show that the RNA surveillance factors XRN2 and MTR4 are also involved in A′ cleavage in humans. A′ cleavage is largely bypassed when XRN2 is depleted, and we also discover that A′ cleavage is not always the initial processing event in all cell types. Together, our data suggest that A′ cleavage is not a prerequisite for downstream pre-rRNA processing steps and may, in fact, represent a quality control step for initial pre-rRNA transcripts. Furthermore, we show that components of the RNA surveillance machinery, including the exosome and TRAMP complexes, also play key roles in the recycling of excised spacer fragments and degradation of aberrant pre-rRNAs in human cells. PMID:24550520

  1. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Richey, Christina; Bernstein, Max; Rall, Jonathan

    2015-01-01

    Introduction: NASA's Planetary Science Division (PSD) solicits its Research and Analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD will be changing the structure of the program elements under which the majority of planetary science R&A is done. Major changes include the creation of five core research program elements aligned with PSD's strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submissionROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2014 submission changes: All PSD programs will use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.Additional Information: Additional details will be provided on the Cassini Data Analysis Program, the Exoplanets Research program and Discovery Data Analysis Program, for which Dr. Richey is the Lead Program Officer.

  2. The exact fundamental solution for the Benes tracking problem

    NASA Astrophysics Data System (ADS)

    Balaji, Bhashyam

    2009-05-01

    The universal continuous-discrete tracking problem requires the solution of a Fokker-Planck-Kolmogorov forward equation (FPKfe) for an arbitrary initial condition. Using results from quantum mechanics, the exact fundamental solution for the FPKfe is derived for the state model of arbitrary dimension with Benes drift that requires only the computation of elementary transcendental functions and standard linear algebra techniques- no ordinary or partial differential equations need to be solved. The measurement process may be an arbitrary, discrete-time nonlinear stochastic process, and the time step size can be arbitrary. Numerical examples are included, demonstrating its utility in practical implementation.

  3. Optimization of Advanced ACTPol Transition Edge Sensor Bolometer Operation Using R(T,I) Transition Measurements

    NASA Astrophysics Data System (ADS)

    Salatino, Maria

    2017-06-01

    In the current submm and mm cosmology experiments the focal planes are populated by kilopixel transition edge sensors (TESes). Varying incoming power load requires frequent rebiasing of the TESes through standard current-voltage (IV) acquisition. The time required to perform IVs on such large arrays and the resulting transient heating of the bath reduces the sky observation time. We explore a bias step method that significantly reduces the time required for the rebiasing process. This exploits the detectors' responses to the injection of a small square wave signal on top of the dc bias current and knowledge of the shape of the detector transition R(T,I). This method has been tested on two detector arrays of the Atacama Cosmology Telescope (ACT). In this paper, we focus on the first step of the method, the estimate of the TES %Rn.

  4. Three steps to writing adaptive study protocols in the early phase clinical development of new medicines

    PubMed Central

    2014-01-01

    This article attempts to define terminology and to describe a process for writing adaptive, early phase study protocols which are transparent, self-intuitive and uniform. It provides a step by step guide, giving templates from projects which received regulatory authorisation and were successfully performed in the UK. During adaptive studies evolving data is used to modify the trial design and conduct within the protocol-defined remit. Adaptations within that remit are documented using non-substantial protocol amendments which do not require regulatory or ethical review. This concept is efficient in gathering relevant data in exploratory early phase studies, ethical and time- and cost-effective. PMID:24980283

  5. A Microwell-Printing Fabrication Strategy for the On-Chip Templated Biosynthesis of Protein Microarrays for Surface Plasmon Resonance Imaging

    PubMed Central

    Manuel, Gerald; Lupták, Andrej; Corn, Robert M.

    2017-01-01

    A two-step templated, ribosomal biosynthesis/printing method for the fabrication of protein microarrays for surface plasmon resonance imaging (SPRI) measurements is demonstrated. In the first step, a sixteen component microarray of proteins is created in microwells by cell free on chip protein synthesis; each microwell contains both an in vitro transcription and translation (IVTT) solution and 350 femtomoles of a specific DNA template sequence that together are used to create approximately 40 picomoles of a specific hexahistidine-tagged protein. In the second step, the protein microwell array is used to contact print one or more protein microarrays onto nitrilotriacetic acid (NTA)-functionalized gold thin film SPRI chips for real-time SPRI surface bioaffinity adsorption measurements. Even though each microwell array element only contains approximately 40 picomoles of protein, the concentration is sufficiently high for the efficient bioaffinity adsorption and capture of the approximately 100 femtomoles of hexahistidine-tagged protein required to create each SPRI microarray element. As a first example, the protein biosynthesis process is verified with fluorescence imaging measurements of a microwell array containing His-tagged green fluorescent protein (GFP), yellow fluorescent protein (YFP) and mCherry (RFP), and then the fidelity of SPRI chips printed from this protein microwell array is ascertained by measuring the real-time adsorption of various antibodies specific to these three structurally related proteins. This greatly simplified two-step synthesis/printing fabrication methodology eliminates most of the handling, purification and processing steps normally required in the synthesis of multiple protein probes, and enables the rapid fabrication of SPRI protein microarrays from DNA templates for the study of protein-protein bioaffinity interactions. PMID:28706572

  6. Estimating psychiatric manpower requirements based on patients' needs.

    PubMed

    Faulkner, L R; Goldman, C R

    1997-05-01

    To provide a better understanding of the complexities of estimating psychiatric manpower requirements, the authors describe several approaches to estimation and present a method based on patients' needs. A five-step method for psychiatric manpower estimation is used, with estimates of data pertinent to each step, to calculate the total psychiatric manpower requirements for the United States. The method is also used to estimate the hours of psychiatric service per patient per year that might be available under current psychiatric practice and under a managed care scenario. Depending on assumptions about data at each step in the method, the total psychiatric manpower requirements for the U.S. population range from 2,989 to 358,696 full-time-equivalent psychiatrists. The number of available hours of psychiatric service per patient per year is 14.1 hours under current psychiatric practice and 2.8 hours under the managed care scenario. The key to psychiatric manpower estimation lies in clarifying the assumptions that underlie the specific method used. Even small differences in assumptions mean large differences in estimates. Any credible manpower estimation process must include discussions and negotiations between psychiatrists, other clinicians, administrators, and patients and families to clarify the treatment needs of patients and the roles, responsibilities, and job description of psychiatrists.

  7. Development of Level 2 Calibration and Validation Plans for GOES-R; What is a RIMP?

    NASA Technical Reports Server (NTRS)

    Kopp, Thomas J.; Belsma, Leslie O.; Mollner, Andrew K.; Sun, Ziping; Deluccia, Frank

    2017-01-01

    Calibration and Validation (CalVal) plans for Geostationary Operational Environmental Satellite version R (GOES-R) Level 2 (L2) products were documented via Resource, Implementation, and Management Plans (RIMPs) for all of the official L2 products required from the GOES-R Advanced Baseline Imager (ABI). In 2015 the GOES-R program decided to replace the typical CalVal plans with RIMPs that covered, for a given L2 product, what was required from that product, how it would be validated, and what tools would be used to do so. Similar to Level 1b products, the intent was to cover the full spectrum of planning required for the CalVal of L2 ABI products. Instead of focusing on step-by-step procedures, the RIMPs concentrated on the criteria for each stage of the validation process (Beta, Provisional, and Full Validation) and the many elements required to prove when each stage was reached.

  8. Methods of downstream processing for the production of biodiesel from microalgae.

    PubMed

    Kim, Jungmin; Yoo, Gursong; Lee, Hansol; Lim, Juntaek; Kim, Kyochan; Kim, Chul Woong; Park, Min S; Yang, Ji-Won

    2013-11-01

    Despite receiving increasing attention during the last few decades, the production of microalgal biofuels is not yet sufficiently cost-effective to compete with that of petroleum-based conventional fuels. Among the steps required for the production of microalgal biofuels, the harvest of the microalgal biomass and the extraction of lipids from microalgae are two of the most expensive. In this review article, we surveyed a substantial amount of previous work in microalgal harvesting and lipid extraction to highlight recent progress in these areas. We also discuss new developments in the biodiesel conversion technology due to the importance of the connectivity of this step with the lipid extraction process. Furthermore, we propose possible future directions for technological or process improvements that will directly affect the final production costs of microalgal biomass-based biofuels. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Knowledge synthesis methods for integrating qualitative and quantitative data: a scoping review reveals poor operationalization of the methodological steps.

    PubMed

    Tricco, Andrea C; Antony, Jesmin; Soobiah, Charlene; Kastner, Monika; MacDonald, Heather; Cogo, Elise; Lillie, Erin; Tran, Judy; Straus, Sharon E

    2016-05-01

    To describe and compare, through a scoping review, emerging knowledge synthesis methods for integrating qualitative and quantitative evidence in health care, in terms of expertise required, similarities, differences, strengths, limitations, and steps involved in using the methods. Electronic databases (e.g., MEDLINE) were searched, and two reviewers independently selected studies and abstracted data for qualitative analysis. In total, 121 articles reporting seven knowledge synthesis methods (critical interpretive synthesis, integrative review, meta-narrative review, meta-summary, mixed studies review, narrative synthesis, and realist review) were included after screening of 17,962 citations and 1,010 full-text articles. Common similarities among methods related to the entire synthesis process, while common differences related to the research question and eligibility criteria. The most common strength was a comprehensive synthesis providing rich contextual data, whereas the most common weakness was a highly subjective method that was not reproducible. For critical interpretive synthesis, meta-narrative review, meta-summary, and narrative synthesis, guidance was not provided for some steps of the review process. Some of the knowledge synthesis methods provided guidance on all steps, whereas other methods were missing guidance on the synthesis process. Further work is needed to clarify these emerging knowledge synthesis methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Restoration handbook for sagebrush steppe ecosystems with emphasis on greater sage-grouse habitat—Part 3. Site level restoration decisions

    USGS Publications Warehouse

    Pyke, David A.; Chambers, Jeanne C.; Pellant, Mike; Miller, Richard F.; Beck, Jeffrey L.; Doescher, Paul S.; Roundy, Bruce A.; Schupp, Eugene W.; Knick, Steven T.; Brunson, Mark; McIver, James D.

    2017-02-14

    Sagebrush steppe ecosystems in the United States currently (2016) occur on only about one-half of their historical land area because of changes in land use, urban growth, and degradation of land, including invasions of non-native plants. The existence of many animal species depends on the existence of sagebrush steppe habitat. The greater sage-grouse (Centrocercus urophasianus) depends on large landscapes of intact habitat of sagebrush and perennial grasses for their existence. In addition, other sagebrush-obligate animals have similar requirements and restoration of landscapes for greater sage-grouse also will benefit these animals. Once sagebrush lands are degraded, they may require restoration actions to make those lands viable habitat for supporting sagebrush-obligate animals, livestock, and wild horses, and to provide ecosystem services for humans now and for future generations.When a decision is made on where restoration treatments should be applied, there are a number of site-specific decisions managers face before selecting the appropriate type of restoration. This site-level decision tool for restoration of sagebrush steppe ecosystems is organized in nine steps.Step 1 describes the process of defining site-level restoration objectives.Step 2 describes the ecological site characteristics of the restoration site. This covers soil chemistry and texture, soil moisture and temperature regimes, and the vegetation communities the site is capable of supporting.Step 3 compares the current vegetation to the plant communities associated with the site State and Transition models.Step 4 takes the manager through the process of current land uses and past disturbances that may influence restoration success.Step 5 is a brief discussion of how weather before and after treatments may impact restoration success.Step 6 addresses restoration treatment types and their potential positive and negative impacts on the ecosystem and on habitats, especially for greater sage-grouse. We discuss when passive restoration options may be sufficient and when active restoration may be necessary to achieve restoration objectives.Step 7 addresses decisions regarding post-restoration livestock grazing management.Step 8 addresses monitoring of the restoration; we discuss important aspects associated with implementation monitoring as well as effectiveness monitoring.Step 9 takes the information learned from monitoring to determine how restoration actions in the future might be adapted to improve restoration success.

  11. Hot working behavior of selective laser melted and laser metal deposited Inconel 718

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Sizova, Irina

    2018-05-01

    The production of Nickel-based high-temperature components is of great importance for the transport and energy sector. Forging of high-temperature alloys often requires expensive dies, multiple forming steps and leads to forged parts with tolerances that require machining to create the final shape and a large amount of scrap. Additive manufacturing offers the possibility to print the desired shapes directly as net-shape components, requiring only little additional effort in machining. Especially for high-temperature alloys carrying a large amount of energy per unit mass, additive manufacturing could be more energy-efficient than forging if the energy contained in the machining scrap exceeds the energy needed for powder production and laser processing. However, the microstructure and performance of 3d-printed parts will not reach the level of forged material unless further expensive processes such as hot-isostatic pressing are used. Using the design freedom and possibilities to locally engineer material, additive manufacturing could be combined with forging operations to novel process chains, offering the possibility to reduce the number of forging steps and to create near-net shape forgings with desired local properties. Some innovative process chains combining additive manufacturing and forging have been patented recently, but almost no scientific knowledge on the workability of 3D printed preforms exists. The present study investigates the flow stress and microstructure evolution during hot working of pre-forms produced by laser powder deposition and selective laser melting (Figure 1) and puts forward a model for the flow stress.

  12. Requirements for the design and implementation of checklists for surgical processes.

    PubMed

    Verdaasdonk, E G G; Stassen, L P S; Widhiasmara, P P; Dankelman, J

    2009-04-01

    The use of checklists is a promising strategy for improving patient safety in all types of surgical processes inside and outside the operating room. This article aims to provide requirements and implementation of checklists for surgical processes. The literature on checklist use in the operating room was reviewed based on research using Medline, Pubmed, and Google Scholar. Although all the studies showed positive effects and important benefits such as improved team cohesion, improved awareness of safety issues, and reduction of errors, their number still is limited. The motivation of team members is considered essential for compliance. Currently, no general guidelines exist for checklist design in the surgical field. Based on the authors' experiences and on guidelines used in the aviation industry, requirements for the checklist design are proposed. The design depends on the checklist purpose, philosophy, and method chosen. The methods consist of the "call-do-response" approach," the "do-verify" approach, or a combination of both. The advantages and disadvantages of paper versus electronic solutions are discussed. Furthermore, a step-by-step strategy of how to implement a checklist in the clinical situation is suggested. The use of structured checklists in surgical processes is most likely to be effective because it standardizes human performance and ensures that procedures are followed correctly instead of relying on human memory alone. Several studies present promising and positive first results, providing a solid basis for further investigation. Future research should focus on the effect of various checklist designs and strategies to ensure maximal compliance.

  13. The research of PSD location method in micro laser welding fields

    NASA Astrophysics Data System (ADS)

    Zhang, Qiue; Zhang, Rong; Dong, Hua

    2010-11-01

    In the field of micro laser welding, besides the special requirement in the parameter of lasers, the locating in welding points accurately is very important. The article adopt position sensitive detector (PSD) as hard core, combine optic system, electric circuits and PC and software processing, confirm the location of welding points. The signal detection circuits adopt the special integrate circuit H-2476 to process weak signal. It is an integrated circuit for high-speed, high-sensitivity optical range finding, which has stronger noiseproof feature, combine digital filter arithmetic, carry out repair the any non-ideal factors, increasing the measure precision. The amplifier adopt programmable amplifier LTC6915. The system adapt two dimension stepping motor drive the workbench, computer and corresponding software processing, make sure the location of spot weld. According to different workpieces to design the clamps. The system on-line detect PSD 's output signal in the moving processing. At the workbench moves in the X direction, the filaments offset is detected dynamic. Analyze the X axes moving sampling signal direction could be estimate the Y axes moving direction, and regulate the Y axes moving values. The workbench driver adopt A3979, it is a stepping motor driver with insert transducer and operate easily. It adapts the requirement of location in micro laser welding fields, real-time control to adjust by computer. It can be content up 20 μm's laser micro welding requirement on the whole. Using laser powder cladding technology achieve inter-penetration welding of high quality and reliability.

  14. Full-waveform data for building roof step edge localization

    NASA Astrophysics Data System (ADS)

    Słota, Małgorzata

    2015-08-01

    Airborne laser scanning data perfectly represent flat or gently sloped areas; to date, however, accurate breakline detection is the main drawback of this technique. This issue becomes particularly important in the case of modeling buildings, where accuracy higher than the footprint size is often required. This article covers several issues related to full-waveform data registered on building step edges. First, the full-waveform data simulator was developed and presented in this paper. Second, this article provides a full description of the changes in echo amplitude, echo width and returned power caused by the presence of edges within the laser footprint. Additionally, two important properties of step edge echoes, peak shift and echo asymmetry, were noted and described. It was shown that these properties lead to incorrect echo positioning along the laser center line and can significantly reduce the edge points' accuracy. For these reasons and because all points are aligned with the center of the beam, regardless of the actual target position within the beam footprint, we can state that step edge points require geometric corrections. This article presents a novel algorithm for the refinement of step edge points. The main distinguishing advantage of the developed algorithm is the fact that none of the additional data, such as emitted signal parameters, beam divergence, approximate edge geometry or scanning settings, are required. The proposed algorithm works only on georeferenced profiles of reflected laser energy. Another major advantage is the simplicity of the calculation, allowing for very efficient data processing. Additionally, the developed method of point correction allows for the accurate determination of points lying on edges and edge point densification. For this reason, fully automatic localization of building roof step edges based on LiDAR full-waveform data with higher accuracy than the size of the lidar footprint is feasible.

  15. A Population Biology Perspective on the Stepwise Infection Process of the Bacterial Pathogen Pasteuria ramosa in Daphnia.

    PubMed

    Ebert, Dieter; Duneau, David; Hall, Matthew D; Luijckx, Pepijn; Andras, Jason P; Du Pasquier, Louis; Ben-Ami, Frida

    2016-01-01

    The infection process of many diseases can be divided into series of steps, each one required to successfully complete the parasite's life and transmission cycle. This approach often reveals that the complex phenomenon of infection is composed of a series of more simple mechanisms. Here we demonstrate that a population biology approach, which takes into consideration the natural genetic and environmental variation at each step, can greatly aid our understanding of the evolutionary processes shaping disease traits. We focus in this review on the biology of the bacterial parasite Pasteuria ramosa and its aquatic crustacean host Daphnia, a model system for the evolutionary ecology of infectious disease. Our analysis reveals tremendous differences in the degree to which the environment, host genetics, parasite genetics and their interactions contribute to the expression of disease traits at each of seven different steps. This allows us to predict which steps may respond most readily to selection and which steps are evolutionarily constrained by an absence of variation. We show that the ability of Pasteuria to attach to the host's cuticle (attachment step) stands out as being strongly influenced by the interaction of host and parasite genotypes, but not by environmental factors, making it the prime candidate for coevolutionary interactions. Furthermore, the stepwise approach helps us understanding the evolution of resistance, virulence and host ranges. The population biological approach introduced here is a versatile tool that can be easily transferred to other systems of infectious disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Hierarchical recruitment of ribosomal proteins and assembly factors remodels nucleolar pre-60S ribosomes.

    PubMed

    Biedka, Stephanie; Micic, Jelena; Wilson, Daniel; Brown, Hailey; Diorio-Toth, Luke; Woolford, John L

    2018-04-24

    Ribosome biogenesis involves numerous preribosomal RNA (pre-rRNA) processing events to remove internal and external transcribed spacer sequences, ultimately yielding three mature rRNAs. Removal of the internal transcribed spacer 2 spacer RNA is the final step in large subunit pre-rRNA processing and begins with endonucleolytic cleavage at the C 2 site of 27SB pre-rRNA. C 2 cleavage requires the hierarchical recruitment of 11 ribosomal proteins and 14 ribosome assembly factors. However, the function of these proteins in C 2 cleavage remained unclear. In this study, we have performed a detailed analysis of the effects of depleting proteins required for C 2 cleavage and interpreted these results using cryo-electron microscopy structures of assembling 60S subunits. This work revealed that these proteins are required for remodeling of several neighborhoods, including two major functional centers of the 60S subunit, suggesting that these remodeling events form a checkpoint leading to C 2 cleavage. Interestingly, when C 2 cleavage is directly blocked by depleting or inactivating the C 2 endonuclease, assembly progresses through all other subsequent steps. © 2018 Biedka et al.

  17. Surface Structure and Photocatalytic Activity of Nano-TiO2 Thin Film

    EPA Science Inventory

    Controlled titanium dioxide (TiO2) thin films were deposited on stainless steel surfaces using flame aerosol synthetic technique, which is a one-step coating process, that doesn’t require further calcination. Solid state characterization of the coatings was conducted by different...

  18. Fundraising for Early Childhood Programs: Getting Started and Getting Results.

    ERIC Educational Resources Information Center

    Finn, Matia

    Designed to assist practitioners serving young children and their families, this book contains information about methods of raising money and managing nonprofit organizations. Following the first chapter's introductory definition of important terms associated with the fundraising process, chapter 2 discusses some prerequisite steps required before…

  19. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  20. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  1. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  2. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  3. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  4. Lessons from Alternative Grading: Essential Qualities of Teacher Feedback

    ERIC Educational Resources Information Center

    Percell, Jay C.

    2017-01-01

    One critically important step in the instructional process is providing feedback to students, and yet, providing timely and thorough feedback is often lacking due attention. Reasons for this oversight could range from several factors including increased class sizes, vast content coverage requirements, extracurricular responsibilities, and the…

  5. Dengue Virus Genome Uncoating Requires Ubiquitination.

    PubMed

    Byk, Laura A; Iglesias, Néstor G; De Maio, Federico A; Gebhard, Leopoldo G; Rossi, Mario; Gamarnik, Andrea V

    2016-06-28

    The process of genome release or uncoating after viral entry is one of the least-studied steps in the flavivirus life cycle. Flaviviruses are mainly arthropod-borne viruses, including emerging and reemerging pathogens such as dengue, Zika, and West Nile viruses. Currently, dengue virus is one of the most significant human viral pathogens transmitted by mosquitoes and is responsible for about 390 million infections every year around the world. Here, we examined for the first time molecular aspects of dengue virus genome uncoating. We followed the fate of the capsid protein and RNA genome early during infection and found that capsid is degraded after viral internalization by the host ubiquitin-proteasome system. However, proteasome activity and capsid degradation were not necessary to free the genome for initial viral translation. Unexpectedly, genome uncoating was blocked by inhibiting ubiquitination. Using different assays to bypass entry and evaluate the first rounds of viral translation, a narrow window of time during infection that requires ubiquitination but not proteasome activity was identified. In this regard, ubiquitin E1-activating enzyme inhibition was sufficient to stabilize the incoming viral genome in the cytoplasm of infected cells, causing its retention in either endosomes or nucleocapsids. Our data support a model in which dengue virus genome uncoating requires a nondegradative ubiquitination step, providing new insights into this crucial but understudied viral process. Dengue is the most significant arthropod-borne viral infection in humans. Although the number of cases increases every year, there are no approved therapeutics available for the treatment of dengue infection, and many basic aspects of the viral biology remain elusive. After entry, the viral membrane must fuse with the endosomal membrane to deliver the viral genome into the cytoplasm for translation and replication. A great deal of information has been obtained in the last decade regarding molecular aspects of the fusion step, but little is known about the events that follow this process, which leads to viral RNA release from the nucleocapsid. Here, we investigated the fate of nucleocapsid components (capsid protein and viral genome) during the infection process and found that capsid is degraded by the ubiquitin-proteasome system. However, in contrast to that observed for other RNA and DNA viruses, dengue virus capsid degradation was not responsible for genome uncoating. Interestingly, we found that dengue virus genome release requires a nondegradative ubiquitination step. These results provide the first insights into dengue virus uncoating and present new opportunities for antiviral intervention. Copyright © 2016 Byk et al.

  6. To build a mine: Prospect to product

    NASA Technical Reports Server (NTRS)

    Gertsch, Richard E.

    1992-01-01

    The terrestrial definition of ore is a quantity of earth materials containing a mineral that can be extracted at a profit. While a space-based resource-gathering operation may well be driven by other motives, such an operation should have the most favorable cost-benefit ratio possible. To this end, principles and procedures already tested by the stringent requirements of the profit motive should guide the selection, design, construction, and operation of a space-based mine. Proceeding from project initiation to a fully operational mine requires several interacting and overlapping steps, which are designed to facilitate the decision process and insure economic viability. The steps to achieve a fully operational mine are outlined. Presuming that the approach to developing nonterrestrial resources will parallel that for developing mineral resources on Earth, we can speculate on some of the problems associated with developing lunar and asteroidal resources. The baseline for our study group was a small lunar mine and oxygen extraction facility. The development of this facility is described in accordance with the steps outlined.

  7. Organic electronics with polymer dielectrics on plastic substrates fabricated via transfer printing

    NASA Astrophysics Data System (ADS)

    Hines, Daniel R.

    Printing methods are fast becoming important processing techniques for the fabrication of flexible electronics. Some goals for flexible electronics are to produce cheap, lightweight, disposable radio frequency identification (RFID) tags, very large flexible displays that can be produced in a roll-to-roll process and wearable electronics for both the clothing and medical industries. Such applications will require fabrication processes for the assembly of dissimilar materials onto a common substrate in ways that are compatible with organic and polymeric materials as well as traditional solid-state electronic materials. A transfer printing method has been developed with these goals and application in mind. This printing method relies primarily on differential adhesion where no chemical processing is performed on the device substrate. It is compatible with a wide variety of materials with each component printed in exactly the same way, thus avoiding any mixed processing steps on the device substrate. The adhesion requirements of one material printed onto a second are studied by measuring the surface energy of both materials and by surface treatments such as plasma exposure or the application of self-assembled monolayers (SAM). Transfer printing has been developed within the context of fabricating organic electronics onto plastic substrates because these materials introduce unique opportunities associated with processing conditions not typically required for traditional semiconducting materials. Compared to silicon, organic semiconductors are soft materials that require low temperature processing and are extremely sensitive to chemical processing and environmental contamination. The transfer printing process has been developed for the important and commonly used organic semiconducting materials, pentacene (Pn) and poly(3-hexylthiophene) (P3HT). A three-step printing process has been developed by which these materials are printed onto an electrode subassembly consisting of previously printed electrodes separated by a polymer dielectric layer all on a plastic substrate. These bottom contact, flexible organic thin-film transistors (OTFT) have been compared to unprinted (reference) devices consisting of top contact electrodes and a silicon dioxide dielectric layer on a silicon substrate. Printed Pn and P3HT TFTs have been shown to out-perform the reference devices. This enhancement has been attributed to an annealing under pressure of the organic semiconducting material.

  8. A LEAN approach toward automated analysis and data processing of polymers using proton NMR spectroscopy.

    PubMed

    de Brouwer, Hans; Stegeman, Gerrit

    2011-02-01

    To maximize utilization of expensive laboratory instruments and to make most effective use of skilled human resources, the entire chain of data processing, calculation, and reporting that is needed to transform raw NMR data into meaningful results was automated. The LEAN process improvement tools were used to identify non-value-added steps in the existing process. These steps were eliminated using an in-house developed software package, which allowed us to meet the key requirement of improving quality and reliability compared with the existing process while freeing up valuable human resources and increasing productivity. Reliability and quality were improved by the consistent data treatment as performed by the software and the uniform administration of results. Automating a single NMR spectrophotometer led to a reduction in operator time of 35%, doubling of the annual sample throughput from 1400 to 2800, and reducing the turn around time from 6 days to less than 2. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  9. Investigation of test methods, material properties and processes for solar cell encapsulants

    NASA Technical Reports Server (NTRS)

    Willis, P. B.; Baum, B.

    1983-01-01

    The goal of the program is to identify, test, evaluate and recommend encapsulation materials and processes for the fabrication of cost-effective and long life solar modules. Of the $18 (1948 $) per square meter allocated for the encapsulation components approximately 50% of the cost ($9/sq m) may be taken by the load bearing component. Due to the proportionally high cost of this element, lower costing materials were investigated. Wood based products were found to be the lowest costing structural materials for module construction, however, they require protection from rainwater and humidity in order to acquire dimensional stability. The cost of a wood product based substrate must, therefore, include raw material costs plus the cost of additional processing to impart hygroscopic inertness. This protection is provided by a two step, or split process in which a flexible laminate containing the cell string is prepared, first in a vacuum process and then adhesively attached with a back cover film to the hardboard in a subsequent step.

  10. Successive membrane separation processes simplify concentration of lipases produced by Aspergillus niger by solid-state fermentation.

    PubMed

    Reinehr, Christian Oliveira; Treichel, Helen; Tres, Marcus Vinicius; Steffens, Juliana; Brião, Vandré Barbosa; Colla, Luciane Maria

    2017-06-01

    In this study, we developed a simplified method for producing, separating, and concentrating lipases derived from solid-state fermentation of agro-industrial residues by filamentous fungi. First, we used Aspergillus niger to produce lipases with hydrolytic activity. We analyzed the separation and concentration of enzymes using membrane separation processes. The sequential use of microfiltration and ultrafiltration processes made it possible to obtain concentrates with enzymatic activities much higher than those in the initial extract. The permeate flux was higher than 60 L/m 2 h during microfiltration using 20- and 0.45-µm membranes and during ultrafiltration using 100- and 50-kDa membranes, where fouling was reversible during the filtration steps, thereby indicating that the fouling may be removed by cleaning processes. These results demonstrate the feasibility of lipase production using A. niger by solid-state fermentation of agro-industrial residues, followed by successive tangential filtration with membranes, which simplify the separation and concentration steps that are typically required in downstream processes.

  11. Industrial Photogrammetry - Accepted Metrology Tool or Exotic Niche

    NASA Astrophysics Data System (ADS)

    Bösemann, Werner

    2016-06-01

    New production technologies like 3D printing and other adaptive manufacturing technologies have changed the industrial manufacturing process, often referred to as next industrial revolution or short industry 4.0. Such Cyber Physical Production Systems combine virtual and real world through digitization, model building process simulation and optimization. It is commonly understood that measurement technologies are the key to combine the real and virtual worlds (eg. [Schmitt 2014]). This change from measurement as a quality control tool to a fully integrated step in the production process has also changed the requirements for 3D metrology solutions. Key words like MAA (Measurement Assisted Assembly) illustrate that new position of metrology in the industrial production process. At the same time it is obvious that these processes not only require more measurements but also systems to deliver the required information in high density in a short time. Here optical solutions including photogrammetry for 3D measurements have big advantages over traditional mechanical CMM's. The paper describes the relevance of different photogrammetric solutions including state of the art, industry requirements and application examples.

  12. Improving government regulations: a guidebook for conservation and renewable energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neese, R. J.; Scheer, R. M.; Marasco, A. L.

    1981-04-01

    An integrated view of the Office of Conservation and Solar Energy (CS) policy making encompassing both administrative procedures and policy analysis is presented. Chapter One very briefly sketches each step in the development of a significant regulation, noting important requirements and participants. Chapter Two expands upon the Overview, providing the details of the process, the rationale and source of requirements, concurrence procedures, and advice on the timing and synchronization of steps. Chapter Three explains the types of analysis documents that may be required for a program. Regulatory Analyses, Environmental Impact Statements, Urban and Community Impact Analyses, and Regulatory Flexibility Analysesmore » are all discussed. Specific information to be included in the documents and the circumstances under which the documents need to be prepared are explained. Chapter Four is a step-by-step discussion of how to do good analysis. Use of models and data bases is discussed. Policy objectives, alternatives, and decision making are explained. In Chapter five guidance is provided on identifying the public that would most likely be interested in the regulation, involving its constituents in a dialogue with CS, evaluating and handling comments, and engineering the final response. Chapter Six provides direction on planning the evaluation, monitoring the regulation's success once it has been promulgated, and allowing for constructive support or criticism from outside DOE. (MCW)« less

  13. How Development and Manufacturing Will Need to Be Structured-Heads of Development/Manufacturing May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Nepveux, Kevin; Sherlock, Jon-Paul; Futran, Mauricio; Thien, Michael; Krumme, Markus

    2015-03-01

    Continuous manufacturing (CM) is a process technology that has been used in the chemical industry for large-scale mass production of chemicals in single-purpose plants with benefit for many years. Recent interest has been raised to expand CM into the low-volume, high-value pharmaceutical business with its unique requirements regarding readiness for human use and the required quality, supply chain, and liability constraints in this business context. Using a fairly abstract set of definitions, this paper derives technical consequences of CM in different scenarios along the development-launch-supply axis in different business models and how they compare to batch processes. Impact of CM on functions in development is discussed and several operational models suitable for originators and other business models are discussed and specific aspects of CM are deduced from CM's technical characteristics. Organizational structures of current operations typically can support CM implementations with just minor refinements if the CM technology is limited to single steps or small sequences (bin-to-bin approach) and if the appropriate technical skill set is available. In such cases, a small, dedicated group focused on CM is recommended. The manufacturing strategy, as centralized versus decentralized in light of CM processes, is discussed and the potential impact of significantly shortened supply lead times on the organization that runs these processes. The ultimate CM implementation may be seen by some as a totally integrated monolithic plant, one that unifies chemistry and pharmaceutical operations into one plant. The organization supporting this approach will have to reflect this change in scope and responsibility. The other extreme, admittedly futuristic at this point, would be a highly decentralized approach with multiple smaller hubs; this would require a new and different organizational structure. This processing approach would open up new opportunities for products that, because of stability constraints or individualization to patients, do not allow centralized manufacturing approaches at all. Again, the entire enterprise needs to be restructured accordingly. The situation of CM in an outsourced operation business model is discussed. Next steps for the industry are recommended. In summary, opportunistic implementation of isolated steps in existing portfolios can be implemented with minimal organizational changes; the availability of the appropriate skills is the determining factor. The implementation of more substantial sequences requires business processes that consider the portfolio, not just single products. Exploration and implementation of complete process chains with consequences for quality decisions do require appropriate organizational support. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  14. Continuous Energy Improvement in Motor Driven Systems - A Guidebook for Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert A. McCoy and John G. Douglass

    2014-02-01

    This guidebook provides a step-by-step approach to developing a motor system energy-improvement action plan. An action plan includes which motors should be repaired or replaced with higher efficiency models, recommendations on maintaining a spares inventory, and discussion of improvements in maintenance practices. The guidebook is the successor to DOE’s 1997 Energy Management for Motor Driven Systems. It builds on its predecessor publication by including topics such as power transmission systems and matching driven equipment to process requirements in addition to motors.

  15. Al Control in High Titanium Ferro with Low Oxygen Prepared by Thermite Reaction

    NASA Astrophysics Data System (ADS)

    Dou, Zhi-he; Wang, Cong; Fan, Shi-gang; Shi, Guan-yong; Zhang, Ting-an

    Based on the pre-works, this paper proposed a new short stage process of the intensify aluminothermy reduction by the stage to prepare high titanium ferroalloy with low O and Al contents. We investigated the effects of Al and Ca and Si combination reduction agent, slag type and step-up reduction conditions on the Al content and distribution in the alloy. The results show that the step-up reduction can not only reduce effectively the oxygen content in the alloy, but also reduce effectively Al content. For instance, the oxygen content in high titanium ferroalloy is within 1%˜4%, and the Al content is within 1%˜5%. Its quality reaches the requirement of high titanium ferroalloy prepared by remelting process.

  16. Water requirements of the aluminum industry

    USGS Publications Warehouse

    Conklin, Howard L.

    1956-01-01

    Aluminum is unique among metals in the way it is obtained from its ore. The first step is to produce alumina, a white powder that bears no resemblance to the bauxite from which it is derived or to the metallic aluminum to which it is reduced by electrolytic action in a second step. Each step requires a complete plant facility, and the plants may be adjacent or separated by as much as the width of the North American continent. Field investigations sf every alumina plant and reduction works in the United States were undertaken to determine the industry's water use. Detailed studies were made of process and plant layout so that a water balance could be made for each plant to determine not only the gross water intake but also an approximation of the consumptive use of water. Water requirements of alumina plants range from 0.28 to 1.10 gallons per pound of alumina; the average for the industry is 0.66 gallon. Water requirements of reduction works vary considerably more, ranging from 1.24 to 36.33 gallons per pound of aluminum, and average 14.62 gallons. All alumina plants in the United States derive alumina from bauxite by the Bayer process or by the Combination process, a modification of the Bayer process. Although the chemical process for obtaining alumina from bauxite is essentially the same at all plants, different procedures are employed to cool the sodium aluminate solution before it enters the precipitating tanks and to concentrate it by evaporation of some of the water in the solution. Where this evaporation takes place in a cooling tower, water in the solution is lost to the atmosphere as water vapor and so is used consumptively. In other plants, the quantity of solution in the system is controlled by evaporation in a multiple-effect evaporator where practically all vapor distilled out of the solution is condensed to water that may be reused. The latter method is used in all recently constructed alumina plants, and some older plants are replacing cooling towers with multiple-effect evaporators. All reduction works in the United States use the Hall process, but the variation in water requirements is even greater than the variation at alumina plants, and, further, the total daily water requirement for all reduction works is more than 9 times the total daily requirement of all alumina plants. Many reduction works use gas scrubbers, but some do not. As gas scrubbing is one of the principal water uses in reduction works, the manner in which wash water is used, cooled, and reused accounts in large measure for the variation in water requirements. Although the supply of water for all plants but one was reported by the management to be ample for all plant needs, the economic factor of the cost of water differs considerably among plants. It is this factor that accounts in large measure for the widely divergent slant practices. Plant capacity alone has so little effect on plant water requirements that other conditions such as plant operation based on the cost of water, plant location, and the need for conservation of water mask any economy inherent in plant size.

  17. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  18. Polarized Growth in the Absence of F-Actin in Saccharomyces cerevisiae Exiting Quiescence

    PubMed Central

    Sahin, Annelise; Daignan-Fornier, Bertrand; Sagot, Isabelle

    2008-01-01

    Background Polarity establishment and maintenance are crucial for morphogenesis and development. In budding yeast, these two intricate processes involve the superposition of regulatory loops between polarity landmarks, RHO GTPases, actin-mediated vesicles transport and endocytosis. Deciphering the chronology and the significance of each molecular step of polarized growth is therefore very challenging. Principal Findings We have taken advantage of the fact that yeast quiescent cells display actin bodies, a non polarized actin structure, to evaluate the role of F-actin in bud emergence. Here we show that upon exit from quiescence, actin cables are not required for the first steps of polarized growth. We further show that polarized growth can occur in the absence of actin patch-mediated endocytosis. We finally establish, using latrunculin-A, that the first steps of polarized growth do not require any F-actin containing structures. Yet, these structures are required for the formation of a bona fide daughter cell and cell cycle completion. We propose that upon exit from quiescence in the absence of F-actin, secretory vesicles randomly reach the plasma membrane but preferentially dock and fuse where polarity cues are localized, this being sufficient to trigger polarized growth. PMID:18596916

  19. More steps towards process automation for optical fabrication

    NASA Astrophysics Data System (ADS)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweedie, A.; Doris, E.

    Establishing interconnection to the grid is a recognized barrier to the deployment of distributed energy generation. This report compares interconnection processes for photovoltaic projects in California and Germany. This report summarizes the steps of the interconnection process for developers and utilities, the average length of time utilities take to process applications, and paperwork required of project developers. Based on a review of the available literature, this report finds that while the interconnection procedures and timelines are similar in California and Germany, differences in the legal and regulatory frameworks are substantial.

  1. Process for operating equilibrium controlled reactions

    DOEpatents

    Nataraj, Shankar; Carvill, Brian Thomas; Hufton, Jeffrey Raymond; Mayorga, Steven Gerard; Gaffney, Thomas Richard; Brzozowski, Jeffrey Richard

    2001-01-01

    A cyclic process for operating an equilibrium controlled reaction in a plurality of reactors containing an admixture of an adsorbent and a reaction catalyst suitable for performing the desired reaction which is operated in a predetermined timed sequence wherein the heating and cooling requirements in a moving reaction mass transfer zone within each reactor are provided by indirect heat exchange with a fluid capable of phase change at temperatures maintained in each reactor during sorpreaction, depressurization, purging and pressurization steps during each process cycle.

  2. Computerized Adaptive Testing System Design: Preliminary Design Considerations.

    DTIC Science & Technology

    1982-07-01

    the administrative or operational requirements of CAT and presented - # k*----.,ku nh-n.-utu (IPOI efi~g.2me (PMU tQ7q. vim NPRDC TR 82-52 July 1982...design model for a computerized adaptive testing ( CAT ) system was developed and presented through a series of hierarchy plus input-process-output (HIPO...physical system was addressed through brief discussions of hardware, software, interfaces, and personnel requirements. Further steps in CAT system

  3. Parallel processing of real-time dynamic systems simulation on OSCAR (Optimally SCheduled Advanced multiprocessoR)

    NASA Technical Reports Server (NTRS)

    Kasahara, Hironori; Honda, Hiroki; Narita, Seinosuke

    1989-01-01

    Parallel processing of real-time dynamic systems simulation on a multiprocessor system named OSCAR is presented. In the simulation of dynamic systems, generally, the same calculation are repeated every time step. However, we cannot apply to Do-all or the Do-across techniques for parallel processing of the simulation since there exist data dependencies from the end of an iteration to the beginning of the next iteration and furthermore data-input and data-output are required every sampling time period. Therefore, parallelism inside the calculation required for a single time step, or a large basic block which consists of arithmetic assignment statements, must be used. In the proposed method, near fine grain tasks, each of which consists of one or more floating point operations, are generated to extract the parallelism from the calculation and assigned to processors by using optimal static scheduling at compile time in order to reduce large run time overhead caused by the use of near fine grain tasks. The practicality of the scheme is demonstrated on OSCAR (Optimally SCheduled Advanced multiprocessoR) which has been developed to extract advantageous features of static scheduling algorithms to the maximum extent.

  4. Solid propellant processing factor in rocket motor design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The ways are described by which propellant processing is affected by choices made in designing rocket engines. Tradeoff studies, design proof or scaleup studies, and special design features are presented that are required to obtain high product quality, and optimum processing costs. Processing is considered to include the operational steps involved with the lining and preparation of the motor case for the grain; the procurement of propellant raw materials; and propellant mixing, casting or extrusion, curing, machining, and finishing. The design criteria, recommended practices, and propellant formulations are included.

  5. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  6. Method of manufacturing carbon nanotubes

    NASA Technical Reports Server (NTRS)

    Benavides, Jeanette M. (Inventor); Leidecker, Henning W. (Inventor); Frazier, Jeffrey (Inventor)

    2004-01-01

    A process for manufacturing carbon nanotubes, including a step of inducing electrical current through a carbon anode and a carbon cathode under conditions effective to produce the carbon nanotubes, wherein the carbon cathode is larger than the carbon anode. Preferably, a welder is used to induce the electrical current via an arc welding process. Preferably, an exhaust hood is placed on the anode, and the process does not require a closed or pressurized chamber. The process provides high-quality, single-walled carbon nanotubes, while eliminating the need for a metal catalyst.

  7. SU-E-J-126: An Online Replanning Method for FFF Beams Without Couch Shift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahunbay, E; Ates, O; Li, X

    2015-06-15

    Purpose: In a situation that couch shift for patient positioning is not preferred or prohibited (e.g., MR-Linac), segment aperture morphing (SAM) can address target dislocation and deformation. For IMRT/VMAT with flattening filter free (FFF) beams, however, SAM method would lead to an adverse translational dose effect due to the beam unflattening. Here we propose a new 2-step process to address both the translational effect of FFF beams and the target deformation. Methods: The replanning method consists of an offline and an online steps. The offline step is to create a series of pre-shifted plans (PSP) obtained by a so calledmore » “warm start” optimization (starting optimization from the original plan, rather from scratch) at a series of isocenter shifts with fixed distance (e.g. 2 cm, at x,y,z = 2,0,0 ; 2,2,0 ; 0,2,0; …;− 2,0,0). The PSPs all have the same number of segments with very similar shapes, since the warm-start optimization only adjusts the MLC positions instead of regenerating them. In the online step, a new plan is obtained by linearly interpolating the MLC positions and the monitor units of the closest PSPs for the shift determined from the image of the day. This two-step process is completely automated, and instantaneously fast (no optimization or dose calculation needed). The previously-developed SAM algorithm is then applied for daily deformation. We tested the method on sample prostate and pancreas cases. Results: The two-step interpolation method can account for the adverse dose effects from FFF beams, while SAM corrects for the target deformation. The whole process takes the same time as the previously reported SAM process (5–10 min). Conclusion: The new two-step method plus SAM can address both the translation effects of FFF beams and target deformation, and can be executed in full automation requiring no additional time from the SAM process. This research was supported by Elekta inc. (Crawley, UK)« less

  8. Development of a Rubric to Improve Critical Thinking

    ERIC Educational Resources Information Center

    Hildenbrand, Kasee J.; Schultz, Judy A.

    2012-01-01

    Context: Health care professionals, including athletic trainers are confronted daily with multiple complex problems that require critical thinking. Objective: This research attempts to develop a reliable process to assess students' critical thinking in a variety of athletic training and kinesiology courses. Design: Our first step was to create a…

  9. Simple Steps for Teaching Prepositions to Students with Autism and Other Developmental Disabilities

    ERIC Educational Resources Information Center

    Hicks, S. Christy; Rivera, Christopher J.; Patterson, Dawn R.

    2016-01-01

    The acquisition of receptive and expressive language skills by students with autism and developmental disabilities (DD) is often delayed, thus making the process of communicating with others challenging. Some students develop language skills incidentally through conversations with their families and peers, but others require instruction in…

  10. American Taxpayer Relief Act of 2012

    THOMAS, 112th Congress

    Rep. Camp, Dave [R-MI-4

    2012-07-24

    01/02/2013 Became Public Law No: 112-240. (TXT | PDF) (All Actions) Notes: Enactment of the "fiscal cliff bill" averted scheduled income tax rate increases and the spending reductions required by the sequestration process. Tracker: This bill has the status Became LawHere are the steps for Status of Legislation:

  11. The Course Development Plan: Macro-Level Decisions and Micro-Level Processes

    ERIC Educational Resources Information Center

    Franker, Karen; James, Dennis

    2016-01-01

    A key step in distance learning project management is the creation of a course development plan. The plan should account for decisions related to materials, curriculum, delivery methods, staffing, technology applications, resources, reporting lines, and project management--issues that may require administrator involvement and support, particularly…

  12. The Creative Use of Alienation.

    ERIC Educational Resources Information Center

    Herrick, James E.

    The process of becoming a world citizen requires, as a first step, personal alienation from the cultural values and social arrangements of nation-states that stand in the way of such an effort. Three interrelated propositions underlie this thesis. First, prevailing cultural values and social arrangements severely limit opportunities for people to…

  13. A New Sampling Strategy for the Detection of Fecal Bacteria Integrated with USEPA Method 1622/1623

    EPA Science Inventory

    USEPA Method 1622/1623 requires the concentration of Cryptosporidium and Giardia from 10 liters of water samples prior to detection. During this process the supernatant is discarded because it is assumed that most protozoa are retained in the filtration and centrifugation steps....

  14. Counseling Workers over 40: GULHEMP, a New Approach.

    ERIC Educational Resources Information Center

    Meredith, Jack

    This series of presentations describe a method of job counseling and placement for the middle-aged which combines pre-employment physical worker analysis with job analysis for effective matching of job requirements with worker capacities. The matching process involves these steps: (1) job analysis by an industrial engineer; (2) worker examination…

  15. What Does it Really Cost? Allocating Indirect Costs.

    ERIC Educational Resources Information Center

    Snyder, Herbert; Davenport, Elisabeth

    1997-01-01

    Better managerial control in terms of decision making and understanding the costs of a system/service result from allocating indirect costs. Allocation requires a three-step process: selecting cost objectives, pooling related overhead costs, and selecting costs bases to connect the objectives to the pooled costs. Argues that activity-based costing…

  16. Program Assessment: Getting to a Practical How-To Model

    ERIC Educational Resources Information Center

    Gardiner, Lorraine R.; Corbitt, Gail; Adams, Steven J.

    2010-01-01

    The Association to Advance Collegiate Schools of Business (AACSB) International's assurance of learning (AoL) standards require that schools develop a sophisticated continuous-improvement process. The authors review various assessment models and develop a practical, 6-step AoL model based on the literature and the authors' AoL-implementation…

  17. Sea Stories: A Collaborative Tool for Articulating Tactical Knowledge.

    ERIC Educational Resources Information Center

    Radtke, Paul H.; Frey, Paul R.

    Having subject matter experts (SMEs) identify the skills and knowledge to be taught is among the more difficult and time-consuming steps in the training development process. A procedure has been developed for identifying specific tactical decision-making knowledge requirements and translating SME knowledge into appropriate multimedia…

  18. Are We There Yet? Evaluating Library Collections, Reference Services, Programs, and Personnel.

    ERIC Educational Resources Information Center

    Robbins-Carter, Jane; Zweizig, Douglas L.

    1985-01-01

    This second in a five-lesson tutorial on library evaluation focuses on the evaluation of library collections. Highlights include the seven-step evaluation process described in lesson one; quantitative methods (total size, unfilled requests, circulation, turnover rate); and qualitative methods (impressionistic, list-checking). One required and…

  19. 40 CFR 141.133 - Compliance requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... specified by § 141.135(c). Systems may begin monitoring to determine whether Step 1 TOC removals can be met... the Step 1 requirements in § 141.135(b)(2) and must therefore apply for alternate minimum TOC removal (Step 2) requirements, is not eligible for retroactive approval of alternate minimum TOC removal (Step 2...

  20. 15 CFR 732.6 - Steps for other requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Steps for other requirements. 732.6...) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS STEPS FOR USING THE EAR § 732.6 Steps for other requirements. Sections 732.1 through 732.4 of this part are useful in...

  1. Integrated Watershed Management to Rehabilitate the Distorded Hydrologic Cycle in a Korean Urban Region

    NASA Astrophysics Data System (ADS)

    Lee, K.; Chung, E.; Park, K.

    2007-12-01

    Many urbanized watersheds suffer from streamflow depletion and poor stream quality, which often negatively affects related factors such as in-stream and near-stream ecologic integrity and water supply. But any watershed management which does not consider all potential risks is not proper since all hydrological components are closely related. Therefore this study has developed and applied a ten-step integrated watershed management (IWM) procedure to sustainably rehabilitate distorted hydrologic cycles due to urbanization. Step 1 of this procedure is understanding the watershed component and processes. This study proposes not only water quantity/quality monitoring but also continuous water quantity/quality simulation and estimation of annual pollutant loads from unit loads of all landuses. Step 2 is quantifying the watershed problem as potential flood damage (PFD), potential streamflow depletion (PSD), potential water quality deterioration (PWQD) and watershed evaluation index (WEI). All indicators are selected from the sustainability concept, Pressure-State- Response (PSR) model. All weights are estimated by Analytic Hierarchy Process (AHP). Four indices are calculated using composite programming, a kind of multicritera decision making technque. In Step 3 residents' preference on management objectives which consists of flood damage mitigation, prevention of streamflow depletion, and water quality enhancement are quantified. WEI can be recalculated using these values. Step 4 requires one to set the specific goals and objectives based on the results from Step 2 and 3. Objectives can include spatial flood allocation, instreamflow requirement and total maximum daily load (TMDL). Step 5 and 6 are developing all possible alternatives and to eliminate the infeasible. Step 7 is analyzing the effectiveness of all remaining feasible alternatives. The criteria of water quantity are presented as changed lowflow(Q275) and drought flow(Q355) of flow duration curve and number of days to satisfy the instreamflow requirement. Also the criteria of water quality are proposed as changed average BOD concentration and total daily loads and number of days to satisfy the TMDL. Step 8 involves the calculation of AEI using various MCDM techniques. The indicators of AEI are obtained by the sustainability concept, Drivers-Pressure-State-Impact-Response (DPSIR), an improved PSR model. All previous results are used in this step. Step 9 is estimating the benefit and cost of alternatives. Discrete Willingness To Pay (WTP) for the specific improvement of some current watershed conditions are estimated by the choice experiment method which is an economic valuation with stated presence techniques. WTPs of specific alternatives are calculated by combining AEI and choice experiment results. Therefore, the benefit of alternatives can be obtained by multiplying WTP and total household value of the sub-watershed. Finally in Step 10 the final alternatives comparing the net benefit and BC ratio are determined. Final alternatives derived from the proposed IWM procedure should not be carried out immediately but be discussed by stakeholders and decision makers. However, since plans obtained from the elaborated analyses reflect even sustainability concept, these alternatives can be apt to be accepted comparatively. This ten-step procedure will be helpful in making decision support system for sustainable IWM.

  2. Late Maturation Steps Preceding Selective Nuclear Export and Egress of Progeny Parvovirus

    PubMed Central

    Wolfisberg, Raphael; Kempf, Christoph

    2016-01-01

    ABSTRACT Although the mechanism is not well understood, growing evidence indicates that the nonenveloped parvovirus minute virus of mice (MVM) may actively egress before passive release through cell lysis. We have dissected the late maturation steps of the intranuclear progeny with the aims of confirming the existence of active prelytic egress and identifying critical capsid rearrangements required to initiate the process. By performing anion-exchange chromatography (AEX), we separated intranuclear progeny particles by their net surface charges. Apart from empty capsids (EC), two distinct populations of full capsids (FC) arose in the nuclei of infected cells. The earliest population of FC to appear was infectious but, like EC, could not be actively exported from the nucleus. Further maturation of this early population, involving the phosphorylation of surface residues, gave rise to a second, late population with nuclear export potential. While capsid surface phosphorylation was strictly associated with nuclear export capacity, mutational analysis revealed that the phosphoserine-rich N terminus of VP2 (N-VP2) was dispensable, although it contributed to passive release. The reverse situation was observed for the incoming particles, which were dephosphorylated in the endosomes. Our results confirm the existence of active prelytic egress and reveal a late phosphorylation event occurring in the nucleus as a selective factor for initiating the process. IMPORTANCE In general, the process of egress of enveloped viruses is active and involves host cell membranes. However, the release of nonenveloped viruses seems to rely more on cell lysis. At least for some nonenveloped viruses, an active process before passive release by cell lysis has been reported, although the underlying mechanism remains poorly understood. By using the nonenveloped model parvovirus minute virus of mice, we could confirm the existence of an active process of nuclear export and further characterize the associated capsid maturation steps. Following DNA packaging in the nucleus, capsids required further modifications, involving the phosphorylation of surface residues, to acquire nuclear export potential. Inversely, those surface residues were dephosphorylated on entering capsids. These spatially controlled phosphorylation-dephosphorylation events concurred with the nuclear export-import potential required to complete the infectious cycle. PMID:27009963

  3. Late Maturation Steps Preceding Selective Nuclear Export and Egress of Progeny Parvovirus.

    PubMed

    Wolfisberg, Raphael; Kempf, Christoph; Ros, Carlos

    2016-06-01

    Although the mechanism is not well understood, growing evidence indicates that the nonenveloped parvovirus minute virus of mice (MVM) may actively egress before passive release through cell lysis. We have dissected the late maturation steps of the intranuclear progeny with the aims of confirming the existence of active prelytic egress and identifying critical capsid rearrangements required to initiate the process. By performing anion-exchange chromatography (AEX), we separated intranuclear progeny particles by their net surface charges. Apart from empty capsids (EC), two distinct populations of full capsids (FC) arose in the nuclei of infected cells. The earliest population of FC to appear was infectious but, like EC, could not be actively exported from the nucleus. Further maturation of this early population, involving the phosphorylation of surface residues, gave rise to a second, late population with nuclear export potential. While capsid surface phosphorylation was strictly associated with nuclear export capacity, mutational analysis revealed that the phosphoserine-rich N terminus of VP2 (N-VP2) was dispensable, although it contributed to passive release. The reverse situation was observed for the incoming particles, which were dephosphorylated in the endosomes. Our results confirm the existence of active prelytic egress and reveal a late phosphorylation event occurring in the nucleus as a selective factor for initiating the process. In general, the process of egress of enveloped viruses is active and involves host cell membranes. However, the release of nonenveloped viruses seems to rely more on cell lysis. At least for some nonenveloped viruses, an active process before passive release by cell lysis has been reported, although the underlying mechanism remains poorly understood. By using the nonenveloped model parvovirus minute virus of mice, we could confirm the existence of an active process of nuclear export and further characterize the associated capsid maturation steps. Following DNA packaging in the nucleus, capsids required further modifications, involving the phosphorylation of surface residues, to acquire nuclear export potential. Inversely, those surface residues were dephosphorylated on entering capsids. These spatially controlled phosphorylation-dephosphorylation events concurred with the nuclear export-import potential required to complete the infectious cycle. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.

    PubMed

    Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar

    2015-01-01

    Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.

  5. Operator models for delivering municipal solid waste management services in developing countries: Part B: Decision support.

    PubMed

    Soós, Reka; Whiteman, Andrew D; Wilson, David C; Briciu, Cosmin; Nürnberger, Sofia; Oelz, Barbara; Gunsilius, Ellen; Schwehn, Ekkehard

    2017-08-01

    This is the second of two papers reporting the results of a major study considering 'operator models' for municipal solid waste management (MSWM) in emerging and developing countries. Part A documents the evidence base, while Part B presents a four-step decision support system for selecting an appropriate operator model in a particular local situation. Step 1 focuses on understanding local problems and framework conditions; Step 2 on formulating and prioritising local objectives; and Step 3 on assessing capacities and conditions, and thus identifying strengths and weaknesses, which underpin selection of the operator model. Step 4A addresses three generic questions, including public versus private operation, inter-municipal co-operation and integration of services. For steps 1-4A, checklists have been developed as decision support tools. Step 4B helps choose locally appropriate models from an evidence-based set of 42 common operator models ( coms); decision support tools here are a detailed catalogue of the coms, setting out advantages and disadvantages of each, and a decision-making flowchart. The decision-making process is iterative, repeating steps 2-4 as required. The advantages of a more formal process include avoiding pre-selection of a particular com known to and favoured by one decision maker, and also its assistance in identifying the possible weaknesses and aspects to consider in the selection and design of operator models. To make the best of whichever operator models are selected, key issues which need to be addressed include the capacity of the public authority as 'client', management in general and financial management in particular.

  6. Parsing the roles of neck-linker docking and tethered head diffusion in the stepping dynamics of kinesin.

    PubMed

    Zhang, Zhechun; Goldtzvik, Yonathan; Thirumalai, D

    2017-11-14

    Kinesin walks processively on microtubules (MTs) in an asymmetric hand-over-hand manner consuming one ATP molecule per 16-nm step. The individual contributions due to docking of the approximately 13-residue neck linker to the leading head (deemed to be the power stroke) and diffusion of the trailing head (TH) that contributes in propelling the motor by 16 nm have not been quantified. We use molecular simulations by creating a coarse-grained model of the MT-kinesin complex, which reproduces the measured stall force as well as the force required to dislodge the motor head from the MT, to show that nearly three-quarters of the step occurs by bidirectional stochastic motion of the TH. However, docking of the neck linker to the leading head constrains the extent of diffusion and minimizes the probability that kinesin takes side steps, implying that both the events are necessary in the motility of kinesin and for the maintenance of processivity. Surprisingly, we find that during a single step, the TH stochastically hops multiple times between the geometrically accessible neighboring sites on the MT before forming a stable interaction with the target binding site with correct orientation between the motor head and the [Formula: see text] tubulin dimer.

  7. Graphical modeling and query language for hospitals.

    PubMed

    Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris

    2013-01-01

    So far there has been little evidence that implementation of the health information technologies (HIT) is leading to health care cost savings. One of the reasons for this lack of impact by the HIT likely lies in the complexity of the business process ownership in the hospitals. The goal of our research is to develop a business model-based method for hospital use which would allow doctors to retrieve directly the ad-hoc information from various hospital databases. We have developed a special domain-specific process modelling language called the MedMod. Formally, we define the MedMod language as a profile on UML Class diagrams, but we also demonstrate it on examples, where we explain the semantics of all its elements informally. Moreover, we have developed the Process Query Language (PQL) that is based on MedMod process definition language. The purpose of PQL is to allow a doctor querying (filtering) runtime data of hospital's processes described using MedMod. The MedMod language tries to overcome deficiencies in existing process modeling languages, allowing to specify the loosely-defined sequence of the steps to be performed in the clinical process. The main advantages of PQL are in two main areas - usability and efficiency. They are: 1) the view on data through "glasses" of familiar process, 2) the simple and easy-to-perceive means of setting filtering conditions require no more expertise than using spreadsheet applications, 3) the dynamic response to each step in construction of the complete query that shortens the learning curve greatly and reduces the error rate, and 4) the selected means of filtering and data retrieving allows to execute queries in O(n) time regarding the size of the dataset. We are about to continue developing this project with three further steps. First, we are planning to develop user-friendly graphical editors for the MedMod process modeling and query languages. The second step is to do evaluation of usability the proposed language and tool involving the physicians from several hospitals in Latvia and working with real data from these hospitals. Our third step is to develop an efficient implementation of the query language.

  8. Estimating Manpower, Personnel, and Training Requirements Early in the Weapon System Acquisition Process: An Application of the HARDMAN Methodology to the Army’s Division Support Weapon System

    DTIC Science & Technology

    1984-02-01

    identifies the supply of personnel and training resources that can be expected at critical dates in the conceptual weapon system’s acquisition schedule...impact analysis matches demand to supply and identifies shortfalls in skills, new skill requirements, and high resource drivers. The tradeoff analysis...system. Step 5 - Conduct Impact Analysis The Impact Analysis determines the Army’s supply of those personnel and training resources required by the

  9. Synchronous separation, seaming, sealing and sterilization (S4) using brazing for sample containerization and planetary protection

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Lindsey, Cameron; Kutzer, Thomas; Salazar, Eduardo

    2018-03-01

    The return of samples back to Earth in future missions would require protection of our planet from the risk of bringing uncontrolled biological materials back with the samples. This protection would require "breaking the chain of contact (BTC)", where any returned material reaching Earth for further analysis would have to be sealed inside a container with extremely high confidence. Therefore, the acquired samples would need to be contained while destroying any potential biological materials that may contaminate the external surface of the container. A novel process that could be used to contain returning samples has been developed and demonstrated in a quarter scale size. The process consists of brazing using non-contact induction heating that synchronously separates, seams, seals and sterilizes (S4) the container. The use of brazing involves melting at temperatures higher than 500°C and this level of heating assures sterilization of the exposed areas since all carbon bonds (namely, organic materials) are broken at this temperature. The mechanism consists of a double wall container with inner and outer shells having Earth-clean interior surfaces. The process consists of two-steps, Step-1: the double wall container halves are fabricated and brazed (equivalent to production on Earth); and Step-2 is the S4 process and it is the equivalent to the execution on-orbit around Mars. In a potential future mission, the double wall container would be split into two halves and prepared on Earth. The potential on-orbit execution would consist of inserting the orbiting sample (OS) container into one of the halves and then mated to the other half and brazed. The latest results of this effort will be described and discussed in this manuscript.

  10. Preparation of cellulose based microspheres by combining spray coagulating with spray drying.

    PubMed

    Wang, Qiao; Fu, Aiping; Li, Hongliang; Liu, Jingquan; Guo, Peizhi; Zhao, Xiu Song; Xia, Lin Hua

    2014-10-13

    Porous microspheres of regenerated cellulose with size in range of 1-2 μm and composite microspheres of chitosan coated cellulose with size of 1-3 μm were obtained through a two-step spray-assisted approach. The spray coagulating process must combine with a spray drying step to guarantee the formation of stable microspheres of cellulose. This approach exhibits the following two main virtues. First, the preparation was performed using aqueous solution of cellulose as precursor in the absence of organic solvent and surfactant; Second, neither crosslinking agent nor separated crosslinking process was required for formation of stable microspheres. Moreover, the spray drying step also provided us with the chance to encapsulate guests into the resultant cellulose microspheres. The potential application of the cellulose microspheres acting as drug delivery vector has been studied in two PBS (phosphate-buffered saline) solution with pH values at 4.0 and 7.4 to mimic the environments of stomach and intestine, respectively. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Silicon algae with carbon topping as thin-film anodes for lithium-ion microbatteries by a two-step facile method

    NASA Astrophysics Data System (ADS)

    Biserni, E.; Xie, M.; Brescia, R.; Scarpellini, A.; Hashempour, M.; Movahed, P.; George, S. M.; Bestetti, M.; Li Bassi, A.; Bruno, P.

    2015-01-01

    Silicon-based electrodes for Li-ion batteries (LIB) attract much attention because of their high theoretical capacity. However, their large volume change during lithiation results in poor cycling due to mechanical cracking. Moreover, silicon can hardly form a stable solid electrolyte interphase (SEI) layer with common electrolytes. We present a safe, innovative strategy to prepare nanostructured silicon-carbon anodes in a two-step process. The nanoporosity of Si films accommodates the volume expansion while a disordered graphitic C layer on top promotes the formation of a stable SEI. This approach shows its promises: carbon-coated porous silicon anodes perform in a very stable way, reaching the areal capacity of ∼175 μAh cm-2, and showing no decay for at least 1000 cycles. With requiring only a two-step deposition process at moderate temperatures, this novel very simple cell concept introduces a promising way to possibly viable up-scaled production of next-generation nanostructured Si anodes for lithium-ion microbatteries.

  12. Laser-induced Self-organizing Microstructures on Steel for Joining with Polymers

    NASA Astrophysics Data System (ADS)

    van der Straeten, Kira; Burkhardt, Irmela; Olowinsky, Alexander; Gillner, Arnold

    The combination of different materials such as thermoplastic composites and metals is an important way to improve lightweight construction. As direct connections between these materials fail due to their physical and chemical properties, other joining techniques are required. A new joining approach besides fastening and adhesive joining is a laser-based two-step process. Within the first step the metal surface is modified by laser-microstructuring. In order to enlarge the boundary surface and create undercuts, random self-organizing microstructures are generated on stainless steel substrates. In a second process step both joining partners, metal and composite, are clamped together, the steel surface is heated up with laser radiation and through heat conduction the thermoplastic matrix is melted and flows into the structures. After cooling-down a firm joint between both materials is created. The presented work shows the influence of different laser parameters on the generation of the microstructures. The joint strength is investigated through tensile shear strength tests.

  13. Pyroprocessing of Light Water Reactor Spent Fuels Based on an Electrochemical Reduction Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohta, Hirokazu; Inoue, Tadashi; Sakamura, Yoshiharu

    A concept of pyroprocessing light water reactor (LWR) spent fuels based on an electrochemical reduction technology is proposed, and the material balance of the processing of mixed oxide (MOX) or high-burnup uranium oxide (UO{sub 2}) spent fuel is evaluated. Furthermore, a burnup analysis for metal fuel fast breeder reactors (FBRs) is conducted on low-decontamination materials recovered by pyroprocessing. In the case of processing MOX spent fuel (40 GWd/t), UO{sub 2} is separately collected for {approx}60 wt% of the spent fuel in advance of the electrochemical reduction step, and the product recovered through the rare earth (RE) removal step, which hasmore » the composition uranium:plutonium:minor actinides:fission products (FPs) = 76.4:18.4:1.7:3.5, can be applied as an ingredient of FBR metal fuel without a further decontamination process. On the other hand, the electroreduced alloy of high-burnup UO{sub 2} spent fuel (48 GWd/t) requires further decontamination of residual FPs by an additional process such as electrorefining even if RE FPs are removed from the alloy because the recovered plutonium (Pu) is accompanied by almost the same amount of FPs in addition to RE. However, the amount of treated materials in the electrorefining step is reduced to {approx}10 wt% of the total spent fuel owing to the prior UO{sub 2} recovery step. These results reveal that the application of electrochemical reduction technology to LWR spent oxide fuel is a promising concept for providing FBR metal fuel by a rationalized process.« less

  14. Thermal modeling of cogging process using finite element method

    NASA Astrophysics Data System (ADS)

    Khaled, Mahmoud; Ramadan, Mohamad; Fourment, Lionel

    2016-10-01

    Among forging processes, incremental processes are those where the work piece undergoes several thermal and deformation steps with small increment of deformation. They offer high flexibility in terms of the work piece size since they allow shaping wide range of parts from small to large size. Since thermal treatment is essential to obtain the required shape and quality, this paper presents the thermal modeling of incremental processes. The finite element discretization, spatial and temporal, is exposed. Simulation is performed using commercial software Forge 3. Results show the thermal behavior at the beginning and at the end of the process.

  15. Reductive stripping process for uranium recovery from organic extracts

    DOEpatents

    Hurst, F.J. Jr.

    1983-06-16

    In the reductive stripping of uranium from an organic extractant in a uranium recovery process, the use of phosphoric acid having a molarity in the range of 8 to 10 increases the efficiency of the reductive stripping and allows the strip step to operate with lower aqueous to organic recycle ratios and shorter retention time in the mixer stages. Under these operating conditions, less solvent is required in the process, and smaller, less expensive process equipment can be utilized. The high strength H/sub 3/PO/sub 4/ is available from the evaporator stage of the process.

  16. Self-regenerating column chromatography

    DOEpatents

    Park, Woo K.

    1995-05-30

    The present invention provides a process for treating both cations and anions by using a self-regenerating, multi-ionic exchange resin column system which requires no separate regeneration steps. The process involves alternating ion-exchange chromatography for cations and anions in a multi-ionic exchange column packed with a mixture of cation and anion exchange resins. The multi-ionic mixed-charge resin column works as a multi-function column, capable of independently processing either cationic or anionic exchange, or simultaneously processing both cationic and anionic exchanges. The major advantage offered by the alternating multi-function ion exchange process is the self-regeneration of the resins.

  17. Reductive stripping process for uranium recovery from organic extracts

    DOEpatents

    Hurst, Jr., Fred J.

    1985-01-01

    In the reductive stripping of uranium from an organic extractant in a uranium recovery process, the use of phosphoric acid having a molarity in the range of 8 to 10 increases the efficiency of the reductive stripping and allows the strip step to operate with lower aqueous to organic recycle ratios and shorter retention time in the mixer stages. Under these operating conditions, less solvent is required in the process, and smaller, less expensive process equipment can be utilized. The high strength H.sub.3 PO.sub.4 is available from the evaporator stage of the process.

  18. 49 CFR 399.207 - Truck and truck-tractor access requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... requirements: (1) Vertical height. All measurements of vertical height shall be made from ground level with the... vertical height of the first step shall be no more than 609 millimeters (24 inches) from ground level. (3... requirement. The step need not retain the disc at rest. (5) Step strength. Each step must withstand a vertical...

  19. 49 CFR 399.207 - Truck and truck-tractor access requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... requirements: (1) Vertical height. All measurements of vertical height shall be made from ground level with the... vertical height of the first step shall be no more than 609 millimeters (24 inches) from ground level. (3... requirement. The step need not retain the disc at rest. (5) Step strength. Each step must withstand a vertical...

  20. 49 CFR 399.207 - Truck and truck-tractor access requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... requirements: (1) Vertical height. All measurements of vertical height shall be made from ground level with the... vertical height of the first step shall be no more than 609 millimeters (24 inches) from ground level. (3... requirement. The step need not retain the disc at rest. (5) Step strength. Each step must withstand a vertical...

  1. 49 CFR 399.207 - Truck and truck-tractor access requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... requirements: (1) Vertical height. All measurements of vertical height shall be made from ground level with the... vertical height of the first step shall be no more than 609 millimeters (24 inches) from ground level. (3... requirement. The step need not retain the disc at rest. (5) Step strength. Each step must withstand a vertical...

  2. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  3. Determining geometric error model parameters of a terrestrial laser scanner through Two-face, Length-consistency, and Network methods

    PubMed Central

    Wang, Ling; Muralikrishnan, Bala; Rachakonda, Prem; Sawyer, Daniel

    2017-01-01

    Terrestrial laser scanners (TLS) are increasingly used in large-scale manufacturing and assembly where required measurement uncertainties are on the order of few tenths of a millimeter or smaller. In order to meet these stringent requirements, systematic errors within a TLS are compensated in-situ through self-calibration. In the Network method of self-calibration, numerous targets distributed in the work-volume are measured from multiple locations with the TLS to determine parameters of the TLS error model. In this paper, we propose two new self-calibration methods, the Two-face method and the Length-consistency method. The Length-consistency method is proposed as a more efficient way of realizing the Network method where the length between any pair of targets from multiple TLS positions are compared to determine TLS model parameters. The Two-face method is a two-step process. In the first step, many model parameters are determined directly from the difference between front-face and back-face measurements of targets distributed in the work volume. In the second step, all remaining model parameters are determined through the Length-consistency method. We compare the Two-face method, the Length-consistency method, and the Network method in terms of the uncertainties in the model parameters, and demonstrate the validity of our techniques using a calibrated scale bar and front-face back-face target measurements. The clear advantage of these self-calibration methods is that a reference instrument or calibrated artifacts are not required, thus significantly lowering the cost involved in the calibration process. PMID:28890607

  4. Classifying seismic waveforms from scratch: a case study in the alpine environment

    NASA Astrophysics Data System (ADS)

    Hammer, C.; Ohrnberger, M.; Fäh, D.

    2013-01-01

    Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTA trigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.

  5. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  6. Innovative and water based stripping approach for thick and bulk photoresists

    NASA Astrophysics Data System (ADS)

    Rudolph, Matthias; Schumann, Dirk; Thrun, Xaver; Esche, Silvio; Hohle, Christoph

    2014-10-01

    The usage of phase fluid based stripping agents to remove photoresists from silicon substrates was studied. Photoresists are required for many silicon based technologies such as MEMS patterning, 3D-Integration or frontend and backend of line semiconductor applications [1]. Although the use of resists is very common, their successful integration often depends on the ability to remove the resist after certain processing steps. On the one hand the resist is changing during subsequent process steps that can cause a thermally activated cross-linking which increases the stripping complexity. Resist removal is also challenging after the formation of a hard polymer surface layer during plasma or implant processes which is called skin or crust [2]. On the other hand the choice of stripping chemistry is often limited due to the presence of functional materials such as metals which can be damaged by aggressive stripping chemistries [3].

  7. Adaptive interference cancel filter for evoked potential using high-order cumulants.

    PubMed

    Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei

    2004-01-01

    This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.

  8. The algorithm for automatic detection of the calibration object

    NASA Astrophysics Data System (ADS)

    Artem, Kruglov; Irina, Ugfeld

    2017-06-01

    The problem of the automatic image calibration is considered in this paper. The most challenging task of the automatic calibration is a proper detection of the calibration object. The solving of this problem required the appliance of the methods and algorithms of the digital image processing, such as morphology, filtering, edge detection, shape approximation. The step-by-step process of the development of the algorithm and its adopting to the specific conditions of the log cuts in the image's background is presented. Testing of the automatic calibration module was carrying out under the conditions of the production process of the logging enterprise. Through the tests the average possibility of the automatic isolating of the calibration object is 86.1% in the absence of the type 1 errors. The algorithm was implemented in the automatic calibration module within the mobile software for the log deck volume measurement.

  9. Standardized Competencies for Parenteral Nutrition Order Review and Parenteral Nutrition Preparation, Including Compounding: The ASPEN Model.

    PubMed

    Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi

    2016-08-01

    Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.

  10. A hitchhiker's guide to an ISS experiment in under 9 months.

    PubMed

    Nadir, Andrei James; Sato, Kevin

    2017-01-01

    The International Space Station National Laboratory gives students a platform to conduct space-flight science experiments. To successfully take advantage of this opportunity, students and their mentors must have an understanding of how to develop and then conduct a science project on international space station within a school year. Many factors influence the speed in which a project progresses. The first step is to develop a science plan, including defining a hypothesis, developing science objectives, and defining a concept of operation for conducting the flight experiment. The next step is to translate the plan into well-defined requirements for payload development. The last step is a rapid development process. Included in this step is identifying problems early and negotiating appropriate trade-offs between science and implementation complexity. Organizing the team and keeping players motivated is an equally important task, as is employing the right mentors. The project team must understand the flight experiment infrastructure, which includes the international space station environment, payload resource requirements and available components, fail-safe operations, system logs, and payload data. Without this understanding, project development can be impacted, resulting in schedule delays, added costs, undiagnosed problems, and data misinterpretation. The information and processes for conducting low-cost, rapidly developed student-based international space station experiments are presented, including insight into the system operations, the development environment, effective team organization, and data analysis. The details are based on the Valley Christian Schools (VCS, San Jose, CA) fluidic density experiment and penicillin experiment, which were developed by 13- and 14-year-old students and flown on ISS.

  11. Study on Potential Changes in Geological and Disposal Environment Caused by 'Natural Phenomena' on a HLW Disposal System

    NASA Astrophysics Data System (ADS)

    Kawamura, M.; Umeda, K.; Ohi, T.; Ishimaru, T.; Niizato, T.; Yasue, K.; Makino, H.

    2007-12-01

    We have developed a formal evaluation method to assess the potential impact of natural phenomena (earthquakes and faulting; volcanism; uplift, subsidence, denudation and sedimentation; climatic and sea-level changes) on a High Level Radioactive Waste (HLW) Disposal System. In 2000, we had developed perturbation scenarios in a generic and conservative sense and illustrated the potential impact on a HLW disposal system. As results of the development of perturbation scenarios, two points were highlighted for consideration in subsequent work: improvement of the scenarios from the viewpoints of reality, transparency, traceability and consistency and avoiding extreme conservatism. Subsequently, we have thus developed a new procedure for describing such perturbation scenarios based on further studies of the characteristics of these natural perturbation phenomena in Japan. The approach to describing the perturbation scenario is effectively developed in five steps: Step 1: Description of potential process of phenomena and their impacts on the geological environment. Step 2: Characterization of potential changes of geological environment in terms of T-H-M-C (Thermal - Hydrological - Mechanical - Chemical) processes. The focus is on specific T-H-M-C parameters that influence geological barrier performance, utilizing the input from Step 1. Step 3: Classification of potential influences, based on similarity of T-H-M-C perturbations. This leads to development of perturbation scenarios to serve as a basis for consequence analysis. Step 4: Establishing models and parameters for performance assessment. Step 5: Calculation and assessment. This study focuses on identifying key T-H-M-C process associated with perturbations at Step 2. This framework has two advantages. First one is assuring maintenance of traceability during the scenario construction processes, facilitating the production and structuring of suitable records. The second is providing effective elicitation and organization of information from a wide range of investigations of earth sciences within a performance assessment context. In this framework, scenario development work proceeds in a stepwise manner, to ensure clear identification of the impact of processes associated with these phenomena on a HLW disposal system. Output is organized to create credible scenarios with required transparency, consistency, traceability and adequate conservatism. In this presentation, the potential impact of natural phenomena in the viewpoint of performance assessment for HLW disposal will be discussed and modeled using the approach.

  12. Planning effectiveness may grow on fault trees.

    PubMed

    Chow, C W; Haddad, K; Mannino, B

    1991-10-01

    The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.

  13. How to unlock the benefits of MRP (materiel requirements planning) II and Just-in-Time.

    PubMed

    Jacobi, M A

    1994-05-01

    Manufacturing companies need to use the best and most applicable parts of MRP II and JIT to run their businesses effectively. MRP II provides the methodology to plan and control the total resources of the company and focuses on the processes that add value to their customers' products. It is the cornerstone of total quality management, as it reduces the variability and costly activities in the communication and subsequent execution of the required steps from customer order to shipment. JIT focuses on simplifying the total business operation and execution of business processes. MRP II and JIT are the foundations for successful manufacturing businesses.

  14. Coagulation-Fenton coupled treatment for ecotoxicity reduction in highly polluted industrial wastewater.

    PubMed

    Perdigón-Melón, J A; Carbajo, J B; Petre, A L; Rosal, R; García-Calvo, E

    2010-09-15

    A coupled coagulation-Fenton process was applied for the treatment of cosmetic industry effluents. In a first step, FeSO(4) was used as coagulant and the non-precipitated Fe(2+) remaining in dissolution was used as catalyst in the further Fenton process. In the coagulation process a huge decrease in total organic carbon (TOC) was achieved, but the high concentration of phenol derivatives was not diminished. The decrease in TOC in the coagulation step significantly reduces the amount of H(2)O(2) required in the Fenton process for phenol depletion. The coupled process, using a H(2)O(2) dose of only 2 g l(-1), reduced TOC and total phenol to values lower than 40 and 0.10 mg l(-1), respectively. The short reaction period (less than 15 min) in TOC and phenol degradation bodes well for improving treatment in a continuous regime. The combination of both processes significantly reduced the ecotoxicity of raw effluent and markedly increased its biodegradability, thus allowing easier treatment by the conventional biological units in conventional sewage treatment plants (STPs). Copyright 2010 Elsevier B.V. All rights reserved.

  15. Antigen Masking During Fixation and Embedding, Dissected

    PubMed Central

    Scalia, Carla Rossana; Boi, Giovanna; Bolognesi, Maddalena Maria; Riva, Lorella; Manzoni, Marco; DeSmedt, Linde; Bosisio, Francesca Maria; Ronchi, Susanna; Leone, Biagio Eugenio; Cattoretti, Giorgio

    2016-01-01

    Antigen masking in routinely processed tissue is a poorly understood process caused by multiple factors. We sought to dissect the effect on antigenicity of each step of processing by using frozen sections as proxies of the whole tissue. An equivalent extent of antigen masking occurs across variable fixation times at room temperature. Most antigens benefit from longer fixation times (>24 hr) for optimal detection after antigen retrieval (AR; for example, Ki-67, bcl-2, ER). The transfer to a graded alcohol series results in an enhanced staining effect, reproduced by treating the sections with detergents, possibly because of a better access of the polymeric immunohistochemical detection system to tissue structures. A second round of masking occurs upon entering the clearing agent, mostly at the paraffin embedding step. This may depend on the non-freezable water removal. AR fully reverses the masking due both to the fixation time and the paraffin embedding. AR itself destroys some epitopes which do not survive routine processing. Processed frozen sections are a tool to investigate fixation and processing requirements for antigens in routine specimens. PMID:27798289

  16. Pilot Plant Testing of Hot Gas Building Decontamination Process

    DTIC Science & Technology

    1987-10-30

    last hours of the cooldown (after water traps in the line were installed) showed no detectable contamination from this station. 1 60 CwC -So 0) 0 o j...Since we will not require refrigeration, additional generators probably 0 qlill not be required. Water is trucked to the site. Agent contaminated water ...surface. The gauze was handled by forceps during all of the sampling steps to prevent contamination after the solvent extraction clean-up of the gauze pads

  17. Maintenance of the Maxillomandibular Position with Digital Workflow in Oral Rehabilitation: A Technical Note.

    PubMed

    Li, Zhongjie; Xia, Yingfeng; Chen, Kai; Zhao, Hanchi; Liu, Yang

    Prosthodontic oral rehabilitation procedures are time consuming and require efforts to maintain the confirmed maxillomandibular relationship. Several occlusal registrations and impressions are needed, and cross-mounting is performed to transfer the diagnostic wax-up to master working casts. The introduction of a digital workflow protocol reduces steps in the required process, and occlusal registrations with less deformation are used. The outcome is a maintained maxillomandibular position that is accurately and conveniently transferred.

  18. Defining the Costs of Reusable Flexible Ureteroscope Reprocessing Using Time-Driven Activity-Based Costing.

    PubMed

    Isaacson, Dylan; Ahmad, Tessnim; Metzler, Ian; Tzou, David T; Taguchi, Kazumi; Usawachintachit, Manint; Zetumer, Samuel; Sherer, Benjamin; Stoller, Marshall; Chi, Thomas

    2017-10-01

    Careful decontamination and sterilization of reusable flexible ureteroscopes used in ureterorenoscopy cases prevent the spread of infectious pathogens to patients and technicians. However, inefficient reprocessing and unavailability of ureteroscopes sent out for repair can contribute to expensive operating room (OR) delays. Time-driven activity-based costing (TDABC) was applied to describe the time and costs involved in reprocessing. Direct observation and timing were performed for all steps in reprocessing of reusable flexible ureteroscopes following operative procedures. Estimated times needed for each step by which damaged ureteroscopes identified during reprocessing are sent for repair were characterized through interviews with purchasing analyst staff. Process maps were created for reprocessing and repair detailing individual step times and their variances. Cost data for labor and disposables used were applied to calculate per minute and average step costs. Ten ureteroscopes were followed through reprocessing. Process mapping for ureteroscope reprocessing averaged 229.0 ± 74.4 minutes, whereas sending a ureteroscope for repair required an estimated 143 minutes per repair. Most steps demonstrated low variance between timed observations. Ureteroscope drying was the longest and highest variance step at 126.5 ± 55.7 minutes and was highly dependent on manual air flushing through the ureteroscope working channel and ureteroscope positioning in the drying cabinet. Total costs for reprocessing totaled $96.13 per episode, including the cost of labor and disposable items. Utilizing TDABC delineates the full spectrum of costs associated with ureteroscope reprocessing and identifies areas for process improvement to drive value-based care. At our institution, ureteroscope drying was one clearly identified target area. Implementing training in ureteroscope drying technique could save up to 2 hours per reprocessing event, potentially preventing expensive OR delays.

  19. Inverting the planning gradient: adjustment of grasps to late segments of multi-step object manipulations.

    PubMed

    Mathew, Hanna; Kunde, Wilfried; Herbort, Oliver

    2017-05-01

    When someone grasps an object, the grasp depends on the intended object manipulation and usually facilitates it. If several object manipulation steps are planned, the first step has been reported to primarily determine the grasp selection. We address whether the grasp can be aligned to the second step, if the second step's requirements exceed those of the first step. Participants grasped and rotated a dial first by a small extent and then by various extents in the opposite direction, without releasing the dial. On average, when the requirements of the first and the second step were similar, participants mostly aligned the grasp to the first step. When the requirements of the second step were considerably higher, participants aligned the grasp to the second step, even though the first step still had a considerable impact. Participants employed two different strategies. One subgroup initially aligned the grasp to the first step and then ceased adjusting the grasp to either step. Another group also initially aligned the grasp to the first step and then switched to aligning it primarily to the second step. The data suggest that participants are more likely to switch to the latter strategy when they experienced more awkward arm postures. In summary, grasp selections for multi-step object manipulations can be aligned to the second object manipulation step, if the requirements of this step clearly exceed those of the first step and if participants have some experience with the task.

  20. Sequence-dependent base pair stepping dynamics in XPD helicase unwinding

    PubMed Central

    Qi, Zhi; Pugh, Robert A; Spies, Maria; Chemla, Yann R

    2013-01-01

    Helicases couple the chemical energy of ATP hydrolysis to directional translocation along nucleic acids and transient duplex separation. Understanding helicase mechanism requires that the basic physicochemical process of base pair separation be understood. This necessitates monitoring helicase activity directly, at high spatio-temporal resolution. Using optical tweezers with single base pair (bp) resolution, we analyzed DNA unwinding by XPD helicase, a Superfamily 2 (SF2) DNA helicase involved in DNA repair and transcription initiation. We show that monomeric XPD unwinds duplex DNA in 1-bp steps, yet exhibits frequent backsteps and undergoes conformational transitions manifested in 5-bp backward and forward steps. Quantifying the sequence dependence of XPD stepping dynamics with near base pair resolution, we provide the strongest and most direct evidence thus far that forward, single-base pair stepping of a helicase utilizes the spontaneous opening of the duplex. The proposed unwinding mechanism may be a universal feature of DNA helicases that move along DNA phosphodiester backbones. DOI: http://dx.doi.org/10.7554/eLife.00334.001 PMID:23741615

  1. Direct, enantioselective α-alkylation of aldehydes using simple olefins.

    PubMed

    Capacci, Andrew G; Malinowski, Justin T; McAlpine, Neil J; Kuhne, Jerome; MacMillan, David W C

    2017-11-01

    Although the α-alkylation of ketones has already been established, the analogous reaction using aldehyde substrates has proven surprisingly elusive. Despite the structural similarities between the two classes of compounds, the sensitivity and unique reactivity of the aldehyde functionality has typically required activated substrates or specialized additives. Here, we show that the synergistic merger of three catalytic processes-photoredox, enamine and hydrogen-atom transfer (HAT) catalysis-enables an enantioselective α-aldehyde alkylation reaction that employs simple olefins as coupling partners. Chiral imidazolidinones or prolinols, in combination with a thiophenol, iridium photoredox catalyst and visible light, have been successfully used in a triple catalytic process that is temporally sequenced to deliver a new hydrogen and electron-borrowing mechanism. This multicatalytic process enables both intra- and intermolecular aldehyde α-methylene coupling with olefins to construct both cyclic and acyclic products, respectively. With respect to atom and step-economy ideals, this stereoselective process allows the production of high-value molecules from feedstock chemicals in one step while consuming only photons.

  2. Influence of bilayer resist processing on p-i-n OLEDs: towards multicolor photolithographic structuring of organic displays

    NASA Astrophysics Data System (ADS)

    Krotkus, Simonas; Nehm, Frederik; Janneck, Robby; Kalkura, Shrujan; Zakhidov, Alex A.; Schober, Matthias; Hild, Olaf R.; Kasemann, Daniel; Hofmann, Simone; Leo, Karl; Reineke, Sebastian

    2015-03-01

    Recently, bilayer resist processing combined with development in hydrofluoroether (HFE) solvents has been shown to enable single color structuring of vacuum-deposited state-of-the-art organic light-emitting diodes (OLED). In this work, we focus on further steps required to achieve multicolor structuring of p-i-n OLEDs using a bilayer resist approach. We show that the green phosphorescent OLED stack is undamaged after lift-off in HFEs, which is a necessary step in order to achieve RGB pixel array structured by means of photolithography. Furthermore, we investigate the influence of both, double resist processing on red OLEDs and exposure of the devices to ambient conditions, on the basis of the electrical, optical and lifetime parameters of the devices. Additionally, water vapor transmission rates of single and bilayer system are evaluated with thin Ca film conductance test. We conclude that diffusion of propylene glycol methyl ether acetate (PGMEA) through the fluoropolymer film is the main mechanism behind OLED degradation observed after bilayer processing.

  3. Technical difficulties and solutions of direct transesterification process of microbial oil for biodiesel synthesis.

    PubMed

    Yousuf, Abu; Khan, Maksudur Rahman; Islam, M Amirul; Wahid, Zularisam Ab; Pirozzi, Domenico

    2017-01-01

    Microbial oils are considered as alternative to vegetable oils or animal fats as biodiesel feedstock. Microalgae and oleaginous yeast are the main candidates of microbial oil producers' community. However, biodiesel synthesis from these sources is associated with high cost and process complexity. The traditional transesterification method includes several steps such as biomass drying, cell disruption, oil extraction and solvent recovery. Therefore, direct transesterification or in situ transesterification, which combines all the steps in a single reactor, has been suggested to make the process cost effective. Nevertheless, the process is not applicable for large-scale biodiesel production having some difficulties such as high water content of biomass that makes the reaction rate slower and hurdles of cell disruption makes the efficiency of oil extraction lower. Additionally, it requires high heating energy in the solvent extraction and recovery stage. To resolve these difficulties, this review suggests the application of antimicrobial peptides and high electric fields to foster the microbial cell wall disruption.

  4. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    NASA Astrophysics Data System (ADS)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  5. Modeling fatigue.

    PubMed

    Sumner, Walton; Xu, Jin Zhong

    2002-01-01

    The American Board of Family Practice is developing a patient simulation program to evaluate diagnostic and management skills. The simulator must give temporally and physiologically reasonable answers to symptom questions such as "Have you been tired?" A three-step process generates symptom histories. In the first step, the simulator determines points in time where it should calculate instantaneous symptom status. In the second step, a Bayesian network implementing a roughly physiologic model of the symptom generates a value on a severity scale at each sampling time. Positive, zero, and negative values represent increased, normal, and decreased status, as applicable. The simulator plots these values over time. In the third step, another Bayesian network inspects this plot and reports how the symptom changed over time. This mechanism handles major trends, multiple and concurrent symptom causes, and gradually effective treatments. Other temporal insights, such as observations about short-term symptom relief, require complimentary mechanisms.

  6. The Pharmaceutical Capping Process-Correlation between Residual Seal Force, Torque Moment, and Flip-off Removal Force.

    PubMed

    Mathaes, Roman; Mahler, Hanns-Christian; Vorgrimler, Lothar; Steinberg, Henrik; Dreher, Sascha; Roggo, Yves; Nieto, Alejandra; Brown, Helen; Roehl, Holger; Adler, Michael; Luemkemann, Joerg; Huwyler, Joerg; Lam, Philippe; Stauch, Oliver; Mohl, Silke; Streubel, Alexander

    2016-01-01

    The majority of parenteral drug products are manufactured in glass vials with an elastomeric rubber stopper and a crimp cap. The vial sealing process is a critical process step during fill-and-finish operations, as it defines the seal quality of the final product. Different critical capping process parameters can affect rubber stopper defects, rubber stopper compression, container closure integrity, and also crimp cap quality. A sufficiently high force to remove the flip-off button prior to usage is required to ensure quality of the drug product unit by the flip-off button during storage, transportation, and until opening and use. Therefore, the final product is 100% visually inspected for lose or defective crimp caps, which is subjective as well as time- and labor-intensive. In this study, we sealed several container closure system configurations with different capping equipment settings (with corresponding residual seal force values) to investigate the torque moment required to turn the crimp cap. A correlation between torque moment and residual seal force has been established. The torque moment was found to be influenced by several parameters, including diameter of the vial head, type of rubber stopper (serum or lyophilized) and type of crimp cap (West(®) or Datwyler(®)). In addition, we measured the force required to remove the flip-off button of a sealed container closure system. The capping process had no influence on measured forces; however, it was possible to detect partially crimped vials. In conclusion, a controlled capping process with a defined target residual seal force range leads to a tight crimp cap on a sealed container closure system and can ensure product quality. The majority of parenteral drug products are manufactured in a glass vials with an elastomeric rubber stopper and a crimp cap. The vial sealing process is a critical process step during fill-and-finish operations, as it defines the seal quality of the final product. An adequate force to remove the flip-off button prior to usage is required to ensure product quality during storage and transportation until use. In addition, the complete crimp cap needs to be fixed in a tight position on the vial. In this study, we investigated the torque moment required to turn the crimp cap and the force required to remove the flip-off button of container closure system sealed with different capping equipment process parameters (having different residual seal force values). © PDA, Inc. 2016.

  7. Extrinsic Repair of Injured Dendrites as a Paradigm for Regeneration by Fusion in Caenorhabditis elegans

    PubMed Central

    Oren-Suissa, Meital; Gattegno, Tamar; Kravtsov, Veronika; Podbilewicz, Benjamin

    2017-01-01

    Injury triggers regeneration of axons and dendrites. Research has identified factors required for axonal regeneration outside the CNS, but little is known about regeneration triggered by dendrotomy. Here, we study neuronal plasticity triggered by dendrotomy and determine the fate of complex PVD arbors following laser surgery of dendrites. We find that severed primary dendrites grow toward each other and reconnect via branch fusion. Simultaneously, terminal branches lose self-avoidance and grow toward each other, meeting and fusing at the tips via an AFF-1-mediated process. Ectopic branch growth is identified as a step in the regeneration process required for bypassing the lesion site. Failure of reconnection to the severed dendrites results in degeneration of the distal end of the neuron. We discover pruning of excess branches via EFF-1 that acts to recover the original wild-type arborization pattern in a late stage of the process. In contrast, AFF-1 activity during dendritic auto-fusion is derived from the lateral seam cells and not autonomously from the PVD neuron. We propose a model in which AFF-1-vesicles derived from the epidermal seam cells fuse neuronal dendrites. Thus, EFF-1 and AFF-1 fusion proteins emerge as new players in neuronal arborization and maintenance of arbor connectivity following injury in Caenorhabditis elegans. Our results demonstrate that there is a genetically determined multi-step pathway to repair broken dendrites in which EFF-1 and AFF-1 act on different steps of the pathway. EFF-1 is essential for dendritic pruning after injury and extrinsic AFF-1 mediates dendrite fusion to bypass injuries. PMID:28283540

  8. Solving satisfiability problems using a novel microarray-based DNA computer.

    PubMed

    Lin, Che-Hsin; Cheng, Hsiao-Ping; Yang, Chang-Biau; Yang, Chia-Ning

    2007-01-01

    An algorithm based on a modified sticker model accompanied with an advanced MEMS-based microarray technology is demonstrated to solve SAT problem, which has long served as a benchmark in DNA computing. Unlike conventional DNA computing algorithms needing an initial data pool to cover correct and incorrect answers and further executing a series of separation procedures to destroy the unwanted ones, we built solutions in parts to satisfy one clause in one step, and eventually solve the entire Boolean formula through steps. No time-consuming sample preparation procedures and delicate sample applying equipment were required for the computing process. Moreover, experimental results show the bound DNA sequences can sustain the chemical solutions during computing processes such that the proposed method shall be useful in dealing with large-scale problems.

  9. Direct write with microelectronic circuit fabrication

    DOEpatents

    Drummond, T.; Ginley, D.

    1988-05-31

    In a process for deposition of material onto a substrate, for example, the deposition of metals for dielectrics onto a semiconductor laser, the material is deposited by providing a colloidal suspension of the material and directly writing the suspension onto the substrate surface by ink jet printing techniques. This procedure minimizes the handling requirements of the substrate during the deposition process and also minimizes the exchange of energy between the material to be deposited and the substrate at the interface. The deposited material is then resolved into a desired pattern, preferably by subjecting the deposit to a laser annealing step. The laser annealing step provides high resolution of the resultant pattern while minimizing the overall thermal load of the substrate and permitting precise control of interface chemistry and interdiffusion between the substrate and the deposit. 3 figs.

  10. Direct write with microelectronic circuit fabrication

    DOEpatents

    Drummond, Timothy; Ginley, David

    1992-01-01

    In a process for deposition of material onto a substrate, for example, the deposition of metals or dielectrics onto a semiconductor laser, the material is deposited by providing a colloidal suspension of the material and directly writing the suspension onto the substrate surface by ink jet printing techniques. This procedure minimizes the handling requirements of the substrate during the deposition process and also minimizes the exchange of energy between the material to be deposited and the substrate at the interface. The deposited material is then resolved into a desired pattern, preferably by subjecting the deposit to a laser annealing step. The laser annealing step provides high resolution of the resultant pattern while minimizing the overall thermal load of the substrate and permitting precise control of interface chemistry and interdiffusion between the substrate and the deposit.

  11. Open LED Illuminator: A Simple and Inexpensive LED Illuminator for Fast Multicolor Particle Tracking in Neurons

    PubMed Central

    Bosse, Jens B.; Tanneti, Nikhila S.; Hogue, Ian B.; Enquist, Lynn W.

    2015-01-01

    Dual-color live cell fluorescence microscopy of fast intracellular trafficking processes, such as axonal transport, requires rapid switching of illumination channels. Typical broad-spectrum sources necessitate the use of mechanical filter switching, which introduces delays between acquisition of different fluorescence channels, impeding the interpretation and quantification of highly dynamic processes. Light Emitting Diodes (LEDs), however, allow modulation of excitation light in microseconds. Here we provide a step-by-step protocol to enable any scientist to build a research-grade LED illuminator for live cell microscopy, even without prior experience with electronics or optics. We quantify and compare components, discuss our design considerations, and demonstrate the performance of our LED illuminator by imaging axonal transport of herpes virus particles with high temporal resolution. PMID:26600461

  12. A detailed description of the implementation of inpatient insulin orders with a commercial electronic health record system.

    PubMed

    Neinstein, Aaron; MacMaster, Heidemarie Windham; Sullivan, Mary M; Rushakoff, Robert

    2014-07-01

    In the setting of Meaningful Use laws and professional society guidelines, hospitals are rapidly implementing electronic glycemic management order sets. There are a number of best practices established in the literature for glycemic management protocols and programs. We believe that this is the first published account of the detailed steps to be taken to design, implement, and optimize glycemic management protocols in a commercial computerized provider order entry (CPOE) system. Prior to CPOE implementation, our hospital already had a mature glycemic management program. To transition to CPOE, we underwent the following 4 steps: (1) preparation and requirements gathering, (2) design and build, (3) implementation and dissemination, and (4) optimization. These steps required more than 2 years of coordinated work between physicians, nurses, pharmacists, and programmers. With the move to CPOE, our complex glycemic management order sets were successfully implemented without any significant interruptions in care. With feedback from users, we have continued to refine the order sets, and this remains an ongoing process. Successful implementation of glycemic management protocols in CPOE is dependent on broad stakeholder input and buy-in. When using a commercial CPOE system, there may be limitations of the system, necessitating workarounds. There should be an upfront plan to apply resources for continuous process improvement and optimization after implementation. © 2014 Diabetes Technology Society.

  13. Molding cork sheets to complex shapes

    NASA Technical Reports Server (NTRS)

    Sharpe, M. H.; Simpson, W. G.; Walker, H. M.

    1977-01-01

    Partially cured cork sheet is easily formed to complex shapes and then final-cured. Temperature and pressure levels required for process depend upon resin system used and final density and strength desired. Sheet can be bonded to surface during final cure, or can be first-formed in mold and bonded to surface in separate step.

  14. Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.

    ERIC Educational Resources Information Center

    Meghabghab, Dania Bilal

    Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…

  15. Neural Correlates of Sequence Learning with Stochastic Feedback

    ERIC Educational Resources Information Center

    Averbeck, Bruno B.; Kilner, James; Frith, Christopher D.

    2011-01-01

    Although much is known about decision making under uncertainty when only a single step is required in the decision process, less is known about sequential decision making. We carried out a stochastic sequence learning task in which subjects had to use noisy feedback to learn sequences of button presses. We compared flat and hierarchical behavioral…

  16. Home Language Survey Data Quality Self-Assessment. REL 2017-198

    ERIC Educational Resources Information Center

    Henry, Susan F.; Mello, Dan; Avery, Maria-Paz; Parker, Caroline; Stafford, Erin

    2017-01-01

    Most state departments of education across the United States recommend or require that districts use a home language survey as the first step in a multistep process of identifying students who qualify for English learner student services. School districts typically administer the home language survey to parents and guardians during a student's…

  17. Skill Sets Required for Environmental Engineering and Where They Are Learned

    ERIC Educational Resources Information Center

    Reed, Kathaleen

    2010-01-01

    The purpose of this study was to investigate the knowledge, skills, abilities and traits environmental engineers need. Two questions were asked: what skills are considered important, and where are they learned? Dreyfus and Dreyfus' novice-to-expert model, which describes a progressive, five-step process of skill development that occurs over time…

  18. From Course Assessment to Redesign: A Hybrid-Vehicle Course as a Case Illustration

    ERIC Educational Resources Information Center

    Stanton, Ken C.; Bradley, Thomas H.

    2013-01-01

    Assessment has become a central aspect of engineering education for evaluating student learning, attaining accreditation, and ensuring accountability. However, the final step of the assessment process, which requires assessment results be used to redesign courses and programmes, is appreciably underdeveloped in the literature. As such, this work…

  19. 43 CFR 45.40 - What are the requirements for prehearing conferences?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and the schedule of remaining steps in the hearing process. (e) Failure to attend. Unless the ALJ... an initial prehearing conference with the parties at the time specified in the docketing notice under... material fact and exclude issues that do not qualify for review as factual, material, and disputed; (ii) To...

  20. Making Wishes Come True: A Guide to Successful School Expansions.

    ERIC Educational Resources Information Center

    Fowle, Bruce

    1993-01-01

    A school expansion project should be carefully orchestrated with following steps: compiling list of everyone's perceived needs; determining what is affordable; developing program of requirements; developing a master plan; and overseeing the process. Case studies of two New York State schools (one urban and one suburban) illustrate how the location…

  1. Evidence of Early Strategies in Learning to Walk

    ERIC Educational Resources Information Center

    Snapp-Childs, Winona; Corbetta, Daniela

    2009-01-01

    Learning to walk is a dynamic process requiring the fine coordination, assembly, and balancing of many body segments at once. For the young walker, coordinating all these behavioral levels may be quite daunting. In this study, we examine the whole-body strategies to which infants resort to produce their first independent steps and progress over…

  2. The value of decision models: Using ecologically based invasive plant management as an example

    USDA-ARS?s Scientific Manuscript database

    Humans have both fast and slow thought processes which influence our judgment and decision-making. The fast system is intuitive and valuable for decisions which do not require multiple steps or the application of logic or statistics. However, many decisions in natural resources are complex and req...

  3. Design of two-column batch-to-batch recirculation to enhance performance in ion-exchange chromatography.

    PubMed

    Persson, Oliver; Andersson, Niklas; Nilsson, Bernt

    2018-01-05

    Preparative liquid chromatography is a separation technique widely used in the manufacturing of fine chemicals and pharmaceuticals. A major drawback of traditional single-column batch chromatography step is the trade-off between product purity and process performance. Recirculation of impure product can be utilized to make the trade-off more favorable. The aim of the present study was to investigate the usage of a two-column batch-to-batch recirculation process step to increase the performance compared to single-column batch chromatography at a high purity requirement. The separation of a ternary protein mixture on ion-exchange chromatography columns was used to evaluate the proposed process. The investigation used modelling and simulation of the process step, experimental validation and optimization of the simulated process. In the presented case the yield increases from 45.4% to 93.6% and the productivity increases 3.4 times compared to the performance of a batch run for a nominal case. A rapid concentration build-up product can be seen during the first cycles, before the process reaches a cyclic steady-state with reoccurring concentration profiles. The optimization of the simulation model predicts that the recirculated salt can be used as a flying start of the elution, which would enhance the process performance. The proposed process is more complex than a batch process, but may improve the separation performance, especially while operating at cyclic steady-state. The recirculation of impure fractions reduces the product losses and ensures separation of product to a high degree of purity. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed Central

    LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346

  5. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed

    LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.

  6. Route to one-step microstructure mold fabrication for PDMS microfluidic chip

    NASA Astrophysics Data System (ADS)

    Lv, Xiaoqing; Geng, Zhaoxin; Fan, Zhiyuan; Wang, Shicai; Su, Yue; Fang, Weihao; Pei, Weihua; Chen, Hongda

    2018-04-01

    The microstructure mold fabrication for PDMS microfluidic chip remains complex and time-consuming process requiring special equipment and protocols: photolithography and etching. Thus, a rapid and cost-effective method is highly needed. Comparing with the traditional microfluidic chip fabricating process based on the micro-electromechanical system (MEMS), this method is simple and easy to implement, and the whole fabrication process only requires 1-2 h. Different size of microstructure from 100 to 1000 μm was fabricated, and used to culture four kinds of breast cancer cell lines. Cell viability and morphology was assessed when they were cultured in the micro straight channels, micro square holes and the bonding PDMS-glass microfluidic chip. The experimental results indicate that the microfluidic chip is good and meet the experimental requirements. This method can greatly reduce the process time and cost of the microfluidic chip, and provide a simple and effective way for the structure design and in the field of biological microfabrications and microfluidic chips.

  7. Liquid fuels from food waste: An alternative process to co-digestion

    NASA Astrophysics Data System (ADS)

    Sim, Yoke-Leng; Ch'ng, Boon-Juok; Mok, Yau-Cheng; Goh, Sok-Yee; Hilaire, Dickens Saint; Pinnock, Travis; Adams, Shemlyn; Cassis, Islande; Ibrahim, Zainab; Johnson, Camille; Johnson, Chantel; Khatim, Fatima; McCormack, Andrece; Okotiuero, Mary; Owens, Charity; Place, Meoak; Remy, Cristine; Strothers, Joel; Waithe, Shannon; Blaszczak-Boxe, Christopher; Pratt, Lawrence M.

    2017-04-01

    Waste from uneaten, spoiled, or otherwise unusable food is an untapped source of material for biofuels. A process is described to recover the oil from mixed food waste, together with a solid residue. This process includes grinding the food waste to an aqueous slurry, skimming off the oil, a combined steam treatment of the remaining solids concurrent with extrusion through a porous cylinder to release the remaining oil, a second oil skimming step, and centrifuging the solids to obtain a moist solid cake for fermentation. The water, together with any resulting oil from the centrifuging step, is recycled back to the grinding step, and the cycle is repeated. The efficiency of oil extraction increases with the oil content of the waste, and greater than 90% of the oil was collected from waste containing at least 3% oil based on the wet mass. Fermentation was performed on the solid cake to obtain ethanol, and the dried solid fermentation residue was a nearly odorless material with potential uses of biochar, gasification, or compost production. This technology has the potential to enable large producers of food waste to comply with new laws which require this material to be diverted from landfills.

  8. Flowsheet Analysis of U-Pu Co-Crystallization Process as a New Reprocessing System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shunji Homma; Jun-ichi Ishii; Jiro Koga

    2006-07-01

    A new fuel reprocessing system by U-Pu co-crystallization process is proposed and examined by flowsheet analysis. This reprocessing system is based on the fact that hexavalent plutonium in nitric acid solution is co-crystallized with uranyl nitrate, whereas it is not crystallized when uranyl nitrate does not exist in the solution. The system consists of five steps: dissolution of spent fuel, plutonium oxidation, U-Pu co-crystallization as a co-decontamination, re-dissolution of the crystals, and U re-crystallization as a U-Pu separation. The system requires a recycling of the mother liquor from the U-Pu co-crystallization step and the appropriate recycle ratio is determined bymore » flowsheet analysis such that the satisfactory decontamination is achieved. Further flowsheet study using four different compositions of LWR spent fuels demonstrates that the constant ratio of plutonium to uranium in mother liquor from the re-crystallization step is achieved for every composition by controlling the temperature. It is also demonstrated by comparing to the Purex process that the size of the plant based on the proposed system is significantly reduced. (authors)« less

  9. Quality control in the development of coagulation factor concentrates.

    PubMed

    Snape, T J

    1987-01-01

    Limitation of process change is a major factor contributing to assurance of quality in pharmaceutical manufacturing. This is particularly true in the manufacture of coagulation factor concentrates, for which presumptive testing for poorly defined product characteristics is an integral feature of finished product quality control. The development of new or modified preparations requires that this comfortable position be abandoned, and that the effect on finished product characteristics of changes to individual process steps (and components) be assessed. The degree of confidence in the safety and efficacy of the new product will be determined by, amongst other things, the complexity of the process alteration and the extent to which the results of finished product tests can be considered predictive. The introduction of a heat-treatment step for inactivation of potential viral contaminants in coagulation factor concentrates presents a significant challenge in both respects, quite independent of any consideration of assessment of the effectiveness of the viral inactivation step. These interactions are illustrated by some of the problems encountered with terminal dry heat-treatment (72 h. at 80 degrees C) of factor VIII and prothrombin complex concentrates manufactured by the Blood Products Laboratory.

  10. VICAR image processing system guide to system use

    NASA Technical Reports Server (NTRS)

    Seidman, J. B.

    1977-01-01

    The functional characteristics and operating requirements of the VICAR (Video Image Communication and Retrieval) system are described. An introduction to the system describes the functional characteristics and the basic theory of operation. A brief description of the data flow as well as tape and disk formats is also presented. A formal presentation of the control statement formats is given along with a guide to usage of the system. The guide provides a step-by-step reference to the creation of a VICAR control card deck. Simple examples are employed to illustrate the various options and the system response thereto.

  11. Theoretical study of gas hydrate decomposition kinetics--model development.

    PubMed

    Windmeier, Christoph; Oellrich, Lothar R

    2013-10-10

    In order to provide an estimate of the order of magnitude of intrinsic gas hydrate dissolution and dissociation kinetics, the "Consecutive Desorption and Melting Model" (CDM) is developed by applying only theoretical considerations. The process of gas hydrate decomposition is assumed to comprise two consecutive and repetitive quasi chemical reaction steps. These are desorption of the guest molecule followed by local solid body melting. The individual kinetic steps are modeled according to the "Statistical Rate Theory of Interfacial Transport" and the Wilson-Frenkel approach. All missing required model parameters are directly linked to geometric considerations and a thermodynamic gas hydrate equilibrium model.

  12. Perspectives on the manufacture of combination vaccines.

    PubMed

    Vose, J R

    2001-12-15

    Evolving regulatory requirements in the United States and Europe create major challenges for manufacturers tasked with production of vaccines that contain > or =9 separate antigens capable of protecting against infectious diseases, such as diphtheria, tetanus, pertussis, polio, hepatitis B, and Haemophilus influenza b, in a single shot. This article describes 10 steps that can facilitate the process of licensing these complex vaccines. It also points out problems associated with the use of animal tests for the crucial step of potency testing for batch release caused by the inherent variability of such tests and the difficulties of interpreting their results.

  13. Overview of the production of sintered SiC optics and optical sub-assemblies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Deny, P.

    2005-08-01

    The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.

  14. Isothermal Amplification Methods for the Detection of Nucleic Acids in Microfluidic Devices

    PubMed Central

    Zanoli, Laura Maria; Spoto, Giuseppe

    2012-01-01

    Diagnostic tools for biomolecular detection need to fulfill specific requirements in terms of sensitivity, selectivity and high-throughput in order to widen their applicability and to minimize the cost of the assay. The nucleic acid amplification is a key step in DNA detection assays. It contributes to improving the assay sensitivity by enabling the detection of a limited number of target molecules. The use of microfluidic devices to miniaturize amplification protocols reduces the required sample volume and the analysis times and offers new possibilities for the process automation and integration in one single device. The vast majority of miniaturized systems for nucleic acid analysis exploit the polymerase chain reaction (PCR) amplification method, which requires repeated cycles of three or two temperature-dependent steps during the amplification of the nucleic acid target sequence. In contrast, low temperature isothermal amplification methods have no need for thermal cycling thus requiring simplified microfluidic device features. Here, the use of miniaturized analysis systems using isothermal amplification reactions for the nucleic acid amplification will be discussed. PMID:25587397

  15. Simulation of the Press Hardening Process and Prediction of the Final Mechanical Material Properties

    NASA Astrophysics Data System (ADS)

    Hochholdinger, Bernd; Hora, Pavel; Grass, Hannes; Lipp, Arnulf

    2011-08-01

    Press hardening is a well-established production process in the automotive industry today. The actual trend of this process technology points towards the manufacturing of parts with tailored properties. Since the knowledge of the mechanical properties of a structural part after forming and quenching is essential for the evaluation of for example the crash performance, an accurate as possible virtual assessment of the production process is more than ever necessary. In order to achieve this, the definition of reliable input parameters and boundary conditions for the thermo-mechanically coupled simulation of the process steps is required. One of the most important input parameters, especially regarding the final properties of the quenched material, is the contact heat transfer coefficient (IHTC). The CHTC depends on the effective pressure or the gap distance between part and tool. The CHTC at different contact pressures and gap distances is determined through inverse parameter identification. Furthermore a simulation strategy for the subsequent steps of the press hardening process as well as adequate modeling approaches for part and tools are discussed. For the prediction of the yield curves of the material after press hardening a phenomenological model is presented. This model requires the knowledge of the microstructure within the part. By post processing the nodal temperature history with a CCT diagram the quantitative distribution of the phase fractions martensite, bainite, ferrite and pearlite after press hardening is determined. The model itself is based on a Hockett-Sherby approach with the Hockett-Sherby parameters being defined in function of the phase fractions and a characteristic cooling rate.

  16. Integration of decentralized clinical data in a data warehouse: a service-oriented design and realization.

    PubMed

    Hanss, Sabine; Schaaf, T; Wetzel, T; Hahn, C; Schrader, T; Tolxdorff, T

    2009-01-01

    In this paper we present a general concept and describe the difficulties for the integration of data from various clinical partners in one data warehouse using the Open European Nephrology Science Center (OpEN.SC) as an example. This includes a requirements analysis of the data integration process and also the design according to these requirements. This conceptual approach based on the Rational Unified Process (RUP) and paradigm of Service-Oriented Architecture (SOA). Because we have to enhance the confidence of our partners in the OpEN.SC system and with this the willingness of them to participate, important requirements are controllability, transparency and security for all partners. Reusable and fine-grained components were found to be necessary when working with diverse data sources. With SOA the requested reusability is implemented easily. A key step in the development of a data integration process within such a health information system like OpEN.SC is to analyze the requirements. And to show that this is not only a theoretical work, we present a design - developed with RUP and SOA - which fulfills these requirements.

  17. Fish welfare assurance system: initial steps to set up an effective tool to safeguard and monitor farmed fish welfare at a company level.

    PubMed

    van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M

    2012-02-01

    The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required.

  18. Formulation for Simultaneous Aerodynamic Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, G. W.; Taylor, A. C., III; Mani, S. V.; Newman, P. A.

    1993-01-01

    An efficient approach for simultaneous aerodynamic analysis and design optimization is presented. This approach does not require the performance of many flow analyses at each design optimization step, which can be an expensive procedure. Thus, this approach brings us one step closer to meeting the challenge of incorporating computational fluid dynamic codes into gradient-based optimization techniques for aerodynamic design. An adjoint-variable method is introduced to nullify the effect of the increased number of design variables in the problem formulation. The method has been successfully tested on one-dimensional nozzle flow problems, including a sample problem with a normal shock. Implementations of the above algorithm are also presented that incorporate Newton iterations to secure a high-quality flow solution at the end of the design process. Implementations with iterative flow solvers are possible and will be required for large, multidimensional flow problems.

  19. Economical Fabrication of Thick-Section Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Babcock, Jason; Ramachandran, Gautham; Williams, Brian; Benander, Robert

    2010-01-01

    A method was developed for producing thick-section [>2 in. (approx.5 cm)], continuous fiber-reinforced ceramic matrix composites (CMCs). Ultramet-modified fiber interface coating and melt infiltration processing, developed previously for thin-section components, were used for the fabrication of CMCs that were an order of magnitude greater in thickness [up to 2.5 in. (approx.6.4 cm)]. Melt processing first involves infiltration of a fiber preform with the desired interface coating, and then with carbon to partially densify the preform. A molten refractory metal is then infiltrated and reacts with the excess carbon to form the carbide matrix without damaging the fiber reinforcement. Infiltration occurs from the inside out as the molten metal fills virtually all the available void space. Densification to <5 vol% porosity is a one-step process requiring no intermediate machining steps. The melt infiltration method requires no external pressure. This prevents over-infiltration of the outer surface plies, which can lead to excessive residual porosity in the center of the part. However, processing of thick-section components required modification of the conventional process conditions, and the means by which the large amount of molten metal is introduced into the fiber preform. Modification of the low-temperature, ultraviolet-enhanced chemical vapor deposition process used to apply interface coatings to the fiber preform was also required to accommodate the high preform thickness. The thick-section CMC processing developed in this work proved to be invaluable for component development, fabrication, and testing in two complementary efforts. In a project for the Army, involving SiC/SiC blisk development, nominally 0.8 in. thick x 8 in. diameter (approx. 2 cm thick x 20 cm diameter) components were successfully infiltrated. Blisk hubs were machined using diamond-embedded cutting tools and successfully spin-tested. Good ply uniformity and extremely low residual porosity (<2 percent) were achieved, the latter being far lower than that achieved with SiC matrix composites fabricated via CVI or PIP. The pyrolytic carbon/zirconium nitride interface coating optimized in this work for use on carbon fibers was incorporated in the SiC/SiC composites and yielded a >41 ksi (approx. 283 MPa) flexural strength.

  20. Multiwavelength digital holography for polishing tool shape measurement

    NASA Astrophysics Data System (ADS)

    Lédl, Vít.; Psota, Pavel; Václavík, Jan; Doleček, Roman; Vojtíšek, Petr

    2013-09-01

    Classical mechano-chemical polishing is still a valuable technique, which gives unbeatable results for some types of optical surfaces. For example, optics for high power lasers requires minimized subsurface damage, very high cosmetic quality, and low mid spatial frequency error. One can hardly achieve this with use of subaperture polishing. The shape of the polishing tool plays a crucial role in achieving the required form of the optical surface. Often the shape of the polishing tool or pad is not known precisely enough during the manufacturing process. The tool shape is usually premachined and later is changed during the polishing procedure. An experienced worker could estimate the shape of the tool indirectly from the shape of the polished element, and that is why he can achieve the required shape in few reasonably long iterative steps. Therefore the lack of the exact tool shape knowledge is tolerated. Sometimes, this indirect method is not feasible even if small parts are considered. Moreover, if processes on machines like planetary (continuous) polishers are considered, the incorrect shape of the polishing pad could extend the polishing times extremely. Every iteration step takes hours. Even worse, polished piece could be wasted if the pad has a poor shape. The ability of the tool shape determination would be very valuable in those types of lengthy processes. It was our primary motivation to develop a contactless measurement method for large diffusive surfaces and demonstrate its usability. The proposed method is based on application of multiwavelength digital holographic interferometry with phase shift.

  1. A Single-Step Enrichment Medium for Nonchromogenic Isolation of Healthy and Cold-Injured Salmonella spp. from Fresh Vegetables.

    PubMed

    Kim, Hong-Seok; Choi, Dasom; Kang, Il-Byeong; Kim, Dong-Hyeon; Yim, Jin-Hyeok; Kim, Young-Ji; Chon, Jung-Whan; Oh, Deog-Hwan; Seo, Kun-Ho

    2017-02-01

    Culture-based detection of nontyphoidal Salmonella spp. in foods requires at least four working days; therefore, new detection methods that shorten the test time are needed. In this study, we developed a novel single-step Salmonella enrichment broth, SSE-1, and compared its detection capability with that of commercial single-step ONE broth-Salmonella (OBS) medium and a conventional two-step enrichment method using buffered peptone water and Rappaport-Vassiliadis soy broth (BPW-RVS). Minimally processed lettuce samples were artificially inoculated with low levels of healthy and cold-injured Salmonella Enteritidis (10 0 or 10 1 colony-forming unit/25 g), incubated in OBS, BPW-RVS, and SSE-1 broths, and streaked on xylose lysine deoxycholate (XLD) agar. Salmonella recoverability was significantly higher in BPW-RVS (79.2%) and SSE-1 (83.3%) compared to OBS (39.3%) (p < 0.05). Our data suggest that the SSE-1 single-step enrichment broth could completely replace two-step enrichment with reduced enrichment time from 48 to 24 h, performing better than commercial single-step enrichment medium in the conventional nonchromogenic Salmonella detection, thus saving time, labor, and cost.

  2. A model for undergraduate physics major outcomes objectives

    NASA Astrophysics Data System (ADS)

    Taylor, G. R.; Erwin, T. Dary

    1989-06-01

    Concern with assessment of student outcomes of undergraduate physics major programs is rapidly rising. The Southern Association of Colleges and Schools and many other regional and state organizations are requiring explicit outcomes assessment in the accrediting process. The first step in this assessment process for major programs is the establishment of student outcomes objectives. A model and set of physics outcomes (educational) objectives that were developed by the faculty in the Physics Department at James Madison University are presented.

  3. Registration and Marking Requirements for UAS. Unmanned Aircraft System (UAS) Registration

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The registration of an aircraft is a prerequisite for issuance of a U.S. certificate of airworthiness by the FAA. The procedures and requirements for aircraft registration, and the subsequent issuance of registration numbers, are contained in FAR Part 47. However, the process/method(s) for applying the requirements of Parts 45 & 47 to Unmanned Aircraft Systems (UAS) has not been defined. This task resolved the application of 14 CFR Parts 45 and 47 to UAS. Key Findings: UAS are aircraft systems and as such the recommended approach to registration is to follow the same process for registration as manned aircraft. This will require manufacturers to comply with the requirements for 14 CFR 47, Aircraft Registration and 14 CFR 45, Identification and Registration Marking. In addition, only the UA should be identified with the N number registration markings. There should also be a documentation link showing the applicability of the control station and communication link to the UA. The documentation link can be in the form of a Type Certificate Data Sheet (TCDS) entry or a UAS logbook entry. The recommended process for the registration of UAS is similar to the manned aircraft process and is outlined in a 6-step process in the paper.

  4. Silicon compilation: From the circuit to the system

    NASA Astrophysics Data System (ADS)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  5. Biointervention makes leather processing greener: an integrated cleansing and tanning system.

    PubMed

    Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2003-06-01

    The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.

  6. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  7. Robustness. [in space systems

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    1993-01-01

    The concept of rubustness includes design simplicity, component and path redundancy, desensitization to the parameter and environment variations, control of parameter variations, and punctual operations. These characteristics must be traded with functional concepts, materials, and fabrication approach against the criteria of performance, cost, and reliability. The paper describes the robustness design process, which includes the following seven major coherent steps: translation of vision into requirements, definition of the robustness characteristics desired, criteria formulation of required robustness, concept selection, detail design, manufacturing and verification, operations.

  8. Information in general medical practices: the information processing model.

    PubMed

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  9. Plasmonic nanobubbles for target cell-specific gene and drug delivery and multifunctional processing of heterogeneous cell systems

    NASA Astrophysics Data System (ADS)

    Lukianova-Hleb, Ekaterina Y.; Huye, Leslie E.; Brenner, Malcolm K.; Lapotko, Dmitri O.

    2014-03-01

    Cell and gene cancer therapies require ex vivo cell processing of human grafts. Such processing requires at least three steps - cell enrichment, cell separation (destruction), and gene transfer - each of which requires the use of a separate technology. While these technologies may be satisfactory for research use, they are of limited usefulness in the clinical treatment setting because they have a low processing rate, as well as a low transfection and separation efficacy and specificity in heterogeneous human grafts. Most problematic, because current technologies are administered in multiple steps - rather than in a single, multifunctional, and simultaneous procedure - they lengthen treatment process and introduce an unnecessary level of complexity, labor, and resources into clinical treatment; all these limitations result in high losses of valuable cells. We report a universal, high-throughput, and multifunctional technology that simultaneously (1) inject free external cargo in target cells, (2) destroys unwanted cells, and (3) preserve valuable non-target cells in heterogeneous grafts. Each of these functions has single target cell specificity in heterogeneous cell system, processing rate > 45 mln cell/min, injection efficacy 90% under 96% viability of the injected cells, target cell destruction efficacy > 99%, viability of not-target cells >99% The developed technology employs novel cellular agents, called plasmonic nanobubbles (PNBs). PNBs are not particles, but transient, intracellular events, a vapor nanobubbles that expand and collapse in mere nanoseconds under optical excitation of gold nanoparticles with short picosecond laser pulses. PNBs of different, cell-specific, size (1) inject free external cargo with small PNBs, (2) Destroy other target cells mechanically with large PNBs and (3) Preserve non-target cells. The multi-functionality, precision, and high throughput of all-in-one PNB technology will tremendously impact cell and gene therapies and other clinical applications that depend on ex vivo processing of heterogeneous cell systems.

  10. Image alignment for tomography reconstruction from synchrotron X-ray microscopic images.

    PubMed

    Cheng, Chang-Chieh; Chien, Chia-Chi; Chen, Hsiang-Hsin; Hwu, Yeukuang; Ching, Yu-Tai

    2014-01-01

    A synchrotron X-ray microscope is a powerful imaging apparatus for taking high-resolution and high-contrast X-ray images of nanoscale objects. A sufficient number of X-ray projection images from different angles is required for constructing 3D volume images of an object. Because a synchrotron light source is immobile, a rotational object holder is required for tomography. At a resolution of 10 nm per pixel, the vibration of the holder caused by rotating the object cannot be disregarded if tomographic images are to be reconstructed accurately. This paper presents a computer method to compensate for the vibration of the rotational holder by aligning neighboring X-ray images. This alignment process involves two steps. The first step is to match the "projected feature points" in the sequence of images. The matched projected feature points in the x-θ plane should form a set of sine-shaped loci. The second step is to fit the loci to a set of sine waves to compute the parameters required for alignment. The experimental results show that the proposed method outperforms two previously proposed methods, Xradia and SPIDER. The developed software system can be downloaded from the URL, http://www.cs.nctu.edu.tw/~chengchc/SCTA or http://goo.gl/s4AMx.

  11. Immobilization techniques to avoid enzyme loss from oxidase-based biosensors: a one-year study.

    PubMed

    House, Jody L; Anderson, Ellen M; Ward, W Kenneth

    2007-01-01

    Continuous amperometric sensors that measure glucose or lactate require a stable sensitivity, and glutaraldehyde crosslinking has been used widely to avoid enzyme loss. Nonetheless, little data is published on the effectiveness of enzyme immobilization with glutaraldehyde. A combination of electrochemical testing and spectrophotometric assays was used to study the relationship between enzyme shedding and the fabrication procedure. In addition, we studied the relationship between the glutaraldehyde concentration and sensor performance over a period of one year. The enzyme immobilization process by glutaraldehyde crosslinking to glucose oxidase appears to require at least 24-hours at room temperature to reach completion. In addition, excess free glucose oxidase can be removed by soaking sensors in purified water for 20 minutes. Even with the addition of these steps, however, it appears that there is some free glucose oxidase entrapped within the enzyme layer which contributes to a decline in sensitivity over time. Although it reduces the ultimate sensitivity (probably via a change in the enzyme's natural conformation), glutaraldehyde concentration in the enzyme layer can be increased in order to minimize this instability. After exposure of oxidase enzymes to glutaraldehyde, effective crosslinking requires a rinse step and a 24-hour incubation step. In order to minimize the loss of sensor sensitivity over time, the glutaraldehyde concentration can be increased.

  12. Generation of cell type-specific monoclonal antibodies for the planarian and optimization of sample processing for immunolabeling.

    PubMed

    Forsthoefel, David J; Waters, Forrest A; Newmark, Phillip A

    2014-12-21

    Efforts to elucidate the cellular and molecular mechanisms of regeneration have required the application of methods to detect specific cell types and tissues in a growing cohort of experimental animal models. For example, in the planarian Schmidtea mediterranea, substantial improvements to nucleic acid hybridization and electron microscopy protocols have facilitated the visualization of regenerative events at the cellular level. By contrast, immunological resources have been slower to emerge. Specifically, the repertoire of antibodies recognizing planarian antigens remains limited, and a more systematic approach is needed to evaluate the effects of processing steps required during sample preparation for immunolabeling. To address these issues and to facilitate studies of planarian digestive system regeneration, we conducted a monoclonal antibody (mAb) screen using phagocytic intestinal cells purified from the digestive tracts of living planarians as immunogens. This approach yielded ten antibodies that recognized intestinal epitopes, as well as markers for the central nervous system, musculature, secretory cells, and epidermis. In order to improve signal intensity and reduce non-specific background for a subset of mAbs, we evaluated the effects of fixation and other steps during sample processing. We found that fixative choice, treatments to remove mucus and bleach pigment, as well as methods for tissue permeabilization and antigen retrieval profoundly influenced labeling by individual antibodies. These experiments led to the development of a step-by-step workflow for determining optimal specimen preparation for labeling whole planarians as well as unbleached histological sections. We generated a collection of monoclonal antibodies recognizing the planarian intestine and other tissues; these antibodies will facilitate studies of planarian tissue morphogenesis. We also developed a protocol for optimizing specimen processing that will accelerate future efforts to generate planarian-specific antibodies, and to extend functional genetic studies of regeneration to post-transcriptional aspects of gene expression, such as protein localization or modification. Our efforts demonstrate the importance of systematically testing multiple approaches to species-specific idiosyncracies, such as mucus removal and pigment bleaching, and may serve as a template for the development of immunological resources in other emerging model organisms.

  13. Testing and validation of computerized decision support systems.

    PubMed

    Sailors, R M; East, T D; Wallace, C J; Carlson, D A; Franklin, M A; Heermann, L K; Kinder, A T; Bradshaw, R L; Randolph, A G; Morris, A H

    1996-01-01

    Systematic, through testing of decision support systems (DSSs) prior to release to general users is a critical aspect of high quality software design. Omission of this step may lead to the dangerous, and potentially fatal, condition of relying on a system with outputs of uncertain quality. Thorough testing requires a great deal of effort and is a difficult job because tools necessary to facilitate testing are not well developed. Testing is a job ill-suited to humans because it requires tireless attention to a large number of details. For these reasons, the majority of DSSs available are probably not well tested prior to release. We have successfully implemented a software design and testing plan which has helped us meet our goal of continuously improving the quality of our DSS software prior to release. While requiring large amounts of effort, we feel that the process of documenting and standardizing our testing methods are important steps toward meeting recognized national and international quality standards. Our testing methodology includes both functional and structural testing and requires input from all levels of development. Our system does not focus solely on meeting design requirements but also addresses the robustness of the system and the completeness of testing.

  14. Pleiades image quality: from users' needs to products definition

    NASA Astrophysics Data System (ADS)

    Kubik, Philippe; Pascal, Véronique; Latry, Christophe; Baillarin, Simon

    2005-10-01

    Pleiades is the highest resolution civilian earth observing system ever developed in Europe. This imagery programme is conducted by the French National Space Agency, CNES. It will operate in 2008-2009 two agile satellites designed to provide optical images to civilian and defence users. Images will be simultaneously acquired in Panchromatic (PA) and multispectral (XS) mode, which allows, in Nadir acquisition condition, to deliver 20 km wide, false or natural colored scenes with a 70 cm ground sampling distance after PA+XS fusion. Imaging capabilities have been highly optimized in order to acquire along-track mosaics, stereo pairs and triplets, and multi-targets. To fulfill the operational requirements and ensure quick access to information, ground processing has to automatically perform the radiometrical and geometrical corrections. Since ground processing capabilities have been taken into account very early in the programme development, it has been possible to relax some costly on-board components requirements, in order to achieve a cost effective on-board/ground compromise. Starting from an overview of the system characteristics, this paper deals with the image products definition (raw level, perfect sensor, orthoimage and along-track orthomosaics), and the main processing steps. It shows how each system performance is a result of the satellite performance followed by an appropriate ground processing. Finally, it focuses on the radiometrical performances of final products which are intimately linked to the following processing steps : radiometrical corrections, PA restoration, image resampling and PAN-sharpening.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakel, Allen J.; Conner, Cliff; Quigley, Kevin

    One of the missions of the Reduced Enrichment for Research and Test Reactors (RERTR) program (and now the National Nuclear Security Administrations Material Management and Minimization program) is to facilitate the use of low enriched uranium (LEU) targets for 99Mo production. The conversion from highly enriched uranium (HEU) to LEU targets will require five to six times more uranium to produce an equivalent amount of 99Mo. The work discussed here addresses the technical challenges encountered in the treatment of uranyl nitrate hexahydrate (UNH)/nitric acid solutions remaining after the dissolution of LEU targets. Specifically, the focus of this work is themore » calcination of the uranium waste from 99Mo production using LEU foil targets and the Modified Cintichem Process. Work with our calciner system showed that high furnace temperature, a large vent tube, and a mechanical shield are beneficial for calciner operation. One- and two-step direct calcination processes were evaluated. The high-temperature one-step process led to contamination of the calciner system. The two-step direct calcination process operated stably and resulted in a relatively large amount of material in the calciner cup. Chemically assisted calcination using peroxide was rejected for further work due to the difficulty in handling the products. Chemically assisted calcination using formic acid was rejected due to unstable operation. Chemically assisted calcination using oxalic acid was recommended, although a better understanding of its chemistry is needed. Overall, this work showed that the two-step direct calcination and the in-cup oxalic acid processes are the best approaches for the treatment of the UNH/nitric acid waste solutions remaining from dissolution of LEU targets for 99Mo production.« less

  16. A MULTICORE BASED PARALLEL IMAGE REGISTRATION METHOD

    PubMed Central

    Yang, Lin; Gong, Leiguang; Zhang, Hong; Nosher, John L.; Foran, David J.

    2012-01-01

    Image registration is a crucial step for many image-assisted clinical applications such as surgery planning and treatment evaluation. In this paper we proposed a landmark based nonlinear image registration algorithm for matching 2D image pairs. The algorithm was shown to be effective and robust under conditions of large deformations. In landmark based registration, the most important step is establishing the correspondence among the selected landmark points. This usually requires an extensive search which is often computationally expensive. We introduced a nonregular data partition algorithm using the K-means clustering algorithm to group the landmarks based on the number of available processing cores. The step optimizes the memory usage and data transfer. We have tested our method using IBM Cell Broadband Engine (Cell/B.E.) platform. PMID:19964921

  17. Metastasis Suppressor Genes: At the Interface Between the Environment and Tumor Cell Growth

    PubMed Central

    Hurst, Douglas R.; Welch, Danny R.

    2013-01-01

    The molecular mechanisms and genetic programs required for cancer metastasis are sometimes overlapping, but components are clearly distinct from those promoting growth of a primary tumor. Every sequential, rate-limiting step in the sequence of events leading to metastasis requires coordinated expression of multiple genes, necessary signaling events, and favorable environmental conditions or the ability to escape negative selection pressures. Metastasis suppressors are molecules that inhibit the process of metastasis without preventing growth of the primary tumor. The cellular processes regulated by metastasis suppressors are diverse and function at every step in the metastatic cascade. As we gain knowledge into the molecular mechanisms of metastasis suppressors and cofactors with which they interact, we learn more about the process, including appreciation that some are potential targets for therapy of metastasis, the most lethal aspect of cancer. Until now, metastasis suppressors have been described largely by their function. With greater appreciation of their biochemical mechanisms of action, the importance of context is increasingly recognized especially since tumor cells exist in myriad microenvironments. In this review, we assemble the evidence that selected molecules are indeed suppressors of metastasis, collate the data defining the biochemical mechanisms of action, and glean insights regarding how metastasis suppressors regulate tumor cell communication to–from microenvironments. PMID:21199781

  18. Modifications to the Conduit Flow Process Mode 2 for MODFLOW-2005

    USGS Publications Warehouse

    Reimann, T.; Birk, S.; Rehrl, C.; Shoemaker, W.B.

    2012-01-01

    As a result of rock dissolution processes, karst aquifers exhibit highly conductive features such as caves and conduits. Within these structures, groundwater flow can become turbulent and therefore be described by nonlinear gradient functions. Some numerical groundwater flow models explicitly account for pipe hydraulics by coupling the continuum model with a pipe network that represents the conduit system. In contrast, the Conduit Flow Process Mode 2 (CFPM2) for MODFLOW-2005 approximates turbulent flow by reducing the hydraulic conductivity within the existing linear head gradient of the MODFLOW continuum model. This approach reduces the practical as well as numerical efforts for simulating turbulence. The original formulation was for large pore aquifers where the onset of turbulence is at low Reynolds numbers (1 to 100) and not for conduits or pipes. In addition, the existing code requires multiple time steps for convergence due to iterative adjustment of the hydraulic conductivity. Modifications to the existing CFPM2 were made by implementing a generalized power function with a user-defined exponent. This allows for matching turbulence in porous media or pipes and eliminates the time steps required for iterative adjustment of hydraulic conductivity. The modified CFPM2 successfully replicated simple benchmark test problems. ?? 2011 The Author(s). Ground Water ?? 2011, National Ground Water Association.

  19. Processing Satellite Images on Tertiary Storage: A Study of the Impact of Tile Size on Performance

    NASA Technical Reports Server (NTRS)

    Yu, JieBing; DeWitt, David J.

    1996-01-01

    Before raw data from a satellite can be used by an Earth scientist, it must first undergo a number of processing steps including basic processing, cleansing, and geo-registration. Processing actually expands the volume of data collected by a factor of 2 or 3 and the original data is never deleted. Thus processing and storage requirements can exceed 2 terrabytes/day. Once processed data is ready for analysis, a series of algorithms (typically developed by the Earth scientists) is applied to a large number of images in a data set. The focus of this paper is how best to handle such images stored on tape using the following assumptions: (1) all images of interest to a scientist are stored on a single tape, (2) images are accessed and processed in the order that they are stored on tape, and (3) the analysis requires access to only a portion of each image and not the entire image.

  20. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, High-Level Use Cases for DMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui; Lu, Xiaonan; Martino, Sal

    Many distribution management systems (DMS) projects have achieved limited success because the electric utility did not sufficiently plan for actual use of the DMS functions in the control room environment. As a result, end users were not clear on how to use the new application software in actual production environments with existing, well-established business processes. An important first step in the DMS implementation process is development and refinement of the “to be” business processes. Development of use cases for the required DMS application functions is a key activity that leads to the formulation of the “to be” requirements. It ismore » also an important activity that is needed to develop specifications that are used to procure a new DMS.« less

  1. Parallel Processing of Images in Mobile Devices using BOINC

    NASA Astrophysics Data System (ADS)

    Curiel, Mariela; Calle, David F.; Santamaría, Alfredo S.; Suarez, David F.; Flórez, Leonardo

    2018-04-01

    Medical image processing helps health professionals make decisions for the diagnosis and treatment of patients. Since some algorithms for processing images require substantial amounts of resources, one could take advantage of distributed or parallel computing. A mobile grid can be an adequate computing infrastructure for this problem. A mobile grid is a grid that includes mobile devices as resource providers. In a previous step of this research, we selected BOINC as the infrastructure to build our mobile grid. However, parallel processing of images in mobile devices poses at least two important challenges: the execution of standard libraries for processing images and obtaining adequate performance when compared to desktop computers grids. By the time we started our research, the use of BOINC in mobile devices also involved two issues: a) the execution of programs in mobile devices required to modify the code to insert calls to the BOINC API, and b) the division of the image among the mobile devices as well as its merging required additional code in some BOINC components. This article presents answers to these four challenges.

  2. Automation of cellular therapy product manufacturing: results of a split validation comparing CD34 selection of peripheral blood stem cell apheresis product with a semi-manual vs. an automatic procedure.

    PubMed

    Hümmer, Christiane; Poppe, Carolin; Bunos, Milica; Stock, Belinda; Wingenfeld, Eva; Huppert, Volker; Stuth, Juliane; Reck, Kristina; Essl, Mike; Seifried, Erhard; Bonig, Halvard

    2016-03-16

    Automation of cell therapy manufacturing promises higher productivity of cell factories, more economical use of highly-trained (and costly) manufacturing staff, facilitation of processes requiring manufacturing steps at inconvenient hours, improved consistency of processing steps and other benefits. One of the most broadly disseminated engineered cell therapy products is immunomagnetically selected CD34+ hematopoietic "stem" cells (HSCs). As the clinical GMP-compliant automat CliniMACS Prodigy is being programmed to perform ever more complex sequential manufacturing steps, we developed a CD34+ selection module for comparison with the standard semi-automatic CD34 "normal scale" selection process on CliniMACS Plus, applicable for 600 × 10(6) target cells out of 60 × 10(9) total cells. Three split-validation processings with healthy donor G-CSF-mobilized apheresis products were performed; feasibility, time consumption and product quality were assessed. All processes proceeded uneventfully. Prodigy runs took about 1 h longer than CliniMACS Plus runs, albeit with markedly less hands-on operator time and therefore also suitable for less experienced operators. Recovery of target cells was the same for both technologies. Although impurities, specifically T- and B-cells, were 5 ± 1.6-fold and 4 ± 0.4-fold higher in the Prodigy products (p = ns and p = 0.013 for T and B cell depletion, respectively), T cell contents per kg of a virtual recipient receiving 4 × 10(6) CD34+ cells/kg was below 10 × 10(3)/kg even in the worst Prodigy product and thus more than fivefold below the specification of CD34+ selected mismatched-donor stem cell products. The products' theoretical clinical usability is thus confirmed. This split validation exercise of a relatively short and simple process exemplifies the potential of automatic cell manufacturing. Automation will further gain in attractiveness when applied to more complex processes, requiring frequent interventions or handling at unfavourable working hours, such as re-targeting of T-cells.

  3. [Eight-step structured decision-making process to assign criminal responsibility and seven focal points for describing relationship between psychopathology and offense].

    PubMed

    Okada, Takayuki

    2013-01-01

    The author suggested that it is essential for lawyers and psychiatrists to have a common understanding of the mutual division of roles between them when determining criminal responsibility (CR) and, for this purpose, proposed an 8-step structured CR decision-making process. The 8 steps are: (1) gathering of information related to mental function and condition, (2) recognition of mental function and condition,(3) psychiatric diagnosis, (4) description of the relationship between psychiatric symptom or psychopathology and index offense, (5) focus on capacities of differentiation between right and wrong and behavioral control, (6) specification of elements of cognitive/volitional prong in legal context, (7) legal evaluation of degree of cognitive/volitional prong, and (8) final interpretation of CR as a legal conclusion. The author suggested that the CR decision-making process should proceed not in a step-like pattern from (1) to (2) to (3) to (8), but in a step-like pattern from (1) to (2) to (4) to (5) to (6) to (7) to (8), and that not steps after (5), which require the interpretation or the application of section 39 of the Penal Code, but Step (4), must be the core of psychiatric expert evidence. When explaining the relationship between the mental disorder and offense described in Step (4), the Seven Focal Points (7FP) are often used. The author urged basic precautions to prevent the misuse of 7FP, which are: (a) the priority of each item is not equal and the relative importance differs from case to case; (b) each item is not exclusively independent, there may be overlap between items; (c) the criminal responsibility shall not be judged because one item is applicable or because a number of items are applicable, i. e., 7FP are not "criteria," for example, the aim is not to decide such things as 'the motive is understandable' or 'the conduct is appropriate', but should be to describe how psychopathological factors affected the offense specifically in the context of understandability of motive or appropriateness of conduct; (d) it is essential to evaluate each item from a neutral point of view rather than only from one perspective, for example, looking at the case from the aspects of both comprehensibility and incomprehensibility of motive or from aspects of both oriented, purposeful, organized behavior and disoriented, purposeless, disorganized behavior during the offense; (e) depending on the case, there are some items that do not require any consideration (there are some cases in which there are less than seven items); (f) 7FP are not exhaustive and there are instances in which, depending on the case, there should be a focus on points that are not included in these.

  4. PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; Suhs, Norman; Dietz, William; Rogers, Stuart; Nash, Steve; Chan, William; Tramel, Robert; Onufer, Jeff

    2006-01-01

    This viewgraph presentation reviews the use and requirements of Pegasus 5. PEGASUS 5 is a code which performs a pre-processing step for the Overset CFD method. The code prepares the overset volume grids for the flow solver by computing the domain connectivity database, and blanking out grid points which are contained inside a solid body. PEGASUS 5 successfully automates most of the overset process. It leads to dramatic reduction in user input over previous generations of overset software. It also can lead to an order of magnitude reduction in both turn-around time and user expertise requirements. It is also however not a "black-box" procedure; care must be taken to examine the resulting grid system.

  5. Fermentation Methods for Protein Enrichment of Cassava and Corn with Candida tropicalis

    PubMed Central

    Azoulay, Edgard; Jouanneau, Françoise; Bertrand, Jean-Claude; Raphael, Alain; Janssens, Jacques; Lebeault, Jean Michel

    1980-01-01

    Candida tropicalis grows on soluble starch, corn, and cassava powders without requiring that these substrates be previously hydrolyzed. C. tropicalis possesses the enzyme needed to hydrolyze starch, namely, an α-amylase. That property has been used to develop a fermentation process whereby C. tropicalis can be grown directly on corn or cassava powders so that the resultant mixture of biomass and residual corn or cassava contains about 20% protein, which represents a balanced diet for either animal fodder or human food. The fact that no extra enzymes are required to hydrolyze starch results in a particularly efficient way of improving the nutritional value of amylaceous products, through a single-step fermentation process. PMID:16345495

  6. The Arizona Home Language Survey: The Identification of Students for ELL Services

    ERIC Educational Resources Information Center

    Goldenberg, Claude; Rutherford-Quach, Sara

    2010-01-01

    Assuring that English language learners (ELLs) receive the services to which they have a right requires accurately identifying those students. Virtually all states identify ELLs in a two-step process. First, parents fill out a home language survey. Second, students in whose homes a language other than English is spoken and who therefore might…

  7. Adaptation Criteria for the Personalised Delivery of Learning Materials: A Multi-Stage Empirical Investigation

    ERIC Educational Resources Information Center

    Thalmann, Stefan

    2014-01-01

    Personalised e-Learning represents a major step-change from the one-size-fits-all approach of traditional learning platforms to a more customised and interactive provision of learning materials. Adaptive learning can support the learning process by tailoring learning materials to individual needs. However, this requires the initial preparation of…

  8. Young Children's Views of the Technology Process: An Exploratory Study

    ERIC Educational Resources Information Center

    Milne, Louise; Edwards, Richard

    2013-01-01

    This paper describes an exploratory study of an aspect of the technological knowledge of two groups of five-year-old students in their first year at school. Their emerging understandings of the steps required to develop a new product were investigated through a series of interviews. A theoretical framework linking technological knowledge to "funds…

  9. Dynamic control of moisture during hot pressing of wood composites

    Treesearch

    Cheng Piao; Todd F. Shupe; Chung Y. Hse

    2006-01-01

    Hot pressing is an important step in the manufacture of wood composites. In the conventional pressing system, hot press output often acts as a constraint to increased production. Severe drying of the furnish (e.g., particles, flakes, or fibers) required by this process substantially increases the manufacturing cost and creates air-polluting emissions of volatile...

  10. 7 CFR 1.640 - What are the requirements for prehearing conferences?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and the schedule of remaining steps in the hearing process. (e) Failure to attend. Unless the ALJ... prehearing conference with the parties at the time specified in the docketing notice under § 1.630, on or... exclude issues that do not qualify for review as factual, material, and disputed; (ii) To consider the...

  11. Back to School: A Guide to Continuing Your Education after Prison

    ERIC Educational Resources Information Center

    Crayton, Anna; Lindahl, Nicole

    2010-01-01

    By picking up this guide, you have already taken an important step towards continuing your education. Going back to school will require hard work and dedication. The process will be both challenging and frustrating. But if you stick with it, continuing your education can bring you tremendous rewards. Earning a General Education Development…

  12. Recruiting High Quality Students through a Fifth Year Program: It Can Be Done.

    ERIC Educational Resources Information Center

    James, Terry L.; And Others

    This paper describes the process of recruiting high quality liberal arts graduates into teacher preparation programs at Memphis State University, Tennessee. The two graduate programs, the Master of Arts in Teaching and the Lyndhurst, both require full-time attendance, have lock-step delivery systems and extended internships, and are intensive. In…

  13. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  14. 3D Stacked Memory Final Report CRADA No. TC-0494-93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernhardt, A.; Beene, G.

    TI and LLNL demonstrated: (1) a process for the fabrication of 3-D memory using stacked DRAM chips, and (2) a fast prototyping process for 3-D stacks and MCMs. The metallization to route the chip pads to the sides of the die was carried out in a single high-speed masking step. The mask was not the usual physical one in glass and chrome, but was simply a computer file used to control the laser patterning process. Changes in either chip or customer circuit-board pad layout were easily and inexpensively accommodated, so that prototyping was a natural consequence of the laser patterningmore » process. As in the current TI process, a dielectric layer was added to the wafer, and vias to the chip I/0 pads were formed. All of the steps in Texas Instruments earlier process that were required to gold bump the pads were eliminated, significantly reducing fabrication cost and complexity. Pads were created on the sides of ·the die, which became pads on the side of the stack. In order to extend the process to accommodate non-memory devices with substantially greater I/0 than is required for DRAMs, pads were patterned on two sides of the memory stacks as a proof of principle. Stacking and bonding were done using modifications of the current TI process. After stacking and bonding, the pads on the sides of the dice were connected by application of a polyimide insulator film with laser ablation of the polyimide to form contacts to the pads. Then metallization was accomplished in the same manner as on the individual die.« less

  15. Pasteurization of shell eggs using radio frequency heating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveke, David J.; Bigley, Andrew B. W.; Brunkhorst, Christopher D.

    The USDA-FSIS estimates that pasteurization of all shell eggs in the U.S. would reduce the annual number of illnesses by more than 110,000. However, less than 3% of shell eggs are commercially pasteurized. One of the main reasons for this is that the commercial hot water process requires as much as 60 min to complete. In the present study, a radio frequency (RF) apparatus was constructed, and a two-step process was developed that uses RF energy and hot water, to pasteurize eggs in less than half the time. In order to select an appropriate RF generator, the impedance of shellmore » eggs was measured in the frequency range of 10–70 MHz. The power density within the egg was modeled to prevent potential hotspots. Escherichia coli (ATCC 35218) was inoculated in the yolk to approximately 7.5 log CFU/ml. The combination process first heated the egg in 35.0 °C water for 3.5 min using 60 MHz RF energy. This resulted in the yolk being preferentially heated to 61 °C. Then, the egg was heated for an additional 20 min with 56.7 °C water. This two-step process reduced the population of E. coli by 6.5 log. The total time for the process was 23.5 min. By contrast, processing for 60 min was required to reduce the E. coli by 6.6 log using just hot water. The novel RF pasteurization process presented in this study was considerably faster than the existing commercial process. As a result, this should lead to an increase in the percentage of eggs being pasteurized, as well as a reduction of foodborne illnesses.« less

  16. Pasteurization of shell eggs using radio frequency heating

    DOE PAGES

    Geveke, David J.; Bigley, Andrew B. W.; Brunkhorst, Christopher D.

    2016-08-21

    The USDA-FSIS estimates that pasteurization of all shell eggs in the U.S. would reduce the annual number of illnesses by more than 110,000. However, less than 3% of shell eggs are commercially pasteurized. One of the main reasons for this is that the commercial hot water process requires as much as 60 min to complete. In the present study, a radio frequency (RF) apparatus was constructed, and a two-step process was developed that uses RF energy and hot water, to pasteurize eggs in less than half the time. In order to select an appropriate RF generator, the impedance of shellmore » eggs was measured in the frequency range of 10–70 MHz. The power density within the egg was modeled to prevent potential hotspots. Escherichia coli (ATCC 35218) was inoculated in the yolk to approximately 7.5 log CFU/ml. The combination process first heated the egg in 35.0 °C water for 3.5 min using 60 MHz RF energy. This resulted in the yolk being preferentially heated to 61 °C. Then, the egg was heated for an additional 20 min with 56.7 °C water. This two-step process reduced the population of E. coli by 6.5 log. The total time for the process was 23.5 min. By contrast, processing for 60 min was required to reduce the E. coli by 6.6 log using just hot water. The novel RF pasteurization process presented in this study was considerably faster than the existing commercial process. As a result, this should lead to an increase in the percentage of eggs being pasteurized, as well as a reduction of foodborne illnesses.« less

  17. Changes, disruption and innovation: An investigation of the introduction of new health information technology in a microbiology laboratory.

    PubMed

    Toouli, George; Georgiou, Andrew; Westbrook, Johanna

    2012-01-01

    It is expected that health information technology (HIT) will deliver a safer, more efficient and effective health care system. The aim of this study was to undertake a qualitative and video-ethnographic examination of the impact of information technologies on work processes in the reception area of a Microbiology Department, to ascertain what changed, how it changed and the impact of the change. The setting for this study was the microbiology laboratory of a large tertiary hospital in Sydney. The study consisted of qualitative (interview and focus group) data and observation sessions for the period August 2005 to October 2006 along with video footage shot in three sessions covering the original system and the two stages of the Cerner implementation. Data analysis was assisted by NVivo software and process maps were produced from the video footage. There were two laboratory information systems observed in the video footage with computerized provider order entry introduced four months later. Process maps highlighted the large number of pre data entry steps with the original system whilst the newer system incorporated many of these steps in to the data entry stage. However, any time saved with the new system was offset by the requirement to complete some data entry of patient information not previously required. Other changes noted included the change of responsibilities for the reception staff and the physical changes required to accommodate the increased activity around the data entry area. Implementing a new HIT is always an exciting time for any environment but ensuring that the implementation goes smoothly and with minimal trouble requires the administrator and their team to plan well in advance for staff training, physical layout and possible staff resource reallocation.

  18. Transforming user needs into functional requirements for an antibiotic clinical decision support system: explicating content analysis for system design.

    PubMed

    Bright, T J

    2013-01-01

    Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). THE APPROACH CONSISTED OF FIVE STEPS: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains.

  19. Transforming User Needs into Functional Requirements for an Antibiotic Clinical Decision Support System

    PubMed Central

    Bright, T.J.

    2013-01-01

    Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586

  20. Advanced Research Deposition System (ARDS) for processing CdTe solar cells

    NASA Astrophysics Data System (ADS)

    Barricklow, Keegan Corey

    CdTe solar cells have been commercialized at the Gigawatt/year level. The development of volume manufacturing processes for next generation CdTe photovoltaics (PV) with higher efficiencies requires research systems with flexibility, scalability, repeatability and automation. The Advanced Research Deposition Systems (ARDS) developed by the Materials Engineering Laboratory (MEL) provides such a platform for the investigation of materials and manufacturing processes necessary to produce the next generation of CdTe PV. Limited by previous research systems, the ARDS was developed to provide process and hardware flexibility, accommodating advanced processing techniques, and capable of producing device quality films. The ARDS is a unique, in-line process tool with nine processing stations. The system was designed, built and assembled at the Materials Engineering Laboratory. Final assembly, startup, characterization and process development are the focus of this research. Many technical challenges encountered during the startup of the ARDS were addressed in this research. In this study, several hardware modifications needed for the reliable operation of the ARDS were designed, constructed and successfully incorporated into the ARDS. The effect of process condition on film properties for each process step was quantified. Process development to achieve 12% efficient baseline solar cell required investigation of discrete processing steps, troubleshooting process variation, and developing performance correlations. Subsequent to this research, many advances have been demonstrated with the ARDS. The ARDS consistently produces devices of 12% +/-.5% by the process of record (POR). The champion cell produced to date utilizing the ARDS has an efficiency of 16.2% on low cost commercial sodalime glass and utilizes advanced films. The ARDS has enabled investigation of advanced concepts for processing CdTe devices including, Plasma Cleaning, Plasma Enhanced Closed Space Sublimation (PECSS), Electron Reflector (ER) using Cd1-xMgxTe (CMT) structure and alternative device structures. The ARDS has been instrumental in the collaborative research with many institutions.

  1. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records.

    PubMed

    Duz, Marco; Marshall, John F; Parkin, Tim

    2017-06-29

    The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free-text notes. Validation was performed by comparison of the computer-assisted method with manual analysis, which was used as the gold standard. Sensitivity, specificity, negative predictive values (NPVs), positive predictive values (PPVs), and F values of the computer-assisted process were calculated by comparing them with the manual classification. Lowest sensitivity, specificity, PPVs, NPVs, and F values were 99.82% (1128/1130), 99.88% (16410/16429), 94.6% (223/239), 100.00% (16410/16412), and 99.0% (100×2×0.983×0.998/[0.983+0.998]), respectively. The computer-assisted process required few seconds to run, although an estimated 30 h were required for dictionary creation. Manual classification required approximately 80 man-hours. The critical step in this work is the creation of accurate and inclusive dictionaries to ensure that no potential cases are missed. It is significantly easier to remove false positive terms from a SS/WS selected subset of a large database than search that original database for potential false negatives. The benefits of using this method are proportional to the size of the dataset to be analyzed. ©Marco Duz, John F Marshall, Tim Parkin. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 29.06.2017.

  2. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records

    PubMed Central

    Marshall, John F; Parkin, Tim

    2017-01-01

    Background The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. Objective The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. Methods The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free-text notes. Validation was performed by comparison of the computer-assisted method with manual analysis, which was used as the gold standard. Sensitivity, specificity, negative predictive values (NPVs), positive predictive values (PPVs), and F values of the computer-assisted process were calculated by comparing them with the manual classification. Results Lowest sensitivity, specificity, PPVs, NPVs, and F values were 99.82% (1128/1130), 99.88% (16410/16429), 94.6% (223/239), 100.00% (16410/16412), and 99.0% (100×2×0.983×0.998/[0.983+0.998]), respectively. The computer-assisted process required few seconds to run, although an estimated 30 h were required for dictionary creation. Manual classification required approximately 80 man-hours. Conclusions The critical step in this work is the creation of accurate and inclusive dictionaries to ensure that no potential cases are missed. It is significantly easier to remove false positive terms from a SS/WS selected subset of a large database than search that original database for potential false negatives. The benefits of using this method are proportional to the size of the dataset to be analyzed. PMID:28663163

  3. Validating Signs and Symptoms From An Actual Mass Casualty Incident to Characterize An Irritant Gas Syndrome Agent (IGSA) Exposure: A First Step in The Development of a Novel IGSA Triage Algorithm.

    PubMed

    Culley, Joan M; Richter, Jane; Donevant, Sara; Tavakoli, Abbas; Craig, Jean; DiNardi, Salvatore

    2017-07-01

    • Chemical exposures daily pose a significant threat to life. Rapid assessment by first responders/emergency nurses is required to reduce death and disability. Currently, no informatics tools for Irritant Gas Syndrome Agents (IGSA) exposures exist to process victims efficiently, continuously monitor for latent signs/symptoms, or make triage recommendations. • This study uses actual patient data from a chemical incident to characterize and validate signs/symptoms of an IGSA Syndrome. Validating signs/symptoms is the first step in developing new emergency department informatics tools with the potential to revolutionize the process by which emergency nurses manage triage victims of chemical incidents. Chemical exposures can pose a significant threat to life. Rapid assessment by first responders/emergency nurses is required to reduce death and disability. Currently, no informatics tools for irritant gas syndrome agents (IGSA) exposures exist to process victims efficiently, continuously monitor for latent signs/symptoms, or make triage recommendations. This study describes the first step in developing ED informatics tools for chemical incidents: validation of signs/symptoms that characterize an IGSA syndrome. Data abstracted from 146 patients treated for chlorine exposure in one emergency department during a 2005 train derailment and 152 patients not exposed to chlorine (a comparison group) were mapped to 93 possible signs/symptoms within 2 tools (WISER and CHEMM-IST) designed to assist emergency responders/emergency nurses with managing hazardous material exposures. Inferential statistics (χ 2 /Fisher's exact test) and diagnostics tests were used to examine mapped signs/symptoms of persons who were and were not exposed to chlorine. Three clusters of signs/symptoms are statistically associated with an IGSA syndrome (P < .01): respiratory (shortness of breath, wheezing, coughing, and choking); chest discomfort (tightness, pain, and burning), and eye, nose and/or throat (pain, irritation, and burning). The syndrome requires the presence of signs/symptoms from at least 2 of these clusters. The latency period must also be considered for exposed/potentially exposed persons. This study uses actual patient data from a chemical incident to characterize and validate signs/symptoms of an IGSA syndrome. Validating signs/symptoms is the first step in developing new ED informatics tools with the potential to revolutionize the process by which emergency nurses manage triage victims of chemical incidents. Copyright © 2017 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.

  4. Investigating the feasibility of scale up and automation of human induced pluripotent stem cells cultured in aggregates in feeder free conditions☆

    PubMed Central

    Soares, Filipa A.C.; Chandra, Amit; Thomas, Robert J.; Pedersen, Roger A.; Vallier, Ludovic; Williams, David J.

    2014-01-01

    The transfer of a laboratory process into a manufacturing facility is one of the most critical steps required for the large scale production of cell-based therapy products. This study describes the first published protocol for scalable automated expansion of human induced pluripotent stem cell lines growing in aggregates in feeder-free and chemically defined medium. Cells were successfully transferred between different sites representative of research and manufacturing settings; and passaged manually and using the CompacT SelecT automation platform. Modified protocols were developed for the automated system and the management of cells aggregates (clumps) was identified as the critical step. Cellular morphology, pluripotency gene expression and differentiation into the three germ layers have been used compare the outcomes of manual and automated processes. PMID:24440272

  5. Development and validation of instrument for ergonomic evaluation of tablet arm chairs

    PubMed Central

    Tirloni, Adriana Seára; dos Reis, Diogo Cunha; Bornia, Antonio Cezar; de Andrade, Dalton Francisco; Borgatto, Adriano Ferreti; Moro, Antônio Renato Pereira

    2016-01-01

    The purpose of this study was to develop and validate an evaluation instrument for tablet arm chairs based on ergonomic requirements, focused on user perceptions and using Item Response Theory (IRT). This exploratory study involved 1,633 participants (university students and professors) in four steps: a pilot study (n=26), semantic validation (n=430), content validation (n=11) and construct validation (n=1,166). Samejima's graded response model was applied to validate the instrument. The results showed that all the steps (theoretical and practical) of the instrument's development and validation processes were successful and that the group of remaining items (n=45) had a high consistency (0.95). This instrument can be used in the furniture industry by engineers and product designers and in the purchasing process of tablet arm chairs for schools, universities and auditoriums. PMID:28337099

  6. One-step global parameter estimation of kinetic inactivation parameters for Bacillus sporothermodurans spores under static and dynamic thermal processes.

    PubMed

    Cattani, F; Dolan, K D; Oliveira, S D; Mishra, D K; Ferreira, C A S; Periago, P M; Aznar, A; Fernandez, P S; Valdramidis, V P

    2016-11-01

    Bacillus sporothermodurans produces highly heat-resistant endospores, that can survive under ultra-high temperature. High heat-resistant sporeforming bacteria are one of the main causes for spoilage and safety of low-acid foods. They can be used as indicators or surrogates to establish the minimum requirements for heat processes, but it is necessary to understand their thermal inactivation kinetics. The aim of the present work was to study the inactivation kinetics under both static and dynamic conditions in a vegetable soup. Ordinary least squares one-step regression and sequential procedures were applied for estimating these parameters. Results showed that multiple dynamic heating profiles, when analyzed simultaneously, can be used to accurately estimate the kinetic parameters while significantly reducing estimation errors and data collection. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. [Lessons learned from a distribution incident at the Alps-Mediterranean Division of the French Blood Establishment].

    PubMed

    Legrand, D

    2008-11-01

    The Alps-Mediterranean division of the French blood establishment (EFS Alpes-Mediterranée) has implemented a risk management program. Within this framework, the labile blood product distribution process was assessed to identify critical steps. Subsequently, safety measures were instituted including computer-assisted decision support, detailed written instructions and control checks at each step. Failure of these measures to prevent an incident underlines the vulnerability of the process to the human factor. Indeed root cause analysis showed that the incident was due to underestimation of the danger by one individual. Elimination of this type of risk will require continuous training, testing and updating of personnel. Identification and reporting of nonconformities will allow personnel at all levels (local, regional, and national) to share lessons and implement appropriate risk mitigation strategies.

  8. Diagonal chromatography to study plant protein modifications.

    PubMed

    Walton, Alan; Tsiatsiani, Liana; Jacques, Silke; Stes, Elisabeth; Messens, Joris; Van Breusegem, Frank; Goormachtig, Sofie; Gevaert, Kris

    2016-08-01

    An interesting asset of diagonal chromatography, which we have introduced for contemporary proteome research, is its high versatility concerning proteomic applications. Indeed, the peptide modification or sorting step that is required between consecutive peptide separations can easily be altered and thereby allows for the enrichment of specific, though different types of peptides. Here, we focus on the application of diagonal chromatography for the study of modifications of plant proteins. In particular, we show how diagonal chromatography allows for studying proteins processed by proteases, protein ubiquitination, and the oxidation of protein-bound methionines. We discuss the actual sorting steps needed for each of these applications and the obtained results. This article is part of a Special Issue entitled: Plant Proteomics--a bridge between fundamental processes and crop production, edited by Dr. Hans-Peter Mock. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Direct LiT Electrolysis in a Metallic Fusion Blanket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, Luke

    2016-09-30

    A process that simplifies the extraction of tritium from molten lithium-based breeding blankets was developed. The process is based on the direct electrolysis of lithium tritide using a ceramic Li ion conductor that replaces the molten salt extraction step. Extraction of tritium in the form of lithium tritide in the blankets/targets of fusion/fission reactors is critical in order to maintain low concentrations. This is needed to decrease the potential tritium permeation to the surroundings and large releases from unforeseen accident scenarios. Extraction is complicated due to required low tritium concentration limits and because of the high affinity of tritium formore » the blanket. This work identified, developed and tested the use of ceramic lithium ion conductors capable of recovering hydrogen and deuterium through an electrolysis step at high temperatures.« less

  10. Direct Lit Electrolysis In A Metallic Lithium Fusion Blanket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colon-Mercado, H.; Babineau, D.; Elvington, M.

    2015-10-13

    A process that simplifies the extraction of tritium from molten lithium based breeding blankets was developed.  The process is based on the direct electrolysis of lithium tritide using a ceramic Li ion conductor that replaces the molten salt extraction step. Extraction of tritium in the form of lithium tritide in the blankets/targets of fission/fusion reactors is critical in order to maintained low concentrations.  This is needed to decrease the potential tritium permeation to the surroundings and large releases from unforeseen accident scenarios. Because of the high affinity of tritium for the blanket, extraction is complicated at the required low levels. This workmore » identified, developed and tested the use of ceramic lithium ion conductors capable of recovering the hydrogen and deuterium thru an electrolysis step at high temperatures. « less

  11. Masturbation, sexuality, and adaptation: normalization in adolescence.

    PubMed

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  12. A simplified bioprocess for human alpha-fetoprotein production from inclusion bodies.

    PubMed

    Leong, Susanna S J; Middelberg, Anton P J

    2007-05-01

    A simple and effective Escherichia coli (E. coli) bioprocess is demonstrated for the preparation of recombinant human alpha-fetoprotein (rhAFP), a pharmaceutically promising protein that has important immunomodulatory functions. The new rhAFP process employs only unit operations that are easy to scale and validate, and reduces the complexity embedded in existing inclusion body processing methods. A key requirement in the establishment of this process was the attainment of high purity rhAFP prior to protein refolding because (i) rhAFP binds easily to hydrophobic contaminants once refolded, and (ii) rhAFP aggregates during renaturation, in a contaminant- dependent way. In this work, direct protein extraction from cell suspension was coupled with a DNA precipitation-centrifugation step prior to purification using two simple chromatographic steps. Refolding was conducted using a single-step, redox-optimized dilution refolding protocol, with refolding success determined by reversed phase HPLC analysis, ELISA, and circular dichroism spectroscopy. Quantitation of DNA and protein contaminant loads after each unit operation showed that contaminant levels were reduced to levels comparable to traditional flowsheets. Protein microchemical modification due to carbamylation in this urea-based process was identified and minimized, yielding a final refolded and purified product that was significantly purified from carbamylated variants. Importantly, this work conclusively demonstrates, for the first time, that a chemical extraction process can substitute the more complex traditional inclusion body processing flowsheet, without compromising product purity and yield. This highly intensified and simplified process is expected to be of general utility for the preparation of other therapeutic candidates expressed as inclusion bodies. (c) 2006 Wiley Periodicals, Inc.

  13. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  14. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  15. Topology and Function of Human P-Glycoprotein in Multidrug Resistant Breast Cancer Cells.

    DTIC Science & Technology

    1995-09-01

    membrane orientation and insertion process co-translationally. For the C-terminal half of Pgp, little is known about the regulatory mechanisms of...solution (in mM: 250 sucrose, 10 Tris-HC1, pH 7.5, 150 NaCl) for further processing . For experiments requiring protease digestion and endoglycosidase...steps), 40 ms after the start of the voltage pulse . Bath and pipette solution compositions were as follows (in mM): NMDG-C1 pipette (280 mosmol/kg

  16. RCRA/UST, Superfund, and EPCRA hotline training module. Introduction to superfund community involvement. Directive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-03-01

    This module covers EPA`s Superfund community involvement program, a set of requirements under the National Contingency Plan (NCP) designed to ensure that public is informed about site conditions and given the opportunity to comment on the proposed remedy of a Superfund site. The NCP serves to uphold the public`s right to voice opinions and express concerns about Superfund site activities. EPA must involve communities throughout Superfund process - particularly at critical decision-making steps in the process.

  17. Obtaining and processing Daymet data using Python and ArcGIS

    USGS Publications Warehouse

    Bohms, Stefanie

    2013-01-01

    This set of scripts was developed to automate the process of downloading and mosaicking daily Daymet data to a user defined extent using ArcGIS and Python programming language. The three steps are downloading the needed Daymet tiles for the study area extent, converting the netcdf file to a tif raster format, and mosaicking those rasters to one file. The set of scripts is intended for all levels of experience with Python programming language and requires no scripting by the user.

  18. Characterization of nonplanar motion in MEMS involving scanning laser interferometry

    NASA Astrophysics Data System (ADS)

    Lawton, Russell A.; Abraham, Margaret H.; Lawrence, Eric

    1999-08-01

    A study to evaluate three processes used for the release of standard devices produced by MCNC using the MUMPS process was undertaken by Jet Propulsion Laboratory with the collaboration of The Aerospace Corporation, and Polytec PI. The processes used were developed at various laboratories and are commonly the final step in the production of micro- electro-mechanical systems prior to packaging. It is at this stage of the process when the devices become extremely delicate and are subject to yield losses due to handling errors or the phenomenon of stiction. The effects of post processing with HF on gain boundaries and subsequent thermal processing producing native oxide growth during packaging will require further investigation.

  19. Isothermal DNA origami folding: avoiding denaturing conditions for one-pot, hybrid-component annealing

    NASA Astrophysics Data System (ADS)

    Kopielski, Andreas; Schneider, Anne; Csáki, Andrea; Fritzsche, Wolfgang

    2015-01-01

    The DNA origami technique offers great potential for nanotechnology. Using biomolecular self-assembly, defined 2D and 3D nanoscale DNA structures can be realized. DNA origami allows the positioning of proteins, fluorophores or nanoparticles with an accuracy of a few nanometers and enables thereby novel nanoscale devices. Origami assembly usually includes a thermal denaturation step at 90 °C. Additional components used for nanoscale assembly (such as proteins) are often thermosensitive, and possibly damaged by such harsh conditions. They have therefore to be attached in an extra second step to avoid defects. To enable a streamlined one-step nanoscale synthesis - a so called one-pot folding - an adaptation of the folding procedures is required. Here we present a thermal optimization of this process for a 2D DNA rectangle-shaped origami resulting in an isothermal assembly protocol below 60 °C without thermal denaturation. Moreover, a room temperature protocol is presented using the chemical additive betaine, which is biocompatible in contrast to chemical denaturing approaches reported previously.The DNA origami technique offers great potential for nanotechnology. Using biomolecular self-assembly, defined 2D and 3D nanoscale DNA structures can be realized. DNA origami allows the positioning of proteins, fluorophores or nanoparticles with an accuracy of a few nanometers and enables thereby novel nanoscale devices. Origami assembly usually includes a thermal denaturation step at 90 °C. Additional components used for nanoscale assembly (such as proteins) are often thermosensitive, and possibly damaged by such harsh conditions. They have therefore to be attached in an extra second step to avoid defects. To enable a streamlined one-step nanoscale synthesis - a so called one-pot folding - an adaptation of the folding procedures is required. Here we present a thermal optimization of this process for a 2D DNA rectangle-shaped origami resulting in an isothermal assembly protocol below 60 °C without thermal denaturation. Moreover, a room temperature protocol is presented using the chemical additive betaine, which is biocompatible in contrast to chemical denaturing approaches reported previously. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr04176c

  20. Passive serialization in a multitasking environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hennessey, J.P.; Osisek, D.L.; Seigh, J.W. II

    1989-02-28

    In a multiprocessing system having a control program in which data objects are shared among processes, this patent describes a method for serializing references to a data object by the processes so as to prevent invalid references to the data object by any process when an operation requiring exclusive access is performed by another process, comprising the steps of: permitting the processes to reference data objects on a shared access basis without obtaining a shared lock; monitoring a point of execution of the control program which is common to all processes in the system, which occurs regularly in the process'more » execution and across which no references to any data object can be maintained by any process, except references using locks; establishing a system reference point which occurs after each process in the system has passed the point of execution at least once since the last such system reference point; requesting an operation requiring exclusive access on a selected data object; preventing subsequent references by other processes to the selected data object; waiting until two of the system references points have occurred; and then performing the requested operation.« less

  1. Leap Frog and Time Step Sub-Cycle Scheme for Coupled Neutronics and Thermal-Hydraulic Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, S.

    2002-07-01

    As the result of the advancing TCP/IP based inter-process communication technology, more and more legacy thermal-hydraulic codes have been coupled with neutronics codes to provide best-estimate capabilities for reactivity related reactor transient analysis. Most of the coupling schemes are based on closely coupled serial or parallel approaches. Therefore, the execution of the coupled codes usually requires significant CPU time, when a complicated system is analyzed. Leap Frog scheme has been used to reduce the run time. The extent of the decoupling is usually determined based on a trial and error process for a specific analysis. It is the intent ofmore » this paper to develop a set of general criteria, which can be used to invoke the automatic Leap Frog algorithm. The algorithm will not only provide the run time reduction but also preserve the accuracy. The criteria will also serve as the base of an automatic time step sub-cycle scheme when a sudden reactivity change is introduced and the thermal-hydraulic code is marching with a relatively large time step. (authors)« less

  2. Asymptotic densities from the modified Montroll-Weiss equation for coupled CTRWs

    NASA Astrophysics Data System (ADS)

    Aghion, Erez; Kessler, David A.; Barkai, Eli

    2018-01-01

    We examine the bi-scaling behavior of Lévy walks with nonlinear coupling, where χ, the particle displacement during each step, is coupled to the duration of the step, τ, by χ τβ. An example of such a process is regular Lévy walks, where β = 1. In recent years such processes were shown to be highly useful for analysis of a class of Langevin dynamics, in particular a system of Sisyphus laser-cooled atoms in an optical lattice, where β = 3/2. We discuss the well-known decoupling approximation used to describe the central part of the particles' position distribution, and use the recently introduced infinite-covariant density approach to study the large fluctuations. Since the density of the step displacements is fat-tailed, the last travel event must be treated with care for the latter. This effect requires a modification of the Montroll-Weiss equation, an equation which has proved important for the analysis of many microscopic models. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  3. Modeling Growth of Nanostructures in Plasmas

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.

    2004-01-01

    As semiconductor circuits shrink to CDs below 0.1 nm, it is becoming increasingly critical to replace and/or enhance existing technology with nanoscale structures, such as nanowires for interconnects. Nanowires grown in plasmas are strongly dependent on processing conditions, such as gas composition and substrate temperature. Growth occurs at specific sites, or step-edges, with the bulk growth rate of the nanowires determined from the equation of motion of the nucleating crystalline steps. Traditional front-tracking algorithms, such as string-based or level set methods, suffer either from numerical complications in higher spatial dimensions, or from difficulties in incorporating surface-intense physical and chemical phenomena. Phase field models have the robustness of the level set method, combined with the ability to implement surface-specific chemistry that is required to model crystal growth, although they do not necessarily directly solve for the advancing front location. We have adopted a phase field approach and will present results of the adatom density and step-growth location in time as a function of processing conditions, such as temperature and plasma gas composition.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobo, R.; Revah, S.; Viveros-Garcia, T.

    An analysis of the local processes occurring in a trickle-bed bioreactor (TBB) with a first-order bioreaction shows that the identification of the TBB operating regime requires knowledge of the substrate concentration in the liquid phase. If the substrate liquid concentration is close to 0, the rate-controlling step is mass transfer at the gas-liquid interface; when it is close to the value in equilibrium with the gas phase, the controlling step is the phenomena occurring in the biofilm, CS{sub 2} removal rate data obtained in a TBB with a Thiobacilii consortia biofilm are analyzed to obtain the mass transfer and kineticmore » parameters, and to show that the bioreactor operates in a regime mainly controlled by mass transfer. A TBB model with two experimentally determined parameters is developed and used to show how the bioreactor size depends on the rate-limiting step, the absorption factor, the substrate fractional conversion, and on the gas and liquid contact pattern. Under certain conditions, the TBB size is independent of the flowing phases` contact pattern. The model effectively describes substrate gas and liquid concentration data for mass transfer and biodegradation rate controlled processes.« less

  5. Synthesis of Ultrathin Si Nanosheets from Natural Clays for Lithium-Ion Battery Anodes.

    PubMed

    Ryu, Jaegeon; Hong, Dongki; Choi, Sinho; Park, Soojin

    2016-02-23

    Two-dimensional Si nanosheets have been studied as a promising candidate for lithium-ion battery anode materials. However, Si nanosheets reported so far showed poor cycling performances and required further improvements. In this work, we utilize inexpensive natural clays for preparing high quality Si nanosheets via a one-step simultaneous molten salt-induced exfoliation and chemical reduction process. This approach produces high purity mesoporous Si nanosheets in high yield. As a control experiment, two-step process (pre-exfoliated silicate sheets and subsequent chemical reduction) cannot sustain their original two-dimensional structure. In contrast, one-step method results in a production of 5 nm-thick highly porous Si nanosheets. Carbon-coated Si nanosheet anodes exhibit a high reversible capacity of 865 mAh g(-1) at 1.0 A g(-1) with an outstanding capacity retention of 92.3% after 500 cycles. It also delivers high rate capability, corresponding to a capacity of 60% at 20 A g(-1) compared to that of 2.0 A g(-1). Furthermore, the Si nanosheet electrodes show volume expansion of only 42% after 200 cycles.

  6. Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization

    PubMed Central

    Marai, G. Elisabeta

    2018-01-01

    Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550

  7. High Fidelity Tape Transfer Printing Based On Chemically Induced Adhesive Strength Modulation

    NASA Astrophysics Data System (ADS)

    Sim, Kyoseung; Chen, Song; Li, Yuhang; Kammoun, Mejdi; Peng, Yun; Xu, Minwei; Gao, Yang; Song, Jizhou; Zhang, Yingchun; Ardebili, Haleh; Yu, Cunjiang

    2015-11-01

    Transfer printing, a two-step process (i.e. picking up and printing) for heterogeneous integration, has been widely exploited for the fabrication of functional electronics system. To ensure a reliable process, strong adhesion for picking up and weak or no adhesion for printing are required. However, it is challenging to meet the requirements of switchable stamp adhesion. Here we introduce a simple, high fidelity process, namely tape transfer printing(TTP), enabled by chemically induced dramatic modulation in tape adhesive strength. We describe the working mechanism of the adhesion modulation that governs this process and demonstrate the method by high fidelity tape transfer printing several types of materials and devices, including Si pellets arrays, photodetector arrays, and electromyography (EMG) sensors, from their preparation substrates to various alien substrates. High fidelity tape transfer printing of components onto curvilinear surfaces is also illustrated.

  8. A narrative method for consciousness research.

    PubMed

    Díaz, José-Luis

    2013-01-01

    Some types of first-person narrations of mental processes that constitute phenomenological accounts and texts, such as internal monolog statements, epitomize the best expressions and representations of human consciousness available and therefore may be used to model phenomenological streams of consciousness. The type of autonomous monolog in which an author or narrator declares actual mental processes in a think aloud manner seems particularly suitable for modeling streams of consciousness. A narrative method to extract and depict conscious processes, operations, contents, and states from an acceptable phenomenological text would require three subsequent steps: operational criteria for producing and/or selecting a phenomenological text, a system for detecting text items that are indicative of conscious contents and processes, and a procedure for representing such items in formal dynamic system devices such as Petri nets. The requirements and restrictions of each of these steps are presented, analyzed, and applied to phenomenological texts in the following manner: (1) the relevance of introspective language and narrative analyses to consciousness research and the idea that specific narratives are of paramount interest for such investigation is justified; (2) some of the obstacles and constraints to attain plausible consciousness inferences from narrative texts and the methodological requirements to extract and depict items relevant to consciousness contents and operations from a suitable phenomenological text are examined; (3) a preliminary exercise of the proposed method is used to analyze and chart a classical interior monolog excerpted from James Joyce's Ulysses, a masterpiece of the stream-of-consciousness literary technique and, finally, (4) an inter-subjective evaluation for inter-observer agreement of mental attributions of another phenomenological text (an excerpt from the Intimate Journal of Miguel de Unamuno) is presented using some mathematical tools.

  9. A narrative method for consciousness research

    PubMed Central

    Díaz, José-Luis

    2013-01-01

    Some types of first-person narrations of mental processes that constitute phenomenological accounts and texts, such as internal monolog statements, epitomize the best expressions and representations of human consciousness available and therefore may be used to model phenomenological streams of consciousness. The type of autonomous monolog in which an author or narrator declares actual mental processes in a think aloud manner seems particularly suitable for modeling streams of consciousness. A narrative method to extract and depict conscious processes, operations, contents, and states from an acceptable phenomenological text would require three subsequent steps: operational criteria for producing and/or selecting a phenomenological text, a system for detecting text items that are indicative of conscious contents and processes, and a procedure for representing such items in formal dynamic system devices such as Petri nets. The requirements and restrictions of each of these steps are presented, analyzed, and applied to phenomenological texts in the following manner: (1) the relevance of introspective language and narrative analyses to consciousness research and the idea that specific narratives are of paramount interest for such investigation is justified; (2) some of the obstacles and constraints to attain plausible consciousness inferences from narrative texts and the methodological requirements to extract and depict items relevant to consciousness contents and operations from a suitable phenomenological text are examined; (3) a preliminary exercise of the proposed method is used to analyze and chart a classical interior monolog excerpted from James Joyce’s Ulysses, a masterpiece of the stream-of-consciousness literary technique and, finally, (4) an inter-subjective evaluation for inter-observer agreement of mental attributions of another phenomenological text (an excerpt from the Intimate Journal of Miguel de Unamuno) is presented using some mathematical tools. PMID:24265610

  10. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  11. Cellulose Biosynthesis: Current Views and Evolving Concepts

    PubMed Central

    SAXENA, INDER M.; BROWN, R. MALCOLM

    2005-01-01

    • Aims To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. • Scope Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. • Conclusions With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back. PMID:15894551

  12. Cellulose biosynthesis: current views and evolving concepts.

    PubMed

    Saxena, Inder M; Brown, R Malcolm

    2005-07-01

    To outline the current state of knowledge and discuss the evolution of various viewpoints put forth to explain the mechanism of cellulose biosynthesis. * Understanding the mechanism of cellulose biosynthesis is one of the major challenges in plant biology. The simplicity in the chemical structure of cellulose belies the complexities that are associated with the synthesis and assembly of this polysaccharide. Assembly of cellulose microfibrils in most organisms is visualized as a multi-step process involving a number of proteins with the key protein being the cellulose synthase catalytic sub-unit. Although genes encoding this protein have been identified in almost all cellulose synthesizing organisms, it has been a challenge in general, and more specifically in vascular plants, to demonstrate cellulose synthase activity in vitro. The assembly of glucan chains into cellulose microfibrils of specific dimensions, viewed as a spontaneous process, necessitates the assembly of synthesizing sites unique to most groups of organisms. The steps of polymerization (requiring the specific arrangement and activity of the cellulose synthase catalytic sub-units) and crystallization (directed self-assembly of glucan chains) are certainly interlinked in the formation of cellulose microfibrils. Mutants affected in cellulose biosynthesis have been identified in vascular plants. Studies on these mutants and herbicide-treated plants suggest an interesting link between the steps of polymerization and crystallization during cellulose biosynthesis. * With the identification of a large number of genes encoding cellulose synthases and cellulose synthase-like proteins in vascular plants and the supposed role of a number of other proteins in cellulose biosynthesis, a complete understanding of this process will necessitate a wider variety of research tools and approaches than was thought to be required a few years back.

  13. Extrinsic Repair of Injured Dendrites as a Paradigm for Regeneration by Fusion in Caenorhabditis elegans.

    PubMed

    Oren-Suissa, Meital; Gattegno, Tamar; Kravtsov, Veronika; Podbilewicz, Benjamin

    2017-05-01

    Injury triggers regeneration of axons and dendrites. Research has identified factors required for axonal regeneration outside the CNS, but little is known about regeneration triggered by dendrotomy. Here, we study neuronal plasticity triggered by dendrotomy and determine the fate of complex PVD arbors following laser surgery of dendrites. We find that severed primary dendrites grow toward each other and reconnect via branch fusion. Simultaneously, terminal branches lose self-avoidance and grow toward each other, meeting and fusing at the tips via an AFF-1-mediated process. Ectopic branch growth is identified as a step in the regeneration process required for bypassing the lesion site. Failure of reconnection to the severed dendrites results in degeneration of the distal end of the neuron. We discover pruning of excess branches via EFF-1 that acts to recover the original wild-type arborization pattern in a late stage of the process. In contrast, AFF-1 activity during dendritic auto-fusion is derived from the lateral seam cells and not autonomously from the PVD neuron. We propose a model in which AFF-1-vesicles derived from the epidermal seam cells fuse neuronal dendrites. Thus, EFF-1 and AFF-1 fusion proteins emerge as new players in neuronal arborization and maintenance of arbor connectivity following injury in Caenorhabditis elegans Our results demonstrate that there is a genetically determined multi-step pathway to repair broken dendrites in which EFF-1 and AFF-1 act on different steps of the pathway. EFF-1 is essential for dendritic pruning after injury and extrinsic AFF-1 mediates dendrite fusion to bypass injuries. Copyright © 2017 by the Genetics Society of America.

  14. Advanced Flip Chips in Extreme Temperature Environments

    NASA Technical Reports Server (NTRS)

    Ramesham, Rajeshuni

    2010-01-01

    The use of underfill materials is necessary with flip-chip interconnect technology to redistribute stresses due to mismatching coefficients of thermal expansion (CTEs) between dissimilar materials in the overall assembly. Underfills are formulated using organic polymers and possibly inorganic filler materials. There are a few ways to apply the underfills with flip-chip technology. Traditional capillary-flow underfill materials now possess high flow speed and reduced time to cure, but they still require additional processing steps beyond the typical surface-mount technology (SMT) assembly process. Studies were conducted using underfills in a temperature range of -190 to 85 C, which resulted in an increase of reliability by one to two orders of magnitude. Thermal shock of the flip-chip test articles was designed to induce failures at the interconnect sites (-40 to 100 C). The study on the reliability of flip chips using underfills in the extreme temperature region is of significant value for space applications. This technology is considered as an enabling technology for future space missions. Flip-chip interconnect technology is an advanced electrical interconnection approach where the silicon die or chip is electrically connected, face down, to the substrate by reflowing solder bumps on area-array metallized terminals on the die to matching footprints of solder-wettable pads on the chosen substrate. This advanced flip-chip interconnect technology will significantly improve the performance of high-speed systems, productivity enhancement over manual wire bonding, self-alignment during die joining, low lead inductances, and reduced need for attachment of precious metals. The use of commercially developed no-flow fluxing underfills provides a means of reducing the processing steps employed in the traditional capillary flow methods to enhance SMT compatibility. Reliability of flip chips may be significantly increased by matching/tailoring the CTEs of the substrate material and the silicon die or chip, and also the underfill materials. Advanced packaging interconnects technology such as flip-chip interconnect test boards have been subjected to various extreme temperature ranges that cover military specifications and extreme Mars and asteroid environments. The eventual goal of each process step and the entire process is to produce components with 100 percent interconnect and satisfy the reliability requirements. Underfill materials, in general, may possibly meet demanding end use requirements such as low warpage, low stress, fine pitch, high reliability, and high adhesion.

  15. New 3D structuring process for non-integrated circuit related technologies (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nouri, Lamia; Possémé, Nicolas; Landis, Stéfan; Milesi, Frédéric; Gaillard, Frédéric-Xavier

    2017-04-01

    Fabrication processes that microelectronic developed for Integrated circuit (IC) technologies for decades, do not meet the new emerging structuration's requirements, in particular non-IC related technologies one, such as MEMS/NEMS, Micro-Fluidics, photovoltaics, lenses. Actually complex 3D structuration requires complex lithography patterning approaches such as gray-scale electron beam lithography, laser ablation, focused ion beam lithography, two photon polymerization. It is now challenging to find cheaper and easiest technique to achieve 3D structures. In this work, we propose a straightforward process to realize 3D structuration, intended for silicon based materials (Si, SiN, SiOCH). This structuration technique is based on nano-imprint lithography (NIL), ion implantation and selective wet etching. In a first step a pattern is performed by lithography on a substrate, then ion implantation is realized through a resist mask in order to create localized modifications in the material, thus the pattern is transferred into the subjacent layer. Finally, after the resist stripping, a selective wet etching is carried out to remove selectively the modified material regarding the non-modified one. In this paper, we will first present results achieved with simple 2D line array pattern processed either on Silicon or SiOCH samples. This step have been carried out to demonstrate the feasibility of this new structuration process. SEM pictures reveals that "infinite" selectivity between the implanted areas versus the non-implanted one could be achieved. We will show that a key combination between the type of implanted ion species and wet etching chemistries is required to obtain such results. The mechanisms understanding involved during both implantation and wet etching processes will also be presented through fine characterizations with Photoluminescence, Raman and Secondary Ion Mass Spectrometry (SIMS) for silicon samples, and ellipso-porosimetry and Fourier Transform InfraRed spectroscopy (FTIR) for SiOCH samples. Finally the benefit of this new patterning approach will be presented on 3D patterns structures.

  16. Biosynthesis and intracellular movement of the melanosomal membrane glycoprotein gp75, the human b (brown) locus product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vijayasaradhi, S.; Doskoch, P.M.; Houghton, A.N.

    1991-10-01

    A 75-kDa melanosomal glycoprotein (gp75) is the product of a gene that maps to the b (brown) locus, a genetic locus that determines coat color in the mouse. The b locus is conserved (88% identity) between mouse and human. The mouse monoclonal antibody TA99 was used to study the biosynthesis and processing of gp75. gp75 was synthesized as a 55-kDa polypeptide, glycosylated by addition and processing of five or more Asn-linked carbohydrate chains through the cis and trans Golgi, and transported to melanosomes as a mature 75-kDa form. Synthesis and processing of gp75 was rapid (T{sub 1/2} < 30 min),more » and early steps in processing were required for efficient export of gp75 was quite stable in the melanosome. Studies with inhibitors of steps in oligosaccharide processing showed that alternative forms of gp75 were generated during trimming reactions by mannosidase IA/IB and that further maturation resulted in the two mature forms of gp75. The authors purpose that the kinetics of biosynthesis and processing reflect events in the biogenesis and maturation of melanosomes.« less

  17. Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.

    PubMed

    Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S

    2014-07-01

    This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Use of scatterometry for resist process control

    NASA Astrophysics Data System (ADS)

    Bishop, Kenneth P.; Milner, Lisa-Michelle; Naqvi, S. Sohail H.; McNeil, John R.; Draper, B. L.

    1992-06-01

    The formation of resist lines having submicron critical dimensions (CDs) is a complex multistep process, requiring precise control of each processing step. Optimization of parameters for each processing step may be accomplished through theoretical modeling techniques and/or the use of send-ahead wafers followed by scanning electron microscope measurements. Once the optimum parameters for any process having been selected, (e.g., time duration and temperature for post-exposure bake process), no in-situ CD measurements are made. In this paper we describe the use of scatterometry to provide this essential metrology capability. It involves focusing a laser beam on a periodic grating and predicting the shape of the grating lines from a measurement of the scattered power in the diffraction orders. The inverse prediction of lineshape from a measurement of the scatter power is based on a vector diffraction analysis used in conjunction with photolithography simulation tools to provide an accurate scatter model for latent image gratings. This diffraction technique has previously been applied to looking at latent image grating formation, as exposure is taking place. We have broadened the scope of the application and consider the problem of determination of optimal focus.

  19. Cold Test Operation of the German VEK Vitrification Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleisch, J.; Schwaab, E.; Weishaupt, M.

    2008-07-01

    In 2007 the German High-Level Liquid Waste (HLLW) Vitrification plant VEK (Verglasungseinrichtung Karlsruhe) has passed a three months integral cold test operation as final step before entering the hot phase. The overall performance of the vitrification process equipment with a liquid-fed ceramic glass melter as main component proved to be completely in line with the requirements of the regulatory body. The retention efficiency of main radioactive-bearing elements across melter and wet off-gas treatment system exceeded the design values distinctly. The strategy to produce a specified waste glass could be successfully demonstrated. The results of the cold test operation allow enteringmore » the next step of hot commissioning, i.e. processing of approximately 2 m{sup 3} of diluted HLLW. In summary: An important step of the VEK vitrification plant towards hot operation has been the performance of the cold test operation from April to July 2007. This first integral operation was carried out under boundary conditions and rules established for radioactive operation. Operation and process control were carried out following the procedure as documented in the licensed operational manuals. The function of the process technology and the safe operation could be demonstrated. No severe problems were encountered. Based on the positive results of the cold test, application of the license for hot operation has been initiated and is expected in the near future. (authors)« less

  20. Thermodynamic analyses of hydrogen production from sub-quality natural gas. Part I: Pyrolysis and autothermal pyrolysis

    NASA Astrophysics Data System (ADS)

    Huang, Cunping; T-Raissi, Ali

    Sub-quality natural gas (SQNG) is defined as natural gas whose composition exceeds pipeline specifications of nitrogen, carbon dioxide (CO 2) and/or hydrogen sulfide (H 2S). Approximately one-third of the U.S. natural gas resource is sub-quality gas [1]. Due to the high cost of removing H 2S from hydrocarbons using current processing technologies, SQNG wells are often capped and the gas remains in the ground. We propose and analyze a two-step hydrogen production scheme using SQNG as feedstock. The first step of the process involves hydrocarbon processing (via steam-methane reformation, autothermal steam-methane reformation, pyrolysis and autothermal pyrolysis) in the presence of H 2S. Our analyses reveal that H 2S existing in SQNG is stable and can be considered as an inert gas. No sulfur dioxide (SO 2) and/or sulfur trioxide (SO 3) is formed from the introduction of oxygen to SQNG. In the second step, after the separation of hydrogen from the main stream, un-reacted H 2S is used to reform the remaining methane, generating more hydrogen and carbon disulfide (CS 2). Thermodynamic analyses on SQNG feedstock containing up to 10% (v/v) H 2S have shown that no H 2S separation is required in this process. The Part I of this paper includes only thermodynamic analyses for SQNG pyrolysis and autothermal pyrolysis.

Top