Sample records for entire design process

  1. Cognitive Design for Learning: Cognition and Emotion in the Design Process

    ERIC Educational Resources Information Center

    Hasebrook, Joachim

    2016-01-01

    We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…

  2. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  3. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  4. 76 FR 56466 - Notice of Intent to Prepare an Environmental Document and Proposed Plan Amendment for the West...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-13

    ... Vehicle Access Element of the CDCA Plan for the WEMO area; and (2) Alternative processes for designating.... Identification of the process and decision criteria that should be used to designate routes in the sub-regional... analysis, and guide the entire process from plan decision-making to route designation review in order to...

  5. Distributed Group Design Process: Lessons Learned.

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ganesan, Radha

    A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…

  6. Students Matter: Quality Measurements in Online Courses

    ERIC Educational Resources Information Center

    Unal, Zafer; Unal, Aslihan

    2016-01-01

    Quality Matters (QM) is a peer review process designed to certify the quality of online courses and online components. It has generated widespread interest and received national recognition for its peer-based approach to quality assurance and continuous improvement in online education. While the entire QM online course design process is…

  7. Multi-Criteria Approach in Multifunctional Building Design Process

    NASA Astrophysics Data System (ADS)

    Gerigk, Mateusz

    2017-10-01

    The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.

  8. GREENSCOPE Technical User’s Guide

    EPA Pesticide Factsheets

    GREENSCOPE’s methodology has been developed and its software tool designed such that it can be applied to an entire process, to a piece of equipment or process unit, or at the investigatory bench scale.

  9. User-Centered Design of Online Learning Communities

    ERIC Educational Resources Information Center

    Lambropoulos, Niki, Ed.; Zaphiris, Panayiotis, Ed.

    2007-01-01

    User-centered design (UCD) is gaining popularity in both the educational and business sectors. This is due to the fact that UCD sheds light on the entire process of analyzing, planning, designing, developing, using, evaluating, and maintaining computer-based learning. "User-Centered Design of Online Learning Communities" explains how…

  10. PrimerDesign-M: A multiple-alignment based multiple-primer design tool for walking across variable genomes

    DOE PAGES

    Yoon, Hyejin; Leitner, Thomas

    2014-12-17

    Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less

  11. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  12. Wake County Public School System Design Guidelines.

    ERIC Educational Resources Information Center

    Wake County Public School System, Raleigh, NC.

    The Wake County Public School System has published its guidelines for planning and design of functional, cost effective, and durable educational facilities that are attractive and enhance the students' educational experience. The guidelines present basic planning requirement and design criteria for the entire construction process, including: codes…

  13. What are you trying to learn? Study designs and the appropriate analysis for your research question

    USDA-ARS?s Scientific Manuscript database

    One fundamental necessity in the entire process of a well-performed study is the experimental design. A well-designed study can help researchers understand and have confidence in their results and analyses, and additionally the agreement or disagreement with the stated hypothesis. This well-designed...

  14. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  15. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  16. Problem Decomposition and Recomposition in Engineering Design: A Comparison of Design Behavior between Professional Engineers, Engineering Seniors, and Engineering Freshmen

    ERIC Educational Resources Information Center

    Song, Ting; Becker, Kurt; Gero, John; DeBerard, Scott; DeBerard, Oenardi; Reeve, Edward

    2016-01-01

    The authors investigated the differences in using problem decomposition and problem recomposition between dyads of engineering experts, engineering seniors, and engineering freshmen. Participants worked in dyads to complete an engineering design challenge within 1 hour. The entire design process was video and audio recorded. After the design…

  17. UOE Pipe Manufacturing Process Simulation: Equipment Designing and Construction

    NASA Astrophysics Data System (ADS)

    Delistoian, Dmitri; Chirchor, Mihael

    2017-12-01

    UOE pipe manufacturing process influence directly on pipeline resilience and operation capacity. At present most spreaded pipe manufacturing method is UOE. This method is based on cold forming. After each technological step appears a certain stress and strain level. For pipe stress strain study is designed and constructed special equipment that simulate entire technological process.UOE pipe equipment is dedicated for manufacturing of longitudinally submerged arc welded DN 400 (16 inch) steel pipe.

  18. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  19. Design and optimization of a fiber optic data link for new generation on-board SAR processing architectures

    NASA Astrophysics Data System (ADS)

    Ciminelli, Caterina; Dell'Olio, Francesco; Armenise, Mario N.; Iacomacci, Francesco; Pasquali, Franca; Formaro, Roberto

    2017-11-01

    A fiber optic digital link for on-board data handling is modeled, designed and optimized in this paper. Design requirements and constraints relevant to the link, which is in the frame of novel on-board processing architectures, are discussed. Two possible link configurations are investigated, showing their advantages and disadvantages. An accurate mathematical model of each link component and the entire system is reported and results of link simulation based on those models are presented. Finally, some details on the optimized design are provided.

  20. An investigation into creative design methodologies for textiles and fashion

    NASA Astrophysics Data System (ADS)

    Gault, Alison

    2017-10-01

    Understanding market intelligence, trends, influences and personal approaches are essential tools for design students to develop their ideas in textiles and fashion. Identifying different personal approaches including, visual, process-led or concept by employing creative methodologies are key to developing a brief. A series of ideas or themes start to emerge and through the design process serve to underpin and inform an entire collection. These investigations ensure that the design collections are able to produce a diverse range of outcomes. Following key structures and coherent stages in the design process creates authentic collections in textiles and fashion. A range of undergraduate students presented their design portfolios (180) and the methodologies employed were mapped against success at module level, industry response and graduate employment.

  1. Designing and Proposing Your Research Project. Concise Guides to Conducting Behavioral, Health, and Social Science Research Series

    ERIC Educational Resources Information Center

    Urban, Jennifer Brown; van Eeden-Moorefield, Bradley Matheus

    2017-01-01

    Designing your own study and writing your research proposal takes time, often more so than conducting the study. This practical, accessible guide walks you through the entire process. You will learn to identify and narrow your research topic, develop your research question, design your study, and choose appropriate sampling and measurement…

  2. A CAD approach to magnetic bearing design

    NASA Technical Reports Server (NTRS)

    Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.

    1988-01-01

    A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.

  3. Interactive computer graphics system for structural sizing and analysis of aircraft structures

    NASA Technical Reports Server (NTRS)

    Bendavid, D.; Pipano, A.; Raibstein, A.; Somekh, E.

    1975-01-01

    A computerized system for preliminary sizing and analysis of aircraft wing and fuselage structures was described. The system is based upon repeated application of analytical program modules, which are interactively interfaced and sequence-controlled during the iterative design process with the aid of design-oriented graphics software modules. The entire process is initiated and controlled via low-cost interactive graphics terminals driven by a remote computer in a time-sharing mode.

  4. An Interactive Design Space Supporting Development of Vehicle Architecture Concept Models

    DTIC Science & Technology

    2011-01-01

    Denver, Colorado, USA IMECE2011-64510 AN INTERACTIVE DESIGN SPACE SUPPORTING DEVELOPMENT OF VEHICLE ARCHITECTURE CONCEPT MODELS Gary Osborne...early in the development cycle. Optimization taking place later in the cycle usually occurs at the detail design level, and tends to result in...architecture changes may be imposed, but such modifications are equivalent to a huge optimization cycle covering almost the entire design process, and

  5. 2010 Anthropometric Survey of U.S. Marine Corps Personnel: Methods and Summary Statistics

    DTIC Science & Technology

    2013-06-01

    models for the ergonomic design of working environments. Today, the entire production chain for a piece of clothing, beginning with the design and...Corps 382 crewstations and workstations. Digital models are increasingly used in the design process for seated and standing workstations, as well...International Standards for Ergonomic Design : These dimensions are useful for comparing data sets between nations, and are measured according to

  6. Program Evolves from Basic CAD to Total Manufacturing Experience

    ERIC Educational Resources Information Center

    Cassola, Joel

    2011-01-01

    Close to a decade ago, John Hersey High School (JHHS) in Arlington Heights, Illinois, made a transition from a traditional classroom-based pre-engineering program. The new program is geared towards helping students understand the entire manufacturing process. Previously, a JHHS student would design a project in computer-aided design (CAD) software…

  7. Mathematical modeling of the whole expanded bed adsorption process to recover and purify chitosanases from the unclarified fermentation broth of Paenibacillus ehimensis.

    PubMed

    de Araújo Padilha, Carlos Eduardo; Fortunato Dantas, Paulo Victor; de Sousa, Francisco Canindé; de Santana Souza, Domingos Fabiano; de Oliveira, Jackson Araújo; de Macedo, Gorete Ribeiro; Dos Santos, Everaldo Silvino

    2016-12-15

    In this study, a general rate model was applied to the entire process of expanded bed adsorption chromatography (EBAC) for the chitosanases purification protocol from unclarified fermentation broth produced by Paenibacillus ehimensis using the anionic adsorbent Streamline ® DEAE. For the experiments performed using the expanded bed, a homemade column (2.6cm×30.0cm) was specially designed. The proposed model predicted the entire EBA process adequately, giving R 2 values higher than 0.85 and χ 2 as low as 0.351 for the elution step. Using the validated model, a 3 3 factorial design was used to investigate other non-tested conditions as input. It was observed that the superficial velocity during loading and washing steps, as well as the settled bed height, has a strong positive effect on the F objective function used to evaluate the production of the purified chitosanases. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Expanding lean thinking to the product and process design and development within the framework of sustainability

    NASA Astrophysics Data System (ADS)

    Sorli, M.; Sopelana, A.; Salgado, M.; Pelaez, G.; Ares, E.

    2012-04-01

    Companies require tools to change towards a new way of developing and producing innovative products to be manufactured considering the economic, social and environmental impact along the product life cycle. Based on translating Lean principles in Product Development (PD) from the design stage and, along the entire product life cycle, it is aimed to address both sustainability and environmental issues. The drivers of sustainable culture within a lean PD have been identified and a baseline for future research on the development of appropriate tools and techniques has been provided. This research provide industry with a framework which balance environmental and sustainable factors with lean principles to be considered and incorporated from the beginning of product design and development covering the entire product lifecycle.

  9. Playground Facilities and Equipment. ACSA School Management Digest, Series 1, Number 7. ERIC/CEM Research Analysis Series, Number 34.

    ERIC Educational Resources Information Center

    Coursen, David

    Modern educators and playground designers are increasingly recognizing that play is a part, perhaps the decisive part, of the entire learning process. Theories of playground equipment design, planning the playground, financial considerations, and equipment suggestions are featured in this review. Examples of playgrounds include innovative…

  10. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  11. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  12. Testing military grade magnetics (transformers, inductors and coils).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Engineers and designers are constantly searching for test methods to qualify or 'prove-in' new designs. In the High Reliability world of military parts, design test, qualification tests, in process tests and product characteristic tests, become even more important. The use of in process and function tests has been adopted as a way of demonstrating that parts will operate correctly and survive its 'use' environments. This paper discusses various types of tests to qualify the magnetic components - the current carrying capability of coils, a next assembly 'as used' test, a corona test and inductance at temperature test. Each of thesemore » tests addresses a different potential failure on a component. The entire process from design to implementation is described.« less

  13. Green Schools as High Performance Learning Facilities

    ERIC Educational Resources Information Center

    Gordon, Douglas E.

    2010-01-01

    In practice, a green school is the physical result of a consensus process of planning, design, and construction that takes into account a building's performance over its entire 50- to 60-year life cycle. The main focus of the process is to reinforce optimal learning, a goal very much in keeping with the parallel goals of resource efficiency and…

  14. Safeguards Approaches for Black Box Processes or Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz-Marcano, Helly; Gitau, Ernest TN; Hockert, John

    2013-09-25

    The objective of this study is to determine whether a safeguards approach can be developed for “black box” processes or facilities. These are facilities where a State or operator may limit IAEA access to specific processes or portions of a facility; in other cases, the IAEA may be prohibited access to the entire facility. The determination of whether a black box process or facility is safeguardable is dependent upon the details of the process type, design, and layout; the specific limitations on inspector access; and the restrictions placed upon the design information that can be provided to the IAEA. Thismore » analysis identified the necessary conditions for safeguardability of black box processes and facilities.« less

  15. The Role of Water Chemistry in Marine Aquarium Design: A Model System for a General Chemistry Class

    ERIC Educational Resources Information Center

    Keaffaber, Jeffrey J.; Palma, Ramiro; Williams, Kathryn R.

    2008-01-01

    Water chemistry is central to aquarium design, and it provides many potential applications for discussion in undergraduate chemistry and engineering courses. Marine aquaria and their life support systems feature many chemical processes. A life support system consists of the entire recirculation system, as well as the habitat tank and all ancillary…

  16. Options in virtual 3D, optical-impression-based planning of dental implants.

    PubMed

    Reich, Sven; Kern, Thomas; Ritter, Lutz

    2014-01-01

    If a 3D radiograph, which in today's dentistry often consists of a CBCT dataset, is available for computerized implant planning, the 3D planning should also consider functional prosthetic aspects. In a conventional workflow, the CBCT is done with a specially produced radiopaque prosthetic setup that makes the desired prosthetic situation visible during virtual implant planning. If an exclusively digital workflow is chosen, intraoral digital impressions are taken. On these digital models, the desired prosthetic suprastructures are designed. The entire datasets are virtually superimposed by a "registration" process on the corresponding structures (teeth) in the CBCTs. Thus, both the osseous and prosthetic structures are visible in one single 3D application and make it possible to consider surgical and prosthetic aspects. After having determined the implant positions on the computer screen, a drilling template is designed digitally. According to this design (CAD), a template is printed or milled in CAM process. This template is the first physically extant product in the entire workflow. The article discusses the options and limitations of this workflow.

  17. Improving aircraft conceptual design - A PHIGS interactive graphics interface for ACSYNT

    NASA Technical Reports Server (NTRS)

    Wampler, S. G.; Myklebust, A.; Jayaram, S.; Gelhausen, P.

    1988-01-01

    A CAD interface has been created for the 'ACSYNT' aircraft conceptual design code that permits the execution and control of the design process via interactive graphics menus. This CAD interface was coded entirely with the new three-dimensional graphics standard, the Programmer's Hierarchical Interactive Graphics System. The CAD/ACSYNT system is designed for use by state-of-the-art high-speed imaging work stations. Attention is given to the approaches employed in modeling, data storage, and rendering.

  18. Lakes and ponds recreation management: a state-wide application of the visitor impact management process

    Treesearch

    Jerry J. Vaske; Rodney R. Zwick; Maureen P. Donnelly

    1992-01-01

    The Visitor Impact Management (VIM) process is designed to identify unacceptable changes occurring as a result of visitor use and to develop management strategies to keep visitor impacts within acceptable levels. All previous attempts to apply the VIM planning framework have concentrated on specific resources. This paper expands this focus to an entire state. Based on...

  19. Cognitive dimensions of talim: evaluating weaving notation through cognitive dimensions (CDs) framework.

    PubMed

    Kaur, Gagan Deep

    2017-05-01

    The design process in Kashmiri carpet weaving is distributed over a number of actors and artifacts and is mediated by a weaving notation called talim. The script encodes entire design in practice-specific symbols. This encoded script is decoded and interpreted via design-specific conventions by weavers to weave the design embedded in it. The cognitive properties of this notational system are described in the paper employing cognitive dimensions (CDs) framework of Green (People and computers, Cambridge University Press, Cambridge, 1989) and Blackwell et al. (Cognitive technology: instruments of mind-CT 2001, LNAI 2117, Springer, Berlin, 2001). After introduction to the practice, the design process is described in 'The design process' section which includes coding and decoding of talim. In 'Cognitive dimensions of talim' section, after briefly discussing CDs framework, the specific cognitive dimensions possessed by talim are described in detail.

  20. Response to Intervention and Continuous School Improvement: Using Data, Vision, and Leadership to Design, Implement, and Evaluate a Schoolwide Prevention System

    ERIC Educational Resources Information Center

    Bernhardt, Victoria L.; Hebert, Connie L.

    2011-01-01

    Ensure the success of your school and improve the learning of "all" students by implementing Response-to-Intervention (RTI) as part of a continuous school improvement (CSI) process. This book shows you how to get your entire staff working together to design, implement, and evaluate a schoolwide prevention system. With specific examples, CSI expert…

  1. Cyber-Informed Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert S.; Benjamin, Jacob; Wright, Virginia L.

    A continuing challenge for engineers who utilize digital systems is to understand the impact of cyber-attacks across the entire product and program lifecycle. This is a challenge due to the evolving nature of cyber threats that may impact the design, development, deployment, and operational phases of all systems. Cyber Informed Engineering is the process by which engineers are made aware of both how to use their engineering knowledge to positively impact the cyber security in the processes by which they architect and design components and the services and security of the components themselves.

  2. Design Evolution and Performance Characterization of the GTX Air-Breathing Launch Vehicle Inlet

    NASA Technical Reports Server (NTRS)

    DeBonis, J. R.; Steffen, C. J., Jr.; Rice, T.; Trefny, C. J.

    2002-01-01

    The design and analysis of a second version of the inlet for the GTX rocket-based combine-cycle launch vehicle is discussed. The previous design did not achieve its predicted performance levels due to excessive turning of low-momentum comer flows and local over-contraction due to asymmetric end-walls. This design attempts to remove these problems by reducing the spike half-angle to 10- from 12-degrees and by implementing true plane of symmetry end-walls. Axisymmetric Reynolds-Averaged Navier-Stokes simulations using both perfect gas and real gas, finite rate chemistry, assumptions were performed to aid in the design process and to create a comprehensive database of inlet performance. The inlet design, which operates over the entire air-breathing Mach number range from 0 to 12, and the performance database are presented. The performance database, for use in cycle analysis, includes predictions of mass capture, pressure recovery, throat Mach number, drag force, and heat load, for the entire Mach range. Results of the computations are compared with experimental data to validate the performance database.

  3. An application of computer aided requirements analysis to a real time deep space system

    NASA Technical Reports Server (NTRS)

    Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.

    1981-01-01

    The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.

  4. Interactive Videodisc Design and Production Workshop Guide.

    ERIC Educational Resources Information Center

    Campbell, J. Olin; And Others

    This "how to" workshop guide provides an overview of the entire videodisc authoring and production process through six individual modules. Focusing on project planning, the first module provides guidelines, procedures, and job aids to help each instructional development team member effectively use the videodisc medium. The second module…

  5. Thermal design of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bachrtel, F. D.; Vaniman, J. L.; Stuckey, J. M.; Gray, C.; Widofsky, B.

    1985-01-01

    The shuttle external tank thermal design presents many challenges in meeting the stringent requirements established by the structures, main propulsion systems, and Orbiter elements. The selected thermal protection design had to meet these requirements, and ease of application, suitability for mass production considering low weight, cost, and high reliability. This development led to a spray-on-foam (SOFI) which covers the entire tank. The need and design for a SOFI material with a dual role of cryogenic insulation and ablator, and the development of the SOFI over SLA concept for high heating areas are discussed. Further issuses of minimum surface ice/frost, no debris, and the development of the TPS spray process considering the required quality and process control are examined.

  6. Enhancing the traditional hospital design process: a focus on patient safety.

    PubMed

    Reiling, John G; Knutzen, Barbara L; Wallen, Thomas K; McCullough, Susan; Miller, Ric; Chernos, Sonja

    2004-03-01

    In 2002 St. Joseph's Community Hospital (West Bend, WI), a member of SynergyHealth, brought together leaders in health care and systems engineering to develop a set of safety-driven facility design principles that would guide the hospital design process. DESIGNING FOR SAFETY: Hospital leadership recognized that a cross-departmental team approach would be needed and formed the 11-member Facility Design Advisory Council, which, with departmental teams and the aid of architects, was responsible for overseeing the design process and for ensuring that the safety considerations were met. The design process was a team approach, with input from national experts, patients and families, hospital staff and physicians, architects, contractors, and the community. The new facility, designed using safety-driven design principles, reflects many innovative design elements, including truly standardized patient rooms, new technology to minimize falls, and patient care alcoves for every patient room. The new hospital has been designed with maximum adaptability and flexibility in mind, to accommodate changes and provide for future growth. The architects labeled the innovative design. The Synergy Model, to describe the process of shaping the entire building and its spaces to work efficiently as a whole for the care and safety of patients. Construction began on the new facility in August 2003 and is expected to be completed in 2005.

  7. Integrating the Complete Research Project into a Large Qualitative Methods Course

    ERIC Educational Resources Information Center

    Raddon, Mary-Beth; Nault, Caleb; Scott, Alexis

    2008-01-01

    Participatory exercises are standard practice in qualitative methods courses; less common are projects that engage students in the entire research process, from research design to write-up. Although the teaching literature provides several models of complete research projects, their feasibility, and appropriateness for large, compulsory,…

  8. Cosmopolitanism and the De-Colonial Option

    ERIC Educational Resources Information Center

    Mignolo, Walter

    2010-01-01

    What are the differences between cosmopolitanism and globalization? Are they "natural" historical processes or are they designed for specific purposes? Was Kant cosmopolitanism good for the entire population of the globe or did it respond to a particular Eurocentered view of what a cosmo-polis should be? The article argues that, while…

  9. A Directed Research Project Investigating Aggressive Behavior in Paradise Fish.

    ERIC Educational Resources Information Center

    Darling, Ruth A.

    2003-01-01

    Presents a laboratory experiment that examines the aggressive behavior of male paradise fish. Students design the experiment, collect data, and analyze and interpret the results. This activity is appropriate for biology, ecology, and animal behavior classes and allows students to be involved in the entire scientific process. (Author/NB)

  10. Approach to design space from retrospective quality data.

    PubMed

    Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon

    2016-01-01

    Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.

  11. Computational Optimization of a Natural Laminar Flow Experimental Wing Glove

    NASA Technical Reports Server (NTRS)

    Hartshom, Fletcher

    2012-01-01

    Computational optimization of a natural laminar flow experimental wing glove that is mounted on a business jet is presented and discussed. The process of designing a laminar flow wing glove starts with creating a two-dimensional optimized airfoil and then lofting it into a three-dimensional wing glove section. The airfoil design process does not consider the three dimensional flow effects such as cross flow due wing sweep as well as engine and body interference. Therefore, once an initial glove geometry is created from the airfoil, the three dimensional wing glove has to be optimized to ensure that the desired extent of laminar flow is maintained over the entire glove. TRANAIR, a non-linear full potential solver with a coupled boundary layer code was used as the main tool in the design and optimization process of the three-dimensional glove shape. The optimization process uses the Class-Shape-Transformation method to perturb the geometry with geometric constraints that allow for a 2-in clearance from the main wing. The three-dimensional glove shape was optimized with the objective of having a spanwise uniform pressure distribution that matches the optimized two-dimensional pressure distribution as closely as possible. Results show that with the appropriate inputs, the optimizer is able to match the two dimensional pressure distributions practically across the entire span of the wing glove. This allows for the experiment to have a much higher probability of having a large extent of natural laminar flow in flight.

  12. Space Coatings for Industry

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Ball Aerospace developed entirely new space lubrication technologies. A new family of dry lubricants emerged from Apollo, specifically designed for long life in space, together with processes for applying them to spacecraft components in microscopically thin coatings. Lubricants worked successfully on seven Orbiting Solar Observatory flights over the span of a decade and attracted attention to other contractors which became Ball customers. The company has developed several hundred variations of the original OSO technology generally designed to improve the quality and useful life of a wide range of products or improve efficiency of the industrial processes by which such products are manufactured.

  13. Advances in the production of freeform optical surfaces

    NASA Astrophysics Data System (ADS)

    Tohme, Yazid E.; Luniya, Suneet S.

    2007-05-01

    Recent market demands for free-form optics have challenged the industry to find new methods and techniques to manufacture free-form optical surfaces with a high level of accuracy and reliability. Production techniques are becoming a mix of multi-axis single point diamond machining centers or deterministic ultra precision grinding centers coupled with capable measurement systems to accomplish the task. It has been determined that a complex software tool is required to seamlessly integrate all aspects of the manufacturing process chain. Advances in computational power and improved performance of computer controlled precision machinery have driven the use of such software programs to measure, visualize, analyze, produce and re-validate the 3D free-form design thus making the process of manufacturing such complex surfaces a viable task. Consolidation of the entire production cycle in a comprehensive software tool that can interact with all systems in design, production and measurement phase will enable manufacturers to solve these complex challenges providing improved product quality, simplified processes, and enhanced performance. The work being presented describes the latest advancements in developing such software package for the entire fabrication process chain for aspheric and free-form shapes. It applies a rational B-spline based kernel to transform an optical design in the form of parametrical definition (optical equation), standard CAD format, or a cloud of points to a central format that drives the simulation. This software tool creates a closed loop for the fabrication process chain. It integrates surface analysis and compensation, tool path generation, and measurement analysis in one package.

  14. Guidelines of the Design of Electropyrotechnic Firing Circuit for Unmanned Flight and Ground Test Projects

    NASA Technical Reports Server (NTRS)

    Gonzalez, Guillermo A.; Lucy, Melvin H.; Massie, Jeffrey J.

    2013-01-01

    The NASA Langley Research Center, Engineering Directorate, Electronic System Branch, is responsible for providing pyrotechnic support capabilities to Langley Research Center unmanned flight and ground test projects. These capabilities include device selection, procurement, testing, problem solving, firing system design, fabrication and testing; ground support equipment design, fabrication and testing; checkout procedures and procedure?s training to pyro technicians. This technical memorandum will serve as a guideline for the design, fabrication and testing of electropyrotechnic firing systems. The guidelines will discuss the entire process beginning with requirements definition and ending with development and execution.

  15. 78 FR 32424 - Notice of Issuance of Final Determination Concerning Monochrome Laser Printers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... for manufacture in the U.S. and subsequent sale to U.S. government agencies. Ricoh states that it developed the SP52000-series printers in Japan, and that the entire engineering, development, design and..., Ltd. At the initial stage of the printers production process, individual parts are assembled into...

  16. 78 FR 32427 - Notice of Issuance of Final Determination Concerning Multifunctional Digital Imaging Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... manufacture different types of Controller units. Ricoh considers the manufacturing of the Controller unit... components and subassemblies of the MFPs from China and the Philippines for manufacture in the U.S. and..., and that the entire engineering, development, design and artwork processes for the MFPs took place in...

  17. Learning To Learn: A Guide to Becoming Information Literate.

    ERIC Educational Resources Information Center

    Riedling, Ann Marlow

    This guide is designed to help students from middle school through the beginning college level master the essential information literacy skills and become effective, efficient learners. It covers the entire process of the research experience from choosing a topic and learning how to explore it effectively, to using the library and its resources,…

  18. Online Student Orientation in Higher Education: A Developmental Study

    ERIC Educational Resources Information Center

    Cho, Moon-Heum

    2012-01-01

    Although orientation for online students is important to their success, little information about how to develop an online student orientation (OSO) has appeared in the literature; therefore, the purpose of this article was to describe the entire process of developing an OSO. This article describes the analysis, design, development, and evaluation…

  19. Reflections on Designing a MPA Service-Learning Component: Lessons Learned

    ERIC Educational Resources Information Center

    Roman, Alexandru V.

    2015-01-01

    This article provides the "lessons learned" from the experience of redesigning two sections (face-to-face and online) of a core master of public administration class as a service-learning course. The suggestions made here can be traced to the entire process of the project, from the "seed idea" through its conceptualization and…

  20. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  1. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    PubMed Central

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  2. Automatic design of conformal cooling channels in injection molding tooling

    NASA Astrophysics Data System (ADS)

    Zhang, Yingming; Hou, Binkui; Wang, Qian; Li, Yang; Huang, Zhigao

    2018-02-01

    The generation of cooling system plays an important role in injection molding design. A conformal cooling system can effectively improve molding efficiency and product quality. This paper provides a generic approach for building conformal cooling channels. The centrelines of these channels are generated in two steps. First, we extract conformal loops based on geometric information of product. Second, centrelines in spiral shape are built by blending these loops. We devise algorithms to implement the entire design process. A case study verifies the feasibility of this approach.

  3. The Airborne Visible / Infrared Imaging Spectrometer AVIS: Design, Characterization and Calibration.

    PubMed

    Oppelt, Natascha; Mauser, Wolfram

    2007-09-14

    The Airborne Visible / Infrared imaging Spectrometer AVIS is a hyperspectralimager designed for environmental monitoring purposes. The sensor, which wasconstructed entirely from commercially available components, has been successfullydeployed during several experiments between 1999 and 2007. We describe the instrumentdesign and present the results of laboratory characterization and calibration of the system'ssecond generation, AVIS-2, which is currently being operated. The processing of the datais described and examples of remote sensing reflectance data are presented.

  4. LMI designmethod for networked-based PID control

    NASA Astrophysics Data System (ADS)

    Souza, Fernando de Oliveira; Mozelli, Leonardo Amaral; de Oliveira, Maurício Carvalho; Palhares, Reinaldo Martinez

    2016-10-01

    In this paper, we propose a methodology for the design of networked PID controllers for second-order delayed processes using linear matrix inequalities. The proposed procedure takes into account time-varying delay on the plant, time-varying delays induced by the network and packed dropouts. The design is carried on entirely using a continuous-time model of the closed-loop system where time-varying delays are used to represent sampling and holding occurring in a discrete-time digital PID controller.

  5. Analysis of design characteristics of a V-type support using an advanced engineering environment

    NASA Astrophysics Data System (ADS)

    Gwiazda, A.; Banaś, W.; Sękala, A.; Cwikla, G.; Topolska, S.; Foit, K.; Monica, Z.

    2017-08-01

    Modern mining support, for the entire period of their use, is the important part of the mining complex, which includes all the devices in the excavation during his normal use. Therefore, during the design of the support, it is an important task to choose the shape and to select the dimensions of a support as well as its strength characteristics. According to the rules, the design process of a support must take into account, inter alia, the type and the dimensions of the expected means of transport, the number and size of pipelines, and the type of additional equipment used excavation area. The support design must ensure the functionality of the excavation process and job security, while maintaining the economic viability of the entire project. Among others it should ensure the selection of a support for specific natural conditions. It is also important to take into consideration the economic characteristics of the project. The article presents an algorithm of integrative approach and its formalized description in the form of integration the areas of different construction characteristics optimization of a V-type mining support. The paper includes the example of its application for developing the construction of this support. In the paper is also described the results of the characteristics analysis and changings that were introduced afterwards. The support models are prepared in the computer environment of the CAD class (Siemens NX PLM). Also the analyses were conducted in this design, graphical environment.

  6. Gaining the Competitive Edge: Design for Manufacturing

    NASA Technical Reports Server (NTRS)

    Batill, Stephen M.; Pinkelman, Jim; Sellar, Richard

    1993-01-01

    The successful design of a commercial aircraft which is intended to be in direct competition with existing aircraft requires a market analysis to establish design requirements, the development of a concept to achieve those goals. and the ability to economically manufacture the aircraft. It is often the case that an engineer designs system components with only the perspective of a particular discipline. The relationship of that component to the entire system is often a minor consideration. In an effort to highlight the interaction that is necessary during the design process, the students were organized into design/build teams and required to integrate aspects of market analysis, engineering design, production and economics into their concepts. In order to facilitate this process a hypothetical "Aeroworld" was established. Having been furnished relevant demographic and economic data for "Aeroworld". students were given the task of designing and building an aircraft for a specific market while achieving an economically competitive design. Involvement of the team in the evolution of the design from market definition to technical development to manufacturing allowed the students to identify critical issues in the design process and to encounter many of the conflicting requirements which arise in an aerospace systems design.

  7. A scalable neural chip with synaptic electronics using CMOS integrated memristors.

    PubMed

    Cruz-Albrecht, Jose M; Derosier, Timothy; Srinivasa, Narayan

    2013-09-27

    The design and simulation of a scalable neural chip with synaptic electronics using nanoscale memristors fully integrated with complementary metal-oxide-semiconductor (CMOS) is presented. The circuit consists of integrate-and-fire neurons and synapses with spike-timing dependent plasticity (STDP). The synaptic conductance values can be stored in memristors with eight levels, and the topology of connections between neurons is reconfigurable. The circuit has been designed using a 90 nm CMOS process with via connections to on-chip post-processed memristor arrays. The design has about 16 million CMOS transistors and 73 728 integrated memristors. We provide circuit level simulations of the entire chip performing neuronal and synaptic computations that result in biologically realistic functional behavior.

  8. Knowledge Utilization Strategies in the Design and Implementation of New Schools--Symbolic Functions.

    ERIC Educational Resources Information Center

    Sieber, Sam D.

    An examination of case studies suggests that rational processes were not entirely at work in the planning and conception of new, innovative schools. The rational model that serves as the foundation of our information systems assumes that a compelling professional need triggers a search for solutions; and, therefore, school personnel are eager to…

  9. Patient Activities Planning and Progress Noting a Humanistic Integrated-Team Approach.

    ERIC Educational Resources Information Center

    Muilenburg, Ted

    This document outlines a system for planning recreation therapy, documenting progress, and relating the entire process to a team approach which includes patient assessment and involvement. The recreation program is seen as therapeutic, closely related to the total medical treatment program. The model is designed so that it can be adapted to almost…

  10. Indicator methods to evaluate the hygienic performance of industrial scale operating Biowaste Composting Plants.

    PubMed

    Martens, Jürgen

    2005-01-01

    The hygienic performance of biowaste composting plants to ensure the quality of compost is of high importance. Existing compost quality assurance systems reflect this importance through intensive testing of hygienic parameters. In many countries, compost quality assurance systems are under construction and it is necessary to check and to optimize the methods to state the hygienic performance of composting plants. A set of indicator methods to evaluate the hygienic performance of normal operating biowaste composting plants was developed. The indicator methods were developed by investigating temperature measurements from indirect process tests from 23 composting plants belonging to 11 design types of the Hygiene Design Type Testing System of the German Compost Quality Association (BGK e.V.). The presented indicator methods are the grade of hygienization, the basic curve shape, and the hygienic risk area. The temperature courses of single plants are not distributed normally, but they were grouped by cluster analysis in normal distributed subgroups. That was a precondition to develop the mentioned indicator methods. For each plant the grade of hygienization was calculated through transformation into the standard normal distribution. It shows the part in percent of the entire data set which meet the legal temperature requirements. The hygienization grade differs widely within the design types and falls below 50% for about one fourth of the plants. The subgroups are divided visually into basic curve shapes which stand for different process courses. For each plant the composition of the entire data set out of the various basic curve shapes can be used as an indicator for the basic process conditions. Some basic curve shapes indicate abnormal process courses which can be emended through process optimization. A hygienic risk area concept using the 90% range of variation of the normal temperature courses was introduced. Comparing the design type range of variation with the legal temperature defaults showed hygienic risk areas over the temperature courses which could be minimized through process optimization. The hygienic risk area of four design types shows a suboptimal hygienic performance.

  11. Building emotional resilience over 14 sessions of emotion focused therapy: Micro-longitudinal analyses of productive emotional patterns.

    PubMed

    Pascual-Leone, A; Yeryomenko, N; Sawashima, T; Warwar, S

    2017-05-04

    Pascual-Leone and Greenberg's sequential model of emotional processing has been used to explore process in over 24 studies. This line of research shows emotional processing in good psychotherapy often follows a sequential order, supporting a saw-toothed pattern of change within individual sessions (progressing "2-steps-forward, 1-step-back"). However, one cannot assume that local in-session patterns are scalable across an entire course of therapy. Thus, the primary objective of this exploratory study was to consider how the sequential patterns identified by Pascual-Leone, may apply across entire courses of treatment. Intensive emotion coding in two separate single-case designs were submitted for quantitative analyses of longitudinal patterns. Comprehensive coding in these cases involved recording observations for every emotional event in an entire course of treatment (using the Classification of Affective-Meaning States), which were then treated as a 9-point ordinal scale. Applying multilevel modeling to each of the two cases showed significant patterns of change over a large number of sessions, and those patterns were either nested at the within-session level or observed at the broader session-by-session level of change. Examining successful treatment cases showed several theoretically coherent kinds of temporal patterns, although not always in the same case. Clinical or methodological significance of this article: This is the first paper to demonstrate systematic temporal patterns of emotion over the course of an entire treatment. (1) The study offers a proof of concept that longitudinal patterns in the micro-processes of emotion can be objectively derived and quantified. (2) It also shows that patterns in emotion may be identified on the within-session level, as well as the session-by-session level of analysis. (3) Finally, observed processes over time support the ordered pattern of emotional states hypothesized in Pascual-Leone and Greenberg's ( 2007 ) model of emotional processing.

  12. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  13. Reinvisioning and redesigning “a library for the fifteenth through twenty-first centuries”: a case study on loss of space from the Library and Center for Knowledge Management, University of California, San Francisco*

    PubMed Central

    Persily, Gail L.; Butter, Karen A.

    2010-01-01

    The University of California, San Francisco, is an academic health sciences campus that is part of a state public university system. Space is very limited at this urban campus, and the library building's 90,000 square feet represent extremely valuable real estate. A planning process spanning several years initially proposed creating new teaching space utilizing 10,000 square feet of the library. A collaborative campus-wide planning process eventually resulted in the design of a new teaching and learning center that integrates clinical skills, simulation, and technology-enhanced education facilties on one entire floor of the building (21,000 square feet). The planning process resulted in a project that serves the entire campus and strengthens the library's role in the education mission. The full impact of the project is yet unknown as construction is not complete. PMID:20098654

  14. Parameter estimating state reconstruction

    NASA Technical Reports Server (NTRS)

    George, E. B.

    1976-01-01

    Parameter estimation is considered for systems whose entire state cannot be measured. Linear observers are designed to recover the unmeasured states to a sufficient accuracy to permit the estimation process. There are three distinct dynamics that must be accommodated in the system design: the dynamics of the plant, the dynamics of the observer, and the system updating of the parameter estimation. The latter two are designed to minimize interaction of the involved systems. These techniques are extended to weakly nonlinear systems. The application to a simulation of a space shuttle POGO system test is of particular interest. A nonlinear simulation of the system is developed, observers designed, and the parameters estimated.

  15. Failure modes and effects analysis automation

    NASA Technical Reports Server (NTRS)

    Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron

    1988-01-01

    A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.

  16. A total design and implementation of an intelligent mobile chemotherapy medication administration.

    PubMed

    Kuo, Ming-Chuan; Chang, Polun

    2014-01-01

    The chemotherapy medication administration is a process involved many stakeholders and efforts. Therefore, the information support system cannot be well designed if the entire process was not carefully examined and reengineered first. We, from a 805-teaching medical center, did a process reengineering and involved physicians, pharmacists and IT engineers to work together to design a mobile support solution. System was implemented in March to July, 2013. A 6" android handheld device with 1D BCR was used as the main hardware. 18 nurses were invited to evaluate their perceived acceptance of system based on Technology Acceptance Model for Mobile Service Model. Time saved was also calculated to measure the effectiveness of system. The results showed positive support from nurses. The estimated time saved every year was about 288 nursing days. We believe our mobile chemotherapy medication administration support system is successful in terms of acceptance and real impacts.

  17. The ARCO 1 megawatt Photovoltaic Power Plant

    NASA Astrophysics Data System (ADS)

    Rhodes, G. W.; Reilly, M. R.

    The world's largest Photovoltaic Power Plant is in operation and meeting performance specifications on the Southern California Edison (SCE) grid near Hesperia, California. The 1 MW plant designed and constructed by The BDM Corporation, for ARCO Solar Inc., occupies a 20 acre site adjacent to the SCE Lugo substation. The entire design and construction process took 7 1/2 months and was not only on schedule but below budget. Because of its vast photovoltaic experience, BDM was chosen over several engineering firms to perform this complex job. We were provided a conceptual design from ARCO which we quickly refined and immediately initiated construction.

  18. [Design of medical devices management system supporting full life-cycle process management].

    PubMed

    Su, Peng; Zhong, Jianping

    2014-03-01

    Based on the analysis of the present status of medical devices management, this paper optimized management process, developed a medical devices management system with Web technologies. With information technology to dynamic master the use of state of the entire life-cycle of medical devices. Through the closed-loop management with pre-event budget, mid-event control and after-event analysis, improved the delicacy management level of medical devices, optimized asset allocation, promoted positive operation of devices.

  19. Review of the Application of Green Building and Energy Saving Technology

    NASA Astrophysics Data System (ADS)

    Tong, Zhineng

    2017-12-01

    The use of energy-saving technologies in green buildings should run through the entire process of building design, construction and use, enabling green energy-saving technologies to maximize their effectiveness in construction. Realize the sustainable development of green building, reduce energy consumption, reduce people’s interference with the natural environment, suitable for people living in “green” building.

  20. Data Relationships: Towards a Conceptual Model of Scientific Data Catalogs

    NASA Astrophysics Data System (ADS)

    Hourcle, J. A.

    2008-12-01

    As the amount of data, types of processing and storage formats increase, the total number of record permutations increase dramatically. The result is an overwhelming number of records that make identifying the best data object to answer a user's needs more difficult. The issue is further complicated as each archive's data catalog may be designed around different concepts - - anything from individual files to be served, series of similarly generated and processed data, or something entirely different. Catalogs may not only be flat tables, but may be structured as multiple tables with each table being a different data series, or a normalized structure of the individual data files. Merging federated search results from archives with different catalog designs can create situations where the data object of interest is difficult to find due to an overwhelming number of seemingly similar or entirely unwanted records. We present a reference model for discussing data catalogs and the complex relationships between similar data objects. We show how the model can be used to improve scientist's ability to quickly identify the best data object for their purposes and discuss technical issues required to use this model in a federated system.

  1. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1996-01-01

    The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.

  2. Phenomena induced by charged particle beams. [experimental design for Spacelab

    NASA Technical Reports Server (NTRS)

    Beghin, C.

    1981-01-01

    The injection of energetic particles along the Earth's magnetic field lines is a possible remote sensing method for measuring the electric fields parallel to the magnetic field with good time resolution over the entire magnetic field. Neutralization processes, return-current effects, dynamics of the beams, triggered instabilities, and waves must be investigated before the fundamental question about proper experimental conditions, such as energy, intensity and divergence of the beams, pitch-angle injection, ion species, proper probes and detectors and their location, and rendezvous conditions, can be resolved. An experiment designed to provide a better understanding of these special physical processes and to provide some answers to questions concerning beam injection techniques is described.

  3. Digital analyzer for point processes based on first-in-first-out memories

    NASA Astrophysics Data System (ADS)

    Basano, Lorenzo; Ottonello, Pasquale; Schiavi, Enore

    1992-06-01

    We present an entirely new version of a multipurpose instrument designed for the statistical analysis of point processes, especially those characterized by high bunching. A long sequence of pulses can be recorded in the RAM bank of a personal computer via a suitably designed front end which employs a pair of first-in-first-out (FIFO) memories; these allow one to build an analyzer that, besides being simpler from the electronic point of view, is capable of sustaining much higher intensity fluctuations of the point process. The overflow risk of the device is evaluated by treating the FIFO pair as a queueing system. The apparatus was tested using both a deterministic signal and a sequence of photoelectrons obtained from laser light scattered by random surfaces.

  4. A synthetic design environment for ship design

    NASA Technical Reports Server (NTRS)

    Chipman, Richard R.

    1995-01-01

    Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.

  5. Aerodynamic optimization of supersonic compressor cascade using differential evolution on GPU

    NASA Astrophysics Data System (ADS)

    Aissa, Mohamed Hasanine; Verstraete, Tom; Vuik, Cornelis

    2016-06-01

    Differential Evolution (DE) is a powerful stochastic optimization method. Compared to gradient-based algorithms, DE is able to avoid local minima but requires at the same time more function evaluations. In turbomachinery applications, function evaluations are performed with time-consuming CFD simulation, which results in a long, non affordable, design cycle. Modern High Performance Computing systems, especially Graphic Processing Units (GPUs), are able to alleviate this inconvenience by accelerating the design evaluation itself. In this work we present a validated CFD Solver running on GPUs, able to accelerate the design evaluation and thus the entire design process. An achieved speedup of 20x to 30x enabled the DE algorithm to run on a high-end computer instead of a costly large cluster. The GPU-enhanced DE was used to optimize the aerodynamics of a supersonic compressor cascade, achieving an aerodynamic loss minimization of 20%.

  6. Aerodynamic optimization of supersonic compressor cascade using differential evolution on GPU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aissa, Mohamed Hasanine; Verstraete, Tom; Vuik, Cornelis

    Differential Evolution (DE) is a powerful stochastic optimization method. Compared to gradient-based algorithms, DE is able to avoid local minima but requires at the same time more function evaluations. In turbomachinery applications, function evaluations are performed with time-consuming CFD simulation, which results in a long, non affordable, design cycle. Modern High Performance Computing systems, especially Graphic Processing Units (GPUs), are able to alleviate this inconvenience by accelerating the design evaluation itself. In this work we present a validated CFD Solver running on GPUs, able to accelerate the design evaluation and thus the entire design process. An achieved speedup of 20xmore » to 30x enabled the DE algorithm to run on a high-end computer instead of a costly large cluster. The GPU-enhanced DE was used to optimize the aerodynamics of a supersonic compressor cascade, achieving an aerodynamic loss minimization of 20%.« less

  7. Process engineering concerns in the lunar environment

    NASA Technical Reports Server (NTRS)

    Sullivan, T. A.

    1990-01-01

    The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.

  8. The GAMCIT gamma ray burst detector

    NASA Technical Reports Server (NTRS)

    Mccall, Benjamin J.; Grunsfeld, John M.; Sobajic, Srdjan D.; Chang, Chinley Leonard; Krum, David M.; Ratner, Albert; Trittschuh, Jennifer E.

    1993-01-01

    The GAMCIT payload is a Get-Away-Special payload designed to search for high-energy gamma-ray bursts and any associated optical transients. This paper presents details on the design of the GAMCIT payload, in the areas of battery selection, power processing, electronics design, gamma-ray detection systems, and the optical imaging of the transients. The paper discusses the progress of the construction, testing, and specific design details of the payload. In addition, this paper discusses the unique challenges involved in bringing this payload to completion, as the project has been designed, constructed, and managed entirely by undergraduate students. Our experience will certainly be valuable to other student groups interested in taking on a challenging project such as a Get-Away-Special payload.

  9. Website Redesign: A Case Study.

    PubMed

    Wu, Jin; Brown, Janis F

    2016-01-01

    A library website redesign is a complicated and at times arduous task, requiring many different steps including determining user needs, analyzing past user behavior, examining other websites, defining design preferences, testing, marketing, and launching the site. Many different types of expertise are required over the entire process. Lessons learned from the Norris Medical Library's experience with the redesign effort may be useful to others undertaking a similar project.

  10. Planting seedlings in tree islands versus plantations as a large-scale tropical forest restoration strategy

    Treesearch

    K. D. Holl; R. A. Zahawi; R. J. Cole; R. Ostertag; S. Cordell

    2010-01-01

    Planting tree seedlings in small patches (islands) has been proposed as a method to facilitate forest recovery that is less expensive than planting large areas and better simulates the nucleation process of recovery. We planted seedlings of four tree species at 12 formerly agricultural sites in southern Costa Rica in two designs: plantation (entire 50 × 50 m area...

  11. Object-oriented design and programming in medical decision support.

    PubMed

    Heathfield, H; Armstrong, J; Kirkham, N

    1991-12-01

    The concept of object-oriented design and programming has recently received a great deal of attention from the software engineering community. This paper highlights the realisable benefits of using the object-oriented approach in the design and development of clinical decision support systems. These systems seek to build a computational model of some problem domain and therefore tend to be exploratory in nature. Conventional procedural design techniques do not support either the process of model building or rapid prototyping. The central concepts of the object-oriented paradigm are introduced, namely encapsulation, inheritance and polymorphism, and their use illustrated in a case study, taken from the domain of breast histopathology. In particular, the dual roles of inheritance in object-oriented programming are examined, i.e., inheritance as a conceptual modelling tool and inheritance as a code reuse mechanism. It is argued that the use of the former is not entirely intuitive and may be difficult to incorporate into the design process. However, inheritance as a means of optimising code reuse offers substantial technical benefits.

  12. Effects of distinctive encoding on correct and false memory: a meta-analytic review of costs and benefits and their origins in the DRM paradigm.

    PubMed

    Huff, Mark J; Bodner, Glen E; Fawcett, Jonathan M

    2015-04-01

    We review and meta-analyze how distinctive encoding alters encoding and retrieval processes and, thus, affects correct and false recognition in the Deese-Roediger-McDermott (DRM) paradigm. Reductions in false recognition following distinctive encoding (e.g., generation), relative to a nondistinctive read-only control condition, reflected both impoverished relational encoding and use of a retrieval-based distinctiveness heuristic. Additional analyses evaluated the costs and benefits of distinctive encoding in within-subjects designs relative to between-group designs. Correct recognition was design independent, but in a within design, distinctive encoding was less effective at reducing false recognition for distinctively encoded lists but more effective for nondistinctively encoded lists. Thus, distinctive encoding is not entirely "cost free" in a within design. In addition to delineating the conditions that modulate the effects of distinctive encoding on recognition accuracy, we discuss the utility of using signal detection indices of memory information and memory monitoring at test to separate encoding and retrieval processes.

  13. Design and performance study of an orthopaedic surgery robotized module for automatic bone drilling.

    PubMed

    Boiadjiev, George; Kastelov, Rumen; Boiadjiev, Tony; Kotev, Vladimir; Delchev, Kamen; Zagurski, Kazimir; Vitkov, Vladimir

    2013-12-01

    Many orthopaedic operations involve drilling and tapping before the insertion of screws into a bone. This drilling is usually performed manually, thus introducing many problems. These include attaining a specific drilling accuracy, preventing blood vessels from breaking, and minimizing drill oscillations that would widen the hole. Bone overheating is the most important problem. To avoid such problems and reduce the subjective factor, automated drilling is recommended. Because numerous parameters influence the drilling process, this study examined some experimental methods. These concerned the experimental identification of technical drilling parameters, including the bone resistance force and temperature in the drilling process. During the drilling process, the following parameters were monitored: time, linear velocity, angular velocity, resistance force, penetration depth, and temperature. Specific drilling effects were revealed during the experiments. The accuracy was improved at the starting point of the drilling, and the error for the entire process was less than 0.2 mm. The temperature deviations were kept within tolerable limits. The results of various experiments with different drilling velocities, drill bit diameters, and penetration depths are presented in tables, as well as the curves of the resistance force and temperature with respect to time. Real-time digital indications of the progress of the drilling process are shown. Automatic bone drilling could entirely solve the problems that usually arise during manual drilling. An experimental setup was designed to identify bone drilling parameters such as the resistance force arising from variable bone density, appropriate mechanical drilling torque, linear speed of the drill, and electromechanical characteristics of the motors, drives, and corresponding controllers. Automatic drilling guarantees greater safety for the patient. Moreover, the robot presented is user-friendly because it is simple to set robot tasks, and process data are collected in real time. Copyright © 2013 John Wiley & Sons, Ltd.

  14. The impact of automation on organizational changes in a community hospital clinical microbiology laboratory.

    PubMed

    Camporese, Alessandro

    2004-06-01

    The diagnosis of infectious diseases and the role of the microbiology laboratory are currently undergoing a process of change. The need for overall efficiency in providing results is now given the same importance as accuracy. This means that laboratories must be able to produce quality results in less time with the capacity to interpret the results clinically. To improve the clinical impact of microbiology results, the new challenge facing the microbiologist has become one of process management instead of pure analysis. A proper project management process designed to improve workflow, reduce analytical time, and provide the same high quality results without losing valuable time treating the patient, has become essential. Our objective was to study the impact of introducing automation and computerization into the microbiology laboratory, and the reorganization of the laboratory workflow, i.e. scheduling personnel to work shifts covering both the entire day and the entire week. In our laboratory, the introduction of automation and computerization, as well as the reorganization of personnel, thus the workflow itself, has resulted in an improvement in response time and greater efficiency in diagnostic procedures.

  15. A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems

    PubMed Central

    Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang

    2016-01-01

    With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach. PMID:26999141

  16. A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems.

    PubMed

    Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang

    2016-03-17

    With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach.

  17. Turboexpander plant designs can provide high ethane recovery without inlet CO/sub 2/ removal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkinson, J.D.; Hudson, H.M.

    1982-05-01

    Several new turboexpander gas-plant schemes offer two advantages over conventional processes: they can recover over 85% of the natural gas stream's ethane while handling higher inlet CO/sub 2/ concentrations without freezing - this saves considerable costs by allowing smaller CO/sub 2/ removal units or eliminating the need for them entirely, and the liquids recovery system requires no more external horsepower and in many cases, even less; this maximized the quantity of liquids recovered per unit of energy input, thus further lowering costs. The economic benefits associated with the proved plant designs make the processes attractive even for inlet gas streamsmore » containing little or no CO/sub 2/.« less

  18. A design of experiment approach for efficient multi-parametric drug testing using a Caenorhabditis elegans model.

    PubMed

    Letizia, M C; Cornaglia, M; Tranchida, G; Trouillon, R; Gijs, M A M

    2018-01-22

    When studying the drug effectiveness towards a target model, one should distinguish the effects of the drug itself and of all the other factors that could influence the screening outcome. This comprehensive knowledge is crucial, especially when model organisms are used to study the drug effect at a systemic level, as a higher number of factors can influence the drug-testing outcome. Covering the entire experimental domain and studying the effect of the simultaneous change in several factors would require numerous experiments, which are costly and time-consuming. Therefore, a design of experiment (DoE) approach in drug-testing is emerging as a robust and efficient method to reduce the use of resources, while maximizing the knowledge of the process. Here, we used a 3-factor-Doehlert DoE to characterize the concentration-dependent effect of the drug doxycycline on the development duration of the nematode Caenorhabditis elegans. To cover the experimental space, 13 experiments were designed and performed, where different doxycycline concentrations were tested, while also varying the temperature and the food amount, which are known to influence the duration of C. elegans development. A microfluidic platform was designed to isolate and culture C. elegans larvae, while testing the doxycycline effect with full control of temperature and feeding over the entire development. Our approach allowed predicting the doxycycline effect on C. elegans development in the complete drug concentration/temperature/feeding experimental space, maximizing the understanding of the effect of this antibiotic on the C. elegans development and paving the way towards a standardized and optimized drug-testing process.

  19. Design and fabrication of self-assembled thin films

    NASA Astrophysics Data System (ADS)

    Topasna, Daniela M.; Topasna, Gregory A.

    2015-10-01

    Students experience the entire process of designing, fabricating and testing thin films during their capstone course. The films are fabricated by the ionic-self assembled monolayer (ISAM) technique, which is suited to a short class and is relatively rapid, inexpensive and environmentally friendly. The materials used are polymers, nanoparticles, and small organic molecules that, in various combinations, can create films with nanometer thickness and with specific properties. These films have various potential applications such as pH optical sensors or antibacterial coatings. This type of project offers students an opportunity to go beyond the standard lecture and labs and to experience firsthand the design and fabrication processes. They learn new techniques and procedures, as well as familiarize themselves with new instruments and optical equipment. For example, students learn how to characterize the films by using UV-Vis-NIR spectrophotometry and in the process learn how the instruments operate. This work compliments a previous exercise that we introduced where students use MATHCAD to numerically model the transmission and reflection of light from thin films.

  20. Roll Angle Estimation Using Thermopiles for a Flight Controlled Mortar

    DTIC Science & Technology

    2012-06-01

    Using Xilinx’s System generator, the entire design was implemented at a relatively high level within Malab’s Simulink. This allowed VHDL code to...thermopile data with a Recursive Least Squares (RLS) filter implemented on a field programmable gate array (FPGA). These results demonstrate the...accurately estimated by processing the thermopile data with a Recursive Least Squares (RLS) filter implemented on a field programmable gate array (FPGA

  1. Technology Horizons: A Vision for Air Force Science and Technology 2010-30

    DTIC Science & Technology

    2011-09-01

    software, hardware, and networks, it is now recognized as en- compassing the entire system that couples information flow and decision processes across...acceleration, and scramjet cruise. Inward turning inlets and a dual- flow path design allow high volumetric efficiency, and high cruise speed provides...the same time, emerging “third- stream engine architectures” can enable constant-mass- flow engines that can provide further reductions in fuel

  2. Contingency plans for chromium utilization. Publication NMAB-335

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The United States depends entirely on foreign sources for the critical material, chromium, making it very vulnerable to supply disruptions. The effectiveness of programs such as stockpiling, conservation, and research and development for substitutes to reduce the impact of disruption of imports of chromite and ferrochromium are discussed. Alternatives for decreasing chromium consumption also are identified for chromium-containing materials in the areas of design, processing, and substitution.

  3. Missing the Mark: Is ICS Training Achieving Its Goal

    DTIC Science & Technology

    2016-12-01

    method achieves learning and actually gives students new knowledge, skills, and 281 Ibid. 282 Ibid...designed to be five days (40 hours) long.331 The class assumes that the student already has a general understanding of ICS and completion of at least...35–37. 64 The entire process is time consuming, as the student must complete the in- class time (as required for the specific class ) and

  4. The Ontology of Clinical Research (OCRe): An Informatics Foundation for the Science of Clinical Research

    PubMed Central

    Sim, Ida; Tu, Samson W.; Carini, Simona; Lehmann, Harold P.; Pollock, Brad H.; Peleg, Mor; Wittkowski, Knut M.

    2013-01-01

    To date, the scientific process for generating, interpreting, and applying knowledge has received less informatics attention than operational processes for conducting clinical studies. The activities of these scientific processes — the science of clinical research — are centered on the study protocol, which is the abstract representation of the scientific design of a clinical study. The Ontology of Clinical Research (OCRe) is an OWL 2 model of the entities and relationships of study design protocols for the purpose of computationally supporting the design and analysis of human studies. OCRe’s modeling is independent of any specific study design or clinical domain. It includes a study design typology and a specialized module called ERGO Annotation for capturing the meaning of eligibility criteria. In this paper, we describe the key informatics use cases of each phase of a study’s scientific lifecycle, present OCRe and the principles behind its modeling, and describe applications of OCRe and associated technologies to a range of clinical research use cases. OCRe captures the central semantics that underlies the scientific processes of clinical research and can serve as an informatics foundation for supporting the entire range of knowledge activities that constitute the science of clinical research. PMID:24239612

  5. On the design and operation of primary settling tanks in state of the art wastewater treatment and water resources recovery.

    PubMed

    Patziger, Miklos; Günthert, Frank Wolfgang; Jardin, Norbert; Kainz, Harald; Londong, Jörg

    2016-11-01

    In state of the art wastewater treatment, primary settling tanks (PSTs) are considered as an integral part of the biological wastewater and sludge treatment process, as well as of the biogas and electric energy production. Consequently they strongly influence the efficiency of the entire wastewater treatment plant. However, in the last decades the inner physical processes of PSTs, largely determining their efficiency, have been poorly addressed. In common practice PSTs are still solely designed and operated based on the surface overflow rate and the hydraulic retention time (HRT) as a black box. The paper shows the results of a comprehensive investigation programme, including 16 PSTs. Their removal efficiency and inner physical processes (like the settling process of primary sludge), internal flow structures within PSTs and their impact on performance were investigated. The results show that: (1) the removal rates of PSTs are generally often underestimated in current design guidelines, (2) the removal rate of different PSTs shows a strongly fluctuating pattern even in the same range of the HRT, and (3) inlet design of PSTs becomes highly relevant in the removal efficiency at rather high surface overflow rates, above 5 m/h, which is the upper design limit of PSTs for dry weather load.

  6. CMOS based capacitance to digital converter circuit for MEMS sensor

    NASA Astrophysics Data System (ADS)

    Rotake, D. R.; Darji, A. D.

    2018-02-01

    Most of the MEMS cantilever based system required costly instruments for characterization, processing and also has large experimental setups which led to non-portable device. So there is a need of low cost, highly sensitive, high speed and portable digital system. The proposed Capacitance to Digital Converter (CDC) interfacing circuit converts capacitance to digital domain which can be easily processed. Recent demand microcantilever deflection is part per trillion ranges which change the capacitance in 1-10 femto farad (fF) range. The entire CDC circuit is designed using CMOS 250nm technology. Design of CDC circuit consists of a D-latch and two oscillators, namely Sensor controlled oscillator (SCO) and digitally controlled oscillator (DCO). The D-latch is designed using transmission gate based MUX for power optimization. A CDC design of 7-stage, 9-stage and 11-stage tested for 1-18 fF and simulated using mentor graphics Eldo tool with parasitic. Since the proposed design does not use resistance component, the total power dissipation is reduced to 2.3621 mW for CDC designed using 9-stage SCO and DCO.

  7. Use Of REX Control System For The Ball On Spool Model

    NASA Astrophysics Data System (ADS)

    Ožana, Štěpán; Pieš, Martin; Hájovský, Radovan; Dočekal, Tomáš

    2015-07-01

    This paper deals with the design and implementation of linear quadratic controller (LQR) for modeling of Ball on Spool. The paper presents the entire process, starting from mathematical model through control design towards application of controller with the use of given hardware platform. Proposed solution by means of REX Control System provides a high level of user comfort regarding implementation of control loop, diagnostics and automatically generated visualization based on HTML5. It represents an ideal example of a complex nonlinear mechatronic system with a lot of possibilities to apply other types of controllers.

  8. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  9. Industrializing Offshore Wind Power with Serial Assembly and Lower-cost Deployment - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kempton, Willett

    A team of engineers and contractors has developed a method to move offshore wind installation toward lower cost, faster deployment, and lower environmental impact. A combination of methods, some incremental and some breaks from past practice, interact to yield multiple improvements. Three designs were evaluated based on detailed engineering: 1) a 5 MW turbine on a jacket with pin piles (base case), 2) a 10 MW turbine on a conventional jacket with pin piles, assembled at sea, and 3) a 10 MW turbine on tripod jacket with suction buckets (caissons) and with complete turbine assembly on-shore. The larger turbine, assemblymore » ashore, and the use of suction buckets together substantially reduce capital cost of offshore wind projects. Notable capital cost reductions are: changing from 5 MW to 10 MW turbine, a 31% capital cost reduction, and assembly on land then single-piece install at sea an additional 9% capital cost reduction. An estimated Design 4) estimates further cost reduction when equipment and processes of Design 3) are optimized, rather than adapted to existing equipment and process. Cost of energy for each of the four Designs are also calculated, yielding approximately the same percentage reductions. The methods of Design 3) analyzed here include accepted structures such as suction buckets used in new ways, innovations conceived but previously without engineering and economic validation, combined with new methods not previously proposed. Analysis of Designs 2) and 3) are based on extensive engineering calculations and detailed cost estimates. All design methods can be done with existing equipment, including lift equipment, ports and ships (except that design 4 assumes a more optimized ship). The design team consists of experienced offshore structure designers, heavy lift engineers, wind turbine designers, vessel operators, and marine construction contractors. Comparing the methods based on criteria of cost and deployment speed, the study selected the third design. That design is, in brief: a conventional turbine and tubular tower is mounted on a tripod jacket, in turn atop three suction buckets. Blades are mounted on the tower, not on the hub. The entire structure is built in port, from the bottom up, then assembled structures are queued in the port for deployment. During weather windows, the fully-assembled structures are lifted off the quay, lashed to the vessel, and transported to the deployment site. The vessel analyzed is a shear leg crane vessel with dynamic positioning like the existing Gulliver, or it could be a US-built crane barge. On site, the entire structure is lowered to the bottom by the crane vessel, then pumping of the suction buckets is managed by smaller service vessels. Blades are lifted into place by small winches operated by workers in the nacelle without lift vessel support. Advantages of the selected design include: cost and time at sea of the expensive lift vessel are significantly reduced; no jack up vessel is required; the weather window required for each installation is shorter; turbine structure construction is continuous with a queue feeding the weather-dependent installation process; pre-installation geotechnical work is faster and less expensive; there are no sound impacts on marine mammals, thus minimal spotting and no work stoppage Industrializing Offshore Wind Power 6 of 96 9 for mammal passage; the entire structure can be removed for decommissioning or major repairs; the method has been validated for current turbines up to 10 MW, and a calculation using simple scaling shows it usable up to 20 MW turbines.« less

  10. Computer-aided design of tooth preparations for automated development of fixed prosthodontics.

    PubMed

    Yuan, Fusong; Sun, Yuchun; Wang, Yong; Lv, Peijun

    2014-01-01

    This paper introduces a method to digitally design a virtual model of a tooth preparation of the mandibular first molar, by using the commercial three-dimensional (3D) computer-aided design software packages Geomagic and Imageware, and using the model as an input to automatic tooth preparing system. The procedure included acquisition of 3D data from dentate casts and digital modeling of the shape of the tooth preparation components, such as the margin, occlusal surface, and axial surface. The completed model data were stored as stereolithography (STL) files, which were used in a tooth preparation system to help to plan the trajectory. Meanwhile, the required mathematical models in the design process were introduced. The method was used to make an individualized tooth preparation of the mandibular first molar. The entire process took 15min. Using the method presented, a straightforward 3D shape of a full crown can be obtained to meet clinical needs prior to tooth preparation. © 2013 Published by Elsevier Ltd.

  11. Advanced Biofuels and Beyond: Chemistry Solutions for Propulsion and Production.

    PubMed

    Leitner, Walter; Klankermayer, Jürgen; Pischinger, Stefan; Pitsch, Heinz; Kohse-Höinghaus, Katharina

    2017-05-08

    Sustainably produced biofuels, especially when they are derived from lignocellulosic biomass, are being discussed intensively for future ground transportation. Traditionally, research activities focus on the synthesis process, while leaving their combustion properties to be evaluated by a different community. This Review adopts an integrative view of engine combustion and fuel synthesis, focusing on chemical aspects as the common denominator. It will be demonstrated that a fundamental understanding of the combustion process can be instrumental to derive design criteria for the molecular structure of fuel candidates, which can then be targets for the analysis of synthetic pathways and the development of catalytic production routes. With such an integrative approach to fuel design, it will be possible to improve systematically the entire system, spanning biomass feedstock, conversion process, fuel, engine, and pollutants with a view to improve the carbon footprint, increase efficiency, and reduce emissions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. The PhOCoe Model--ergonomic pattern mapping in participatory design processes.

    PubMed

    Silva e Santos, Marcello

    2012-01-01

    The discipline and practice of human factors and ergonomics is quite rich in terms of the availability of analysis, development and evaluation tools and methods for its various processes. However, we lack effective instruments to either map or regulate comprehensively and effectively, cognitive and organizational related impacts, especially the environmental ones. Moreover, when ergonomic transformations through design - such as a new workstation design or even an entire new facility - is at play, ergonomics professionals tend to stay at bay, relying solely on design professionals and engineers. There is vast empirical evidence showing that participation of ergonomists as project facilitators, may contribute to an effective professional synergy amongst the various stakeholders in a multidisciplinary venue. When that happens, everyone wins - users and designers alike -because eventual conflicts, raised up in the midst of options selection, are dissipated in exchange for more convergent design alternatives. This paper presents a method for participatory design, in which users are encouraged to actively participate in the whole design process by sharing their real work activities with the design team. The negotiated results inferred from the ergonomic action and translated into a new design, are then compiled into a "Ergonomic Pattern Manual". This handbook of ergonomics-oriented design guidelines contains essential guidelines to be consulted in recurrent design project situations in which similar patterns might be used. The main drive is simple: nobody knows better than workers themselves what an adequate workplace design solution (equipment, workstation, office layout) should be.

  13. Strongly-Refractive One-Dimensional Photonic Crystal Prisms

    NASA Technical Reports Server (NTRS)

    Ting, David Z. (Inventor)

    2004-01-01

    One-dimensional (1D) photonic crystal prisms can separate a beam of polychromatic electromagnetic waves into constituent wavelength components and can utilize unconventional refraction properties for wavelength dispersion over significant portions of an entire photonic band rather than just near the band edges outside the photonic band gaps. Using a ID photonic crystal simplifies the design and fabrication process and allows the use of larger feature sizes. The prism geometry broadens the useful wavelength range, enables better optical transmission, and exhibits angular dependence on wavelength with reduced non-linearity. The properties of the 1 D photonic crystal prism can be tuned by varying design parameters such as incidence angle, exit surface angle, and layer widths. The ID photonic crystal prism can be fabricated in a planar process, and can be used as optical integrated circuit elements.

  14. Low-cost composite blades for the Mod-0A wind turbines

    NASA Technical Reports Server (NTRS)

    Weingart, O.

    1982-01-01

    Low cost approaches to the design and fabrication of blades for a two-bladed 200 kW wind turbine were identified and the applicability of the techniques to larger and smaller blades was assessed. Blade tooling was designed and fabricated. Two complete blades and a partial blade for tool tryout were built. The patented TFT process was used to wind the entire blade. This process allows rapid winding of an axially oriented composite onto a tapered mandrel, with tapered wall thickness. The blade consists of a TFT glass-epoxy airfoil structure filament wound onto a steel root end fitting. The fitting is, in turn, bolted to a conical steel adapter section to provide for mounting attachment to the hub. Structural analysis, blade properties, and cost and weight analysis are described.

  15. From community preferences to design: Investigation of human-centered optimization algorithms in web-based, democratic planning of watershed restoration

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Mukhopadhyay, S.

    2014-12-01

    Web 2.0 technologies are useful resources for reaching out to larger stakeholder communities and involve them in policy making and planning efforts. While these technologies have been used in the past to support education and communication endeavors, we have developed a novel, web-based, interactive planning tool that involves the community in using science-based methods for the design of potential runoff management strategies on their landscape. The tool, Watershed REstoration using Spatio-Temporal Optimization of Resources (WRESTORE), uses a democratic voting process coupled with visualization interfaces, computational simulation and optimization models, and user modeling techniques to support a human-centered design approach. The tool can be used to engage diverse watershed stakeholders and landowners via the internet, thereby improving opportunities for outreach and collaborations. Users are able to (a) design multiple types of conservation practices at their field-scale catchment and at the entire watershed scale, (b) examine impacts and limitations of their decisions on their neighboring catchments and on the entire watershed, (c) compare alternatives via a cost-benefit analysis, (d) vote on their "favorite" designs based on their preferences and constraints, and (e) propose their "favorite" alternatives to policy makers and other stakeholders. In this presentation, we will demonstrate the effectiveness of WRESTORE for designing alternatives of conservation practices to reduce peak flows in a Midwestern watershed, present results on multiple approaches for engaging with larger communities, and discuss potential for future developments.

  16. Design and development of a layer-based additive manufacturing process for the realization of metal parts of designed mesostructure

    NASA Astrophysics Data System (ADS)

    Williams, Christopher Bryant

    Low-density cellular materials, metallic bodies with gaseous voids, are a unique class of materials that are characterized by their high strength, low mass, good energy absorption characteristics, and good thermal and acoustic insulation properties. In an effort to take advantage of this entire suite of positive mechanical traits, designers are tailoring the cellular mesostructure for multiple design objectives. Unfortunately, existing cellular material manufacturing technologies limit the design space as they are limited to certain part mesostructure, material type, and macrostructure. The opportunity that exists to improve the design of existing products, and the ability to reap the benefits of cellular materials in new applications is the driving force behind this research. As such, the primary research goal of this work is to design, embody, and analyze a manufacturing process that provides a designer the ability to specify the material type, material composition, void morphology, and mesostructure topology for any conceivable part geometry. The accomplishment of this goal is achieved in three phases of research: (1) Design---Following a systematic design process and a rigorous selection exercise, a layer-based additive manufacturing process is designed that is capable of meeting the unique requirements of fabricating cellular material geometry. Specifically, metal parts of designed mesostructure are fabricated via three-dimensional printing of metal oxide ceramic powder followed by post-processing in a reducing atmosphere. (2) Embodiment ---The primary research hypothesis is verified through the use of the designed manufacturing process chain to successfully realize metal parts of designed mesostructure. (3) Modeling & Evaluation ---The designed manufacturing process is modeled in this final research phase so as to increase understanding of experimental results and to establish a foundation for future analytical modeling research. In addition to an analysis of the physics of primitive creation and an investigation of failure modes during the layered fabrication of thin trusses, build time and cost models are presented in order to verify claims of the process's economic benefits. The main contribution of this research is the embodiment of a novel manner for realizing metal parts of designed mesostructure.

  17. Designing Domain-Specific HUMS Architectures: An Automated Approach

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban

    2004-01-01

    The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.

  18. Predicting Silk Fiber Mechanical Properties through Multiscale Simulation and Protein Design.

    PubMed

    Rim, Nae-Gyune; Roberts, Erin G; Ebrahimi, Davoud; Dinjaski, Nina; Jacobsen, Matthew M; Martín-Moldes, Zaira; Buehler, Markus J; Kaplan, David L; Wong, Joyce Y

    2017-08-14

    Silk is a promising material for biomedical applications, and much research is focused on how application-specific, mechanical properties of silk can be designed synthetically through proper amino acid sequences and processing parameters. This protocol describes an iterative process between research disciplines that combines simulation, genetic synthesis, and fiber analysis to better design silk fibers with specific mechanical properties. Computational methods are used to assess the protein polymer structure as it forms an interconnected fiber network through shearing and how this process affects fiber mechanical properties. Model outcomes are validated experimentally with the genetic design of protein polymers that match the simulation structures, fiber fabrication from these polymers, and mechanical testing of these fibers. Through iterative feedback between computation, genetic synthesis, and fiber mechanical testing, this protocol will enable a priori prediction capability of recombinant material mechanical properties via insights from the resulting molecular architecture of the fiber network based entirely on the initial protein monomer composition. This style of protocol may be applied to other fields where a research team seeks to design a biomaterial with biomedical application-specific properties. This protocol highlights when and how the three research groups (simulation, synthesis, and engineering) should be interacting to arrive at the most effective method for predictive design of their material.

  19. Space station automation study: Automation requriements derived from space manufacturing concepts,volume 2

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Automation reuirements were developed for two manufacturing concepts: (1) Gallium Arsenide Electroepitaxial Crystal Production and Wafer Manufacturing Facility, and (2) Gallium Arsenide VLSI Microelectronics Chip Processing Facility. A functional overview of the ultimate design concept incoporating the two manufacturing facilities on the space station are provided. The concepts were selected to facilitate an in-depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, sensors, and artificial intelligence. While the cost-effectiveness of these facilities was not analyzed, both appear entirely feasible for the year 2000 timeframe.

  20. Applying simulation to optimize plastic molded optical parts

    NASA Astrophysics Data System (ADS)

    Jaworski, Matthew; Bakharev, Alexander; Costa, Franco; Friedl, Chris

    2012-10-01

    Optical injection molded parts are used in many different industries including electronics, consumer, medical and automotive due to their cost and performance advantages compared to alternative materials such as glass. The injection molding process, however, induces elastic (residual stress) and viscoelastic (flow orientation stress) deformation into the molded article which alters the material's refractive index to be anisotropic in different directions. Being able to predict and correct optical performance issues associated with birefringence early in the design phase is a huge competitive advantage. This paper reviews how to apply simulation analysis of the entire molding process to optimize manufacturability and part performance.

  1. Process Research On Polycrystalline Silicon Material (PROPSM)

    NASA Technical Reports Server (NTRS)

    Culik, J. S.; Wohlgemuth, J. H.

    1982-01-01

    Performance limiting mechanisms in polycrystalline silicon are investigated by fabricating a matrix of solar cells of various thicknesses from polycrystalline silicon wafers of several bulk resistivities. The analysis of the results for the entire matrix indicates that bulk recombination is the dominant factor limiting the short circuit current in large grain (greater than 1 to 2 mm diameter) polycrystalline silicon, the same mechanism that limits the short circuit current in single crystal silicon. An experiment to investigate the limiting mechanisms of open circuit voltage and fill factor for large grain polycrystalline silicon is designed. Two process sequences to fabricate small cells are investigated.

  2. Innovative vitrification for soil remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetta, N.W.; Patten, J.S.; Hart, J.G.

    1995-12-01

    The objective of this DOE demonstration program is to validate the performance and operation of the Vortec Cyclone Melting System (CMS{trademark}) for the processing of LLW contaminated soils found at DOE sites. This DOE vitrification demonstration project has successfully progressed through the first two phases. Phase 1 consisted of pilot scale testing with surrogate wastes and the conceptual design of a process plant operating at a generic DOE site. The objective of Phase 2, which is scheduled to be completed the end of FY 95, is to develop a definitive process plant design for the treatment of wastes at amore » specific DOE facility. During Phase 2, a site specific design was developed for the processing of LLW soils and muds containing TSCA organics and RCRA metal contaminants. Phase 3 will consist of a full scale demonstration at the DOE gaseous diffusion plant located in Paducah, KY. Several DOE sites were evaluated for potential application of the technology. Paducah was selected for the demonstration program because of their urgent waste remediation needs as well as their strong management and cost sharing financial support for the project. During Phase 2, the basic nitrification process design was modified to meet the specific needs of the new waste streams available at Paducah. The system design developed for Paducah has significantly enhanced the processing capabilities of the Vortec vitrification process. The overall system design now includes the capability to shred entire drums and drum packs containing mud, concrete, plastics and PCB`s as well as bulk waste materials. This enhanced processing capability will substantially expand the total DOE waste remediation applications of the technology.« less

  3. Approximate Model of Zone Sedimentation

    NASA Astrophysics Data System (ADS)

    Dzianik, František

    2011-12-01

    The process of zone sedimentation is affected by many factors that are not possible to express analytically. For this reason, the zone settling is evaluated in practice experimentally or by application of an empirical mathematical description of the process. The paper presents the development of approximate model of zone settling, i.e. the general function which should properly approximate the behaviour of the settling process within its entire range and at the various conditions. Furthermore, the specification of the model parameters by the regression analysis of settling test results is shown. The suitability of the model is reviewed by graphical dependencies and by statistical coefficients of correlation. The approximate model could by also useful on the simplification of process design of continual settling tanks and thickeners.

  4. An Exploration of Software-Based GNSS Signal Processing at Multiple Frequencies

    NASA Astrophysics Data System (ADS)

    Pasqual Paul, Manuel; Elosegui, Pedro; Lind, Frank; Vazquez, Antonio; Pankratius, Victor

    2017-01-01

    The Global Navigation Satellite System (GNSS; i.e., GPS, GLONASS, Galileo, and other constellations) has recently grown into numerous areas that go far beyond the traditional scope in navigation. In the geosciences, for example, high-precision GPS has become a powerful tool for a myriad of geophysical applications such as in geodynamics, seismology, paleoclimate, cryosphere, and remote sensing of the atmosphere. Positioning with millimeter-level accuracy can be achieved through carrier-phase-based, multi-frequency signal processing, which mitigates various biases and error sources such as those arising from ionospheric effects. Today, however, most receivers with multi-frequency capabilities are highly specialized hardware receiving systems with proprietary and closed designs, limited interfaces, and significant acquisition costs. This work explores alternatives that are entirely software-based, using Software-Defined Radio (SDR) receivers as a way to digitize the entire spectrum of interest. It presents an overview of existing open-source frameworks and outlines the next steps towards converting GPS software receivers from single-frequency to dual-frequency, geodetic-quality systems. In the future, this development will lead to a more flexible multi-constellation GNSS processing architecture that can be easily reused in different contexts, as well as to further miniaturization of receivers.

  5. Engineering of an inhalable DDA/TDB liposomal adjuvant: a quality-by-design approach towards optimization of the spray drying process.

    PubMed

    Ingvarsson, Pall Thor; Yang, Mingshi; Mulvad, Helle; Nielsen, Hanne Mørck; Rantanen, Jukka; Foged, Camilla

    2013-11-01

    The purpose of this study was to identify and optimize spray drying parameters of importance for the design of an inhalable powder formulation of a cationic liposomal adjuvant composed of dimethyldioctadecylammonium (DDA) bromide and trehalose-6,6'-dibehenate (TDB). A quality by design (QbD) approach was applied to identify and link critical process parameters (CPPs) of the spray drying process to critical quality attributes (CQAs) using risk assessment and design of experiments (DoE), followed by identification of an optimal operating space (OOS). A central composite face-centered design was carried out followed by multiple linear regression analysis. Four CQAs were identified; the mass median aerodynamic diameter (MMAD), the liposome stability (size) during processing, the moisture content and the yield. Five CPPs (drying airflow, feed flow rate, feedstock concentration, atomizing airflow and outlet temperature) were identified and tested in a systematic way. The MMAD and the yield were successfully modeled. For the liposome size stability, the ratio between the size after and before spray drying was modeled successfully. The model for the residual moisture content was poor, although, the moisture content was below 3% in the entire design space. Finally, the OOS was drafted from the constructed models for the spray drying of trehalose stabilized DDA/TDB liposomes. The QbD approach for the spray drying process should include a careful consideration of the quality target product profile. This approach implementing risk assessment and DoE was successfully applied to optimize the spray drying of an inhalable DDA/TDB liposomal adjuvant designed for pulmonary vaccination.

  6. End-to-end commissioning demonstration of the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Acton, D. Scott; Towell, Timothy; Schwenker, John; Shields, Duncan; Sabatke, Erin; Contos, Adam R.; Hansen, Karl; Shi, Fang; Dean, Bruce; Smith, Scott

    2007-09-01

    The one-meter Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate the design and implementation of the wavefront sensing and control (WFSC) capabilities of the James Webb Space Telescope (JWST). We have recently conducted an "end-to-end" demonstration of the flight commissioning process on the TBT. This demonstration started with the Primary Mirror (PM) segments and the Secondary Mirror (SM) in random positions, traceable to the worst-case flight deployment conditions. The commissioning process detected and corrected the deployment errors, resulting in diffraction-limited performance across the entire science FOV. This paper will describe the commissioning demonstration and the WFSC algorithms used at each step in the process.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kung, C C; Brunkhorst, C; Greenough, N

    Experimental results have shown that the high harmonic fast wave (HHFW) at 30 MHz can provide substantial plasma heating and current drive for the NSTX spherical tokamak operation. However, the present antenna strap design rarely achieves the design goal of delivering the full transmitter capability of 6 MW to the plasma. In order to deliver more power to the plasma, a new antenna strap design and the associated coaxial line feeds are being constructed. This new antenna strap design features two feedthroughs to replace the old single feed-through design. In the design process, CST Microwave Studio has been used tomore » simulate the entire new antenna strap structure including the enclosure and the Faraday shield. In this paper, the antenna strap model and the simulation results will be discussed in detail. The test results from the new antenna straps with their associated resonant loops will be presented as well.« less

  8. Stagnation-point heat-transfer rate predictions at aeroassist flight conditions

    NASA Technical Reports Server (NTRS)

    Gupta, Roop N.; Jones, Jim J.; Rochelle, William C.

    1992-01-01

    The results are presented for the stagnation-point heat-transfer rates used in the design process of the Aeroassist Flight Experiment (AFE) vehicle over its entire aeropass trajectory. The prediction methods used in this investigation demonstrate the application of computational fluid dynamics (CFD) techniques to a wide range of flight conditions and their usefulness in a design process. The heating rates were computed by a viscous-shock-layer (VSL) code at the lower altitudes and by a Navier-Stokes (N-S) code for the higher altitude cases. For both methods, finite-rate chemically reacting gas was considered, and a temperature-dependent wall-catalysis model was used. The wall temperature for each case was assumed to be radiative equilibrium temperature, based on total heating. The radiative heating was estimated by using a correlation equation. Wall slip was included in the N-S calculation method, and this method implicitly accounts for shock slip. The N-S/VSL combination of projection methods was established by comparison with the published benchmark flow-field code LAURA results at lower altitudes, and the direct simulation Monte Carlo results at higher altitude cases. To obtain the design heating rate over the entire forward face of the vehicle, a boundary-layer method (BLIMP code) that employs reacting chemistry and surface catalysis was used. The ratio of the VSL or N-S method prediction to that obtained from the boundary-layer method code at the stagnation point is used to define an adjustment factor, which accounts for the errors involved in using the boundary-layer method.

  9. Civil Engineering Corrosion Control. Volume 3. Cathodic Protection Design

    DTIC Science & Technology

    1975-02-01

    coatings, test stations bonds, and insulation. It is certainly not a "cure-all Its economics and feasibility mus’ always be carefully studied .. An in...General Description of Cathodic Protection. Cath- odic protection, as the name signifies, is the process by which an entire surface is transformed into a...The National Asaoeiation of Corrosion Enguler "I i ,.I-11 Standard RP-Ol-69, "Recommended Practice Por ront.ol ol." Ex - ternal Corrosion on

  10. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This document is a review journal that covers significant developments in the field of nuclear safety. Its scope includes the analysis and control of hazards associated with nuclear energy, operations involving fissionable materials, and the products of nuclear fission and their effects on the environment. Primary emphasis is on safety in reactor design, construction, and operation; however, the safety aspects of the entire fuel cycle, including fuel fabrication, spent-fuel processing, nuclear waste disposal, handling of radioisotopes, and environmental effects of these operations, are also treated.

  12. Application of experimental design in geothermal resources assessment of Ciwidey-Patuha, West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Ashat, Ali; Pratama, Heru Berian

    2017-12-01

    The successful Ciwidey-Patuha geothermal field size assessment required integration data analysis of all aspects to determined optimum capacity to be installed. Resources assessment involve significant uncertainty of subsurface information and multiple development scenarios from these field. Therefore, this paper applied the application of experimental design approach to the geothermal numerical simulation of Ciwidey-Patuha to generate probabilistic resource assessment result. This process assesses the impact of evaluated parameters affecting resources and interacting between these parameters. This methodology have been successfully estimated the maximum resources with polynomial function covering the entire range of possible values of important reservoir parameters.

  13. Vertical electrostatic actuator with extended digital range via tailored topology

    NASA Astrophysics Data System (ADS)

    Zhang, Yanhang; Dunn, Martin L.

    2002-07-01

    We describe the design, fabrication, and testing of an electrostatic vertical actuator that exhibits a range of motion that covers the entire initial gap between the actuator and substrate and provides controllable digital output motion. This is obtained by spatially tailoring the electrode arrangement and the stiffness characteristics of the microstructure to control the voltage-deflection characteristics. The concept is based on the electrostatic pull down of bimaterial beams, via a series of electrodes attached to the beams by flexures with tailored stiffness characteristics. The range of travel of the actuator is defined by the post-release deformed shape of the bilayer beams, and can be controlled by a post-release heat-treat process combined with a tailored actuator topology (material distribution and geometry, including spatial geometrical patterning of the individual layers of the bilayer beams). Not only does this allow an increase in the range of travel to cover the entire initial gap, but it also permits digital control of the tip of the actuator which can be designed to yield linear displacement - pull in step characteristics. We fabricated these actuators using the MUMPs surface micromachining process, and packaged them in-house. We measured, using an interferometric microscope, full field deformed shapes of the actuator at each pull in step. The measurements compare well with companion simulation results, both qualitatively and quantitatively.

  14. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  15. Discovery informatics in biological and biomedical sciences: research challenges and opportunities.

    PubMed

    Honavar, Vasant

    2015-01-01

    New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).

  16. Aerodynamic Impact of an Aft-Facing Slat-Step on High Re Airfoils

    NASA Astrophysics Data System (ADS)

    Kibble, Geoffrey; Petrin, Chris; Jacob, Jamey; Elbing, Brian; Ireland, Peter; Black, Buddy

    2016-11-01

    Typically, the initial aerodynamic design and subsequent testing and simulation of an aircraft wing assumes an ideal wing surface without imperfections. In reality, however the surface of an in-service aircraft wing rarely matches the surface characteristics of the test wings used during the conceptual design phase and certification process. This disconnect is usually deemed negligible or overlooked entirely. Specifically, many aircraft incorporate a leading edge slat; however, the mating between the slat and the top surface of the wing is not perfectly flush and creates a small aft-facing step behind the slat. In some cases, the slat can create a step as large as one millimeter tall, which is entirely submerged within the boundary layer. This abrupt change in geometry creates a span-wise vortex behind the step and in transonic flow causes a shock to form near the leading edge. This study investigates both experimentally and computationally the implications of an aft-facing slat-step on an aircraft wing and is compared to the ideal wing surface for subsonic and transonic flow conditions. The results of this study are useful for design of flow control modifications for aircraft currently in service and important for improving the next generation of aircraft wings.

  17. Spacecraft Conceptual Design for Returning Entire Near-Earth Asteroids

    NASA Technical Reports Server (NTRS)

    Brophy, John R.; Oleson, Steve

    2012-01-01

    In situ resource utilization (ISRU) in general, and asteroid mining in particular are ideas that have been around for a long time, and for good reason. It is clear that ultimately human exploration beyond low-Earth orbit will have to utilize the material resources available in space. Historically, the lack of sufficiently capable in-space transportation has been one of the key impediments to the harvesting of near-Earth asteroid resources. With the advent of high-power (or order 40 kW) solar electric propulsion systems, that impediment is being removed. High-power solar electric propulsion (SEP) would be enabling for the exploitation of asteroid resources. The design of a 40-kW end-of-life SEP system is presented that could rendezvous with, capture, and subsequently transport a 1,000-metric-ton near-Earth asteroid back to cislunar space. The conceptual spacecraft design was developed by the Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at the Glenn Research Center in collaboration with the Keck Institute for Space Studies (KISS) team assembled to investigate the feasibility of an asteroid retrieval mission. Returning such an object to cislunar space would enable astronaut crews to inspect, sample, dissect, and ultimately determine how to extract the desired materials from the asteroid. This process could jump-start the entire ISRU industry.

  18. Shielded loaded bowtie antenna incorporating the presence of paving structure for improved GPR pipe detection

    NASA Astrophysics Data System (ADS)

    Seyfried, Daniel; Jansen, Ronald; Schoebel, Joerg

    2014-12-01

    In civil engineering Ground Penetrating Radar becomes more and more a considerable tool for nondestructive testing and exploration of the underground. For example, the detection of existence of utilization pipe networks prior to construction works or detection of damaged spot beneath a paved street is a highly advantageous application. However, different surface conditions as well as ground bounce reflection and antenna cross-talk may seriously affect the detection capability of the entire radar system. Therefore, proper antenna design is an essential part in order to obtain radar data of high quality. In this paper we redesign a given loaded bowtie antenna in order to reduce strong and unwanted signal contributions such as ground bounce reflection and antenna cross-talk. During the optimization process we also review all parameters of our existing antenna in order to maximize energy transfer into ground. The entire process incorporating appropriate simulations along with running measurements on our GPR test site where we buried different types of pipes and cables for testing and developing radar hardware and software algorithms under quasi-real conditions is described in this paper.

  19. Surgical quality assessment. A simplified approach.

    PubMed

    DeLong, D L

    1991-10-01

    The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.

  20. MIRATE: MIps RATional dEsign Science Gateway.

    PubMed

    Busato, Mirko; Distefano, Rosario; Bates, Ferdia; Karim, Kal; Bossi, Alessandra Maria; López Vilariño, José Manuel; Piletsky, Sergey; Bombieri, Nicola; Giorgetti, Alejandro

    2018-06-13

    Molecularly imprinted polymers (MIPs) are high affinity robust synthetic receptors, which can be optimally synthesized and manufactured more economically than their biological equivalents (i.e. antibody). In MIPs production, rational design based on molecular modeling is a commonly employed technique. This mostly aids in (i) virtual screening of functional monomers (FMs), (ii) optimization of monomer-template ratio, and (iii) selectivity analysis. We present MIRATE, an integrated science gateway for the intelligent design of MIPs. By combining and adapting multiple state-of-the-art bioinformatics tools into automated and innovative pipelines, MIRATE guides the user through the entire process of MIPs' design. The platform allows the user to fully customize each stage involved in the MIPs' design, with the main goal to support the synthesis in the wet-laboratory. MIRATE is freely accessible with no login requirement at http://mirate.di.univr.it/. All major browsers are supported.

  1. Participatory ergonomics for psychological factors evaluation in work system design.

    PubMed

    Wang, Lingyan; Lau, Henry Y K

    2012-01-01

    It is a well recognized understanding that workers whose voice needs to be heard should be actively encouraged as full participants and involved in the early design stages of new ergonomic work system which encompass the development and implementation of new tools, workplaces, technologies or organizations. This paper presents a novel participatory strategy to evaluate three key psychological factors which are respectively mental fatigue, spiritual stress, and emotional satisfaction in work system design based on a modified version of Participatory Ergonomics (PE). In specific, it integrates a PE technique with a formulation view by combining the parallel development of PE strategies, frameworks and functions throughout the coverage of the entire work system design process, so as to bridge the gap between qualitative and quantitative analysis of psychological factors which can cause adverse or advantageous effects on worker's physiological and behavioral performance.

  2. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  3. An engineering code to analyze hypersonic thermal management systems

    NASA Technical Reports Server (NTRS)

    Vangriethuysen, Valerie J.; Wallace, Clark E.

    1993-01-01

    Thermal loads on current and future aircraft are increasing and as a result are stressing the energy collection, control, and dissipation capabilities of current thermal management systems and technology. The thermal loads for hypersonic vehicles will be no exception. In fact, with their projected high heat loads and fluxes, hypersonic vehicles are a prime example of systems that will require thermal management systems (TMS) that have been optimized and integrated with the entire vehicle to the maximum extent possible during the initial design stages. This will not only be to meet operational requirements, but also to fulfill weight and performance constraints in order for the vehicle to takeoff and complete its mission successfully. To meet this challenge, the TMS can no longer be two or more entirely independent systems, nor can thermal management be an after thought in the design process, the typical pervasive approach in the past. Instead, a TMS that was integrated throughout the entire vehicle and subsequently optimized will be required. To accomplish this, a method that iteratively optimizes the TMS throughout the vehicle will not only be highly desirable, but advantageous in order to reduce the manhours normally required to conduct the necessary tradeoff studies and comparisons. A thermal management engineering computer code that is under development and being managed at Wright Laboratory, Wright-Patterson AFB, is discussed. The primary goal of the code is to aid in the development of a hypersonic vehicle TMS that has been optimized and integrated on a total vehicle basis.

  4. A Recipe for Soft Fluidic Elastomer Robots

    PubMed Central

    Marchese, Andrew D.; Katzschmann, Robert K.

    2015-01-01

    Abstract This work provides approaches to designing and fabricating soft fluidic elastomer robots. That is, three viable actuator morphologies composed entirely from soft silicone rubber are explored, and these morphologies are differentiated by their internal channel structure, namely, ribbed, cylindrical, and pleated. Additionally, three distinct casting-based fabrication processes are explored: lamination-based casting, retractable-pin-based casting, and lost-wax-based casting. Furthermore, two ways of fabricating a multiple DOF robot are explored: casting the complete robot as a whole and casting single degree of freedom (DOF) segments with subsequent concatenation. We experimentally validate each soft actuator morphology and fabrication process by creating multiple physical soft robot prototypes. PMID:27625913

  5. A Recipe for Soft Fluidic Elastomer Robots.

    PubMed

    Marchese, Andrew D; Katzschmann, Robert K; Rus, Daniela

    2015-03-01

    This work provides approaches to designing and fabricating soft fluidic elastomer robots. That is, three viable actuator morphologies composed entirely from soft silicone rubber are explored, and these morphologies are differentiated by their internal channel structure, namely, ribbed, cylindrical, and pleated. Additionally, three distinct casting-based fabrication processes are explored: lamination-based casting, retractable-pin-based casting, and lost-wax-based casting. Furthermore, two ways of fabricating a multiple DOF robot are explored: casting the complete robot as a whole and casting single degree of freedom (DOF) segments with subsequent concatenation. We experimentally validate each soft actuator morphology and fabrication process by creating multiple physical soft robot prototypes.

  6. Nuclear Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, E G

    This document is a review journal that covers significant developments in the field of nuclear safety. Its scope includes the analysis and control of hazards associated with nuclear energy, operations involving fissionable materials, and the products of nuclear fission and their effects on the environment. Primary emphasis is on safety in reactor design, construction, and operation; however, the safety aspects of the entire fuel cycle, including fuel fabrication, spent-fuel processing, nuclear waste disposal, handling of radioisotopes, and environmental effects of these operations, are also treated.

  7. An Analysis of the Design-Build Delivery Approach in Air Force Military Construction

    DTIC Science & Technology

    2008-03-01

    revealed His love , wisdom, and grace though this entire thesis process. I greatly appreciate the guidance, support, and encouragement of my thesis...wife. It is a miracle that I have someone with such an abundant amount of love , patience, support, understanding, and joy to share my life with...I’m amazed at how much I love you and am excited for what the future holds for our family. James W. Rosner vii Table of Contents Page

  8. Nuclear Safety. Technical progress journal, April--June 1996: Volume 37, No. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muhlheim, M D

    1996-01-01

    This journal covers significant issues in the field of nuclear safety. Its primary scope is safety in the design, construction, operation, and decommissioning of nuclear power reactors worldwide and the research and analysis activities that promote this goal, but it also encompasses the safety aspects of the entire nuclear fuel cycle, including fuel fabrication, spent-fuel processing and handling, nuclear waste disposal, the handling of fissionable materials and radioisotopes, and the environmental effects of all these activities.

  9. Nuclear Safety. Technical progress journal, January--March 1994: Volume 35, No. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, E G

    1994-01-01

    This is a journal that covers significant issues in the field of nuclear safety. Its primary scope is safety in the design, construction, operation, and decommissioning of nuclear power reactors worldwide and the research and analysis activities that promote this goal, but it also encompasses the safety aspects of the entire nuclear fuel cycle, including fuel fabrication, spent-fuel processing and handling, and nuclear waste disposal, the handling of fissionable materials and radioisotopes, and the environmental effects of all these activities.

  10. Reconfigurable silicon thermo-optical device based on spectral tuning of ring resonators.

    PubMed

    Fegadolli, William S; Almeida, Vilson R; Oliveira, José Edimar Barbosa

    2011-06-20

    A novel tunable and reconfigurable thermo-optical device is theoretically proposed and analyzed in this paper. The device is designed to be entirely compatible with CMOS process and to work as a thermo-optical filter or modulator. Numerical results, made by means of analytical and Finite-Difference Time-Domain (FDTD) methods, show that a compact device enables a broad bandwidth operation, of up to 830 GHz, which allows the device to work under a large temperature variation, of up to 96 K.

  11. Balanced-Rotating-Spray Tank-And-Pipe-Cleaning System

    NASA Technical Reports Server (NTRS)

    Thaxton, Eric A.; Caimi, Raoul E. B.

    1995-01-01

    Spray head translates and rotates to clean entire inner surface of tank or pipe. Cleansing effected by three laterally balanced gas/liquid jets from spray head that rotates about longitudinal axis. Uses much less liquid. Cleaning process in system relies on mechanical action of jets instead of contaminant dissolution. Eliminates very difficult machining needed to make multiple converging/diverging nozzles within one spray head. Makes nozzle much smaller. Basic two-phase-flow, supersonic-nozzle design applied to other spray systems for interior or exterior cleaning.

  12. Effective Application of a Quality System in the Donation Process at Hospital Level.

    PubMed

    Trujnara, M; Czerwiński, J; Osadzińska, J

    2016-06-01

    This article describes the application of a quality system at the hospital level at the Multidisciplinary Hospital in Warsaw-Międzylesie in Poland. A quality system of hospital procedures (in accordance with the ISO system 9001:2008) regarding the donation process, from the identification of a possible donor to the retrieval of organs, was applied there in 2014. Seven independent documents about hospital procedures, were designed to cover the entire process of donation. The number of donors identified increased after the application of the quality system. The reason for this increase is, above all, the cooperation of the well-trained team of specialists who have been engaged in the process of donation for many years, but formal procedures certainly organize the process and make it easier. Copyright © 2016. Published by Elsevier Inc.

  13. Equalizer design techniques for dispersive cables with application to the SPS wideband kicker

    NASA Astrophysics Data System (ADS)

    Platt, Jason; Hofle, Wolfgang; Pollock, Kristin; Fox, John

    2017-10-01

    A wide-band vertical instability feedback control system in development at CERN requires 1-1.5 GHz of bandwidth for the entire processing chain, from the beam pickups through the feedback signal digital processing to the back-end power amplifiers and kicker structures. Dispersive effects in cables, amplifiers, pickup and kicker elements can result in distortions in the time domain signal as it proceeds through the processing system, and deviations from linear phase response reduce the allowable bandwidth for the closed-loop feedback system. We have developed an equalizer analog circuit that compensates for these dispersive effects. Here we present a design technique for the construction of an analog equalizer that incorporates the effect of parasitic circuit elements in the equalizer to increase the fidelity of the implemented equalizer. Finally, we show results from the measurement of an assembled backend equalizer that corrects for dispersive elements in the cables over a bandwidth of 10-1000 MHz.

  14. Evolution of a modular software network

    PubMed Central

    Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.

    2011-01-01

    “Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260

  15. Cameras for semiconductor process control

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Parker, D. L.

    1977-01-01

    The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

  16. Process Engineering Technology Center Initiative

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    2001-01-01

    NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at KSC because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how KSC has benefited from PE and how KSC has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where KSC's PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.

  17. Process Engineering Technology Center Initiative

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    2002-01-01

    NASA's Kennedy Space Center (KSC) is developing as a world-class Spaceport Technology Center (STC). From a process engineering (PE) perspective, the facilities used for flight hardware processing at KSC are NASA's premier factories. The products of these factories are safe, successful shuttle and expendable vehicle launches carrying state-of-the-art payloads. PE is devoted to process design, process management, and process improvement, rather than product design. PE also emphasizes the relationships of workers with systems and processes. Thus, it is difficult to speak of having a laboratory for PE at K.S.C. because the entire facility is practically a laboratory when observed from a macro level perspective. However, it becomes necessary, at times, to show and display how K.S.C. has benefited from PE and how K.S.C. has contributed to the development of PE; hence, it has been proposed that a Process Engineering Technology Center (PETC) be developed to offer a place with a centralized focus on PE projects, and a place where K.S.C.'s PE capabilities can be showcased, and a venue where new Process Engineering technologies can be investigated and tested. Graphics for showcasing PE capabilities have been designed, and two initial test beds for PE technology research have been identified. Specifically, one test bed will look into the use of wearable computers with head mounted displays to deliver work instructions; the other test bed will look into developing simulation models that can be assembled into one to create a hierarchical model.

  18. 3S (Safeguards, Security, Safety) based pyroprocessing facility safety evaluation plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ku, J.H.; Choung, W.M.; You, G.S.

    The big advantage of pyroprocessing for the management of spent fuels against the conventional reprocessing technologies lies in its proliferation resistance since the pure plutonium cannot be separated from the spent fuel. The extracted materials can be directly used as metal fuel in a fast reactor, and pyroprocessing reduces drastically the volume and heat load of the spent fuel. KAERI has implemented the SBD (Safeguards-By-Design) concept in nuclear fuel cycle facilities. The goal of SBD is to integrate international safeguards into the entire facility design process since the very beginning of the design phase. This paper presents a safety evaluationmore » plan using a conceptual design of a reference pyroprocessing facility, in which 3S (Safeguards, Security, Safety)-By-Design (3SBD) concept is integrated from early conceptual design phase. The purpose of this paper is to establish an advanced pyroprocessing hot cell facility design concept based on 3SBD for the successful realization of pyroprocessing technology with enhanced safety and proliferation resistance.« less

  19. Fabrication and characterization of resonant SOI micromechanical silicon sensors based on DRIE micromachining, freestanding release process and silicon direct bonding

    NASA Astrophysics Data System (ADS)

    Gigan, Olivier; Chen, Hua; Robert, Olivier; Renard, Stephane; Marty, Frederic

    2002-11-01

    This paper is dedicated to the fabrication and technological aspect of a silicon microresonator sensor. The entire project includes the fabrication processes, the system modelling/simulation, and the electronic interface. The mechanical model of such resonator is presented including description of frequency stability and Hysterises behaviour of the electrostatically driven resonator. Numeric model and FEM simulations are used to simulate the system dynamic behaviour. The complete fabrication process is based on standard microelectronics technology with specific MEMS technological steps. The key steps are described: micromachining on SOI by Deep Reactive Ion Etching (DRIE), specific release processes to prevent sticking (resist and HF-vapour release process) and collective vacuum encapsulation by Silicon Direct Bonding (SDB). The complete process has been validated and prototypes have been fabricated. The ASIC was designed to interface the sensor and to control the vibration amplitude. This electronic was simulated and designed to work up to 200°C and implemented in a standard 0.6μ CMOS technology. Characterizations of sensor prototypes are done both mechanically and electrostatically. These measurements showed good agreements with theory and FEM simulations.

  20. Broadband omnidirectional antireflection coating based on subwavelength surface Mie resonators

    PubMed Central

    Spinelli, P.; Verschuuren, M.A.; Polman, A.

    2012-01-01

    Reflection is a natural phenomenon that occurs when light passes the interface between materials with different refractive index. In many applications, such as solar cells or photodetectors, reflection is an unwanted loss process. Many ways to reduce reflection from a substrate have been investigated so far, including dielectric interference coatings, surface texturing, adiabatic index matching and scattering from plasmonic nanoparticles. Here we present an entirely new concept that suppresses the reflection of light from a silicon surface over a broad spectral range. A two-dimensional periodic array of subwavelength silicon nanocylinders designed to possess strongly substrate-coupled Mie resonances yields almost zero total reflectance over the entire spectral range from the ultraviolet to the near-infrared. This new antireflection concept relies on the strong forward scattering that occurs when a scattering structure is placed in close proximity to a high-index substrate with a high optical density of states. PMID:22353722

  1. System Engineering Infrastructure Evolution Galileo IOV and the Steps Beyond

    NASA Astrophysics Data System (ADS)

    Eickhoff, J.; Herpel, H.-J.; Steinle, T.; Birn, R.; Steiner, W.-D.; Eisenmann, H.; Ludwig, T.

    2009-05-01

    The trends to more and more constrained financial budgets in satellite engineering require a permanent optimization of the S/C system engineering processes and infrastructure. Astrium in the recent years already has built up a system simulation infrastructure - the "Model-based Development & Verification Environment" - which meanwhile is well known all over Europe and is established as Astrium's standard approach for ESA, DLR projects and now even the EU/ESA-Project Galileo IOV. The key feature of the MDVE / FVE approach is to provide entire S/C simulation (with full featured OBC simulation) already in early phases to start OBSW code tests on a simulated S/C and to later add hardware in the loop step by step up to an entire "Engineering Functional Model (EFM)" or "FlatSat". The subsequent enhancements to this simulator infrastructure w.r.t. spacecraft design data handling are reported in the following sections.

  2. Reduction of oxygen concentration by heater design during Czochralski Si growth

    NASA Astrophysics Data System (ADS)

    Zhou, Bing; Chen, Wenliang; Li, Zhihui; Yue, Ruicun; Liu, Guowei; Huang, Xinming

    2018-02-01

    Oxygen is one of the highest-concentration impurities in single crystals grown by the Czochralski (CZ) process, and seriously impairs the quality of the Si wafer. In this study, computer simulations were applied to design a new CZ system. A more appropriate thermal field was acquired by optimization of the heater structure. The simulation results showed that, compared with the conventional system, the oxygen concentration in the newly designed CZ system was reduced significantly throughout the entire CZ process because of the lower crucible wall temperature and optimized convection. To verify the simulation results, experiments were conducted on an industrial single-crystal furnace. The experimental results showed that the oxygen concentration was reduced significantly, especially at the top of the CZ-Si ingot. Specifically, the oxygen concentration was 6.19 × 1017 atom/cm3 at the top of the CZ-Si ingot with the newly designed CZ system, compared with 9.22 × 1017 atom/cm3 with the conventional system. Corresponding light-induced degradation of solar cells based on the top of crystals from the newly designed CZ system was 1.62%, a reduction of 0.64% compared with crystals from the conventional system (2.26%).

  3. PCDD/PCDF reduction by the co-combustion process.

    PubMed

    Lee, Vinci K C; Cheung, Wai-Hung; McKay, Gordon

    2008-01-01

    A novel process, termed the co-combustion process, has been developed and designed to utilise the thermal treatment of municipal solid waste (MSW) in cement clinker production and reduce PCDD/PCDF emissions. To test the conceptual design; detailed engineering design of the process and equipment was performed and a pilot plant was constructed to treat up to 40 tonnes MSW per day. The novel process features included several units external to the main traditional cement rotary kiln: an external calcinations unit in which the hot gas calcined the limestone thus making significant energy savings for this chemical reaction; the lime generated was used in a second chamber to act as a giant acid gas scrubber to remove SOx and particularly HCl (a source of chloride); an external rotary kiln and secondary combustion unit capable of producing a hot gas at 1200 degrees C; a gas cooler to simulate a boiler turbogenerator set for electricity generation; the incorporation of some of the bottom ash, calcined lime and dust collector solids into the cement clinker. A PCDD/PCDF inventory has been completed for the entire process and measured PCDD/PCDF emissions were 0.001 ng I-TEQ/Nm(3) on average which is 1% of the best practical means [Hong Kong Environmental Protection Department, 2001. A guidance note on the best practicable means for incinerators (municipal waste incineration), BPM12/1] MSW incineration emission limit values.

  4. Reengineering a cardiovascular surgery service.

    PubMed

    Tunick, P A; Etkin, S; Horrocks, A; Jeglinski, G; Kelly, J; Sutton, P

    1997-04-01

    Reengineering, involving the radical redesign of business processes, has been used successfully in a variety of health care settings. In 1994 New York University (NYU) Medical Center (MC) launched its first reengineering team, whose purpose was to redesign the entire process of caring for patients-from referral to discharge-on the cardiovascular (CV) surgery service. REENIGINEERING TEAM: The multidisciplinary CV Surgery Reengineering Team was charged with two goals: improving customer (patient, family, and referring physician) satisfaction and improving profitability. The methodology to be used was based on a reengineering philosophy-discarding basic assumptions and designing the patient care process from the ground up. THE TRANSFER-IN INITIATIVE: A survey of NYU cardiologists, distributed in April 1994, suggested that the organization was considered a difficult place to transfer patients. The team's recommendations led to a new, streamlined transfer-in policy. The average waiting time from when a referring physician requested a patient transfer and the time when an NYUMC physician accepted the transfer decreased from an average of 9 hours under the old system to immediate acceptance. Three customer satisfaction task forces implemented multiple programs to make the service more user friendly. In addition, referrals increased and length of stay decreased, without an adverse impact on the mortality rate. For the first time at NYUMC, a multidisciplinary team was given the mandate to achieve major changes in an entire patient care process. Similar projects are now underway.

  5. Municipal waste stabilization in a reactor with an integrated active and passive aeration system.

    PubMed

    Kasinski, Slawomir; Slota, Monika; Markowski, Michal; Kaminska, Anna

    2016-04-01

    To test whether an integrated passive and active aeration system could be an effective solution for aerobic decomposition of municipal waste in technical conditions, a full-scale composting reactor was designed. The waste was actively aerated for 5d, passively aerated for 35 d, and then actively aerated for 5d, and the entire composting process was monitored. During the 45-day observation period, changes in the fractional, morphological and physico-chemical characteristics of the waste at the top of the reactor differed from those in the center of the reactor. The fractional and morphological analysis made during the entire process of stabilization, showed the total reduction of organic matter measured of 82 wt% and 86 wt% at the respective depths. The reduction of organic matter calculated using the results of Lost of Ignition (LOI) and Total Organic Carbon (TOC) showed, respectively, 40.51-46.62% organic matter loss at the top and 45.33-53.39% in the center of the reactor. At the end of the process, moisture content, LOI and TOC at the top were 3.29%, 6.10% and 4.13% higher, respectively, than in the center. The results showed that application of passive aeration in larger scale simultaneously allows the thermophilic levels to be maintained during municipal solid waste composting process while not inhibiting microbial activity in the reactor. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A holistic approach to SIM platform and its application to early-warning satellite system

    NASA Astrophysics Data System (ADS)

    Sun, Fuyu; Zhou, Jianping; Xu, Zheyao

    2018-01-01

    This study proposes a new simulation platform named Simulation Integrated Management (SIM) for the analysis of parallel and distributed systems. The platform eases the process of designing and testing both applications and architectures. The main characteristics of SIM are flexibility, scalability, and expandability. To improve the efficiency of project development, new models of early-warning satellite system were designed based on the SIM platform. Finally, through a series of experiments, the correctness of SIM platform and the aforementioned early-warning satellite models was validated, and the systematical analyses for the orbital determination precision of the ballistic missile during its entire flight process were presented, as well as the deviation of the launch/landing point. Furthermore, the causes of deviation and prevention methods will be fully explained. The simulation platform and the models will lay the foundations for further validations of autonomy technology in space attack-defense architecture research.

  7. All-Digital Baseband 65nm PLL/FPLL Clock Multiplier using 10-cell Library

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L., Jr.; Wu, Qiong; Liu, Rui; Chen, Li

    2014-01-01

    PLLs for clock generation are essential for modern circuits, to generate specialized frequencies for many interfaces and high frequencies for chip internal operation. These circuits depend on analog circuits and careful tailoring for each new process, and making them fault tolerant is an incompletely solved problem. Until now, all digital PLLs have been restricted to sampled data DSP techniques and not available for the highest frequency baseband applications. This paper presents the design and preliminary evaluation of an all-digital baseband technique built entirely with an easily portable 10-cell digital library. The library is also described, as it aids in research and low volume design porting to new processes. The advantages of the digital approach are the wide variety of techniques available to give varying degrees of fault tolerance, and the simplicity of porting the design to new processes, even to exotic processes that may not have analog capability. The only tuning parameter is digital gate delay. An all-digital approach presents unique problems and standard analog loop stability design criteria cannot be directly used. Because of the quantization of frequency, there is effectively infinite gain for very small loop error feedback. The numerically controlled oscillator (NCO) based on a tapped delay line cannot be reliably updated while a pulse is active in the delay line, and ordinarily does not have enough frequency resolution for a low-jitter output.

  8. ALL-Digital Baseband 65nm PLL/FPLL Clock Multiplier Using 10-Cell Library

    NASA Technical Reports Server (NTRS)

    Schuler, Robert L., Jr.; Wu, Qiong; Liu, Rui; Chen, Li; Madala, Shridhar

    2014-01-01

    PLLs for clock generation are essential for modern circuits, to generate specialized frequencies for many interfaces and high frequencies for chip internal operation. These circuits depend on analog circuits and careful tailoring for each new process, and making them fault tolerant is an incompletely solved problem. Until now, all digital PLLs have been restricted to sampled data DSP techniques and not available for the highest frequency baseband applications. This paper presents the design and preliminary evaluation of an all-digital baseband technique built entirely with an easily portable 10-cell digital library. The library is also described, as it aids in research and low volume design porting to new processes. The advantages of the digital approach are the wide variety of techniques available to give varying degrees of fault tolerance, and the simplicity of porting the design to new processes, even to exotic processes that may not have analog capability. The only tuning parameter is digital gate delay. An all-digital approach presents unique problems and standard analog loop stability design criteria cannot be directly used. Because of the quantization of frequency, there is effectively infinite gain for very small loop error feedback. The numerically controlled oscillator (NCO) based on a tapped delay line cannot be reliably updated while a pulse is active in the delay line, and ordinarily does not have enough frequency resolution for a low-jitter output.

  9. Mathematical Modeling Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Seshan, Panchalam K.; Ganapathi, Balasubramanian; Jan, Darrell L.; Ferrall, Joseph F.; Rohatgi, Naresh K.

    1994-01-01

    Generic hierarchical model of life-support system developed to facilitate comparisons of options in design of system. Model represents combinations of interdependent subsystems supporting microbes, plants, fish, and land animals (including humans). Generic model enables rapid configuration of variety of specific life support component models for tradeoff studies culminating in single system design. Enables rapid evaluation of effects of substituting alternate technologies and even entire groups of technologies and subsystems. Used to synthesize and analyze life-support systems ranging from relatively simple, nonregenerative units like aquariums to complex closed-loop systems aboard submarines or spacecraft. Model, called Generic Modular Flow Schematic (GMFS), coded in such chemical-process-simulation languages as Aspen Plus and expressed as three-dimensional spreadsheet.

  10. Optical Design of Segmented Hexagon Array Solar Mirror

    NASA Technical Reports Server (NTRS)

    Huegele, Vince

    2000-01-01

    A segmented array of mirrors was designed for a solar concentrator test stand at MSFC for firing solar thermal propulsion engines. The 144 mirrors each have a spherical surface to approximate a parabolic concentrator when combined into the entire 18-foot diameter array. The mirror segments are aluminum hexagons that had the surface diamond turned and quartz coated. The array focuses sunlight reflected from a heliostat to a 4 inch diameter spot containing 10 kw of power at the 15-foot focal point. The derivation of the surface figure for the respective mirror elements is shown. The alignment process of the array is discussed and test results of the system's performance is given.

  11. Lunar Landing Trajectory Design for Onboard Hazard Detection and Avoidance

    NASA Technical Reports Server (NTRS)

    Paschall, Steve; Brady, Tye; Sostaric, Ron

    2009-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing the software and hardware technology needed to support a safe and precise landing for the next generation of lunar missions. ALHAT provides this capability through terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard detection system to select safe landing locations, and an Autonomous Guidance, Navigation, and Control (AGNC) capability to process these measurements and safely direct the vehicle to a landing location. This paper focuses on the key trajectory design issues relevant to providing an onboard Hazard Detection and Avoidance (HDA) capability for the lander. Hazard detection can be accomplished by the crew visually scanning the terrain through a window, a sensor system imaging the terrain, or some combination of both. For ALHAT, this hazard detection activity is provided by a sensor system, which either augments the crew s perception or entirely replaces the crew in the case of a robotic landing. Detecting hazards influences the trajectory design by requiring the proper perspective, range to the landing site, and sufficient time to view the terrain. Following this, the trajectory design must provide additional time to process this information and make a decision about where to safely land. During the final part of the HDA process, the trajectory design must provide sufficient margin to enable a hazard avoidance maneuver. In order to demonstrate the effects of these constraints on the landing trajectory, a tradespace of trajectory designs was created for the initial ALHAT Design Analysis Cycle (ALDAC-1) and each case evaluated with these HDA constraints active. The ALHAT analysis process, described in this paper, narrows down this tradespace and subsequently better defines the trajectory design needed to support onboard HDA. Future ALDACs will enhance this trajectory design by balancing these issues and others in an overall system design process.

  12. Sustainable fuel for the transportation sector

    PubMed Central

    Agrawal, Rakesh; Singh, Navneet R.; Ribeiro, Fabio H.; Delgass, W. Nicholas

    2007-01-01

    A hybrid hydrogen-carbon (H2CAR) process for the production of liquid hydrocarbon fuels is proposed wherein biomass is the carbon source and hydrogen is supplied from carbon-free energy. To implement this concept, a process has been designed to co-feed a biomass gasifier with H2 and CO2 recycled from the H2-CO to liquid conversion reactor. Modeling of this biomass to liquids process has identified several major advantages of the H2CAR process. (i) The land area needed to grow the biomass is <40% of that needed by other routes that solely use biomass to support the entire transportation sector. (ii) Whereas the literature estimates known processes to be able to produce ≈30% of the United States transportation fuel from the annual biomass of 1.366 billion tons, the H2CAR process shows the potential to supply the entire United States transportation sector from that quantity of biomass. (iii) The synthesized liquid provides H2 storage in an open loop system. (iv) Reduction to practice of the H2CAR route has the potential to provide the transportation sector for the foreseeable future, using the existing infrastructure. The rationale of using H2 in the H2CAR process is explained by the significantly higher annualized average solar energy conversion efficiency for hydrogen generation versus that for biomass growth. For coal to liquids, the advantage of H2CAR is that there is no additional CO2 release to the atmosphere due to the replacement of petroleum with coal, thus eliminating the need to sequester CO2. PMID:17360377

  13. Sustainable fuel for the transportation sector.

    PubMed

    Agrawal, Rakesh; Singh, Navneet R; Ribeiro, Fabio H; Delgass, W Nicholas

    2007-03-20

    A hybrid hydrogen-carbon (H(2)CAR) process for the production of liquid hydrocarbon fuels is proposed wherein biomass is the carbon source and hydrogen is supplied from carbon-free energy. To implement this concept, a process has been designed to co-feed a biomass gasifier with H(2) and CO(2) recycled from the H(2)-CO to liquid conversion reactor. Modeling of this biomass to liquids process has identified several major advantages of the H(2)CAR process. (i) The land area needed to grow the biomass is <40% of that needed by other routes that solely use biomass to support the entire transportation sector. (ii) Whereas the literature estimates known processes to be able to produce approximately 30% of the United States transportation fuel from the annual biomass of 1.366 billion tons, the H(2)CAR process shows the potential to supply the entire United States transportation sector from that quantity of biomass. (iii) The synthesized liquid provides H(2) storage in an open loop system. (iv) Reduction to practice of the H(2)CAR route has the potential to provide the transportation sector for the foreseeable future, using the existing infrastructure. The rationale of using H(2) in the H(2)CAR process is explained by the significantly higher annualized average solar energy conversion efficiency for hydrogen generation versus that for biomass growth. For coal to liquids, the advantage of H(2)CAR is that there is no additional CO(2) release to the atmosphere due to the replacement of petroleum with coal, thus eliminating the need to sequester CO(2).

  14. Spoked-ring microcavities: enabling seamless integration of nanophotonics in unmodified advanced CMOS microelectronics chips

    NASA Astrophysics Data System (ADS)

    Wade, Mark T.; Shainline, Jeffrey M.; Orcutt, Jason S.; Ram, Rajeev J.; Stojanovic, Vladimir; Popovic, Milos A.

    2014-03-01

    We present the spoked-ring microcavity, a nanophotonic building block enabling energy-efficient, active photonics in unmodified, advanced CMOS microelectronics processes. The cavity is realized in the IBM 45nm SOI CMOS process - the same process used to make many commercially available microprocessors including the IBM Power7 and Sony Playstation 3 processors. In advanced SOI CMOS processes, no partial etch steps and no vertical junctions are available, which limits the types of optical cavities that can be used for active nanophotonics. To enable efficient active devices with no process modifications, we designed a novel spoked-ring microcavity which is fully compatible with the constraints of the process. As a modulator, the device leverages the sub-100nm lithography resolution of the process to create radially extending p-n junctions, providing high optical fill factor depletion-mode modulation and thereby eliminating the need for a vertical junction. The device is made entirely in the transistor active layer, low-loss crystalline silicon, which eliminates the need for a partial etch commonly used to create ridge cavities. In this work, we present the full optical and electrical design of the cavity including rigorous mode solver and FDTD simulations to design the Qlimiting electrical contacts and the coupling/excitation. We address the layout of active photonics within the mask set of a standard advanced CMOS process and show that high-performance photonic devices can be seamlessly monolithically integrated alongside electronics on the same chip. The present designs enable monolithically integrated optoelectronic transceivers on a single advanced CMOS chip, without requiring any process changes, enabling the penetration of photonics into the microprocessor.

  15. Design of a control system for the LECR3

    NASA Astrophysics Data System (ADS)

    Zhou, Wen-Xiong; Wang, Yan-Yu; Zhou, De-Tai; Lin, Fu-Yuan; Luo, Jin-Fu; Yu, Yan-Juan; Feng, Yu-Cheng; Lu, Wang

    2013-11-01

    The Lanzhou Electron Cyclotron Resonance Ion Source No. 3 (LECR3) plays an important role in supplying many kinds of ion beams to the Heavy Ion Research Facility in Lanzhou (HIRFL). In this paper, we provide a detailed description of a new remote control system for the LECR3 that we designed and implemented. This system uses typical distribution control for both the LECR3 and the newly-built Lanzhou All Permanent Magnet ECR Ion Source No. 1 (LAPECR1). The entire project, including the construction of hardware and the software, was completed in September 2012. The hardware consists of an industry computer (IPC), an intranet composed of a switch, and various controllers with Ethernet access functions. The software is written in C++ and is used to control all of the respective equipment through the intranet to ensure that the useful information is stored in a database for later analysis. The entire system can efficiently acquire the necessary data from the respective equipment at a speed of 3 times per second, after which the data is stored in the database. The system can also complete the interlock protection and alarm process in one second.

  16. Tracking the Time-Dependent Role of the Hippocampus in Memory Recall Using DREADDs.

    PubMed

    Varela, Carmen; Weiss, Sarah; Meyer, Retsina; Halassa, Michael; Biedenkapp, Joseph; Wilson, Matthew A; Goosens, Ki Ann; Bendor, Daniel

    2016-01-01

    The hippocampus is critical for the storage of new autobiographical experiences as memories. Following an initial encoding stage in the hippocampus, memories undergo a process of systems-level consolidation, which leads to greater stability through time and an increased reliance on neocortical areas for retrieval. The extent to which the retrieval of these consolidated memories still requires the hippocampus is unclear, as both spared and severely degraded remote memory recall have been reported following post-training hippocampal lesions. One difficulty in definitively addressing the role of the hippocampus in remote memory retrieval is the precision with which the entire volume of the hippocampal region can be inactivated. To address this issue, we used Designer Receptors Exclusively Activated by Designer Drugs (DREADDs), a chemical-genetic tool capable of highly specific neuronal manipulation over large volumes of brain tissue. We find that remote (>7 weeks after acquisition), but not recent (1-2 days after acquisition) contextual fear memories can be recalled after injection of the DREADD agonist (CNO) in animals expressing the inhibitory DREADD in the entire hippocampus. Our data demonstrate a time-dependent role of the hippocampus in memory retrieval, supporting the standard model of systems consolidation.

  17. Deployable bamboo structure project: A building life-cycle report

    NASA Astrophysics Data System (ADS)

    Firdaus, Adrian; Prastyatama, Budianastas; Sagara, Altho; Wirabuana, Revian N.

    2017-11-01

    Bamboo is considered as a sustainable material in the world of construction, and it is vastly available in Indonesia. The general utilization of the material is increasingly frequent, however, its usage as a deployable structure-a recently-developed use of bamboo, is still untapped. This paper presents a report on a deployable bamboo structure project, covering the entire building life-cycle phase. The cycle encompasses the designing; fabrication; transportation; construction; operation and maintenance; as well as a plan for future re-use. The building is made of a configuration of the structural module, each being a folding set of bars which could be reduced in size to fit into vehicles for easy transportation. Each structural module was made of Gigantochloa apus bamboo. The fabrication, transportation, and construction phase require by a minimum of three workers. The fabrication and construction phase require three hours and fifteen minutes respectively. The building is utilized as cafeteria stands, the operation and maintenance phase started since early March 2017. The maintenance plan is scheduled on a monthly basis, focusing on the inspection of the locking mechanism element and the entire structural integrity. The building is designed to allow disassembly process so that it is reusable in the future.

  18. Tracking the Time-Dependent Role of the Hippocampus in Memory Recall Using DREADDs

    PubMed Central

    Varela, Carmen; Weiss, Sarah; Meyer, Retsina; Halassa, Michael; Biedenkapp, Joseph; Wilson, Matthew A.; Goosens, Ki Ann

    2016-01-01

    The hippocampus is critical for the storage of new autobiographical experiences as memories. Following an initial encoding stage in the hippocampus, memories undergo a process of systems-level consolidation, which leads to greater stability through time and an increased reliance on neocortical areas for retrieval. The extent to which the retrieval of these consolidated memories still requires the hippocampus is unclear, as both spared and severely degraded remote memory recall have been reported following post-training hippocampal lesions. One difficulty in definitively addressing the role of the hippocampus in remote memory retrieval is the precision with which the entire volume of the hippocampal region can be inactivated. To address this issue, we used Designer Receptors Exclusively Activated by Designer Drugs (DREADDs), a chemical-genetic tool capable of highly specific neuronal manipulation over large volumes of brain tissue. We find that remote (>7 weeks after acquisition), but not recent (1–2 days after acquisition) contextual fear memories can be recalled after injection of the DREADD agonist (CNO) in animals expressing the inhibitory DREADD in the entire hippocampus. Our data demonstrate a time-dependent role of the hippocampus in memory retrieval, supporting the standard model of systems consolidation. PMID:27145133

  19. Viper cabin-fuselage structural design concept with engine installation and wing structural design

    NASA Technical Reports Server (NTRS)

    Marchesseault, B.; Carr, D.; Mccorkle, T.; Stevens, C.; Turner, D.

    1993-01-01

    This report describes the process and considerations in designing the cabin, nose, drive shaft, and wing assemblies for the 'Viper' concept aircraft. Interfaces of these assemblies, as well as interfaces with the sections of the aircraft aft of the cabin, are also discussed. The results of the design process are included. The goal of this project is to provide a structural design which complies with FAR 23 requirements regarding occupant safety, emergency landing loads, and maneuvering loads. The design must also address the interfaces of the various systems in the cabin, nose, and wing, including the drive shaft, venting, vacuum, electrical, fuel, and control systems. Interfaces between the cabin assembly and the wing carrythrough and empennage assemblies were required, as well. In the design of the wing assemblies, consistency with the existing cabin design was required. The major areas considered in this report are materials and construction, loading, maintenance, environmental considerations, wing assembly fatigue, and weight. The first three areas are developed separately for the nose, cabin, drive shaft, and wing assemblies, while the last three are discussed for the entire design. For each assembly, loading calculations were performed to determine the proper sizing of major load carrying components. Table 1.0 lists the resulting margins of safety for these key components, along with the types of the loads involved, and the page number upon which they are discussed.

  20. MAGMA: analysis of two-channel microarrays made easy.

    PubMed

    Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph

    2007-07-01

    The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.

  1. Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement

    NASA Technical Reports Server (NTRS)

    Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.

    2010-01-01

    The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.

  2. Design of 6 MeV X-band electron linac for dual-head gantry radiotherapy system

    NASA Astrophysics Data System (ADS)

    Shin, Seung-wook; Lee, Seung-Hyun; Lee, Jong-Chul; Kim, Huisu; Ha, Donghyup; Ghergherehchi, Mitra; Chai, Jongseo; Lee, Byung-no; Chae, Moonsik

    2017-12-01

    A compact 6 MeV electron linac is being developed at Sungkyunkwan University, in collaboration with the Korea atomic energy research institute (KAERI). The linac will be used as an X-ray source for a dual-head gantry radiotherapy system. X-band technology has been employed to satisfy the size requirement of the dual-head gantry radiotherapy machine. Among the several options available, we selected a pi/2-mode, standing-wave, side-coupled cavity. This choice of radiofrequency (RF) cavity design is intended to enhance the shunt impedance of each cavity in the linac. An optimum structure of the RF cavity with a high-performance design was determined by applying a genetic algorithm during the optimization procedure. This paper describes the detailed design process for a single normal RF cavity and the entire structure, including the RF power coupler and coupling cavity, as well as the beam dynamics results.

  3. Expanding uses of building information modeling in life-cycle construction projects.

    PubMed

    Hannele, Kerosuo; Reijo, Miettinen; Tarja, Mäki; Sami, Paavola; Jenni, Korpela; Teija, Rantala

    2012-01-01

    BIM is targeted at providing information about the entire building and a complete set of design documents and data stored in an integrated database. In this paper, we study the use of BIM in two life-cycle construction projects in Kuopio, Finland during 2011. The analysis of uses of BIM and their main problems will constitute a foundation for an intervention. We will focus on the following questions: (1) How different partners use the composite BIM model? (2) What are the major contradictions or problems in the BIM use? The preliminary findings reported in this study show that BIM has been adopted quite generally to design use but the old ways of collaboration seem to prevail, especially between designers and between designers and building sites. BIM has provided new means and demands for collaboration but expansive uses of BIM for providing new interactive processes across professional fields have not much come true.

  4. Design of structure and simulation of the three-zone gasifier of dense layer of the inverted process

    NASA Astrophysics Data System (ADS)

    Zagrutdinov, R. Sh; Negutorov, V. N.; Maliykhin, D. G.; Nikishanin, M. S.; Senachin, P. K.

    2017-11-01

    Experts of LLC “New Energy Technologies” have developed gasifiers designs, with the implementation of the three-zone gasification method, which satisfy the following conditions: 1) the generated gas must be free from tar, soot and hydrocarbons, with a given ratio of CO/H2; 2) to use as the fuel source a wide range of low-grade low-value solid fuels, including biomass and various kinds of carbonaceous wastes; 3) have high reliability in operation, do not require qualified operating personnel, be relatively inexpensive to produce and use steam-air blowing instead of expensive steam-oxygen one; 4) the line of standard sizes should be sufficiently wide (with a single unit capacity of fuel from 1 to 50-70 MW). Two models of gas generators of the inverted gasification process with three combustion zones operating under pressure have been adopted for design: 1) gas generator with a remote combustion chamber type GOP-VKS (two-block version) and 2) a gas generator with a common combustion chamber of the GOP-OK type (single-block version), which is an almost ideal model for increasing the unit capacity. There have been worked out various schemes for the preparation of briquettes from practically the entire spectrum of low-grade fuel: high-ash and high-moisture coals, peat and biomass, including all types of waste - solid household waste, crop, livestock, poultry, etc. In the gas generators there are gasified the cylindrical briquettes with a diameter of 20-25 mm and a length of 25-35 mm. There have been developed a mathematical model and computer code for numerical simulation of synthesis gas generation processes in a gasifier of a dense layer of inverted process during a steam-air blast, including: continuity equations for the 8 gas phase components and for the solid phase; the equation of the heat balance for the entire heterogeneous system; the Darcy law equation (for porous media); equation of state for 8 components of the gas phase; equations for the rates of 3 gas-phase and 4 heterogeneous reactions; macro kinetics law of coke combustion; other equations and boundary conditions.

  5. Design and practice of a comprehensively functional integrated management information system for major construction

    NASA Astrophysics Data System (ADS)

    Liu, Yuling; Wang, Xiaoping; Zhu, Yuhui; Fei, Lanlan

    2017-08-01

    This paper introduces a Comprehensively Functional Integrated Management Information System designed for the Optical Engineering Major by the College of Optical Science and Engineering, Zhejiang University, which combines the functions of teaching, students learning, educational assessment and management. The system consists of 5 modules, major overview, online curriculum, experiment teaching management, graduation project management and teaching quality feedback. The major overview module introduces the development history, training program, curriculums and experiment syllabus and teaching achievements of optical engineering major in Zhejiang University. The Management Information System is convenient for students to learn in a mobile and personalized way. The online curriculum module makes it very easy for teachers to setup a website for new curriculums. On the website, teachers can help students on their problems about the curriculums in time and collect their homework online. The experiment teaching management module and the graduation project management module enables the students to fulfill their experiment process and graduation thesis under the help of their supervisors. Before students take an experiment in the lab, they must pass the pre-experiment quiz on the corresponding module. After the experiment, students need to submit the experiment report to the web server. Moreover, the module contains experiment process video recordings, which are very helpful to improve the effect of the experiment education. The management of the entire process of a student's graduation program, including the project selection, mid-term inspection, progress report of every two weeks, final thesis, et al, is completed by the graduation project management module. The teaching quality feedback module is not only helpful for teachers to know whether the education effect of curriculum is good or not, but also helpful for the administrators of the college to know whether the design of syllabus is reasonable or not. The Management Information System changes the management object from the education results to the entire education processes. And it improves the efficiency of the management. It provides an effective method to promote curriculum construction management by supervision and evaluation, which improves students' learning outcomes and the quality of curriculums. As a result, it promotes the quality system of education obviously.

  6. Registry in a tube: multiplexed pools of retrievable parts for genetic design space exploration

    PubMed Central

    Woodruff, Lauren B. A.; Gorochowski, Thomas E.; Roehner, Nicholas; Densmore, Douglas; Gordon, D. Benjamin; Nicol, Robert

    2017-01-01

    Abstract Genetic designs can consist of dozens of genes and hundreds of genetic parts. After evaluating a design, it is desirable to implement changes without the cost and burden of starting the construction process from scratch. Here, we report a two-step process where a large design space is divided into deep pools of composite parts, from which individuals are retrieved and assembled to build a final construct. The pools are built via multiplexed assembly and sequenced using next-generation sequencing. Each pool consists of ∼20 Mb of up to 5000 unique and sequence-verified composite parts that are barcoded for retrieval by PCR. This approach is applied to a 16-gene nitrogen fixation pathway, which is broken into pools containing a total of 55 848 composite parts (71.0 Mb). The pools encompass an enormous design space (1043 possible 23 kb constructs), from which an algorithm-guided 192-member 4.5 Mb library is built. Next, all 1030 possible genetic circuits based on 10 repressors (NOR/NOT gates) are encoded in pools where each repressor is fused to all permutations of input promoters. These demonstrate that multiplexing can be applied to encompass entire design spaces from which individuals can be accessed and evaluated. PMID:28007941

  7. Application of Optical Coherence Tomography Freeze-Drying Microscopy for Designing Lyophilization Process and Its Impact on Process Efficiency and Product Quality.

    PubMed

    Korang-Yeboah, Maxwell; Srinivasan, Charudharshini; Siddiqui, Akhtar; Awotwe-Otoo, David; Cruz, Celia N; Muhammad, Ashraf

    2018-01-01

    Optical coherence tomography freeze-drying microscopy (OCT-FDM) is a novel technique that allows the three-dimensional imaging of a drug product during the entire lyophilization process. OCT-FDM consists of a single-vial freeze dryer (SVFD) affixed with an optical coherence tomography (OCT) imaging system. Unlike the conventional techniques, such as modulated differential scanning calorimetry (mDSC) and light transmission freeze-drying microscopy, used for predicting the product collapse temperature (Tc), the OCT-FDM approach seeks to mimic the actual product and process conditions during the lyophilization process. However, there is limited understanding on the application of this emerging technique to the design of the lyophilization process. In this study, we investigated the suitability of OCT-FDM technique in designing a lyophilization process. Moreover, we compared the product quality attributes of the resulting lyophilized product manufactured using Tc, a critical process control parameter, as determined by OCT-FDM versus as estimated by mDSC. OCT-FDM analysis revealed the absence of collapse even for the low protein concentration (5 mg/ml) and low solid content formulation (1%w/v) studied. This was confirmed by lab scale lyophilization. In addition, lyophilization cycles designed using Tc values obtained from OCT-FDM were more efficient with higher sublimation rate and mass flux than the conventional cycles, since drying was conducted at higher shelf temperature. Finally, the quality attributes of the products lyophilized using Tc determined by OCT-FDM and mDSC were similar, and product shrinkage and cracks were observed in all the batches of freeze-dried products irrespective of the technique employed in predicting Tc.

  8. Design of Accumulators and Liquid/Gas Charging of Single Phase Mechanically Pumped Fluid Loop Heat Rejection Systems

    NASA Technical Reports Server (NTRS)

    Bhandari, Pradeep; Dudik, Brenda; Birur, Gajanana; Karlmann, Paul; Bame, David; Mastropietro, A. J.

    2012-01-01

    For single phase mechanically pumped fluid loops used for thermal control of spacecraft, a gas charged accumulator is typically used to modulate pressures within the loop. This is needed to accommodate changes in the working fluid volume due to changes in the operating temperatures as the spacecraft encounters varying thermal environments during its mission. Overall, the three key requirements on the accumulator to maintain an appropriate pressure range throughout the mission are: accommodation of the volume change of the fluid due to temperature changes, avoidance of pump cavitation and prevention of boiling in the liquid. The sizing and design of such an accumulator requires very careful and accurate accounting of temperature distribution within each element of the working fluid for the entire range of conditions expected, accurate knowledge of volume of each fluid element, assessment of corresponding pressures needed to avoid boiling in the liquid, as well as the pressures needed to avoid cavitation in the pump. The appropriate liquid and accumulator strokes required to accommodate the liquid volume change, as well as the appropriate gas volumes, require proper sizing to ensure that the correct pressure range is maintained during the mission. Additionally, a very careful assessment of the process for charging both the gas side and the liquid side of the accumulator is required to properly position the bellows and pressurize the system to a level commensurate with requirements. To achieve the accurate sizing of the accumulator and the charging of the system, sophisticated EXCEL based spreadsheets were developed to rapidly come up with an accumulator design and the corresponding charging parameters. These spreadsheets have proven to be computationally fast and accurate tools for this purpose. This paper will describe the entire process of designing and charging the system, using a case study of the Mars Science Laboratory (MSL) fluid loops, which is en route to Mars for an August 2012 landing.

  9. Reference Architecture Model Enabling Standards Interoperability.

    PubMed

    Blobel, Bernd

    2017-01-01

    Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.

  10. Design, evaluation, and fabrication of low-cost composite blades for intermediate-size wind turbines

    NASA Technical Reports Server (NTRS)

    Weingart, O.

    1981-01-01

    Low cost approaches for production of 60 ft long glass fiber/resin composite rotor blades for the MOD-OA wind turbine were identified and evaluated. The most cost-effective configuration was selected for detailed design. Subelement and subscale specimens were fabricated for testing to confirm physical and mechanical properties of the composite blade materials, to develop and evaluate blade fabrication techniques and processes, and to confirm the structural adequacy of the root end joint. Full-scale blade tooling was constructed and a partial blade for tool and process tryout was built. Then two full scale blades were fabricated and delivered to NASA-LeRC for installation on a MOD-OA wind turbine at Clayton, New Mexico for operational testing. Each blade was 60 ft. long with 4.5 ft. chord at root end and 2575 lbs weight including metal hub adapter. The selected blade configuration was a three cell design constructed using a resin impregnated glass fiber tape winding process that allows rapid wrapping of primarily axially oriented fibers onto a tapered mandrel, with tapered wall thickness. The ring winder/transverse filament tape process combination was used for the first time on this program to produce entire rotor blade structures. This approach permitted the complete blade to be wound on stationary mandrels, an improvement which alleviated some of the tooling and process problems encountered on previous composite blade programs.

  11. Lessons from a Space Analog on Adaptation for Long-Duration Exploration Missions.

    PubMed

    Anglin, Katlin M; Kring, Jason P

    2016-04-01

    Exploration missions to asteroids and Mars will bring new challenges associated with communication delays and more autonomy for crews. Mission safety and success will rely on how well the entire system, from technology to the human elements, is adaptable and resilient to disruptive, novel, or potentially catastrophic events. The recent NASA Extreme Environment Missions Operations (NEEMO) 20 mission highlighted this need and produced valuable "lessons learned" that will inform future research on team adaptation and resilience. A team of NASA, industry, and academic members used an iterative process to design a tripod shaped structure, called the CORAL Tower, for two astronauts to assemble underwater with minimal tools. The team also developed assembly procedures, administered training to the crew, and provided support during the mission. During the design, training, and assembly of the Tower, the team learned first-hand how adaptation in extreme environments depends on incremental testing, thorough procedures and contingency plans that predict possible failure scenarios, and effective team adaptation and resiliency for the crew and support personnel. Findings from NEEMO 20 provide direction on the design and testing process for future space systems and crews to maximize adaptation. This experience also underscored the need for more research on team adaptation, particularly how input and process factors affect adaption outcomes, the team adaptation iterative process, and new ways to measure the adaptation process.

  12. High temperature aircraft research furnace facilities

    NASA Technical Reports Server (NTRS)

    Smith, James E., Jr.; Cashon, John L.

    1992-01-01

    Focus is on the design, fabrication, and development of the High Temperature Aircraft Research Furnace Facilities (HTARFF). The HTARFF was developed to process electrically conductive materials with high melting points in a low gravity environment. The basic principle of operation is to accurately translate a high temperature arc-plasma gas front as it orbits around a cylindrical sample, thereby making it possible to precisely traverse the entire surface of a sample. The furnace utilizes the gas-tungsten-arc-welding (GTAW) process, also commonly referred to as Tungsten-Inert-Gas (TIG). The HTARFF was developed to further research efforts in the areas of directional solidification, float-zone processing, welding in a low-gravity environment, and segregation effects in metals. The furnace is intended for use aboard the NASA-JSC Reduced Gravity Program KC-135A Aircraft.

  13. Space processing applications rocket project. SPAR 8

    NASA Technical Reports Server (NTRS)

    Chassay, R. P. (Editor)

    1984-01-01

    The Space Processing Applications Rocket Project (SPAR) VIII Final Report contains the engineering report prepared at the Marshall Space Flight Center (MSFC) as well as the three reports from the principal investigators. These reports also describe pertinent portions of ground-based research leading to the ultimate selection of the flight sample composition, including design, fabrication, and testing, all of which are expected to contribute immeasurably to an improved comprehension of materials processing in space. This technical memorandum is directed entirely to the payload manifest flown in the eighth of a series of SPAR flights conducted at the White Sands Missile Range (WSMR) and includes the experiments entitled Glass Formation Experiment SPAR 74-42/1R, Glass Fining Experiment in Low-Gravity SPAR 77-13/1, and Dynamics of Liquid Bubbles SPAR Experiment 77-18/2.

  14. Tesla: An application for real-time data analysis in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Aaij, R.; Amato, S.; Anderlini, L.; Benson, S.; Cattaneo, M.; Clemencic, M.; Couturier, B.; Frank, M.; Gligorov, V. V.; Head, T.; Jones, C.; Komarov, I.; Lupton, O.; Matev, R.; Raven, G.; Sciascia, B.; Skwarnicki, T.; Spradlin, P.; Stahl, S.; Storaci, B.; Vesterinen, M.

    2016-11-01

    Upgrades to the LHCb computing infrastructure in the first long shutdown of the LHC have allowed for high quality decay information to be calculated by the software trigger making a separate offline event reconstruction unnecessary. Furthermore, the storage space of the triggered candidate is an order of magnitude smaller than the entire raw event that would otherwise need to be persisted. Tesla is an application designed to process the information calculated by the trigger, with the resulting output used to directly perform physics measurements.

  15. Automatic knowledge extraction from chemical structures: the case of mutagenicity prediction.

    PubMed

    Ferrari, T; Cattaneo, D; Gini, G; Golbamaki Bakhtyari, N; Manganaro, A; Benfenati, E

    2013-01-01

    This work proposes a new structure-activity relationship (SAR) approach to mine molecular fragments that act as structural alerts for biological activity. The entire process is designed to fit with human reasoning, not only to make the predictions more reliable but also to permit clear control by the user in order to meet customized requirements. This approach has been tested on the mutagenicity endpoint, showing marked prediction skills and, more interestingly, bringing to the surface much of the knowledge already collected in the literature as well as new evidence.

  16. Tracking data in the office environment.

    PubMed

    Erickson, Ty B

    2010-09-01

    Data tracking in the office setting focuses on a narrow spectrum of the entire patient safety arena; however, when properly executed, data tracking increases staff members' awareness of the importance of patient safety. Data tracking is also a high-volume event and thereby continues to loop back on the consciousness of providers in all aspects of their practice. Improvement in date tracking will improve the collateral areas of patient safety such as proper medication usage, legibility of written communication, effective delegation of patient safety initiatives, and a collegial effort at developing teams for safety design processes.

  17. CENTRIFUGAL CASTING MACHINE

    DOEpatents

    Shuck, A.B.

    1958-04-01

    A device is described that is specifically designed to cast uraniumn fuel rods in a vacuunn, in order to obtain flawless, nonoxidized castings which subsequently require a maximum of machining or wastage of the expensive processed material. A chamber surrounded with heating elements is connected to the molds, and the entire apparatus is housed in an airtight container. A charge of uranium is placed in the chamber, heated, then is allowed to flow into the molds While being rotated. Water circulating through passages in the molds chills the casting to form a fine grained fuel rod in nearly finished form.

  18. Attachment of lead wires to thin film thermocouples mounted on high temperature materials using the parallel gap welding process

    NASA Technical Reports Server (NTRS)

    Holanda, Raymond; Kim, Walter S.; Pencil, Eric; Groth, Mary; Danzey, Gerald A.

    1990-01-01

    Parallel gap resistance welding was used to attach lead wires to sputtered thin film sensors. Ranges of optimum welding parameters to produce an acceptable weld were determined. The thin film sensors were Pt13Rh/Pt thermocouples; they were mounted on substrates of MCrAlY-coated superalloys, aluminum oxide, silicon carbide and silicon nitride. The entire sensor system is designed to be used on aircraft engine parts. These sensor systems, including the thin-film-to-lead-wire connectors, were tested to 1000 C.

  19. [Electronic data processing-assisted bookkeeping and accounting system at the Düsseldorf Institute of Forensic Medicine].

    PubMed

    Bonte, W; Bonte, I

    1989-01-01

    In 1985 we reported about the usefulness of a simple home computer (here: Commodore C 64) for scientific work. This paper will demonstrate, that such an instrument also can be an appropriate tool for the entire accountancy of a medicolegal institute. Presented were self-designed programs which deal with the following matters: complication of monthly performance reports, calculation of services for clinical care, typing of analytical results and brief interpretations, typing of liquidations, clearing of proceeds from written expertises and autopsies against administration and staff.

  20. Exploration of a Capability-Focused Aerospace System of Systems Architecture Alternative with Bilayer Design Space, Based on RST-SOM Algorithmic Methods

    PubMed Central

    Li, Zhifei; Qin, Dongliang

    2014-01-01

    In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation. PMID:24790572

  1. Exploration of a capability-focused aerospace system of systems architecture alternative with bilayer design space, based on RST-SOM algorithmic methods.

    PubMed

    Li, Zhifei; Qin, Dongliang; Yang, Feng

    2014-01-01

    In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation.

  2. Design and Implementation of an Intrinsically Safe Liquid-Level Sensor Using Coaxial Cable

    PubMed Central

    Jin, Baoquan; Liu, Xin; Bai, Qing; Wang, Dong; Wang, Yu

    2015-01-01

    Real-time detection of liquid level in complex environments has always been a knotty issue. In this paper, an intrinsically safe liquid-level sensor system for flammable and explosive environments is designed and implemented. The poly vinyl chloride (PVC) coaxial cable is chosen as the sensing element and the measuring mechanism is analyzed. Then, the capacitance-to-voltage conversion circuit is designed and the expected output signal is achieved by adopting parameter optimization. Furthermore, the experimental platform of the liquid-level sensor system is constructed, which involves the entire process of measuring, converting, filtering, processing, visualizing and communicating. Additionally, the system is designed with characteristics of intrinsic safety by limiting the energy of the circuit to avoid or restrain the thermal effects and sparks. Finally, the approach of the piecewise linearization is adopted in order to improve the measuring accuracy by matching the appropriate calibration points. The test results demonstrate that over the measurement range of 1.0 m, the maximum nonlinearity error is 0.8% full-scale span (FSS), the maximum repeatability error is 0.5% FSS, and the maximum hysteresis error is reduced from 0.7% FSS to 0.5% FSS by applying software compensation algorithms. PMID:26029949

  3. Design and implementation of an intrinsically safe liquid-level sensor using coaxial cable.

    PubMed

    Jin, Baoquan; Liu, Xin; Bai, Qing; Wang, Dong; Wang, Yu

    2015-05-28

    Real-time detection of liquid level in complex environments has always been a knotty issue. In this paper, an intrinsically safe liquid-level sensor system for flammable and explosive environments is designed and implemented. The poly vinyl chloride (PVC) coaxial cable is chosen as the sensing element and the measuring mechanism is analyzed. Then, the capacitance-to-voltage conversion circuit is designed and the expected output signal is achieved by adopting parameter optimization. Furthermore, the experimental platform of the liquid-level sensor system is constructed, which involves the entire process of measuring, converting, filtering, processing, visualizing and communicating. Additionally, the system is designed with characteristics of intrinsic safety by limiting the energy of the circuit to avoid or restrain the thermal effects and sparks. Finally, the approach of the piecewise linearization is adopted in order to improve the measuring accuracy by matching the appropriate calibration points. The test results demonstrate that over the measurement range of 1.0 m, the maximum nonlinearity error is 0.8% full-scale span (FSS), the maximum repeatability error is 0.5% FSS, and the maximum hysteresis error is reduced from 0.7% FSS to 0.5% FSS by applying software compensation algorithms.

  4. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  5. Delivering better power: the role of simulation in reducing the environmental impact of aircraft engines.

    PubMed

    Menzies, Kevin

    2014-08-13

    The growth in simulation capability over the past 20 years has led to remarkable changes in the design process for gas turbines. The availability of relatively cheap computational power coupled to improvements in numerical methods and physical modelling in simulation codes have enabled the development of aircraft propulsion systems that are more powerful and yet more efficient than ever before. However, the design challenges are correspondingly greater, especially to reduce environmental impact. The simulation requirements to achieve a reduced environmental impact are described along with the implications of continued growth in available computational power. It is concluded that achieving the environmental goals will demand large-scale multi-disciplinary simulations requiring significantly increased computational power, to enable optimization of the airframe and propulsion system over the entire operational envelope. However even with massive parallelization, the limits imposed by communications latency will constrain the time required to achieve a solution, and therefore the position of such large-scale calculations in the industrial design process. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  6. OPC model generation procedure for different reticle vendors

    NASA Astrophysics Data System (ADS)

    Jost, Andrew M.; Belova, Nadya; Callan, Neal P.

    2003-12-01

    The challenge of delivering acceptable semiconductor products to customers in timely fashion becomes more difficult as design complexity increases. The requirements of current generation designs tax OPC engineers greater than ever before since the readiness of high-quality OPC models can delay new process qualifications or lead to respins, which add to the upward-spiraling costs of new reticle sets, extend time-to-market, and disappoint customers. In their efforts to extend the printability of new designs, OPC engineers generally focus on the data-to-wafer path, ignoring data-to-mask effects almost entirely. However, it is unknown whether reticle makers' disparate processes truly yield comparable reticles, even with identical tools. This approach raises the question of whether a single OPC model is applicable to all reticle vendors. LSI Logic has developed a methodology for quantifying vendor-to-vendor reticle manufacturing differences and adapting OPC models for use at several reticle vendors. This approach allows LSI Logic to easily adapt existing OPC models for use with several reticle vendors and obviates the generation of unnecessary models, allowing OPC engineers to focus their efforts on the most critical layers.

  7. Design and development of a bio-inspired, under-actuated soft gripper.

    PubMed

    Hassan, Taimoor; Manti, Mariangela; Passetti, Giovanni; d'Elia, Nicolò; Cianchetti, Matteo; Laschi, Cecilia

    2015-08-01

    The development of robotic devices able to perform manipulation tasks mimicking the human hand has been assessed on large scale. This work stands in the challenging scenario where soft materials are combined with bio-inspired design in order to develop soft grippers with improved grasping and holding capabilities. We are going to show a low-cost, under-actuated and adaptable soft gripper, highlighting the design and the manufacturing process. In particular, a critical analysis is made among three versions of the gripper with same design and actuation mechanism, but based on different materials. A novel actuation principle has been implemented in both cases, in order to reduce the encumbrance of the entire system and improve its aesthetics. Grasping and holding capabilities have been tested for each device, with target objects varying in shape, size and material. Results highlight synergy between the geometry and the intrinsic properties of the soft material, showing the way to novel design principles for soft grippers.

  8. Design of low noise imaging system

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Chen, Xiaolai

    2017-10-01

    In order to meet the needs of engineering applications for low noise imaging system under the mode of global shutter, a complete imaging system is designed based on the SCMOS (Scientific CMOS) image sensor CIS2521F. The paper introduces hardware circuit and software system design. Based on the analysis of key indexes and technologies about the imaging system, the paper makes chips selection and decides SCMOS + FPGA+ DDRII+ Camera Link as processing architecture. Then it introduces the entire system workflow and power supply and distribution unit design. As for the software system, which consists of the SCMOS control module, image acquisition module, data cache control module and transmission control module, the paper designs in Verilog language and drives it to work properly based on Xilinx FPGA. The imaging experimental results show that the imaging system exhibits a 2560*2160 pixel resolution, has a maximum frame frequency of 50 fps. The imaging quality of the system satisfies the requirement of the index.

  9. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  10. A new method for designing dual foil electron beam forming systems. I. Introduction, concept of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.

  11. DISCO: An object-oriented system for music composition and sound design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaper, H. G.; Tipei, S.; Wright, J. M.

    2000-09-05

    This paper describes an object-oriented approach to music composition and sound design. The approach unifies the processes of music making and instrument building by using similar logic, objects, and procedures. The composition modules use an abstract representation of musical data, which can be easily mapped onto different synthesis languages or a traditionally notated score. An abstract base class is used to derive classes on different time scales. Objects can be related to act across time scales, as well as across an entire piece, and relationships between similar objects can replicate traditional music operations or introduce new ones. The DISCO (Digitalmore » Instrument for Sonification and Composition) system is an open-ended work in progress.« less

  12. A critical assessment of in-flight particle state during plasma spraying of YSZ and its implications on coating properties and process reliability

    NASA Astrophysics Data System (ADS)

    Srinivasan, Vasudevan

    Air plasma spray is inherently complex due to the deviation from equilibrium conditions, three dimensional nature, multitude of interrelated (controllable) parameters and (uncontrollable) variables involved, and stochastic variability at different stages. The resultant coatings are complex due to the layered high defect density microstructure. Despite the widespread use and commercial success for decades in earthmoving, automotive, aerospace and power generation industries, plasma spray has not been completely understood and prime reliance for critical applications such as thermal barrier coatings on gas turbines are yet to be accomplished. This dissertation is aimed at understanding the in-flight particle state of the plasma spray process towards designing coatings and achieving coating reliability with the aid of noncontact in-flight particle and spray stream sensors. Key issues such as the phenomena of optimum particle injection and the definition of spray stream using particle state are investigated. Few strategies to modify the microstructure and properties of Yttria Stabilized Zirconia coatings are examined systematically using the framework of process maps. An approach to design process window based on design relevant coating properties is presented. Options to control the process for enhanced reproducibility and reliability are examined and the resultant variability is evaluated systematically at the different stages in the process. The 3D variability due to the difference in plasma characteristics has been critically examined by investigating splats collected from the entire spray footprint.

  13. Magnetically Enhanced Solid-Liquid Separation

    NASA Astrophysics Data System (ADS)

    Rey, C. M.; Keller, K.; Fuchs, B.

    2005-07-01

    DuPont is developing an entirely new method of solid-liquid filtration involving the use of magnetic fields and magnetic field gradients. The new hybrid process, entitled Magnetically Enhanced Solid-Liquid Separation (MESLS), is designed to improve the de-watering kinetics and reduce the residual moisture content of solid particulates mechanically separated from liquid slurries. Gravitation, pressure, temperature, centrifugation, and fluid dynamics have dictated traditional solid-liquid separation for the past 50 years. The introduction of an external field (i.e. the magnetic field) offers the promise to manipulate particle behavior in an entirely new manner, which leads to increased process efficiency. Traditional solid-liquid separation typically consists of two primary steps. The first is a mechanical step in which the solid particulate is separated from the liquid using e.g. gas pressure through a filter membrane, centrifugation, etc. The second step is a thermal drying process, which is required due to imperfect mechanical separation. The thermal drying process is over 100-200 times less energy efficient than the mechanical step. Since enormous volumes of materials are processed each year, more efficient mechanical solid-liquid separations can be leveraged into dramatic reductions in overall energy consumption by reducing downstream drying requirements have a tremendous impact on energy consumption. Using DuPont's MESLS process, initial test results showed four very important effects of the magnetic field on the solid-liquid filtration process: 1) reduction of the time to reach gas breakthrough, 2) less loss of solid into the filtrate, 3) reduction of the (solids) residual moisture content, and 4) acceleration of the de-watering kinetics. These test results and their potential impact on future commercial solid-liquid filtration is discussed. New applications can be found in mining, chemical and bioprocesses.

  14. A novel processing platform for post tape out flows

    NASA Astrophysics Data System (ADS)

    Vu, Hien T.; Kim, Soohong; Word, James; Cai, Lynn Y.

    2018-03-01

    As the computational requirements for post tape out (PTO) flows increase at the 7nm and below technology nodes, there is a need to increase the scalability of the computational tools in order to reduce the turn-around time (TAT) of the flows. Utilization of design hierarchy has been one proven method to provide sufficient partitioning to enable PTO processing. However, as the data is processed through the PTO flow, its effective hierarchy is reduced. The reduction is necessary to achieve the desired accuracy. Also, the sequential nature of the PTO flow is inherently non-scalable. To address these limitations, we are proposing a quasi-hierarchical solution that combines multiple levels of parallelism to increase the scalability of the entire PTO flow. In this paper, we describe the system and present experimental results demonstrating the runtime reduction through scalable processing with thousands of computational cores.

  15. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  16. Development of hydrogen peroxide technique for bioburden reduction

    NASA Astrophysics Data System (ADS)

    Rohatgi, N.; Schwartz, L.; Stabekis, P.; Barengoltz, J.

    In order to meet the National Aeronautics and Space Administration (NASA) Planetary Protection microbial reduction requirements for Mars in-situ life detection and sample return missions, entire planetary spacecraft (including planetary entry probes and planetary landing capsules) may have to be exposed to a qualified sterilization process. Presently, dry heat is the only NASA approved sterilization technique available for spacecraft application. However, with the increasing use of various man-made materials, highly sophisticated electronic circuit boards, and sensors in a modern spacecraft, compatibility issues may render this process unacceptable to design engineers and thus impractical to achieve terminal sterilization of the entire spacecraft. An alternative vapor phase hydrogen peroxide sterilization process, which is currently used in various industries, has been selected for further development. Strategic Technology Enterprises, Incorporated (STE), a subsidiary of STERIS Corporation, under a contract from the Jet Propulsion Laboratory (JPL) is developing systems and methodologies to decontaminate spacecraft using vaporized hydrogen peroxide (VHP) technology. The VHP technology provides an effective, rapid and low temperature means for inactivation of spores, mycobacteria, fungi, viruses and other microorganisms. The VHP application is a dry process affording excellent material compatibility with many of the components found in spacecraft such as polymers, paints and electronic systems. Furthermore, the VHP process has innocuous residuals as it decomposes to water vapor and oxygen. This paper will discuss the approach that is being used to develop this technique and will present lethality data that have been collected to establish deep vacuum VHP sterilization cycles. In addition, the application of this technique to meet planetary protection requirements will be addressed.

  17. Single-Vector Calibration of Wind-Tunnel Force Balances

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; DeLoach, R.

    2003-01-01

    An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.

  18. EMG amplifier with wireless data transmission

    NASA Astrophysics Data System (ADS)

    Kowalski, Grzegorz; Wildner, Krzysztof

    2017-08-01

    Wireless medical diagnostics is a trend in modern technology used in medicine. This paper presents a concept of realization, architecture of hardware and software implementation of an elecromyography signal (EMG) amplifier with wireless data transmission. This amplifier consists of three components: analogue processing of bioelectric signal module, micro-controller circuit and an application enabling data acquisition via a personal computer. The analogue bioelectric signal processing circuit receives electromyography signals from the skin surface, followed by initial analogue processing and preparation of the signals for further digital processing. The second module is a micro-controller circuit designed to wirelessly transmit the electromyography signals from the analogue signal converter to a personal computer. Its purpose is to eliminate the need for wired connections between the patient and the data logging device. The third block is a computer application designed to display the transmitted electromyography signals, as well as data capture and analysis. Its purpose is to provide a graphical representation of the collected data. The entire device has been thoroughly tested to ensure proper functioning. In use, the device displayed the captured electromyography signal from the arm of the patient. Amplitude- frequency characteristics were set in order to investigate the bandwidth and the overall gain of the device.

  19. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  20. Preliminary design of nine high speed civil transports

    NASA Technical Reports Server (NTRS)

    Sandlin, Doral; Vantriet, Robert; Soban, Dani; Hoang, TY

    1992-01-01

    Sixty senior design students at Cal Poly, SLO have completed a year-long project to design the next generation of High Speed Civil Transports (HSCT). The design process was divided up into three distinct phases. The first third of the project was devoted entirely to research into the special problems associated with an HSCT. These included economic viability, airport compatibility, high speed aerodynamics, sonic boom minimization, environmental impact, and structures and materials. The result of this research was the development of nine separate Requests for Proposal (RFP) that outlined reasonable yet challenging design criteria for the aircraft. All were designed to be technically feasible in the year 2015. The next phase of the project divided the sixty students into nine design groups. Each group, with its own RFP, completed a Class 1 preliminary design of an HSCT. The nine configurations varied from conventional double deltas to variable geometry wings to a pivoting oblique wing design. The final phase of the project included a more detailed Class 2 sizing as well as performance and stability and control analysis. Cal Poly, San Luis Obispo presents nine unique solutions to the same problem: that of designing an economically viable, environmentally acceptable, safe and comfortable supersonic transport.

  1. A brief understanding of process optimisation in microwave-assisted extraction of botanical materials: options and opportunities with chemometric tools.

    PubMed

    Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C

    2014-01-01

    Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Fracture toughness of ultrashort pulse-bonded fused silica

    NASA Astrophysics Data System (ADS)

    Richter, S.; Naumann, F.; Zimmermann, F.; Tünnermann, A.; Nolte, S.

    2016-02-01

    We determined the bond interface strength of ultrashort pulse laser-welded fused silica for different processing parameters. To this end, we used a high repetition rate ultrashort pulse laser system to inscribe parallel welding lines with a specific V-shaped design into optically contacted fused silica samples. Afterward, we applied a micro-chevron test to measure the fracture toughness and surface energy of the laser-inscribed welding seams. We analyzed the influence of different processing parameters such as laser repetition rate and line separation on the fracture toughness and fracture surface energy. Welding the entire surface a fracture toughness of 0.71 {MPa} {m}^{1/2}, about 90 % of the pristine bulk material ({≈ } 0.8 {MPa} {m}^{1/2}), is obtained.

  3. Physical and mechanical metallurgy of NiAl

    NASA Technical Reports Server (NTRS)

    Noebe, Ronald D.; Bowman, Randy R.; Nathal, Michael V.

    1994-01-01

    Considerable research has been performed on NiAl over the last decade, with an exponential increase in effort occurring over the last few years. This is due to interest in this material for electronic, catalytic, coating and especially high-temperature structural applications. This report uses this wealth of new information to develop a complete description of the properties and processing of NiAl and NiAl-based materials. Emphasis is placed on the controlling fracture and deformation mechanisms of single and polycrystalline NiAl and its alloys over the entire range of temperatures for which data are available. Creep, fatigue, and environmental resistance of this material are discussed. In addition, issues surrounding alloy design, development of NiAl-based composites, and materials processing are addressed.

  4. ROMI-RIP: Rough mill rip-first simulator. Forest Service general technical report (Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, R.E.

    1995-07-01

    The ROugh Mill Rip-First Simulator (ROMI-RIP) is a computer software package that simulates the gang-ripping of lumber. ROMI-RIP was designed to closely simulate current machines and industrial practice. This simulator allows the user to perform `what if` analyses on various gang-rip-first rough mill operations with fixed, floating outer blade and all-movable blade arbors. ROMI-RIP accepts cutting bills with up to 300 different part sizes. Plots of processed boards are easily viewed or printed. Detailed summaries of processing steps (number of rips and crosscuts) and yields (single boards or entire board files) can also be viewed of printed. ROMI-RIP requires IBMmore » personal computers with 80286 of higher processors.« less

  5. Atomization and vaporization characteristics of airblast fuel injection inside a venturi tube

    NASA Technical Reports Server (NTRS)

    Sun, H.; Chue, T.-H.; Lai, M.-C.; Tacina, R. R.

    1993-01-01

    This paper describes the experimental and numerical characterization of the capillary fuel injection, atomization, dispersion, and vaporization of liquid fuel in a coflowing air stream inside a single venturi tube. The experimental techniques used are all laser-based. Phase Doppler analyzer was used to characterize the atomization and vaporization process. Planar laser-induced fluorescence visualizations give good qualitative picture of the fuel droplet and vapor distribution. Limited quantitative capabilities of the technique are also demonstrated. A modified version of the KIVA-II was used to simulate the entire spray process, including breakup and vaporization. The advantage of venturi nozzle is demonstrated in terms of better atomization, more uniform F/A distribution, and less pressure drop. Multidimensional spray calculations can be used as a design tool only if care is taken for the proper breakup model, and wall impingement process.

  6. PatternQuery: web application for fast detection of biomacromolecular structural patterns in the entire Protein Data Bank.

    PubMed

    Sehnal, David; Pravda, Lukáš; Svobodová Vařeková, Radka; Ionescu, Crina-Maria; Koča, Jaroslav

    2015-07-01

    Well defined biomacromolecular patterns such as binding sites, catalytic sites, specific protein or nucleic acid sequences, etc. precisely modulate many important biological phenomena. We introduce PatternQuery, a web-based application designed for detection and fast extraction of such patterns. The application uses a unique query language with Python-like syntax to define the patterns that will be extracted from datasets provided by the user, or from the entire Protein Data Bank (PDB). Moreover, the database-wide search can be restricted using a variety of criteria, such as PDB ID, resolution, and organism of origin, to provide only relevant data. The extraction generally takes a few seconds for several hundreds of entries, up to approximately one hour for the whole PDB. The detected patterns are made available for download to enable further processing, as well as presented in a clear tabular and graphical form directly in the browser. The unique design of the language and the provided service could pave the way towards novel PDB-wide analyses, which were either difficult or unfeasible in the past. The application is available free of charge at http://ncbr.muni.cz/PatternQuery. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. GPU Optimizations for a Production Molecular Docking Code*

    PubMed Central

    Landaverde, Raphael; Herbordt, Martin C.

    2015-01-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users. PMID:26594667

  8. GPU Optimizations for a Production Molecular Docking Code.

    PubMed

    Landaverde, Raphael; Herbordt, Martin C

    2014-09-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users.

  9. Pressure Mapping and Efficiency Analysis of an EPPLER 857 Hydrokinetic Turbine

    NASA Astrophysics Data System (ADS)

    Clark, Tristan

    A conceptual energy ship is presented to provide renewable energy. The ship, driven by the wind, drags a hydrokinetic turbine through the water. The power generated is used to run electrolysis on board, taking the resultant hydrogen back to shore to be used as an energy source. The basin efficiency (Power/thrust*velocity) of the Hydrokinetic Turbine (HTK) plays a vital role in this process. In order to extract the maximum allowable power from the flow, the blades need to be optimized. The structural analysis of the blade is important, as the blade will undergo high pressure loads from the water. A procedure for analysis of a preliminary Hydrokinetic Turbine blade design is developed. The blade was designed by a non-optimized Blade Element Momentum Theory (BEMT) code. Six simulations were run, with varying mesh resolution, turbulence models, and flow region size. The procedure was developed that provides detailed explanation for the entire process, from geometry and mesh generation to post-processing analysis tools. The efficiency results from the simulations are used to study the mesh resolution, flow region size, and turbulence models. The results are compared to the BEMT model design targets. Static pressure maps are created that can be used for structural analysis of the blades.

  10. Determination of technological parameters in strip mining by time-of-flight and image processing

    NASA Astrophysics Data System (ADS)

    Elandaloussi, Frank; Mueller, B.; Osten, Wolfgang

    1999-09-01

    The conveying and dumping of earth masses lying over the coal seam in lignite surface mining is done usually by overburden conveyor bridges. The overburden, obtained from connected excavators, is transported over the bridge construction using a conveyor belt system and poured into one front dump and three surface dumps. The shaping of the dump growth is of great importance both to guaranty the stability of the masses dumped to earth stocks as well as the whole construction and to prepare the area for re-cultivation. This article describes three measurement systems: one to determine the impact point of the dumped earth masses, one to determine the shape of the entire mining process and the other a sensor for the loading of the conveyor belt. For the first measurement system, a real-time video system has been designed, set-up and installed that is capable to determine the impact point of all three dumps simultaneously. The second measurement system is a connection of 5 special designed laser distance measuring instruments, that are able to measure the shape of the mining process under unfavorable environmental conditions like dust, high temperature changes, heavy shocks etc. The third sensor is designed for monitoring the transportation of the masses via the conveyor belt system.

  11. Design of a Clinical Information Management System to Support DNA Analysis Laboratory Operation

    PubMed Central

    Dubay, Christopher J.; Zimmerman, David; Popovich, Bradley

    1995-01-01

    The LabDirector system has been developed at the Oregon Health Sciences University to support the operation of our clinical DNA analysis laboratory. Through an iterative design process which has spanned two years, we have produced a system that is both highly tailored to a clinical genetics production laboratory and flexible in its implementation, to support the rapid growth and change of protocols and methodologies in use in the field. The administrative aspects of the system are integrated with an enterprise schedule management system. The laboratory side of the system is driven by a protocol modeling and execution system. The close integration between these two aspects of the clinical laboratory facilitates smooth operations, and allows management to accurately measure costs and performance. The entire application has been designed and documented to provide utility to a wide range of clinical laboratory environments.

  12. In Situ Cyclization of Native Proteins: Structure-Based Design of a Bicyclic Enzyme.

    PubMed

    Pelay-Gimeno, Marta; Bange, Tanja; Hennig, Sven; Grossmann, Tom N

    2018-05-30

    Increased tolerance of enzymes towards thermal and chemical stress is required for many applications and can be achieved by macrocyclization of the enzyme resulting in the stabilizing of its tertiary structure. So far, macrocyclization approaches utilize a very limited structural diversity which complicates the design process. Here, we report an approach that enables cyclization via the installation of modular crosslinks into native proteins composed entirely of proteinogenic amino acids. Our stabilization procedure involves the introduction of three surface exposed cysteines which are reacted with a triselectrophile resulting in the in situ cylization of the protein (INCYPRO). A bicyclic version of Sortase A was designed exhibiting increased tolerance towards thermal as well as chemical denaturation, and proved efficient in protein labeling under denaturing conditions. In addition, we applied INCYPRO to the KIX domain resulting in up to 24 °C increased thermal stability. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Advanced Multispectral Scanner (AMS) study. [aircraft remote sensing

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The status of aircraft multispectral scanner technology was accessed in order to develop preliminary design specifications for an advanced instrument to be used for remote sensing data collection by aircraft in the 1980 time frame. The system designed provides a no-moving parts multispectral scanning capability through the exploitation of linear array charge coupled device technology and advanced electronic signal processing techniques. Major advantages include: 10:1 V/H rate capability; 120 deg FOV at V/H = 0.25 rad/sec; 1 to 2 rad resolution; high sensitivity; large dynamic range capability; geometric fidelity; roll compensation; modularity; long life; and 24 channel data acquisition capability. The field flattening techniques of the optical design allow wide field view to be achieved at fast f/nos for both the long and short wavelength regions. The digital signal averaging technique permits maximization of signal to noise performance over the entire V/H rate range.

  14. Registry in a tube: multiplexed pools of retrievable parts for genetic design space exploration.

    PubMed

    Woodruff, Lauren B A; Gorochowski, Thomas E; Roehner, Nicholas; Mikkelsen, Tarjei S; Densmore, Douglas; Gordon, D Benjamin; Nicol, Robert; Voigt, Christopher A

    2017-02-17

    Genetic designs can consist of dozens of genes and hundreds of genetic parts. After evaluating a design, it is desirable to implement changes without the cost and burden of starting the construction process from scratch. Here, we report a two-step process where a large design space is divided into deep pools of composite parts, from which individuals are retrieved and assembled to build a final construct. The pools are built via multiplexed assembly and sequenced using next-generation sequencing. Each pool consists of ∼20 Mb of up to 5000 unique and sequence-verified composite parts that are barcoded for retrieval by PCR. This approach is applied to a 16-gene nitrogen fixation pathway, which is broken into pools containing a total of 55 848 composite parts (71.0 Mb). The pools encompass an enormous design space (1043 possible 23 kb constructs), from which an algorithm-guided 192-member 4.5 Mb library is built. Next, all 1030 possible genetic circuits based on 10 repressors (NOR/NOT gates) are encoded in pools where each repressor is fused to all permutations of input promoters. These demonstrate that multiplexing can be applied to encompass entire design spaces from which individuals can be accessed and evaluated. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  16. Hybrid battery/supercapacitor energy storage system for the electric vehicles

    NASA Astrophysics Data System (ADS)

    Kouchachvili, Lia; Yaïci, Wahiba; Entchev, Evgueniy

    2018-01-01

    Electric vehicles (EVs) have recently attracted considerable attention and so did the development of the battery technologies. Although the battery technology has been significantly advanced, the available batteries do not entirely meet the energy demands of the EV power consumption. One of the key issues is non-monotonic consumption of energy accompanied by frequent changes during the battery discharging process. This is very harmful to the electrochemical process of the battery. A practical solution is to couple the battery with a supercapacitor, which is basically an electrochemical cell with a similar architecture, but with a higher rate capability and better cyclability. In this design, the supercapacitor can provide the excess energy required while the battery fails to do so. In addition to the battery and supercapacitor as the individual units, designing the architecture of the corresponding hybrid system from an electrical engineering point of view is of utmost importance. The present manuscript reviews the recent works devoted to the application of various battery/supercapacitor hybrid systems in EVs.

  17. Plant growth chamber M design

    NASA Technical Reports Server (NTRS)

    Prince, R. P.; Knott, W. M.

    1986-01-01

    Crop production is just one of the many processes involved in establishing long term survival of man in space. The benefits of integrating higher plants into the overall plan was recognized early by NASA through the Closed Ecological Life Support System (CELSS) program. The first step is to design, construct, and operate a sealed (gas, liquid, and solid) plant growth chamber. A 3.6 m diameter by 6.7 m high closed cylinder (previously used as a hypobaric vessel during the Mercury program) is being modified for this purpose. The chamber is mounted on legs with the central axis vertical. Entrance to the chamber is through an airlock. This chamber will be devoted entirely to higher plant experimentation. Any waste treatment, food processing or product storage studies will be carried on outside of this chamber. Its primary purpose is to provide input and output data on solids, liquids, and gases for single crop species and multiple species production using different nutrient delivery systems.

  18. A domain-specific design architecture for composite material design and aircraft part redesign

    NASA Technical Reports Server (NTRS)

    Punch, W. F., III; Keller, K. J.; Bond, W.; Sticklen, J.

    1992-01-01

    Advanced composites have been targeted as a 'leapfrog' technology that would provide a unique global competitive position for U.S. industry. Composites are unique in the requirements for an integrated approach to designing, manufacturing, and marketing of products developed utilizing the new materials of construction. Numerous studies extending across the entire economic spectrum of the United States from aerospace to military to durable goods have identified composites as a 'key' technology. In general there have been two approaches to composite construction: build models of a given composite materials, then determine characteristics of the material via numerical simulation and empirical testing; and experience-directed construction of fabrication plans for building composites with given properties. The first route sets a goal to capture basic understanding of a device (the composite) by use of a rigorous mathematical model; the second attempts to capture the expertise about the process of fabricating a composite (to date) at a surface level typically expressed in a rule based system. From an AI perspective, these two research lines are attacking distinctly different problems, and both tracks have current limitations. The mathematical modeling approach has yielded a wealth of data but a large number of simplifying assumptions are needed to make numerical simulation tractable. Likewise, although surface level expertise about how to build a particular composite may yield important results, recent trends in the KBS area are towards augmenting surface level problem solving with deeper level knowledge. Many of the relative advantages of composites, e.g., the strength:weight ratio, is most prominent when the entire component is designed as a unitary piece. The bottleneck in undertaking such unitary design lies in the difficulty of the re-design task. Designing the fabrication protocols for a complex-shaped, thick section composite are currently very difficult. It is in fact this difficulty that our research will address.

  19. Downsampling Photodetector Array with Windowing

    NASA Technical Reports Server (NTRS)

    Patawaran, Ferze D.; Farr, William H.; Nguyen, Danh H.; Quirk, Kevin J.; Sahasrabudhe, Adit

    2012-01-01

    In a photon counting detector array, each pixel in the array produces an electrical pulse when an incident photon on that pixel is detected. Detection and demodulation of an optical communication signal that modulated the intensity of the optical signal requires counting the number of photon arrivals over a given interval. As the size of photon counting photodetector arrays increases, parallel processing of all the pixels exceeds the resources available in current application-specific integrated circuit (ASIC) and gate array (GA) technology; the desire for a high fill factor in avalanche photodiode (APD) detector arrays also precludes this. Through the use of downsampling and windowing portions of the detector array, the processing is distributed between the ASIC and GA. This allows demodulation of the optical communication signal incident on a large photon counting detector array, as well as providing architecture amenable to algorithmic changes. The detector array readout ASIC functions as a parallel-to-serial converter, serializing the photodetector array output for subsequent processing. Additional downsampling functionality for each pixel is added to this ASIC. Due to the large number of pixels in the array, the readout time of the entire photodetector is greater than the time between photon arrivals; therefore, a downsampling pre-processing step is done in order to increase the time allowed for the readout to occur. Each pixel drives a small counter that is incremented at every detected photon arrival or, equivalently, the charge in a storage capacitor is incremented. At the end of a user-configurable counting period (calculated independently from the ASIC), the counters are sampled and cleared. This downsampled photon count information is then sent one counter word at a time to the GA. For a large array, processing even the downsampled pixel counts exceeds the capabilities of the GA. Windowing of the array, whereby several subsets of pixels are designated for processing, is used to further reduce the computational requirements. The grouping of the designated pixel frame as the photon count information is sent one word at a time to the GA, the aggregation of the pixels in a window can be achieved by selecting only the designated pixel counts from the serial stream of photon counts, thereby obviating the need to store the entire frame of pixel count in the gate array. The pixel count se quence from each window can then be processed, forming lower-rate pixel statistics for each window. By having this processing occur in the GA rather than in the ASIC, future changes to the processing algorithm can be readily implemented. The high-bandwidth requirements of a photon counting array combined with the properties of the optical modulation being detected by the array present a unique problem that has not been addressed by current CCD or CMOS sensor array solutions.

  20. Effects of process parameters on solid self-microemulsifying particles in a laboratory scale fluid bed.

    PubMed

    Mukherjee, Tusharmouli; Plakogiannis, Fotios M

    2012-01-01

    The purpose of this study was to select the critical process parameters of the fluid bed processes impacting the quality attribute of a solid self-microemulsifying (SME) system of albendazole (ABZ). A fractional factorial design (2(4-1)) with four parameters (spray rate, inlet air temperature, inlet air flow, and atomization air pressure) was created by MINITAB software. Batches were manufactured in a laboratory top-spray fluid bed at 625-g scale. Loss on drying (LOD) samples were taken throughout each batch to build the entire moisture profiles. All dried granulation were sieved using mesh 20 and analyzed for particle size distribution (PSD), morphology, density, and flow. It was found that as spray rate increased, sauter-mean diameter (D(s)) also increased. The effect of inlet air temperature on the peak moisture which is directly related to the mean particle size was found to be significant. There were two-way interactions between studied process parameters. The main effects of inlet air flow rate and atomization air pressure could not be found as the data were inconclusive. The partial least square (PLS) regression model was found significant (P < 0.01) and predictive for optimization. This study established a design space for the parameters for solid SME manufacturing process.

  1. Monte Carlo simulation of efficient data acquisition for an entire-body PET scanner

    NASA Astrophysics Data System (ADS)

    Isnaini, Ismet; Obi, Takashi; Yoshida, Eiji; Yamaya, Taiga

    2014-07-01

    Conventional PET scanners can image the whole body using many bed positions. On the other hand, an entire-body PET scanner with an extended axial FOV, which can trace whole-body uptake images at the same time and improve sensitivity dynamically, has been desired. The entire-body PET scanner would have to process a large amount of data effectively. As a result, the entire-body PET scanner has high dead time at a multiplex detector grouping process. Also, the entire-body PET scanner has many oblique line-of-responses. In this work, we study an efficient data acquisition for the entire-body PET scanner using the Monte Carlo simulation. The simulated entire-body PET scanner based on depth-of-interaction detectors has a 2016-mm axial field-of-view (FOV) and an 80-cm ring diameter. Since the entire-body PET scanner has higher single data loss than a conventional PET scanner at grouping circuits, the NECR of the entire-body PET scanner decreases. But, single data loss is mitigated by separating the axially arranged detector into multiple parts. Our choice of 3 groups of axially-arranged detectors has shown to increase the peak NECR by 41%. An appropriate choice of maximum ring difference (MRD) will also maintain the same high performance of sensitivity and high peak NECR while at the same time reduces the data size. The extremely-oblique line of response for large axial FOV does not contribute much to the performance of the scanner. The total sensitivity with full MRD increased only 15% than that with about half MRD. The peak NECR was saturated at about half MRD. The entire-body PET scanner promises to provide a large axial FOV and to have sufficient performance values without using the full data.

  2. Development of the e-Baby serious game with regard to the evaluation of oxygenation in preterm babies: contributions of the emotional design.

    PubMed

    Fonseca, Luciana Mara Monti; Dias, Danielle Monteiro Vilela; Góes, Fernanda Dos Santos Nogueira; Seixas, Carlos Alberto; Scochi, Carmen Gracinda Silvan; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2014-09-01

    The present study aimed to describe the development process of a serious game that enables users to evaluate the respiratory process in a preterm infant based on an emotional design model. The e-Baby serious game was built to feature the simulated environment of an incubator, in which the user performs a clinical evaluation of the respiratory process in a virtual preterm infant. The user learns about the preterm baby's history, chooses the tools for the clinical evaluation, evaluates the baby, and determines whether his/her evaluation is appropriate. The e-Baby game presents phases that contain respiratory process impairments of higher or lower complexity in the virtual preterm baby. Included links give the user the option of recording the entire evaluation procedure and sharing his/her performance on a social network. e-Baby integrates a Clinical Evaluation of the Preterm Baby course in the Moodle virtual environment. This game, which evaluates the respiratory process in preterm infants, could support a more flexible, attractive, and interactive teaching and learning process that includes simulations with features very similar to neonatal unit realities, thus allowing more appropriate training for clinical oxygenation evaluations in at-risk preterm infants. e-Baby allows advanced user-technology-educational interactions because it requires active participation in the process and is emotionally integrated.

  3. Intelligent microchip networks: an agent-on-chip synthesis framework for the design of smart and robust sensor networks

    NASA Astrophysics Data System (ADS)

    Bosse, Stefan

    2013-05-01

    Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.

  4. A robotically constructed production and supply base on Phobos

    NASA Astrophysics Data System (ADS)

    1989-05-01

    PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.

  5. A robotically constructed production and supply base on Phobos

    NASA Technical Reports Server (NTRS)

    1989-01-01

    PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.

  6. Spaceborne sensors (1983-2000 AD): A forecast of technology

    NASA Technical Reports Server (NTRS)

    Kostiuk, T.; Clark, B. P.

    1984-01-01

    A technical review and forecast of space technology as it applies to spaceborne sensors for future NASA missions is presented. A format for categorization of sensor systems covering the entire electromagnetic spectrum, including particles and fields is developed. Major generic sensor systems are related to their subsystems, components, and to basic research and development. General supporting technologies such as cryogenics, optical design, and data processing electronics are addressed where appropriate. The dependence of many classes of instruments on common components, basic R&D and support technologies is also illustrated. A forecast of important system designs and instrument and component performance parameters is provided for the 1983-2000 AD time frame. Some insight into the scientific and applications capabilities and goals of the sensor systems is also given.

  7. A high speed buffer for LV data acquisition

    NASA Technical Reports Server (NTRS)

    Cavone, Angelo A.; Sterlina, Patrick S.; Clemmons, James I., Jr.; Meyers, James F.

    1987-01-01

    The laser velocimeter (autocovariance) buffer interface is a data acquisition subsystem designed specifically for the acquisition of data from a laser velocimeter. The subsystem acquires data from up to six laser velocimeter components in parallel, measures the times between successive data points for each of the components, establishes and maintains a coincident condition between any two or three components, and acquires data from other instrumentation systems simultaneously with the laser velocimeter data points. The subsystem is designed to control the entire data acquisition process based on initial setup parameters obtained from a host computer and to be independent of the computer during the acquisition. On completion of the acquisition cycle, the interface transfers the contents of its memory to the host under direction of the host via a single 16-bit parallel DMA channel.

  8. GaN-Based Laser Wireless Power Transfer System.

    PubMed

    De Santi, Carlo; Meneghini, Matteo; Caria, Alessandro; Dogmus, Ezgi; Zegaoui, Malek; Medjdoub, Farid; Kalinic, Boris; Cesca, Tiziana; Meneghesso, Gaudenzio; Zanoni, Enrico

    2018-01-17

    The aim of this work is to present a potential application of gallium nitride-based optoelectronic devices. By using a laser diode and a photodetector, we designed and demonstrated a free-space compact and lightweight wireless power transfer system, whose efficiency is limited by the efficiency of the receiver. We analyzed the effect of the electrical load, temperature, partial absorption and optical excitation distribution on the efficiency, by identifying heating and band-filling as the most impactful processes. By comparing the final demonstrator with a commercial RF-based Qi system, we conclude that the efficiency is still low at close range, but is promising in medium to long range applications. Efficiency may not be a limiting factor, since this concept can enable entirely new possibilities and designs, especially relevant for space applications.

  9. GaN-Based Laser Wireless Power Transfer System

    PubMed Central

    Meneghini, Matteo; Caria, Alessandro; Dogmus, Ezgi; Zegaoui, Malek; Medjdoub, Farid; Kalinic, Boris; Meneghesso, Gaudenzio; Zanoni, Enrico

    2018-01-01

    The aim of this work is to present a potential application of gallium nitride-based optoelectronic devices. By using a laser diode and a photodetector, we designed and demonstrated a free-space compact and lightweight wireless power transfer system, whose efficiency is limited by the efficiency of the receiver. We analyzed the effect of the electrical load, temperature, partial absorption and optical excitation distribution on the efficiency, by identifying heating and band-filling as the most impactful processes. By comparing the final demonstrator with a commercial RF-based Qi system, we conclude that the efficiency is still low at close range, but is promising in medium to long range applications. Efficiency may not be a limiting factor, since this concept can enable entirely new possibilities and designs, especially relevant for space applications. PMID:29342114

  10. Reconfigurable, Cognitive Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  11. Curves showing column strength of steel and duralumin tubing

    NASA Technical Reports Server (NTRS)

    Ross, Orrin E

    1929-01-01

    Given here are a set of column strength curves that are intended to simplify the method of determining the size of struts in an airplane structure when the load in the member is known. The curves will also simplify the checking of the strength of a strut if the size and length are known. With these curves, no computations are necessary, as in the case of the old-fashioned method of strut design. The process is so simple that draftsmen or others who are not entirely familiar with mechanics can check the strength of a strut without much danger of error.

  12. Effective dimension reduction for sparse functional data

    PubMed Central

    YAO, F.; LEI, E.; WU, Y.

    2015-01-01

    Summary We propose a method of effective dimension reduction for functional data, emphasizing the sparse design where one observes only a few noisy and irregular measurements for some or all of the subjects. The proposed method borrows strength across the entire sample and provides a way to characterize the effective dimension reduction space, via functional cumulative slicing. Our theoretical study reveals a bias-variance trade-off associated with the regularizing truncation and decaying structures of the predictor process and the effective dimension reduction space. A simulation study and an application illustrate the superior finite-sample performance of the method. PMID:26566293

  13. A cryogenic thermal source for detector array characterization

    NASA Astrophysics Data System (ADS)

    Chuss, David T.; Rostem, Karwan; Wollack, Edward J.; Berman, Leah; Colazo, Felipe; DeGeorge, Martin; Helson, Kyle; Sagliocca, Marco

    2017-10-01

    We describe the design, fabrication, and validation of a cryogenically compatible quasioptical thermal source for characterization of detector arrays. The source is constructed using a graphite-loaded epoxy mixture that is molded into a tiled pyramidal structure. The mold is fabricated using a hardened steel template produced via a wire electron discharge machining process. The absorptive mixture is bonded to a copper backplate enabling thermalization of the entire structure and measurement of the source temperature. Measurements indicate that the reflectance of the source is <0.001 across a spectral band extending from 75 to 330 GHz.

  14. Nanoscale patterning of electronic devices at the amorphous LaAlO3/SrTiO3 oxide interface using an electron sensitive polymer mask

    NASA Astrophysics Data System (ADS)

    Bjørlig, Anders V.; von Soosten, Merlin; Erlandsen, Ricci; Dahm, Rasmus Tindal; Zhang, Yu; Gan, Yulin; Chen, Yunzhong; Pryds, Nini; Jespersen, Thomas S.

    2018-04-01

    A simple approach is presented for designing complex oxide mesoscopic electronic devices based on the conducting interfaces of room temperature grown LaAlO3/SrTiO3 heterostructures. The technique is based entirely on methods known from conventional semiconductor processing technology, and we demonstrate a lateral resolution of ˜100 nm. We study the low temperature transport properties of nanoscale wires and demonstrate the feasibility of the technique for defining in-plane gates allowing local control of the electrostatic environment in mesoscopic devices.

  15. A Cryogenic Thermal Source for Detector Array Characterization

    NASA Technical Reports Server (NTRS)

    Chuss, David T.; Rostem, Karwan; Wollack, Edward J.; Berman, Leah; Colazo, Felipe; DeGeorge, Martin; Helson, Kyle; Sagliocca, Marco

    2017-01-01

    We describe the design, fabrication, and validation of a cryogenically compatible quasioptical thermal source for characterization of detector arrays. The source is constructed using a graphite-loaded epoxy mixture that is molded into a tiled pyramidal structure. The mold is fabricated using a hardened steel template produced via a wire electron discharge machining process. The absorptive mixture is bonded to a copper backplate enabling thermalization of the entire structure and measurement of the source temperature. Measurements indicate that the reflectance of the source is less than 0.001 across a spectral band extending from 75 to 330 gigahertz.

  16. Vapor phase pyrolysis

    NASA Technical Reports Server (NTRS)

    Steurer, Wolfgang

    1992-01-01

    The vapor phase pyrolysis process is designed exclusively for the lunar production of oxygen. In this concept, granulated raw material (soil) that consists almost entirely of metal oxides is vaporized and the vapor is raised to a temperature where it dissociates into suboxides and free oxygen. Rapid cooling of the dissociated vapor to a discrete temperature causes condensation of the suboxides, while the oxygen remains essentially intact and can be collected downstream. The gas flow path and flow rate are maintained at an optimum level by control of the pressure differential between the vaporization region and the oxygen collection system with the aid of the environmental vacuum.

  17. Axiomatic Design and Fabrication of Composite Structures - Applications in Robots, Machine Tools, and Automobiles

    NASA Astrophysics Data System (ADS)

    Lee, Dai Gil; Suh, Nam Pyo

    2005-11-01

    The idea that materials can be designed to satisfy specific performance requirements is relatively new. With high-performance composites, however, the entire process of designing and fabricating a part can be worked out before manufacturing. The purpose of this book is to present an integrated approach to the design and manufacturing of products from advanced composites. It shows how the basic behavior of composites and their constitutive relationships can be used during the design stage, which minimizes the complexity of manufacturing composite parts and reduces the repetitive "design-build-test" cycle. Designing it right the first time is going to determine the competitiveness of a company, the reliability of the part, the robustness of fabrication processes, and ultimately, the cost and development time of composite parts. Most of all, it should expand the use of advanced composite parts in fields that use composites only to a limited extent at this time. To achieve these goals, this book presents the design and fabrication of novel composite parts made for machine tools and other applications like robots and automobiles. This book is suitable as a textbook for graduate courses in the design and fabrication of composites. It will also be of interest to practicing engineers learning about composites and axiomatic design. A CD-ROM is included in every copy of the book, containing Axiomatic CLPT software. This program, developed by the authors, will assist readers in calculating material properties from the microstructure of the composite. This book is part of the Oxford Series on Advanced Manufacturing.

  18. Expert-guided evolutionary algorithm for layout design of complex space stations

    NASA Astrophysics Data System (ADS)

    Qian, Zhiqin; Bi, Zhuming; Cao, Qun; Ju, Weiguo; Teng, Hongfei; Zheng, Yang; Zheng, Siyu

    2017-08-01

    The layout of a space station should be designed in such a way that different equipment and instruments are placed for the station as a whole to achieve the best overall performance. The station layout design is a typical nondeterministic polynomial problem. In particular, how to manage the design complexity to achieve an acceptable solution within a reasonable timeframe poses a great challenge. In this article, a new evolutionary algorithm has been proposed to meet such a challenge. It is called as the expert-guided evolutionary algorithm with a tree-like structure decomposition (EGEA-TSD). Two innovations in EGEA-TSD are (i) to deal with the design complexity, the entire design space is divided into subspaces with a tree-like structure; it reduces the computation and facilitates experts' involvement in the solving process. (ii) A human-intervention interface is developed to allow experts' involvement in avoiding local optimums and accelerating convergence. To validate the proposed algorithm, the layout design of one-space station is formulated as a multi-disciplinary design problem, the developed algorithm is programmed and executed, and the result is compared with those from other two algorithms; it has illustrated the superior performance of the proposed EGEA-TSD.

  19. Musical rhythm and reading development: does beat processing matter?

    PubMed

    Ozernov-Palchik, Ola; Patel, Aniruddh D

    2018-05-20

    There is mounting evidence for links between musical rhythm processing and reading-related cognitive skills, such as phonological awareness. This may be because music and speech are rhythmic: both involve processing complex sound sequences with systematic patterns of timing, accent, and grouping. Yet, there is a salient difference between musical and speech rhythm: musical rhythm is often beat-based (based on an underlying grid of equal time intervals), while speech rhythm is not. Thus, the role of beat-based processing in the reading-rhythm relationship is not clear. Is there is a distinct relation between beat-based processing mechanisms and reading-related language skills, or is the rhythm-reading link entirely due to shared mechanisms for processing nonbeat-based aspects of temporal structure? We discuss recent evidence for a distinct link between beat-based processing and early reading abilities in young children, and suggest experimental designs that would allow one to further methodically investigate this relationship. We propose that beat-based processing taps into a listener's ability to use rich contextual regularities to form predictions, a skill important for reading development. © 2018 New York Academy of Sciences.

  20. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as the standard deviation of calibrated radiance spectra from multiple scans. To obtain an estimate of the FPA performance, we developed an efficient method of generating pixel performance assessments. In addition, a random pixel selection scheme is developed based on the pixel performance evaluation. This would allow us to perform the calibration procedures on a random pixel population that is a good statistical representation of the entire FPA. The design and implementation of each individual component will be discussed in details.

  1. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  2. Designing of deployment sequence for braking and drift systems in atmosphere of Mars and Venus

    NASA Astrophysics Data System (ADS)

    Vorontsov, Victor

    2006-07-01

    Analysis of project development and space research using contact method, namely, by means of automatic descent modules and balloons shows that designing formation of entry, descent and landing (EDL) sequence and operation in the atmosphere are of great importance. This process starts at the very beginning of designing, has undergone a lot of iterations and influences processing of normal operation results. Along with designing of descent module systems, including systems of braking in the atmosphere, designing of flight operation sequence and trajectories of motion in the atmosphere is performed. As the entire operation sequence and transfer from one phase to another was correctly chosen, the probability of experiment success on the whole and efficiency of application of various systems vary. By now the most extensive experience of Russian specialists in research of terrestrial planets has been gained with the help of automatic interplanetary stations “Mars”, “Venera”, “Vega” which had descent modules and drifting in the atmosphere balloons. Particular interest and complicity of formation of EDL and drift sequence in the atmosphere of these planets arise from radically different operation conditions, in particular, strongly rarefied atmosphere of the one planet and extremely dense atmosphere of another. Consequently, this determines the choice of braking systems and their parameters and method of EDL consequence formation. At the same time there are general fundamental methods and designed research techniques that allowed taking general technical approach to designing of EDL and drift sequence in the atmosphere.

  3. Contributions for the next generation of 3D metal printing machines

    NASA Astrophysics Data System (ADS)

    Pereira, M.; Thombansen, U.

    2015-03-01

    The 3D metal printing processes are key technologies for the new industry manufacturing requirements, as small lot production associated with high design complexity and high flexibility are needed towards personalization and customization. The main challenges for these processes are associated to increasing printing volumes, maintaining the relative accuracy level and reducing the global manufacturing time. Through a review on current technologies and solutions proposed by global patents new design solutions for 3D metal printing machines can be suggested. This paper picks up current technologies and trends in SLM and suggests some design approaches to overcome these challenges. As the SLM process is based on laser scanning, an increase in printing volume requires moving the scanner over the work surface by motion systems if printing accuracy has to be kept constant. This approach however does not contribute to a reduction in manufacturing time, as only one laser source will be responsible for building the entire work piece. With given technology limits in galvo based laser scanning systems, the most obvious solution consists in using multiple beam delivery systems in series, in parallel or both. Another concern is related to the weight of large work pieces. A new powder recoater can control the layer thickness and uniformity and eliminate or diminish fumes. To improve global accuracy, the use of a pair of high frequency piezoelectric actuators can help in positioning the laser beam. The implementation of such suggestions can contribute to SLM productivity. To do this, several research activities need to be accomplished in areas related to design, control, software and process fundamentals.

  4. Design of a superconducting volume coil for magnetic resonance microscopy of the mouse brain

    NASA Astrophysics Data System (ADS)

    Nouls, John C.; Izenson, Michael G.; Greeley, Harold P.; Johnson, G. Allan

    2008-04-01

    We present the design process of a superconducting volume coil for magnetic resonance microscopy of the mouse brain at 9.4 T. The yttrium barium copper oxide coil has been designed through an iterative process of three-dimensional finite-element simulations and validation against room temperature copper coils. Compared to previous designs, the Helmholtz pair provides substantially higher B1 homogeneity over an extended volume of interest sufficiently large to image biologically relevant specimens. A custom-built cryogenic cooling system maintains the superconducting probe at 60 ± 0.1 K. Specimen loading and probe retuning can be carried out interactively with the coil at operating temperature, enabling much higher through-put. The operation of the probe is a routine, consistent procedure. Signal-to-noise ratio in a mouse brain increased by a factor ranging from 1.1 to 2.9 as compared to a room-temperature solenoid coil optimized for mouse brain microscopy. We demonstrate images encoded at 10 × 10 × 20 μm for an entire mouse brain specimen with signal-to-noise ratio of 18 and a total acquisition time of 16.5 h, revealing neuroanatomy unseen at lower resolution. Phantom measurements show an effective spatial resolution better than 20 μm.

  5. Design of a superconducting volume coil for magnetic resonance microscopy of the mouse brain.

    PubMed

    Nouls, John C; Izenson, Michael G; Greeley, Harold P; Johnson, G Allan

    2008-04-01

    We present the design process of a superconducting volume coil for magnetic resonance microscopy of the mouse brain at 9.4T. The yttrium barium copper oxide coil has been designed through an iterative process of three-dimensional finite-element simulations and validation against room temperature copper coils. Compared to previous designs, the Helmholtz pair provides substantially higher B(1) homogeneity over an extended volume of interest sufficiently large to image biologically relevant specimens. A custom-built cryogenic cooling system maintains the superconducting probe at 60+/-0.1K. Specimen loading and probe retuning can be carried out interactively with the coil at operating temperature, enabling much higher through-put. The operation of the probe is a routine, consistent procedure. Signal-to-noise ratio in a mouse brain increased by a factor ranging from 1.1 to 2.9 as compared to a room-temperature solenoid coil optimized for mouse brain microscopy. We demonstrate images encoded at 10x10x20mum for an entire mouse brain specimen with signal-to-noise ratio of 18 and a total acquisition time of 16.5h, revealing neuroanatomy unseen at lower resolution. Phantom measurements show an effective spatial resolution better than 20mum.

  6. Techniques and Tools for Trustworthy Composition of Pre-Designed Embedded Software Components

    DTIC Science & Technology

    2012-07-01

    following option choices. 1. A plain vanilla pi-trie algorithm set to build the entire pi-trie. 2. A pi-trie algorithm filtered for positive prime...implicates only. 3. A plain vanilla pi-trie algorithm to build the entire pi-trie, but recognize variable-disjoint subformulas. 4. A pi-trie

  7. Green Infrastructure Modeling Tools

    EPA Pesticide Factsheets

    Modeling tools support planning and design decisions on a range of scales from setting a green infrastructure target for an entire watershed to designing a green infrastructure practice for a particular site.

  8. Design of a portable optical emission tomography system for microwave induced compact plasma for visible to near-infrared emission lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rathore, Kavita, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Munshi, Prabhat, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Bhattacharjee, Sudeep, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in

    A new non-invasive diagnostic system is developed for Microwave Induced Plasma (MIP) to reconstruct tomographic images of a 2D emission profile. A compact MIP system has wide application in industry as well as research application such as thrusters for space propulsion, high current ion beams, and creation of negative ions for heating of fusion plasma. Emission profile depends on two crucial parameters, namely, the electron temperature and density (over the entire spatial extent) of the plasma system. Emission tomography provides basic understanding of plasmas and it is very useful to monitor internal structure of plasma phenomena without disturbing its actualmore » processes. This paper presents development of a compact, modular, and versatile Optical Emission Tomography (OET) tool for a cylindrical, magnetically confined MIP system. It has eight slit-hole cameras and each consisting of a complementary metal–oxide–semiconductor linear image sensor for light detection. The optical noise is reduced by using aspheric lens and interference band-pass filters in each camera. The entire cylindrical plasma can be scanned with automated sliding ring mechanism arranged in fan-beam data collection geometry. The design of the camera includes a unique possibility to incorporate different filters to get the particular wavelength light from the plasma. This OET system includes selected band-pass filters for particular argon emission 750 nm, 772 nm, and 811 nm lines and hydrogen emission H{sub α} (656 nm) and H{sub β} (486 nm) lines. Convolution back projection algorithm is used to obtain the tomographic images of plasma emission line. The paper mainly focuses on (a) design of OET system in detail and (b) study of emission profile for 750 nm argon emission lines to validate the system design.« less

  9. The First Interlaced Continuum Robot, Devised to Intrinsically Follow the Leader

    PubMed Central

    Kang, Byungjeon; Kojcev, Risto; Sinibaldi, Edoardo

    2016-01-01

    Flexible probes that are safely deployed to hard-to-reach targets while avoiding critical structures are strategic in several high-impact application fields, including the biomedical sector and the sector of inspections at large. A critical problem for these tools is the best approach for deploying an entire tool body, not only its tip, on a sought trajectory. A probe that achieves this deployment is considered to follow the leader (or to achieve follow-the-leader deployment) because its body sections follow the track traced by its tip. Follow-the-leader deployment through cavities is complicated due to a lack of external supports. Currently, no definitive implementation for a probe that is intrinsically able to follow the leader, i.e., without relying on external supports, has been achieved. In this paper, we present a completely new device, namely the first interlaced continuum robot, devised to intrinsically follow the leader. We developed the interlaced configuration by pursuing a conceptual approach irrespective of application-specific constraints and assuming two flexible tools with controllable stiffness. We questioned the possibility of solving the previously mentioned deployment problem by harnessing probe symmetry during the design process. This study examines the entire development of the novel interlaced probe: model-based conceptual design, detailed design and prototyping, and preliminary experimental assessment. Our probe can build a track with a radius of curvature that is as small as twice the probe diameter, which enables it to outperform state-of-the-art tools that are aimed at follow-the-leader deployment. Despite the limitations that are inherently associated with its original character, this study provides a prototypical approach to the design of interlaced continuum systems and demonstrates the first interlaced continuum probe, which is intrinsically able to follow the leader. PMID:26914328

  10. The First Interlaced Continuum Robot, Devised to Intrinsically Follow the Leader.

    PubMed

    Kang, Byungjeon; Kojcev, Risto; Sinibaldi, Edoardo

    2016-01-01

    Flexible probes that are safely deployed to hard-to-reach targets while avoiding critical structures are strategic in several high-impact application fields, including the biomedical sector and the sector of inspections at large. A critical problem for these tools is the best approach for deploying an entire tool body, not only its tip, on a sought trajectory. A probe that achieves this deployment is considered to follow the leader (or to achieve follow-the-leader deployment) because its body sections follow the track traced by its tip. Follow-the-leader deployment through cavities is complicated due to a lack of external supports. Currently, no definitive implementation for a probe that is intrinsically able to follow the leader, i.e., without relying on external supports, has been achieved. In this paper, we present a completely new device, namely the first interlaced continuum robot, devised to intrinsically follow the leader. We developed the interlaced configuration by pursuing a conceptual approach irrespective of application-specific constraints and assuming two flexible tools with controllable stiffness. We questioned the possibility of solving the previously mentioned deployment problem by harnessing probe symmetry during the design process. This study examines the entire development of the novel interlaced probe: model-based conceptual design, detailed design and prototyping, and preliminary experimental assessment. Our probe can build a track with a radius of curvature that is as small as twice the probe diameter, which enables it to outperform state-of-the-art tools that are aimed at follow-the-leader deployment. Despite the limitations that are inherently associated with its original character, this study provides a prototypical approach to the design of interlaced continuum systems and demonstrates the first interlaced continuum probe, which is intrinsically able to follow the leader.

  11. Case studies on design, simulation and visualization of control and measurement applications using REX control system

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a wide variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.

  12. Ab initio design of laser pulses to control molecular motion

    NASA Astrophysics Data System (ADS)

    Balint-Kurti, Gabriel; Ren, Qinghua; Manby, Frederick; Artamonov, Maxim; Ho, Tak-San; Rabitz, Herschel; Zou, Shiyang; Singh, Harjinder

    2007-03-01

    Our recent attempts to design laser pulses entirely theoretically, in a quantitative and accurate manner, so as to fully understand the underlying mechanisms active in the control process will be outlined. We have developed a new Born-Oppenheimer like separation called the electric-nuclear Born-Oppenheimer (ENBO) approximation. In this approximation variations of both the nuclear geometry and of the external electric field are assumed to be slow compared with the speed at which the electronic degrees of freedom respond to these changes. This assumption permits the generation of a potential energy surface that depends not only on the relative geometry of the nuclei, but also on the electric field strength and on the orientation of the molecule with respect to the electric field. The range of validity of the ENBO approximation is discussed. Optimal control theory is used along with the ENBO approximation to design laser pulses for exciting vibrational and rotational motion in H2 and CO molecules. Progress on other applications, including controlling photodissociation processes, isotope separation, stabilization of molecular Bose-Einstein condensates as well as applications to biological molecules also be presented. *Support acknowledged from EPSRC.

  13. AVE-SESAME program for the REEDA System

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1981-01-01

    The REEDA system software was modified and improved to process the AVE-SESAME severe storm data. A random access file system for the AVE storm data was designed, tested, and implemented. The AVE/SESAME software was modified to incorporate the random access file input and to interface with new graphics hardware/software now available on the REEDA system. Software was developed to graphically display the AVE/SESAME data in the convention normally used by severe storm researchers. Software was converted to AVE/SESAME software systems and interfaced with existing graphics hardware/software available on the REEDA System. Software documentation was provided for existing AVE/SESAME programs underlining functional flow charts and interacting questions. All AVE/SESAME data sets in random access format was processed to allow developed software to access the entire AVE/SESAME data base. The existing software was modified to allow for processing of different AVE/SESAME data set types including satellite surface and radar data.

  14. User's manual SIG: a general-purpose signal processing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lager, D.; Azevedo, S.

    1983-10-25

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Many of the basic operations one would perform on digitized data are contained in the core SIG package. Out of these core commands, more powerful signal processing algorithms may be built. Many different operations on time- and frequency-domain signals can be performed by SIG. They include operations on the samples of a signal, such as adding a scalar tomore » each sample, operations on the entire signal such as digital filtering, and operations on two or more signals such as adding two signals. Signals may be simulated, such as a pulse train or a random waveform. Graphics operations display signals and spectra.« less

  15. Tailoring Selective Laser Melting Process Parameters for NiTi Implants

    NASA Astrophysics Data System (ADS)

    Bormann, Therese; Schumacher, Ralf; Müller, Bert; Mertmann, Matthias; de Wild, Michael

    2012-12-01

    Complex-shaped NiTi constructions become more and more essential for biomedical applications especially for dental or cranio-maxillofacial implants. The additive manufacturing method of selective laser melting allows realizing complex-shaped elements with predefined porosity and three-dimensional micro-architecture directly out of the design data. We demonstrate that the intentional modification of the applied energy during the SLM-process allows tailoring the transformation temperatures of NiTi entities within the entire construction. Differential scanning calorimetry, x-ray diffraction, and metallographic analysis were employed for the thermal and structural characterizations. In particular, the phase transformation temperatures, the related crystallographic phases, and the formed microstructures of SLM constructions were determined for a series of SLM-processing parameters. The SLM-NiTi exhibits pseudoelastic behavior. In this manner, the properties of NiTi implants can be tailored to build smart implants with pre-defined micro-architecture and advanced performance.

  16. The influence of emotion on lexical processing: insights from RT distributional analysis.

    PubMed

    Yap, Melvin J; Seow, Cui Shan

    2014-04-01

    In two lexical decision experiments, the present study was designed to examine emotional valence effects on visual lexical decision (standard and go/no-go) performance, using traditional analyses of means and distributional analyses of response times. Consistent with an earlier study by Kousta, Vinson, and Vigliocco (Cognition 112:473-481, 2009), we found that emotional words (both negative and positive) were responded to faster than neutral words. Finer-grained distributional analyses further revealed that the facilitation afforded by valence was reflected by a combination of distributional shifting and an increase in the slow tail of the distribution. This suggests that emotional valence effects in lexical decision are unlikely to be entirely mediated by early, preconscious processes, which are associated with pure distributional shifting. Instead, our results suggest a dissociation between early preconscious processes and a later, more task-specific effect that is driven by feedback from semantically rich representations.

  17. Geostationary Lightning Mapper: Lessons Learned from Post Launch Test

    NASA Astrophysics Data System (ADS)

    Edgington, S.; Tillier, C. E.; Demroff, H.; VanBezooijen, R.; Christian, H. J., Jr.; Bitzer, P. M.

    2017-12-01

    Pre-launch calibration and algorithm design for the GOES Geostationary Lightning Mapper resulted in a successful and trouble-free on-orbit activation and post-launch test sequence. Within minutes of opening the GLM aperture door on January 4th, 2017, lightning was detected across the entire field of view. During the six-month post-launch test period, numerous processing parameters on board the instrument and in the ground processing algorithms were fine-tuned. Demonstrated on-orbit performance exceeded pre-launch predictions. We provide an overview of the ground calibration sequence, on-orbit tuning of the instrument, tuning of the ground processing algorithms (event filtering and navigation). We also touch on new insights obtained from analysis of a large and growing archive of raw GLM data, containing 3e8 flash detections derived from over 1e10 full-disk images of the Earth.

  18. New paradigms in internal architecture design and freeform fabrication of tissue engineering porous scaffolds.

    PubMed

    Yoo, Dongjin

    2012-07-01

    Advanced additive manufacture (AM) techniques are now being developed to fabricate scaffolds with controlled internal pore architectures in the field of tissue engineering. In general, these techniques use a hybrid method which combines computer-aided design (CAD) with computer-aided manufacturing (CAM) tools to design and fabricate complicated three-dimensional (3D) scaffold models. The mathematical descriptions of micro-architectures along with the macro-structures of the 3D scaffold models are limited by current CAD technologies as well as by the difficulty of transferring the designed digital models to standard formats for fabrication. To overcome these difficulties, we have developed an efficient internal pore architecture design system based on triply periodic minimal surface (TPMS) unit cell libraries and associated computational methods to assemble TPMS unit cells into an entire scaffold model. In addition, we have developed a process planning technique based on TPMS internal architecture pattern of unit cells to generate tool paths for freeform fabrication of tissue engineering porous scaffolds. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. A Visual Decision Aid for Gear Materials Selection

    NASA Astrophysics Data System (ADS)

    Maity, S. R.; Chakraborty, S.

    2013-10-01

    Materials play an important role during the entire design process and the designers need to identify materials with specific functionalities in order to find out feasible design concepts. While selecting materials for engineering designs from an ever-increasing array of alternatives, with each having its own characteristics, applications, advantages and limitations, a clear understanding of the functional requirements for each individual component is required and various important criteria need to be considered. Although various approaches have already been adopted by the past researchers to solve the material selection problems, they all require profound knowledge in mathematics from the part of the designers for their implementation. This paper proposes the application of an integrated preference ranking organization method for enrichment evaluation and geometrical analysis for interactive aid method as a visual decision aid for material selection. Two real time gear material selection problems are solved which prove the potentiality and usefulness of this combined approach. It is observed that Nitralloy 135M and Nylon glass fiber reinforced 6/6 are respectively the choicest metallic and non-metallic gear materials.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Joshua; Burnham, Laurie; Jones, Christian Birk

    The U.S. DOE Regional Test Center for Solar Technologies program was established to validate photovoltaic (PV) technologies installed in a range of different climates. The program is funded by the Energy Department's SunShot Initiative. The initiative seeks to make solar energy cost competitive with other forms of electricity by the end of the decade. Sandia National Laboratory currently manages four different sites across the country. The National Renewable Energy Laboratory manages a fifth site in Colorado. The entire PV portfolio currently includes 20 industry partners and almost 500 kW of installed systems. The program follows a defined process that outlinesmore » tasks, milestones, agreements, and deliverables. The process is broken out into four main parts: 1) planning and design, 2) installation, 3) operations, and 4) decommissioning. This operations manual defines the various elements of each part.« less

  1. Video over IP design guidebook.

    DOT National Transportation Integrated Search

    2009-12-01

    Texas Department of Transportation (TxDOT) engineers are responsible for the design, evaluation, and : implementation of video solutions across the entire state. These installations occur with vast differences in : requirements, expectations, and con...

  2. Creation and utilization of a World Wide Web based space radiation effects code: SIREST

    NASA Technical Reports Server (NTRS)

    Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.; hide

    2001-01-01

    In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.

  3. The implementation of microstructural and heat treatment models to development of forming technology of critical aluminum-alloy parts

    NASA Astrophysics Data System (ADS)

    Biba, Nikolay; Alimov, Artem; Shitikov, Andrey; Stebunov, Sergei

    2018-05-01

    The demand for high performance and energy efficient transportation systems have boosted interest in lightweight design solutions. To achieve maximum weight reductions, it is not enough just to replace steel parts by their aluminium analogues, but it is necessary to change the entire concept of vehicle design. In this case we must develop methods for manufacturing a variety of critical parts with unusual and difficult to produce shapes. The mechanical properties of the material in these parts must also be optimised and tightly controlled to provide the best distribution within the part volume. The only way to achieve these goals is to implement technology development methods based on simulation of the entire manufacturing chain from preparing a billet through the forming operations and heat treatment of the product. The paper presents an approach to such technology development. The simulation of the technological chain starts with extruding a round billet. Depending on the extrusion process parameters, the billet can have different levels of material workout and variation of grain size throughout the volume. After extrusion, the billet gets formed into the required shape in a forging process. The main requirements at this stage are to get the near net shape of the product without defects and to provide proper configuration of grain flow that strengthens the product in the most critical direction. Then the product undergoes solution treatment, quenching and ageing. The simulation of all these stages are performed by QForm FEM code that provides thermo-mechanical coupled deformation of the material during extrusion and forging. To provide microstructure and heat treatment simulation, special subroutines has been developed by the authors. The proposed approach is illustrated by an industrial case study.

  4. Introduction of novel 3D-printed superficial applicators for high-dose-rate skin brachytherapy.

    PubMed

    Jones, Emma-Louise; Tonino Baldion, Anna; Thomas, Christopher; Burrows, Tom; Byrne, Nick; Newton, Victoria; Aldridge, Sarah

    Custom-made surface mold applicators often allow more flexibility when carrying out skin brachytherapy, particularly for small treatment areas with high surface obliquity. They can, however, be difficult to manufacture, particularly if there is a lack of experience in superficial high-dose-rate brachytherapy techniques or with limited resources. We present a novel method of manufacturing superficial brachytherapy applicators utilizing three-dimensional (3D)-printing techniques. We describe the treatment planning process and the process of applicator manufacture. The treatment planning process, with the introduction of a pre-plan, allows for an "ideal" catheter arrangement within an applicator to be determined, exploiting varying catheter orientations, heights, and curvatures if required. The pre-plan arrangement is then 3D printed to the exact specifications of the pre-plan applicator design. This results in improved target volume coverage and improved sparing of organs at risk. Using a pre-plan technique for ideal catheter placement followed by automated 3D-printed applicator manufacture has greatly improved the entire process of superficial high-dose-rate brachytherapy treatment. We are able to design and manufacture flexible, well-fitting, superior quality applicators resulting in a more efficient and improved patient pathway and patient experience. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  5. Data-based control of a multi-step forming process

    NASA Astrophysics Data System (ADS)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  6. Health conditions of inmates in Italy.

    PubMed

    Voller, Fabio; Silvestri, Caterina; Martino, Gianrocco; Fanti, Eleonora; Bazzerla, Giorgio; Ferrari, Fabio; Grignani, Marco; Libianchi, Sandro; Pagano, Antonio Maria; Scarpa, Franco; Stasi, Cristina; Di Fiandra, Teresa

    2016-11-16

    Several studies have shown that prison is characterized by a higher prevalence of chronic diseases than unconfined settings. The aim of this study was to describe the characteristics and health of inmates, focusing on internal diseases. We designed a specific clinical record using the Python programming language. We considered all of the diagnoses according to the ICD-9-CM. Of a total of 17,086 inmates, 15,751 were enrolled in our study (M = 14,835; F = 869), corresponding to 92.2% of the entire inmate population (mean age of 39.6 years). The project involved a total of 57 detention facilities in six Italian regions (for a total of 28% of all detainees in Italy), as counted in a census taken on February 3, 2014. From the entire study sample, 32.5% of prisoners did not present any disorders, while 67.5% suffered from at least one disease. The most frequent pathologies were psychiatric (41.3%), digestive (14.5%), infectious (11.5%), cardiovascular (11.4%), endocrine, metabolic, and immune (8.6%), and respiratory (5.4%). The findings showed that a large number of detainees were affected by several chronic conditions such as hypertension, dyslipidemia and type 2 diabetes mellitus, with an unusually high prevalence for such a young population. Therefore, a series of preventive measures is recommended to strengthen the entire care process and improve the health and living conditions of prisoners.

  7. Simulation-Driven Design Approach for Design and Optimization of Blankholder

    NASA Astrophysics Data System (ADS)

    Sravan, Tatipala; Suddapalli, Nikshep R.; Johan, Pilthammar; Mats, Sigvant; Christian, Johansson

    2017-09-01

    Reliable design of stamping dies is desired for efficient and safe production. The design of stamping dies are today mostly based on casting feasibility, although it can also be based on criteria for fatigue, stiffness, safety, economy. Current work presents an approach that is built on Simulation Driven Design, enabling Design Optimization to address this issue. A structural finite element model of a stamping die, used to produce doors for Volvo V70/S80 car models, is studied. This die had developed cracks during its usage. To understand the behaviour of stress distribution in the stamping die, structural analysis of the die is conducted and critical regions with high stresses are identified. The results from structural FE-models are compared with analytical calculations pertaining to fatigue properties of the material. To arrive at an optimum design with increased stiffness and lifetime, topology and free-shape optimization are performed. In the optimization routine, identified critical regions of the die are set as design variables. Other optimization variables are set to maintain manufacturability of the resultant stamping die. Thereafter a CAD model is built based on geometrical results from topology and free-shape optimizations. Then the CAD model is subjected to structural analysis to visualize the new stress distribution. This process is iterated until a satisfactory result is obtained. The final results show reduction in stress levels by 70% with a more homogeneous distribution. Even though mass of the die is increased by 17 %, overall, a stiffer die with better lifetime is obtained. Finally, by reflecting on the entire process, a coordinated approach to handle such situations efficiently is presented.

  8. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.

  9. Optical systems integrated modeling

    NASA Technical Reports Server (NTRS)

    Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck

    1992-01-01

    An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.

  10. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center.

    PubMed

    Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude

    2011-07-01

    Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  11. The Design, Fabrication and Characterization of a Transparent Atom Chip

    PubMed Central

    Chuang, Ho-Chiao; Huang, Chia-Shiuan; Chen, Hung-Pin; Huang, Chi-Sheng; Lin, Yu-Hsin

    2014-01-01

    This study describes the design and fabrication of transparent atom chips for atomic physics experiments. A fabrication process was developed to define the wire patterns on a transparent glass substrate to create the desired magnetic field for atom trapping experiments. An area on the chip was reserved for the optical access, so that the laser light can penetrate directly through the glass substrate for the laser cooling process. Furthermore, since the thermal conductivity of the glass substrate is poorer than other common materials for atom chip substrate, for example silicon, silicon carbide, aluminum nitride. Thus, heat dissipation copper blocks are designed on the front and back of the glass substrate to improve the electrical current conduction. The testing results showed that a maximum burnout current of 2 A was measured from the wire pattern (with a width of 100 μm and a height of 20 μm) without any heat dissipation design and it can increase to 2.5 A with a heat dissipation design on the front side of the atom chips. Therefore, heat dissipation copper blocks were designed and fabricated on the back of the glass substrate just under the wire patterns which increases the maximum burnout current to 4.5 A. Moreover, a maximum burnout current of 6 A was achieved when the entire backside glass substrate was recessed and a thicker copper block was electroplated, which meets most requirements of atomic physics experiments. PMID:24922456

  12. An optimal adder-based hardware architecture for the DCT/SA-DCT

    NASA Astrophysics Data System (ADS)

    Kinane, Andrew; Muresan, Valentin; O'Connor, Noel

    2005-07-01

    The explosive growth of the mobile multimedia industry has accentuated the need for ecient VLSI implemen- tations of the associated computationally demanding signal processing algorithms. This need becomes greater as end-users demand increasingly enhanced features and more advanced underpinning video analysis. One such feature is object-based video processing as supported by MPEG-4 core profile, which allows content-based in- teractivity. MPEG-4 has many computationally demanding underlying algorithms, an example of which is the Shape Adaptive Discrete Cosine Transform (SA-DCT). The dynamic nature of the SA-DCT processing steps pose significant VLSI implementation challenges and many of the previously proposed approaches use area and power consumptive multipliers. Most also ignore the subtleties of the packing steps and manipulation of the shape information. We propose a new multiplier-less serial datapath based solely on adders and multiplexers to improve area and power. The adder cost is minimised by employing resource re-use methods. The number of (physical) adders used has been derived using a common sub-expression elimination algorithm. Additional energy eciency is factored into the design by employing guarded evaluation and local clock gating. Our design implements the SA-DCT packing with minimal switching using ecient addressing logic with a transpose mem- ory RAM. The entire design has been synthesized using TSMC 0.09µm TCBN90LP technology yielding a gate count of 12028 for the datapath and its control logic.

  13. Neutron residual stress measurement and numerical modeling in a curved thin-walled structure by laser powder bed fusion additive manufacturing

    DOE PAGES

    An, Ke; Yuan, Lang; Dial, Laura; ...

    2017-09-11

    Severe residual stresses in metal parts made by laser powder bed fusion additive manufacturing processes (LPBFAM) can cause both distortion and cracking during the fabrication processes. Limited data is currently available for both iterating through process conditions and design, and in particular, for validating numerical models to accelerate process certification. In this work, residual stresses of a curved thin-walled structure, made of Ni-based superalloy Inconel 625™ and fabricated by LPBFAM, were resolved by neutron diffraction without measuring the stress-free lattices along both the build and the transverse directions. The stresses of the entire part during fabrication and after cooling downmore » were predicted by a simplified layer-by-layer finite element based numerical model. The simulated and measured stresses were found in good quantitative agreement. The validated simplified simulation methodology will allow to assess residual stresses in more complex structures and to significantly reduce manufacturing cycle time.« less

  14. Neutron residual stress measurement and numerical modeling in a curved thin-walled structure by laser powder bed fusion additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Ke; Yuan, Lang; Dial, Laura

    Severe residual stresses in metal parts made by laser powder bed fusion additive manufacturing processes (LPBFAM) can cause both distortion and cracking during the fabrication processes. Limited data is currently available for both iterating through process conditions and design, and in particular, for validating numerical models to accelerate process certification. In this work, residual stresses of a curved thin-walled structure, made of Ni-based superalloy Inconel 625™ and fabricated by LPBFAM, were resolved by neutron diffraction without measuring the stress-free lattices along both the build and the transverse directions. The stresses of the entire part during fabrication and after cooling downmore » were predicted by a simplified layer-by-layer finite element based numerical model. The simulated and measured stresses were found in good quantitative agreement. The validated simplified simulation methodology will allow to assess residual stresses in more complex structures and to significantly reduce manufacturing cycle time.« less

  15. Multi-scale process and supply chain modelling: from lignocellulosic feedstock to process and products

    PubMed Central

    Hosseini, Seyed Ali; Shah, Nilay

    2011-01-01

    There is a large body of literature regarding the choice and optimization of different processes for converting feedstock to bioethanol and bio-commodities; moreover, there has been some reasonable technological development in bioconversion methods over the past decade. However, the eventual cost and other important metrics relating to sustainability of biofuel production will be determined not only by the performance of the conversion process, but also by the performance of the entire supply chain from feedstock production to consumption. Moreover, in order to ensure world-class biorefinery performance, both the network and the individual components must be designed appropriately, and allocation of resources over the resulting infrastructure must effectively be performed. The goal of this work is to describe the key challenges in bioenergy supply chain modelling and then to develop a framework and methodology to show how multi-scale modelling can pave the way to answer holistic supply chain questions, such as the prospects for second generation bioenergy crops. PMID:22482032

  16. mdFoam+: Advanced molecular dynamics in OpenFOAM

    NASA Astrophysics Data System (ADS)

    Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.

  17. Neurosecurity: security and privacy for neural devices.

    PubMed

    Denning, Tamara; Matsuoka, Yoky; Kohno, Tadayoshi

    2009-07-01

    An increasing number of neural implantable devices will become available in the near future due to advances in neural engineering. This discipline holds the potential to improve many patients' lives dramatically by offering improved-and in some cases entirely new-forms of rehabilitation for conditions ranging from missing limbs to degenerative cognitive diseases. The use of standard engineering practices, medical trials, and neuroethical evaluations during the design process can create systems that are safe and that follow ethical guidelines; unfortunately, none of these disciplines currently ensure that neural devices are robust against adversarial entities trying to exploit these devices to alter, block, or eavesdrop on neural signals. The authors define "neurosecurity"-a version of computer science security principles and methods applied to neural engineering-and discuss why neurosecurity should be a critical consideration in the design of future neural devices.

  18. The Future of Pharmaceutical Manufacturing Sciences

    PubMed Central

    2015-01-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993

  19. Optical surface analysis: a new technique for the inspection and metrology of optoelectronic films and wafers

    NASA Astrophysics Data System (ADS)

    Bechtler, Laurie; Velidandla, Vamsi

    2003-04-01

    In response to demand for higher volumes and greater product capability, integrated optoelectronic device processing is rapidly increasing in complexity, benefiting from techniques developed for conventional silicon integrated circuit processing. The needs for high product yield and low manufacturing cost are also similar to the silicon wafer processing industry. This paper discusses the design and use of an automated inspection instrument called the Optical Surface Analyzer (OSA) to evaluate two critical production issues in optoelectronic device manufacturing: (1) film thickness uniformity, and (2) defectivity at various process steps. The OSA measurement instrument is better suited to photonics process development than most equipment developed for conventional silicon wafer processing in two important ways: it can handle both transparent and opaque substrates (unlike most inspection and metrology tools), and it is a full-wafer inspection method that captures defects and film variations over the entire substrate surface (unlike most film thickness measurement tools). Measurement examples will be provided in the paper for a variety of films and substrates used for optoelectronics manufacturing.

  20. The Future of Pharmaceutical Manufacturing Sciences.

    PubMed

    Rantanen, Jukka; Khinast, Johannes

    2015-11-01

    The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.

  1. Converting customer expectations into achievable results.

    PubMed

    Landis, G A

    1999-11-01

    It is not enough in today's environment to just meet customers' expectations--we must exceed them. Therefore, one must learn what constitutes expectations. These needs have expanded during the past few years from just manufacturing the product and looking at the outcome from a provincial standpoint. Now we must understand and satisfy the entire supply chain. To manage this process and satisfy the customer, the process now involves the supplier, the manufacturer, and the entire distribution system.

  2. CMOS compatible fabrication process of MEMS resonator for timing reference and sensing application

    NASA Astrophysics Data System (ADS)

    Huynh, Duc H.; Nguyen, Phuong D.; Nguyen, Thanh C.; Skafidas, Stan; Evans, Robin

    2015-12-01

    Frequency reference and timing control devices are ubiquitous in electronic applications. There is at least one resonator required for each of this device. Currently electromechanical resonators such as crystal resonator, ceramic resonator are the ultimate choices. This tendency will probably keep going for many more years. However, current market demands for small size, low power consumption, cheap and reliable products, has divulged many limitations of this type of resonators. They cannot be integrated into standard CMOS (Complement metaloxide- semiconductor) IC (Integrated Circuit) due to material and fabrication process incompatibility. Currently, these devices are off-chip and they require external circuitries to interface with the ICs. This configuration significantly increases the overall size and cost of the entire electronic system. In addition, extra external connection, especially at high frequency, will potentially create negative impacts on the performance of the entire system due to signal degradation and parasitic effects. Furthermore, due to off-chip packaging nature, these devices are quite expensive, particularly for high frequency and high quality factor devices. To address these issues, researchers have been intensively studying on an alternative for type of resonator by utilizing the new emerging MEMS (Micro-electro-mechanical systems) technology. Recent progress in this field has demonstrated a MEMS resonator with resonant frequency of 2.97 GHz and quality factor (measured in vacuum) of 42900. Despite this great achievement, this prototype is still far from being fully integrated into CMOS system due to incompatibility in fabrication process and its high series motional impedance. On the other hand, fully integrated MEMS resonator had been demonstrated but at lower frequency and quality factor. We propose a design and fabrication process for a low cost, high frequency and a high quality MEMS resonator, which can be integrated into a standard CMOS IC. This device is expected to operate in hundreds of Mhz frequency range; quality factor surpasses 10000 and series motional impedance low enough that could be matching into conventional system without enormous effort. This MEMS resonator can be used in the design of many blocks in wireless and RF (Radio Frequency) systems such as low phase noise oscillator, band pass filter, power amplifier and in many sensing application.

  3. Artificial neural networks to model formulation-property correlations in the process of inline-compounding on an injection moulding machine

    NASA Astrophysics Data System (ADS)

    Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer

    2015-05-01

    Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.

  4. Generation and use of human 3D-CAD models

    NASA Astrophysics Data System (ADS)

    Grotepass, Juergen; Speyer, Hartmut; Kaiser, Ralf

    2002-05-01

    Individualized Products are one of the ten mega trends of the 21st Century with human modeling as the key issue for tomorrow's design and product development. The use of human modeling software for computer based ergonomic simulations within the production process increases quality while reducing costs by 30- 50 percent and shortening production time. This presentation focuses on the use of human 3D-CAD models for both, the ergonomic design of working environments and made to measure garment production. Today, the entire production chain can be designed, individualized models generated and analyzed in 3D computer environments. Anthropometric design for ergonomics is matched to human needs, thus preserving health. Ergonomic simulation includes topics as human vision, reachability, kinematics, force and comfort analysis and international design capabilities. In German more than 17 billions of Mark are moved to other industries, because clothes do not fit. Individual clothing tailored to the customer's preference means surplus value, pleasure and perfect fit. The body scanning technology is the key to generation and use of human 3D-CAD models for both, the ergonomic design of working environments and made to measure garment production.

  5. Modelling the EDLC-based Power Supply Module for a Maneuvering System of a Nanosatellite

    NASA Astrophysics Data System (ADS)

    Kumarin, A. A.; Kudryavtsev, I. A.

    2018-01-01

    The development of the model of the power supply module of a maneuvering system of a nanosatellite is described. The module is based on an EDLC battery as an energy buffer. The EDLC choice is described. Experiments are conducted to provide data for model. Simulation of the power supply module is made for charging and discharging of the battery processes. The difference between simulation and experiment does not exceed 0.5% for charging and 10% for discharging. The developed model can be used in early design and to adjust charger and load parameters. The model can be expanded to represent the entire power system.

  6. Self-assembling nucleic acid delivery vehicles via linear, water-soluble, cyclodextrin-containing polymers.

    PubMed

    Davis, M E; Pun, S H; Bellocq, N C; Reineke, T M; Popielarski, S R; Mishra, S; Heidel, J D

    2004-01-01

    Non-viral (synthetic) nucleic acid delivery systems have the potential to provide for the practical application of nucleic acid-based therapeutics. We have designed and prepared a tunable, non-viral nucleic acid delivery system that self-assembles with nucleic acids and centers around a new class of polymeric materials; namely, linear, water-soluble cyclodextrin-containing polymers. The relationships between polymer structure and gene delivery are illustrated, and the roles of the cyclodextrin moieties for minimizing toxicity and forming inclusion complexes in the self-assembly processes are highlighted. This vehicle is the first example of a polymer-based gene delivery system formed entirely by self-assembly.

  7. Integrated phased array for wide-angle beam steering.

    PubMed

    Yaacobi, Ami; Sun, Jie; Moresco, Michele; Leake, Gerald; Coolbaugh, Douglas; Watts, Michael R

    2014-08-01

    We demonstrate an on-chip optical phased array fabricated in a CMOS compatible process with continuous, fast (100 kHz), wide-angle (51°) beam-steering suitable for applications such as low-cost LIDAR systems. The device demonstrates the largest (51°) beam-steering and beam-spacing to date while providing the ability to steer continuously over the entire range. Continuous steering is enabled by a cascaded phase shifting architecture utilizing, low power and small footprint, thermo-optic phase shifters. We demonstrate these results in the telecom C-band, but the same design can easily be adjusted for any wavelength between 1.2 and 3.5 μm.

  8. Dashboard visualizations: Supporting real-time throughput decision-making.

    PubMed

    Franklin, Amy; Gantela, Swaroop; Shifarraw, Salsawit; Johnson, Todd R; Robinson, David J; King, Brent R; Mehta, Amit M; Maddow, Charles L; Hoot, Nathan R; Nguyen, Vickie; Rubio, Adriana; Zhang, Jiajie; Okafor, Nnaemeka G

    2017-07-01

    Providing timely and effective care in the emergency department (ED) requires the management of individual patients as well as the flow and demands of the entire department. Strategic changes to work processes, such as adding a flow coordination nurse or a physician in triage, have demonstrated improvements in throughput times. However, such global strategic changes do not address the real-time, often opportunistic workflow decisions of individual clinicians in the ED. We believe that real-time representation of the status of the entire emergency department and each patient within it through information visualizations will better support clinical decision-making in-the-moment and provide for rapid intervention to improve ED flow. This notion is based on previous work where we found that clinicians' workflow decisions were often based on an in-the-moment local perspective, rather than a global perspective. Here, we discuss the challenges of designing and implementing visualizations for ED through a discussion of the development of our prototype Throughput Dashboard and the potential it holds for supporting real-time decision-making. Copyright © 2017. Published by Elsevier Inc.

  9. Highly flexible and all-solid-state paperlike polymer supercapacitors.

    PubMed

    Meng, Chuizhou; Liu, Changhong; Chen, Luzhuo; Hu, Chunhua; Fan, Shoushan

    2010-10-13

    In recent years, much effort have been dedicated to achieve thin, lightweight and even flexible energy-storage devices for wearable electronics. Here we demonstrate a novel kind of ultrathin all-solid-state supercapacitor configuration with an extremely simple process using two slightly separated polyaniline-based electrodes well solidified in the H(2)SO(4)-polyvinyl alcohol gel electrolyte. The thickness of the entire device is much comparable to that of a piece of commercial standard A4 print paper. Under its highly flexible (twisting) state, the integrate device shows a high specific capacitance of 350 F/g for the electrode materials, well cycle stability after 1000 cycles and a leakage current of as small as 17.2 μA. Furthermore, due to its polymer-based component structure, it has a specific capacitance of as high as 31.4 F/g for the entire device, which is more than 6 times that of current high-level commercial supercapacitor products. These highly flexible and all-solid-state paperlike polymer supercapacitors may bring new design opportunities of device configuration for energy-storage devices in the future wearable electronic area.

  10. Is the perception of 3D shape from shading based on assumed reflectance and illumination?

    PubMed

    Todd, James T; Egan, Eric J L; Phillips, Flip

    2014-01-01

    The research described in the present article was designed to compare three types of image shading: one generated with a Lambertian BRDF and homogeneous illumination such that image intensity was determined entirely by local surface orientation irrespective of position; one that was textured with a linear intensity gradient, such that image intensity was determined entirely by local surface position irrespective of orientation; and another that was generated with a Lambertian BRDF and inhomogeneous illumination such that image intensity was influenced by both position and orientation. A gauge figure adjustment task was used to measure observers' perceptions of local surface orientation on the depicted surfaces, and the probe points included 60 pairs of regions that both had the same orientation. The results show clearly that observers' perceptions of these three types of stimuli were remarkably similar, and that probe regions with similar apparent orientations could have large differences in image intensity. This latter finding is incompatible with any process for computing shape from shading that assumes any plausible reflectance function combined with any possible homogeneous illumination.

  11. Is the perception of 3D shape from shading based on assumed reflectance and illumination?

    PubMed Central

    Todd, James T.; Egan, Eric J. L.; Phillips, Flip

    2014-01-01

    The research described in the present article was designed to compare three types of image shading: one generated with a Lambertian BRDF and homogeneous illumination such that image intensity was determined entirely by local surface orientation irrespective of position; one that was textured with a linear intensity gradient, such that image intensity was determined entirely by local surface position irrespective of orientation; and another that was generated with a Lambertian BRDF and inhomogeneous illumination such that image intensity was influenced by both position and orientation. A gauge figure adjustment task was used to measure observers' perceptions of local surface orientation on the depicted surfaces, and the probe points included 60 pairs of regions that both had the same orientation. The results show clearly that observers' perceptions of these three types of stimuli were remarkably similar, and that probe regions with similar apparent orientations could have large differences in image intensity. This latter finding is incompatible with any process for computing shape from shading that assumes any plausible reflectance function combined with any possible homogeneous illumination. PMID:26034561

  12. 77 FR 6781 - Opportunity for Designation in the Topeka, KS; Cedar Rapids, IA; Minot, ND; and Cincinnati, OH...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... Colorado, Kansas, Nebraska, and Wyoming, is assigned to this official agency: 1. The entire State of Colorado. 2. The entire State of Kansas. 3. In Nebraska and Wyoming: Bounded on the North by the northern...: Kansas Grain Inspection Service, Inc. (Kansas); Mid-Iowa Grain Inspection, Inc. (Mid-Iowa); Minot Grain...

  13. Integrated Design of Basic Training, Practicum and End-of-Course Assignment Modules in the Teacher Training Degree: Perception of University Teachers, Students, and School Teachers

    NASA Astrophysics Data System (ADS)

    Torremorell, Maria Carme Boqué; de Nicolás, Montserrat Alguacil; Valls, Mercè Pañellas

    Teacher training at the Blanquerna Faculty of Psychology and Educational and Sports Sciences (FPCEE), in Barcelona, has a long pedagogical tradition based on teaching innovation. Its educational style is characterised by methods focused on the students' involvement and on close collaboration with teaching practice centres. Within a core subject in the Teacher Training diploma course, students were asked to assess different methodological proposals aimed at promoting the development of their personal, social, and professional competences. In the assessment surveys, from a sample of 145 students, scores for variables very satisfactory or satisfactory ranged from 95.8 % to 83.4 % for the entire set of methodological actions under analysis. Data obtained in this first research phase were very useful to design basic training modules for the new Teacher Training Degree. In the second phase (in process), active teachers are asked for their perception on the orientation of the practicum, its connection with the end-of-course assignment, and the in-service student's incidence on innovation processes at school.

  14. Design and Field Experimentation of a Cooperative ITS Architecture Based on Distributed RSUs.

    PubMed

    Moreno, Asier; Osaba, Eneko; Onieva, Enrique; Perallos, Asier; Iovino, Giovanni; Fernández, Pablo

    2016-07-22

    This paper describes a new cooperative Intelligent Transportation System architecture that aims to enable collaborative sensing services. The main goal of this architecture is to improve transportation efficiency and performance. The system, which has been proven within the participation in the ICSI (Intelligent Cooperative Sensing for Improved traffic efficiency) European project, encompasses the entire process of capture and management of available road data. For this purpose, it applies a combination of cooperative services and methods for data sensing, acquisition, processing and communication amongst road users, vehicles, infrastructures and related stakeholders. Additionally, the advantages of using the proposed system are exposed. The most important of these advantages is the use of a distributed architecture, moving the system intelligence from the control centre to the peripheral devices. The global architecture of the system is presented, as well as the software design and the interaction between its main components. Finally, functional and operational results observed through the experimentation are described. This experimentation has been carried out in two real scenarios, in Lisbon (Portugal) and Pisa (Italy).

  15. Design and Field Experimentation of a Cooperative ITS Architecture Based on Distributed RSUs †

    PubMed Central

    Moreno, Asier; Osaba, Eneko; Onieva, Enrique; Perallos, Asier; Iovino, Giovanni; Fernández, Pablo

    2016-01-01

    This paper describes a new cooperative Intelligent Transportation System architecture that aims to enable collaborative sensing services. The main goal of this architecture is to improve transportation efficiency and performance. The system, which has been proven within the participation in the ICSI (Intelligent Cooperative Sensing for Improved traffic efficiency) European project, encompasses the entire process of capture and management of available road data. For this purpose, it applies a combination of cooperative services and methods for data sensing, acquisition, processing and communication amongst road users, vehicles, infrastructures and related stakeholders. Additionally, the advantages of using the proposed system are exposed. The most important of these advantages is the use of a distributed architecture, moving the system intelligence from the control centre to the peripheral devices. The global architecture of the system is presented, as well as the software design and the interaction between its main components. Finally, functional and operational results observed through the experimentation are described. This experimentation has been carried out in two real scenarios, in Lisbon (Portugal) and Pisa (Italy). PMID:27455277

  16. A systems engineering perspective on the human-centered design of health information systems.

    PubMed

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  17. Unique Chernobyl Cranes for Deconstruction Activities in the New Safe Confinement - 13542

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parameswaran, N.A. Vijay; Chornyy, Igor; Owen, Rob

    2013-07-01

    The devastation left behind from the Chernobyl nuclear power plant (ChNPP) Unit 4 accident which occurred on April 26, 1986 presented unparalleled technical challenges to the world engineering and scientific community. One of the largest tasks that are in progress is the design and construction of the New Safe Confinement (NSC). The NSC is an engineered enclosure for the entire object shelter (OS) that includes a suite of process equipment. The process equipment will be used for the dismantling of the destroyed Chernobyl Nuclear Power Plant (ChNPP) Unit. One of the major mechanical handling systems to be installed in themore » NSC is the Main Cranes System (MCS). The planned decontamination and decommissioning or dismantling (D and D) activities will require the handling of heavily shielded waste disposal casks containing nuclear fuel as well as lifting and transporting extremely large structural elements. These activities, to be performed within the NSC, will require large and sophisticated cranes. The article will focus on the unique design features of the MCS for the D and D activities. (authors)« less

  18. System analysis of graphics processor architecture using virtual prototyping

    NASA Astrophysics Data System (ADS)

    Hancock, William R.; Groat, Jeff; Steeves, Todd; Spaanenburg, Henk; Shackleton, John

    1995-06-01

    Honeywell has been actively involved in the definition of the next generation display processors for military and commercial cockpits. A major concern is how to achieve super graphics workstation performance in avionics application. Most notable are requirements for low volume, low power, harsh environmental conditions, real-time performance and low cost. This paper describes the application of VHDL to the system analysis tasks associated with achieving these goals in a cost effective manner. The paper will describe the top level architecture identified to provide the graphical and video processing power needed to drive future high resolution display devices and to generate more natural panoramic 3D formats. The major discussion, however, will be on the use of VHDL to model the processing elements and customized pipelines needed to realize the architecture and for doing the complex system tradeoff studies necessary to achieve a cost effective implementation. New software tools have been developed to allow 'virtual' prototyping in the VHDL environment. This results in a hardware/software codesign using VHDL performance and functional models. This unique architectural tool allows simulation and tradeoffs within a standard and tightly integrated toolset, which eventually will be used to specify and design the entire system from the top level requirements and system performance to the lowest level individual ASICs. New processing elements, algorithms, and standard graphical inputs can be designed, tested and evaluated without the costly hardware prototyping using the innovative 'virtual' prototyping techniques which are evolving on this project. In addition, virtual prototyping of the display processor does not bind the preliminary design to point solutions as a physical prototype will. when the development schedule is known, one can extrapolate processing elements performance and design the system around the most current technology.

  19. ScanImage: flexible software for operating laser scanning microscopes.

    PubMed

    Pologruto, Thomas A; Sabatini, Bernardo L; Svoboda, Karel

    2003-05-17

    Laser scanning microscopy is a powerful tool for analyzing the structure and function of biological specimens. Although numerous commercial laser scanning microscopes exist, some of the more interesting and challenging applications demand custom design. A major impediment to custom design is the difficulty of building custom data acquisition hardware and writing the complex software required to run the laser scanning microscope. We describe a simple, software-based approach to operating a laser scanning microscope without the need for custom data acquisition hardware. Data acquisition and control of laser scanning are achieved through standard data acquisition boards. The entire burden of signal integration and image processing is placed on the CPU of the computer. We quantitate the effectiveness of our data acquisition and signal conditioning algorithm under a variety of conditions. We implement our approach in an open source software package (ScanImage) and describe its functionality. We present ScanImage, software to run a flexible laser scanning microscope that allows easy custom design.

  20. Advanced online control mode selection for gas turbine aircraft engines

    NASA Astrophysics Data System (ADS)

    Wiseman, Matthew William

    The modern gas turbine aircraft engine is a complex, highly nonlinear system the operates in a widely varying environment. Traditional engine control techniques based on the hydro mechanical control concepts of early turbojet engines are unable to deliver the performance required from today's advanced engine designs. A new type of advanced control utilizing multiple control modes and an online mode selector is investigated, and various strategies for improving the baseline mode selection architecture are introduced. The ability to five-tune actuator command outputs is added to the basic mode selection and blending process, and mode selection designs that we valid for the entire flight envelope are presented. Methods for optimizing the mode selector to improve overall engine performance are also discussed. Finally, using flight test data from a GE F110-powered F16 aircraft, the full-envelope mode selector designs are validated and shown to provide significant performance benefits. Specifically, thrust command tracking is enhanced while critical engine limits are protected, with very little impact on engine efficiency.

  1. CMOS compatible on-chip telecom-band to mid-infrared supercontinuum generation in dispersion-engineered reverse strip/slot hybrid Si3N4 waveguide

    NASA Astrophysics Data System (ADS)

    Hui, Zhanqiang; Zhang, Lingxuan; Zhang, Wenfu

    2018-01-01

    A silicon nitride (Si3N4)-based reverse strip/slot hybrid waveguide with single vertical silica slot is proposed to acquire extremely low and flat chromatic dispersion profile. This is achieved by design and optimization of the geometrical structural parameters of the reverse hybrid waveguide. The flat dispersion varying between ±10 ps/(nm.km) is obtained over 610 nm bandwidth. Both the effective area and nonlinear coefficient of the waveguide across the entire spectral range of interest are investigated. This led to design of an on-chip supercontinuum (SC) source with -30 dB bandwidth of 2996 nm covering from 1.209 to 4.205 μm. Furthermore, we discuss the output signal spectral and temporal characteristic as a function of the pump power. Our waveguide design offers a CMOS compatible, low-cost/high yield (no photolithography or lift-off processes are necessary) on-chip SC source for near- and mid-infrared nonlinear applications.

  2. Development of an Optical Gas Leak Sensor for Detecting Ethylene, Dimethyl Ether and Methane

    PubMed Central

    Tan, Qiulin; Pei, Xiangdong; Zhu, Simin; Sun, Dong; Liu, Jun; Xue, Chenyang; Liang, Ting; Zhang, Wendong; Xiong, Jijun

    2013-01-01

    In this paper, we present an approach to develop an optical gas leak sensor that can be used to measure ethylene, dimethyl ether, and methane. The sensor is designed based on the principles of IR absorption spectrum detection, and comprises two crossed elliptical surfaces with a folded reflection-type optical path. We first analyze the optical path and the use of this structure to design a miniature gas sensor. The proposed sensor includes two detectors (one to acquire the reference signal and the other for the response signal), the light source, and the filter, all of which are integrated in a miniature gold-plated chamber. We also designed a signal detection device to extract the sensor signal and a microprocessor to calculate and control the entire process. The produced sensor prototype had an accuracy of ±0.05%. Experiments which simulate the transportation of hazardous chemicals demonstrated that the developed sensor exhibited a good dynamic response and adequately met technical requirements. PMID:23539025

  3. Participatory ergonomics and design of technical assistance.

    PubMed

    Rodríguez, Claudia Isabel Rojas

    2012-01-01

    This work describes the experience of application of a procedural initiative, which aimed to identify and address technical assistance needs progressively in therapy and rehabilitation activities. The proposal theoretical axes are the basics of participatory ergonomics and interdisciplinary work, was raised with the intention of addressing important issues for the entire design process including: perception, attention, memory and human being comfort, and the interrelationships that create objects in the context in which they are used. This project was done in collaboration with leading institutes for the rehabilitation of Colombia: Cirec and Roosevelt, through two investigative stages: a first ethnographic stage, during which were observed one hundred forty four (144) procedures of rehabilitation and therapy to build a bank of assistive technology needs, justified on the project observation variables. And a second stage of action research in which they were designed elements that facilitate the implementation of rehabilitation procedures efficiently. Currently being developed experiential situations in different hospitals to examine the reliability of the proposed solutions.

  4. Entire Photodamaged Chloroplasts Are Transported to the Central Vacuole by Autophagy[OPEN

    PubMed Central

    2017-01-01

    Turnover of dysfunctional organelles is vital to maintain homeostasis in eukaryotic cells. As photosynthetic organelles, plant chloroplasts can suffer sunlight-induced damage. However, the process for turnover of entire damaged chloroplasts remains unclear. Here, we demonstrate that autophagy is responsible for the elimination of sunlight-damaged, collapsed chloroplasts in Arabidopsis thaliana. We found that vacuolar transport of entire chloroplasts, termed chlorophagy, was induced by UV-B damage to the chloroplast apparatus. This transport did not occur in autophagy-defective atg mutants, which exhibited UV-B-sensitive phenotypes and accumulated collapsed chloroplasts. Use of a fluorescent protein marker of the autophagosomal membrane allowed us to image autophagosome-mediated transport of entire chloroplasts to the central vacuole. In contrast to sugar starvation, which preferentially induced distinct type of chloroplast-targeted autophagy that transports a part of stroma via the Rubisco-containing body (RCB) pathway, photooxidative damage induced chlorophagy without prior activation of RCB production. We further showed that chlorophagy is induced by chloroplast damage caused by either artificial visible light or natural sunlight. Thus, this report establishes that an autophagic process eliminates entire chloroplasts in response to light-induced damage. PMID:28123106

  5. Engineering Aerothermal Analysis for X-34 Thermal Protection System Design

    NASA Technical Reports Server (NTRS)

    Wurster, Kathryn E.; Riley, Christopher J.; Zoby, E. Vincent

    1998-01-01

    Design of the thermal protection system for any hypersonic flight vehicle requires determination of both the peak temperatures over the surface and the heating-rate history along the flight profile. In this paper, the process used to generate the aerothermal environments required for the X-34 Testbed Technology Demonstrator thermal protection system design is described as it has evolved from a relatively simplistic approach based on engineering methods applied to critical areas to one of detailed analyses over the entire vehicle. A brief description of the trajectory development leading to the selection of the thermal protection system design trajectory is included. Comparisons of engineering heating predictions with wind-tunnel test data and with results obtained using a Navier-Stokes flowfield code and an inviscid/boundary layer method are shown. Good agreement is demonstrated among all these methods for both the ground-test condition and the peak heating flight condition. Finally, the detailed analysis using engineering methods to interpolate the surface-heating-rate results from the inviscid/boundary layer method to predict the required thermal environments is described and results presented.

  6. Engineering Aerothermal Analysis for X-34 Thermal Protection System Design

    NASA Technical Reports Server (NTRS)

    Wurster, Kathryn E.; Riley, Christopher J.; Zoby, E. Vincent

    1998-01-01

    Design of the thermal protection system for any hypersonic flight vehicle requires determination of both the peak temperatures over the surface and the heating-rate history along the flight profile. In this paper, the process used to generate the aerothermal environments required for the X-34 Testbed Technology Demonstrator thermal protection system design is described as it has evolved from a relatively simplistic approach based on engineering methods applied to critical areas to one of detailed analyses over the entire vehicle. A brief description of the trajectory development leading to the selection of the thermal protection system design trajectory is included. Comparisons of engineering heating predictions with wind-tunnel test data and with results obtained using a Navier- Stokes flowfield code and an inviscid/boundary layer method are shown. Good agreement is demonstrated among all these methods for both the ground-test condition and the peak heating flight condition. Finally, the detailed analysis using engineering methods to interpolate the surface-heating-rate results from the inviscid/boundary layer method to predict the required thermal environments is described and results presented.

  7. Full-Scale Wind-Tunnel Investigation of Wing-Cooling Ducts Effects of Propeller Slipstream, Special Report

    NASA Technical Reports Server (NTRS)

    Nickle, F. R.; Freeman, Arthur B.

    1939-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  8. Adaptive wing structures

    NASA Astrophysics Data System (ADS)

    Reed, John L., Jr.; Hemmelgarn, Christopher D.; Pelley, Bryan M.; Havens, Ernie

    2005-05-01

    Cornerstone Research Group, Inc. (CRG) is developing a unique adaptive wing structure intended to enhance the capability of loitering Unmanned Air Vehicles (UAVs). In order to tailor the wing design to a specific application, CRG has developed a wing structure capable of morphing in chord and increasing planform area by 80 percent. With these features, aircraft will be capable of optimizing their flight efficiency throughout the entire mission profile. The key benefit from this morphing design is increased maneuverability, resulting in improved effectiveness over the current design. During the development process CRG has overcome several challenges in the design of such a structure while incorporating advanced materials capable of maintaining aerodynamic shape and transferring aerodynamic loads while enabling crucial changes in planform shape. To overcome some of these challenges, CRG is working on integration of their shape memory polymer materials into the wing skin to enable seamless morphing. This paper will address the challenges associated with the development of a morphing aerospace structure capable of such large shape change, the materials necessary for enabling morphing capabilities, and the current status of the morphing program within CRG.

  9. Finite Element Modeling of a Semi-Rigid Hybrid Mirror and a Highly Actuated Membrane Mirror as Candidates for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Craig, Larry; Jacobson, Dave; Mosier, Gary; Nein, Max; Page, Timothy; Redding, Dave; Sutherlin, Steve; Wilkerson, Gary

    2000-01-01

    Advanced space telescopes, which will eventually replace the Hubble Space Telescope (HTS), will have apertures of 8 - 20 n. Primary mirrors of these dimensions will have to be foldable to fit into the space launcher. By necessity these mirrors will be extremely light weight and flexible and the historical approaches to mirror designs, where the mirror is made as rigid as possible to maintain figure and to serve as the anchor for the entire telescope, cannot be applied any longer. New design concepts and verifications will depend entirely on analytical methods to predict optical performance. Finite element modeling of the structural and thermal behavior of such mirrors is becoming the tool for advanced space mirror designs. This paper discusses some of the preliminary tasks and study results, which are currently the basis for the design studies of the Next Generation Space Telescope.

  10. POWER-BURST FACILITY (PBF) CONCEPTUAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wasserman, A.A.; Johnson, S.O.; Heffner, R.E.

    1963-06-21

    A description is presented of the conceptual design of a high- performance, pulsed reactor called the Power Burst Facility (PBF). This reactor is designed to generate power bursts with initial asymptotic periods as short as 1 msec, producing energy releases large enough to destroy entire fuel subassemblies placed in a capsule or flow loop mounted in the reactor, all without damage to the reactor itself. It will be used primarily to evaluate the consequences and hazards of very rapid destructive accidents in reactors representing the entire range of current nuclear technology as applied to power generation, propulsion, and testing. Itmore » will also be used to carry out detailed studies of nondestructive reactivity feedback mechanisms in the shortperiod domain. The facility was designed to be sufficiently flexible to accommodate future cores of even more advanced design. The design for the first reactor core is based upon proven technology; hence, completion of the final design of this core will involve no significant development delays. Construction of the PBF is proposed to begin in September 1984, and is expected to take approximately 20 months to complete. (auth)« less

  11. Reinventing Material Science - Continuum Magazine | NREL

    Science.gov Websites

    to reinvent an entire field of study, but that is exactly what the Center for Inverse Design is functional materials by developing an "inverse design" approach, powered by theory that guides experiment. The Center for Inverse Design was established as an Energy Frontier Research Center, funded by

  12. The Implications of "Contamination" for Experimental Design in Education

    ERIC Educational Resources Information Center

    Rhoads, Christopher H.

    2011-01-01

    Experimental designs that randomly assign entire clusters of individuals (e.g., schools and classrooms) to treatments are frequently advocated as a way of guarding against contamination of the estimated average causal effect of treatment. However, in the absence of contamination, experimental designs that randomly assign intact clusters to…

  13. Teaching and Learning by Design

    ERIC Educational Resources Information Center

    Doll, Carol A.

    2009-01-01

    As teacher and instructional partner, one will need skills in instructional design. The three main types of instructional design are curriculum planning, unit planning, and lesson planning. Curriculum planning, the big picture, looks at the entire set of skills and knowledge for the district, the school, or a specific grade level. Unit planning…

  14. Fabrication of corner cube array retro-reflective structure with DLP-based 3D printing technology

    NASA Astrophysics Data System (ADS)

    Riahi, Mohammadreza

    2016-06-01

    In this article, the fabrication of a corner cube array retro-reflective structure is presented by using DLP-based 3D printing technology. In this additive manufacturing technology a pattern of a cube corner array is designed in a computer and sliced with specific software. The image of each slice is then projected from the bottom side of a reservoir, containing UV cure resin, utilizing a DLP video projector. The projected area is cured and attached to a base plate. This process is repeated until the entire part is made. The best orientation of the printing process and the effect of layer thicknesses on the surface finish of the cube has been investigated. The thermal reflow surface finishing and replication with soft molding has also been presented in this article.

  15. Reduced state feedback gain computation. [optimization and control theory for aircraft control

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1976-01-01

    Because application of conventional optimal linear regulator theory to flight controller design requires the capability of measuring and/or estimating the entire state vector, it is of interest to consider procedures for computing controls which are restricted to be linear feedback functions of a lower dimensional output vector and which take into account the presence of measurement noise and process uncertainty. Therefore, a stochastic linear model that was developed is presented which accounts for aircraft parameter and initial uncertainty, measurement noise, turbulence, pilot command and a restricted number of measurable outputs. Optimization with respect to the corresponding output feedback gains was performed for both finite and infinite time performance indices without gradient computation by using Zangwill's modification of a procedure originally proposed by Powell. Results using a seventh order process show the proposed procedures to be very effective.

  16. In situ fatigue loading stage inside scanning electron microscope

    NASA Technical Reports Server (NTRS)

    Telesman, Jack; Kantzos, Peter; Brewer, David

    1988-01-01

    A fatigue loading stage inside a scanning electron microscopy (SEM) was developed. The stage allows dynamic and static high-magnification and high-resolution viewing of the fatigue crack initiation and crack propagation processes. The loading stage is controlled by a closed-loop servohydraulic system. Maximum load is 1000 lb (4450 N) with test frequencies ranging up to 30 Hz. The stage accommodates specimens up to 2 inches (50 mm) in length and tolerates substantial specimen translation to view the propagating crack. At room temperature, acceptable working resolution is obtainable for magnifications ranging up to 10,000X. The system is equipped with a high-temperature setup designed for temperatures up to 2000 F (1100 C). The signal can be videotaped for further analysis of the pertinent fatigue damage mechanisms. The design allows for quick and easy interchange and conversion of the SEM from a loading stage configuration to its normal operational configuration and vice versa. Tests are performed entirely in the in-situ mode. In contrast to other designs, the NASA design has greatly extended the life of the loading stage by not exposing the bellows to cyclic loading. The loading stage was used to investigate the fatigue crack growth mechanisms in the (100)-oriented PWA 1480 single-crystal, nickel-based supperalloy. The high-magnification observations revealed the details of the crack growth processes.

  17. 7 CFR 51.3416 - Classification of defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Maximum allowed for U.S. No. 2 processing Occurring outside of or not entirely confined to the vascular ring Internal Black Spot, Internal Discoloration, Vascular Browning, Fusarium Wilt, Net Necrosis, Other Necrosis, Stem End Browning 5% waste 10% waste. Occurring entirely within the vascular ring Hollow Heart or...

  18. Theoretical study of optical pump process in solid gain medium based on four-energy-level model

    NASA Astrophysics Data System (ADS)

    Ma, Yongjun; Fan, Zhongwei; Zhang, Bin; Yu, Jin; Zhang, Hongbo

    2018-04-01

    A semiclassical algorithm is explored to a four-energy level model, aiming to find out the factors that affect the dynamics behavior during the pump process. The impacts of pump intensity Ω p , non-radiative transition rate γ 43 and decay rate of electric dipole δ 14 are discussed in detail. The calculation results show that large γ 43, small δ 14, and strong pumping Ω p are beneficial to the establishing of population inversion. Under strong pumping conditions, the entire pump process can be divided into four different phases, tentatively named far-from-equilibrium process, Rabi oscillation process, quasi dynamic equilibrium process and ‘equilibrium’ process. The Rabi oscillation can slow the pumping process and cause some instability. Moreover, the duration of the entire process is negatively related to Ω p and γ 43 whereas positively related to δ 14.

  19. Case studies on design, simulation and visualization of control and measurement applications using REX control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozana, Stepan, E-mail: stepan.ozana@vsb.cz; Pies, Martin, E-mail: martin.pies@vsb.cz; Docekal, Tomas, E-mail: docekalt@email.cz

    REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a widemore » variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.« less

  20. A Robust, Gravity-Insensitive, High-Temperature Condenser for Water Recovery

    NASA Technical Reports Server (NTRS)

    Chen, Weibo; Conboy, Thomas; Ewert, Michael

    2016-01-01

    Regenerative life support systems are vital for NASA's future long-duration human space exploration missions. A Heat Melt Compactor (HMC) system is being developed by NASA to dry and compress trash generated during space missions. The resulting water vapor is recovered and separated from the process gas flow by a gravity-insensitive condenser. Creare is developing a high-temperature condenser for this application. The entire condenser is constructed from metals that have excellent resistance to chemical attack from contaminants and is suitable for high-temperature operation. The metal construction and design configuration also offer greatest flexibility for potential coating and regeneration processes to reduce biofilm growth and thus enhancing the reliability of the condenser. The proposed condenser builds on the gravity-insensitive phase separator technology Creare developed for aircraft and spacecraft applications. This paper will first discuss the design requirements for the condenser in an HMC system that will be demonstrated on the International Space Station (ISS). Then, it will present the overall design of the condenser and the preliminary thermal test results of a subscale condenser. Finally, this paper will discuss the predicted performance of the full-size condenser and the development plan to mature the technology and enhance its long-term reliability for a flight system.

  1. Omnidirectional, broadband light absorption using large-area, ultrathin lossy metallic film coatings

    NASA Astrophysics Data System (ADS)

    Li, Zhongyang; Palacios, Edgar; Butun, Serkan; Kocer, Hasan; Aydin, Koray

    2015-10-01

    Resonant absorbers based on nanostructured materials are promising for variety of applications including optical filters, thermophotovoltaics, thermal emitters, and hot-electron collection. One of the significant challenges for such micro/nanoscale featured medium or surface, however, is costly lithographic processes for structural patterning which restricted from industrial production of complex designs. Here, we demonstrate lithography-free, broadband, polarization-independent optical absorbers based on a three-layer ultrathin film composed of subwavelength chromium (Cr) and oxide film coatings. We have measured almost perfect absorption as high as 99.5% across the entire visible regime and beyond (400-800 nm). In addition to near-ideal absorption, our absorbers exhibit omnidirectional independence for incidence angle over ±60 degrees. Broadband absorbers introduced in this study perform better than nanostructured plasmonic absorber counterparts in terms of bandwidth, polarization and angle independence. Improvements of such “blackbody” samples based on uniform thin-film coatings is attributed to extremely low quality factor of asymmetric highly-lossy Fabry-Perot cavities. Such broadband absorber designs are ultrathin compared to carbon nanotube based black materials, and does not require lithographic processes. This demonstration redirects the broadband super absorber design to extreme simplicity, higher performance and cost effective manufacturing convenience for practical industrial production.

  2. Capturing molecular multimode relaxation processes in excitable gases based on decomposition of acoustic relaxation spectra

    NASA Astrophysics Data System (ADS)

    Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng

    2017-08-01

    Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.

  3. The Apollo Expericence Lessons Learned for Constellation Lunar Dust Management

    NASA Astrophysics Data System (ADS)

    Wagner, Sandra

    2006-09-01

    Lunar dust will present significant challenges to NASA's Lunar Exploration Missions. The challenges can be overcome by using best practices in system engineering design. For successful lunar surface missions, all systems that come into contact with lunar dust must consider the effects throughout the entire design process. Interfaces between all these systems with other systems also must be considered. Incorporating dust management into Concept of Operations and Requirements development are the best place to begin to mitigate the risks presented by lunar dust. However, that is only the beginning. To be successful, every person who works on NASA's Constellation lunar missions must be mindful of this problem. Success will also require fiscal responsibility. NASA must learn from Apollo the root cause of problems caused by dust, and then find the most cost-effective solutions to address each challenge. This will require a combination of common sense existing technologies and promising, innovative technical solutions

  4. Strategies for dealing with missing data in clinical trials: from design to analysis.

    PubMed

    Dziura, James D; Post, Lori A; Zhao, Qing; Fu, Zhixuan; Peduzzi, Peter

    2013-09-01

    Randomized clinical trials are the gold standard for evaluating interventions as randomized assignment equalizes known and unknown characteristics between intervention groups. However, when participants miss visits, the ability to conduct an intent-to-treat analysis and draw conclusions about a causal link is compromised. As guidance to those performing clinical trials, this review is a non-technical overview of the consequences of missing data and a prescription for its treatment beyond the typical analytic approaches to the entire research process. Examples of bias from incorrect analysis with missing data and discussion of the advantages/disadvantages of analytic methods are given. As no single analysis is definitive when missing data occurs, strategies for its prevention throughout the course of a trial are presented. We aim to convey an appreciation for how missing data influences results and an understanding of the need for careful consideration of missing data during the design, planning, conduct, and analytic stages.

  5. Optical diagnostics of mercury jet for an intense proton target.

    PubMed

    Park, H; Tsang, T; Kirk, H G; Ladeinde, F; Graves, V B; Spampinato, P T; Carroll, A J; Titus, P H; McDonald, K T

    2008-04-01

    An optical diagnostic system is designed and constructed for imaging a free mercury jet interacting with a high intensity proton beam in a pulsed high-field solenoid magnet. The optical imaging system employs a backilluminated, laser shadow photography technique. Object illumination and image capture are transmitted through radiation-hard multimode optical fibers and flexible coherent imaging fibers. A retroreflected illumination design allows the entire passive imaging system to fit inside the bore of the solenoid magnet. A sequence of synchronized short laser light pulses are used to freeze the transient events, and the images are recorded by several high speed charge coupled devices. Quantitative and qualitative data analysis using image processing based on probability approach is described. The characteristics of free mercury jet as a high power target for beam-jet interaction at various levels of the magnetic induction field is reported in this paper.

  6. Quantification of Processing Effects on Filament Wound Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Aiello, Robert A.; Chamis, Christos C.

    1999-01-01

    A computational simulation procedure is described which is designed specifically for the modeling and analysis of filament wound pressure vessels. Cylindrical vessels with spherical or elliptical end caps can be generated automatically. End caps other than spherical or elliptical may be modeled by varying circular sections along the x-axis according to the C C! end cap shape. The finite element model generated is composed of plate type quadrilateral shell elements on the entire vessel surface. This computational procedure can also be sued to generate grid, connectivity and material cards (bulk data) for component parts of a larger model. These bulk data are assigned to a user designated file for finite element structural/stress analysis of composite pressure vessels. The procedure accommodates filament would pressure vessels of all types of shells-of-revolution. It has provisions to readily evaluate initial stresses due to pretension in the winding filaments and residual stresses due to cure temperature.

  7. Quantification of Processing Effects on Filament Wound Pressure Vessels. Revision

    NASA Technical Reports Server (NTRS)

    Aiello, Robert A.; Chamis, Christos C.

    2002-01-01

    A computational simulation procedure is described which is designed specifically for the modeling and analysis of filament wound pressure vessels. Cylindrical vessels with spherical or elliptical end caps can be generated automatically. End caps other than spherical or elliptical may be modeled by varying circular sections along the x-axis according to the end cap shape. The finite element model generated is composed of plate type quadrilateral shell elements on the entire vessel surface. This computational procedure can also be used to generate grid, connectivity and material cards (bulk data) for component parts of a larger model. These bulk data are assigned to a user designated file for finite element structural/stress analysis of composite pressure vessels. The procedure accommodates filament wound pressure vessels of all types of shells-of -revolution. It has provisions to readily evaluate initial stresses due to pretension in the winding filaments and residual stresses due to cure temperature.

  8. The Apollo Experience Lessons Learned for Constellation Lunar Dust Management

    NASA Technical Reports Server (NTRS)

    Wagner, Sandra

    2006-01-01

    Lunar dust will present significant challenges to NASA's Lunar Exploration Missions. The challenges can be overcome by using best practices in system engineering design. For successful lunar surface missions, all systems that come into contact with lunar dust must consider the effects throughout the entire design process. Interfaces between all these systems with other systems also must be considered. Incorporating dust management into Concept of Operations and Requirements development are the best place to begin to mitigate the risks presented by lunar dust. However, that is only the beginning. To be successful, every person who works on NASA's Constellation lunar missions must be mindful of this problem. Success will also require fiscal responsibility. NASA must learn from Apollo the root cause of problems caused by dust, and then find the most cost-effective solutions to address each challenge. This will require a combination of common sense existing technologies and promising, innovative technical solutions

  9. Design of an 8-40 GHz Antenna for the Wideband Instrument for Snow Measurements (WISM)

    NASA Technical Reports Server (NTRS)

    Durham, Timothy E.; Vanhille, Kenneth J.; Trent, Christopher; Lambert, Kevin M.; Miranda, Felix A.

    2015-01-01

    Measurement of land surface snow remains a significant challenge in the remote sensing arena. Developing the tools needed to remotely measure Snow Water Equivalent (SWE) is an important priority. The Wideband Instrument for Snow Measurements (WISM) is being developed to address this need. WISM is an airborne instrument comprised of a dual-frequency (X- and Ku-bands) Synthetic Aperture Radar (SAR) and dual-frequency (K- and Ka-bands) radiometer. A unique feature of this instrument is that all measurement bands share a common antenna aperture consisting of an array feed reflector that covers the entire bandwidth. This paper covers the design and fabrication of the wideband array feed which is based on tightly coupled dipole arrays. Implementation using a relatively new multi-layer microfabrication process results in a small, 6x6 element, dual-linear polarized array with beamformer that operates from 8 to 40 gigahertz.

  10. CHAMP - Camera, Handlens, and Microscope Probe

    NASA Technical Reports Server (NTRS)

    Mungas, G. S.; Beegle, L. W.; Boynton, J.; Sepulveda, C. A.; Balzer, M. A.; Sobel, H. R.; Fisher, T. A.; Deans, M.; Lee, P.

    2005-01-01

    CHAMP (Camera, Handlens And Microscope Probe) is a novel field microscope capable of color imaging with continuously variable spatial resolution from infinity imaging down to diffraction-limited microscopy (3 micron/pixel). As an arm-mounted imager, CHAMP supports stereo-imaging with variable baselines, can continuously image targets at an increasing magnification during an arm approach, can provide precision range-finding estimates to targets, and can accommodate microscopic imaging of rough surfaces through a image filtering process called z-stacking. Currently designed with a filter wheel with 4 different filters, so that color and black and white images can be obtained over the entire Field-of-View, future designs will increase the number of filter positions to include 8 different filters. Finally, CHAMP incorporates controlled white and UV illumination so that images can be obtained regardless of sun position, and any potential fluorescent species can be identified so the most astrobiologically interesting samples can be identified.

  11. Real-time Simulation of Turboprop Engine Control System

    NASA Astrophysics Data System (ADS)

    Sheng, Hanlin; Zhang, Tianhong; Zhang, Yi

    2017-05-01

    On account of the complexity of turboprop engine control system, real-time simulation is the technology, under the prerequisite of maintaining real-time, to effectively reduce development cost, shorten development cycle and avert testing risks. The paper takes RT-LAB as a platform and studies the real-time digital simulation of turboprop engine control system. The architecture, work principles and external interfaces of RT-LAB real-time simulation platform are introduced firstly. Then based on a turboprop engine model, the control laws of propeller control loop and fuel control loop are studied. From that and on the basis of Matlab/Simulink, an integrated controller is designed which can realize the entire process control of the engine from start-up to maximum power till stop. At the end, on the basis of RT-LAB platform, the real-time digital simulation of the designed control system is studied, different regulating plans are tried and more ideal control effects have been obtained.

  12. Statistical Tools for Designing Initial and Post-Removal UXO Characterization Surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.

    2002-09-06

    The Department of Defense (DoD) is in the process of assessing and remediating closed, transferred, and transferring ranges. It is estimated that over 20 million acres of land in the United States potentially contain UXO. The release of DoD sites for public use will require high confidence that UXO is not present. This high confidence may be achieved solely from an extensive knowledge of historical site operations as documented in the conceptual site model or in combination with geophysical sensor surveys designed to have a sufficiently high probability of finding UXO contaminated zones. Many of these sites involve very largemore » geographical areas such that it is often impractical and/or cost prohibitive to perform 100% surveys of the entire site of interest. In that case, it is necessary to be explicit about the performance required of a survey that covers less than 100% of the site.« less

  13. Trash to treasure: production of biofuels and commodity chemicals via syngas fermenting microorganisms.

    PubMed

    Latif, Haythem; Zeidan, Ahmad A; Nielsen, Alex T; Zengler, Karsten

    2014-06-01

    Fermentation of syngas is a means through which unutilized organic waste streams can be converted biologically into biofuels and commodity chemicals. Despite recent advances, several issues remain which limit implementation of industrial-scale syngas fermentation processes. At the cellular level, the energy conservation mechanism of syngas fermenting microorganisms has not yet been entirely elucidated. Furthermore, there was a lack of genetic tools to study and ultimately enhance their metabolic capabilities. Recently, substantial progress has been made in understanding the intricate energy conservation mechanisms of these microorganisms. Given the complex relationship between energy conservation and metabolism, strain design greatly benefits from systems-level approaches. Numerous genetic manipulation tools have also been developed, paving the way for the use of metabolic engineering and systems biology approaches. Rational strain designs can now be deployed resulting in desirable phenotypic traits for large-scale production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Arsenic removal from water employing a combined system: photooxidation and adsorption.

    PubMed

    Lescano, Maia; Zalazar, Cristina; Brandi, Rodolfo

    2015-03-01

    A combined system employing photochemical oxidation (UV/H2O2) and adsorption for arsenic removal from water was designed and evaluated. In this work, a bench-scale photochemical annular reactor was developed being connected alternately to a pair of adsorption columns filled with titanium dioxide (TiO2) and granular ferric hydroxide (GFH). The experiences were performed by varying the relation of As concentration (As (III)/As (V) weight ratio) at constant hydrogen peroxide concentration and incident radiation. Experimental oxidation results were compared with theoretical predictions using an intrinsic kinetic model previously obtained. In addition, the effectiveness of the process was evaluated using a groundwater sample. The mathematical model of the entire system was developed. It could be used as an effective tool for the design and prediction of the behaviour of these types of systems. The combined technology is efficient and promising for arsenic removal to small and medium scale.

  15. Conceptual design for spacelab pool boiling experiment

    NASA Technical Reports Server (NTRS)

    Lienhard, J. H.; Peck, R. E.

    1978-01-01

    A pool boiling heat transfer experiment to be incorporated with a larger two-phase flow experiment on Spacelab was designed to confirm (or alter) the results of earth-normal gravity experiments which indicate that the hydrodynamic peak and minimum pool boiling heat fluxes vanish at very low gravity. Twelve small sealed test cells containing water, methanol or Freon 113 and cylindrical heaters of various sizes are to be built. Each cell will be subjected to one or more 45 sec tests in which the surface heat flux on the heaters is increased linearly until the surface temperature reaches a limiting value of 500 C. The entire boiling process will be photographed in slow-motion. Boiling curves will be constructed from thermocouple and electric input data, for comparison with the motion picture records. The conduct of the experiment will require no more than a few hours of operator time.

  16. Screening Li-Ion Batteries for Internal Shorts

    NASA Technical Reports Server (NTRS)

    Darcy, Eric

    2006-01-01

    The extremely high cost of aerospace battery failures due to internal shorts makes it essential that their occurrence be very rare, if not eliminated altogether. With Li-ion cells/batteries, the potentially catastrophic safety hazard that some internal shorts present adds additional incentive for prevention. Prevention can be achieved by design, manufacturing measures, and testing. Specifically for NASA s spacesuit application, a Li-ion polymer pouch cell battery design is in its final stages of production. One of the 20 flight batteries fabricated and tested developed a cell internal short, which did not present a safety hazard, but has required revisiting the entire manufacturing and testing process. Herein are the details of the failure investigation that followed to get to root cause of the internal short and the corrective actions that will be taken. The resulting lessons learned are applicable to most Li-ion battery applications.

  17. Computational aerodynamics requirements: The future role of the computer and the needs of the aerospace industry

    NASA Technical Reports Server (NTRS)

    Rubbert, P. E.

    1978-01-01

    The commercial airplane builder's viewpoint on the important issues involved in the development of improved computational aerodynamics tools such as powerful computers optimized for fluid flow problems is presented. The primary user of computational aerodynamics in a commercial aircraft company is the design engineer who is concerned with solving practical engineering problems. From his viewpoint, the development of program interfaces and pre-and post-processing capability for new computational methods is just as important as the algorithms and machine architecture. As more and more details of the entire flow field are computed, the visibility of the output data becomes a major problem which is then doubled when a design capability is added. The user must be able to see, understand, and interpret the results calculated. Enormous costs are expanded because of the need to work with programs having only primitive user interfaces.

  18. An evidential reasoning-based AHP approach for the selection of environmentally-friendly designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NG, C.Y., E-mail: ng.cy@cityu.edu.hk

    Due to the stringent environmental regulatory requirements being imposed by cross-national bodies in recent years, manufacturers have to minimize the environmental impact of their products. Among those environmental impact evaluation tools available, Life Cycle Assessment (LCA) is often employed to quantify the product's environmental impact throughout its entire life cycle. However, owing to the requirements of expert knowledge in environmental science and vast effort for data collection in carrying out LCA, as well as the common absence of complete product information during product development processes, there is a need to develop a more suitable tool for product designers. An evidentialmore » reasoning-based approach, which aims at providing a fast-track method to perform design alternative evaluations for non-LCA experts, is therefore introduced as a new initiative to deal with the incomplete or uncertain information. The proposed approach also enables decision makers to quantitatively assess the life cycle phases and design alternatives by comparing their potential environmental impacts, thus effectively and efficiently facilitates the identification of greener designs. A case application is carried out to demonstrate the applicability of the proposed approach.« less

  19. Design Issues of the Pre-Compression Rings of Iter

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Baker, W.; Bettinali, L.; Jong, C.; Mallick, K.; Nardi, C.; Rajainmaki, H.; Rossi, P.; Semeraro, L.

    2010-04-01

    The pre-compression system is the keystone of ITER. A centripetal force of ˜30 MN will be applied at cryogenic conditions on top and bottom of each TF coil. It will prevent the `breathing effect' caused by the bursting forces occurring during plasma operation that would affect the machine design life of 30000 cycles. Different alternatives have been studied throughout the years. There are two major design requirements limiting the engineering possibilities: 1) the limited available space and 2) the need to hamper eddy currents flowing in the structures. Six unidirectionally wound glass-fibre composite rings (˜5 m diameter and ˜300 mm cross section) are the final design choice. The rings will withstand the maximum hoop stresses <500 MPa at room temperature conditions. Although retightening or replacing the pre-compression rings in case of malfunctioning is possible, they have to sustain the load during the entire 20 years of machine operation. The present paper summarizes the pre-compression ring R&D carried out during several years. In particular, we will address the composite choice and mechanical characterization, assessment of creep or stress relaxation phenomena, sub-sized rings testing and the optimal ring fabrication processes that have led to the present final design.

  20. High Level Waste Remote Handling Equipment in the Melter Cave Support Handling System at the Hanford Waste Treatment Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardal, M.A.; Darwen, N.J.

    2008-07-01

    Cold war plutonium production led to extensive amounts of radioactive waste stored in tanks at the Department of Energy's (DOE) Hanford site. Bechtel National, Inc. is building the largest nuclear Waste Treatment Plant in the world located at the Department of Energy's Hanford site to immobilize the millions of gallons of radioactive waste. The site comprises five main facilities; Pretreatment, High Level Waste vitrification, Low Active Waste vitrification, an Analytical Lab and the Balance of Facilities. The pretreatment facilities will separate the high and low level waste. The high level waste will then proceed to the HLW facility for vitrification.more » Vitrification is a process of utilizing a melter to mix molten glass with radioactive waste to form a stable product for storage. The melter cave is designated as the High Level Waste Melter Cave Support Handling System (HSH). There are several key processes that occur in the HSH cell that are necessary for vitrification and include: feed preparation, mixing, pouring, cooling and all maintenance and repair of the process equipment. Due to the cell's high level radiation, remote handling equipment provided by PaR Systems, Inc. is required to install and remove all equipment in the HSH cell. The remote handling crane is composed of a bridge and trolley. The trolley supports a telescoping tube set that rigidly deploys a TR 4350 manipulator arm with seven degrees of freedom. A rotating, extending, and retracting slewing hoist is mounted to the bottom of the trolley and is centered about the telescoping tube set. Both the manipulator and slewer are unique to this cell. The slewer can reach into corners and the manipulator's cross pivoting wrist provides better operational dexterity and camera viewing angles at the end of the arm. Since the crane functions will be operated remotely, the entire cell and crane have been modeled with 3-D software. Model simulations have been used to confirm operational and maintenance functional and timing studies throughout the design process. Since no humans can go in or out of the cell, there are several recovery options that have been designed into the system including jack-down wheels for the bridge and trolley, recovery drums for the manipulator hoist, and a wire rope cable cutter for the slewer jib hoist. If the entire crane fails in cell, the large diameter cable reel that provides power, signal, and control to the crane can be used to retrieve the crane from the cell into the crane maintenance area. (authors)« less

  1. The aerodynamic design of an advanced rotor airfoil

    NASA Technical Reports Server (NTRS)

    Blackwell, J. A., Jr.; Hinson, B. L.

    1978-01-01

    An advanced rotor airfoil, designed utilizing supercritical airfoil technology and advanced design and analysis methodology is described. The airfoil was designed subject to stringent aerodynamic design criteria for improving the performance over the entire rotor operating regime. The design criteria are discussed. The design was accomplished using a physical plane, viscous, transonic inverse design procedure, and a constrained function minimization technique for optimizing the airfoil leading edge shape. The aerodynamic performance objectives of the airfoil are discussed.

  2. Multi-fidelity and multi-disciplinary design optimization of supersonic business jets

    NASA Astrophysics Data System (ADS)

    Choi, Seongim

    Supersonic jets have been drawing great attention after the end of service for the Concorde was announced on April of 2003. It is believed, however, that civilian supersonic aircraft may make a viable return in the business jet market. This thesis focuses on the design optimization of feasible supersonic business jet configurations. Preliminary design techniques for mitigation of ground sonic boom are investigated while ensuring that all relevant disciplinary constraints are satisfied (including aerodynamic performance, propulsion, stability & control and structures.) In order to achieve reasonable confidence in the resulting designs, high-fidelity simulations are required, making the entire design process both expensive and complex. In order to minimize the computational cost, surrogate/approximate models are constructed using a hierarchy of different fidelity analysis tools including PASS, A502/Panair and Euler/NS codes. Direct search methods such as Genetic Algorithms (GAs) and a nonlinear SIMPLEX are employed to designs in searches of large and noisy design spaces. A local gradient-based search method can be combined with these global search methods for small modifications of candidate optimum designs. The Mesh Adaptive Direct Search (MADS) method can also be used to explore the design space using a solution-adaptive grid refinement approach. These hybrid approaches, both in search methodology and surrogate model construction, are shown to result in designs with reductions in sonic boom and improved aerodynamic performance.

  3. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  4. Designing Birefringent Filters For Solid-State Lasers

    NASA Technical Reports Server (NTRS)

    Monosmith, Bryan

    1992-01-01

    Mathematical model enables design of filter assembly of birefringent plates as integral part of resonator cavity of tunable solid-state laser. Proper design treats polarization eigenstate of entire resonator as function of wavelength. Program includes software modules for variety of optical elements including Pockels cell, laser rod, quarter- and half-wave plates, Faraday rotator, and polarizers.

  5. Introducing Whole-Systems Design to First-Year Engineering Students with Case Studies

    ERIC Educational Resources Information Center

    Blizzard, Jackie; Klotz, Leidy; Pradhan, Alok; Dukes, Michael

    2012-01-01

    Purpose: A whole-systems approach, which seeks to optimize an entire system for multiple benefits, not isolated components for single benefits, is essential to engineering design for radically improved sustainability performance. Based on real-world applications of whole-systems design, the Rocky Mountain Institute (RMI) is developing educational…

  6. 34 CFR 200.79 - Exclusion of supplemental State and local funds from supplement, not supplant and comparability...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... academic achievement standards; (ii) Provides supplementary services designed to meet the special... at least 40 percent; (ii) Is designed to promote schoolwide reform and upgrade the entire educational... academic achievement standards that all students are expected to meet; (iii) Is designed to meet the...

  7. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  8. Implementation of random contact hole design with CPL mask by using IML technology

    NASA Astrophysics Data System (ADS)

    Hsu, Michael; Van Den Broeke, Doug; Hsu, Stephen; Chen, J. Fung; Shi, Xuelong; Corcoran, Noel; Yu, Linda

    2005-11-01

    The contact hole imaging is a very challenge task for the optical lithography process during IC manufacturing. Lots of RETs were proposed to improve the contrast of small opening hole. Scattering Bar (SB) OPC, together with optimized illumination, is no doubt one of the critical enablers for low k1 contact imaging. In this study, an effective model-based SB OPC based on IML technology is implemented for contact layer at 90nm, 65nm, and 45nm nodes. For our full-chip implementation flow, the first step is to determine the critical design area and then to proceed with NA and illumination optimization. Then, we selected the best NA in combination with optimum illumination via a Diffraction Optical Element (DOE). With optimized illumination, it is now possible to construct an interference map for the full-chip mask pattern. Utilizing the interference map, the model-based SB OPC is performed. Next, model OPC can be applied with the presence of SB for the entire chip. It is important to note that, for patterning at k1 near 0.35 or below, it may be necessary to include 3D mask effects with a high NA OPC model. With enhanced DOF by IML and immersion process, the low k1 production worthy contact process is feasible.

  9. Investigating the impact of blood pressure increase to the brain using high resolution serial histology and image processing

    NASA Astrophysics Data System (ADS)

    Lesage, F.; Castonguay, A.; Tardif, P. L.; Lefebvre, J.; Li, B.

    2015-09-01

    A combined serial OCT/confocal scanner was designed to image large sections of biological tissues at microscopic resolution. Serial imaging of organs embedded in agarose blocks is performed by cutting through tissue using a vibratome which sequentially cuts slices in order to reveal new tissue to image, overcoming limited light penetration encountered in microscopy. Two linear stages allow moving the tissue with respect to the microscope objective, acquiring a 2D grid of volumes (1x1x0.3 mm) with OCT and a 2D grid of images (1x1mm) with the confocal arm. This process is repeated automatically, until the entire sample is imaged. Raw data is then post-processed to re-stitch each individual acquisition and obtain a reconstructed volume of the imaged tissue. This design is being used to investigate correlations between white matter and microvasculature changes with aging and with increase in pulse pressure following transaortic constriction in mice. The dual imaging capability of the system allowed to reveal different contrast information: OCT imaging reveals changes in refractive indices giving contrast between white and grey matter in the mouse brain, while transcardial perfusion of FITC or pre-sacrifice injection of Evans Blue shows microsvasculature properties in the brain with confocal imaging.

  10. Theoretical modelling and optimization of bubble column dehumidifier for a solar driven humidification-dehumidification system

    NASA Astrophysics Data System (ADS)

    Ranjitha, P. Raj; Ratheesh, R.; Jayakumar, J. S.; Balakrishnan, Shankar

    2018-02-01

    Availability and utilization of energy and water are the top most global challenges being faced by the new millennium. At the present state water scarcity has become a global as well as a regional challenge. 40 % of world population faces water shortage. Challenge of water scarcity can be tackled only with increase in water supply beyond what is obtained from hydrological cycle. This can be achieved either by desalinating the sea water or by reusing the waste water. High energy requirement need to be overcome for either of the two processes. Of many desalination technologies, humidification dehumidification (HDH) technology powered by solar energy is widely accepted for small scale production. Detailed optimization studies on system have the potential to effectively utilize the solar energy for brackish water desalination. Dehumidification technology, specifically, require further study because the dehumidifier effectiveness control the energetic performance of the entire HDH system. The reason attributes to the high resistance involved to diffuse dilute vapor through air in a dehumidifier. The present work intends to optimize the design of a bubble column dehumidifier for a solar energy driven desalination process. Optimization is carried out using Matlab simulation. Design process will identify the unique needs of a bubble column dehumidifier in HDH system.

  11. The synthesis and properties of linear A-π-D-π-A type organic small molecule containing diketopyrrolopyrrole terminal units.

    PubMed

    Zhang, Shanshan; Niu, Qingfen; Sun, Tao; Li, Yang; Li, Tianduo; Liu, Haixia

    2017-08-05

    A novel linear A-π-D-π-A-type organic small molecule Ph2(PDPP) 2 consisting diketopyrrolopyrrole (DPP) as acceptor unit, biphenylene as donor unit and acetylene unit as π-linkage has been successfully designed and synthesized. Its corresponding thermal, photophysical and electrochemical properties as well as the photoinduced charge-separation process were investigated. Ph2(PDPP) 2 exhibits high thermal stability and it can be soluble in common organic solvents such as chloroform and tetrahydrofuran. The photophysical properties show that DPP 2 Ph 2 harvests sunlight over the entire visible spectrum range in the thin-film state (300-800nm). DPP 2 Ph 2 has lower band gaps and appropriate energy levels to satisfy the requirement of solution-processable organic solar cells. The efficient photoinduced charge separation process was clearly observed between DPP 2 Ph 2 with PC 61 BM and the K sv value was found to be as high as 2.13×10 4 M -1 . Therefore, these excellent properties demonstrate that the designed A-π-D-π-A-type small molecule Ph2(PDPP) 2 is the prospective candidate as donor material for organic photovoltaic material. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Enhanced modeling and simulation of EO/IR sensor systems

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; May, Christopher

    2015-05-01

    The testing and evaluation process developed by the Night Vision and Electronic Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) provides end to end systems evaluation, testing, and training of EO/IR sensors. By combining NV-LabCap, the Night Vision Integrated Performance Model (NV-IPM), One Semi-Automated Forces (OneSAF) input sensor file generation, and the Night Vision Image Generator (NVIG) capabilities, NVESD provides confidence to the M&S community that EO/IR sensor developmental and operational testing and evaluation are accurately represented throughout the lifecycle of an EO/IR system. This new process allows for both theoretical and actual sensor testing. A sensor can be theoretically designed in NV-IPM, modeled in NV-IPM, and then seamlessly input into the wargames for operational analysis. After theoretical design, prototype sensors can be measured by using NV-LabCap, then modeled in NV-IPM and input into wargames for further evaluation. The measurement process to high fidelity modeling and simulation can then be repeated again and again throughout the entire life cycle of an EO/IR sensor as needed, to include LRIP, full rate production, and even after Depot Level Maintenance. This is a prototypical example of how an engineering level model and higher level simulations can share models to mutual benefit.

  13. Optical lenses design and experimental investigations of a dynamic focusing unit for a CO2 laser scanning system

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Xu, Yue; Zhang, Huaxin; Liu, Peng; Jiao, Guohua

    2016-09-01

    Laser scanners are critical components in material processing systems, such as welding, cutting, and drilling. To achieve high-accuracy processing, the laser spot size should be small and uniform in the entire objective flat field. However, traditional static focusing method using F-theta objective lens is limited by the narrow flat field. To overcome these limitations, a dynamic focusing unit consisting of two lenses is presented in this paper. The dual-lens system has a movable plano-concave lens and a fixed convex lens. As the location of the movable optical elements is changed, the focal length is shifted to keep a small focus spot in a broad flat processing filed. The optical parameters of the two elements are theoretical analyzed. The spot size is calculated to obtain the relationship between the moving length of first lens and the shift focus length of the system. Also, the Zemax model of the optical system is built up to verify the theoretical design and optimize the optical parameter. The proposed lenses are manufactured and a test system is built up to investigate their performances. The experimental results show the spot size is smaller than 450um in all the 500*500mm 2 filed with CO2 laser. Compared with the other dynamic focusing units, this design has fewer lenses and no focusing spot in the optical path. In addition, the focal length minimal changes with the shit of incident laser beam.

  14. A highly reliable, autonomous data communication subsystem for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Masotto, Thomas; Alger, Linda

    1990-01-01

    The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.

  15. CPLOAS_2 user manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sallaberry, Cedric Jean-Marie.; Helton, Jon Craig

    2012-10-01

    Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high-consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to deactivate the entire system before the SL system fails (i.e., degrades into a configuration that could allowmore » an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). This report describes the Fortran 90 program CPLOAS_2 that implements the following representations for PLOAS for situations in which both link physical properties and link failure properties are time-dependent: (i) failure of all SLs before failure of any WL, (ii) failure of any SL before failure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS can be included in the calculations performed by CPLOAS_2.« less

  16. Anticipatory control: A software retrofit for current plant controllers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parthasarathy, S.; Parlos, A.G.; Atiya, A.F.

    1993-01-01

    The design and simulated testing of an artificial neural network (ANN)-based self-adapting controller for complex process systems are presented in this paper. The proposed controller employs concepts based on anticipatory systems, which have been widely used in the petroleum and chemical industries, and they are slowly finding their way into the power industry. In particular, model predictive control (MPC) is used for the systematic adaptation of the controller parameters to achieve desirable plant performance over the entire operating envelope. The versatile anticipatory control algorithm developed in this study is projected to enhance plant performance and lend robustness to drifts inmore » plant parameters and to modeling uncertainties. This novel technique of integrating recurrent ANNs with a conventional controller structure appears capable of controlling complex, nonlinear, and nonminimum phase process systems. The direct, on-line adaptive control algorithm presented in this paper considers the plant response over a finite time horizon, diminishing the need for manual control or process interruption for controller gain tuning.« less

  17. Mission Life Thermal Analysis and Environment Correlation for the Lunar Reconnaissance Orbiter

    NASA Technical Reports Server (NTRS)

    Garrison, Matthew B.; Peabody, Hume

    2012-01-01

    Standard thermal analysis practices include stacking worst-case conditions including environmental heat loads, thermo-optical properties and orbital beta angles. This results in the design being driven by a few bounding thermal cases, although those cases may only represent a very small portion of the actual mission life. The NASA Goddard Space Flight Center Thermal Branch developed a procedure to predict the flight temperatures over the entire mission life, assuming a known beta angle progression, variation in the thermal environment, and a degradation rate in the coatings. This was applied to the Global Precipitation Measurement core spacecraft. In order to assess the validity of this process, this work applies the similar process to the Lunar Reconnaissance Orbiter. A flight-correlated thermal model was exercised to give predictions of the thermal performance over the mission life. These results were then compared against flight data from the first two years of the spacecraft s use. This is used to validate the process and to suggest possible improvements for future analyses.

  18. Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.

    2002-01-01

    Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.

  19. Identifying the Root Causes of Wait States in Large-Scale Parallel Applications

    DOE PAGES

    Böhme, David; Geimer, Markus; Arnold, Lukas; ...

    2016-07-20

    Driven by growing application requirements and accelerated by current trends in microprocessor design, the number of processor cores on modern supercomputers is increasing from generation to generation. However, load or communication imbalance prevents many codes from taking advantage of the available parallelism, as delays of single processes may spread wait states across the entire machine. Moreover, when employing complex point-to-point communication patterns, wait states may propagate along far-reaching cause-effect chains that are hard to track manually and that complicate an assessment of the actual costs of an imbalance. Building on earlier work by Meira Jr. et al., we present amore » scalable approach that identifies program wait states and attributes their costs in terms of resource waste to their original cause. Ultimately, by replaying event traces in parallel both forward and backward, we can identify the processes and call paths responsible for the most severe imbalances even for runs with hundreds of thousands of processes.« less

  20. Identifying the Root Causes of Wait States in Large-Scale Parallel Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Böhme, David; Geimer, Markus; Arnold, Lukas

    Driven by growing application requirements and accelerated by current trends in microprocessor design, the number of processor cores on modern supercomputers is increasing from generation to generation. However, load or communication imbalance prevents many codes from taking advantage of the available parallelism, as delays of single processes may spread wait states across the entire machine. Moreover, when employing complex point-to-point communication patterns, wait states may propagate along far-reaching cause-effect chains that are hard to track manually and that complicate an assessment of the actual costs of an imbalance. Building on earlier work by Meira Jr. et al., we present amore » scalable approach that identifies program wait states and attributes their costs in terms of resource waste to their original cause. Ultimately, by replaying event traces in parallel both forward and backward, we can identify the processes and call paths responsible for the most severe imbalances even for runs with hundreds of thousands of processes.« less

  1. The electrical properties of zero-gravity processed immiscibles

    NASA Technical Reports Server (NTRS)

    Lacy, L. L.; Otto, G. H.

    1974-01-01

    When dispersed or mixed immiscibles are solidified on earth, a large amount of separation of the constituents takes place due to differences in densities. However, when the immiscibles are dispersed and solidified in zero-gravity, density separation does not occur, and unique composite solids can be formed with many new and promising electrical properties. By measuring the electrical resistivity and superconducting critical temperature, Tc, of zero-g processed Ga-Bi samples, it has been found that the electrical properties of such materials are entirely different from the basic constituents and the ground control samples. Our results indicate that space processed immiscible materials may form an entirely new class of electronic materials.

  2. A Passive Earth-Entry Capsule for Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Mitcheltree, Robert A.; Kellas, Sotiris

    1999-01-01

    A combination of aerodynamic analysis and testing, aerothermodynamic analysis, structural analysis and testing, impact analysis and testing, thermal analysis, ground characterization tests, configuration packaging, and trajectory simulation are employed to determine the feasibility of an entirely passive Earth entry capsule for the Mars Sample Return mission. The design circumvents the potential failure modes of a parachute terminal descent system by replacing that system with passive energy absorbing material to cushion the Mars samples during ground impact. The suggested design utilizes a spherically blunted 45-degree half-angle cone forebody with an ablative heat shield. The primary structure is a hemispherical, composite sandwich enclosing carbon foam energy absorbing material. Though no demonstration test of the entire system is included, results of the tests and analysis presented indicate that the design is a viable option for the Mars Sample Return Mission.

  3. Recent progress in plasmonic colour filters for image sensor and multispectral applications

    NASA Astrophysics Data System (ADS)

    Pinton, Nadia; Grant, James; Choubey, Bhaskar; Cumming, David; Collins, Steve

    2016-04-01

    Using nanostructured thin metal films as colour filters offers several important advantages, in particular high tunability across the entire visible spectrum and some of the infrared region, and also compatibility with conventional CMOS processes. Since 2003, the field of plasmonic colour filters has evolved rapidly and several different designs and materials, or combination of materials, have been proposed and studied. In this paper we present a simulation study for a single- step lithographically patterned multilayer structure able to provide competitive transmission efficiencies above 40% and contemporary FWHM of the order of 30 nm across the visible spectrum. The total thickness of the proposed filters is less than 200 nm and is constant for every wavelength, unlike e.g. resonant cavity-based filters such as Fabry-Perot that require a variable stack of several layers according to the working frequency, and their passband characteristics are entirely controlled by changing the lithographic pattern. It will also be shown that a key to obtaining narrow-band optical response lies in the dielectric environment of a nanostructure and that it is not necessary to have a symmetric structure to ensure good coupling between the SPPs at the top and bottom interfaces. Moreover, an analytical method to evaluate the periodicity, given a specific structure and a desirable working wavelength, will be proposed and its accuracy demonstrated. This method conveniently eliminate the need to optimize the design of a filter numerically, i.e. by running several time-consuming simulations with different periodicities.

  4. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    PubMed

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  5. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  6. Demonstration of the James Webb Space Telescope commissioning on the JWST testbed telescope

    NASA Astrophysics Data System (ADS)

    Acton, D. Scott; Towell, Timothy; Schwenker, John; Swensen, John; Shields, Duncan; Sabatke, Erin; Klingemann, Lana; Contos, Adam R.; Bauer, Brian; Hansen, Karl; Atcheson, Paul D.; Redding, David; Shi, Fang; Basinger, Scott; Dean, Bruce; Burns, Laura

    2006-06-01

    The one-meter Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate the design and implementation of the wavefront sensing and control (WFS&C) capabilities of the James Webb Space Telescope (JWST). The TBT is used to develop and verify the WFS&C algorithms, check the communication interfaces, validate the WFS&C optical components and actuators, and provide risk reduction opportunities for test approaches for later full-scale cryogenic vacuum testing of the observatory. In addition, the TBT provides a vital opportunity to demonstrate the entire WFS&C commissioning process. This paper describes recent WFS&C commissioning experiments that have been performed on the TBT.

  7. Zero-Gravity Atmospheric Cloud Physics Experiment Laboratory engineering concepts/design tradeoffs. Volume 1: Study results

    NASA Technical Reports Server (NTRS)

    Greco, R. V.; Eaton, L. R.; Wilkinson, H. C.

    1974-01-01

    The work is summarized which was accomplished from January 1974 to October 1974 for the Zero-Gravity Atmospheric Cloud Physics Laboratory. The definition and development of an atmospheric cloud physics laboratory and the selection and delineation of candidate experiments that require the unique environment of zero gravity or near zero gravity are reported. The experiment program and the laboratory concept for a Spacelab payload to perform cloud microphysics research are defined. This multimission laboratory is planned to be available to the entire scientific community to utilize in furthering the basic understanding of cloud microphysical processes and phenomenon, thereby contributing to improved weather prediction and ultimately to provide beneficial weather control and modification.

  8. Mean Flow Augmented Acoustics in Rocket Systems

    NASA Technical Reports Server (NTRS)

    Fischbach, Sean R.

    2015-01-01

    Combustion instability in solid rocket motors and liquid engines is a complication that continues to plague designers and engineers. Many rocket systems experience violent fluctuations in pressure, velocity, and temperature originating from the complex interactions between the combustion process and gas dynamics. During sever cases of combustion instability fluctuation amplitudes can reach values equal to or greater than the average chamber pressure. Large amplitude oscillations lead to damaged injectors, loss of rocket performance, damaged payloads, and in some cases breach of case/loss of mission. Historic difficulties in modeling and predicting combustion instability has reduced most rocket systems experiencing instability into a costly fix through testing paradigm or to scrap the system entirely.

  9. A statistical model for radar images of agricultural scenes

    NASA Technical Reports Server (NTRS)

    Frost, V. S.; Shanmugan, K. S.; Holtzman, J. C.; Stiles, J. A.

    1982-01-01

    The presently derived and validated statistical model for radar images containing many different homogeneous fields predicts the probability density functions of radar images of entire agricultural scenes, thereby allowing histograms of large scenes composed of a variety of crops to be described. Seasat-A SAR images of agricultural scenes are accurately predicted by the model on the basis of three assumptions: each field has the same SNR, all target classes cover approximately the same area, and the true reflectivity characterizing each individual target class is a uniformly distributed random variable. The model is expected to be useful in the design of data processing algorithms and for scene analysis using radar images.

  10. 24 CFR 3280.904 - Specific requirements for designing the transportation system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the transportation system. 3280.904 Section 3280.904 Housing and Urban Development Regulations... SAFETY STANDARDS Transportation § 3280.904 Specific requirements for designing the transportation system. (a) General. The entire system (frame, drawbar and coupling mechanism, running gear assembly, and...

  11. AIRSAR Web-Based Data Processing

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne

    2007-01-01

    The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.

  12. Automatic Neural Processing of Disorder-Related Stimuli in Social Anxiety Disorder: Faces and More

    PubMed Central

    Schulz, Claudia; Mothes-Lasch, Martin; Straube, Thomas

    2013-01-01

    It has been proposed that social anxiety disorder (SAD) is associated with automatic information processing biases resulting in hypersensitivity to signals of social threat such as negative facial expressions. However, the nature and extent of automatic processes in SAD on the behavioral and neural level is not entirely clear yet. The present review summarizes neuroscientific findings on automatic processing of facial threat but also other disorder-related stimuli such as emotional prosody or negative words in SAD. We review initial evidence for automatic activation of the amygdala, insula, and sensory cortices as well as for automatic early electrophysiological components. However, findings vary depending on tasks, stimuli, and neuroscientific methods. Only few studies set out to examine automatic neural processes directly and systematic attempts are as yet lacking. We suggest that future studies should: (1) use different stimulus modalities, (2) examine different emotional expressions, (3) compare findings in SAD with other anxiety disorders, (4) use more sophisticated experimental designs to investigate features of automaticity systematically, and (5) combine different neuroscientific methods (such as functional neuroimaging and electrophysiology). Finally, the understanding of neural automatic processes could also provide hints for therapeutic approaches. PMID:23745116

  13. Semantic processing during morphological priming: an ERP study.

    PubMed

    Beyersmann, Elisabeth; Iakimova, Galina; Ziegler, Johannes C; Colé, Pascale

    2014-09-04

    Previous research has yielded conflicting results regarding the onset of semantic processing during morphological priming. The present study was designed to further explore the time-course of morphological processing using event-related potentials (ERPs). We conducted a primed lexical decision study comparing a morphological (LAVAGE - laver [washing - wash]), a semantic (LINGE - laver [laundry - wash]), an orthographic (LAVANDE - laver [lavender - wash]), and an unrelated control condition (HOSPICE - laver [nursing home - wash]), using the same targets across the four priming conditions. The behavioral data showed significant effects of morphological and semantic priming, with the magnitude of morphological priming being significantly larger than the magnitude of semantic priming. The ERP data revealed significant morphological but no semantic priming at 100-250 ms. Furthermore, a reduction of the N400 amplitude in the morphological condition compared to the semantic and orthographic condition demonstrates that the morphological priming effect was not entirely due to the semantic or orthographic overlap between the prime and the target. The present data reflect an early process of semantically blind morphological decomposition, and a later process of morpho-semantic decomposition, which we discuss in the context of recent morphological processing theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Dose rate prediction methodology for remote handled transuranic waste workers at the waste isolation pilot plant.

    PubMed

    Hayes, Robert

    2002-10-01

    An approach is described for estimating future dose rates to Waste Isolation Pilot Plant workers processing remote handled transuranic waste. The waste streams will come from the entire U.S. Department of Energy complex and can take on virtually any form found from the processing sequences for defense-related production, radiochemistry, activation and related work. For this reason, the average waste matrix from all generator sites is used to estimate the average radiation fields over the facility lifetime. Innovative new techniques were applied to estimate expected radiation fields. Non-linear curve fitting techniques were used to predict exposure rate profiles from cylindrical sources using closed form equations for lines and disks. This information becomes the basis for Safety Analysis Report dose rate estimates and for present and future ALARA design reviews when attempts are made to reduce worker doses.

  15. Smart CMOS image sensor for lightning detection and imaging.

    PubMed

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-03-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel frame-to-frame difference comparison with an adjustable threshold and on-chip digital processing allowing an efficient localization of a faint lightning pulse on the entire large format array at a frequency of 1 kHz. A CMOS prototype sensor with a 256×256 pixel array and a 60 μm pixel pitch has been fabricated using a 0.35 μm 2P 5M technology and tested to validate the selected detection approach.

  16. Automated ammunition logistics for the Crusader program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speaks, D.M.; Kring, C.T.; Lloyd, P.D.

    1997-03-01

    The US Army`s next generation artillery system is called the Crusader. A self-propelled howitzer and a resupply vehicle constitute the Crusader system, which will be designed for improved mobility, increased firepower, and greater survivability than current generation vehicles. The Army`s Project Manager, Crusader, gave Oak Ridge National Laboratory (ORNL) the task of developing and demonstrating a concept for the resupply vehicle. The resupply vehicle is intended to sustain the howitzer with ammunition and fuel and will significantly increase capabilities over those of current resupply vehicles. Ammunition is currently processed and transferred almost entirely by hand. ORNL identified and evaluated variousmore » concepts for automated upload, processing, storage, docking and delivery. Each of the critical technologies was then developed separately and demonstrated on discrete test platforms. An integrated technology demonstrator, incorporating each of the individual technology components to realistically simulate performance of the selected vehicle concept, was developed and successfully demonstrated for the Army.« less

  17. Synthesis of nanoparticles using ethanol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jia Xu

    The present disclosure relates to methods for producing nanoparticles. The nanoparticles may be made using ethanol as the solvent and the reductant to fabricate noble-metal nanoparticles with a narrow particle size distributions, and to coat a thin metal shell on other metal cores. With or without carbon supports, particle size is controlled by fine-tuning the reduction power of ethanol, by adjusting the temperature, and by adding an alkaline solution during syntheses. The thickness of the added or coated metal shell can be varied easily from sub-monolayer to multiple layers in a seed-mediated growth process. The entire synthesis of designed core-shellmore » catalysts can be completed using metal salts as the precursors with more than 98% yield; and, substantially no cleaning processes are necessary apart from simple rinsing. Accordingly, this method is considered to be a "green" chemistry method.« less

  18. Achieving continuous improvement in laboratory organization through performance measurements: a seven-year experience.

    PubMed

    Salinas, Maria; López-Garrigós, Maite; Gutiérrez, Mercedes; Lugo, Javier; Sirvent, Jose Vicente; Uris, Joaquin

    2010-01-01

    Laboratory performance can be measured using a set of model key performance indicators (KPIs). The design and implementation of KPIs are important issues. KPI results from 7 years are reported and their implementation, monitoring, objectives, interventions, result reporting and delivery are analyzed. The KPIs of the entire laboratory process were obtained using Laboratory Information System (LIS) registers. These were collected automatically using a data warehouse application, spreadsheets and external quality program reports. Customer satisfaction was assessed using surveys. Nine model laboratory KPIs were proposed and measured. The results of some examples of KPIs used in our laboratory are reported. Their corrective measurements or the implementation of objectives led to improvement in the associated KPIs results. Measurement of laboratory performance using KPIs and a data warehouse application that continuously collects registers and calculates KPIs confirmed the reliability of indicators, indicator acceptability and usability for users, and continuous process improvement.

  19. SkyMapper Filter Set: Design and Fabrication of Large-Scale Optical Filters

    NASA Astrophysics Data System (ADS)

    Bessell, Michael; Bloxham, Gabe; Schmidt, Brian; Keller, Stefan; Tisserand, Patrick; Francis, Paul

    2011-07-01

    The SkyMapper Southern Sky Survey will be conducted from Siding Spring Observatory with u, v, g, r, i, and z filters that comprise glued glass combination filters with dimensions of 309 × 309 × 15 mm. In this article we discuss the rationale for our bandpasses and physical characteristics of the filter set. The u, v, g, and z filters are entirely glass filters, which provide highly uniform bandpasses across the complete filter aperture. The i filter uses glass with a short-wave pass coating, and the r filter is a complete dielectric filter. We describe the process by which the filters were constructed, including the processes used to obtain uniform dielectric coatings and optimized narrowband antireflection coatings, as well as the technique of gluing the large glass pieces together after coating using UV transparent epoxy cement. The measured passbands, including extinction and CCD QE, are presented.

  20. Remotely Operated Aircraft (ROA) Impact on the National Airspace System (NAS) Work Package: Automation Impacts of ROA's in the NAS

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The purpose of this document is to analyze the impact of Remotely Operated Aircraft (ROA) operations on current and planned Air Traffic Control (ATC) automation systems in the En Route, Terminal, and Traffic Flow Management domains. The operational aspects of ROA flight, while similar, are not entirely identical to their manned counterparts and may not have been considered within the time-horizons of the automation tools. This analysis was performed to determine if flight characteristics of ROAs would be compatible with current and future NAS automation tools. Improvements to existing systems / processes are recommended that would give Air Traffic Controllers an indication that a particular aircraft is an ROA and modifications to IFR flight plan processing algorithms and / or designation of airspace where an ROA will be operating for long periods of time.

  1. A RESTful Service Oriented Architecture for Science Data Processing

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Tilmes, C.; Durbin, P.; Masuoka, E.

    2012-12-01

    The Atmospheric Composition Processing System is an implementation of a RESTful Service Oriented Architecture which handles incoming data from the Ozone Monitoring Instrument and the Ozone Monitoring and Profiler Suite aboard the Aura and NPP spacecrafts respectively. The system has been built entirely from open source components, such as Postgres, Perl, and SQLite and has leveraged the vast resources of the Comprehensive Perl Archive Network (CPAN). The modular design of the system also allows for many of the components to be easily released and integrated into the CPAN ecosystem and reused independently. At minimal expense, the CPAN infrastructure and community provide peer review, feedback and continuous testing in a wide variety of environments and architectures. A well defined set of conventions also facilitates dependency management, packaging, and distribution of code. Test driven development also provides a way to ensure stability despite a continuously changing base of dependencies.

  2. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  3. Anisotropic growth of NiO nanorods from Ni nanoparticles by rapid thermal oxidation.

    PubMed

    Koga, Kenji; Hirasawa, Makoto

    2013-09-20

    NiO nanorods with extremely high crystallinity were grown by rapid thermal oxidation through exposure of Ni nanoparticles (NPs) heated above 400° C to oxygen. Oxidation proceeds by nucleation of a NiO island on a Ni NP that grows anisotropically to produce a NiO nanorod. This process differs completely from that under mild oxidation conditions, where the surface of the NPs is completely covered with an oxide film during the early stage of oxidation. The observed novel behaviour strongly suggests an interfacial oxidation mechanism driven by the dissolution of adsorbed oxygen into the Ni NP sub-surface region, subsequent diffusion and reaction at the NiO/Ni interface. The early oxidation conditions of metal NPs impose a significant influence on the entire oxidation process at the nanoscale and are therefore inherently important for the precise morphological control of oxidized NPs to design functional nanomaterials.

  4. Vision-based in-line fabric defect detection using yarn-specific shape features

    NASA Astrophysics Data System (ADS)

    Schneider, Dorian; Aach, Til

    2012-01-01

    We develop a methodology for automatic in-line flaw detection in industrial woven fabrics. Where state of the art detection algorithms apply texture analysis methods to operate on low-resolved ({200 ppi) image data, we describe here a process flow to segment single yarns in high-resolved ({1000 ppi) textile images. Four yarn shape features are extracted, allowing a precise detection and measurement of defects. The degree of precision reached allows a classification of detected defects according to their nature, providing an innovation in the field of automatic fabric flaw detection. The design has been carried out to meet real time requirements and face adverse conditions caused by loom vibrations and dirt. The entire process flow is discussed followed by an evaluation using a database with real-life industrial fabric images. This work pertains to the construction of an on-loom defect detection system to be used in manufacturing practice.

  5. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  6. Software for Analyzing Laminar-to-Turbulent Flow Transitions

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software. This is achieved by combining two prior programs in an innovative manner

  7. Application of the multi-disciplinary thematic seminar method in two homecare cases - a comparative study.

    PubMed

    Scandurra, Isabella; Hägglund, Maria; Koch, Sabine

    2008-01-01

    A significant problem with current health information technologies is that they poorly support collaborative work of healthcare professionals, sometimes leading to a fragmentation of workflow and disruption of healthcare processes. This paper presents two homecare cases, both applying multi-disciplinary thematic seminars (MdTS) as a collaborative method for user needs elicitation and requirements specification. This study describes the MdTS application to elicit user needs from different perspectives to coincide with collaborative professions' work practices in two cases. Despite different objectives, the two cases validated that MdTS emphasized the "points of intersection" in cooperative work. Different user groups with similar, yet distinct needs reached a common understanding of the entire work process, agreed upon requirements and participated in the design of prototypes supporting cooperative work. MdTS was applicable in both exploratory and normative studies aiming to elicit the specific requirements in a cooperative environment.

  8. [The workplace injury trends in the petrochemical industry: from data analysis to risk management].

    PubMed

    Campo, Giuseppe; Martini, Benedetta

    2013-01-01

    The most recent INAIL data show that, in 2009-2011, the accident frequency rate and the severity rate of workplace injuries in the chemical industry are lower than for the total non-agricultural workforce. The chemical industry, primarily because of the complex and hazardous work processes, requires an appropriate system for assessing and monitoring specific risks.The implementation of Responsible Care, a risk management system specific for the chemical industry, in 1984, has represented a historical step in the process of critical awareness of risk management by the chemical companies. Responsible Care is a risk management system specifically designed on the risk profiles of this type of enterprise, which integrates safety, health and environment. A risk management system, suitable for the needs of a chemical company, should extend its coverage area, beyond the responsible management of products throughout the entire production cycle, to the issues of corporate responsibility.

  9. Plastic-Based Structurally Programmable Microfluidic Biochips for Clinical Diagnostics

    DTIC Science & Technology

    2005-05-01

    BIOCOMPATIBILITY CRITERIA OF SELECTED UV ADHESIVE LOCTITE 3211™......... 63 1 I. Executive Summary The objective of this project is to develop a smart...added into biochip design for improving the biocompatibility of entire biochip. Detailed problems include: • Design and development of structure... biocompatible biosensor array. 6 • Design and development of the sensor-to-circuit interface. Electronic Control System and Analyzer Design of the

  10. Balloon-borne three-meter telescope for far-infrared and submillimeter astronomy

    NASA Technical Reports Server (NTRS)

    Fazio, Giovanni G.

    1986-01-01

    The study and revision of the gimbal design of the Three-Meter Balloon Borne Telescope (TMBBT) is discussed. Efforts were made to eliminate the alignment and limited rotation problems inherent in the flex-pivot design. A new design using ball bearings to replace the flex-pivots was designed and its performance analyzed. An error analysis for the entire gondola pointing system was also prepared.

  11. Integrating ergonomics knowledge into business-driven design projects: The shaping of resource constraints in engineering consultancy.

    PubMed

    Hall-Andersen, Lene Bjerg; Neumann, Patrick; Broberg, Ole

    2016-10-17

    The integration of ergonomics knowledge into engineering projects leads to both healthier and more efficient workplaces. There is a lack of knowledge about integrating ergonomic knowledge into the design practice in engineering consultancies. This study explores how organizational resources can pose constraints for the integration of ergonomics knowledge into engineering design projects in a business-driven setting, and how ergonomists cope with these resource constraints. An exploratory case study in an engineering consultancy was conducted. A total of 27 participants were interviewed. Data were collected applying semi-structured interviews, observations, and documentary studies. Interviews were transcribed, coded, and categorized into themes. From the analysis five overall themes emerged as major constituents of resource constraints: 1) maximizing project revenue, 2) payment for ergonomics services, 3) value of ergonomic services, 4) role of the client, and 5) coping strategies to overcome resource constraints. We hypothesize that resource constraints were shaped due to sub-optimization of costs in design projects. The economical contribution of ergonomics measures was not evaluated in the entire life cycle of a designed workplace. Coping strategies included teaming up with engineering designers in the sales process or creating an alliance with ergonomists in the client organization.

  12. Post-Optimality Analysis In Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Kroo, Ilan M.; Gage, Peter J.

    1993-01-01

    This analysis pertains to the applicability of optimal sensitivity information to aerospace vehicle design. An optimal sensitivity (or post-optimality) analysis refers to computations performed once the initial optimization problem is solved. These computations may be used to characterize the design space about the present solution and infer changes in this solution as a result of constraint or parameter variations, without reoptimizing the entire system. The present analysis demonstrates that post-optimality information generated through first-order computations can be used to accurately predict the effect of constraint and parameter perturbations on the optimal solution. This assessment is based on the solution of an aircraft design problem in which the post-optimality estimates are shown to be within a few percent of the true solution over the practical range of constraint and parameter variations. Through solution of a reusable, single-stage-to-orbit, launch vehicle design problem, this optimal sensitivity information is also shown to improve the efficiency of the design process, For a hierarchically decomposed problem, this computational efficiency is realized by estimating the main-problem objective gradient through optimal sep&ivity calculations, By reducing the need for finite differentiation of a re-optimized subproblem, a significant decrease in the number of objective function evaluations required to reach the optimal solution is obtained.

  13. Limits of thermochemical and photochemical syntheses of gaseous fuels: a finite-time thermodynamic analysis. Annual report, September 1983-February, 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, R.S.

    The objectives of this project are to develop methods for the evaluation of syntheses of gaseous fuels in terms of their optimum possible performance, particularly when they are required to supply those fuels at nonzero rates. The first objective is entirely in the tradition of classical thermodynamics, the processes, given the characteristics and constraints that define them. The new element which this project introduces is the capability to set limits more realistic than those from classical thermodynamics, by the inclusion of the influence of the rate or duration of a process on its performance. The development of these analyses ismore » a natural step in the evolution represented by the evaluative papers of Appendix IV, e.g., by Funk et al., Abraham, Shinnar, Bilgen and Fletcher. A second objective is to determine how any given process should be carried out, within its constraints, in order to yield its optimum performance and to use this information whenever possible to help guide the design of that process.« less

  14. Modeling the economics of landfilling organic processing waste streams

    NASA Astrophysics Data System (ADS)

    Rosentrater, Kurt A.

    2005-11-01

    As manufacturing industries become more cognizant of the ecological effects that their firms have on the surrounding environment, their waste streams are increasingly becoming viewed not only as materials in need of disposal, but also as resources that can be reused, recycled, or reprocessed into valuable products. Within the food processing sector are many examples of various liquid, sludge, and solid biological and organic waste streams that require remediation. Alternative disposal methods for food and other bio-organic manufacturing waste streams are increasingly being investigated. Direct shipping, blending, extrusion, pelleting, and drying are commonly used to produce finished human food, animal feed, industrial products, and components ready for further manufacture. Landfilling, the traditional approach to waste remediation, however, should not be dismissed entirely. It does provide a baseline to which all other recycling and reprocessing options should be compared. This paper discusses the implementation of a computer model designed to examine the economics of landfilling bio-organic processing waste streams. Not only are these results applicable to food processing operations, but any industrial or manufacturing firm would benefit from examining the trends discussed here.

  15. Space Processing Applications Rocket (SPAR) project: SPAR 10

    NASA Technical Reports Server (NTRS)

    Poorman, R. (Compiler)

    1986-01-01

    The Space Processing Applications Rocket Project (SPAR) X Final Report contains the compilation of the post-flight reports from each of the Principal Investigators (PIs) on the four selected science payloads, in addition to the engineering report as documented by the Marshall Space Flight Center (MSFC). This combined effort also describes pertinent portions of ground-based research leading to the ultimate selection of the flight sample composition, including design, fabrication and testing, all of which are expected to contribute to an improved comprehension of materials processing in space. The SPAR project was coordinated and managed by MSFC as part of the Microgravity Science and Applications (MSA) program of the Office of Space Science and Applications (OSSA) of NASA Headquarters. This technical memorandum is directed entirely to the payload manifest flown in the tenth of a series of SPAR flights conducted at the White Sands Missile Range (WSMR) and includes the experiments entitled, Containerless Processing Technology, SPAR Experiment 76-20/3; Directional Solidification of Magnetic Composites, SPAR Experiment 76-22/3; Comparative Alloy Solidification, SPAR Experiment 76-36/3; and Foam Copper, SPAR Experiment 77-9/1R.

  16. Solar kerosene from H2O and CO2

    NASA Astrophysics Data System (ADS)

    Furler, P.; Marxer, D.; Scheffe, J.; Reinalda, D.; Geerlings, H.; Falter, C.; Batteiger, V.; Sizmann, A.; Steinfeld, A.

    2017-06-01

    The entire production chain for renewable kerosene obtained directly from sunlight, H2O, and CO2 is experimentally demonstrated. The key component of the production process is a high-temperature solar reactor containing a reticulated porous ceramic (RPC) structure made of ceria, which enables the splitting of H2O and CO2 via a 2-step thermochemical redox cycle. In the 1st reduction step, ceria is endo-thermally reduced using concentrated solar radiation as the energy source of process heat. In the 2nd oxidation step, nonstoichiometric ceria reacts with H2O and CO2 to form H2 and CO - syngas - which is finally converted into kerosene by the Fischer-Tropsch process. The RPC featured dual-scale porosity for enhanced heat and mass transfer: mm-size pores for volumetric radiation absorption during the reduction step and μm-size pores within its struts for fast kinetics during the oxidation step. We report on the engineering design of the solar reactor and the experimental demonstration of over 290 consecutive redox cycles for producing high-quality syngas suitable for the processing of liquid hydrocarbon fuels.

  17. Under Construction: An Experiential Exercise Illustrating Elements of Work Design

    ERIC Educational Resources Information Center

    Donovan, Kimberly M.; Fluegge-Woolf, Erin R.

    2015-01-01

    The Under Construction Exercise was developed by the authors to highlight key factors of work design that when implemented among the work group or entire organization can lead to an environment conducive to fostering satisfaction and motivation. In the exercise, groups are assigned to one of four different conditions that are designed to emulate…

  18. Design optimum frac jobs using virtual intelligence techniques

    NASA Astrophysics Data System (ADS)

    Mohaghegh, Shahab; Popa, Andrei; Ameri, Sam

    2000-10-01

    Designing optimal frac jobs is a complex and time-consuming process. It usually involves the use of a two- or three-dimensional computer model. For the computer models to perform as intended, a wealth of input data is required. The input data includes wellbore configuration and reservoir characteristics such as porosity, permeability, stress and thickness profiles of the pay layers as well as the overburden layers. Among other essential information required for the design process is fracturing fluid type and volume, proppant type and volume, injection rate, proppant concentration and frac job schedule. Some of the parameters such as fluid and proppant types have discrete possible choices. Other parameters such as fluid and proppant volume, on the other hand, assume values from within a range of minimum and maximum values. A potential frac design for a particular pay zone is a combination of all of these parameters. Finding the optimum combination is not a trivial process. It usually requires an experienced engineer and a considerable amount of time to tune the parameters in order to achieve desirable outcome. This paper introduces a new methodology that integrates two virtual intelligence techniques, namely, artificial neural networks and genetic algorithms to automate and simplify the optimum frac job design process. This methodology requires little input from the engineer beyond the reservoir characterizations and wellbore configuration. The software tool that has been developed based on this methodology uses the reservoir characteristics and an optimization criteria indicated by the engineer, for example a certain propped frac length, and provides the detail of the optimum frac design that will result in the specified criteria. An ensemble of neural networks is trained to mimic the two- or three-dimensional frac simulator. Once successfully trained, these networks are capable of providing instantaneous results in response to any set of input parameters. These networks will be used as the fitness function for a genetic algorithm routine that will search for the best combination of the design parameters for the frac job. The genetic algorithm will search through the entire solution space and identify the optimal combination of parameters to be used in the design process. Considering the complexity of this task this methodology converges relatively fast, providing the engineer with several near-optimum scenarios for the frac job design. These scenarios, which can be achieved in just a minute or two, can be valuable initial points for the engineer to start his/her design job and save him/her hours of runs on the simulator.

  19. Expectation, information processing, and subjective duration.

    PubMed

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  20. Design and multifidelity analysis of dual mode scramjet compression system using coupled NPSS and fluent simulation

    NASA Astrophysics Data System (ADS)

    Vijayakumar, Nandakumar

    Hypersonic airbreathing engines mark a potential future development of the aerospace industry and immense efforts have been taken in gaining knowledge in them for the past decades. The physical phenomenon occurring at the hypersonic flow regime makes the design and performance prediction of a scramjet engine hard. Though cutting-edge simulation tools fight their way toward accurate prediction of the environment, the time consumed by the entire process in designing and analyzing a scramjet engine and its component may be exorbitant. A multi-fidelity approach for designing a scramjet with a cruising Mach number of 6 is detailed in this research where high-order simulations are applied according to the physics involved in the component. Two state-of-the-art simulation tools were used to take the aerodynamic and propulsion disciplines into account for realistic prediction of the individual components as well as the entire scramjet. The specific goal of this research is to create a virtual environment to design and analyze a hypersonic, two-dimensional, planar inlet and isolator to check its operability for a dual-mode scramjet engine. The dual mode scramjet engine starts at a Mach number of 3.5 where it operates as a ramjet and accelerates to Mach 6 to be operated as a scramjet engine. The intercomponent interaction between the compression components with the rest of the engine is studied by varying the fidelity of the numerical simulation according to the complexity of the situation. Efforts have been taken to track the transition Mach number as it switches from ramjet to scramjet. A complete scramjet assembly was built using the Numerical Propulsion Simulation System (NPSS) and the performance of the engine was evaluated for various scenarios. Different numerical techniques were opted for varying the fidelity of the analysis with the highest fidelity consisting of 2D RANS CFD simulation. The interaction between the NPSS elements with the CFD solver is governed by the top-level assembly solver of NPSS. The importance of intercomponent interactions are discussed. The methodology used in this research for design and analysis, should add up to provide an efficient way for estimating the design and off-design operating modes of a dual mode scramjet engine.

  1. Full spectral optical modeling of quantum-dot-converted elements for light-emitting diodes considering reabsorption and reemission effect.

    PubMed

    Li, Jia-Sheng; Tang, Yong; Li, Zong-Tao; Cao, Kai; Yan, Cai-Man; Ding, Xin-Rui

    2018-07-20

    Quantum dots (QDs) have attracted significant attention in light-emitting diode (LED) illumination and display applications, owing to their high quantum yield and unique spectral properties. However, an effective optical model of quantum-dot-converted elements (QDCEs) for (LEDs) that entirely considers the reabsorption and reemission effect is lacking. This suppresses the design of QDCE structures and further investigation of light-extraction/conversion mechanisms in QDCEs. In this paper, we proposed a full spectral optical modeling method for QDCEs packaged in LEDs, entirely considering the reabsorption and reemission effect, and its results are compared with traditional models without reabsorption or reemission. The comparisons indicate that the QDCE absorption loss of QD emission light is a major factor decreasing the radiant efficacy of LEDs, which should be considered when designing QDCE structures. According to the measurements of fabricated LEDs, only calculation results that entirely consider reabsorption and reemission show good agreement with experimental radiant efficacy, spectra, and peak wavelength at the same down-conversion efficiency. Consequently, it is highly expected that QDCE will be modeled considering the reabsorption and reemission events. This study provides a simple and effective modeling method for QDCEs, which shows great potential for their structure designs and fundamental investigations.

  2. Full spectral optical modeling of quantum-dot-converted elements for light-emitting diodes considering reabsorption and reemission effect

    NASA Astrophysics Data System (ADS)

    Li, Jia-Sheng; Tang, Yong; Li, Zong-Tao; Cao, Kai; Yan, Cai-Man; Ding, Xin-Rui

    2018-07-01

    Quantum dots (QDs) have attracted significant attention in light-emitting diode (LED) illumination and display applications, owing to their high quantum yield and unique spectral properties. However, an effective optical model of quantum-dot-converted elements (QDCEs) for (LEDs) that entirely considers the reabsorption and reemission effect is lacking. This suppresses the design of QDCE structures and further investigation of light-extraction/conversion mechanisms in QDCEs. In this paper, we proposed a full spectral optical modeling method for QDCEs packaged in LEDs, entirely considering the reabsorption and reemission effect, and its results are compared with traditional models without reabsorption or reemission. The comparisons indicate that the QDCE absorption loss of QD emission light is a major factor decreasing the radiant efficacy of LEDs, which should be considered when designing QDCE structures. According to the measurements of fabricated LEDs, only calculation results that entirely consider reabsorption and reemission show good agreement with experimental radiant efficacy, spectra, and peak wavelength at the same down-conversion efficiency. Consequently, it is highly expected that QDCE will be modeled considering the reabsorption and reemission events. This study provides a simple and effective modeling method for QDCEs, which shows great potential for their structure designs and fundamental investigations.

  3. Analysis of power gating in different hierarchical levels of 2MB cache, considering variation

    NASA Astrophysics Data System (ADS)

    Jafari, Mohsen; Imani, Mohsen; Fathipour, Morteza

    2015-09-01

    This article reintroduces power gating technique in different hierarchical levels of static random-access memory (SRAM) design including cell, row, bank and entire cache memory in 16 nm Fin field effect transistor. Different structures of SRAM cells such as 6T, 8T, 9T and 10T are used in design of 2MB cache memory. The power reduction of the entire cache memory employing cell-level optimisation is 99.7% with the expense of area and other stability overheads. The power saving of the cell-level optimisation is 3× (1.2×) higher than power gating in cache (bank) level due to its superior selectivity. The access delay times are allowed to increase by 4% in the same energy delay product to achieve the best power reduction for each supply voltages and optimisation levels. The results show the row-level power gating is the best for optimising the power of the entire cache with lowest drawbacks. Comparisons of cells show that the cells whose bodies have higher power consumption are the best candidates for power gating technique in row-level optimisation. The technique has the lowest percentage of saving in minimum energy point (MEP) of the design. The power gating also improves the variation of power in all structures by at least 70%.

  4. A Quality by Design approach to investigate tablet dissolution shift upon accelerated stability by multivariate methods.

    PubMed

    Huang, Jun; Goolcharran, Chimanlall; Ghosh, Krishnendu

    2011-05-01

    This paper presents the use of experimental design, optimization and multivariate techniques to investigate root-cause of tablet dissolution shift (slow-down) upon stability and develop control strategies for a drug product during formulation and process development. The effectiveness and usefulness of these methodologies were demonstrated through two application examples. In both applications, dissolution slow-down was observed during a 4-week accelerated stability test under 51°C/75%RH storage condition. In Application I, an experimental design was carried out to evaluate the interactions and effects of the design factors on critical quality attribute (CQA) of dissolution upon stability. The design space was studied by design of experiment (DOE) and multivariate analysis to ensure desired dissolution profile and minimal dissolution shift upon stability. Multivariate techniques, such as multi-way principal component analysis (MPCA) of the entire dissolution profiles upon stability, were performed to reveal batch relationships and to evaluate the impact of design factors on dissolution. In Application II, an experiment was conducted to study the impact of varying tablet breaking force on dissolution upon stability utilizing MPCA. It was demonstrated that the use of multivariate methods, defined as Quality by Design (QbD) principles and tools in ICH-Q8 guidance, provides an effective means to achieve a greater understanding of tablet dissolution upon stability. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. To Design or Not to Design (Part Three): Metacognition: How Problematizing Transforms a Complex System towards a Desired State

    DTIC Science & Technology

    2011-03-18

    problematization‟s place in the entrepreneurial situation. In contrast to FM5-0 Chapter 3 Design‟s confusing graphic on design, this one conveys meaning and...system transformation graphic that, although designed for entrepreneurial application, shares many overlapping contextual features that military design...doctrine attempts. Most significantly, they bound their entrepreneurial situation around the entire system in a manner that correlates to an

  6. Worklist handling in workflow-enabled radiological application systems

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  7. The limits of intelligence in design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papamichael, K.; Protzen, J.P.

    1993-05-01

    A new, comprehensive design theory is presented, applicable to all design domains such as engineering and industrial design, architecture, city and regional planning, and, in general, any goal-oriented activity that involves decision making. The design process is analyzed into fundamental activities that are characterized with respect to the nature of knowledge requirements and the degree to which they can be specified and delegated to others, in general, and to computers in particular. The characterization of design problems as ``wicked,`` or ``ill-defined,`` design has been understood as a rational activity, that is ``thinking before acting.`` The new theory presented in thismore » paper suggests that design is ``thinking and feeling while acting,`` supporting the position that design is only partially rational. Intelligence, ``natural`` or ``artificial,`` is only one of two requirements for design, the other being emotions. Design decisions are only partially inferred, that is, they are not entirely the product of reasoning. Rather, design decisions are based on judgment that requires the notion of ``good`` and ``bad,`` which is attributed to feelings, rather than thoughts. The presentation of the design theory extends to the implications associated with the limits of intelligence in design, which, in turn, become constraints on the potential role of computers in design. Many of the current development efforts in computer-aided design violate these constraints, especially in the implementation of expert systems and multi-criterion evaluation models. These violations are identified and discussed in detail. Finally, specific areas for further research and development in computer-aided design are presented and discussed.« less

  8. From the ORFeome concept to highly comprehensive, full-genome screening libraries.

    PubMed

    Rid, Raphaela; Abdel-Hadi, Omar; Maier, Richard; Wagner, Martin; Hundsberger, Harald; Hintner, Helmut; Bauer, Johann; Onder, Kamil

    2013-02-01

    Recombination-based cloning techniques have in recent times facilitated the establishment of genome-scale single-gene ORFeome repositories. Their further handling and downstream application in systematic fashion is, however, practically impeded because of logistical plus economic challenges. At this juncture, simultaneously transferring entire gene collections in compiled pool format could represent an advanced compromise between systematic ORFeome (an organism's entire set of protein-encoding open reading frames) projects and traditional random library approaches, but has not yet been considered in great detail. In our endeavor to merge the comprehensiveness of ORFeomes with a basically simple, streamlined, and easily executable single-tube design, we have here produced five different pooled screening-ready libraries for both Staphylococcus aureus and Homo sapiens. By evaluating the parallel transfer efficiencies of differentially sized genes from initial polymerase chain reaction (PCR) product amplification to entry and final destination library construction via quantitative real-time PCR, we found that the complexity of the gene population is fairly stably maintained once an entry resource has been successfully established, and that no apparent size-selection bias loss of large inserts takes place. Recombinational transfer processes are hence robust enough for straightforwardly achieving such pooled screening libraries.

  9. Computational multiscale modeling in protein--ligand docking.

    PubMed

    Taufer, Michela; Armen, Roger; Chen, Jianhan; Teller, Patricia; Brooks, Charles

    2009-01-01

    In biological systems, the binding of small molecule ligands to proteins is a crucial process for almost every aspect of biochemistry and molecular biology. Enzymes are proteins that function by catalyzing specific biochemical reactions that convert reactants into products. Complex organisms are typically composed of cells in which thousands of enzymes participate in complex and interconnected biochemical pathways. Some enzymes serve as sequential steps in specific pathways (such as energy metabolism), while others function to regulate entire pathways and cellular functions [1]. Small molecule ligands can be designed to bind to a specific enzyme and inhibit the biochemical reaction. Inhibiting the activity of key enzymes may result in the entire biochemical pathways being turned on or off [2], [3]. Many small molecule drugs marketed today function in this generic way as enzyme inhibitors. If research identifies a specific enzyme as being crucial to the progress of disease, then this enzyme may be targeted with an inhibitor, which may slow down or reverse the progress of disease. In this way, enzymes are targeted from specific pathogens (e.g., virus, bacteria, fungi) for infectious diseases [4], [5], and human enzymes are targeted for noninfectious diseases such as cardiovascular disease, cancer, diabetes, and neurodegenerative diseases [6].

  10. Policymaking in European healthy cities.

    PubMed

    de Leeuw, Evelyne; Green, Geoff; Spanswick, Lucy; Palmer, Nicola

    2015-06-01

    This paper assesses policy development in, with and for Healthy Cities in the European Region of the World Health Organization. Materials for the assessment were sourced through case studies, a questionnaire and statistical databases. They were compiled in a realist synthesis methodology, applying theory-based evaluation principles. Non-response analyses were applied to ascertain the degree of representatives of the high response rates for the entire network of Healthy Cities in Europe. Further measures of reliability and validity were applied, and it was found that our material was indicative of the entire network. European Healthy Cities are successful in developing local health policy across many sectors within and outside government. They were also successful in addressing 'wicked' problems around equity, governance and participation in themes such as Healthy Urban Planning. It appears that strong local leadership for policy change is driven by international collaboration and the stewardship of the World Health Organization. The processes enacted by WHO, structuring membership of the Healthy City Network (designation) and the guidance on particular themes, are identified as being important for the success of local policy development. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Entirely irrelevant distractors can capture and captivate attention.

    PubMed

    Forster, Sophie; Lavie, Nilli

    2011-12-01

    The question of whether a stimulus onset may capture attention when it is entirely irrelevant to the task and even in the absence of any attentional settings for abrupt onset or any dynamic changes has been highly controversial. In the present study, we designed a novel irrelevant capture task to address this question. Participants engaged in a continuous task making sequential forced choice (letter or digit) responses to each item in an alphanumeric matrix that remained on screen throughout many responses. This task therefore involved no attentional settings for onset or indeed any dynamic changes, yet the brief onset of an entirely irrelevant distractor (a cartoon picture) resulted in significant slowing of the two (Experiment 1) or three (Experiment 2) responses immediately following distractor appearance These findings provide a clear demonstration of attention being captured and captivated by a distractor that is entirely irrelevant to any attentional settings of the task.

  12. Sampling design by the core-food approach for the Taiwan total diet study on veterinary drugs.

    PubMed

    Chen, Chien-Chih; Tsai, Ching-Lun; Chang, Chia-Chin; Ni, Shih-Pei; Chen, Yi-Tzu; Chiang, Chow-Feng

    2017-06-01

    The core-food (CF) approach, first adopted in the United States in the 1980s, has been widely used by many countries to assess the exposure to dietary hazards at a population level. However, the reliability of exposure estimates (C × CR) depends critically on sampling methods designed for the detected chemical concentrations (C) of each CF to match with the corresponding consumption rate (CR) estimated from the surveyed intake data. In order to reduce the uncertainty of food matching, this study presents a sampling design scheme, namely the subsample method, for the 2016 Taiwan total diet study (TDS) on veterinary drugs. We first combined the four sets of national dietary recall data that covered the entire age strata (1-65+ years), and aggregated them into 307 CFs by their similarity in nutritional values, manufacturing and cooking methods. The 40 CFs pertinent to veterinary drug residues were selected for this study, and 16 subsamples for each CF were designed by weighing their quantities in CR, product brands, manufacturing, processing and cooking methods. The calculated food matching rates of each CF from this study were 84.3-97.3%, which were higher than those obtained from many previous studies using the representative food (RF) method (53.1-57.8%). The subsample method not only considers the variety of food processing and cooking methods, but also it provides better food matching and reduces the uncertainty of exposure assessment.

  13. A novel association rule mining approach using TID intermediate itemset.

    PubMed

    Aqra, Iyad; Herawan, Tutut; Abdul Ghani, Norjihan; Akhunzada, Adnan; Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang

    2018-01-01

    Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets.

  14. A novel association rule mining approach using TID intermediate itemset

    PubMed Central

    Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang

    2018-01-01

    Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets. PMID:29351287

  15. Manned Spacecraft Requirements for Materials and Processes

    NASA Technical Reports Server (NTRS)

    Vaughn, Timothy P.

    2006-01-01

    A major cause of project failure can be attributed to an emphasized focus on end products and inadequate attention to resolving development risks during the initial phases of a project. The initial phases of a project, which we will call the "study period", are critical to determining project scope and costs, and can make or break most projects. If the requirements are not defined adequately, how can the scope be adequately determined, also how can the costs of the entire project be effectively estimated, and how can the risk of project success be accurately assessed? Using the proper material specifications and standards and incorporating these specifications and standards in the design process should be considered inherently crucial to the technical success of a project as just as importantly, crucial to the cost and schedule success. This paper will intertwine several important aspects or considerations for project success: 1) Characteristics of a "Good Material Requirement"; 2) Linking material requirements to the implementation of "Design for Manufacturing"; techniques and 3) The importance of decomposing materials requirements during the study phase/development phase to mitigate project risk for the maturation of technologies before the building of hardware.

  16. Electrically-pumped 850-nm micromirror VECSELs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geib, Kent Martin; Peake, Gregory Merwin; Serkland, Darwin Keith

    Vertical-external-cavity surface-emitting lasers (VECSELs) combine high optical power and good beam quality in a device with surface-normal output. In this paper, we describe the design and operating characteristics of an electrically-pumped VECSEL that employs a wafer-scale fabrication process and operates at 850 nm. A curved micromirror output coupler is heterogeneously integrated with AlGaAs-based semiconductor material to form a compact and robust device. The structure relies on flip-chip bonding the processed epitaxial material to an aluminum nitride mount; this heatsink both dissipates thermal energy and permits high frequency modulation using coplanar traces that lead to the VECSEL mesa. Backside emission ismore » employed, and laser operation at 850 nm is made possible by removing the entire GaAs substrate through selective wet etching. While substrate removal eliminates absorptive losses, it simultaneously compromises laser performance by increasing series resistance and degrading the spatial uniformity of current injection. Several aspects of the VECSEL design help to mitigate these issues, including the use of a novel current-spreading n type distributed Bragg reflector (DBR). Additionally, VECSEL performance is improved through the use of a p-type DBR that is modified for low thermal resistance.« less

  17. Electrically pumped 850-nm micromirror VECSELs

    NASA Astrophysics Data System (ADS)

    Keeler, Gordon A.; Serkland, Darwin K.; Geib, Kent M.; Peake, Gregory M.; Mar, Alan

    2005-03-01

    Vertical-external-cavity surface-emitting lasers (VECSELs) combine high optical power and good beam quality in a device with surface-normal output. In this paper, we describe the design and operating characteristics of an electrically-pumped VECSEL that employs a wafer-scale fabrication process and operates at 850 nm. A curved micromirror output coupler is heterogeneously integrated with AlGaAs-based semiconductor material to form a compact and robust device. The structure relies on flip-chip bonding the processed epitaxial material to an aluminum nitride mount; this heatsink both dissipates thermal energy and permits high frequency modulation using coplanar traces that lead to the VECSEL mesa. Backside emission is employed, and laser operation at 850 nm is made possible by removing the entire GaAs substrate through selective wet etching. While substrate removal eliminates absorptive losses, it simultaneously compromises laser performance by increasing series resistance and degrading the spatial uniformity of current injection. Several aspects of the VECSEL design help to mitigate these issues, including the use of a novel current-spreading n type distributed Bragg reflector (DBR). Additionally, VECSEL performance is improved through the use of a p-type DBR that is modified for low thermal resistance.

  18. The Defense Life Cycle Management System as a Working Model for Academic Application

    ERIC Educational Resources Information Center

    Burian, Philip E.; Keffel, Leslie M.; Maffei, Francis R., III

    2011-01-01

    Performing the review and assessment of masters' level degree programs can be an overwhelming and challenging endeavor. Getting organized and mapping out the entire review and assessment process can be extremely helpful and more importantly provide a path for successfully accomplishing the review and assessment of the entire program. This paper…

  19. Some Memories Are Odder than Others: Judgments of Episodic Oddity Violate Known Decision Rules

    ERIC Educational Resources Information Center

    O'Connor, Akira R.; Guhl, Emily N.; Cox, Justin C.; Dobbins, Ian G.

    2011-01-01

    Current decision models of recognition memory are based almost entirely on one paradigm, single item old/new judgments accompanied by confidence ratings. This task results in receiver operating characteristics (ROCs) that are well fit by both signal-detection and dual-process models. Here we examine an entirely new recognition task, the judgment…

  20. 77 FR 71794 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-04

    ... in that community. Formative research is research that occurs before a program is designed and... entirely behavioral but most often they are cycles of interviews and focus groups designed to inform the... specific data collection instruments, (3) methodological research (4) usability testing of technology-based...

  1. 7 CFR 1948.68 - Criteria for designation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... has historically occurred within the area; an increase (for which data or projected data is available... percent of the designated area's population; or data showing that available public facilities and services... calendar years. This data should be obtained from a single source for the entire State, if possible...

  2. Cryogenic Technology, part 1. [conference proceedings; cryogenic wind tunnel design and instrumentation

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Different engineering problems associated with the design of mechanisms and systems to operate in a cryogenic environment are discussed. The focal point for the entire engineering effort was the design of the National Transonic Facility, which is a closed-circuit cryogenic wind tunnel. The papers covered a variety of mechanical, structural, and systems design subjects including thermal structures insulation systems, noise, seals, and materials.

  3. Numerical simulation of the actuation system for the ALDF's propulsion control valve. [Aircraft Landing Dynamics Facility

    NASA Technical Reports Server (NTRS)

    Korte, John J.

    1990-01-01

    A numerical simulation of the actuation system for the propulsion control valve (PCV) of the NASA Langley Aircraft Landing Dynamics Facility was developed during the preliminary design of the PCV and used throughout the entire project. The simulation is based on a predictive model of the PCV which is used to evaluate and design the actuation system. The PCV controls a 1.7 million-pound thrust water jet used in propelling a 108,000-pound test carriage. The PCV can open and close in 0.300 second and deliver over 9,000 gallons of water per sec at pressures up to 3150 psi. The numerical simulation results are used to predict transient performance and valve opening characteristics, specify the hydraulic control system, define transient loadings on components, and evaluate failure modes. The mathematical model used for numerically simulating the mechanical fluid power system is described, and numerical results are demonstrated for a typical opening and closing cycle of the PCV. A summary is then given on how the model is used in the design process.

  4. Dynamic Simulation of a Periodic 10 K Sorption Cryocooler

    NASA Technical Reports Server (NTRS)

    Bhandari, P.; Rodriguez, J.; Bard, S.; Wade, L.

    1994-01-01

    A transient thermal simulation model has been developed to simulate the dynamic performance of a multiple-stage 10 K sorption cryocooler for spacecraft sensor cooling applications that require periodic quick-cooldown (under 2 minutes) , negligible vibration, low power consumption, and long life (5 to 10 years). The model was specifically designed to represent the Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE), but it can be adapted to represent other sorption cryocooler systems as well. The model simulates the heat transfer, mass transfer, and thermodynamic processes in the cryostat and the sorbent beds for the entire refrigeration cycle, and includes the transient effects of variable hydrogen supply pressures due to expansion and overflow of hydrogen during the cooldown operation. The paper describes model limitations and simplifying assumptions, with estimates of errors induced by them, and presents comparisons of performance predictions with ground experiments. An important benefit of the model is its ability to predict performance sensitivities to variations of key design and operational parameters. The insights thus obtained are expected to lead to higher efficiencies and lower weights for future designs.

  5. Low-profile heliostat design for solar central receiver systems

    NASA Technical Reports Server (NTRS)

    Fourakis, E.; Severson, A. M.

    1977-01-01

    Heliostat designs intended to reduce costs and the effect of adverse wind loads on the devices were developed. Included was the low-profile heliostat consisting of a stiff frame with sectional focusing reflectors coupled together to turn as a unit. The entire frame is arranged to turn angularly about a center point. The ability of the heliostat to rotate about both the vertical and horizontal axes permits a central computer control system to continuously aim the sun's reflection onto a selected target. An engineering model of the basic device was built and is being tested. Control and mirror parameters, such as roughness and need for fine aiming, are being studied. The fabrication of these prototypes is in process. The model was also designed to test mirror focusing techniques, heliostat geometry, mechanical functioning, and tracking control. The model can be easily relocated to test mirror imaging on a tower from various directions. In addition to steering and aiming studies, the tests include the effects of temperature changes, wind gusting and weathering. The results of economic studies on this heliostat are also presented.

  6. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  7. An integrated low carbon energy solution to cooking fuel, tailored to Niger state's rural population

    NASA Astrophysics Data System (ADS)

    Carvell, Aaron; Price-Allison, Andrew; Birch, Calum; Green, Toby; Harijan, Khanji; Maihankuri, Sheidi; Raji, Abdulganiy; Uqaili, Mohammed; Dupont, Valerie

    2017-11-01

    Niger State (Nigeria) was selected as a case study of renewable, affordable and user friendly clean energy provision in remote areas of developing countries. Niger state has 80% of its 4.5 million population living in rural agrarian areas with low literacy rates, there is a lack of wind thus eliminating wind as widely available potential power source. Based on the assessment of the local large insolation, the type of agricultural, biomass and husbandry resources, this study selected the design of anaerobic digestion units processing mostly animal and human waste, and whose heating and power requirement would be entirely provided by solar photovoltaic/thermal to maintain optimum efficiency of the biogas production. The designs was carried out at the scale of up to 15 household demand (community scale). Volume and therefore the production of biogas maybe increased or decreased in the design considered, and local, low cost resilient material were proposed. The proposed system was costed for a community of 24 people, demonstrating the potential for clean and renewable gas production economically.

  8. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  9. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Determining the 95% limit of detection for waterborne pathogen analyses from primary concentration to qPCR.

    PubMed

    Stokdyk, Joel P; Firnstahl, Aaron D; Spencer, Susan K; Burch, Tucker R; Borchardt, Mark A

    2016-06-01

    The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay alone rather than the entire sample process. Our objective was to develop an approach to determine the 95% LOD (lowest concentration at which 95% of positive samples are detected) for the entire process of waterborne pathogen detection. We began by spiking the lowest concentration that was consistently positive at the qPCR step (based on its standard curve) into each procedural step working backwards (i.e., extraction, secondary concentration, primary concentration), which established a concentration that was detectable following losses of the pathogen from processing. Using the fraction of positive replicates (n = 10) at this concentration, we selected and analyzed a second, and then third, concentration. If the fraction of positive replicates equaled 1 or 0 for two concentrations, we selected another. We calculated the LOD using probit analysis. To demonstrate our approach we determined the 95% LOD for Salmonella enterica serovar Typhimurium, adenovirus 41, and vaccine-derived poliovirus Sabin 3, which were 11, 12, and 6 genomic copies (gc) per reaction (rxn), respectively (equivalent to 1.3, 1.5, and 4.0 gc L(-1) assuming the 1500 L tap-water sample volume prescribed in EPA Method 1615). This approach limited the number of analyses required and was amenable to testing multiple genetic targets simultaneously (i.e., spiking a single sample with multiple microorganisms). An LOD determined this way can facilitate study design, guide the number of required technical replicates, aid method evaluation, and inform data interpretation. Published by Elsevier Ltd.

  11. [The Balanced Scorecard as a management tool in a public health organization].

    PubMed

    Villalbí, Joan R; Villalbí, Joan; Guix, Joan; Casas, Conrad; Borrell, Carme; Duran, Júlia; Artazcoz, Lucía; Camprubí, Esteve; Cusí, Meritxell; Rodríguez-Montuquín, Pau; Armengol, Josep M; Jiménez, Guy

    2007-01-01

    The Balanced Scorecard is a tool for strategic planning in business. We present our experience after introducing this instrument in a public health agency to align daily management practice with strategic objectives. Our management team required deep discussions with external support to clarify the concepts behind the Balanced Scorecard, adapt them to a public organization in the health field distinct from the business sector in which the Balanced Scorecard was designed, and adopt this instrument as a management tool. This process led to definition of the Balanced Scorecard by our Management Committee in 2002, the subsequent evaluation of the degree to which its objectives had been reached, and its periodic redefinition. In addition, second-level Balanced Scorecards were defined for different divisions and services within the agency. The adoption of the Balanced Scorecard by the management team required prior effort to clarify who are the stockholders and who are the clients of a public health organization. The agency's activity and production were also analyzed and a key processes model was defined. Although it is hard to attribute specific changes to a single cause, we believe several improvements in management can be ascribed, at least in part, to the use of the Balanced Scorecard. The systematic use of the Balanced Scorecard produced greater cohesion in the management team and the entire organization and brought the strategic objectives closer to daily management operations. The organization is more attentive to its clients, has taken steps to improve its most complex cross-sectional processes, and has developed further actions for the development and growth of its officers and its entire personnel. At the same time, its management team is more in tune with the needs of the agency's administrative bodies that compose its governing board.

  12. Determining the 95% limit of detection for waterborne pathogen analyses from primary concentration to qPCR

    USGS Publications Warehouse

    Stokdyk, Joel P.; Firnstahl, Aaron; Spencer, Susan K.; Burch, Tucker R; Borchardt, Mark A.

    2016-01-01

    The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay alone rather than the entire sample process. Our objective was to develop an approach to determine the 95% LOD (lowest concentration at which 95% of positive samples are detected) for the entire process of waterborne pathogen detection. We began by spiking the lowest concentration that was consistently positive at the qPCR step (based on its standard curve) into each procedural step working backwards (i.e., extraction, secondary concentration, primary concentration), which established a concentration that was detectable following losses of the pathogen from processing. Using the fraction of positive replicates (n = 10) at this concentration, we selected and analyzed a second, and then third, concentration. If the fraction of positive replicates equaled 1 or 0 for two concentrations, we selected another. We calculated the LOD using probit analysis. To demonstrate our approach we determined the 95% LOD for Salmonella enterica serovar Typhimurium, adenovirus 41, and vaccine-derived poliovirus Sabin 3, which were 11, 12, and 6 genomic copies (gc) per reaction (rxn), respectively (equivalent to 1.3, 1.5, and 4.0 gc L−1 assuming the 1500 L tap-water sample volume prescribed in EPA Method 1615). This approach limited the number of analyses required and was amenable to testing multiple genetic targets simultaneously (i.e., spiking a single sample with multiple microorganisms). An LOD determined this way can facilitate study design, guide the number of required technical replicates, aid method evaluation, and inform data interpretation.

  13. Sohbrit: Autonomous COTS System for Satellite Characterization

    NASA Astrophysics Data System (ADS)

    Blazier, N.; Tarin, S.; Wells, M.; Brown, N.; Nandy, P.; Woodbury, D.

    As technology continues to improve, driving down the cost of commercial astronomical products while increasing their capabilities, manpower to run observations has become the limiting factor in acquiring continuous and repeatable space situational awareness data. Sandia National Laboratories set out to automate a testbed comprised entirely of commercial off-the-shelf (COTS) hardware for space object characterization (SOC) focusing on satellites in geosynchronous orbit. Using an entirely autonomous system allows collection parameters such as target illumination and nightly overlap to be accounted for habitually; this enables repeatable development of target light curves to establish patterns of life in a variety of spectral bands. The system, known as Sohbrit, is responsible for autonomously creating an optimized schedule, checking the weather, opening the observatory dome, aligning and focusing the telescope, executing the schedule by slewing to each target and imaging it in a number of spectral bands (e.g., B, V, R, I, wide-open) via a filter wheel, closing the dome at the end of observations, processing the data, and storing/disseminating the data for exploitation via the web. Sohbrit must handle various situations such as weather outages and focus changes due to temperature shifts and optical seeing variations without human interaction. Sohbrit can collect large volumes of data nightly due to its high level of automation. To store and disseminate these large quantities of data, we utilize a cloud-based big data architecture called Firebird, which exposes the data out to the community for use by developers and analysts. Sohbrit is the first COTS system we are aware of to automate the full process of multispectral geosynchronous characterization from scheduling all the way to processed, disseminated data. In this paper we will discuss design decisions, issues encountered and overcome during implementation, and show results produced by Sohbrit.

  14. Cluster man/system design requirements and verification. [for Skylab program

    NASA Technical Reports Server (NTRS)

    Watters, H. H.

    1974-01-01

    Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.

  15. Defense Utility of Commercial Vessels and Craft.

    DTIC Science & Technology

    1980-01-01

    competence represented on this Committee include: ship design (naval architecture and marine engineering); marine transportation systems analysis; port...entire maritime industry, * including operators, designers , shipbuilders, suppliers, regulators, and researchers. This report was developed by a...larger commercial vessels now included in military contingency planning or the specially designed ships of the U.S. Navy and U.S. Coast Guard. 0 The

  16. Fashion Design: Designing a Learner-Active, Multi-Level High School Course

    ERIC Educational Resources Information Center

    Nelson, Diane

    2009-01-01

    A high school fashion design teacher has much in common with the ringmaster of a three-ring circus. The challenges of teaching a hands-on course are to facilitate the entire class and to meet the needs of individual students. When teaching family and consumer sciences, the goal is to have a learner-active classroom. Revamping the high school's…

  17. Design Core Commonalities: A Study of the College of Design at Iowa State University

    ERIC Educational Resources Information Center

    Venes, Jane

    2015-01-01

    This comprehensive study asks what a group of rather diverse disciplines have in common. It involves a cross-disciplinary examination of an entire college, the College of Design at Iowa State University. This research was intended to provide a sense of direction in developing and assessing possible core content. The reasoning was that material…

  18. Design optimization of space launch vehicles using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Bayley, Douglas James

    The United States Air Force (USAF) continues to have a need for assured access to space. In addition to flexible and responsive spacelift, a reduction in the cost per launch of space launch vehicles is also desirable. For this purpose, an investigation of the design optimization of space launch vehicles has been conducted. Using a suite of custom codes, the performance aspects of an entire space launch vehicle were analyzed. A genetic algorithm (GA) was employed to optimize the design of the space launch vehicle. A cost model was incorporated into the optimization process with the goal of minimizing the overall vehicle cost. The other goals of the design optimization included obtaining the proper altitude and velocity to achieve a low-Earth orbit. Specific mission parameters that are particular to USAF space endeavors were specified at the start of the design optimization process. Solid propellant motors, liquid fueled rockets, and air-launched systems in various configurations provided the propulsion systems for two, three and four-stage launch vehicles. Mass properties models, an aerodynamics model, and a six-degree-of-freedom (6DOF) flight dynamics simulator were all used to model the system. The results show the feasibility of this method in designing launch vehicles that meet mission requirements. Comparisons to existing real world systems provide the validation for the physical system models. However, the ability to obtain a truly minimized cost was elusive. The cost model uses an industry standard approach, however, validation of this portion of the model was challenging due to the proprietary nature of cost figures and due to the dependence of many existing systems on surplus hardware.

  19. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  20. Design of a side coupled standing wave accelerating tube for NSTRI e-Linac

    NASA Astrophysics Data System (ADS)

    Zarei, S.; Abbasi Davani, F.; Lamehi Rachti, M.; Ghasemi, F.

    2017-09-01

    The design and construction of a 6 MeV electron linear accelerator (e-Linac) was defined in the Institute of Nuclear Science and Technology (NSTRI) for cargo inspection and medical applications. For this accelerator, a side coupled standing wave tube resonant at a frequency of 2998.5 MHZ in π/2 mode was selected. In this article, the authors provide a step-by-step explanation of the process of the design for this tube. The design and simulation of the accelerating and coupling cavities were carried out in five steps; (1) separate design of the accelerating and coupling cavities, (2) design of the coupling aperture between the cavities, (3) design of the entire structure for resonance at the nominal frequency, (4) design of the buncher, and (5) design of the power coupling port. At all design stages, in addition to finding the dimensions of the cavity, the impact of construction tolerances and simulation errors on the electromagnetic parameters were investigated. The values obtained for the coupling coefficient, coupling constant, quality factor and capture efficiency are 2.11, 0.011, 16203 and 36%, respectively. The results of beam dynamics study of the simulated tube in ASTRA have yielded a value of 5.14 π-mm-mrad for the horizontal emittance, 5.06 π-mm-mrad for the vertical emittance, 1.17 mm for the horizontal beam size, 1.16 mm for the vertical beam size and 1090 keV for the energy spread of the output beam.

  1. A System for Interactive Computer Control of Experiments.

    DTIC Science & Technology

    1986-08-25

    for which the entire wave form is desired, requiring a transient digitizer for each channel . Pulse lengths vary between I and 30 microseconds, so the...to ensure that the computer knows which channel of the data acquisition system corresponds to each parameter. This manual is designed to be used in...are two types of voltage data to be recorded. First are the channels for which the entire wave form is to be recorded, such as the cathode voltage or

  2. Probability of loss of assured safety in systems with multiple time-dependent failure modes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helton, Jon Craig; Pilch, Martin.; Sallaberry, Cedric Jean-Marie.

    2012-09-01

    Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high-consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to deactivate the entire system before the SL system fails (i.e., degrades into a configuration that could allowmore » an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). Representations for PLOAS for situations in which both link physical properties and link failure properties are time-dependent are derived and numerically evaluated for a variety of WL/SL configurations, including PLOAS defined by (i) failure of all SLs before failure of any WL, (ii) failure of any SL before failure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS are considered.« less

  3. Melt-processed polymeric cellular dosage forms for immediate drug release.

    PubMed

    Blaesi, Aron H; Saka, Nannaji

    2015-12-28

    The present immediate-release solid dosage forms, such as the oral tablets and capsules, comprise granular matrices. While effective in releasing the drug rapidly, they are fraught with difficulties inherent in processing particulate matter. By contrast, liquid-based processes would be far more predictable; but the standard cast microstructures are unsuited for immediate-release because they resist fluid percolation and penetration. In this article, we introduce cellular dosage forms that can be readily prepared from polymeric melts by incorporating the nucleation, growth, and coalescence of microscopic gas bubbles in a molding process. We show that the cell topology and formulation of such cellular structures can be engineered to reduce the length-scale of the mass-transfer step, which determines the time of drug release, from as large as the dosage form itself to as small as the thickness of the cell wall. This allows the cellular dosage forms to achieve drug release rates over an order of magnitude faster compared with those of cast matrices, spanning the entire spectrum of immediate-release and beyond. The melt-processed polymeric cellular dosage forms enable predictive design of immediate-release solid dosage forms by tailoring microstructures, and could be manufactured efficiently in a single step.

  4. Advanced Networks in Motion Mobile Sensorweb

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Stewart, David H.

    2011-01-01

    Advanced mobile networking technology applicable to mobile sensor platforms was developed, deployed and demonstrated. A two-tier sensorweb design was developed. The first tier utilized mobile network technology to provide mobility. The second tier, which sits above the first tier, utilizes 6LowPAN (Internet Protocol version 6 Low Power Wireless Personal Area Networks) sensors. The entire network was IPv6 enabled. Successful mobile sensorweb system field tests took place in late August and early September of 2009. The entire network utilized IPv6 and was monitored and controlled using a remote Web browser via IPv6 technology. This paper describes the mobile networking and 6LowPAN sensorweb design, implementation, deployment and testing as well as wireless systems and network monitoring software developed to support testing and validation.

  5. Evaluation of Thermodynamic Models for Predicting Phase Equilibria of CO2 + Impurity Binary Mixture

    NASA Astrophysics Data System (ADS)

    Shin, Byeong Soo; Rho, Won Gu; You, Seong-Sik; Kang, Jeong Won; Lee, Chul Soo

    2018-03-01

    For the design and operation of CO2 capture and storage (CCS) processes, equation of state (EoS) models are used for phase equilibrium calculations. Reliability of an EoS model plays a crucial role, and many variations of EoS models have been reported and continue to be published. The prediction of phase equilibria for CO2 mixtures containing SO2, N2, NO, H2, O2, CH4, H2S, Ar, and H2O is important for CO2 transportation because the captured gas normally contains small amounts of impurities even though it is purified in advance. For the design of pipelines in deep sea or arctic conditions, flow assurance and safety are considered priority issues, and highly reliable calculations are required. In this work, predictive Soave-Redlich-Kwong, cubic plus association, Groupe Européen de Recherches Gazières (GERG-2008), perturbed-chain statistical associating fluid theory, and non-random lattice fluids hydrogen bond EoS models were compared regarding performance in calculating phase equilibria of CO2-impurity binary mixtures and with the collected literature data. No single EoS could cover the entire range of systems considered in this study. Weaknesses and strong points of each EoS model were analyzed, and recommendations are given as guidelines for safe design and operation of CCS processes.

  6. A Drop in the Bucket or a Pebble in a Pond: Commercial Building Partners’ Replication of EEMs Across Their Portfolios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antonopoulos, Chrissi A.; Baechler, Michael C.; Dillon, Heather E.

    This study presents findings from questionnaire and interview data investigating replication efforts of Commercial Building Partnership (CBP) partners that worked directly with the Pacific Northwest National Laboratory (PNNL). PNNL partnered with 12 organizations on new and retrofit construction projects as part of the U.S. Department of Energy (DOE) CBP program. PNNL and other national laboratories collaborate with industry leaders that own large portfolios of buildings to develop high performance projects for new construction and renovation. This project accelerates market adoption of commercially available energy saving technologies into the design process for new and upgraded commercial buildings. The labs provide assistancemore » to the partners’ design teams and make a business case for energy investments. From the owner’s perspective, a sound investment results in energy savings based on corporate objectives and design. Through a feedback questionnaire, along with personal interviews, PNNL gathered qualitative and quantitative information relating to replication efforts by each organization. Data through this process were analyzed to provide insight into two primary research areas: 1) CBP partners’ replication efforts of technologies and approaches used in the CBP project to the rest of the organization’s building portfolio (including replication verification), and, 2) the market potential for technology diffusion into the total U.S. commercial building stock, as a direct result of the CBP entire program.« less

  7. Methods for processing high-throughput RNA sequencing data.

    PubMed

    Ares, Manuel

    2014-11-03

    High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.

  8. Key Elements of a Low Voltage, Ultracompact Plasma Spectrometer

    NASA Technical Reports Server (NTRS)

    Scime, E. E.; Barrie, A.; Dugas, M.; Elliott, D.; Ellison, S.; Keesee, A. M.; Pollock, C. J.; Rager, A.; Tersteeg, J.

    2016-01-01

    Taking advantage of technological developments in wafer-scale processing over the past two decades, such as deep etching, 3-D chip stacking, and double-sided lithography, we have designed and fabricated the key elements of an ultracompact 1.5cm (exp 3)plasma spectrometer that requires only low-voltage power supplies, has no microchannel plates, and has a high aperture area to instrument volume ratio. The initial design of the instrument targets the measurement of charged particles in the 3-20keV range with a highly directional field of view and a 100 duty cycle; i.e., the entire energy range Is continuously measured. In addition to reducing mass, size, and voltage requirements, the new design will affect the manufacturing process of plasma spectrometers, enabling large quantities of identical instruments to be manufactured at low individual unit cost. Such a plasma spectrometer is ideal for heliophysics plasma investigations, particularly for small satellite and multispacecraft missions. Two key elements of the instrument have been fabricated: the collimator and the energy analyzer. An initial collimator transparency of 20 with 3deg x 3deg angular resolution was achieved. The targeted 40 collimator transparency appears readily achievable. The targeted energy analyzer scaling factor of 1875 was achieved; i.e.20 keV electrons were selected for only a 10.7V bias voltage in the energy analyzer.

  9. New optical sensor systems for high-resolution satellite, airborne and terrestrial imaging systems

    NASA Astrophysics Data System (ADS)

    Eckardt, Andreas; Börner, Anko; Lehmann, Frank

    2007-10-01

    The department of Optical Information Systems (OS) at the Institute of Robotics and Mechatronics of the German Aerospace Center (DLR) has more than 25 years experience with high-resolution imaging technology. The technology changes in the development of detectors, as well as the significant change of the manufacturing accuracy in combination with the engineering research define the next generation of spaceborne sensor systems focusing on Earth observation and remote sensing. The combination of large TDI lines, intelligent synchronization control, fast-readable sensors and new focal-plane concepts open the door to new remote-sensing instruments. This class of instruments is feasible for high-resolution sensor systems regarding geometry and radiometry and their data products like 3D virtual reality. Systemic approaches are essential for such designs of complex sensor systems for dedicated tasks. The system theory of the instrument inside a simulated environment is the beginning of the optimization process for the optical, mechanical and electrical designs. Single modules and the entire system have to be calibrated and verified. Suitable procedures must be defined on component, module and system level for the assembly test and verification process. This kind of development strategy allows the hardware-in-the-loop design. The paper gives an overview about the current activities at DLR in the field of innovative sensor systems for photogrammetric and remote sensing purposes.

  10. Using "Game of Thrones" to Teach International Relations

    ERIC Educational Resources Information Center

    Young, Laura D.; Carranza Ko, Ñusta; Perrin, Michael

    2018-01-01

    Despite the known benefits of long-term, game-based simulations they remain underutilized in Political Science classrooms. Simulations used are typically designed to reinforce a concept and are short-lived, lasting one or two class sessions; rarely are entire courses designed around a single simulation. Creating real-world conditions in which…

  11. 23 CFR 1340.9 - Computation of estimates.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.9 Computation of estimates. (a) Data... design and any subsequent adjustments. (e) Sampling weight adjustments for observation sites with no... section, the nonresponse rate for the entire survey shall not exceed 10 percent for the ratio of the total...

  12. 23 CFR 1340.9 - Computation of estimates.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.9 Computation of estimates. (a) Data... design and any subsequent adjustments. (e) Sampling weight adjustments for observation sites with no... section, the nonresponse rate for the entire survey shall not exceed 10 percent for the ratio of the total...

  13. 23 CFR 1340.9 - Computation of estimates.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.9 Computation of estimates. (a) Data... design and any subsequent adjustments. (e) Sampling weight adjustments for observation sites with no... section, the nonresponse rate for the entire survey shall not exceed 10 percent for the ratio of the total...

  14. 49 CFR 178.707 - Standards for composite IBCs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... designed to bear the entire stacking load. The inner receptacle and outer packaging form an integral... outer packaging. (2) A composite IBC with a fully enclosing outer packaging must be designed to permit assessment of the integrity of the inner container following the leakproofness and hydraulic tests. The outer...

  15. 50 CFR 91.13 - Technical requirements for design and submission of entry.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF THE INTERIOR (CONTINUED) MISCELLANEOUS PROVISIONS MIGRATORY BIRD HUNTING AND CONSERVATION STAMP.... No scrollwork, lettering, bird band numbers, signatures or initials may appear on the design. Each..., or under glass, or have any protective covering (other than the matting) attached to them. The entire...

  16. 50 CFR 91.13 - Technical requirements for design and submission of entry.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF THE INTERIOR (CONTINUED) MISCELLANEOUS PROVISIONS MIGRATORY BIRD HUNTING AND CONSERVATION STAMP.... No scrollwork, lettering, bird band numbers, signatures or initials may appear on the design. Each..., or under glass, or have any protective covering (other than the matting) attached to them. The entire...

  17. 12 CFR 360.9 - Large-bank deposit insurance determination modernization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... domestic offices of an insured depository institution, the provisional hold algorithm must be designed to... determined by the FDIC, the algorithm must be designed to calculate and place a hold equal to the dollar... credit arrangement, the provisional hold algorithm will apply a provisional hold percentage to the entire...

  18. 12 CFR 360.9 - Large-bank deposit insurance determination modernization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... domestic offices of an insured depository institution, the provisional hold algorithm must be designed to... determined by the FDIC, the algorithm must be designed to calculate and place a hold equal to the dollar... credit arrangement, the provisional hold algorithm will apply a provisional hold percentage to the entire...

  19. 12 CFR 360.9 - Large-bank deposit insurance determination modernization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... domestic offices of an insured depository institution, the provisional hold algorithm must be designed to... determined by the FDIC, the algorithm must be designed to calculate and place a hold equal to the dollar... credit arrangement, the provisional hold algorithm will apply a provisional hold percentage to the entire...

  20. 12 CFR 360.9 - Large-bank deposit insurance determination modernization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... domestic offices of an insured depository institution, the provisional hold algorithm must be designed to... determined by the FDIC, the algorithm must be designed to calculate and place a hold equal to the dollar... credit arrangement, the provisional hold algorithm will apply a provisional hold percentage to the entire...

  1. Effects of Integrating Peace Education in the Nigeria Education System

    ERIC Educational Resources Information Center

    Olowo, Oluwatoyin Olusegun

    2016-01-01

    This paper attempted to investigate the effects of integrating Peace Education into Nigeria educational system. Four research questions were designed for the study. The researcher designed an instrument tagged: Questionnaire on effect of Integrating Peace Education (QEIPE). The entire population of two hundred respondents spread across Secondary…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Mittereder, A. Poerschke

    This report presents a cold-climate project that examines an alternative approach to ground source heat pump (GSHP) ground loop design. The innovative ground loop design is an attempt to reduce the installed cost of the ground loop heat exchange portion of the system by containing the entire ground loop within the excavated location beneath the basement slab.

  3. JOSE, Jupiter orbiting spacecraft: A systems study, volume 1

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A brief summary of the mechanical properties of Jupiter is presented along with an organizational outline of the entire JOSE program. Other aspects of the program described include: spacecraft design, mission trajectories, altitude control, propulsion subsystem, on-board power supply, spacecraft structures and environmental design considerations, and telemetry.

  4. Expanded Processing Techniques for EMI Systems

    DTIC Science & Technology

    2012-07-01

    possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets

  5. Power control of SAFE reactor using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Irvine, Claude

    2002-01-01

    Controlling the 100 kW SAFE (Safe Affordable Fission Engine) reactor consists of design and implementation of a fuzzy logic process control system to regulate dynamic variables related to nuclear system power. The first phase of development concentrates primarily on system power startup and regulation, maintaining core temperature equilibrium, and power profile matching. This paper discusses the experimental work performed in those areas. Nuclear core power from the fuel elements is simulated using resistive heating elements while heat rejection is processed by a series of heat pipes. Both axial and radial nuclear power distributions are determined from neuronic modeling codes. The axial temperature profile of the simulated core is matched to the nuclear power profile by varying the resistance of the heating elements. The SAFE model establishes radial temperature profile equivalence by establishing 32 control zones as the nodal coordinates. Control features also allow for slow warm up, since complete shutoff can occur in the heat pipes if heat-source temperatures drop/rise below a certain minimum value, depending on the specific fluid and gas combination in the heat pipe. The entire system is expected to be self-adaptive, i.e., capable of responding to long-range changes in the space environment. Particular attention in the development of the fuzzy logic algorithm shall ensure that the system process remains at set point, virtually eliminating overshoot on start-up and during in-process disturbances. The controller design will withstand harsh environments and applications where it might come in contact with water, corrosive chemicals, radiation fields, etc. .

  6. Legitimation problems of participatory processes in technology assessment and technology policy.

    PubMed

    Saretzki, Thomas

    2012-11-01

    Since James Carroll (1971) made a strong case for "participatory technology", scientists, engineers, policy-makers and the public at large have seen quite a number of different approaches to design and implement participatory processes in technology assessment and technology policy. As these participatory experiments and practices spread over the last two decades, one could easily get the impression that participation turned from a theoretical normative claim to a working practice that goes without saying. Looking beyond the well-known forerunners and considering the ambivalent experiences that have been made under different conditions in various places, however, the "if" and "how" of participation are still contested issues when questions of technology are on the agenda. Legitimation problems indicate that attempts to justify participation in a given case have not been entirely successful in the eyes of relevant groups among the sponsors, participants, organizers or observers. Legitimation problems of participatory processes in technology assessment and technology policy vary considerably, and they do so not only with the two domains and the ways of their interrelation or the specific features of the participatory processes. If we ask whether or not participation is seen as problematic in technology assessment and technology policy-making and in what sense it is being evaluated as problematic, then we find that the answer depends also on the approaches and criteria that have been used to legitimize or delegitimize the call for a specific design of participation.

  7. An Optimization of Manufacturing Systems using a Feedback Control Scheduling Model

    NASA Astrophysics Data System (ADS)

    Ikome, John M.; Kanakana, Grace M.

    2018-03-01

    In complex production system that involves multiple process, unplanned disruption often turn to make the entire production system vulnerable to a number of problems which leads to customer’s dissatisfaction. However, this problem has been an ongoing problem that requires a research and methods to streamline the entire process or develop a model that will address it, in contrast to this, we have developed a feedback scheduling model that can minimize some of this problem and after a number of experiment, it shows that some of this problems can be eliminated if the correct remedial actions are implemented on time.

  8. SEPARATION OF INORGANIC SALTS FROM ORGANIC SOLUTIONS

    DOEpatents

    Katzin, L.I.; Sullivan, J.C.

    1958-06-24

    A process is described for recovering the nitrates of uranium and plutonium from solution in oxygen-containing organic solvents such as ketones or ethers. The solution of such salts dissolved in an oxygen-containing organic compound is contacted with an ion exchange resin whereby sorption of the entire salt on the resin takes place and then the salt-depleted liquid and the resin are separated from each other. The reaction seems to be based on an anion formation of the entire salt by complexing with the anion of the resin. Strong base or quaternary ammonium type resins can be used successfully in this process.

  9. Process compensated resonance testing modeling for damage evolution and uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Biedermann, Eric; Heffernan, Julieanne; Mayes, Alexander; Gatewood, Garrett; Jauriqui, Leanne; Goodlet, Brent; Pollock, Tresa; Torbet, Chris; Aldrin, John C.; Mazdiyasni, Siamack

    2017-02-01

    Process Compensated Resonance Testing (PCRT) is a nondestructive evaluation (NDE) method based on the fundamentals of Resonant Ultrasound Spectroscopy (RUS). PCRT is used for material characterization, defect detection, process control and life monitoring of critical gas turbine engine and aircraft components. Forward modeling and model inversion for PCRT have the potential to greatly increase the method's material characterization capability while reducing its dependence on compiling a large population of physical resonance measurements. This paper presents progress on forward modeling studies for damage mechanisms and defects in common to structural materials for gas turbine engines. Finite element method (FEM) models of single crystal (SX) Ni-based superalloy Mar-M247 dog bones and Ti-6Al-4V cylindrical bars were created, and FEM modal analyses calculated the resonance frequencies for the samples in their baseline condition. Then the frequency effects of superalloy creep (high-temperature plastic deformation) and macroscopic texture (preferred crystallographic orientation of grains detrimental to fatigue properties) were evaluated. A PCRT sorting module for creep damage in Mar-M247 was trained with a virtual database made entirely of modeled design points. The sorting module demonstrated successful discrimination of design points with as little as 1% creep strain in the gauge section from a population of acceptable design points with a range of material and geometric variation. The resonance frequency effects of macro-scale texture in Ti-6Al-4V were quantified with forward models of cylinder samples. FEM-based model inversion was demonstrated for Mar-M247 bulk material properties and variations in crystallographic orientation. PCRT uncertainty quantification (UQ) was performed using Monte Carlo studies for Mar-M247 that quantified the overall uncertainty in resonance frequencies resulting from coupled variation in geometry, material properties, crystallographic orientation and creep damage. A model calibration process was also developed that evaluates inversion fitting to differences from a designated reference sample rather than absolute property values, yielding a reduction in fit error.

  10. Preliminary design of a mini-Brayton Compressor-Alternator-Turbine (CAT)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The preliminary design of a mini-Brayton compressor-alternator-turbine system is discussed. The program design goals are listed. The optimum system characteristics over the entire range of power output were determined by performing a wide-range parametric study. The ability to develop the required components to the degree necessary within the limitations of present technology is evaluated. The sensitivity of the system to various individual design parameters was analyzed.

  11. Design intent optimization at the beyond 7nm node: the intersection of DTCO and EUVL stochastic mitigation techniques

    NASA Astrophysics Data System (ADS)

    Crouse, Michael; Liebmann, Lars; Plachecki, Vince; Salama, Mohamed; Chen, Yulu; Saulnier, Nicole; Dunn, Derren; Matthew, Itty; Hsu, Stephen; Gronlund, Keith; Goodwin, Francis

    2017-03-01

    The initial readiness of EUV patterning was demonstrated in 2016 with IBM Alliance's 7nm device technology. The focus has now shifted to driving the 'effective' k1 factor and enabling the second generation of EUV patterning. Thus, Design Technology Co-optimization (DTCO) has become a critical part of technology enablement as scaling has become more challenging and the industry pushes the limits of EUV lithography. The working partnership between the design teams and the process development teams typically involves an iterative approach to evaluate the manufacturability of proposed designs, subsequent modifications to those designs and finally a design manual for the technology. While this approach has served the industry well for many generations, the challenges at the Beyond 7nm node require a more efficient approach. In this work, we describe the use of "Design Intent" lithographic layout optimization where we remove the iterative component of DTCO and replace it with an optimization that achieves both a "patterning friendly" design and minimizes the well-known EUV stochastic effects. Solved together, this "design intent" approach can more quickly achieve superior lithographic results while still meeting the original device's functional specifications. Specifically, in this work we will demonstrate "design intent" optimization for critical BEOL layers using design tolerance bands to guide the source mask co-optimization. The design tolerance bands can be either supplied as part of the original design or derived from some basic rules. Additionally, the EUV stochastic behavior is mitigated by enhancing the image log slope (ILS) for specific key features as part of the overall optimization. We will show the benefit of the "design intent approach" on both bidirectional and unidirectional 28nm min pitch standard logic layouts and compare the more typical iterative SMO approach. Thus demonstrating the benefit of allowing the design to float within the specified range. Lastly, we discuss how the evolution of this approach could lead to layout optimization based entirely on some minimal set of functional requirements and process constraints.

  12. Fundamentals in Biostatistics for Research in Pediatric Dentistry: Part I - Basic Concepts.

    PubMed

    Garrocho-Rangel, J A; Ruiz-Rodríguez, M S; Pozos-Guillén, A J

    The purpose of this report was to provide the reader with some basic concepts in order to better understand the significance and reliability of the results of any article on Pediatric Dentistry. Currently, Pediatric Dentists need the best evidence available in the literature on which to base their diagnoses and treatment decisions for the children's oral care. Basic understanding of Biostatistics plays an important role during the entire Evidence-Based Dentistry (EBD) process. This report describes Biostatistics fundamentals in order to introduce the basic concepts used in statistics, such as summary measures, estimation, hypothesis testing, effect size, level of significance, p value, confidence intervals, etc., which are available to Pediatric Dentists interested in reading or designing original clinical or epidemiological studies.

  13. Using support vector machines to detect medical fraud and abuse.

    PubMed

    Francis, Charles; Pepper, Noah; Strong, Homer

    2011-01-01

    This paper examines the architecture and efficacy of Quash, an automated medical bill processing system capable of bill routing and abuse detection. Quash is designed to be used in conjunction with human auditors and a standard bill review software platform to provide a complete cost containment solution for medical claims. The primary contribution of Quash is to provide a real world speed up for medical fraud detection experts in their work. There will be a discussion of implementation details and preliminary experimental results. In this paper we are entirely focused on medical data and billing patterns that occur within the United States, though these results should be applicable to any financial transaction environment in which structured coding data can be mined.

  14. Getting the Bigger Picture With Digital Surveillance

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Through a Space Act Agreement, Diebold, Inc., acquired the exclusive rights to Glenn Research Center's patented video observation technology, originally designed to accelerate video image analysis for various ongoing and future space applications. Diebold implemented the technology into its AccuTrack digital, color video recorder, a state-of- the-art surveillance product that uses motion detection for around-the- clock monitoring. AccuTrack captures digitally signed images and transaction data in real-time. This process replaces the onerous tasks involved in operating a VCR-based surveillance system, and subsequently eliminates the need for central viewing and tape archiving locations altogether. AccuTrack can monitor an entire bank facility, including four automated teller machines, multiple teller lines, and new account areas, all from one central location.

  15. Controllable g5p-Protein-Directed Aggregation of ssDNA-Gold Nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S.; Maye, M; Zhang, Y

    We assembled single-stranded DNA (ssDNA) conjugated nanoparticles using the phage M13 gene 5 protein (g5p) as the molecular glue to bind two antiparallel noncomplementary ssDNA strands. The entire process was controlled tightly by the concentration of the g5p protein and the presence of double-stranded DNA. The g5p-ssDNA aggregate was disintegrated by hybridization with complementary ssDNA (C-ssDNA) that triggers the dissociation of the complex. Polyhistidine-tagged g5p was bound to nickel nitrilotriacetic acid (Ni2+-NTA) conjugated nanoparticles and subsequently used to coassemble the ssDNA-conjugated nanoparticles into multiparticle-type aggregates. Our approach offers great promise for designing biologically functional, controllable protein/nanoparticle composites.

  16. Performance calculation and simulation system of high energy laser weapon

    NASA Astrophysics Data System (ADS)

    Wang, Pei; Liu, Min; Su, Yu; Zhang, Ke

    2014-12-01

    High energy laser weapons are ready for some of today's most challenging military applications. Based on the analysis of the main tactical/technical index and combating process of high energy laser weapon, a performance calculation and simulation system of high energy laser weapon was established. Firstly, the index decomposition and workflow of high energy laser weapon was proposed. The entire system was composed of six parts, including classical target, platform of laser weapon, detect sensor, tracking and pointing control, laser atmosphere propagation and damage assessment module. Then, the index calculation modules were designed. Finally, anti-missile interception simulation was performed. The system can provide reference and basis for the analysis and evaluation of high energy laser weapon efficiency.

  17. Three-dimensional finite element magnetic simulation of an innovative multi-coiled magnetorheological brake

    NASA Astrophysics Data System (ADS)

    Ubaidillah; Permata, A. N. S.; Mazlan, S. A.; Tjahjana, D. D. D. P.; Widodo, P. J.

    2017-10-01

    This research delivers a finite element magnetic simulation of a novel disk type multi-coil magnetorheological brake (MR brake). The MR brake axial design had more than one coil located outside of the casing. This design could simplify the maintenance process of brakes. One pair of coils was used as the representative of the entire coil in the simulation process, and it could distribute magnetic flux in all parts of the electromagnetic. The objective of this simulation was to produce magnetic flux on the surface of the disc brake rotor. The value of the MR brake magnetic flux was higher than that of the current MR brake having one coil with a larger size. The result of the simulation would be used to identify the effect of different fluids on each variation. The Magneto-rheological fluid MRF-132DG and MRF-140CG were injected in each gap as much as 0.50, 1.00, and 1.50 mm, respectively. On the simulation process, the coils were energized at 0.25, 0.50, 0.75, 1.00, 1.50, and 2.00 A, respectively. The magnetic flux produced by MRF-140CG was 336 m Tesla on the gap of 0.5 mm. The result of the simulation shows that the smaller the gap variation was, the higher the magnetic value was.

  18. Backwards Faded Scaffolding Impact on Pre-Service Teachers’ Cognition

    NASA Astrophysics Data System (ADS)

    Slater, T. F.; Slater, S. J.

    2009-12-01

    In response to national reform movements calling for future teachers to be prepared to design and deliver science instruction using the principles of inquiry in the context of Earth system science, we created and evaluated an innovative curriculum for specially designed courses for pre-service elementary education and secondary undergraduates based upon an inquiry-oriented teaching approach framed by the notions of backwards faded-scaffolding as an overarching theme for inquiry-oriented instruction. Students completed both structured- and open-inquiry projects using online scientific data bases, particularly those available from NASA, and presented the results of their investigations several times throughout the semester as a mini-science conference. Using a single-group, multiple-measures, quasi-experimental design, students demonstrated enhanced content knowledge of astronomy and inquiry as well as attitudes and self-efficacy toward teaching as measured by the Test of Astronomy STandards (TOAST), the Science Teaching Efficacy Belief Instrument - Version B, and the Attitudes Toward Science Inventory. We adopted a model of inquiry where: (i) students are engaged in questions; (ii) students are designing plans to pursue data; and (iii) students are generating and defending conclusions based on evidence they have collected. We developed an approach that is directly in contrast with the open inquiry “science fair” model to specifically use carefully scaffolded, shorter term inquiries, placing the most challenging aspects of “question generation” at the end of the lessons. In this model, during students' first experience with inquiry they are guided through the entire process, from research question to the appropriate content and format for a scientific conclusion. In their second experience, students generate their conclusions independently, with the previous experience set out as a guide for content and format. They are required to make sense of data that has been purposefully and logically planned, collected, and analyzed with instructor guidance. They construct and defend conclusions based upon evidence that is, effectively, given to them. By the time students reach their third inquiry they have been exposed to two experiences in which they were guided through the process of data collection and analysis. On this third inquiry data collection and analysis becomes an independent task. By the fourth inquiry, students have received explicit instruction on the connection between the research question or hypothesis, and the procedure undergone to address those questions three times. They are positioned to take responsibility for creating a plausible method for collecting data given a research prompt. By the fifth inquiry, students have now seen four examples of quality research questions andhypotheses, and their relationship to procedures, data collection and conclusions. At this point they are positioned to successfully conduct an entire inquiry cycle. This strategy is specifically designed to provide students with early success and a sense of how the pieces of the scientific process connect to each other.

  19. Reference Values of Within-District Intraclass Correlations of Academic Achievement by District Characteristics: Results from a Meta-Analysis of District-Specific Values

    ERIC Educational Resources Information Center

    Hedberg, E. C.; Hedges, Larry V.

    2014-01-01

    Randomized experiments are often considered the strongest designs to study the impact of educational interventions. Perhaps the most prevalent class of designs used in large scale education experiments is the cluster randomized design in which entire schools are assigned to treatments. In cluster randomized trials (CRTs) that assign schools to…

  20. Navy Columbia Class (Ohio Replacement) Ballistic Missile Submarine (SSBN[X]) Program: Background and Issues for Congress

    DTIC Science & Technology

    2016-10-25

    program, a program to design and build a new class of 12 ballistic missile submarines (SSBNs) to replace the Navy’s current force of 14 Ohio-class SSBNs...billion in detailed design and nonrecurring engineering (DD/NRE) costs for the entire class, and $8.8 billion in construction costs for the ship... Design ................................................................................................................. 8 Program Cost

Top